extended reality

exploring-the-world-of-live-xr-theater

Exploring the World of Live XR Theater

The last three years may feel as though they’ve gone by pretty quickly. A few short years ago, we were seeing an explosion of interest and production in XR theater and live virtual entertainment. The pandemic meant that a lot of theaters were empty, creating a strong need for audiences and entertainers alike.

Now it’s 2023. Theaters are open again. But, that doesn’t mean that XR theater has gone anywhere. Far from being a temporary fix to string us through an isolated event, live VR entertainment is stronger than ever. It remains a way to explore new avenues of storytelling and even bring new audiences into traditional entertainment venues.

Understanding Immersive Theater

Before we dive in, a quick clarifying note may be required. While some readers will hopefully come from a theater background, most readers are likely more familiar with XR terminology so one particular term might be confusing.

When ARPost describes an experience as “immersive,” we’re usually talking about a 3D virtual environment that is spatially explored either by physical movement in augmented or mixed reality, or through spatial navigation in virtual reality. However, XR does not have a monopoly on the word.

“Immersive theater” is a term from the live entertainment world that far predates XR and XR theater. In this form of immersive theater, participants converse with actors, manipulate props, and physically move through sets that might take up an entire building. While the pandemic played a part in the growth of XR theater, its roots are in immersive theater.

“Due to our familiarity with the genre of immersive theatre, and some of our team members had prior experience performing in and being audience members in VR theatre shows, the transition from in real life (IRL) to VR was very natural,” Ferryman Collective founding member Stephen Butchko told ARPost.

Ferryman Collective, one of the premiere production companies in XR theater, was founded during the pandemic but its founding members had already been performing immersive theater in live venues for years. In fact, one of Ferryman Collective’s first major productions, Severance Theory: Welcome to Respite, began life as an in-person immersive theater production.

From Gaming to XR Theater

The Under Presents, released in 2019, might be the first major piece of XR theater. Tender Claws, the development studio behind the production, had been exploring innovative digital productions and engagements for four years already, but The Under Presents is where our story begins.

The experience, built as a game that sometimes featured live actors, introduced countless viewers to live XR theater. It also inspired other artists at a time when the theater community was in dire need of something new and different.

“Born out of the Pandemic”

“Ferryman Collective was born out of the pandemic and brought together by the magic of The Under Presents, or ‘TUP’, as we affectionately call it,” Ferryman Collective founding member Deirdre Lyons told ARPost. “The pandemic shut everything down in 2020 except for TUP, as people performed and participated in it from home.”

In 2019, Lyons was one of the Tender Claw’s VR actors – a job that she still holds while also producing, directing, and acting in productions by Ferryman Collective. A number of members of Ferryman Collective met while working on TUP.

The live show was only supposed to run for three months but extended the run due to its high popularity. The live component of the app and game was eventually closed, leaving actors free to work on other projects, with Tender Claws’ second major XR theater production, Tempest, coming out the following year.

Ferryman Collective’s first production, PARA, a horror story about a dubious AI startup, came out in the autumn of 2020. The show was written by Braden Roy, and was directed by Roy and Brian Tull, who had also met working on TUP. Roy also wrote Ferryman Collective’s second production, Krampusnacht, directed by Roy, Tull, and Lyons in the winter of 2020-2021.

XR Theater Meets Immersive Theater

Ferryman Collective learned a lot from PARA and Krampusnacht. The latter got the collective their first award nomination, with a run that was extended four times to keep up with interest. However, the collective’s breakout production was The Severance Theory: Welcome to Respite – an XR adaptation of a pre-pandemic live immersive theater production.

“Having experienced quiet moments of contemplation with other audience members within my experience as an actor on TUP, I knew that this medium had the potential for a profound connection,” said Lyons. “Having done some voiceover work on The Severance Theory: Welcome to Respite […] I felt this piece could be that kind of powerful experience in VR.”

Lyons reached out to the play’s creator, Lyndsie Scoggin, who had also been sidelined by the pandemic. Scoggin went from not owning a headset to writing and directing the XR theater adaptation, which took on a life of its own.

“The IRL version of [Welcome to Respite] was performed for one audience member who plays a seven-year-old kid named Alex,” Butchko told ARPost. “In the VR version, we are able to include up to nine additional audience members who are put into invisible avatars and play the alternate aspects of Alex’s personality, the Alter Egos.”

Bringing in Participants

Ferryman Collective’s approach to Welcome to Respite brings in more participants per show, but it also allows the participants to control the story as a group as each one gets a vote to determine Alex’s actions taken by the singular Alex over the course of the play.

Expanding the scale of XR theater audiences is one of the pioneering pursuits of “scrappy storyteller” Brandan Bradley. Bradley has been exploring XR since 2017 but really dove into it during the pandemic. During this time he has launched his own projects and XR theater productions and has also acted in productions by Ferryman Collective.

“The pandemic brought about this collision of my two loves: interactive media and fine arts,” Bradley told ARPost in a 2020 interview.

NON-PLAYER CHARACTER - a VR Musical - Brandan Bradley

Bradley’s current production, NPC, brings in a group decision dynamic similar to Welcome to Respite. Bradley plays a side character in a video game that sees the main character die and turns to the audience for guidance. The audience is four “on-stage” participants that interact with him directly, and a larger “seated audience” that watches the action unfold.

Expanding the audience

Splitting the audience like this does a number of things for Bradley. Traditional immersive theater experiences might only have the participating audience – and most XR theater still works that way. From a strictly box office perspective, bringing in the “seated audience” allows Bradley to sell significantly more tickets per performance.

There’s also an audience accommodation aspect. While the “seated audience” might be interested in seeing a story that is shaped by the audience, shaping the story themselves might not be their cup of tea. Further, the “seated audience” can join on more widely affordable and available devices – including a web browser.

“There is a large contingency of the audience that enjoys a more passive role – like a Twitch chat come to life,” Bradley told me over coffee at AWE. “My mom, who will never put on goggles, is willing to join on the keyboard.”

Bradley’s OnBoardXR – a sort of workshop and venue for XR entertainers to begin developing and testing live performances – uses a similar ticketing model. In a lobby, audience members choose different avatars to signal to the actors the degree to which they feel comfortable participating.

NPC and OnBoardXR, take place on-browser and can be joined in headset, on a desktop, or even on a mobile phone. Ferryman Collective performs in VRChat for similar options. This is a departure from Tender Claws’ VR-only productions.

“All of us would love to be The Under Presents […] but the price point is outrageous and the timetable is untenable for someone who just wants to keep producing […] we’re kind of ‘Off Broadway,’” said Bradley. “This is the balance that we’re all doing. There are things we would all love to do with more robust tools […] right now it’s more important to have more participants.”

Exploring Affordances

Anytime that anything is brought into virtual reality, there are benefits and barriers. Live theater is no different. Set and prop design, construction, and storage can be a lot easier. This to the point that no XR production ever need be permanently ended. A show can be revived at any time because everything exists as files as opposed to physical objects that must be stored.

However, physicality and expression can be a trade-off. A character may be fantastically designed for VR, but controlling it and expressing through it isn’t always easy – even with special avatars with controller-activated expressions.

“Emotions within the scene must be conveyed through the actor’s voice and sometimes stylized gestures[…],” said Butchko. “Things that we cannot do easily or convincingly are eat, drink, and lay down. Those were all found in the IRL production of [Welcome to Respite], but could not be used in the VR version due to technical limitations.”

Further, if you’re still comparing XR theater with a typical play instead of immersive theater, there are a few more details that you might have missed. Some in-person immersive theater involves physical contact between actors and participants, or at least involves participants physically interacting with sets and props.

“Not all immersive shows have physical actor-audience contact but there’s still the physicality of the structure and props that can’t be replicated without building a physical space,” Tull told ARPost. “Smell and taste are noticed less, though the potpourri of an old mansion or a gin and tonic at a seedy speakeasy go a long way in completing the illusion.”

Tull further commented that, even when “physical actor-audience contact” is involved, “the visual immersion of virtual reality can almost replicate the intimacy of actual touch.” I certainly found this to be the case.

Exploring Emotion

As a participant in Ferryman Collective’s Gumball Dreams, an actor reached out and virtually put his hand on my chest. If an actor had physically done this in an IRL production, I dare say that this would have made me immensely uncomfortable in the worst way. But, in VR, this came across as intended – a moving intimate gesture between characters in a story.

Gumball Dreams has an amusing name and a brightly colored and stylized virtual world. However, the actual story is an incredibly moving exploration of mortality and consciousness. Similar themes exist in NPC, while Welcome to Respite explores the experience of psychological disorders. What makes XR theater so conducive to these heavy topics?

“At a media level, when you’re inviting the kind of immersion that VR affords, you want to do more than just comedy,” said Bradley. “There is an emotional intimacy that we experience in VR that we haven’t experienced anywhere else and don’t have words for and that’s the next degree of the storytelling experience.”

In this year’s AWE panel discussion on “XR Entertainment: The Next Generation of Movie Makers and Performers”, Ferryman Collective performer and producer Whitton Frank gave a description of XR theater that also explains the draw that it has to audiences as well as entertainers.

“You are given a character and you are a part of the play […] you’re having emotional experiences with another human being which is why, I think, people get excited about this,” said Frank. “That is the way forward – to show people the world in a way that they haven’t seen it before.”

Find an XR Theater Experience

So, how do you know when and which XR theater experiences are available? It’s still a pretty niche field, but it’s close-knit. Start out by following groups like Tender Claws, OnBoardXR, and Ferryman Collective. Then (before or after the show), talk to the other audience members. Some will likely be new to it themselves, but others will be able to point you in the right direction.

Exploring the World of Live XR Theater Read More »

challenges-behind-applying-real-world-laws-to-xr-spaces-and-ensuring-user-safety

Challenges Behind Applying Real-World Laws to XR Spaces and Ensuring User Safety

Immersive technologies bridging the gap between the physical and digital worlds can create new business opportunities. However, it also gives rise to new challenges in regulation and applying real-world laws to XR spaces. According to a World Economic Forum report, we are relatively slow in innovating new legal frameworks for emerging technologies like AR and VR.

Common Challenges of Applying Laws to AR and VR

XR technologies like AR and VR are already considered beneficial and are used in industries like medicine and education. However, XR still harbors risks to human rights, according to an Electronic Frontier Foundation (EFF) article.

Issues like data harvesting and online harassment pose real threats to users, and self-regulation when it comes to data protection and ethical guidelines is insufficient in mitigating such risks. Some common challenges that crop up when applying real-world laws to AR and VR include intellectual property, virtual privacy and security, and product liability.

There’s also the need for a new framework tailored to fit emerging technologies, but legislative attempts at regulation may face several hurdles. It’s also worth noting that while regulation can help keep users safe, it may also potentially hamper the development of such technologies, according to Digikonn co-founder Chirag Prajapati.

Can Real-World Laws Be Applied to XR Spaces?

In an interview with IEEE Spectrum in 2018, Robyn Chatwood, an intellectual property and information technology partner at Dentons Australia, gave an example of an incident that occurred in a VR space where a user experienced sexual assault. Unfortunately, Chatwood remarked that there are no laws saying that sexual assault in VR is the same as in the real world. When asked when she thinks these issues will be addressed, Chatwood remarked that, in several years, another incident could draw more widespread attention to the problems in XR spaces. It’s also possible that, through increased adoption, society will begin to recognize the need to develop regulations for XR spaces.

On a more positive note, the trend toward regulations for XR spaces has been changing recently. For instance, Meta has rolled out a minimum distance between avatars in Horizon Worlds, its VR social media platform. This boundary prevents other avatars from getting into your avatar’s personal space. This system works by halting a user’s forward movement as they get closer to the said boundary.

There are also new laws being drafted to protect users in online spaces. In particular, the UK’s Online Safety Bill, which had its second reading in the House of Commons in April 2022, aims to protect users by ensuring that online platforms have safety measures in place against harmful and illegal content and covers four new criminal offenses.

In the paper, The Law and Ethics of Virtual Assault, author John Danaher proposes a broader definition of virtual sexual assault, which allows for what he calls the different “sub-types of virtual sexual assault.” Danaher also provides suggestions on when virtual acts should be criminalized and how virtual sexual assault can be criminalized. The paper also touches on topics like consent and criminal responsibility for such crimes.

There’s even a short film that brings to light pressing metaverse concerns. Privacy Lost aims to educate policymakers about the potential dangers, such as manipulation, that come with emerging technologies.

While many legal issues in the virtual world are resolved through criminal courts and tort systems, according to Gamma Law’s David B. Hoppe, these approaches lack the necessary nuance and context to resolve such legal disputes. Hoppe remarks that real-world laws may not have the specificity that will allow them to tackle new privacy issues in XR spaces and shares that there is a need for a more nuanced legal strategy and tailored legal documents to help protect users in XR spaces.

Issues with Existing Cyber Laws

The novelty of AR and VR technologies makes it challenging to implement legislation. However, for users to maximize the benefits of such technologies, their needs should be considered by developers, policymakers, and organizations that implement them. While cyber laws are in place, persistent issues still need to be tackled, such as challenges in executing sanctions for offenders and the lack of adequate responses.

The United Nations Office on Drugs and Crime (UNODC) also cites several obstacles to cybercrime investigations, such as user anonymity from technologies, attribution, which determines who or what is responsible for the crime, and traceback, which can be time-consuming. The UNODC also notes that the lack of coordinated national cybercrime laws and international standards for evidence can hamper cybercrime investigations.

Creating Safer XR Spaces for Users

Based on guidelines provided by the World Economic Forum, there are several key considerations that legislators should consider. These include how laws and regulations apply to XR conduct governed by private platforms and how rules can potentially apply when an XR user’s activities have direct, real-world effects.

The XR Association (XRA) has also provided guidelines to help create safe and inclusive immersive spaces. Its conduct policy tips to address abuse include creating tailored policies that align with a business’ product and community and including notifications of possible violations. Moreover, the XRA has been proactive in rolling out measures for the responsible development and adoption of XR. For instance, it has held discussions on user privacy and safety in mixed reality spaces, zeroing in on how developers, policymakers, and organizations can better promote privacy, safety, and inclusion, as well as tackle issues that are unique to XR spaces. It also works with XRA member companies to create guidelines for age-appropriate use of XR technology, helping develop safer virtual spaces for younger users.

Other Key Players in XR Safety

Aside from the XRA, other organizations are also taking steps to create safer XR spaces. X Reality Safety Intelligence (XRSI), formerly known as X Reality Safety Initiative, is one of the world’s leading organizations focused on providing intelligence and advisory services to promote the safety and well-being of ecosystems for emerging technologies.

It has created a number of programs that help tackle critical issues and risks in the metaverse focusing on aspects like diversity and inclusion, trustworthy journalism, and child safety. For instance, the organization has shown support for the Kids PRIVACY Act, a legislation that aims to implement more robust measures to protect younger users online.

XRSI has also published research and shared guidelines to create standards for XR spaces. It has partnered with Standards Australia to create the first-ever Metaverse Standards whitepaper, which serves as a guide for standards in the metaverse to protect users against risks unique to the metaverse. These are categorized as Human Risks, Regulatory Risks, Financial Risks, and Legal Risks, among other metaverse-unique risks.

The whitepaper is a collaborative effort that brings together cybersecurity experts, VR and AR pioneers, strategists, and AI and metaverse specialists. One of its authors, Dr. Catriona Wallace, is the founder of the social enterprise The Responsible Metaverse Alliance. Cybersecurity professional Kavya Pearlman, the founder and CEO of XRSI, is also one of its authors. Pearlman works with various organizations and governments, advising on policymaking and cybersecurity to help keep users safe in emerging technology ecosystems.

One such issue that’s being highlighted by the XRSI is the risks that come with XR data collection in three areas: medical XR and healthcare, learning and education, and employment and work. The report highlights how emerging technologies create new privacy and safety concerns, risks such as the lack of inclusivity, the lack of equality in education, and the lack of experience in using data collected in XR spaces are cropping up.

In light of these issues, the XRSI has created goals and guidelines to help address these risks. Some of the goals include establishing a standards-based workflow to manage XR-collected data and adopting a new approach to classifying such data.

The EU is also taking steps to ensure data protection in emerging technologies, with new EU laws aiming to complement the GDPR’s requirements for XR technologies and services. Moreover, the EU data protection law applies to most XR technologies, particularly for commercial applications. It’s possible that a user’s explicit consent may be required to make data processing operations legitimate.

According to the Information Technology & Innovation Foundation (ITIF), policymakers need to mitigate so-called regulatory uncertainty by making it clear how and when laws apply to AR and VR technologies. The same ITIF report stresses that they need to collaborate with stakeholder communities and industry leaders to create and implement comprehensive guidelines and clear standards for AR and VR use.

However, while creating safer XR spaces is of utmost importance, the ITIF also highlights the risks of over-regulation, which can stifle the development of new technologies. To mitigate this risk, policymakers can instead focus on developing regulations that help promote innovation in the field, such as creating best practices for law enforcement agencies to tackle cybercrime and focusing on funding for user safety research.

Moreover, the ITIF also provides some guidelines regarding privacy concerns from AR in public spaces, as well as what steps leaders and policymakers could take to mitigate the risks and challenges that come with the use of immersive technologies.

The EFF also shares that governments need to execute or update data protection legislation to protect users and their data.

There is still a long way to go when applying real-world laws to XR spaces. However, many organizations, policymakers, and stakeholders are already taking steps to help make such spaces safer for users.

Challenges Behind Applying Real-World Laws to XR Spaces and Ensuring User Safety Read More »

highlighting-the-top-3-xr-trends-of-2023-[insights-from-this-year’s-awe-usa]

Highlighting the Top 3 XR Trends of 2023 [Insights From This Year’s AWE USA]

The 2023 edition of AWE USA not only lived up to its reputation but also reached new heights, reportedly marking its largest event to date. From cutting-edge hardware to new, groundbreaking technology and software solutions, this year had it all.

3 Trends That Will Shape the Future of XR

Let’s dive in and explore the main three trends that stood out and are bound to shape the narrative for the future of XR.

Main Focus on AR

There was a lot of discussion this year about artificial intelligence and how it will enable XR rather than replace it. Just like with the metaverse last year, AI became a new hot topic, but in terms of hardware, the spotlight was clearly on AR.

There were, of course, some notable VR-first devices presented: Lenovo announced their new ThinkReality VRX headset, which is now available for purchase ($1,299). I had a chance to give it a try and was impressed with its large sweet spot, visual clarity, and a high degree of comfort. The headset includes a cooling system that takes the heat away from your face and makes the inside feel almost air-conditioned.

ThinkReality VRX
ThinkReality VRX

HTC presented their modular HTC Vive XR Elite ($1,099) for which they had won a “Best Headworn Device” award. It can be worn both like a traditional headset with a head strap or akin to glasses with an external power source instead of the battery in the back. In detached form, the Vive XR Elite weighs only 270 grams.

These devices were more of an exception rather than the rule, however, and pale in comparison to the amount of AR devices showcased this year. Just on the main floor, we had Vuzix promoting their Ultralite turnkey AR solution, Sightful with a screenless Spacetop AR laptop, XREAL presenting XREAL Air glasses, and Magic Leap returning with Magic Leap 2. Right next to those was C-Real with their unique light field display and Tilt Five. In the lobby, Zappar was demonstrating its $75 Cardboard-inspired device.

And that’s just the hardware, the list doesn’t include smartphone-based solutions like Snapchat’s SnapAR and Snap Lenses or Ffface.me digital clothing. Many software providers were experimenting with AR as well. Whether it was enterprise and training applications or entertainment like a laser-tag-inspired Laser Limbo, the focus on augmented reality was prevalent.

Laser-tag-inspired Laser Limbo
Laser-tag-inspired Laser Limbo

Subjectively, I found the XREAL and Tilt Five glasses to be the most promising choices in terms of their usefulness and affordability. Tilt Five ($359) offers six degrees of freedom and a wide 110° field of view, plus a whole range of tabletop applications and games. It also comes with a tracked controller.

Tilt Five
Tilt Five

The XREAL Air ($488 with XReal Beam) might only have three degrees of freedom and a smaller FOV of 46°, but makes up for it with its versatility. It weighs only 79 grams and is compatible with phones, consoles, and laptops. Almost any device with a screen can be beamed into the glasses. For those looking to start experimenting with AR, both offer a good and inexpensive entry point.

The Renaissance of Haptics

It was hard to ignore the sheer volume of haptic-related products at AWE. There was a surge of novel startups and original concepts along with many industry veterans returning to show off their latest progress.

I did not expect haptics to have such a strong showing and was positively taken aback. Bhaptics were busy presenting their new TactGlove and Contact CI came out with a new product called Maestro. The most established player in the space, HaptX, was there as well.

Among newer entrants, SenseGlove was celebrating their €3.25M Series A funding with a newly updated Nova 2 haptic glove. Weart demoed their TouchDIVER glove capable of not only feedback but also temperature variations, while OWO showed off their latest haptic vest that uses electrical impulses to simulate sensations. Fluid Reality stole the show with its electroosmotic device that uses an electric field to create feedback.

Fluid Reality
Fluid Reality

There were too many to list but even this short rundown underscores how noticeable haptics were this year. Most of these products target industrial and business markets, with the notable exceptions being the OWO vest ($499) and Bhaptics (also $499). Both devices have their strengths and weaknesses, though I have to give extra points to OWO for taking a bold, unique approach and allowing users to configure the vest so that it can simulate discomfort as well as other unpleasant feedback. This can result in a significantly more visceral experience and a heightened feeling of presence that’s hard to replicate using other methods.

OWO Haptic Vest
OWO Haptic Vest

Seeing all the new and creative ways to model and recreate tactile data left me impressed with what’s to come, but at the same time, underwhelmed with the more conventional approaches.

Full resistance feedback, which restricts your movement, felt detached and did not properly mirror what I was seeing inside the headset. That was the case for both SenseGlove Nova and the high-end HaptX.

Their feedback, while indeed powerful, felt very mechanical and arbitrary. There are two paradigms here at play, one is trying to nail the fidelity but approximate the sensation, while the other one is trying to provide the exact, realistic sensation at the cost of fidelity.

New Optics Solutions Are Coming

There were a number of booths dealing with optics and display solutions this year. It’s possible the latest push into AR helped supercharge this progress in optics. Many booths had some kind of developer kit or proof-of-concept ready. Visitors would come and literally peer into the future through these stationary prototypes.

One example was Ant Reality demonstrating their mixed waveguide solution called Crossfire. While the displays (ranging in field of view from 56° to 120°) were impressive, what made them unique was their ability to do both AR and VR. At a press of a button, the surroundings would go completely dark, turning the augmented overlay into an immersive experience. Magic Leap 2 is known for offering what is called segmented dimming, but in the case of the Crossfire, the glasses would become completely opaque despite the AWE show floor being exceptionally bright.

Ant Display demonstrating their prototypes
Ant Display demonstrating their prototypes

Another never-before-achieved breakthrough was a light field display incorporated into an AR headset, courtesy of CREAL. Light field displays promise to solve a lot of issues, the most common one being correct focal depth. Harnessing the direction of light can produce outstanding results, but shrinking light field tech to fit into a glasses form factor still proves tricky. CREAL’s headset is an important, pioneering step in this field.

CREAL’s LFD headset
CREAL’s LFD headset

Another interesting innovation came from a company called Hypervision. Their claim to fame is their ultra-wide display capable of achieving a human vision 240° field of view. To make this happen, Hypervision used not one, not two, but four pancake lenses. Vertically, the screen has 95° so it doesn’t quite match the human eye top to bottom, but horizontally there’s full peripheral vision. While the stitching between the screens was slightly perceptible, the ability to achieve human FOV in such a small form factor is a massive step forward.

Hypervision
Hypervision

Overall, this means that the future generations of XR devices will have access to a wide variety of new, next-gen optics and display solutions, most of which are not even known to the general public. Display tech doesn’t follow Moore’s Law so it’s always difficult to make any specific predictions, but there’s clearly no stagnation in the field and some of the breakthroughs we saw this year are truly exciting.

Closing Thoughts

These are just some of the main trends and shifts we saw this year. There was a notable increase in 3D spatial display panels, such as Leia Lume Pad 2, Sony’s Spatial Display, Looking Glass, and a human-sized holographic box by ARHT.

This forms part of a larger trend of broadening the definition of spatial computing, which is sometimes expanded to include other real-world tools and technologies like visualizations, projection mapping, and 3D screens.

What also caught my eye was a noticeable reduction in locomotion solutions. Gone are the days of omnidirectional treadmills or big simulator seats. The only two exceptions were the unconventional EXIT SUIT, which suspends the wearer slightly above the ground allowing them to run in the air, sit, fly, and do a range of other motions (for which the team had won this year “AWEsome” award) and the Freeaim shoes that act like rollers, pushing the wearer backward as they walk.

This was the last AWE hosted in Santa Clara. From next year on, the event is moving to the Long Beach Convention Center. This shift to a new, bigger venue highlights the constant growth of the XR space and that’s one trend that speaks for itself.

Guest Post


About the Guest Author(s)

Mat Pawluczuk

Mat Pawluczuk

Mat Pawluczuk is an XR / VR writer and content creator.

Highlighting the Top 3 XR Trends of 2023 [Insights From This Year’s AWE USA] Read More »

how-eye-tracking-contributes-to-xr-analytics-and-experience

How Eye Tracking Contributes to XR Analytics and Experience

Near-to-eye displays offer powerful new ways to understand what the wearer is doing – and maybe even thinking. Right now, the use of XR analytics like eye tracking is largely limited to enterprise including use cases like education and assessment, though eye tracking also enables new input modes and display improvements.

To learn more about the present and potential future of this technology, ARPost spoke with hardware manufacturer Lumus and XR analytics specialists Cognitive3D.

Why Do We Need XR Analytics?

XR analytics can be broken down generally into learning about an experience and learning about the person in that experience.

Learning About Experiences

Learning about an experience is important for people building XR applications – both in terms of building applications that people will want to use, and in building applications that people will be able to use.

“The stakes are much higher in creating high-quality content,” Cognitive3D founder and CEO Tony Bevilacqua said in an interview with ARPost. “That means creating something that’s comfortable, that’s not going to make you sick, that’s going to be accessible and well-received.”

This kind of thing is important for anyone building anything, but it is crucial for people building in XR, according to Bevilacqua. When a gamer experiences a usability problem in a console game or mobile app, they’re likely to blame that specific game or app and move onto a different one. However, XR is still new enough that people aren’t always so understanding.

“A bad experience can create attrition not just for an app, but for the platform itself,” said Bevilacqua. “That headset might go back into the box and stay there.”

Of course, developers are also interested in “success metrics” for their experiences. This is an issue of particular importance for people building XR experiences as part of advertising and marketing campaigns where traditional metrics from web and mobile experiences aren’t as useful.

“We all kind of know that opening an app and how much time people spent, those are very surface-level metrics,” said Bevilacqua. For XR, it’s more important to understand participation – and that means deeper analytical tools.

Learning About People

In other use cases, the people interacting with an experience are the subject that the XR analytics are most interested in. In these situations, Bevilacqua describes “the headset as a vehicle for data collection.” Examples include academic research, assessing skills and competency, and consumer research.

Competency assessment and consumer research might involve digital twins that the individual interacts with in VR. How efficiently can they perform a task? What do they think about a virtual preproduction model of a car? What products draw their eyes in a virtual supermarket?

“We focus more on non-consumer-focused use cases like focus groups,” said Bevilacqua. “We try to build off of the characteristics that make VR unique.”

At least part of the reason for this is that a lot of XR devices still don’t have the hardware required for XR analytics, like eye tracking capabilities.

Building Hardware for Eye Tracking

David Goldman is the Vice President of AR Optics at Lumus. The company is primarily focused on making displays but, as a components manufacturer, they have to make parts that work with other customer requirements – including eye tracking. The company even has a few patents on its own approach to it.

According to Goldman, traditional approaches to eye tracking involve cameras and infrared lights inside of the headset. The invisible light reflects off of the eye and is captured by the camera. Those lights and cameras add some cost but, more importantly, they take up “valuable real estate, from an aesthetic perspective.”

The patented Lumus system uses the waveguide itself as the light source because waveguides already require projecting light. This light reflects off of the eye, so all that is required is an additional inside camera, which is a lot more affordable in terms of cost and space. However, high standards for emerging experiences plays a role here too.

“When you’re a company trying to introduce a whole new product, you’re trying to shave pennies off of the dollar on everything,” said Goldman. “Looking at the bill of materials, it’s unlikely to make a first generation.”

Though, more and more devices coming to market do include this hardware – including consumer devices. Why? In part because the hardware enables a lot more than just XR analytics.

Enabling New Kinds of Interactions

Eye tracking enables advanced display technologies like foveated rendering, which is one of the big reasons that it’s increasingly being included in consumer VR devices. Foveated rendering is a technique that improves the graphic fidelity of a small area of the overall display based on where your eye is looking at the moment.

AR devices currently don’t have a field-of-view large enough to benefit from foveated rendering, but Goldman said that Lumus will have a device with a field-of-view over 50 degrees before 2030.

Eye tracking also has promise as an advanced input system. Goldman cited the Apple Vision Pro, which uses a combination of eye tracking and hand-tracking to go completely controller-free. Mixed reality devices like the Apple Vision Pro and Meta Quest 3 also bring up the fact that eye tracking has different implications in AR than it does in VR.

“Effectively, you can know exactly where I’m looking and what I’m interested in, so it has its implications for advertisers,” said Goldman. “What’s less nefarious for me and more interesting as a user is contextual search.”

Power and Responsibility

As more advanced XR analytics tools come to more consumer-focused hardware, do we need to be concerned about these tools being turned on casual XR fans? It’s certainly something that we need to be watchful of.

“It’s certainly a sensitive issue,” said Goldman. “It’s certainly a concern for the consumer, so I think every company will have to address this up front.”

Bevilacqua explained that his company has adopted the XR Privacy Framework. Cognitive3D notifies individuals when certain kinds of data might be collected and gives them the option to opt out. However, Bevilacqua believes that the best option is to avoid certain kinds of data collection in the first place.

“It’s important to balance data collection with user privacy. … We have a pretty balanced view on what needs to be collected and what doesn’t,” said Bevilacqua. “For us, eye tracking is something we do not find acceptable in a consumer application.”

Bevilacqua also pointed out that platforms and marketplaces have their own internal guidelines that make it difficult for app developers to collect too much information on their own.

“There is acceptable use policy about what kinds of data exist and what can be used,” said Bevilacqua. “You can’t just go out and collect eye tracking data and use it for ads. That’s not something Meta is going to allow.”

All About Balance

We need XR analytics. They make for better experiences and can even improve the quality of goods and services that we enjoy and rely on in the physical world. Not to mention the benefits that the required hardware brings to consumer applications. While technologies like eye tracking can be scary if used irresponsibly, we seem to be in good hands so far.

How Eye Tracking Contributes to XR Analytics and Experience Read More »

apple-vision-pro:-a-catalyst-for-the-growth-of-the-xr-industry

Apple Vision Pro: A Catalyst for the Growth of the XR Industry

Sponsored content

Sponsored by VR Vision

The recent introduction of Apple’s Vision Pro has ignited a fresh wave of excitement in the extended reality landscape, with industry experts and enthusiasts alike anticipating a surge in the growth and evolution of the XR industry.

This immersive technology (coined “spatial computing” by Apple), which encompasses virtual reality, augmented reality, and mixed reality, is set to experience a significant boost from Apple’s entry into the field.

A New Era in Immersive Technology

The Vision Pro’s unveiling at Apple’s Worldwide Developer Conference (WWDC) generated a buzz in the XR world. It has triggered both commendations and criticisms from the global XR community, with its future potential and implications for the broader XR landscape hotly debated.

Apple’s Vision Pro is a spatial computer that seamlessly blends digital content with the physical world, marking a significant step forward in immersive technology.

Apple Vision Pro - headset

According to the company, it uses a “fully three-dimensional user interface controlled by the most natural and intuitive inputs possible – a user’s eyes, hands, and voice.” This marks a departure from traditional interaction methods, offering a more immersive experience for users.

A panel of global executives from the immersive tech industry weighed in on the device, discussing its potential use cases, and how it would impact the global XR community. The consensus was that the Vision Pro represented a significant leap forward in the development of XR technology, setting the stage for exciting advancements in the field.

The Potential of the Vision Pro

The Vision Pro’s introduction has been described as one of the “watershed moments” for the VR and AR industry. The device with enormous potential is poised to breathe new life into the XR space, with two of the world’s largest tech giants, Apple and Meta Platforms (formerly Facebook), now vying for market share.

The Vision Pro’s announcement has spurred conversations and expectations that “spatial computing” will become an integral part of everyday life, much like how other Apple devices have seamlessly integrated into our daily personal and professional lives.

Apple has a remarkable track record of introducing technology that resonates with individuals on a personal level. The company’s knack for creating products that enhance individuals’ lives, work, and well-being has been a crucial factor in their widespread adoption.

Vision Pro: Design and Features

The Vision Pro comes with a clean, sleek design, and high-quality features – a standard we’ve come to expect from Apple. The device is controlled using our hands, rather than external controllers, making it more intuitive and user-friendly.

Apple has prioritized its use cases within its existing ecosystem of apps and content. This strategic move sets Vision Pro apart from its competitors, providing a unique selling proposition.

The device’s hardware is impressive, but its real strength lies in the software experience it offers. Vision Pro introduces a new dimension to personal computing by transforming how users interact with their favorite apps, enhancing productivity and entertainment experiences.

The Impact on the XR Market

The Vision Pro’s introduction has the potential to reshape the XR market. Apple’s entry into the XR space is expected to boost confidence, incite competition, and accelerate advancements in other headsets. This would lead to more people using mixed reality headsets in their day-to-day lives, accelerating use cases for enterprises and industries.

On the other hand, the device’s high price point suggests that it will initially find more success among corporate entities and developers. Companies could use the Vision Pro to create immersive experiences at events, while developers could use it to build innovative apps and content for the device.

At VR Vision, for example, we see enormous potential in the application of virtual reality training for enterprise applications, and the Vision Pro will only enable further innovation in that sector.

It is much safer and cost-effective to operate heavy machinery in the virtual world than in the real world for training. This has applicability across a wide array of industries and use cases and it will be interesting to see just how impactful it truly becomes.

The Vision Pro’s Presentation

Apple’s presentation of the Vision Pro was impressive, ticking many boxes. It showcased significant advancements in hardware and software, demonstrating how the device could offer a hands-free, intuitive experience. The demonstration also highlighted how spatial computing and the new user experience could spur creative content development.

However, some critics felt that the presentation didn’t fully demonstrate the range of VR activities that Vision Pro could achieve. There was a focus on ‘looking and clicking’ functions, which could also be performed on a smartphone. More emphasis could have been placed on the device’s potential for workplace and communication applications.

The Target Audience and Use Cases

The Vision Pro’s high price point suggests that its target audience will initially be businesses and developers. The device could revolutionize workplace training and education, enhancing engagement with learning materials, and streamlining work processes.

Apple Vision Pro

For developers, the Vision Pro represents an opportunity to experiment and innovate. Apple’s established App Store and developer community provide a strong launchpad for the creation of apps and content for Vision Pro. These early adopters may not create polished work initially, but their experiments and ideas will likely flourish in the coming years.

The Role of Vision Pro in the XR Market

Apple’s history of developing proprietary technology and working internally suggests that the Vision Pro will likely follow a similar path. The company’s commitment to quality control, unique design processes, and product development control has given Apple devices their distinctive look and feel.

While it’s difficult to predict the future, interoperability between headsets will likely mirror the landscape of Android and Apple smartphones or Mac and Windows computers. The Vision Pro will likely stand out in the market for its unique feel, best-in-class visuals and technology, and intuitive user experiences, maintaining the overall cohesion between various Apple devices.

Enhancing App Development With Unity

The integration of Unity’s development platform with Vision Pro enables developers to leverage the device’s capabilities and create compelling AR experiences.

Unity’s robust toolset offers a wide range of features, including real-time rendering, physics simulation, and advanced animation systems, all optimized for the Vision Pro’s hardware.

This seamless integration allows developers to focus on unleashing their creativity and designing immersive experiences that blur the line between the physical and virtual worlds.

The Vision Pro holds immense potential for a wide range of industries. From gaming and entertainment to education, healthcare, and industrial training, the device opens up avenues for innovative applications. Imagine interactive virtual tours of historical sites, immersive educational experiences, or real-time collaborative design and engineering projects. The Vision Pro’s spatial computing capabilities pave the way for a future where digital content seamlessly blends with our physical reality, transforming the way we learn, work, and entertain ourselves.

Apple’s Vision Pro: A Boost for Meta

Apple’s entry into the XR market could be a boon for Meta. Despite the criticisms and challenges Meta has faced, its headsets have consistently offered the best value in their class, with excellent hardware and a great game library, all at an attractive price.

The introduction of the Vision Pro could force Meta to step up its game, enhancing its software offerings and improving its user experience. The competition from Apple could ultimately lead to better products from Meta, benefiting users and developers alike.

Conclusion

The introduction of the Apple Vision Pro represents a significant milestone in the XR industry. Its potential impact extends beyond its impressive hardware and software features, setting the stage for exciting advancements in the field.

With Apple now a major player in the XR space, the industry is poised for a surge in growth and evolution. The Vision Pro’s introduction could lead to more investment in R&D, a flourishing supply chain, and an influx of developers eager to create innovative experiences for the device.

Undoubtedly, the Vision Pro marks the beginning of a new era in immersive technology, and its impact on the XR industry will be felt for years to come.

Written by Lorne Fade, COO at VR Vision

Apple Vision Pro: A Catalyst for the Growth of the XR Industry Read More »

the-intersections-of-artificial-intelligence-and-extended-reality

The Intersections of Artificial Intelligence and Extended Reality

It seems like just yesterday it was the AR this, VR that, metaverse, metaverse, metaverse. Now all anyone can talk about is artificial intelligence. Is that a bad sign for XR? Some people seem to think so. However, people in the XR industry understand that it’s not a competition.

In fact, artificial intelligence has a huge role to play in building and experiencing XR content – and it’s been part of high-level metaverse discussions for a very long time. I’ve never claimed to be a metaverse expert and I’m not about to claim to be an AI expert, so I’ve been talking to the people building these technologies to learn more about how they help each other.

The Types of Artificial Intelligence in Extended Realities

For the sake of this article, there are three main different branches of artificial intelligence: computer vision, generative AI, and large language models. AI is more complicated than this, but this helps to get us started talking about how it relates to XR.

Computer Vision

In XR, computer vision helps apps recognize and understand elements in the environment. This places virtual elements in the environment and sometimes lets them react to that environment. Computer vision is also increasingly being used to streamline the creation of digital twins of physical items or locations.

Niantic is one of XR’s big world-builders using computer vision and scene understanding to realistically augment the world. 8th Wall, an acquisition that does its own projects but also serves as Niantic’s WebXR division, also uses some AI but is also compatible with other AI tools, as teams showcased in a recent Innovation Lab hackathon.

“During the sky effects challenge in March, we saw some really interesting integrations of sky effects with generative AI because that was the shiny object at the time,” Caitlin Lacey, Niantic’s Senior Director of Product Marketing told ARPost in a recent interview. “We saw project after project take that spin and we never really saw that coming.”

The winner used generative AI to create the environment that replaced the sky through a recent tool developed by 8th Wall. While some see artificial intelligence (that “shiny object”) as taking the wind out of immersive tech’s sails, Lacey sees this as an evolution rather than a distraction.

“I don’t think it’s one or the other. I think they complement each other,” said Lacey. “I like to call them the peanut butter and jelly of the internet.”

Generative AI

Generative AI takes a prompt and turns it into some form of media, whether an image, a short video, or even a 3D asset. Generative AI is often used in VR experiences to create “skyboxes” – the flat image over the virtual landscape where players have their actual interactions. However, as AI gets stronger, it is increasingly used to create virtual assets and environments themselves.

Artificial Intelligence and Professional Content Creation

Talespin makes immersive XR experiences for training soft skills in the workplace. The company has been using artificial intelligence internally for a while now and recently rolled out a whole AI-powered authoring tool for their clients and customers.

A release shared with ARPost calls the platform “an orchestrator of several AI technologies behind the scenes.” That includes developing generative AI tools for character and world building, but it also includes work with other kinds of artificial intelligence that we’ll explore further in the article, like LLMs.

“One of the problems we’ve all had in the XR community is that there’s a very small contingent of people who have the interest and the know-how and the time to create these experiences, so this massive opportunity is funneled into a very narrow pipeline,” Talespin CEO Kyle Jackson told ARPost. “Internally, we’ve seen a 95-97% reduction in time to create [with AI tools].”

Talespin isn’t introducing these tools to put themselves out of business. On the contrary, Jackson said that his team is able to be even more involved in helping companies workshop their experiences because his team is spending less time building the experiences themselves. Jackson further said this is only one example of a shift happening to more and more jobs.

“What should we be doing to make ourselves more valuable as these things shift? … It’s really about metacognition,” said Jackson. “Our place flipped from needing to know the answer to needing to know the question.”

Artificial Intelligence and Individual Creators

DEVAR launched MyWebAR in 2021 as a no-code authoring tool for WebAR experiences. In the spring of 2023, that platform became more powerful with a neural network for AR object creation.

In creating a 3D asset from a prompt, the network determines the necessary polygon count and replicates the texture. The resulting 3D asset can exist in AR experiences and serve as a marker itself for second-layer experiences.

“A designer today is someone who can not just draw, but describe. Today, it’s the same in XR,” DEVAR founder and CEO Anna Belova told ARPost. “Our goal is to make this available to everyone … you just need to open your imagination.”

Blurring the Lines

“From strictly the making a world aspect, AI takes on a lot of the work,” Mirrorscape CEO Grant Anderson told ARPost. “Making all of these models and environments takes a lot of time and money, so AI is a magic bullet.”

Mirroscape is looking to “bring your tabletop game to life with immersive 3D augmented reality.” Of course, much of the beauty of tabletop games come from the fact that players are creating their own worlds and characters as they go along. While the roleplaying element has been reproduced by other platforms, Mirrorscape is bringing in the individual creativity through AI.

“We’re all about user-created content, and I think in the end AI is really going to revolutionize that,” said Grant. “It’s going to blur the lines around what a game publisher is.”

Even for those who are professional builders but who might be independent or just starting out, artificial intelligence, whether to create assets or just for ideation, can help level the playing field. That was a theme of a recent Zapworks workshop “Can AI Unlock Your Creating Potential? Augmenting Reality With AI Tools.”

“AI is now giving individuals like me and all of you sort of superpowers to compete with collectives,” Zappar executive creative director Andre Assalino said during the workshop. “If I was a one-man band, if I was starting off with my own little design firm or whatever, if it’s just me freelancing, I now will be able to do so much more than I could five years ago.”

NeRFs

Neural Radiance Fields (NeRFs) weren’t included in the introduction because they can be seen as a combination of generative AI and computer vision. It starts out with a special kind of neural network called a multilayer perceptron (MLP). A “neural network” is any artificial intelligence that’s based off of the human brain, and an MLP is … well, look at it this way:

If you’ve ever taken an engineering course, or even a highschool shop class, you’ve been introduced to drafting. Technical drawings represent a 3D structure as a series of 2D images, each showing different angles of the 3D structure. Over time, you can get pretty good at visualizing the complete structure from these flat images. An MLP can do the same thing.

The difference is the output. When a human does this, the output is a thought – a spatial understanding of the object in your mind’s eye. When an MLP does this, the output is a NeRF – a 3D rendering generated from the 2D images.

Early on, this meant feeding countless images into the MLP. However, in the summer of 2022, Apple and the University of British Columbia developed a way to do it with one video. Their approach was specifically interested in generating 3D models of people from video clips for use in AR applications.

Whether a NeRF recreates a human or an object, it’s quickly becoming the fastest and easiest way to make digital twins. Of course, the only downside is that NeRF can only create digital models of things that already exist in the physical world.

Digital Twins and Simulation

Digital twins can be built with or without artificial intelligence. However, some use cases of digital twins are powered by AI. These include simulations like optimization and disaster readiness. For example, a digital twin of a real campus can be created, but then modified on a computer to maximize production or minimize risk in different simulated scenarios.

“You can do things like scan in areas of a refinery, but then create optimized versions of that refinery … and have different simulations of things happening,” MeetKai co-founder and executive chairwoman Weili Dai told ARPost in a recent interview.

A recent suite of authoring tools launched by the company (which started in AI before branching into XR solutions) includes AI-powered tools for creating virtual environments from the virtual world. These can be left as exact digital twins, or they can be edited to streamline the production of more fantastic virtual worlds by providing a foundation built in reality.

Large Language Models

Large Language Models take in language prompts and return language responses. This is on the list of AI interactions that runs largely under the hood so that, ideally, users don’t realize that they’re interacting with AI. For example, large language models could be the future of NPC interactions and “non-human agents” that help us navigate vast virtual worlds.

“In these virtual world environments, people are often more comfortable talking to virtual agents,” Inworld AI CEO Ilya Gelfenbeyn told ARPost in a recent interview. “In many cases, they are acting in some service roles and they are preferable [to human agents].”

Inworld AI makes brains that can animate Ready Player Me avatars in virtual worlds. Creators get to decide what the artificial intelligence knows – or what information it can access from the web – and what its personality is like as it walks and talks its way through the virtual landscape.

“You basically are teaching an actor how it is supposed to behave,” Inworld CPO Kylan Gibbs told ARPost.

Large language models are also used by developers to speed up back-end processes like generating code.

How XR Gives Back

So far, we’ve talked about ways in which artificial intelligence makes XR experiences better. However, the opposite is also true, with XR helping to strengthen AI for other uses and applications.

Evolving AI

We’ve already seen that some approaches to artificial intelligence are modeled after the human brain. We know that the human brain developed essentially through trial and error as it rose to meet the needs of our early ancestors. So, what if virtual brains had the same opportunity?

Martine Rothblatt PhD reports that very opportunity in the excellent book “Virtually Human: The Promise – and the Peril – of Digital Immortality”:

“[Academics] have even programmed elements of autonomy and empathy into computers. They even create artificial software worlds in which they attempt to mimic natural selection. In these artificial worlds, software structures compete for resources, undergo mutations, and evolve. Experimenters are hopeful that consciousness will evolve in their software as it did in biology, with vastly greater speed.”

Feeding AI

Like any emerging technology, people’s expectations of artificial intelligence can grow faster than AI’s actual capabilities. AI learns by having data entered into it. Lots of data.

For some applications, there is a lot of extant data for artificial intelligence to learn from. But, sometimes, the answers that people want from AI don’t exist yet as data from the physical world.

“One sort of major issue of training AI is the lack of data,” Treble Technologies CEO Finnur Pind told ARPost in a recent interview.

Treble Technologies works with creating realistic sound in virtual environments. To train an artificial intelligence to work with sound, it needs audio files. Historically, these were painstakingly sampled with different things causing different sounds in different environments.

Usually, during the early design phases, an architect or automotive designer will approach Treble to predict what audio will sound like in a future space. However, Treble can also use its software to generate specific sounds in specific environments to train artificial intelligence without all of the time and labor-intensive sampling. Pinur calls this “synthetic data generation.”

The AI-XR Relationship Is “and” Not “or”

Holding up artificial intelligence as the new technology on the block that somehow takes away from XR is an interesting narrative. However, experts are in agreement that these two emerging technologies reinforce each other – they don’t compete. XR helps AI grow in new and fantastic ways, while AI makes XR tools more powerful and more accessible. There’s room for both.

The Intersections of Artificial Intelligence and Extended Reality Read More »

awe-usa-2023-day-three:-eyes-on-apple

AWE USA 2023 Day Three: Eyes on Apple

The last, third day of AWE USA 2023 took place on Friday, June 2. The first day of AWE is largely dominated by keynotes. A lot of air on the second day is taken up by the expo floor opening. By the third day, the keynotes are done, the expo floor starts to get packed away, and panel discussions and developer talks rule the day. And Apple ruled a lot of those talks.

Bracing for Impact From Apple

A big shift is expected this week as Apple is expected to announce its entrance into the XR market. The writing has been on the wall for a long time.

Rumors have probably been circulating for longer than many readers have even been watching XR. ARPost started speculating in 2018 on a 2019 release. Five years of radio silence later and we had reports that the product would be delayed indefinitely.

The rumor mill is back in operation with an expected launch this week (Apple’s WWDC23 starts today) – with many suggesting that Meta’s sudden announcement of the Quest 3 is a harbinger. Whether an Apple entrance is real this time or not, AWE is bracing itself.

Suspicion on Standards

Let’s take a step back and look at a conversation that happened on AWE USA 2023 Day Two, but is very pertinent to the emerging Apple narrative.

The “Building Open Standards for the Metaverse” panel moderated by Moor Insights and Strategy Senior Analyst Anshel Sag brought together XR Safety Initiative (XRSI) founder and CEO Kavya Pearlman, XRSI Advisor Elizabeth Rothman, and Khronos Group President Neil Trevett.

Apple’s tendency to operate outside of standards was discussed. Even prior to their entrance into the market, this has caused problems for XR app developers – Apple devices even have a different way of sensing depth than Android devices. XR glasses tend to come out first or only on Android in part because of Android’s more open ecosystem.

“Apple currently holds so much power that they could say ‘This is the way we’re going to go.’ and the Metaverse Standards Forum could stand up and say ‘No.’,” said Pearlman, expressing concern over accessibility of “the next generation of the internet”.

Trevett expressed a different approach, saying that standards should present the best option, not the only option. While standards are more useful the more groups use them, competition is helpful and shows diversity in the industry. And diversity in the industry is what sets Apple apart.

“If Apple does announce something, they’ll do a lot of education … it will progress how people use the tech whether they use open standards or not,” said Trevett. “If you don’t have a competitor on the proprietary end of the spectrum, that’s when you should start to worry because it means that no one cares enough about what you’re doing.”

Hope for New Displays

On Day Three, KGOn Tech LLC’s resident optics expert Karl Guttag presented an early morning developer session on “Optical Versus Passthrough Mixed Reality.” Guttag has been justifiably critical of Meta Quest Pro’s passthrough in particular. Even for optical XR, he expressed skepticism about a screen replacement, which is what the Apple headset is largely rumored to be.

karl guttag AWE 2023 Day 3
Karl Guttag

“One of our biggest issues in the market is expectations vs. reality,” said Guttag. “What is hard in optical AR is easy in passthrough and vice versa. I see very little overlap in applications … there is also very little overlap in device requirements.”

A New Generation of Interaction

“The Quest 3 has finally been announced, which is great for everyone in the industry,” 3lbXR and 3lb Games CEO Robin Moulder said in her talk “Expand Your Reach: Ditch the Controllers and Jump into Mixed Reality.” “Next week is going to be a whole new level when Apple announces something – hopefully.”

robin moulder AWE 2023 Day 3
Robin Moulder

Moulder presented the next round of headsets as the first of a generation that will hopefully be user-friendly enough to increase adoption and deployment bringing more users and creators into the XR ecosystem.

“By the time we have the Apple headset and the new Quest 3, everybody is going to be freaking out about how great hand tracking is and moving into this new world of possibilities,” said Moulder.

More on AI

AI isn’t distracting anyone from XR and Apple isn’t distracting anyone from AI. Apple appearing as a conference theme doesn’t mean that anyone was done talking about AI. If you’re sick of reading about AI, at least read the first section below.

Lucid Realities: A Glimpse Into the Current State of Generative AI

After two full days of people talking about how AI is a magical world generator that’s going to take the task of content creation off of the shoulders of builders, Microsoft Research Engineer Jasmine Roberts set the record straight.

jasmine roberts AWE 2023
Jasmine Roberts

“We’ve passed through this techno-optimist state into dystopia and neither of those are good,” said Roberts. “When people think that [AI] can replace writers, it’s not really meant to do that. You still need human supervisors.”

AI not being able to do everything that a lot of people think it can isn’t the end of the world. A lot of the things that people want AI to do is already possible through other less glamorous tools.

“A lot of what people want from generative AI, they can actually get from procedural generation,” said Roberts. “There are some situations where you need bespoke assets so generative AI wouldn’t really cut it.”

Roberts isn’t against AI – her presentation was simply illustrating that it doesn’t work the way that some industry outsiders are being led to believe. That isn’t the same as saying that it doesn’t work. In fact, she brought a demo of an upcoming AI-powered Clippy. (You remember Clippy, right?)

Augmented Ecologies

Roberts was talking about the limitations of AI. The “Augmented Ecologies” panel moderated by AWE co-founder Tish Shute, saw Three Dog Labs founder Sean White,  Morpheus XR CTO Anselm Hook, and Croquet founder and CTO David A. Smith talking about what happens when AI is the new dominant life form on planet Earth.

Tish Shute, Sean White, Anselm Hook, and David Smith - AWE 2023 Day 3
From left to right: Tish Shute, Sean White, Anselm Hook, and David Smith

“We’re kind of moving to a probabilistic model, it’s less deterministic, which is much more in line with ecological models,” said White.

This talk presented the scenario in which developers are no longer the ones running the show. AI takes on a life of its own, and that life is more capable than ours.

“In an ecology, we’re not necessarily at the center, we’re part of the system,” said Hook. “We’re not necessarily able to dominate the technologies that are out there anymore.”

This might scare you, but it doesn’t scare Smith. Smith described a future in which AI becomes the legacy that can live in environments that humans never can, like the reaches of space.

“The metaverse and AI are going to redefine what it means to be human,” said Smith. “Ecosystems are not healthy if they are not evolving.”

“No Longer the Apex”

On the morning of Day Two, the Virtual World Society and the VR/AR Association hosted a very special breakfast. Invited were some of the most influential leaders in the immersive technology space. The goal was to discuss the health and future of the XR industry.

The findings will be presented in a report, but some of the concepts were also presented at “Spatial Computing for All” – a fireside chat with Virtual World Society Founder Tom Furness, HTC China President Alvin Graylin, and moderated by technology consultant Linda Ricci.

The major takeaway was that the industry insiders aren’t particularly worried about the next few years. After that, the way in which we do work might start to change and that might have to change the ways that we think about ourselves and value our identities in a changing society.

AWE Is Changing Too

During the show wrap-up, Ori Inbar had some big news. “AWE is leveling up to LA.” This was the fourteenth AWE. Every AWE, except for one year when the entire conference was virtual because of the COVID-19 pandemic, has been in Santa Clara. But, the conference has grown so much that it’s time to move.

AWE 2024 in LA

“I think we realized this year that we were kind of busting at the seams,” said Inbar. “We need a lot more space.”

The conference, which will take place from June 18-20 will be in Long Beach, with “super, super early bird tickets” available for the next few weeks.

Yes, There’s Still More

Most of the Auggie Awards and the winners of Inbar’s climate challenge were announced during a ceremony on the evening of Day Two. During the event wrap-up, the final three Auggies were awarded. We didn’t forget, we just didn’t have room for them in our coverage.

So, there is one final piece of AWE coverage just on the Auggies. Keep an eye out. Spoiler alert, Apple wasn’t nominated in any of the categories.

AWE USA 2023 Day Three: Eyes on Apple Read More »

awe-usa-2023-day-two:-more-keynotes,-more-panels,-and-the-open-expo-floor

AWE USA 2023 Day Two: More Keynotes, More Panels, and the Open Expo Floor

The second day of AWE is the day that the expo floor opens. That is always thrilling, and we’ll get there, but first – more keynotes and conversations.

AWE Day Two Keynotes

Day One kickstarted the keynotes, but AWE Day Two saw exciting presentations and announcements from Magic Leap and Niantic. Both affirmed a theme from the day before: meaningful XR is already here.

Magic Leap: Let’s Get to Work

“The vision of AR that some legacy tech companies are promising is still years out, is not years or months or days out,” Magic Leap CEO Peggy Johnson said in her keynote. “The small team at Magic Leap has made something that many larger companies are still struggling to achieve.”

Peggy Johnson, Magic Leap's CEO AWE Day 2
Peggy Johnson

Johnson also continued another theme from AWE Day One: AI and XR aren’t in competition – they help each other. Inbar’s opening talk included a line that quickly became a motto for almost the whole event: “XR is the interface for AI.”

“I honestly believe AR systems are going to become the endpoints for a lot of AI,” said Johnson. “The ability to provide contract input and get contextual output will really be a game changer.”

Magic Leap’s big announcement wasn’t to do with AI, but it will still be thrilling to developers: an Unreal Engine plugin is coming in August.

“AR Everywhere” With Niantic

While enterprise companies and hardware manufacturers are still struggling with adoption to since degree, few companies have done as much for AR consumer adoption as Niantic.

Brian McClendon Niantic Labs AWE Day 2
Brian McClendon

In his AWE keynote, “Empowering AR Everywhere”, Niantic Senior Vice President of Engineering, Brian McClendon, laid out a number of major updates coming to the company – as well as coming to or through 8th Wall.

First, ARDK 3.0 will allow developers using Niantic tools to also use outside AR asset libraries. It will also enable a QR code-triggered “lobby system” for multi-user shared AR experiences. The updated ARDK will enter a beta phase later this month. A new maps SDK compatible with Unity is also coming to 8th Wall.

Further, 8th Wall’s “Metaversal Deployment” announced at AWE 2021 is now compatible with mixed reality via Quest 2, Quest Pro, “and probably all future MR headsets.”

Big Picture Panel Discussions

One of the things that really makes AWE special is its ability to bring together the industry’s big thinkers. A number of insightful panel discussions from Day Two explored some of the biggest topics in XR today.

XR’s Inflection Point

The panel discussion “How Immersive Storytelling Can Deepen Human Understanding of Critical Issues” brought together Unity CEO John Riccitiello, journalist Ashlan Cousteau, and TRIPP CEO and co-founder Nanea Reeves. The talk included further affirmations that, contrary to some media pieces, XR as an industry is thriving.

John Riccitiello, Ashlan Cousteau, Nanea Reeves - AWE Day 2
From left to right: John Riccitiello, Ashlan Cousteau, and Nanea Reeves

“I now cancel what I said seven years ago about this not being a good time to build a business in this space,” said Riccitiello. “We’re at a time right now where it makes a lot of sense to look forward with optimism around XR. … Companies are born around technology transitions.”

Reeves echoed the sentiment, but included some of the cautious caveats expressed by XR ethicist Kent Bye during a panel discussion yesterday.

“We’re at such an interesting point of technology and the evolution of it, especially with AI and XR,” said Reeves. “What’s the next level of storytelling and what should we be aware of as we bring AI into it?”

Building Open Standards for the Metaverse

The good news is that the metaverse isn’t dead. The bad news is that it arguably hasn’t been born yet either. One of the most important features of the metaverse is also one of its most elusive.

It was also the crux of a panel discussion bringing together XR Safety Initiative founder and CEO Kavya Pearlman, XRSI Advisor Elizabeth Rothman, and Khronos Group President Neil Trevett, moderated by Moor Insights and Strategy Senior Analyst Anshel Sag.

Kavya Pearlman, Neil Trevett, Elizabeth Rothman, and Anshel Sag - AWE 2023 Day 2
From left to right: Kavya Pearlman, Neil Trevett, Elizabeth Rothman, and Anshel Sag

“Whichever way you come to the metaverse, you need interoperability,” said Trevett. “It’s foundational.”

The panel also addresses the lasting and fleeting effects of the wave of attention that has seemingly passed over the metaverse.

“We go through these hype cycles and bubbles,” said Rothman. “There are always technological innovations that come out of them.”

The panel also addressed AI, an overarching theme of the conference. However, the panel brought up one concern with the technology that had not been addressed elsewhere.

“This convergence has a way more visceral impact on children’s brains even than social media,” said Pearlman.

So far, the “solution” to this problem has been for content publishers to age-restrict experiences. However, this approach has crucial shortcomings. First, most approaches to age restrictions aren’t foolproof. Second, when they are, this measure excludes young users rather than protecting them.

“We run the risk of regulating children right out of the metaverse,” said Rothman. “We need to strike a balance.”

Hitting the AWE Floor

I first started covering AWE during the pandemic when the entire conference was virtual. AWE is a lot more fun in-person but, practically speaking, the demos are the only component that can’t really happen remotely.

Meeting Wol

I actually met Wol in the Niantic Lounge before the very first session on Day One. While this is where arranging this content makes sense to me, Wol was possibly my first impression of AWE. And it was a good one. But wait, who’s Wol?

Niantic Lounge AWE 2023
Niantic Lounge

Wol is a collaboration between 8th Wall, Liquid City, and InWorld AI. He’s an artificially intelligent character virtually embodied as an owl. His only job is to educate people about the Redwood Forest but he’s also passionate about mushrooms, fairies, and, well, you just have to meet him.

“Wol has a lot of personal knowledge about his own life, and he can talk to you about the forest through his own experience,” explained Liquid City Director Keiichi Matsuda. “Ultimately, Wol has a mind of its own and we can only provide parameters for it.”

Wol

I met Wol through the Quest Pro in passthrough AR via a portal that appeared in the room directly into the Redwoods – and, now that I think about it, this was the day before Niantic announced that 8th Wall supported Quest Pro MR. In any case, the whole experience was magical, and I can’t wait to get home and show it to the family.

Visiting Orlando via Santa Clara

Largely thanks to a group called the Orlando Economic Partnership, Orlando is quickly becoming a global epicenter of metaverse development. Just one of their many initiatives is an 800-square-mile virtual twin of the Orlando area. The digital twin has its own in-person viewing room in Orlando but it also exists in a more bite-size iteration that runs on a Quest 2.

“The idea was to showcase the entire region – all of its assets in terms of data points that we could present,” explained the OEP’s Director of Marketing and Communications Justin Braun. “It’s going to become a platform for the city to build on.”

I was able to see at AWE featured photorealistic 3D models of Orlando landmarks, complete with informational slides and quiz questions. The full version, which took 11 months, is a lot more fully featured. It just doesn’t fit in Braun’s backpack.

“At some point, this will be able to do things that are beneficial for the city and its utilities, like shower power outages,” said the OEP’s Chief Information Officer David Adelson. “It’s community-driven.”

Gathering Around the Campfire

I opened by saying that demos can’t be done remotely. I remotely demoed Campfire recently, but that was their desktop view. Campfire also offers tabletop and room-scale 3D interactions that require the company’s custom-made headset and markers. I got to try these solutions out hands-on when I reconnected with CEO and co-founder Jay Wright on the AWE floor.

campfire at AWE 2023 Day 2
Campfire at AWE USA 2023

“The perception system is designed to do one thing very well, and that’s to make multi-user AR as high-fidelity as desktop,” said Wright. And they’ve done it.

Models and mockups that I viewed in mixed reality using Campfire’s hardware were beautifully rendered. The internet connectivity at AWE is notoriously spotty and, while the controller disconnected a few times, the display never skipped a beat.

Wright demonstrated the visor that switches Campfire from MR to VR on a virtually reconstructed art museum that I could view from above in a “dollhouse mode” or travel through in a 1:1 model. In addition to showcasing more hardware and software ease-of-use, it might have been the most graphically impressive showcase I’ve seen from XR hardware ever.

The Lenovo VRX

With Lenovo ThinkReality’s new headset announced the day before AWE started, this might be the record for the shortest passage of time between a headset releasing and my putting it on – and it’s all thanks to ARPost’s longtime Lenovo contact Bill Adams.

“We think we have one of the best passthrough headsets and most comfortable headsets in the industry,” said Adams, who made a gentleman’s wager that I would (finally) be able to see my notes through the Lenovo VRX.

I couldn’t read my writing, but I could tell where the writing was on the page – which, honestly, is enough. Having tried the same experiment on the Quest Pro earlier that day, I can back up what Adams said about the headset’s passthrough quality.

As for comfort, ditto. The headset features a removable overhead strap, but it was so comfortable that I forgot that the strap was there anyway. Switching from VR to passthrough is a simple button press.

Catching Up With Snap

The average user can have a great AR experience with just a phone, and the average creator can make a really advanced experience without creating their own app, according to Snap Senior Product Communications Manager Cassie Bumgarner.

Snap AR at AWE 2023
Snap at AWE 2023

“There’s a lot of chatter on the hardware front, but what we want to show is that there’s so much more left to unlock on the mobile front,” said Bumgarner.

A Snap Lense made with QReal uses AI to identify LEGO bricks in a tub. A quick scan, and the lens recommends small models that can be made with the available pieces. Bumgarner and I still get the fun of digging out the pieces and assembling them, and then the app creates a virtual LEGO set to match our creation – in this case, a bathtub to go with the duck we made.

Snap bricks AWE 2023 Day 2

Of course, Snap has hardware too. On display at AWE, the company showed off the virtual try-on mirrors debuted at the Snap Partner Summit that took place in April.

One More Day of AWE

Two days down and there’s still so much to look forward to from AWE. The expo floor is still open tomorrow. There are no more keynotes, but that just means that there’s more time for panel discussions and insightful conversations. And don’t think we forgot about the Auggies. While most of the Auggies were awarded last evening, there are still three to be awarded.

AWE USA 2023 Day Two: More Keynotes, More Panels, and the Open Expo Floor Read More »

awe-usa-2023-day-one:-xr,-ai,-metaverse,-and-more

AWE USA 2023 Day One: XR, AI, Metaverse, and More

AWE USA 2023 saw a blossoming industry defending itself from negative press and a perceived rivalry with other emerging technologies. Fortunately, Day One also brought big announcements, great discussions, and a little help from AI itself.

Ori Inbar’s Welcome Address

Historically, AWE has started with an address from founder Ori Inbar. This time, it started with an address from a hologram of Ori Inbar appearing on an ARHT display.

Ori Inbar hologram at AWE USA 2023 Day 1
Ori Inbar hologram

The hologram waxed on for a few minutes about progress in the industry and XR’s incredible journey. Then the human Ori Inbar appeared and told the audience that everything that the hologram said was written by ChatGPT.

While (the real) Inbar quipped that he uses artificial intelligence to show him how not to talk, he addressed recent media claims that AI is taking attention and funding away from XR. He has a different view.

it’s ON !!!

Ori Inbar just started his opening key note at #AWE2023

Holo-Ori was here thanks to our friends from @arht_tech.@como pic.twitter.com/Do23hjIkST

— AWE (@ARealityEvent) May 31, 2023

“We industry insiders know this is not exactly true … AI is a good thing for XR. AI accelerates XR,” said Inbar. “XR is the interface for AI … our interactions [with AI] will become a lot less about text and prompts and a lot more about spatial context.”

“Metaverse, Shmetaverse” Returns With a Very Special Guest

Inbar has always been bullish on XR. He has been skeptical of the metaverse.

At the end of his welcome address last year, Inbar praised himself for not saying “the M word” a single time. The year before that, he opened the conference with a joke game show called “Metaverse, Shmetaverse.” Attendees this year were curious to see Inbar share the stage with a special guest: Neal Stephenson.

Neal Stephenson at AWE USA 2023 Day 1
Neal Stephenson

Stephenson’s 1992 book, Snow Crash, introduced the world to the word “metaverse” – though Stephenson said that he wasn’t the first one to imagine the concept. He also addressed the common concern that the term for shared virtual spaces came from a dystopian novel.

“The metaverse described in Snow Crash was my best guess about what spatial computing as a mass medium might look like,” said Stephenson. “The metaverse itself is neither dystopian nor utopian.”

Stephenson then commented that the last five years or so have seen the emergence of the core technologies necessary to create the metaverse, though it still suffers from a lack of compelling content. That’s something that his company, Lamina1, hopes to address through a blockchain-based system for rewarding creators.

“There have to be experiences in the metaverse that are worth having,” said Stephenson. “For me, there’s a kind of glaring and frustrating lack of support for the people who make those experiences.”

AWE 2023 Keynotes and Follow-Ups

Both Day One and Day Two of AWE start out with blocks of keynotes on the main stage. On Day One, following Inbar’s welcome address and conversation with Stephenson, we heard from Qualcomm and XREAL (formerly Nreal). Both talks kicked off themes that would be taken up in other sessions throughout the day.

Qualcomm

From the main stage, Qualcomm Vice President and General Manager of XR, Hugo Swart, presented “Accelerating the XR Ecosystem: The Future Is Open.” He commented on the challenge of developing AR headsets, but mentioned the half-dozen or so Qualcomm-enabled headsets released in the last year, including the Lenovo ThinkReality VRX announced Tuesday.

Hugo Swart Qualcomm at AWE USA 2023 Day 1
Hugo Swart

Swart was joined on the stage by OPPO Director of XR Technology, Yi Xu, who announced a new Qualcomm-powered MR headset that would become available as a developer edition in the second half of this year.

As exciting as those announcements were, it was a software announcement that really made a stir. It’s a new Snapdragon Spaces tool called “Dual Render Fusion.”

“We have been working very hard to reimagine smartphone XR when used with AR glasses,” said Swart. “The idea is that mobile developers designing apps for 2D expand those apps to world-scale apps without any knowledge of XR.”

Keeping the Conversation Going

Another talk, “XR’s Inflection Point” presented by Qualcomm Director of Product Management Steve Lukas, provided a deeper dive into Dual Render Fusion. The tool allows an experience to use a mobile phone camera and a headworn device’s camera simultaneously. Existing app development tools hadn’t allowed this because (until now) it didn’t make sense.

Steve Lukas at AWE 2023 Day 1
Steve Lukas

“To increase XR’s adoption curve, we must first flatten its learning curve, and that’s what Qualcomm just did,” said Lukas. “We’re not ready to give up on mobile phones so why don’t we stop talking about how to replace them and start talking about how to leverage them?”

A panel discussion, “Creating a New Reality With Snapdragon Today” moderated by Qualcomm Senior Director of Product Management XR Said Bakadir, brought together Xu, Lenovo General Manager of XR and Metaverse Vishal Shah, and DigiLens Vice President of Sales and Marketing Brian Hamilton. They largely addressed the need to rethink AR content and delivery.

Vishal Shah, Brian Hamilton, Yi Xu, and Said Bakadir at AWE USA 2023 Day 1
From left to right: Vishal Shah, Brian Hamilton, Yi Xu, and Said Bakadir

“When I talk to the developers, they say, ‘Well there’s no hardware.’ When I talk to the hardware guys, they say, ‘There’s no content.’ And we’re kind of stuck in that space,” said Bakadir.

Hamilton and Shah both said, in their own words, that Qualcomm is creating “an all-in-one platform” and “an end-to-end solution” that solves the content/delivery dilemma that Bakadir opened with.

XREAL

In case you blinked and missed it, Nreal is now XREAL. According to a release shared with ARPost, the name change had to do with “disputes regarding the Nreal mark” (probably how similar it was to “Unreal”). But, “the disputes were solved amicably.”

Chi Xu XREAL AWE 2023
Chi Xu

The only change is the name – the hardware and software are still the hardware and software that we know and love. So, when CEO Chi Xu took the stage to present “Unleashing the Potential of Consumer AR” he just focused on progress.

From one angle, that progress looks like a version of XREAL’s AR operating system for Steam Deck, which Xu said is “coming soon.” From another angle, it looked like the partnership with Sightful which recently resulted in “Spacetop” – the world’s first AR laptop.

XREAL also announced Beam, a controller and compute box that can connect wirelessly or via hard connection to XREAL glasses specifically for streaming media. Beam also allows comfort and usability settings for the virtual screen that aren’t currently supported by the company’s current console and app integrations. Xu called it “the best TV innovation since TV.”

AI and XR

A number of panels and talks also picked up on Inbar’s theme of AI and XR. And they all (as far as I saw) unanimously agreed with Inbar’s assessment that there is no actual competition between the two technologies.

The most in-depth discussion on the topic was “The Intersection of AI and XR” a panel discussion between XR ethicist Kent Bye, Lamina1 CPO Tony Parisi, HTC Global VP of Corporate Development Alvin Graylin, and moderated by WXR Fund Managing Partner Amy LaMeyer.

Amy LaMeyer, Tony Parisi, Alvin Graylin, Kent Bye AWE 2023 Day 1
From left to right: Amy LaMeyer, Tony Parisi, Alvin Graylin, Kent Bye

“There’s this myth that AI is here so now XR’s dead, but it’s the complete opposite,” said Graylin. Graylin pointed out that most forms of tracking and input as well as approaches to scene understanding are all driven by AI. “AI has been part of XR for a long time.”

While they all agreed that AI is a part of XR, the group disagreed on the extent to which AI could take over content creation.

“A lot of people think AI is the solution to all of their content creation and authoring needs in XR, but that’s not the whole equation,” said Parisi.

Graylin countered that AI will increasingly be able to replace human developers. Bye in particular was vocal that we should be reluctant and suspicious of handing over too much creative power to AI in the first place.

“The differentiating factor is going to be storytelling,” said Bye. “I’m seeing a lot of XR theater that has live actors doing things that AI could never do.”

Web3, WebXR, and the Metaverse

The conversation is still continuing regarding the relationship between the metaverse and Web3. With both the metaverse and Web3 focusing on the ideas of openness and interoperability, WebXR has become a common ground between the two. WebXR is also the most accessible from a hardware perspective.

“VR headsets will remain a niche tech like game consoles: some people will have them and use them and swear by them and won’t be able to live without them, but not everyone will have one,” Nokia Head of Trends and Innovation Scouting, Leslie Shannon, said in her talk “What Problem Does the Metaverse Solve?”

Leslie Shannon AWE 2023 Day 1
Leslie Shannon

“The majority of metaverse experiences are happening on mobile phones,” said Shannon. “Presence is more important than immersion.”

Wonderland Engine CEO Jonathan Hale asked “Will WebXR Replace Native XR” with The Fitness Resort COO Lydia Berry. Berry commented that the availability of WebXR across devices helps developers make their content accessible as well as discoverable.

Lydia Berry and Jonathan Hale AWE 2023 Day 1
Lydia Berry and Jonathan Hale

“The adoption challenges around glasses are there. We’re still in the really early adoption phase,” said Berry. “We need as many headsets out there as possible.”

Hale also added that WebXR is being taken more seriously as a delivery method by hardware manufacturers who were previously mainly interested in pursuing native apps.

“More and more interest is coming from hardware manufacturers every day,” said Hale. “We just announced that we’re working with Qualcomm to bring Wonderland Engine to Snapdragon Spaces.”

Keep Coming Back

AWE Day One was a riot but there’s a lot more where that came from. Day Two kicks off with keynotes by Magic Leap and Niantic, there are more talks, more panels, more AI, and the Expo Floor opens up for demos. We’ll see you tomorrow.

AWE USA 2023 Day One: XR, AI, Metaverse, and More Read More »

immersive-technology-for-the-american-workforce-act:-legislation-that-aims-to-provide-equitable-access-to-xr-tech

Immersive Technology for the American Workforce Act: Legislation That Aims to Provide Equitable Access to XR Tech

The Immersive Technology for the American Workforce Act of 2023 was drafted by Rep. Lisa Blunt Rochester (D-DE) and Rep. Tim Walberg (R-MI) with the support of organizations like the XR Association (XRA), Talespin, Unity, Association for Career and Technical Education, Transfr, and HTC VIVE, among others.

“Emerging technologies, such as XR, can help meet people where they are and expand access to cutting-edge technology and training resources,” remarked XRA CEO Elizabeth Hyman in a press release shared with ARPost. “Rep. Lisa Blunt Rochester’s and Rep. Tim Walberg’s bill recognizes the importance of equitable access to skills training and workforce development programs and the key role immersive technology plays in delivering better outcomes.”

What Is the Immersive Technology for the American Workforce Act of 2023?

One advantage of incorporating immersive technologies for workforce training is that these are cost-effective and safer. They can also provide expanded training to underserved communities, as well as to workers with disabilities.

The Immersive Technology for the American Workforce Act aims to create a five-year program that provides support to various institutions, allowing them to utilize immersive technologies in their educational and training programs.

Furthermore, it aims to promote the development of inclusive technology while prioritizing underserved communities, such as rural areas and areas of substantial unemployment. It seeks to foster partnerships between private and public entities to address skills gaps, meet the needs of the workforce, and assist individuals who are facing barriers to employment.

“We’re excited to be able to work with Rep. Blunt Rochester, a member of Congress who cares deeply about ensuring underserved populations are able to tap into next-generation technology and skills training,” said XRA Senior Vice President of Public Policy Joan O’Hara.

There’s almost a quarter of Americans living in rural communities who are facing unique workforce challenges. Moreover, the US Bureau of Labor Statistics reported that, at the start of 2023, the country had 10.5 million unfilled jobs. The bill seeks to fill these gaps by enabling Americans from underserved communities and various backgrounds to have access to effective and high-quality training programs.

“XR technologies can dramatically change the way America’s workforce is recruited, trained, skilled, and upskilled. Scalable solutions are necessary to meet the diverse needs of today’s undiscovered talent to meet the needs of our complex workforce,” said Transfr CEO Bharanidharan Rajakumar.

How Will the Legislation Impact the Future of Work?

The Immersive Technology for the American Workforce Act follows the footsteps of “recent legislative successes”, such as the Access to Digital Therapeutics Act of 2023, which effectively extends “coverage for prescription digital therapeutics”. It aims to provide support, in the form of grants, to community colleges and career and technical education centers.

The grants will allow them to leverage XR technology for purposes such as workforce development and skills training. Furthermore, Immersive Technology for the American Workforce Act will enable such organizations and facilities to utilize XR technology to enhance their training, which, in turn, can help address the speed with which American companies meet workforce needs.

Immersive Technology for the American Workforce Act: Legislation That Aims to Provide Equitable Access to XR Tech Read More »

a-demo-and-fresh-look-at-campfire

A Demo and Fresh Look at Campfire

For the last few years, Campfire 3D has been expanding the world of “holographic collaboration” with a custom headset and base station as well as software that works in headset-based mixed reality and virtual reality, mobile-based augmented reality, and now on desktop 3D.

The company is currently launching new hardware and a new product package, so we spoke with CEO and co-founder Jay Wright for a software demo and an explanation of the new release.

Gather Around the Campfire

“Our mission is to deliver what we call holographic collaboration – multiple people standing around a digital model of a physical thing, whether they’re all in the same room, or across the world,” said Wright, who called it the killer app for enterprise XR. “If this can be done successfully, we have a huge potential to reduce travel, shipping, and physical reworks.”

And Wright is no stranger to enterprise XR. He developed Vuforia as its vice president at Qualcomm. Qualcomm sold the project to PTC, where Wright followed as Vuforia’s president and general manager. While Wright left Vuforia in 2018, it remains PTC’s main enterprise augmented reality arm.

The following year, Wright co-founded Campfire with Roy Ashok, Alexander Turin, Steve Worden, Yuhan Liu, and XR pioneer Avi Bar-Zeev as founding advisor. Bar-Zeev has worked in XR since 1992 including co-founding Google Earth, serving as a consultant for Linden Labs, a principal architect for Microsoft, advisor for Tilt Five and Croquet, and president of the XR Guild.

In 2021, Campfire came out of stealth and started working with companies offering software, a headset, and a console that generated the virtual model.

A Demo on Desktop

While I haven’t yet gotten my hands on the company’s headset, the team did set me up for a demo on desktop – a major new offering for the tool. Wright did mention that he will be at the Augmented World Expo in a few weeks, so hopefully I’ll be able to try the headset there.

Basic functionalities with basic models include rotating and zooming in on models, and leaving annotations for other viewers to consider. This can be labeling items on the model, or taking screenshots, marking up the screenshot, and pinning it to avoid marking up the model directly.

As long as models are made up of components, they can be “exploded” to view and manipulate those components separately. This also allows users to see how systems are composed of parts through virtual assembly, disassembly, and reassembly. A “blue ghost” shows where selected components fit into a complete system for automatic guided instructions.

Selected components can also be reconfigured with different colors or textures on the fly. They can also be made invisible to make internal components easier to see without using the explode feature. A “slice” tool provides a transparency plane that can be moved through a model to create cross-sections. All of these tools work on all platforms.

“We spent a lot of time on ease-of-use,” said Wright. “The user interface is really similar whether it’s on a flat screen or in VR.”

Additions and Improvements

Today’s announcement includes a streamlined software package, expanded device accessibility, a larger base station option, and a new hardware and software package for teams.

A Cross-Platform Solution

The complete Campfire ecosystem consists of hardware and software. On the hardware side, the company has its own headset – which can be used for augmented reality or with a shaded visor for virtual reality – and two consoles for different-sized models. A phone can be an AR viewer but also serves as a controller for the headset via an adapter called “the pack.”

Campfire headset side

“We did this because everybody has used a phone and knows how to use it,” explained Wright.

One person must have a headset and console but additional participants can join on mobile or now on a desktop.

“Flat screens are still very important,” said Wright. “There are very few workflows in enterprise that involve XR and that don’t involve flat screens.”

That was one of the most consistent pieces of feedback that the company received from early users leading to this announcement. Of course, the different hardware that users join on does impact their experience – though all have access to basic collaboration tools.

“Once everybody is in Campfire, everybody has access to basic tools for pointing at things and communicating,” said Wright. “A huge amount of the power in holographic collaboration is just the ability to point things out in the 3D space.”

A Streamlined Software Offering

The apps were another common point of criticism. Until this announcement, the software side consisted of two separate end-user apps – one for authoring models and one for viewing models and collaborating. Now, one app can do both jobs.

Campfire new app mac car training

Participants can also be invited to a Campfire session via a link, just like 2D remote collaboration tools like Google Docs. This is fitting, as Wright believes that Campfire’s software has even more in common with legacy remote collaboration solutions.

“To the extent that spreadsheets or word documents drove the PC, we think that holographic collaboration does that for XR,” said Wright.

More Ways to View

Campfire launched with a tabletop console, which was great for designing smaller products like shoes, or modeling consumer packaged goods. Of course, virtual models of larger objects can be scaled down, but some users wanted larger models. That’s why Campfire now offers the “studio console” which goes on the floor instead of on a table.

Campfire console

Right now, viewing Campfire in AR or VR requires the company’s custom headset. However, the company is working on optimizing the application for use with the growing number of available passthrough headsets available on the market.

“We don’t see this class of device as something everyone has access to,” said Wright. “But people are going to purchase these devices and expect Campfire to work on them.”

Subscriptions Rolling Out Now

As of today, there are three ways to experience Campfire. First, the application does have a functionally-limited free model. Enterprise plans start at $1,500 per month and currently require contacting the company directly as they scale their public rollout. And now there’s “Campsite.”

Campfire new campsite experience

“Campsite” bundles five enterprise licenses, 2 headsets, packs, and tabletop consoles, and one studio console for $15,000 per year. Wright says that the whole Campsite can be set up in less than an hour.

A Future of Enterprise Collaboration

There are other companies doing parts of what Campfire is doing. And Wright’s argument that this technology is the future is hard to refute. While other companies are likely to step up, this is definitely a company to watch right now. After everything that they learned in the last two years, it’s exciting to think of what improvements this greater rollout will inspire.

A Demo and Fresh Look at Campfire Read More »

wonderland-engine-is-here-to-make-webxr-development-faster-and-easier

Wonderland Engine Is Here to Make WebXR Development Faster and Easier

WebXR development is increasingly popular. Developers want to create content that users can enjoy without having to install apps or check the compatibility of their devices.

One of the companies working for the advancement of immersive technologies, Wonderland GmbH, based in Cologne, Germany, has recently announced one giant leap forward in this process. They have recently released Wonderland Engine 1.0.0, a WebXR development platform already vouched for by top content creators.

Wonderland Engine 1.0.0 – Bringing Native XR Performance to WebXR Development

What is special about the new engine launched by Wonderland? Its first benefit is the ability to mimic native XR performance. Before its launch, Wonderland Engine 1.0.0 passed the test of content creators.

WebXR development platform Wonderland Engine editor vr website with browser

Vhite Rabbit XR and Paradowski Creative, two companies creating XR games, used the engine to develop content. The Escape Artist, an upcoming title by Paradowski Creative, is created with Wonderland Engine 1.0.p0, and its developers say that it matches native games in terms of polish and quality.

“We’re excited to announce this foundational version of Wonderland Engine, as we seek to bridge the gap between native XR app development and WebXR,” said the CEO and founder of Wonderland, Jonathan Hale, in a press release shared with ARPost. “We see a bright future for the WebXR community, for its developers, hardware, support, and content.”

Top Features of Wonderland Engine 1.0.0

The developers who choose Wonderland GmbH’s WebXR development platform to create content will be able to use the following:

  • Full 8th Wall integration – complete integration of 8th Wall AR tracking features such as face tracking, image tracking, SLAM, and VPS;
  • Runtime API rewrite – better code completion, static checks for bugs before running the code, and complete isolation for integration with other libraries;
  • Translation tools – necessary for the localization of WebXR content;
  • Benchmarking framework – to check for content performance on various devices.

Developers can find the complete list of features and bug fixes on the official release page.

According to the company, Wonderland Engine users can launch their first running app into the browser in less than two minutes. With a bit of experience, users can build a multi-user environment that supports VR, AR, and 3D in 10 minutes, as demonstrated in this video.

The XR Development Platform Is Optimized for VR Browsers

To indicate their commitment to helping content creators, Wonderland GmbH is optimizing their tool specifically for the most popular VR browsers: Meta Quest Browser, Pico Browser, and Wolvic.  

Wonderland Engine WebXR meta browser

Wonderland Engine-based apps support any headset that has a browser available. Also, any headset released in the future will automatically be supported, if it has a browser. Apps created with Wonderland Engine can also run on mobile devices through the browser, as Progressive Web Apps (PWA), which also allows them to run offline.

Apart from the two game development companies mentioned above, the company is also working with various content creators.

“It was crucial to bring the whole ecosystem with us to test and validate the changes we made. This resulted in a highly reliable base to build upon in upcoming versions,” Hale said. “By making it easier to build XR on the web we hope to attract developers and content creators to WebXR. We see WebXR truly being able to rival native apps and offer consumers a rich world of rapidly accessible content to enjoy.”

Meet the Wonderland Team at AWE USA 2023

The creators of Wonderland Engine 1.0.0 will present the WebXR development platform at AWE USA 2023 (use ARPost’s discount code 23ARPOSTD for 20% off your ticket), which is taking place in Santa Clara, CA between May 31 and June 2.

The company is one of the sponsors of the event and will also be present at the event in booth no. 605.

Wonderland Engine Is Here to Make WebXR Development Faster and Easier Read More »