featured

how-virtual-reality-is-revolutionizing-police-training

How Virtual Reality Is Revolutionizing Police Training

Law enforcement officers face various complex and challenging situations where they must respond to high-risk incidents involving armed perpetrators. Unfortunately, police officers in the US only receive less than six months of police training—which is where virtual reality comes in.

Using VR helps augment the need for more in-depth training in a safe and immersive training environment. It also helps further hone their skills, allowing them to effectively manage a more comprehensive array of situations, including highly stressful and unpredictable scenarios.

In this article, we’ll explore virtual reality’s role in police training, its benefits, and some real-life applications.

Why VR Is an Effective Training Tool

VR has many police training applications, allowing officers to improve their interactions with their communities and help them develop the necessary reactions in a more controlled environment. It provides law enforcement officers with immersive experiences close to real-life situations, which can help improve their learning and performance compared to more traditional training methods. With virtual reality police training, users can interact with a simulated environment that reacts accordingly, making them feel like they’re really there.

As a police training tool, VR can be used to enhance existing aspects of training, according to a study by Laura Giessing of Heidelberg University. It has the potential to help officers become better equipped to face critical incidents on duty by acquiring skills and tactics that can be readily applied when facing high-stress situations.

Benefits of VR in Police Training

Aside from helping law enforcement officers further develop skills such as communication, de-escalation, or intervention, it can also help them build empathy. Developing empathy allows officers to become more effective on duty by better understanding what a particular subject is going through.

Using VR as a police training tool has several key benefits, including:

Officer Safety

Police officers face complex and potentially dangerous scenarios in their line of work. Using VR for police training allows them to immerse in those scenarios without the risk of physical harm.

Access to Realistic Simulations

Virtual reality can simulate realistic scenarios that elicit the same reactions as their real-world counterparts. These simulations give officers the opportunity to continuously expose themselves to the simulations and gain as much experience as possible before facing similar situations in the field.

Customizable Scenarios

The great thing about using virtual reality in police training is that it’s a scalable and customizable solution. This means that training academies or organizations can create custom scenarios that align with changing needs and industry best practices.

Enhanced Decision-Making Capabilities

By exposing officers to realistic simulations, they can hone their critical thinking, problem-solving, and communication skills. VR training can also be modified to simulate increasingly high-stress or high-risk situations, helping officers learn how to effectively handle and de-escalate such scenarios at a more manageable pace.

Focus on Evaluation and Debriefing

VR can also help officers learn how to best evaluate a scenario and execute more in-depth debriefing sessions. That’s because users can replay different scenarios, allowing them to analyze each segment in more detail.

Real-World Examples of Police VR Training

Many police departments and organizations in the US and abroad already use VR for police training. These include:

Sacramento Police Department

This department uses immersive video simulators to recreate real-world scenarios, providing its officers with cultural competency and implicit bias training. Officers are also educated about proper decision-making and peer intervention.

Los Alamos Police Department

In 2021, the Los Alamos Police Department started applying VR technology to train its officers in more effective de-escalation tactics.

Mexico City

Mexico City established the first virtual reality training center for officers in Latin America. One of the goals of the training center is to help officers enhance their reflexes in high-risk or stressful emergency scenarios to improve their performance.

Gwent Police

Gwent Police officers benefit from a VR training program that teaches them how to respond to and make better decisions in stressful situations. The program has 10 scenarios based on real-life problems that police officers frequently encounter.

Dutch Police

The Dutch Police developed a VR simulation game that trains officers to complete different scenarios. This VR training also provides bias training for Dutch Police officers, helping them become more knowledgeable and better prevent ethnic profiling.

How Virtual Reality Is Revolutionizing Police Training Read More »

hands-on-review:-yoges-handle-attachments-for-quest-2-controllers

Hands-On Review: YOGES Handle Attachments for Quest 2 Controllers

There are a lot of possible interactions in virtual reality. The standard Quest 2 controllers just don’t always cut it anymore. Fortunately, there’s a large market of accessories manufacturers making adapters for different games and use cases. Not least among them is YOGES.

YOGES at It Again

YOGES specializes in accessories for the Meta Quest 2 headset and Quest 2 controllers. We’ve already reviewed one of their head strap alternatives for the device and found it to be comfortable and competitively priced. When they invited us to try out their “handle attachments” of course we were curious.

The adapters are designed for the Quest 2 controllers and are reported to work with games including Beat Saber, Gorilla Tag, Kayak VR: Mirage, Real VR Fishing, and others. In this writing, I used the grips to play Playin Pickleball, Bait!, and Kizuna AI – Touch the Beat! (That’s a Beat Saber clone with super-short sabers).

Before we jump into the playthroughs, let’s look at what’s in the box.

Unboxing

The minimal YOGES packaging for the handle attachments packs one handle for each controller, one detachable lanyard for each controller, and a connector piece turning the whole set into one two-headed controller. There are also two extra velcro ties to hold the controllers into the adapters – just in case. A set of directions is included as well, but it’s a simple setup.

Hands-On Review: YOGES Handle Attachments for Quest 2 Controllers

The standard Quest 2 controller sits into the adapters, which are each labeled “L” or “R”. Then, a velcro tab secures the controller into the adapter via the tracking ring – so, likely not compatible with the Quest Pro controllers. The bottoms of each adapter are threaded. Screw on a lanyard attachment or screw one of the adapters into either end of the connector piece.

The lightweight adapters are hollow core encased in durable-feeling molded foam. That hollow core keeps the weight and probably the cost down, but it also means that you can insert your Quest 2 controllers without removing the lanyards from them. That’s a handy feature because you might not want these adapters for everything that you do in VR.

The full rig measures in at almost exactly two feet. Each controller in a separate adapter with the lanyard attachment measures in at about ten inches – that’s some five-and-a-half inches longer than the Quest 2 controller by itself.

The adapters extend the Quest 2 controllers but don’t allow you to interact with them in any way. That is, you’ve still got to be holding the controller to press buttons and triggers. Fortunately, the lanyard on the end is long enough that you can put it around your wrist and still reach over the entire adapter to reach the controller.

Playtesting the Adapters for Quest 2 Controllers

I was worried that that length was going to throw off my game. It seems to me that if the adapter adds a few inches, that means that the Quest 2 thinks that my arm is a few inches longer than it is – right? This shouldn’t make much difference saber beating or gorilla tagging, but I was all set for playing pickleball to be a nightmare.

Playin Pickleball

But then, it wasn’t. I don’t know if the Quest 2 is smarter than I gave it credit for or if my brain was a lot more ready to accept the extended controller as a part of my arm, but I had no trouble hitting the ball reliably into targets in a practice mode.

layin Pickleball also might be the game that has seen the most flying Quest 2 controllers in my home – lanyards are a must. However, I didn’t use the lanyards to play with the YOGES adapter – the extra length and the molded foam made it significantly easier to hold onto a paddle.

Kizuna AI – Touch the Beat!

I had a bit more of a time getting used to the adapters when I played a round of Kizuna AI – Touch the Beat!. If you haven’t played the game, it’s very similar to Beat Saber but with smaller targets, smaller sabers, and different motion challenges.

Things took some more getting used to, possibly because the sabers are narrower than a pickleball paddle so my movements needed to be even more precise. I did also hit my overhead light at least once, though I’m not entirely sure that that was because of the adapter. Still, by the end of the first song, I had a pretty memorable streak going.

Bait!

From here, I really wanted to use the adapter as a sword handle in Battle Talent, but in Battle Talent you need to hold the trigger to hold the weapon, so that was a no-go. You also pump both arms and use the joysticks to run, so I couldn’t just leave a controller down and dedicate myself to two-handed weapons. I wondered about how the handle might work as a fishing rod in Bait!.

In Bait! you hold the rod and cast with one hand but use the trigger on the other controller to real it in. I let the left-hand controller (sans adapter) hang off of my left wrist as I used the right controller (with adapter) to do a double-handed cast. It was a little awkward because Bait! was still tracking the left-hand controller as it flopped through the air, but the cast was beautiful.

Is it Worth the Price?

Depending on where, when, and how you buy the YOGES Handle Attachments, they run between $18.58 (the price on Amazon at the time of writing) and $33.98 (the price currently listed on the YOGES website). That’s fairly competitive for adapters of this kind – and most adapter sets don’t include the connector piece.

YOGES adapters for Quest 2 Controllers velcro strap

As always, whether or not that’s worth the price depends on the games that you play. For as many games as I found improved by the adapters, I have at least as many that wouldn’t work. Maybe that’s not the case for you. Or maybe it is but you feel really passionate about improving your VR fishing cast or your virtual pickleball game.

I will say that on all of the games that were compatible with these adapters for Quest 2 controllers (and Bait!) my game was improved – or at least felt improved.

Parting Thoughts

So far, I continue to be pleased with YOGES. The Quest 2 Controller Handle Attachments, like the headset strap, are lightweight and low-cost comfortable adapters. While they may not be for all people or in all cases, they certainly have their place in the VR accessories ecosystem.

Hands-On Review: YOGES Handle Attachments for Quest 2 Controllers Read More »

exploring-the-world-of-live-xr-theater

Exploring the World of Live XR Theater

The last three years may feel as though they’ve gone by pretty quickly. A few short years ago, we were seeing an explosion of interest and production in XR theater and live virtual entertainment. The pandemic meant that a lot of theaters were empty, creating a strong need for audiences and entertainers alike.

Now it’s 2023. Theaters are open again. But, that doesn’t mean that XR theater has gone anywhere. Far from being a temporary fix to string us through an isolated event, live VR entertainment is stronger than ever. It remains a way to explore new avenues of storytelling and even bring new audiences into traditional entertainment venues.

Understanding Immersive Theater

Before we dive in, a quick clarifying note may be required. While some readers will hopefully come from a theater background, most readers are likely more familiar with XR terminology so one particular term might be confusing.

When ARPost describes an experience as “immersive,” we’re usually talking about a 3D virtual environment that is spatially explored either by physical movement in augmented or mixed reality, or through spatial navigation in virtual reality. However, XR does not have a monopoly on the word.

“Immersive theater” is a term from the live entertainment world that far predates XR and XR theater. In this form of immersive theater, participants converse with actors, manipulate props, and physically move through sets that might take up an entire building. While the pandemic played a part in the growth of XR theater, its roots are in immersive theater.

“Due to our familiarity with the genre of immersive theatre, and some of our team members had prior experience performing in and being audience members in VR theatre shows, the transition from in real life (IRL) to VR was very natural,” Ferryman Collective founding member Stephen Butchko told ARPost.

Ferryman Collective, one of the premiere production companies in XR theater, was founded during the pandemic but its founding members had already been performing immersive theater in live venues for years. In fact, one of Ferryman Collective’s first major productions, Severance Theory: Welcome to Respite, began life as an in-person immersive theater production.

From Gaming to XR Theater

The Under Presents, released in 2019, might be the first major piece of XR theater. Tender Claws, the development studio behind the production, had been exploring innovative digital productions and engagements for four years already, but The Under Presents is where our story begins.

The experience, built as a game that sometimes featured live actors, introduced countless viewers to live XR theater. It also inspired other artists at a time when the theater community was in dire need of something new and different.

“Born out of the Pandemic”

“Ferryman Collective was born out of the pandemic and brought together by the magic of The Under Presents, or ‘TUP’, as we affectionately call it,” Ferryman Collective founding member Deirdre Lyons told ARPost. “The pandemic shut everything down in 2020 except for TUP, as people performed and participated in it from home.”

In 2019, Lyons was one of the Tender Claw’s VR actors – a job that she still holds while also producing, directing, and acting in productions by Ferryman Collective. A number of members of Ferryman Collective met while working on TUP.

The live show was only supposed to run for three months but extended the run due to its high popularity. The live component of the app and game was eventually closed, leaving actors free to work on other projects, with Tender Claws’ second major XR theater production, Tempest, coming out the following year.

Ferryman Collective’s first production, PARA, a horror story about a dubious AI startup, came out in the autumn of 2020. The show was written by Braden Roy, and was directed by Roy and Brian Tull, who had also met working on TUP. Roy also wrote Ferryman Collective’s second production, Krampusnacht, directed by Roy, Tull, and Lyons in the winter of 2020-2021.

XR Theater Meets Immersive Theater

Ferryman Collective learned a lot from PARA and Krampusnacht. The latter got the collective their first award nomination, with a run that was extended four times to keep up with interest. However, the collective’s breakout production was The Severance Theory: Welcome to Respite – an XR adaptation of a pre-pandemic live immersive theater production.

“Having experienced quiet moments of contemplation with other audience members within my experience as an actor on TUP, I knew that this medium had the potential for a profound connection,” said Lyons. “Having done some voiceover work on The Severance Theory: Welcome to Respite […] I felt this piece could be that kind of powerful experience in VR.”

Lyons reached out to the play’s creator, Lyndsie Scoggin, who had also been sidelined by the pandemic. Scoggin went from not owning a headset to writing and directing the XR theater adaptation, which took on a life of its own.

“The IRL version of [Welcome to Respite] was performed for one audience member who plays a seven-year-old kid named Alex,” Butchko told ARPost. “In the VR version, we are able to include up to nine additional audience members who are put into invisible avatars and play the alternate aspects of Alex’s personality, the Alter Egos.”

Bringing in Participants

Ferryman Collective’s approach to Welcome to Respite brings in more participants per show, but it also allows the participants to control the story as a group as each one gets a vote to determine Alex’s actions taken by the singular Alex over the course of the play.

Expanding the scale of XR theater audiences is one of the pioneering pursuits of “scrappy storyteller” Brandan Bradley. Bradley has been exploring XR since 2017 but really dove into it during the pandemic. During this time he has launched his own projects and XR theater productions and has also acted in productions by Ferryman Collective.

“The pandemic brought about this collision of my two loves: interactive media and fine arts,” Bradley told ARPost in a 2020 interview.

NON-PLAYER CHARACTER - a VR Musical - Brandan Bradley

Bradley’s current production, NPC, brings in a group decision dynamic similar to Welcome to Respite. Bradley plays a side character in a video game that sees the main character die and turns to the audience for guidance. The audience is four “on-stage” participants that interact with him directly, and a larger “seated audience” that watches the action unfold.

Expanding the audience

Splitting the audience like this does a number of things for Bradley. Traditional immersive theater experiences might only have the participating audience – and most XR theater still works that way. From a strictly box office perspective, bringing in the “seated audience” allows Bradley to sell significantly more tickets per performance.

There’s also an audience accommodation aspect. While the “seated audience” might be interested in seeing a story that is shaped by the audience, shaping the story themselves might not be their cup of tea. Further, the “seated audience” can join on more widely affordable and available devices – including a web browser.

“There is a large contingency of the audience that enjoys a more passive role – like a Twitch chat come to life,” Bradley told me over coffee at AWE. “My mom, who will never put on goggles, is willing to join on the keyboard.”

Bradley’s OnBoardXR – a sort of workshop and venue for XR entertainers to begin developing and testing live performances – uses a similar ticketing model. In a lobby, audience members choose different avatars to signal to the actors the degree to which they feel comfortable participating.

NPC and OnBoardXR, take place on-browser and can be joined in headset, on a desktop, or even on a mobile phone. Ferryman Collective performs in VRChat for similar options. This is a departure from Tender Claws’ VR-only productions.

“All of us would love to be The Under Presents […] but the price point is outrageous and the timetable is untenable for someone who just wants to keep producing […] we’re kind of ‘Off Broadway,’” said Bradley. “This is the balance that we’re all doing. There are things we would all love to do with more robust tools […] right now it’s more important to have more participants.”

Exploring Affordances

Anytime that anything is brought into virtual reality, there are benefits and barriers. Live theater is no different. Set and prop design, construction, and storage can be a lot easier. This to the point that no XR production ever need be permanently ended. A show can be revived at any time because everything exists as files as opposed to physical objects that must be stored.

However, physicality and expression can be a trade-off. A character may be fantastically designed for VR, but controlling it and expressing through it isn’t always easy – even with special avatars with controller-activated expressions.

“Emotions within the scene must be conveyed through the actor’s voice and sometimes stylized gestures[…],” said Butchko. “Things that we cannot do easily or convincingly are eat, drink, and lay down. Those were all found in the IRL production of [Welcome to Respite], but could not be used in the VR version due to technical limitations.”

Further, if you’re still comparing XR theater with a typical play instead of immersive theater, there are a few more details that you might have missed. Some in-person immersive theater involves physical contact between actors and participants, or at least involves participants physically interacting with sets and props.

“Not all immersive shows have physical actor-audience contact but there’s still the physicality of the structure and props that can’t be replicated without building a physical space,” Tull told ARPost. “Smell and taste are noticed less, though the potpourri of an old mansion or a gin and tonic at a seedy speakeasy go a long way in completing the illusion.”

Tull further commented that, even when “physical actor-audience contact” is involved, “the visual immersion of virtual reality can almost replicate the intimacy of actual touch.” I certainly found this to be the case.

Exploring Emotion

As a participant in Ferryman Collective’s Gumball Dreams, an actor reached out and virtually put his hand on my chest. If an actor had physically done this in an IRL production, I dare say that this would have made me immensely uncomfortable in the worst way. But, in VR, this came across as intended – a moving intimate gesture between characters in a story.

Gumball Dreams has an amusing name and a brightly colored and stylized virtual world. However, the actual story is an incredibly moving exploration of mortality and consciousness. Similar themes exist in NPC, while Welcome to Respite explores the experience of psychological disorders. What makes XR theater so conducive to these heavy topics?

“At a media level, when you’re inviting the kind of immersion that VR affords, you want to do more than just comedy,” said Bradley. “There is an emotional intimacy that we experience in VR that we haven’t experienced anywhere else and don’t have words for and that’s the next degree of the storytelling experience.”

In this year’s AWE panel discussion on “XR Entertainment: The Next Generation of Movie Makers and Performers”, Ferryman Collective performer and producer Whitton Frank gave a description of XR theater that also explains the draw that it has to audiences as well as entertainers.

“You are given a character and you are a part of the play […] you’re having emotional experiences with another human being which is why, I think, people get excited about this,” said Frank. “That is the way forward – to show people the world in a way that they haven’t seen it before.”

Find an XR Theater Experience

So, how do you know when and which XR theater experiences are available? It’s still a pretty niche field, but it’s close-knit. Start out by following groups like Tender Claws, OnBoardXR, and Ferryman Collective. Then (before or after the show), talk to the other audience members. Some will likely be new to it themselves, but others will be able to point you in the right direction.

Exploring the World of Live XR Theater Read More »

challenges-behind-applying-real-world-laws-to-xr-spaces-and-ensuring-user-safety

Challenges Behind Applying Real-World Laws to XR Spaces and Ensuring User Safety

Immersive technologies bridging the gap between the physical and digital worlds can create new business opportunities. However, it also gives rise to new challenges in regulation and applying real-world laws to XR spaces. According to a World Economic Forum report, we are relatively slow in innovating new legal frameworks for emerging technologies like AR and VR.

Common Challenges of Applying Laws to AR and VR

XR technologies like AR and VR are already considered beneficial and are used in industries like medicine and education. However, XR still harbors risks to human rights, according to an Electronic Frontier Foundation (EFF) article.

Issues like data harvesting and online harassment pose real threats to users, and self-regulation when it comes to data protection and ethical guidelines is insufficient in mitigating such risks. Some common challenges that crop up when applying real-world laws to AR and VR include intellectual property, virtual privacy and security, and product liability.

There’s also the need for a new framework tailored to fit emerging technologies, but legislative attempts at regulation may face several hurdles. It’s also worth noting that while regulation can help keep users safe, it may also potentially hamper the development of such technologies, according to Digikonn co-founder Chirag Prajapati.

Can Real-World Laws Be Applied to XR Spaces?

In an interview with IEEE Spectrum in 2018, Robyn Chatwood, an intellectual property and information technology partner at Dentons Australia, gave an example of an incident that occurred in a VR space where a user experienced sexual assault. Unfortunately, Chatwood remarked that there are no laws saying that sexual assault in VR is the same as in the real world. When asked when she thinks these issues will be addressed, Chatwood remarked that, in several years, another incident could draw more widespread attention to the problems in XR spaces. It’s also possible that, through increased adoption, society will begin to recognize the need to develop regulations for XR spaces.

On a more positive note, the trend toward regulations for XR spaces has been changing recently. For instance, Meta has rolled out a minimum distance between avatars in Horizon Worlds, its VR social media platform. This boundary prevents other avatars from getting into your avatar’s personal space. This system works by halting a user’s forward movement as they get closer to the said boundary.

There are also new laws being drafted to protect users in online spaces. In particular, the UK’s Online Safety Bill, which had its second reading in the House of Commons in April 2022, aims to protect users by ensuring that online platforms have safety measures in place against harmful and illegal content and covers four new criminal offenses.

In the paper, The Law and Ethics of Virtual Assault, author John Danaher proposes a broader definition of virtual sexual assault, which allows for what he calls the different “sub-types of virtual sexual assault.” Danaher also provides suggestions on when virtual acts should be criminalized and how virtual sexual assault can be criminalized. The paper also touches on topics like consent and criminal responsibility for such crimes.

There’s even a short film that brings to light pressing metaverse concerns. Privacy Lost aims to educate policymakers about the potential dangers, such as manipulation, that come with emerging technologies.

While many legal issues in the virtual world are resolved through criminal courts and tort systems, according to Gamma Law’s David B. Hoppe, these approaches lack the necessary nuance and context to resolve such legal disputes. Hoppe remarks that real-world laws may not have the specificity that will allow them to tackle new privacy issues in XR spaces and shares that there is a need for a more nuanced legal strategy and tailored legal documents to help protect users in XR spaces.

Issues with Existing Cyber Laws

The novelty of AR and VR technologies makes it challenging to implement legislation. However, for users to maximize the benefits of such technologies, their needs should be considered by developers, policymakers, and organizations that implement them. While cyber laws are in place, persistent issues still need to be tackled, such as challenges in executing sanctions for offenders and the lack of adequate responses.

The United Nations Office on Drugs and Crime (UNODC) also cites several obstacles to cybercrime investigations, such as user anonymity from technologies, attribution, which determines who or what is responsible for the crime, and traceback, which can be time-consuming. The UNODC also notes that the lack of coordinated national cybercrime laws and international standards for evidence can hamper cybercrime investigations.

Creating Safer XR Spaces for Users

Based on guidelines provided by the World Economic Forum, there are several key considerations that legislators should consider. These include how laws and regulations apply to XR conduct governed by private platforms and how rules can potentially apply when an XR user’s activities have direct, real-world effects.

The XR Association (XRA) has also provided guidelines to help create safe and inclusive immersive spaces. Its conduct policy tips to address abuse include creating tailored policies that align with a business’ product and community and including notifications of possible violations. Moreover, the XRA has been proactive in rolling out measures for the responsible development and adoption of XR. For instance, it has held discussions on user privacy and safety in mixed reality spaces, zeroing in on how developers, policymakers, and organizations can better promote privacy, safety, and inclusion, as well as tackle issues that are unique to XR spaces. It also works with XRA member companies to create guidelines for age-appropriate use of XR technology, helping develop safer virtual spaces for younger users.

Other Key Players in XR Safety

Aside from the XRA, other organizations are also taking steps to create safer XR spaces. X Reality Safety Intelligence (XRSI), formerly known as X Reality Safety Initiative, is one of the world’s leading organizations focused on providing intelligence and advisory services to promote the safety and well-being of ecosystems for emerging technologies.

It has created a number of programs that help tackle critical issues and risks in the metaverse focusing on aspects like diversity and inclusion, trustworthy journalism, and child safety. For instance, the organization has shown support for the Kids PRIVACY Act, a legislation that aims to implement more robust measures to protect younger users online.

XRSI has also published research and shared guidelines to create standards for XR spaces. It has partnered with Standards Australia to create the first-ever Metaverse Standards whitepaper, which serves as a guide for standards in the metaverse to protect users against risks unique to the metaverse. These are categorized as Human Risks, Regulatory Risks, Financial Risks, and Legal Risks, among other metaverse-unique risks.

The whitepaper is a collaborative effort that brings together cybersecurity experts, VR and AR pioneers, strategists, and AI and metaverse specialists. One of its authors, Dr. Catriona Wallace, is the founder of the social enterprise The Responsible Metaverse Alliance. Cybersecurity professional Kavya Pearlman, the founder and CEO of XRSI, is also one of its authors. Pearlman works with various organizations and governments, advising on policymaking and cybersecurity to help keep users safe in emerging technology ecosystems.

One such issue that’s being highlighted by the XRSI is the risks that come with XR data collection in three areas: medical XR and healthcare, learning and education, and employment and work. The report highlights how emerging technologies create new privacy and safety concerns, risks such as the lack of inclusivity, the lack of equality in education, and the lack of experience in using data collected in XR spaces are cropping up.

In light of these issues, the XRSI has created goals and guidelines to help address these risks. Some of the goals include establishing a standards-based workflow to manage XR-collected data and adopting a new approach to classifying such data.

The EU is also taking steps to ensure data protection in emerging technologies, with new EU laws aiming to complement the GDPR’s requirements for XR technologies and services. Moreover, the EU data protection law applies to most XR technologies, particularly for commercial applications. It’s possible that a user’s explicit consent may be required to make data processing operations legitimate.

According to the Information Technology & Innovation Foundation (ITIF), policymakers need to mitigate so-called regulatory uncertainty by making it clear how and when laws apply to AR and VR technologies. The same ITIF report stresses that they need to collaborate with stakeholder communities and industry leaders to create and implement comprehensive guidelines and clear standards for AR and VR use.

However, while creating safer XR spaces is of utmost importance, the ITIF also highlights the risks of over-regulation, which can stifle the development of new technologies. To mitigate this risk, policymakers can instead focus on developing regulations that help promote innovation in the field, such as creating best practices for law enforcement agencies to tackle cybercrime and focusing on funding for user safety research.

Moreover, the ITIF also provides some guidelines regarding privacy concerns from AR in public spaces, as well as what steps leaders and policymakers could take to mitigate the risks and challenges that come with the use of immersive technologies.

The EFF also shares that governments need to execute or update data protection legislation to protect users and their data.

There is still a long way to go when applying real-world laws to XR spaces. However, many organizations, policymakers, and stakeholders are already taking steps to help make such spaces safer for users.

Challenges Behind Applying Real-World Laws to XR Spaces and Ensuring User Safety Read More »

alien-invasion-ar-fps-review

Alien Invasion AR FPS Review

What better place to play a game about an alien invasion in your backyard than in your backyard? When a game studio offered to stage an alien invasion right here in my neck of the woods, I shelved my concerns about violent video games and picked up my mobile phone to see what Alien Invasion AR FPS is all about.

Resisting an Alien Invasion in Augmented Reality

Set in the not-too-distant future, Alien Invasion AR FPS by Stary, tells the story of an insidious and subtle alien foe. The aliens, nicknamed “Jackers” came in peace and even brought gifts. However, the gifts were sabotaged and the aliens quickly showed their true colors and effectively took over the planet.

Alien Invasion AR FPS ipad

In Alien Invasion AR FPS, you play the part of a resistance fighter in this sort of Sci-Fi “Red Dawn” situation. Use limited resources and unlimited resourcefulness to take back your home from the Jackers. But, how does it all play out?

Narrative and Gameplay

Alien Invasion AR FPS unlocks level-by-level in an unfolding linear narrative starring you and your “commanding officer” in the resistance. The introductory video as well as your mission brief at the beginning of each stage involves some compelling art but some humdrum voicework.

As you are a resistance fighter, most of the early missions involve tasks like planting explosives or setting up defensive positions. The mission brief at the beginning of each mission starts out by explaining how the success of the previous mission shifted the balance of the overarching conflict, which helps to give a sense of purpose to the gameplay, which can feel repetitive.

As the game progresses, your victories unlock more resources for the resistance, including new weapons. The beginning of many of the early levels has a brief tutorial on how to use any new equipment that you have unlocked. You have unlimited ammunition, but health and grenades are limited and need to be sourced from throughout the levels.

The game currently consists of four levels of four stages each plus the intro video. I haven’t beaten the whole game yet, but the names of the levels and material provided by the game’s publisher suggest that the resistance does eventually succeed in driving the Jackers from Earth.

Playing Alien Invasion AR FPS

Alien Invasion AR FPS is a free app download for iOS 12 and newer, and for Android 8.0 and newer, and it’s surprisingly agile. The app is still in its early days – maybe one day it will have a marketplace for buying extra supplies, or maybe it will use the AR ad formats Niantic is exploring. But for now, it’s really just free.

From the technical perspective, the game plays out in a series of digital sets that you place in your physical environment. The game recommends a play area of almost 50 square feet, so it recommends playing outside. Even outside, I don’t think that I ever played in an area that big, but my backyard was big enough.

Once your mobile device recognizes that you’re in a large enough space, you tap the ground to place the virtual elements. Getting the angle exactly right is tricky and if you don’t figure it out pretty well, those virtual elements can be too high or too low, which kind of ruins the effect and impacts playability.

Once the stage is set, you navigate through the space by physically moving through your environment. If the area isn’t large enough, you can pause the game, move to a new position, and resume the game. Typically, you perform some initial task, move to cover, and confirm that you’re in place. Then, the wave of Jackers comes for you.

Buttons on the screen manage your various healing kits, your weapons and firing, and additional equipment that you gradually unlock and use, like hand grenades.

Letdowns and Triumphs

Unfortunately, what the stage looks like doesn’t change based on your physical environment. My backyard has a shed and some stone retaining walls, so it would have been cool if the game had recognized these and incorporated them into the stage design – but I understand that that’s a huge ask for a free mobile app.

AR game Alien Invasion AR FPS

Ducking and moving from cover to cover is effective and feels right. You also have to explore each stage a little if you want to collect resources like health kits. And your health kits don’t replenish at the beginning of each stage, so at least taking a good look around before the first wave comes is highly recommended.

My general strategy was to hunker down wherever I started the level and fight in place. Although, at one point, the last Jacker in a stage refused to leave his cover, so I got up and charged through the map firing my SMG. There was definitely a moment of thinking “This is exactly the way that an AR FPS is supposed to feel.”

Speaking of “feel,” Alien Invasion AR FPS doesn’t have haptic support – the phone doesn’t vibrate when I fire a gun or get shot. This feels like a huge missed opportunity, but it can’t just be something that the developers never thought of, so I’m confident that it will come in an update at some point.

Compromises Paid Off Overall

We’ve already seen one area where the choice to make the AR FPS affordable and accessible might have meant going without some potentially more immersive features. There’s one more big thing about this app that I didn’t mention that likely fits in the same camp: it doesn’t require data or Wi-Fi. At least, not yet. The game’s roadmap includes multiplayer that probably will.

For me, this is a huge win – and it makes a lot of sense for a game that was designed to be played outdoors. As someone who’s seen too many Pokèmon trainers throwing balls into their bathtubs because they didn’t have connections outside of their homes, an AR game that doesn’t require connectivity feels like a breath of fresh air.

Again, that’s with the understanding that other AR games can do things that this one can’t. As a technical showpiece for AR, this game might not blow picky critics out of the water. But, as an artistic showcase for AR, this game elevates an enjoyable and well-executed first-person shooter onto a new level of play.

But How Did it Make Me Feel?

I mentioned at the top of this piece that I’m historically not a fan of violence in video games – particularly XR video games. It was something that I struggled with as I approached Peaky Blinders: The King’s Ransom. In my playthrough, I found that that game managed graphic content in such a way that it was able to be a part of the story without overwhelming the player.

I feel similarly about AR use in Alien Invasion AR FPS. It also helps that in Alien Invasion I’m killing aliens instead of Englishmen – that sits better with me. But, the aliens aren’t rendered in such quality that I have to intimately consider their death – they don’t even bleed like the gang members and political agitators that I virtually shot down in London and Birmingham.

Returning to Alien Invasion’s use of AR as an artistic medium rather than strictly as a game development tool, there’s a lot to be said for the way that AR tells this story about, well, an alien invasion.

Early in the game, I load an anti-aircraft gun that shoots down an alien ship – and it happens over my backyard. As I watched the airship go down behind my laundry line, I imagined it crashing down the road from my house and blocking traffic. It was another one of those moments that felt like a win for the development studio: this is what an AR FPS can do.

It’s Free

Are there things that I would like to see in updates to Alien Invasion AR FPS? Yes. Are there things that I can complain about from the game? Not really. As a lightweight, connection-optional mobile-based AR FPS that you can download and play for free, I really can’t think of any reason not to recommend that you at least give the game a try.

Alien Invasion AR FPS Review Read More »

talespin-launches-ai-lab-for-product-and-implementation-development

Talespin Launches AI Lab for Product and Implementation Development

Artificial intelligence has been a part of Talespin since day one but the company has been leaning more heavily into the technology in recent years including through internal AI-assisted workflows and a public-facing AI development toolkit. Now, Talepsin is announcing an AI lab “dedicated to responsible artificial intelligence (AI) innovation in the immersive learning space.”

“Immersive Learning Through the Application of AI”

AI isn’t the end of work – but it will change the kinds of work that we do. That’s the outlook that a number of experts take, including the team behind Talespin. They use AI to create virtual humans in simulations for teaching soft skills. In other words, they use AI to make humans more human – because those are the strengths that won’t be automated any time soon.

Talespin AI Lab

“What should we be doing to make ourselves more valuable as these things shift?” Talespin co-founder and CEO Kyle Jackson recently told ARPost.“It’s really about metacognition.”

Talespin has been using AI to create experiences internally since 2015, ramping up to the use of generative AI for experience creation in 2019. They recently made those AI creation tools publicly available in the CoPilot Designer 3.0 release earlier this year.

Now, a new division of the company – the Talespin AI Lab – is looking to accelerate immersive learning through AI by further developing avenues for continued platform innovation as well as offering consulting services for the use of generative AI. Within Talepsin, the lab consists of over 30 team members and department heads who will work with outside developers.

“The launch of Talespin AI Lab will ensure we’re bringing our customers and the industry at large the most innovative and impactful AI solutions when it comes to immersive learning,” Jackson said in a release shared with ARPost.

Platform Innovation

CoPilot Designer 3.0 is hardly outdated, but interactive samples of Talespin’s upcoming AI-powered APIs for realistic characters and assisted content writing can currently be requested through the lab with even more generative AI tools coming to the platform this fall.

In interviews and in prepared material, Talespin representatives have stated that working with AI has more than halved the production time for immersive training experiences over the past four years. They expect that change to continue at an even more rapid pace going forward.

“Not long ago creating an XR learning module took 5 months. With the use of generative AI tools, that same content will be created in less than 30 minutes by the end of this year,” Jackson wrote in a blog post. “Delivering the most powerful learning modality with this type of speed is a development that allows organizations to combat the largest workforce shift in history.”

While the team certainly deserves credit for that, the company credits working with clients, customers, and partners as having accelerated their learnings with the technology.

Generative AI Services

That brings in the other major job of the AI Lab – generative AI consulting services. Through these services, the AI Lab will share Talespin’s learnings on using generative AI to achieve learning outcomes.

“These services include facilitating workshops during which Talespin walks clients through processes and lessons learned through research and partnership with the world’s leading learning companies,” according to an email to ARPost.

AI Lab Talespin

Generative AI consulting services might sound redundant but understanding that generative AI exists and knowing how to use it to solve a problem are different things. Even when Talespin’s clients have access to AI tools, they work with the team at Talespin to get the most out of those tools.

“Our place flipped from needing to know the answer to needing to know the question,” Jackson said in summing up the continued need for human experts in the AI world.

Building a More Intelligent Future in the AI Lab

AI is at a position similar to that seen by XR in recent months and blockchain shortly before that. Its potential is so exciting, we can forget that its full realization is far from imminent.

As exciting as Talespin’s announcements are, Jackson’s blog post foresees adaptive learning and whole virtual worlds dreamed up in an instant. While these ambitions remain things of the future, initiatives like the AI Lab are bringing them ever closer.

Talespin Launches AI Lab for Product and Implementation Development Read More »

the-multiverse:-what-it-is,-its-benefits,-and-its-role-in-helping-build-the-metaverse

The Multiverse: What It Is, Its Benefits, and Its Role in Helping Build the Metaverse

The metaverse continues to gain popularity worldwide. Though it depends on how you define the metaverse, it’s estimated that it has over 400 million active users monthly, and the global metaverse market is projected to surpass USD 1.3 trillion by 2030.

While the metaverse is a relatively established concept, a comprehensive idea of the metaverse is slowly gaining traction: the multiverse. With the rapid and constant change that the world is experiencing due to continuous technological innovations, how we communicate and access information has dramatically changed over the years.

For this article, we talked with Ronny Tome, the founder of a blockchain company Ducatus Global, to help us shed more light on the multiverse, what it is, and what it will play in bringing the metaverse to life.

What Is the Multiverse?

Before defining what the multiverse is, let’s do a quick review of what the metaverse is. As defined by Tome, a metaverse is a shared space that’s created when physical and virtual realities come together.

It is an immersive environment that goes beyond the traditional concept of virtual reality as entertainment, the metaverse connects anyone – anywhere and anytime – with others in a virtual environment that offers possibilities for social interaction, community, business, and more. Today and arguably the future’s metaverse is believed to be the next evolution of the Internet,” he said.

Meanwhile, the multiverse “is the home of multiple metaverses,” as Tome puts it. In the multiverse, users can discover and experiences different metaverses all at once, even if the diverse ecosystems are disparate. To expand on the concept of the multiverse, it’s a virtual space where users can also interact with each other.

Multiverse vs. Metaverse: What Are Their Differences? 

Some of the key differences between the metaverse and the multiverse are as follows, according to LeewayHertz:

  • The metaverse is one shared virtual universe, while the multiverse houses multiple virtual universes.
  • The metaverse is interconnected, while the multiverse contains several disparate ecosystems.
  • In the metaverse, users can have ownership over their digital assets, whereas, in the multiverse, users won’t be able to own their investments in separate ecosystems.

However, while they may be two different things, Tome said in the interview that the metaverse and the multiverse will complement each other, meaning there’s no competition between these virtual universes.

Tome then introduced the idea of the GOLD Multiverse, Ducatus Global’s vision of the multiverse and part of the Ducatus Ecosystem, which allows users to utilize their rewards and tokens in the metaverse and real life. Think of it as a blockchain with multiple layers comprised of different universes where users can do activities like shopping for clothing in a virtual world and then visiting a garden in real life, all of which are connected to the metaverse.

Benefits of the GOLD Multiverse

So, how can organizations benefit from the multiverse?

According to Tome, the GOLD Multiverse opens up more opportunities for users.

In the GOLD Multiverse, there are various metaverses, which we call worlds, that are dedicated to different values, interests, and persuasions. There’s a world of sustainability in Garden of Life, a world of heritage and tradition with the Queen Margherita Napoli, a world of health and fitness in META Gym, and more,” he said.

Tome goes on to say that, for instance, if someone is interested in bodybuilding, then they’d go to the META Gym. However, before they can get there, they’d have to go through what he calls a “gateway,” where they’ll be able to discover other worlds, which can potentially hold new opportunities for them.

For organizations planning on venturing into the multiverse, Tome suggests that it’s best to be clear about their objectives and that they should “have the tenacity to fulfill these objectives.”

For ordinary users, the GOLD Multiverse also holds several benefits, according to Tome, such as:

Accessibility

Users can support their preferred advocacies regardless of their location.

Connection

In the GOLD Multiverse, users (organizations and individuals) can freely interact with each other without any limitations of borders or influence.

Education

Users can enjoy limitless learning. They can still access new information, skills, and more with minimal resources.

Community

In the GOLD Multiverse, users can unite and bond over shared interests.

Financial Opportunities

Each metaverse offers various chances to grow financially, either by receiving rewards or making use of the skills developed through learning experiences or by promoting their brands to users,” said Tome.

The Role of the Multiverse in Bringing the Metaverse to Life

Our vision for GOLD Multiverse is shaped by the values of openness and inclusivity. We believe that by serving as a gateway, the GOLD Multiverse magnifies the potential of each metaverse. Potential to be seen more, experienced more, and supported by more people,” said Tome. “With a single entry through this gateway, you will immediately be made aware of the existence of the various worlds that you can discover.”

However, while the concept of the multiverse seems promising, some challenges remain. Tome notes that these challenges, such as glitches and downtime, are commonly faced by anyone going online.

Nothing is certain. But if we are committed to our goals and objectives, the work will never stop. Technology is never constant, it’s always evolving, improving, striving to be better, and always looking for the next best thing,” Tome remarked.

Tome adds that the mission of the GOLD Multiverse is to help different projects proceed with the development of their respective metaverses.

Our role extends beyond constructing or designing the virtual space, we also offer our expertise in making sure that objectives and goals are met,” he shared. When asked if there are any real-world rules that govern the multiverse, Tome shared that if these refer to government regulations, they’re not aware of any such practices.

As a company though, we abide by strict professional standards in the way we do business and our work in general. We practice due diligence and vet the projects that we allow into our blockchain and the multiverse – this is our very own GOLD standard,” said Tome.

The Multiverse: What It Is, Its Benefits, and Its Role in Helping Build the Metaverse Read More »

highlighting-the-top-3-xr-trends-of-2023-[insights-from-this-year’s-awe-usa]

Highlighting the Top 3 XR Trends of 2023 [Insights From This Year’s AWE USA]

The 2023 edition of AWE USA not only lived up to its reputation but also reached new heights, reportedly marking its largest event to date. From cutting-edge hardware to new, groundbreaking technology and software solutions, this year had it all.

3 Trends That Will Shape the Future of XR

Let’s dive in and explore the main three trends that stood out and are bound to shape the narrative for the future of XR.

Main Focus on AR

There was a lot of discussion this year about artificial intelligence and how it will enable XR rather than replace it. Just like with the metaverse last year, AI became a new hot topic, but in terms of hardware, the spotlight was clearly on AR.

There were, of course, some notable VR-first devices presented: Lenovo announced their new ThinkReality VRX headset, which is now available for purchase ($1,299). I had a chance to give it a try and was impressed with its large sweet spot, visual clarity, and a high degree of comfort. The headset includes a cooling system that takes the heat away from your face and makes the inside feel almost air-conditioned.

ThinkReality VRX
ThinkReality VRX

HTC presented their modular HTC Vive XR Elite ($1,099) for which they had won a “Best Headworn Device” award. It can be worn both like a traditional headset with a head strap or akin to glasses with an external power source instead of the battery in the back. In detached form, the Vive XR Elite weighs only 270 grams.

These devices were more of an exception rather than the rule, however, and pale in comparison to the amount of AR devices showcased this year. Just on the main floor, we had Vuzix promoting their Ultralite turnkey AR solution, Sightful with a screenless Spacetop AR laptop, XREAL presenting XREAL Air glasses, and Magic Leap returning with Magic Leap 2. Right next to those was C-Real with their unique light field display and Tilt Five. In the lobby, Zappar was demonstrating its $75 Cardboard-inspired device.

And that’s just the hardware, the list doesn’t include smartphone-based solutions like Snapchat’s SnapAR and Snap Lenses or Ffface.me digital clothing. Many software providers were experimenting with AR as well. Whether it was enterprise and training applications or entertainment like a laser-tag-inspired Laser Limbo, the focus on augmented reality was prevalent.

Laser-tag-inspired Laser Limbo
Laser-tag-inspired Laser Limbo

Subjectively, I found the XREAL and Tilt Five glasses to be the most promising choices in terms of their usefulness and affordability. Tilt Five ($359) offers six degrees of freedom and a wide 110° field of view, plus a whole range of tabletop applications and games. It also comes with a tracked controller.

Tilt Five
Tilt Five

The XREAL Air ($488 with XReal Beam) might only have three degrees of freedom and a smaller FOV of 46°, but makes up for it with its versatility. It weighs only 79 grams and is compatible with phones, consoles, and laptops. Almost any device with a screen can be beamed into the glasses. For those looking to start experimenting with AR, both offer a good and inexpensive entry point.

The Renaissance of Haptics

It was hard to ignore the sheer volume of haptic-related products at AWE. There was a surge of novel startups and original concepts along with many industry veterans returning to show off their latest progress.

I did not expect haptics to have such a strong showing and was positively taken aback. Bhaptics were busy presenting their new TactGlove and Contact CI came out with a new product called Maestro. The most established player in the space, HaptX, was there as well.

Among newer entrants, SenseGlove was celebrating their €3.25M Series A funding with a newly updated Nova 2 haptic glove. Weart demoed their TouchDIVER glove capable of not only feedback but also temperature variations, while OWO showed off their latest haptic vest that uses electrical impulses to simulate sensations. Fluid Reality stole the show with its electroosmotic device that uses an electric field to create feedback.

Fluid Reality
Fluid Reality

There were too many to list but even this short rundown underscores how noticeable haptics were this year. Most of these products target industrial and business markets, with the notable exceptions being the OWO vest ($499) and Bhaptics (also $499). Both devices have their strengths and weaknesses, though I have to give extra points to OWO for taking a bold, unique approach and allowing users to configure the vest so that it can simulate discomfort as well as other unpleasant feedback. This can result in a significantly more visceral experience and a heightened feeling of presence that’s hard to replicate using other methods.

OWO Haptic Vest
OWO Haptic Vest

Seeing all the new and creative ways to model and recreate tactile data left me impressed with what’s to come, but at the same time, underwhelmed with the more conventional approaches.

Full resistance feedback, which restricts your movement, felt detached and did not properly mirror what I was seeing inside the headset. That was the case for both SenseGlove Nova and the high-end HaptX.

Their feedback, while indeed powerful, felt very mechanical and arbitrary. There are two paradigms here at play, one is trying to nail the fidelity but approximate the sensation, while the other one is trying to provide the exact, realistic sensation at the cost of fidelity.

New Optics Solutions Are Coming

There were a number of booths dealing with optics and display solutions this year. It’s possible the latest push into AR helped supercharge this progress in optics. Many booths had some kind of developer kit or proof-of-concept ready. Visitors would come and literally peer into the future through these stationary prototypes.

One example was Ant Reality demonstrating their mixed waveguide solution called Crossfire. While the displays (ranging in field of view from 56° to 120°) were impressive, what made them unique was their ability to do both AR and VR. At a press of a button, the surroundings would go completely dark, turning the augmented overlay into an immersive experience. Magic Leap 2 is known for offering what is called segmented dimming, but in the case of the Crossfire, the glasses would become completely opaque despite the AWE show floor being exceptionally bright.

Ant Display demonstrating their prototypes
Ant Display demonstrating their prototypes

Another never-before-achieved breakthrough was a light field display incorporated into an AR headset, courtesy of CREAL. Light field displays promise to solve a lot of issues, the most common one being correct focal depth. Harnessing the direction of light can produce outstanding results, but shrinking light field tech to fit into a glasses form factor still proves tricky. CREAL’s headset is an important, pioneering step in this field.

CREAL’s LFD headset
CREAL’s LFD headset

Another interesting innovation came from a company called Hypervision. Their claim to fame is their ultra-wide display capable of achieving a human vision 240° field of view. To make this happen, Hypervision used not one, not two, but four pancake lenses. Vertically, the screen has 95° so it doesn’t quite match the human eye top to bottom, but horizontally there’s full peripheral vision. While the stitching between the screens was slightly perceptible, the ability to achieve human FOV in such a small form factor is a massive step forward.

Hypervision
Hypervision

Overall, this means that the future generations of XR devices will have access to a wide variety of new, next-gen optics and display solutions, most of which are not even known to the general public. Display tech doesn’t follow Moore’s Law so it’s always difficult to make any specific predictions, but there’s clearly no stagnation in the field and some of the breakthroughs we saw this year are truly exciting.

Closing Thoughts

These are just some of the main trends and shifts we saw this year. There was a notable increase in 3D spatial display panels, such as Leia Lume Pad 2, Sony’s Spatial Display, Looking Glass, and a human-sized holographic box by ARHT.

This forms part of a larger trend of broadening the definition of spatial computing, which is sometimes expanded to include other real-world tools and technologies like visualizations, projection mapping, and 3D screens.

What also caught my eye was a noticeable reduction in locomotion solutions. Gone are the days of omnidirectional treadmills or big simulator seats. The only two exceptions were the unconventional EXIT SUIT, which suspends the wearer slightly above the ground allowing them to run in the air, sit, fly, and do a range of other motions (for which the team had won this year “AWEsome” award) and the Freeaim shoes that act like rollers, pushing the wearer backward as they walk.

This was the last AWE hosted in Santa Clara. From next year on, the event is moving to the Long Beach Convention Center. This shift to a new, bigger venue highlights the constant growth of the XR space and that’s one trend that speaks for itself.

Guest Post


About the Guest Author(s)

Mat Pawluczuk

Mat Pawluczuk

Mat Pawluczuk is an XR / VR writer and content creator.

Highlighting the Top 3 XR Trends of 2023 [Insights From This Year’s AWE USA] Read More »

how-eye-tracking-contributes-to-xr-analytics-and-experience

How Eye Tracking Contributes to XR Analytics and Experience

Near-to-eye displays offer powerful new ways to understand what the wearer is doing – and maybe even thinking. Right now, the use of XR analytics like eye tracking is largely limited to enterprise including use cases like education and assessment, though eye tracking also enables new input modes and display improvements.

To learn more about the present and potential future of this technology, ARPost spoke with hardware manufacturer Lumus and XR analytics specialists Cognitive3D.

Why Do We Need XR Analytics?

XR analytics can be broken down generally into learning about an experience and learning about the person in that experience.

Learning About Experiences

Learning about an experience is important for people building XR applications – both in terms of building applications that people will want to use, and in building applications that people will be able to use.

“The stakes are much higher in creating high-quality content,” Cognitive3D founder and CEO Tony Bevilacqua said in an interview with ARPost. “That means creating something that’s comfortable, that’s not going to make you sick, that’s going to be accessible and well-received.”

This kind of thing is important for anyone building anything, but it is crucial for people building in XR, according to Bevilacqua. When a gamer experiences a usability problem in a console game or mobile app, they’re likely to blame that specific game or app and move onto a different one. However, XR is still new enough that people aren’t always so understanding.

“A bad experience can create attrition not just for an app, but for the platform itself,” said Bevilacqua. “That headset might go back into the box and stay there.”

Of course, developers are also interested in “success metrics” for their experiences. This is an issue of particular importance for people building XR experiences as part of advertising and marketing campaigns where traditional metrics from web and mobile experiences aren’t as useful.

“We all kind of know that opening an app and how much time people spent, those are very surface-level metrics,” said Bevilacqua. For XR, it’s more important to understand participation – and that means deeper analytical tools.

Learning About People

In other use cases, the people interacting with an experience are the subject that the XR analytics are most interested in. In these situations, Bevilacqua describes “the headset as a vehicle for data collection.” Examples include academic research, assessing skills and competency, and consumer research.

Competency assessment and consumer research might involve digital twins that the individual interacts with in VR. How efficiently can they perform a task? What do they think about a virtual preproduction model of a car? What products draw their eyes in a virtual supermarket?

“We focus more on non-consumer-focused use cases like focus groups,” said Bevilacqua. “We try to build off of the characteristics that make VR unique.”

At least part of the reason for this is that a lot of XR devices still don’t have the hardware required for XR analytics, like eye tracking capabilities.

Building Hardware for Eye Tracking

David Goldman is the Vice President of AR Optics at Lumus. The company is primarily focused on making displays but, as a components manufacturer, they have to make parts that work with other customer requirements – including eye tracking. The company even has a few patents on its own approach to it.

According to Goldman, traditional approaches to eye tracking involve cameras and infrared lights inside of the headset. The invisible light reflects off of the eye and is captured by the camera. Those lights and cameras add some cost but, more importantly, they take up “valuable real estate, from an aesthetic perspective.”

The patented Lumus system uses the waveguide itself as the light source because waveguides already require projecting light. This light reflects off of the eye, so all that is required is an additional inside camera, which is a lot more affordable in terms of cost and space. However, high standards for emerging experiences plays a role here too.

“When you’re a company trying to introduce a whole new product, you’re trying to shave pennies off of the dollar on everything,” said Goldman. “Looking at the bill of materials, it’s unlikely to make a first generation.”

Though, more and more devices coming to market do include this hardware – including consumer devices. Why? In part because the hardware enables a lot more than just XR analytics.

Enabling New Kinds of Interactions

Eye tracking enables advanced display technologies like foveated rendering, which is one of the big reasons that it’s increasingly being included in consumer VR devices. Foveated rendering is a technique that improves the graphic fidelity of a small area of the overall display based on where your eye is looking at the moment.

AR devices currently don’t have a field-of-view large enough to benefit from foveated rendering, but Goldman said that Lumus will have a device with a field-of-view over 50 degrees before 2030.

Eye tracking also has promise as an advanced input system. Goldman cited the Apple Vision Pro, which uses a combination of eye tracking and hand-tracking to go completely controller-free. Mixed reality devices like the Apple Vision Pro and Meta Quest 3 also bring up the fact that eye tracking has different implications in AR than it does in VR.

“Effectively, you can know exactly where I’m looking and what I’m interested in, so it has its implications for advertisers,” said Goldman. “What’s less nefarious for me and more interesting as a user is contextual search.”

Power and Responsibility

As more advanced XR analytics tools come to more consumer-focused hardware, do we need to be concerned about these tools being turned on casual XR fans? It’s certainly something that we need to be watchful of.

“It’s certainly a sensitive issue,” said Goldman. “It’s certainly a concern for the consumer, so I think every company will have to address this up front.”

Bevilacqua explained that his company has adopted the XR Privacy Framework. Cognitive3D notifies individuals when certain kinds of data might be collected and gives them the option to opt out. However, Bevilacqua believes that the best option is to avoid certain kinds of data collection in the first place.

“It’s important to balance data collection with user privacy. … We have a pretty balanced view on what needs to be collected and what doesn’t,” said Bevilacqua. “For us, eye tracking is something we do not find acceptable in a consumer application.”

Bevilacqua also pointed out that platforms and marketplaces have their own internal guidelines that make it difficult for app developers to collect too much information on their own.

“There is acceptable use policy about what kinds of data exist and what can be used,” said Bevilacqua. “You can’t just go out and collect eye tracking data and use it for ads. That’s not something Meta is going to allow.”

All About Balance

We need XR analytics. They make for better experiences and can even improve the quality of goods and services that we enjoy and rely on in the physical world. Not to mention the benefits that the required hardware brings to consumer applications. While technologies like eye tracking can be scary if used irresponsibly, we seem to be in good hands so far.

How Eye Tracking Contributes to XR Analytics and Experience Read More »

why-emerging-tech-is-both-the-cause-and-solution-of-tomorrow’s-labor-challenges

Why Emerging Tech is Both the Cause and Solution of Tomorrow’s Labor Challenges

The post-pandemic workforce is experiencing several significant shifts, particularly in how organizations tackle labor challenges and approach talent acquisition. One of the key factors for this disruption is the emergence of new, game-changing technologies like AI and machine learning.

Today’s organizations are facing staffing needs and talent shortages due to the Great Resignation, prompting them to respond to an uncertain future by shifting how they approach the talent acquisition process.

For this article, we interviewed Nathan Robinson, CEO of the workforce learning platform Gemba, to discuss the future of work and the workplace. We’ll also shed more light on how new technologies and developments are shaping the future of talent acquisition.

Rethinking the Traditional Talent Acquisition Process

According to Robinson, today’s talent acquisition process vastly differs from what it was like a few years ago. With the emerging technologies such as AI, VR, and quantum computing, many jobs considered in demand today didn’t even exist a decade ago. He adds that this trend will only become even more pronounced as technological advancements continue to rise.

As a result, corporations will no longer be able to rely on higher education to supply a steady stream of necessary talent. Instead, organizations will have to hire candidates based on their ability and willingness to learn and then provide the necessary training themselves,” he remarked.

He added that, up to a year ago, no one had ever heard of ChatGPT and no one even knew what “generative AI” meant. Today, you can find job listings for prompt engineers and prominent language model specialists. Robinson also shared that technological advancement isn’t linear, with each innovation advancing and accelerating the pace of development, which can potentially change how organizations approach the talent acquisition process.

We can rightly assume that in five or ten years’ time, there will be a whole host of new positions that today we can’t reasonably predict, much less expect there to be a sufficient number of individuals already skilled or trained in that role,” Robinson told us. “That’s why we will almost certainly see a renewed focus on talent development, as opposed to acquisition, in the near future.”

How Emerging Technologies Are Changing How Organizations Look At and Acquire Talent

According to Robinson, some of the factors that have prompted this shift include the pandemic, the rise of remote and hybrid work, the Great Resignation, and Quiet Quitting. He noted that because of these shifts, the “goals and psychology of the modern worker have changed dramatically.”

This is why now, more than ever before, organizations must be clear and intentional about the culture they cultivate, the quality of life they afford, and the opportunities for learning and growth they provide their employees,” Robinson said. “These types of ‘non-traditional’ considerations are beginning to outweigh the cut-and-dry, compensation-focused costs associated with attracting top talent in some senses.”

He also shared that this new talent acquisition process can impact organizations over time, promoting them to shift away from recruitment and instead focus more on internal employee development. According to a Gartner report, 46% of HR leaders see recruitment as their top priority.

However, Robinson thinks that, as new technologies offer better solutions to labor challenges, such as on-the-job training, this number will steadily decline as HR professionals gradually focus on developing existing talent.

Emerging Tech as Both the Cause and Solution of Future Labor Challenges

Advanced technologies, such as AI, XR, and quantum computing, are the driving force behind the looming skills gap in that they are leading to the development of new types of roles for which we have very few trained professionals,” said Robinson.

A World Economic Forum report highlights that by 2027, it’s estimated that machines will instead complete 43% of tasks that used to be completed by humans. This is a significant shift from 2022’s 34%. Moreover, it’s estimated that 1.1 billion jobs may potentially be transformed by technology in the next ten years.

While emerging technologies are prompting labor challenges, they can also be seen as a solution. Robinson adds that these emerging technologies, particularly XR, can help organizations overcome the skills gap. According to him, such technologies can help organizations facilitate more efficient, cost-effective, and engaging training and development, thus allowing them to overcome such challenges.

To help potential employees overcome the upcoming skills disconnect, Robinson notes that the training should begin with management, using top-down managerial strategies and lean and agile development methodologies.

Overcoming Today’s Labor Challenges

Today, talent acquisition is seen as a key differentiator between successful and unsuccessful companies. While I think that will continue to hold true, I also think it will soon take a backseat to employee training and development,” Robinson said. “The industry leader will no longer be whoever is able to poach the best talent. It will soon be whoever is able to train and develop their existing talent to keep pace with the changing technological and economic landscape.”

At the end of the day, according to Robinson, embracing the unknown future of work and the workplace is about being ready for anything.

As the rate of technological advancement continues to accelerate, the gap between what we imagine the near future will be and what it actually looks like will only grow,” Robinson remarked. He suggests that instead of trying to predict every last development, it’s better to be agile and ready for the unpredictable. This means staying on top of new technologies and investing in tools to help organizations become more agile.

Why Emerging Tech is Both the Cause and Solution of Tomorrow’s Labor Challenges Read More »

celebrating-the-2023-fifa-women’s-world-cup-on-snapchat

Celebrating the 2023 FIFA Women’s World Cup on Snapchat

The 2023 FIFA Women’s World Cup is on – and celebrations aren’t limited to the physical world. A number of innovative activations on Snapchat ensure that fans everywhere can feel like a part of the action.

Snap’s 2023 FIFA Women’s World Cup Lenses

While AR activations can be fun, they don’t often add much real value for people actually following the sport. Snap’s USWNT (the US Women’s National Team) Team Tracker Lens uses brand new tech and classic Snapchat style to display real-time team and player information. Another lens can be used to preview Stories augmented with content from the U.S. Soccer App.

2023 FIFA Women’s World Cup Team Tracker

Curious about other teams to follow? A lens created with FIFA’s “Fancestry” quiz helps fans follow new teams based on a brief personality survey – with a unique digital jersey representing your “Fancestry.”

FIFA Fancestry lens -  2023 fifa women's world cup

Whatever team (or teams) you choose to support during 2023 FIFA Women’s World Cup, you can show your colors with the Across the Globe Lens with a different selfie lens for every team.

Global country fan lenses Snap - 2023 fifa women's world cup

Or, show your support for women’s sports in general with the TOGETHXR Lens. And, of course, all of the teams have their own stickers and bitmoji fashions to further customize your communications.

Togethxr Lens - 2023 fifa women's world cup

“Snapchat is honored to be a part of the 2023 World Cup,”  Snap’s Strategic Partnerships Sports Lead Emma Wakely said in a release. “Through immersive content coverage, creator collaborations, and new, innovative AR experiences, Snapchatters will have an unparalleled opportunity to express their football fandom like never before.”

Snapchat’s Playbook

Snap’s strategy for the package is an interesting play. For the most part, the engagements are more stylized than those employed last year for the (Mens) World Cup Snapchat celebrations. However, these interactions are more … interactive. Though, Snap’s Live Garment Transfer Technology does make a comeback for an AR jersey activation similar to the one we saw last year.

Perhaps the most in-depth lens in terms of interactivity is the USWNT Team Lens. This activation is most like a multi-player partnership through which Snap augmented the Superbowl last year – but that app was only for fans physically at the Superbowl. And what is AR for if not for expanding experiences to people who can’t be at one given physical location?

Don’t forget, the Snap team isn’t the only one making lenses. To find all lenses, including those made through partnerships or by independent creators, tap the explore tab on the Snapchat camera screen and search “Women’s World Cup” or “2023 FIFA Women’s World Cup.”

Celebrate the Big Games

This has just been a look at Snapchat’s AR integrations around the 2023 FIFA Women’s World Cup. There are also special Stories, Cameo content, Spotlight Challenges, Snap Map features, and more. So pick your team and go crazy.

Celebrating the 2023 FIFA Women’s World Cup on Snapchat Read More »

looking-forward-to-awe-asia-2023

Looking Forward to AWE Asia 2023

If you get all of your AWE coverage from ARPost, you might be under the impression that the event is only in California – but it wouldn’t be much of a “World Expo” then, would it? In addition to frequent all-online events, AWE consists of three in-person events each year: AWE USA, AWE Europe, and AWE Asia.

AWE Asia, this year taking place in Singapore, is fast approaching, with the agenda now finalized. Attendees can look forward to hearing from over 60 speakers in over 60 sessions including keynotes, talks, and panels over the course of the two-day conference. Let’s take a look at some of the most exciting sessions.

AWE Asia Keynotes and Addresses

Day One starts off with an opening ceremony by AWE co-founder Ori Inbar, joined on-stage by AWE Asia President and Vice President, Gavin Newton-Tanzer and Ryan Hu. This session is followed by back-to-back keynotes by HTC Global Vice President of Corporate Development Alvin Graylin and University of South Australia professor Dr. Mark Billinghurst.

Day Two also starts off with keynotes. First, “Metaverse as the Next Biggest Thing: Challenges, Roadmaps, and Standardization” by IEEE president Dr. Yu Yuan. This is followed by “ifland: A Case Study on Telco Collaboration in Building a Global Metaverse Platform” presented by SK Telecom Vice President Ikhwan Cho and Deutsche Telekom Senior Director of XR and the Metaverse Terry Schussler.

Day Two then closes with remarks and awards from Inbar, Newton-Tanzer, and AWE Asia COO and Content Director David Weeks.

The keynotes and addresses are great because they often feature some of a conference’s biggest announcements and most anticipated speakers. They’re also great because nothing is scheduled at the same time as a keynote. From here, we’ll have to start making some tough calls.

Day One Sessions

Following the AWE Asia welcome address and keynotes on Day One, the crowd is sure to split. Remain near the main stage to hear NVIDIA’s Vanessa Ching discuss “Developers, Platforms, and AI.” Venture off to a substage to hear Joe Millward and Kyle Jackson of Talespin talk about “Scaling XR Content for the Enterprise With Generative AI.”

Next up. Niantic Senior VP of Engineering, Brian McClendon, explains how “Niantic is Powering AR, Everywhere, All at Once.” Having seen this talk at AWE USA, I can tell you it’s worth seeing, but I can also point out that you could watch the recording online and stretch your day a little further.

Another tough decision follows. Will it be “How AI Will Enhance the Metaverse and Education” with Meta Head of Global Education Partnerships Leticia Jauregui and Zoe Immersive CEO and co-founder Emilie Joly? Or will it be “Beyond Loudness: Spatial Chat and the Future of Virtual World Communication” with Dolby Laboratories Developer Advocate Angelik Laboy?

Day One’s Marathon on the Main Stage

The afternoon of Day One has a lineup of promising presentations on the main stage. Starting, Immersal Chief Marketing Officer Päivi Laakso-Kuivalainen and Graviton Interactive co-founder and Managing Director Declan Dwyer talk “Revolutionizing Fan Engagement: Augmented Reality in Stadiums Powered by Visual Positioning Systems and Spatial Computing.”

This is followed by Linux Foundation General Manager Royal O’Brien talking about “Inspiring Game Development Through Open Source.” Then, keep your seat to hear Trigger XR founder and CEO Jason Yim talk about retail, advertising, and e-commerce. A little later on the same stage, Mindverse.AI co-founder and COO Kisson Lin talks about the Web3 creator economy.

Day Two Sessions Main Stage Sessions

One can’t-miss session on Day Two comes from Dispelix APAC VP of Sales and Partnerships Andy Lin, presenting “PERFECTING COMFORT – Vision Behind Dispelix Waveguide Combiners for Near-to-Eye XR Displays.”

Some of the last regular sessions on the main stage before the AWE Asia closing address look promising as well.

First, Infocomm Assistant Director of Innovation Joanne Teh, Deloitte Center for the Edge Southeast Asia Leader Michelle Khoo, Serl.io co-founder and CEO Terence Loo, and SMRT Corporation Learning Technologies Lead Benjamin Chen have a panel discussion about “The Future of Immersive Experiences: Navigating the World of XR.”

Immediately following the panel discussion, Google’s Toshihiro Ohnuma takes the stage to discuss “Connecting Both Worlds – Google Maps and AR Core.”

In between those sessions, the substages look pretty promising.

Major Side-Stage Attractions

After Lin’s talk, head over to Substage 1 for a series of promising talks. These start with Maxar Technologies Business Development Manager Andrew Steele presenting “Experience the Digital Twin Built for Connecting Your XR Content With the Real World. “ The world-scale digital twin won the Auggie for Best Use of AI at the awards ceremony in Santa Clara this spring.

Up next on the same stage, Anything World co-founder and Creative Director Sebastian Hofer explains “How AI Is Powering a Golden Age in Games Development.”

A quick break between sessions and then back to learn about “ThinkReality Solutions Powering the Enterprise Metaverse” with Lenovo Emerging Technologies Lead Martand Srivastava and Qualcomm’s Kai Ping Tee.

Lots to Take In

AWE Asia being two days instead of three certainly doesn’t solve the classic AWE problem of there being just too much amazing content to take in everything. At least, not live anyway.

To attend AWE Asia yourself, get tickets here, and use our code AW323SEB25 for 30% off the standard ticket and PAR23VSEB for 35% off the VIP ticket.

Looking Forward to AWE Asia 2023 Read More »