virtual reality

licking-this-“lollipop”-will-let-you-taste-virtual-flavors

Licking this “lollipop” will let you taste virtual flavors

Demonstrating lollipop user interface to simulate taste in virtual and augmented reality environments. Credit: Lu et al, 2024/PNAS

Virtual reality (VR) technology has long sought to incorporate the human senses into virtual and mixed-reality environments. In addition to sight and sound, researchers have been trying to add the sensation of human touch and smell via various user interfaces, as well as taste. But the latter has proved to be quite challenging. A team of Hong Kong scientists has now developed a handheld user interface shaped like a lollipop capable of re-creating several different flavors in a virtual environment, according to a new paper published in the Proceedings of the National Academy of Sciences (PNAS).

It’s well established that human taste consists of sweet, salty, sour, bitter, and umami—five basic flavors induced by chemical stimulation of the tongue and, to a lesser extent, in parts of the pharynx, larynx, and epiglottis. Recreating those sensations in VR has resulted in a handful of attempts at a flavor user interface, relying on such mechanisms as chemical, thermal, and electrical stimulation, as well as iontophoresis.

The chemical approach usually involves applying flavoring chemicals directly onto the tongue, but this requires room for bulk storage of said chemicals, and there is a long delay time that is not ideal for VR applications. Thermal variations applied directly to the tongue can stimulate taste sensations but require a complicated system incorporating a cooling subsystem and temperature sensors, among other components.

The most mainstream method is electrical stimulation, in which the five basic flavors are simulated by varying the frequency, intensity, and direction of electrical signals on the tongue. But this method requires placing electrode patches on or near the tongue, which is awkward, and the method is prone to taste biases.

So Yiming Liu of City University of Hong Kong and co-authors opted to work with iontophoresis, in which stable taste feedback is achieved by using ions flowing through biologically safe hydrogels to transport flavor chemicals. This method is safe, requires low power consumption, allows for precise taste feedback, and offers a more natural human-machine interface. Liu et al. improved on recent advances in this area by developing their portable lollipop-shaped user interface device, which also improves flavor quality and consistency.

Licking this “lollipop” will let you taste virtual flavors Read More »

meta-quest-3s-is-a-disappointing-half-step-to-carmack’s-low-cost-vr-vision

Meta Quest 3S is a disappointing half-step to Carmack’s low-cost VR vision


Significant visual and comfort compromises make last year’s Quest 3 a better VR investment.

Look at all those dots. Credit: Kyle Orland

It’s been just over two years now since soon-to-depart CTO John Carmack told a Meta Connect audience about his vision for a super low-end VR headset that came in at $250 and 250 grams. “We’re not building that headset today, but I keep trying,” Carmack said at the time with some exasperation.

On the pricing half of the equation, the recently released Quest 3S headset is nearly on target for Carmack’s hopes and dreams. Meta’s new $299 headset is a significant drop from the $499 Quest 3 and the cheapest price point for a Meta VR headset since the company raised the price of the aging Quest 2 to $400 back in the summer of 2022. When you account for a few years of inflation in there, the Quest 3S is close to the $250 headset Carmack envisioned.

A new button on the underside of the Quest 3S lets you transition to pass-through mode at any time.

Credit: Kyle Orland

A new button on the underside of the Quest 3S lets you transition to pass-through mode at any time. Credit: Kyle Orland

Unfortunately, Meta must still seriously tackle the “250 grams” part of Carmack’s vision. The 514g Quest 3S feels at least as unwieldy on your face as the 515g Quest 3, and both are still quite far from the “super light comforts” Carmack envisioned. Add in all the compromises Meta made so the Quest 3S could hit that lower price point, and you have a cheap, half-measure headset that we can only really recommend to the most price-conscious of VR consumers.

Meta Quest 2 Plus

iFixit’s recent teardown of the Quest 3S shows that the new headset is more than just a spiritual successor to the cheap and popular Quest 2. On the contrary, iFixit found the Quest 3S optical stack uses the exact same parts as the Quest 2, right down to the LCD panels and fresnel lenses.

In 2020, the 1832×1920 per-eye resolution offered by that visual stack represented a significant upgrade from what had come before, especially at such a low price point. Today, though, that dated display technology invites direct comparisons to the 2604×2208 per-eye display on last year’s Quest 3. With the displays sitting just inches from your eyes, that difference represents a very noticeable 20 percent drop in apparent clarity, down from 25 pixels per degree to a mere 20.

Going back to the 3S after a year spent in the Quest 3 is a bit like walking around in glasses that suddenly have a thin layer of Vaseline smeared on them. Everything looks quite a bit fuzzier, especially near the borders of the display, and edges of objects look distinctly more jagged than on the Quest 3. The difference is especially noticeable when trying to read small text in VR or make out fine details in the real world through the headset’s array of passthrough cameras.

It’s not quite a retreat to the days of the infamous “screen door effect” that plagued early Oculus-era headsets, but the distinct visual downgrade makes virtual reality experiences that much less convincing and engrossing on the 3S.

It’s the little things

The visual downgrades on the Quest 3S extend to the field of view, which narrows from 110 horizontal degrees on the Quest 3 to a mere 97 degrees on the 3S (the vertical field of view sees a smaller reduction from 97 degrees to 93 degrees). This difference isn’t as apparent as the drop in resolution between the two headsets, but it does lead to a few more “tunnel vision” moments at the margins. In a game like Beat Saber, for instance, I noticed many of my swings were rendered effectively invisible by the larger black void taking up much of my peripheral vision.

A comparative side view shows the reduced depth of the pancake lens housing on the Quest 3 (top) compared to the Quest 3S (bottom).

Credit: Kyle Orland

A comparative side view shows the reduced depth of the pancake lens housing on the Quest 3 (top) compared to the Quest 3S (bottom). Credit: Kyle Orland

Going back to the fresnel-lens-based Quest 2 visual stack also means doing without the thinner pancake lenses introduced on the Quest 3. The result is an eyebox on the 3S that extends about an inch farther from your face than on the Quest 3. That might not sound like much, but having the lens’ center of gravity farther from your face makes the headset feel a bit heavier and the fit a bit less secure as you move your head around in VR.

Then there are the compromises when it comes to fine-tuning the distance between the Quest 3S’ lenses. On the Quest 3, an adjustment wheel on the bottom of the headset lets you adjust this interpupillary distance (IPD) continuously, down to the millimeter precision. On the Quest 3S, you instead manually shift the lenses into three preset grooves that are a full 5 millimeters apart. If your face’s actual IPD falls in the middle of those 5 mm windows, the result can be the kind of eye strain and trouble focusing that we complained about in our original Quest 2 review.

Meta has also done away with quite a few Quest 3 creature comforts in an apparent effort to keep the Quest 3S price down. The lack of an external depth sensor, for instance, can make things like pass-through video and hand tracking feel a bit more wonky than on the Quest 3. The Quest 3S is missing a standard headphone jack, too, for those still using wired headphones. And the new headset also lacks any automatic face detection, adding the small annoyance of physically tapping the power button to return from sleep mode when you put it back on.

Spend the extra money

From the front, the external cameras are the easiest way to tell the difference between the Quest 3S (left) and the Quest 3.

From the front, the external cameras are the easiest way to tell the difference between the Quest 3S (left) and the Quest 3.

I’ve been comparing the Quest 3S to the Quest 3 because that’s the decision consumers considering a Meta headset will face today (if they can get over the need for a Meta account to use the headset in the first place). But Meta’s discontinuation of the aging Quest 2 means millions of current Quest 2 owners will soon be faced with the prospect of upgrading or abandoning Meta’s VR ecosystem for good, just as original Quest owners did last year.

For those current Quest 2 owners, the Quest 3S represents the cheapest way to maintain continued access to Meta’s library of VR games and apps. And that library continues to expand with everything from mind-bending indie games to quirky multiplayer arenas to single-player adventures like Batman: Arkham Shadow, which now comes free with every Quest 3 or 3S headset.

But the move from a Quest 2 to a Quest 3S is relatively small, considering the four-year gap between the similarly priced headsets. Yes, you’ll notice some significant improvements in the newer headset’s full-color pass-through cameras and the headset’s maximum frame rate (up from 90 Hz to 120 Hz). The 3S also offers a slightly more future-proofed Qualcomm XR Gen 2 processor (over the Quest 2’s original XR processor) and slightly more precise Touch Plus controllers (which are missing the annoying tracking ring on the original Quest 2 controllers).

All told, though, the Quest 3S is far from the generational upgrade from the Quest 2 you might hope for. For that kind of apparent jump, you’re much better off shelling out a bit more money for the full-fledged Quest 3. The improvements in form factor, field of view, IPD adjustment, and especially resolution make the higher-end set well worth the extra money. That’s especially true if you can manage to track down the now-discontinued 128GB Quest 3, which is currently being closed out for just $430 (compared to $500 for the new 528GB version).

If you simply want the cheapest way to access Meta’s library of virtual reality games, the Quest 3S certainly fills that hole in the market. If you want a more robust VR experience that’s likely to suffice further into the future, though, the extra investment in a Quest 3 headset is probably worth it.

Photo of Kyle Orland

Kyle Orland has been the Senior Gaming Editor at Ars Technica since 2012, writing primarily about the business, tech, and culture behind video games. He has journalism and computer science degrees from University of Maryland. He once wrote a whole book about Minesweeper.

Meta Quest 3S is a disappointing half-step to Carmack’s low-cost VR vision Read More »

apple-vision-pro’s-content-drought-improves-with-new-3d-videos

Apple Vision Pro’s content drought improves with new 3D videos

Immersive Video —

It’s still not the weekly cadence we expected, but it’s something.

  • Boundless premieres tonight, taking Vision Pro users on a hot air balloon ride in Turkey.

  • Submerged will be Apple’s first fictional short film for Vision Pro.

  • Users will get a glimpse into the 2024 NBA All-Star Weekend.

  • This cryptic image teases The Weeknd’s Vision Pro “experience.”

  • The new series Elevated will tour places of interest around the world from above.

  • Apple is partnering with Red Bull for a surfing documentary.

  • Wild Life returns with an episode about elephants in a wildlife preserve.

Today, Apple announced a slate of more than a dozen upcoming Immersive Videos for its Vision Pro spatial reality headset. The first, titled Boundless, launches tonight at 9 pm ET. More will follow in the coming weeks and months.

The announcement follows a long, slow period for new Vision Pro-specific video content from Apple. The headset launched in early February with a handful of Immersive Video episodes ranging from five to 15 minutes each. Since then, only three new videos have been added.

On March 28, Apple released a highlight reel of Major League Soccer plays from the season that had ended months prior. A second episode of Prehistoric Planet, Apple’s Immersive Video dinosaur nature documentary, went live on April 19. Likewise, a new episode of the Adventure series titled “Parkour” landed on May 24.

The MLS video played more like a short ad for Apple’s MLS programming than anything else, but other Immersive Videos have impressed with their quality if not their creative ambition. They’re all short videos that put the viewer inside a moment in space and time with either animals or people doing their thing. The videos are high-resolution, and the 3D is generally well done. The production values are high, even if the narratives are light. They come across as tech demos, as much as anything, but they are impressive.

Tonight’s Boundless episode will allow viewers to see what it’s like to ride in a hot air balloon over sweeping vistas. Another episode titled “Arctic Surfing” will arrive this fall, Apple says. Sometime next month, Apple will publish the second episode of its real wildlife documentary, simply titled Wild Life. The episode will focus on elephants in Kenya’s Sheldrick Wildlife Trust. Another episode is in the works, too. “Later this year,” Apple writes in its newsroom post, “viewers will brave the deep with a bold group of divers in the Bahamas, who come face-to-face with apex predators and discover creatures much more complex than often portrayed.”

More on the way

In September, we’ll see the debut of a new Immersive Video series titled Elevated. Apple describes it as an “aerial travel series” in which viewers will fly over places of interest. The first episode will take viewers to Hawaii, while another planned for later this year will go to New England.

Apple is additionally partnering with Red Bull for a look at surfing called Red Bull: Big-Wave Surfing.

In addition to those documentary episodes, there will be three short films by year’s end. One will be a musical experience featuring The Weeknd, and another will take basketball fans inside the 2024 NBA All-Star Weekend. There will also be Submerged, the first narrative fictional Immersive Video on the platform. It’s an action short film depicting struggles on a submarine during World War II.

It’s good to see Apple finally making some movement here; the drought of content after the launch didn’t inspire confidence in the platform. Many people with mixed reality headsets use them a bunch for a few weeks but either fail to find ways to fit them into their daily habits or run out of compelling content and taper off before long. To keep people invested in visionOS, Apple needs to keep a rapid cadence of new content that users look forward to at least every week. Otherwise, some users will see their headsets sit on shelves, forgotten.

When I reviewed the Vision Pro, I assumed that the Immersive Video episodes would roll out weekly. That hasn’t proven the case, and it still doesn’t look like it will. Apple is going to have to invest more in content (and take more risks with that content, moving beyond short tech demo documentaries) to make the Vision Pro stick with customers.

Listing image by Apple

Apple Vision Pro’s content drought improves with new 3D videos Read More »

apple’s-vision-pro-goes-on-sale-outside-the-us-for-the-first-time

Apple’s Vision Pro goes on sale outside the US for the first time

Spatial computing —

Since February, the headset has only been available in the United States.

A mixed reality headset over a table in an Apple Store

Enlarge / A Vision Pro on display at an Apple Store in Tokyo.

Apple

Apple’s Vision Pro headset went on sale outside the United States for the first time today, in the first of two waves of expanded availability.

The $3,499 “spatial computing” device launched back in February in the US, but it hasn’t taken the tech world by storm. Part of that has been its regional launch, with some of the biggest markets still lacking access.

Apple announced that the product would be sold internationally during its keynote at the Worldwide Developers Conference earlier this month.

The first new markets to get Vision Pro shipments are China, Japan, and Singapore—those are the ones where it went on sale today.

A second wave will come on July 12, with the headset rolling out in Australia, Canada, France, Germany, and the United Kingdom.

When we first tested the Vision Pro in February, we wrote that it was a technically impressive device with a lot of untapped potential. It works very well as a personal entertainment device for frequent travelers, in particular. However, its applications for productivity and gaming still need to be expanded to justify the high price.

Of course, there have been conflicting rumors of late about just how expensive Apple plans to keep its mixed reality devices. One report claimed that the company put the brakes on a new version of the Vision Pro for now, opting instead to develop a cheaper alternative for a 2025 launch.

But another report in Bloomberg suggested that’s an overstatement.  It simply noted that the Vision Pro 2 has been slightly delayed from its original target launch window and reported that the cheaper model will come first.

In any case, availability will have to expand and the price will ultimately have to come down if augmented reality can become the major computing revolution that Apple CEO Tim Cook has predicted. This international rollout is the next step to test whether there’s a market for that.

Apple’s Vision Pro goes on sale outside the US for the first time Read More »

meta’s-new-$199-quest-2-price-is-a-steal-for-the-vr-curious

Meta’s new $199 Quest 2 price is a steal for the VR-curious

Bargain basement —

Move comes as support winds down for the original Quest headset.

For just $199, you could be having as much fun as this paid model.

Enlarge / For just $199, you could be having as much fun as this paid model.

Meta has announced it’s permanently lowering the price of its aging Quest 2 headset to $199 for a 128GB base model, representing the company’s lowest price yet for a full-featured untethered VR headset.

The Quest 2, which launched in 2020 at $299, famously defied tech product convention by increasing its MSRP to $399 amid inflation and supply chain issues in mid-2022. Actual prices for the headset at retail have fallen since then, though; Best Buy offered new units for $299 as of last October and for $250 by the 2023 post-Thanksgiving shopping season, for instance.

And the Quest 2 is far from the company’s state-of-the-art headset at this point. Meta launched the surprisingly expensive Quest Pro in late 2022 before dropping that headset’s price from $1,499 to $999 less than five months later. And last year’s launch of the Quest 3 at a $499 starting price brought some significant improvements in resolution, processing power, thickness, and full-color passthrough images over the Quest 2.

But for how long?

Those looking to get the Quest 2 at its new bargain MSRP should keep in mind that Meta may not be planning to support the aging headset for the long haul. Meta is currently winding down support for the original Quest headset, which launched in 2019 and no longer has access to important online features, security updates, and even new apps. The Quest 2 is just 18 months younger than the original Quest, and the new price might represent an effort to clear out defunct stock in favor of newer, more powerful Quest options.

The Quest 2 (left) has a 40 percent thicker profile than the pancake-optics on the Quest 3 (right).

Enlarge / The Quest 2 (left) has a 40 percent thicker profile than the pancake-optics on the Quest 3 (right).

Meta

Then again, plenty of developers are still targeting apps and games at the comparatively large audience on the Quest 2, which sold an estimated 15 million units through mid-2022, roughly on par with the Xbox Series S|X in roughly the same time period. But there are some signs that Quest 2 software is selling relatively slower than those hardware numbers might suggest amid reports that many Quest purchasers are no longer active users. And Meta continues to lose massive amounts of money on the VR segment, while Sony is reportedly halting production of the PS5-tethered PSVR2 headset amid weaker than expected demand.

The Quest 2’s new price is the first time Meta has offered a headset below the “$250 and 250 grams” target former Meta CTO John Carmack once envisioned for a “super cheap, super lightweight headset” that could bring in the mass market (the Quest 2 weighs in at 503 grams). The new price is also stunningly cheap when you consider that, just six or seven years ago, VR-curious consumers could easily end up paying $1,500 or more (in 2024 dollars) for a high-end tethered headset and the “VR-ready” computer needed to power it.

If you’ve waited this long to see what virtual reality gaming is all about, this price drop is the perfect opportunity to indulge your curiosity for a relative pittance. Heck, it might be worth it even if your headset ends up, like mine, a Beat Saber machine most of the time.

Meta’s new $199 Quest 2 price is a steal for the VR-curious Read More »

apple’s-first-new-3d-vision-pro-video-since-launch-is-only-a-few-minutes-long

Apple’s first new 3D Vision Pro video since launch is only a few minutes long

Immersive Video —

Major League Soccer highlight reel is the first Immersive Video since launch.

  • All the available Immersive Video launch content fit on a small strip in the TV app.

    Samuel Axon

  • Initial videos were labeled as episodes in a series, but subsequent episodes haven’t come.

Tonight, Apple will debut some new Immersive Video content for the Vision Pro headset—the first sports content for the device. It doesn’t seem like much after two months of no new content, though.

Starting at 6 pm PT/9 pm ET, Vision Pro users will be able to watch a sports film captured for the platform’s Immersive Video format. The video will be a series of highlights from last year’s Major League Soccer (MLS) playoffs, and according to Six Colors, it will run just five minutes. It will be free for all Vision Pro users.

On February 2, Apple released what appeared to be the first episodes of three Immersive Video series: Adventure, Prehistoric Planet, and Wildlife. Each debuted alongside the Vision Pro’s launch with one episode labeled “Episode 1” of “Season 1.”

However, it’s been almost two months, and none of those series have received new episodes. The only other piece of Immersive Video content available is an Alicia Keyes performance video that also debuted on February 2. Most of these videos were only a few minutes long.

That means that this short soccer video depicting sports moments from 2023 will be the only new piece of Immersive Video content Apple has put out since the device launched at the beginning of February.

When I reviewed the Vision Pro as an entertainment device, I lauded its capabilities for viewing 2D films and videos, but I also talked a bit about its 3D video capabilities. I said the first pieces of original 3D content from Apple seemed promising and that I looked forward to future episodes. Given that they were labeled just like Apple TV+ series in the TV app, I assumed they would arrive at a weekly cadence. Further episodes haven’t come.

Notably, Apple didn’t include a first-party app for playing 3D videos downloaded from the web with the Vision Pro, though an independent developer filled that gap with an app called Reality Player. There are a few 3D video streaming or downloading services in the visionOS App Store, but the selection is very anemic compared to what you have access to with other headsets.

Apple hasn’t been calling the Vision Pro a VR headset, opting instead for the term “spatial computing”—and that’s understandable because it does a lot more than most VR headsets.

But if you’re looking for new examples of the sorts of passive viewing content you can enjoy on other headsets, the Vision Pro is still far behind the competition two months in.

The device can display a wealth of 2D video content, but this drives home the initial impression that the Vision Pro is meant for viewing flat, 2D content as windows in 3D space. The situation isn’t quite as dire with apps and games, with a handful of new spatial apps in those categories rolling out in recent weeks.

Most apps behave just like iPad apps, with 2D viewports at the content; you can position those viewports wherever you want in the room around you. Most video content is also 2D.

There are situations where that’s neat to have, but it’s surprising Apple hasn’t invested more in actual 3D content yet. In terms of new stuff, this short soccer video debuting tonight is all we have right now.

Listing image by Samuel Axon

Apple’s first new 3D Vision Pro video since launch is only a few minutes long Read More »

report:-sony-stops-producing-psvr2-amid-“surplus”-of-unsold-units

Report: Sony stops producing PSVR2 amid “surplus” of unsold units

Too many too late? —

Pricy tethered headset falters after the modest success of original PSVR.

PSVR2 (left) next to the original PSVR.

Enlarge / PSVR2 (left) next to the original PSVR.

Kyle Orland / Ars Technica

It looks like Sony’s PlayStation VR2 is not living up to the company’s sales expectations just over a year after it first hit the market. Bloomberg reports that the PlayStation-maker has stopped producing new PSVR2 units as it tries to clear out a growing backlog of unsold inventory.

Bloomberg cites “people familiar with [Sony’s] plans” in reporting that PSVR2 sales have “slowed progressively” since its February 2023 launch. Sony has produced “well over 2 million” units of the headset, compared to what tracking firm IDC estimates as just 1.69 million unit shipments to retailers through the end of last year. The discrepancy has caused a “surplus of assembled devices… throughout Sony’s supply chain,” according to Bloomberg’s sources.

IDC estimates a quarterly low of 325,000 PSVR2 units shipped in the usually hot holiday season, compared to a full 1.3 million estimated holiday shipments for Meta’s then-new Quest 3 headset, which combined with other Quest products to account for over 3.7 million estimated sales for the full year.

The last of the tethered headsets?

The reported state of affairs for PSVR2 is a big change from the late 2010s when the original PlayStation VR became one of the bestselling early VR headsets simply by selling to the small, VR-curious slice of PS4 owners. At the time, the original PSVR was one of the cheapest “all-in” entry points for the nascent market of tethered VR headsets, in large part because it didn’t require a connection to an expensive, high-end gaming PC.

In the intervening years, though, the VR headset market has almost completely migrated to untethered headsets, which allow for freer movement and eliminate the need to purchase and stay near external hardware. The $550 PlayStation VR2 is also pricier than the $500 Meta Quest 3 headset, even before you add in the $500 asking price for a needed PS5. Sony’s new headset also isn’t backward compatible with games designed for the original PSVR, forcing potential upgraders to abandon most of their existing VR game libraries for the new platform.

Even before the PSVR2 launched, Sony was reportedly scaling back its ambitions for the headset (though the company denied those reports at the time and said it was “seeing enthusiasm from PlayStation fans”). And since its launch, PSVR2 has suffered from a lack of exclusive titles, featuring a lineup mostly composed of warmed-over ports long available on other headsets. An Inverse report from late last year shared a series of damning complaints from developers who have struggled to get their games to run well on the new hardware.

Put it all together, and PSVR2 seems like a too-little-too-late upgrade that has largely squandered the company’s early lead in the space. We wouldn’t be shocked if this spells the end of the line for Sony’s VR hardware plans and for mass-market tethered headsets in general.

Report: Sony stops producing PSVR2 amid “surplus” of unsold units Read More »

how-virtual-reality-is-revolutionizing-police-training

How Virtual Reality Is Revolutionizing Police Training

Law enforcement officers face various complex and challenging situations where they must respond to high-risk incidents involving armed perpetrators. Unfortunately, police officers in the US only receive less than six months of police training—which is where virtual reality comes in.

Using VR helps augment the need for more in-depth training in a safe and immersive training environment. It also helps further hone their skills, allowing them to effectively manage a more comprehensive array of situations, including highly stressful and unpredictable scenarios.

In this article, we’ll explore virtual reality’s role in police training, its benefits, and some real-life applications.

Why VR Is an Effective Training Tool

VR has many police training applications, allowing officers to improve their interactions with their communities and help them develop the necessary reactions in a more controlled environment. It provides law enforcement officers with immersive experiences close to real-life situations, which can help improve their learning and performance compared to more traditional training methods. With virtual reality police training, users can interact with a simulated environment that reacts accordingly, making them feel like they’re really there.

As a police training tool, VR can be used to enhance existing aspects of training, according to a study by Laura Giessing of Heidelberg University. It has the potential to help officers become better equipped to face critical incidents on duty by acquiring skills and tactics that can be readily applied when facing high-stress situations.

Benefits of VR in Police Training

Aside from helping law enforcement officers further develop skills such as communication, de-escalation, or intervention, it can also help them build empathy. Developing empathy allows officers to become more effective on duty by better understanding what a particular subject is going through.

Using VR as a police training tool has several key benefits, including:

Officer Safety

Police officers face complex and potentially dangerous scenarios in their line of work. Using VR for police training allows them to immerse in those scenarios without the risk of physical harm.

Access to Realistic Simulations

Virtual reality can simulate realistic scenarios that elicit the same reactions as their real-world counterparts. These simulations give officers the opportunity to continuously expose themselves to the simulations and gain as much experience as possible before facing similar situations in the field.

Customizable Scenarios

The great thing about using virtual reality in police training is that it’s a scalable and customizable solution. This means that training academies or organizations can create custom scenarios that align with changing needs and industry best practices.

Enhanced Decision-Making Capabilities

By exposing officers to realistic simulations, they can hone their critical thinking, problem-solving, and communication skills. VR training can also be modified to simulate increasingly high-stress or high-risk situations, helping officers learn how to effectively handle and de-escalate such scenarios at a more manageable pace.

Focus on Evaluation and Debriefing

VR can also help officers learn how to best evaluate a scenario and execute more in-depth debriefing sessions. That’s because users can replay different scenarios, allowing them to analyze each segment in more detail.

Real-World Examples of Police VR Training

Many police departments and organizations in the US and abroad already use VR for police training. These include:

Sacramento Police Department

This department uses immersive video simulators to recreate real-world scenarios, providing its officers with cultural competency and implicit bias training. Officers are also educated about proper decision-making and peer intervention.

Los Alamos Police Department

In 2021, the Los Alamos Police Department started applying VR technology to train its officers in more effective de-escalation tactics.

Mexico City

Mexico City established the first virtual reality training center for officers in Latin America. One of the goals of the training center is to help officers enhance their reflexes in high-risk or stressful emergency scenarios to improve their performance.

Gwent Police

Gwent Police officers benefit from a VR training program that teaches them how to respond to and make better decisions in stressful situations. The program has 10 scenarios based on real-life problems that police officers frequently encounter.

Dutch Police

The Dutch Police developed a VR simulation game that trains officers to complete different scenarios. This VR training also provides bias training for Dutch Police officers, helping them become more knowledgeable and better prevent ethnic profiling.

How Virtual Reality Is Revolutionizing Police Training Read More »

hands-on-review:-yoges-handle-attachments-for-quest-2-controllers

Hands-On Review: YOGES Handle Attachments for Quest 2 Controllers

There are a lot of possible interactions in virtual reality. The standard Quest 2 controllers just don’t always cut it anymore. Fortunately, there’s a large market of accessories manufacturers making adapters for different games and use cases. Not least among them is YOGES.

YOGES at It Again

YOGES specializes in accessories for the Meta Quest 2 headset and Quest 2 controllers. We’ve already reviewed one of their head strap alternatives for the device and found it to be comfortable and competitively priced. When they invited us to try out their “handle attachments” of course we were curious.

The adapters are designed for the Quest 2 controllers and are reported to work with games including Beat Saber, Gorilla Tag, Kayak VR: Mirage, Real VR Fishing, and others. In this writing, I used the grips to play Playin Pickleball, Bait!, and Kizuna AI – Touch the Beat! (That’s a Beat Saber clone with super-short sabers).

Before we jump into the playthroughs, let’s look at what’s in the box.

Unboxing

The minimal YOGES packaging for the handle attachments packs one handle for each controller, one detachable lanyard for each controller, and a connector piece turning the whole set into one two-headed controller. There are also two extra velcro ties to hold the controllers into the adapters – just in case. A set of directions is included as well, but it’s a simple setup.

Hands-On Review: YOGES Handle Attachments for Quest 2 Controllers

The standard Quest 2 controller sits into the adapters, which are each labeled “L” or “R”. Then, a velcro tab secures the controller into the adapter via the tracking ring – so, likely not compatible with the Quest Pro controllers. The bottoms of each adapter are threaded. Screw on a lanyard attachment or screw one of the adapters into either end of the connector piece.

The lightweight adapters are hollow core encased in durable-feeling molded foam. That hollow core keeps the weight and probably the cost down, but it also means that you can insert your Quest 2 controllers without removing the lanyards from them. That’s a handy feature because you might not want these adapters for everything that you do in VR.

The full rig measures in at almost exactly two feet. Each controller in a separate adapter with the lanyard attachment measures in at about ten inches – that’s some five-and-a-half inches longer than the Quest 2 controller by itself.

The adapters extend the Quest 2 controllers but don’t allow you to interact with them in any way. That is, you’ve still got to be holding the controller to press buttons and triggers. Fortunately, the lanyard on the end is long enough that you can put it around your wrist and still reach over the entire adapter to reach the controller.

Playtesting the Adapters for Quest 2 Controllers

I was worried that that length was going to throw off my game. It seems to me that if the adapter adds a few inches, that means that the Quest 2 thinks that my arm is a few inches longer than it is – right? This shouldn’t make much difference saber beating or gorilla tagging, but I was all set for playing pickleball to be a nightmare.

Playin Pickleball

But then, it wasn’t. I don’t know if the Quest 2 is smarter than I gave it credit for or if my brain was a lot more ready to accept the extended controller as a part of my arm, but I had no trouble hitting the ball reliably into targets in a practice mode.

layin Pickleball also might be the game that has seen the most flying Quest 2 controllers in my home – lanyards are a must. However, I didn’t use the lanyards to play with the YOGES adapter – the extra length and the molded foam made it significantly easier to hold onto a paddle.

Kizuna AI – Touch the Beat!

I had a bit more of a time getting used to the adapters when I played a round of Kizuna AI – Touch the Beat!. If you haven’t played the game, it’s very similar to Beat Saber but with smaller targets, smaller sabers, and different motion challenges.

Things took some more getting used to, possibly because the sabers are narrower than a pickleball paddle so my movements needed to be even more precise. I did also hit my overhead light at least once, though I’m not entirely sure that that was because of the adapter. Still, by the end of the first song, I had a pretty memorable streak going.

Bait!

From here, I really wanted to use the adapter as a sword handle in Battle Talent, but in Battle Talent you need to hold the trigger to hold the weapon, so that was a no-go. You also pump both arms and use the joysticks to run, so I couldn’t just leave a controller down and dedicate myself to two-handed weapons. I wondered about how the handle might work as a fishing rod in Bait!.

In Bait! you hold the rod and cast with one hand but use the trigger on the other controller to real it in. I let the left-hand controller (sans adapter) hang off of my left wrist as I used the right controller (with adapter) to do a double-handed cast. It was a little awkward because Bait! was still tracking the left-hand controller as it flopped through the air, but the cast was beautiful.

Is it Worth the Price?

Depending on where, when, and how you buy the YOGES Handle Attachments, they run between $18.58 (the price on Amazon at the time of writing) and $33.98 (the price currently listed on the YOGES website). That’s fairly competitive for adapters of this kind – and most adapter sets don’t include the connector piece.

YOGES adapters for Quest 2 Controllers velcro strap

As always, whether or not that’s worth the price depends on the games that you play. For as many games as I found improved by the adapters, I have at least as many that wouldn’t work. Maybe that’s not the case for you. Or maybe it is but you feel really passionate about improving your VR fishing cast or your virtual pickleball game.

I will say that on all of the games that were compatible with these adapters for Quest 2 controllers (and Bait!) my game was improved – or at least felt improved.

Parting Thoughts

So far, I continue to be pleased with YOGES. The Quest 2 Controller Handle Attachments, like the headset strap, are lightweight and low-cost comfortable adapters. While they may not be for all people or in all cases, they certainly have their place in the VR accessories ecosystem.

Hands-On Review: YOGES Handle Attachments for Quest 2 Controllers Read More »

exploring-the-world-of-live-xr-theater

Exploring the World of Live XR Theater

The last three years may feel as though they’ve gone by pretty quickly. A few short years ago, we were seeing an explosion of interest and production in XR theater and live virtual entertainment. The pandemic meant that a lot of theaters were empty, creating a strong need for audiences and entertainers alike.

Now it’s 2023. Theaters are open again. But, that doesn’t mean that XR theater has gone anywhere. Far from being a temporary fix to string us through an isolated event, live VR entertainment is stronger than ever. It remains a way to explore new avenues of storytelling and even bring new audiences into traditional entertainment venues.

Understanding Immersive Theater

Before we dive in, a quick clarifying note may be required. While some readers will hopefully come from a theater background, most readers are likely more familiar with XR terminology so one particular term might be confusing.

When ARPost describes an experience as “immersive,” we’re usually talking about a 3D virtual environment that is spatially explored either by physical movement in augmented or mixed reality, or through spatial navigation in virtual reality. However, XR does not have a monopoly on the word.

“Immersive theater” is a term from the live entertainment world that far predates XR and XR theater. In this form of immersive theater, participants converse with actors, manipulate props, and physically move through sets that might take up an entire building. While the pandemic played a part in the growth of XR theater, its roots are in immersive theater.

“Due to our familiarity with the genre of immersive theatre, and some of our team members had prior experience performing in and being audience members in VR theatre shows, the transition from in real life (IRL) to VR was very natural,” Ferryman Collective founding member Stephen Butchko told ARPost.

Ferryman Collective, one of the premiere production companies in XR theater, was founded during the pandemic but its founding members had already been performing immersive theater in live venues for years. In fact, one of Ferryman Collective’s first major productions, Severance Theory: Welcome to Respite, began life as an in-person immersive theater production.

From Gaming to XR Theater

The Under Presents, released in 2019, might be the first major piece of XR theater. Tender Claws, the development studio behind the production, had been exploring innovative digital productions and engagements for four years already, but The Under Presents is where our story begins.

The experience, built as a game that sometimes featured live actors, introduced countless viewers to live XR theater. It also inspired other artists at a time when the theater community was in dire need of something new and different.

“Born out of the Pandemic”

“Ferryman Collective was born out of the pandemic and brought together by the magic of The Under Presents, or ‘TUP’, as we affectionately call it,” Ferryman Collective founding member Deirdre Lyons told ARPost. “The pandemic shut everything down in 2020 except for TUP, as people performed and participated in it from home.”

In 2019, Lyons was one of the Tender Claw’s VR actors – a job that she still holds while also producing, directing, and acting in productions by Ferryman Collective. A number of members of Ferryman Collective met while working on TUP.

The live show was only supposed to run for three months but extended the run due to its high popularity. The live component of the app and game was eventually closed, leaving actors free to work on other projects, with Tender Claws’ second major XR theater production, Tempest, coming out the following year.

Ferryman Collective’s first production, PARA, a horror story about a dubious AI startup, came out in the autumn of 2020. The show was written by Braden Roy, and was directed by Roy and Brian Tull, who had also met working on TUP. Roy also wrote Ferryman Collective’s second production, Krampusnacht, directed by Roy, Tull, and Lyons in the winter of 2020-2021.

XR Theater Meets Immersive Theater

Ferryman Collective learned a lot from PARA and Krampusnacht. The latter got the collective their first award nomination, with a run that was extended four times to keep up with interest. However, the collective’s breakout production was The Severance Theory: Welcome to Respite – an XR adaptation of a pre-pandemic live immersive theater production.

“Having experienced quiet moments of contemplation with other audience members within my experience as an actor on TUP, I knew that this medium had the potential for a profound connection,” said Lyons. “Having done some voiceover work on The Severance Theory: Welcome to Respite […] I felt this piece could be that kind of powerful experience in VR.”

Lyons reached out to the play’s creator, Lyndsie Scoggin, who had also been sidelined by the pandemic. Scoggin went from not owning a headset to writing and directing the XR theater adaptation, which took on a life of its own.

“The IRL version of [Welcome to Respite] was performed for one audience member who plays a seven-year-old kid named Alex,” Butchko told ARPost. “In the VR version, we are able to include up to nine additional audience members who are put into invisible avatars and play the alternate aspects of Alex’s personality, the Alter Egos.”

Bringing in Participants

Ferryman Collective’s approach to Welcome to Respite brings in more participants per show, but it also allows the participants to control the story as a group as each one gets a vote to determine Alex’s actions taken by the singular Alex over the course of the play.

Expanding the scale of XR theater audiences is one of the pioneering pursuits of “scrappy storyteller” Brandan Bradley. Bradley has been exploring XR since 2017 but really dove into it during the pandemic. During this time he has launched his own projects and XR theater productions and has also acted in productions by Ferryman Collective.

“The pandemic brought about this collision of my two loves: interactive media and fine arts,” Bradley told ARPost in a 2020 interview.

NON-PLAYER CHARACTER - a VR Musical - Brandan Bradley

Bradley’s current production, NPC, brings in a group decision dynamic similar to Welcome to Respite. Bradley plays a side character in a video game that sees the main character die and turns to the audience for guidance. The audience is four “on-stage” participants that interact with him directly, and a larger “seated audience” that watches the action unfold.

Expanding the audience

Splitting the audience like this does a number of things for Bradley. Traditional immersive theater experiences might only have the participating audience – and most XR theater still works that way. From a strictly box office perspective, bringing in the “seated audience” allows Bradley to sell significantly more tickets per performance.

There’s also an audience accommodation aspect. While the “seated audience” might be interested in seeing a story that is shaped by the audience, shaping the story themselves might not be their cup of tea. Further, the “seated audience” can join on more widely affordable and available devices – including a web browser.

“There is a large contingency of the audience that enjoys a more passive role – like a Twitch chat come to life,” Bradley told me over coffee at AWE. “My mom, who will never put on goggles, is willing to join on the keyboard.”

Bradley’s OnBoardXR – a sort of workshop and venue for XR entertainers to begin developing and testing live performances – uses a similar ticketing model. In a lobby, audience members choose different avatars to signal to the actors the degree to which they feel comfortable participating.

NPC and OnBoardXR, take place on-browser and can be joined in headset, on a desktop, or even on a mobile phone. Ferryman Collective performs in VRChat for similar options. This is a departure from Tender Claws’ VR-only productions.

“All of us would love to be The Under Presents […] but the price point is outrageous and the timetable is untenable for someone who just wants to keep producing […] we’re kind of ‘Off Broadway,’” said Bradley. “This is the balance that we’re all doing. There are things we would all love to do with more robust tools […] right now it’s more important to have more participants.”

Exploring Affordances

Anytime that anything is brought into virtual reality, there are benefits and barriers. Live theater is no different. Set and prop design, construction, and storage can be a lot easier. This to the point that no XR production ever need be permanently ended. A show can be revived at any time because everything exists as files as opposed to physical objects that must be stored.

However, physicality and expression can be a trade-off. A character may be fantastically designed for VR, but controlling it and expressing through it isn’t always easy – even with special avatars with controller-activated expressions.

“Emotions within the scene must be conveyed through the actor’s voice and sometimes stylized gestures[…],” said Butchko. “Things that we cannot do easily or convincingly are eat, drink, and lay down. Those were all found in the IRL production of [Welcome to Respite], but could not be used in the VR version due to technical limitations.”

Further, if you’re still comparing XR theater with a typical play instead of immersive theater, there are a few more details that you might have missed. Some in-person immersive theater involves physical contact between actors and participants, or at least involves participants physically interacting with sets and props.

“Not all immersive shows have physical actor-audience contact but there’s still the physicality of the structure and props that can’t be replicated without building a physical space,” Tull told ARPost. “Smell and taste are noticed less, though the potpourri of an old mansion or a gin and tonic at a seedy speakeasy go a long way in completing the illusion.”

Tull further commented that, even when “physical actor-audience contact” is involved, “the visual immersion of virtual reality can almost replicate the intimacy of actual touch.” I certainly found this to be the case.

Exploring Emotion

As a participant in Ferryman Collective’s Gumball Dreams, an actor reached out and virtually put his hand on my chest. If an actor had physically done this in an IRL production, I dare say that this would have made me immensely uncomfortable in the worst way. But, in VR, this came across as intended – a moving intimate gesture between characters in a story.

Gumball Dreams has an amusing name and a brightly colored and stylized virtual world. However, the actual story is an incredibly moving exploration of mortality and consciousness. Similar themes exist in NPC, while Welcome to Respite explores the experience of psychological disorders. What makes XR theater so conducive to these heavy topics?

“At a media level, when you’re inviting the kind of immersion that VR affords, you want to do more than just comedy,” said Bradley. “There is an emotional intimacy that we experience in VR that we haven’t experienced anywhere else and don’t have words for and that’s the next degree of the storytelling experience.”

In this year’s AWE panel discussion on “XR Entertainment: The Next Generation of Movie Makers and Performers”, Ferryman Collective performer and producer Whitton Frank gave a description of XR theater that also explains the draw that it has to audiences as well as entertainers.

“You are given a character and you are a part of the play […] you’re having emotional experiences with another human being which is why, I think, people get excited about this,” said Frank. “That is the way forward – to show people the world in a way that they haven’t seen it before.”

Find an XR Theater Experience

So, how do you know when and which XR theater experiences are available? It’s still a pretty niche field, but it’s close-knit. Start out by following groups like Tender Claws, OnBoardXR, and Ferryman Collective. Then (before or after the show), talk to the other audience members. Some will likely be new to it themselves, but others will be able to point you in the right direction.

Exploring the World of Live XR Theater Read More »

challenges-behind-applying-real-world-laws-to-xr-spaces-and-ensuring-user-safety

Challenges Behind Applying Real-World Laws to XR Spaces and Ensuring User Safety

Immersive technologies bridging the gap between the physical and digital worlds can create new business opportunities. However, it also gives rise to new challenges in regulation and applying real-world laws to XR spaces. According to a World Economic Forum report, we are relatively slow in innovating new legal frameworks for emerging technologies like AR and VR.

Common Challenges of Applying Laws to AR and VR

XR technologies like AR and VR are already considered beneficial and are used in industries like medicine and education. However, XR still harbors risks to human rights, according to an Electronic Frontier Foundation (EFF) article.

Issues like data harvesting and online harassment pose real threats to users, and self-regulation when it comes to data protection and ethical guidelines is insufficient in mitigating such risks. Some common challenges that crop up when applying real-world laws to AR and VR include intellectual property, virtual privacy and security, and product liability.

There’s also the need for a new framework tailored to fit emerging technologies, but legislative attempts at regulation may face several hurdles. It’s also worth noting that while regulation can help keep users safe, it may also potentially hamper the development of such technologies, according to Digikonn co-founder Chirag Prajapati.

Can Real-World Laws Be Applied to XR Spaces?

In an interview with IEEE Spectrum in 2018, Robyn Chatwood, an intellectual property and information technology partner at Dentons Australia, gave an example of an incident that occurred in a VR space where a user experienced sexual assault. Unfortunately, Chatwood remarked that there are no laws saying that sexual assault in VR is the same as in the real world. When asked when she thinks these issues will be addressed, Chatwood remarked that, in several years, another incident could draw more widespread attention to the problems in XR spaces. It’s also possible that, through increased adoption, society will begin to recognize the need to develop regulations for XR spaces.

On a more positive note, the trend toward regulations for XR spaces has been changing recently. For instance, Meta has rolled out a minimum distance between avatars in Horizon Worlds, its VR social media platform. This boundary prevents other avatars from getting into your avatar’s personal space. This system works by halting a user’s forward movement as they get closer to the said boundary.

There are also new laws being drafted to protect users in online spaces. In particular, the UK’s Online Safety Bill, which had its second reading in the House of Commons in April 2022, aims to protect users by ensuring that online platforms have safety measures in place against harmful and illegal content and covers four new criminal offenses.

In the paper, The Law and Ethics of Virtual Assault, author John Danaher proposes a broader definition of virtual sexual assault, which allows for what he calls the different “sub-types of virtual sexual assault.” Danaher also provides suggestions on when virtual acts should be criminalized and how virtual sexual assault can be criminalized. The paper also touches on topics like consent and criminal responsibility for such crimes.

There’s even a short film that brings to light pressing metaverse concerns. Privacy Lost aims to educate policymakers about the potential dangers, such as manipulation, that come with emerging technologies.

While many legal issues in the virtual world are resolved through criminal courts and tort systems, according to Gamma Law’s David B. Hoppe, these approaches lack the necessary nuance and context to resolve such legal disputes. Hoppe remarks that real-world laws may not have the specificity that will allow them to tackle new privacy issues in XR spaces and shares that there is a need for a more nuanced legal strategy and tailored legal documents to help protect users in XR spaces.

Issues with Existing Cyber Laws

The novelty of AR and VR technologies makes it challenging to implement legislation. However, for users to maximize the benefits of such technologies, their needs should be considered by developers, policymakers, and organizations that implement them. While cyber laws are in place, persistent issues still need to be tackled, such as challenges in executing sanctions for offenders and the lack of adequate responses.

The United Nations Office on Drugs and Crime (UNODC) also cites several obstacles to cybercrime investigations, such as user anonymity from technologies, attribution, which determines who or what is responsible for the crime, and traceback, which can be time-consuming. The UNODC also notes that the lack of coordinated national cybercrime laws and international standards for evidence can hamper cybercrime investigations.

Creating Safer XR Spaces for Users

Based on guidelines provided by the World Economic Forum, there are several key considerations that legislators should consider. These include how laws and regulations apply to XR conduct governed by private platforms and how rules can potentially apply when an XR user’s activities have direct, real-world effects.

The XR Association (XRA) has also provided guidelines to help create safe and inclusive immersive spaces. Its conduct policy tips to address abuse include creating tailored policies that align with a business’ product and community and including notifications of possible violations. Moreover, the XRA has been proactive in rolling out measures for the responsible development and adoption of XR. For instance, it has held discussions on user privacy and safety in mixed reality spaces, zeroing in on how developers, policymakers, and organizations can better promote privacy, safety, and inclusion, as well as tackle issues that are unique to XR spaces. It also works with XRA member companies to create guidelines for age-appropriate use of XR technology, helping develop safer virtual spaces for younger users.

Other Key Players in XR Safety

Aside from the XRA, other organizations are also taking steps to create safer XR spaces. X Reality Safety Intelligence (XRSI), formerly known as X Reality Safety Initiative, is one of the world’s leading organizations focused on providing intelligence and advisory services to promote the safety and well-being of ecosystems for emerging technologies.

It has created a number of programs that help tackle critical issues and risks in the metaverse focusing on aspects like diversity and inclusion, trustworthy journalism, and child safety. For instance, the organization has shown support for the Kids PRIVACY Act, a legislation that aims to implement more robust measures to protect younger users online.

XRSI has also published research and shared guidelines to create standards for XR spaces. It has partnered with Standards Australia to create the first-ever Metaverse Standards whitepaper, which serves as a guide for standards in the metaverse to protect users against risks unique to the metaverse. These are categorized as Human Risks, Regulatory Risks, Financial Risks, and Legal Risks, among other metaverse-unique risks.

The whitepaper is a collaborative effort that brings together cybersecurity experts, VR and AR pioneers, strategists, and AI and metaverse specialists. One of its authors, Dr. Catriona Wallace, is the founder of the social enterprise The Responsible Metaverse Alliance. Cybersecurity professional Kavya Pearlman, the founder and CEO of XRSI, is also one of its authors. Pearlman works with various organizations and governments, advising on policymaking and cybersecurity to help keep users safe in emerging technology ecosystems.

One such issue that’s being highlighted by the XRSI is the risks that come with XR data collection in three areas: medical XR and healthcare, learning and education, and employment and work. The report highlights how emerging technologies create new privacy and safety concerns, risks such as the lack of inclusivity, the lack of equality in education, and the lack of experience in using data collected in XR spaces are cropping up.

In light of these issues, the XRSI has created goals and guidelines to help address these risks. Some of the goals include establishing a standards-based workflow to manage XR-collected data and adopting a new approach to classifying such data.

The EU is also taking steps to ensure data protection in emerging technologies, with new EU laws aiming to complement the GDPR’s requirements for XR technologies and services. Moreover, the EU data protection law applies to most XR technologies, particularly for commercial applications. It’s possible that a user’s explicit consent may be required to make data processing operations legitimate.

According to the Information Technology & Innovation Foundation (ITIF), policymakers need to mitigate so-called regulatory uncertainty by making it clear how and when laws apply to AR and VR technologies. The same ITIF report stresses that they need to collaborate with stakeholder communities and industry leaders to create and implement comprehensive guidelines and clear standards for AR and VR use.

However, while creating safer XR spaces is of utmost importance, the ITIF also highlights the risks of over-regulation, which can stifle the development of new technologies. To mitigate this risk, policymakers can instead focus on developing regulations that help promote innovation in the field, such as creating best practices for law enforcement agencies to tackle cybercrime and focusing on funding for user safety research.

Moreover, the ITIF also provides some guidelines regarding privacy concerns from AR in public spaces, as well as what steps leaders and policymakers could take to mitigate the risks and challenges that come with the use of immersive technologies.

The EFF also shares that governments need to execute or update data protection legislation to protect users and their data.

There is still a long way to go when applying real-world laws to XR spaces. However, many organizations, policymakers, and stakeholders are already taking steps to help make such spaces safer for users.

Challenges Behind Applying Real-World Laws to XR Spaces and Ensuring User Safety Read More »

talespin-launches-ai-lab-for-product-and-implementation-development

Talespin Launches AI Lab for Product and Implementation Development

Artificial intelligence has been a part of Talespin since day one but the company has been leaning more heavily into the technology in recent years including through internal AI-assisted workflows and a public-facing AI development toolkit. Now, Talepsin is announcing an AI lab “dedicated to responsible artificial intelligence (AI) innovation in the immersive learning space.”

“Immersive Learning Through the Application of AI”

AI isn’t the end of work – but it will change the kinds of work that we do. That’s the outlook that a number of experts take, including the team behind Talespin. They use AI to create virtual humans in simulations for teaching soft skills. In other words, they use AI to make humans more human – because those are the strengths that won’t be automated any time soon.

Talespin AI Lab

“What should we be doing to make ourselves more valuable as these things shift?” Talespin co-founder and CEO Kyle Jackson recently told ARPost.“It’s really about metacognition.”

Talespin has been using AI to create experiences internally since 2015, ramping up to the use of generative AI for experience creation in 2019. They recently made those AI creation tools publicly available in the CoPilot Designer 3.0 release earlier this year.

Now, a new division of the company – the Talespin AI Lab – is looking to accelerate immersive learning through AI by further developing avenues for continued platform innovation as well as offering consulting services for the use of generative AI. Within Talepsin, the lab consists of over 30 team members and department heads who will work with outside developers.

“The launch of Talespin AI Lab will ensure we’re bringing our customers and the industry at large the most innovative and impactful AI solutions when it comes to immersive learning,” Jackson said in a release shared with ARPost.

Platform Innovation

CoPilot Designer 3.0 is hardly outdated, but interactive samples of Talespin’s upcoming AI-powered APIs for realistic characters and assisted content writing can currently be requested through the lab with even more generative AI tools coming to the platform this fall.

In interviews and in prepared material, Talespin representatives have stated that working with AI has more than halved the production time for immersive training experiences over the past four years. They expect that change to continue at an even more rapid pace going forward.

“Not long ago creating an XR learning module took 5 months. With the use of generative AI tools, that same content will be created in less than 30 minutes by the end of this year,” Jackson wrote in a blog post. “Delivering the most powerful learning modality with this type of speed is a development that allows organizations to combat the largest workforce shift in history.”

While the team certainly deserves credit for that, the company credits working with clients, customers, and partners as having accelerated their learnings with the technology.

Generative AI Services

That brings in the other major job of the AI Lab – generative AI consulting services. Through these services, the AI Lab will share Talespin’s learnings on using generative AI to achieve learning outcomes.

“These services include facilitating workshops during which Talespin walks clients through processes and lessons learned through research and partnership with the world’s leading learning companies,” according to an email to ARPost.

AI Lab Talespin

Generative AI consulting services might sound redundant but understanding that generative AI exists and knowing how to use it to solve a problem are different things. Even when Talespin’s clients have access to AI tools, they work with the team at Talespin to get the most out of those tools.

“Our place flipped from needing to know the answer to needing to know the question,” Jackson said in summing up the continued need for human experts in the AI world.

Building a More Intelligent Future in the AI Lab

AI is at a position similar to that seen by XR in recent months and blockchain shortly before that. Its potential is so exciting, we can forget that its full realization is far from imminent.

As exciting as Talespin’s announcements are, Jackson’s blog post foresees adaptive learning and whole virtual worlds dreamed up in an instant. While these ambitions remain things of the future, initiatives like the AI Lab are bringing them ever closer.

Talespin Launches AI Lab for Product and Implementation Development Read More »