augmented reality

immersive-technology-for-the-american-workforce-act:-legislation-that-aims-to-provide-equitable-access-to-xr-tech

Immersive Technology for the American Workforce Act: Legislation That Aims to Provide Equitable Access to XR Tech

The Immersive Technology for the American Workforce Act of 2023 was drafted by Rep. Lisa Blunt Rochester (D-DE) and Rep. Tim Walberg (R-MI) with the support of organizations like the XR Association (XRA), Talespin, Unity, Association for Career and Technical Education, Transfr, and HTC VIVE, among others.

“Emerging technologies, such as XR, can help meet people where they are and expand access to cutting-edge technology and training resources,” remarked XRA CEO Elizabeth Hyman in a press release shared with ARPost. “Rep. Lisa Blunt Rochester’s and Rep. Tim Walberg’s bill recognizes the importance of equitable access to skills training and workforce development programs and the key role immersive technology plays in delivering better outcomes.”

What Is the Immersive Technology for the American Workforce Act of 2023?

One advantage of incorporating immersive technologies for workforce training is that these are cost-effective and safer. They can also provide expanded training to underserved communities, as well as to workers with disabilities.

The Immersive Technology for the American Workforce Act aims to create a five-year program that provides support to various institutions, allowing them to utilize immersive technologies in their educational and training programs.

Furthermore, it aims to promote the development of inclusive technology while prioritizing underserved communities, such as rural areas and areas of substantial unemployment. It seeks to foster partnerships between private and public entities to address skills gaps, meet the needs of the workforce, and assist individuals who are facing barriers to employment.

“We’re excited to be able to work with Rep. Blunt Rochester, a member of Congress who cares deeply about ensuring underserved populations are able to tap into next-generation technology and skills training,” said XRA Senior Vice President of Public Policy Joan O’Hara.

There’s almost a quarter of Americans living in rural communities who are facing unique workforce challenges. Moreover, the US Bureau of Labor Statistics reported that, at the start of 2023, the country had 10.5 million unfilled jobs. The bill seeks to fill these gaps by enabling Americans from underserved communities and various backgrounds to have access to effective and high-quality training programs.

“XR technologies can dramatically change the way America’s workforce is recruited, trained, skilled, and upskilled. Scalable solutions are necessary to meet the diverse needs of today’s undiscovered talent to meet the needs of our complex workforce,” said Transfr CEO Bharanidharan Rajakumar.

How Will the Legislation Impact the Future of Work?

The Immersive Technology for the American Workforce Act follows the footsteps of “recent legislative successes”, such as the Access to Digital Therapeutics Act of 2023, which effectively extends “coverage for prescription digital therapeutics”. It aims to provide support, in the form of grants, to community colleges and career and technical education centers.

The grants will allow them to leverage XR technology for purposes such as workforce development and skills training. Furthermore, Immersive Technology for the American Workforce Act will enable such organizations and facilities to utilize XR technology to enhance their training, which, in turn, can help address the speed with which American companies meet workforce needs.

Immersive Technology for the American Workforce Act: Legislation That Aims to Provide Equitable Access to XR Tech Read More »

a-very-interesting-vr/ar-association-enterprise-&-training-forum

A Very Interesting VR/AR Association Enterprise & Training Forum

The VR/AR Association held a VR Enterprise and Training Forum yesterday, May 24. The one-day event hosted on the Hopin remote conference platform, brought together a number of industry experts to discuss the business applications of a number of XR techniques and topics including digital twins, virtual humans, and generative AI.

The VR/AR Association Gives Enterprise the Mic

The VR/AR Association hosted the event, though non-members were welcome to attend. In addition to keynotes, talks, and panel discussions, the event included opportunities for networking with other remote attendees.

“Our community is at the heart of what we do: we spark innovation and we start trends,” said VR/AR Association Enterprise Committee Co-Chair, Cindy Mallory, during a welcome session.

While there were some bonafide “technologists” in the panels, most speakers were people using the technology in industry themselves. While hearing from “the usual suspects” is nice, VR/AR Association fora are rare opportunities for industry professionals to hear from one another on how they approach problems and solutions in a rapidly changing workplace.

“I feel like there are no wrong answers,” VR/AR Association Training Committee Co-Chair,Bobby Carlton,said during the welcome session. “We’re all explorers asking where these tools fit in and how they apply.”

The Convergence

One of the reasons that the workplace is changing so rapidly has to do with not only the pace with which technologies are changing, but with the pace with which they are becoming reliant on one another. This is a trend that a number of commentators have labeled “the convergence.”

“When we talk about the convergence, we’re talking about XR but we’re also talking about computer vision and AI,” CGS Inc President of Enterprise Learning and XR, Doug Stephen, said in the keynote that opened the event, “How Integrated XR Is Creating a Connected Workplace and Driving Digital Transformation.”

CGS Australia Head, Adam Shah, was also a speaker. Together the pair discussed how using XR with advanced IT strategies, AI, and other emerging technologies creates opportunities as well as confusion for enterprise. Both commented that companies can only seize the opportunities provided by these emerging technologies through ongoing education.

“When you put all of these technologies together, it becomes harder for companies to get started on this journey,” said Shah. “Learning is the goal at the end of the day, so we ask ‘What learning outcomes do you want to achieve?’ and we work backwards from there.”

The convergence isn’t only changing how business is done, it’s changing who’s doing what. That was much of the topic of the panel discussion “What Problem Are You Trying to Solve For Your Customer? How Can Generative AI and XR Help Solve It? Faster, Cheaper, Better!”

“Things are becoming more dialectical between producers and consumers, or that line is melting where consumers can create whatever they want,” said Virtual World Society Executive Director Angelina Dayton. “We exist as both creators and as consumers … We see that more and more now.”

“The Journey” of Emerging Technology

The figure of “the journey” was also used by Overlay founder and CEO, Christopher Morace, in his keynote “Asset Vision – Using AI Models and VR to get more out of Digital Twins.” Morace stressed that we have to talk about the journey because a number of the benefits that the average user wants from these emerging technologies still aren’t practical or possible.

“The interesting thing about our space is that we see this amazing future and all of these visionaries want to start at the end,” said Morace. “How do we take people along on this journey to get to where we all want to be while still making the most out of the technology that we have today?”

Morace specifically cited ads by Meta showing software that barely exists running on hardware that’s still a few years away (though other XR companies have been guilty of this as well). The good news is that extremely practical XR technologies do exist today, including for enterprise – we just need to accept that they’re on mobile devices and tablets right now.

Digital Twins and Virtual Humans

We might first think of digital twins of places or objects – and that’s how Morace was speaking of them. However, there are also digital twins of people. Claire Hedgespeth, Head of Production and Marketing at Avatar Dimension, addressed its opportunities and obstacles in her talk, “Business of Virtual Humans.”

“The biggest obstacle for most people is the cost. … Right now, 2D videos are deemed sufficient for most outlets but I do feel that we’re missing an opportunity,” said Hedgespeth. “The potential for using virtual humans is only as limited as your imagination.”

The language of digital twins was also used on a global scale by AR Mavericks founder and CEO, William Wallace, in his talk “Augmented Reality and the Built World.” Wallace presented a combination of AR, advanced networks, and virtual positioning coming together to create an application layer he calls “The Tagisphere.”

“We can figure out where a person is so we can match them to the assets that are near them,” said Wallace. “It’s like a 3D model that you can access on your desktop, but we can bring it into the real world.”

It may sound a lot like the metaverse to some, but that word is out of fashion at the moment.

And the Destination Is … The Metaverse?

“We rarely use the M-word. We’re really not using it at all right now,” Qualcomm’s XR Senior Director, Martin Herdina, said in his talk “Spaces Enabling the Next Generation of Enterprise MR Experiences.”

Herdina put extra emphasis on computing advancements like cloud computing over the usual discussions of visual experience and form factor in his discussion of immersive technology. He also presented modern AR as a stepping stone to a largely MR future for enterprise.

“We see MR being a total game changer,” said Herdina. “Companies who have developed AR, who have tested those waters and built experience in that space, they will be first in line to succeed.”

VR/AR Association Co-Chair, Mark Gröb, expressed similar sentiments regarding “the M-word” in his VRARA Enterprise Committee Summary, which closed out the event.

“Enterprise VR had a reality check,” said Gröb. “The metaverse really was a false start. The hype redirected to AI-generated tools may or may not be a bad thing.”

Gröb further commented that people in the business of immersive technology specifically may be better able to get back to business with some of that outside attention drawn toward other things.

“Now we’re focusing on the more important thing, which was XR training,” said Gröb. “All of the business cases that we talked about today, it’s about consistent training.”

Business as Usual in the VR/AR Association

There has been a lot of discussion recently regarding “the death of the metaverse” – a topic which, arguably, hadn’t yet been born in the first place. Whether it was always just a gas and the extent to which that gas has been entirely replaced by AI is yet to be seen.

While there were people talking about “the enterprise metaverse” – particularly referring to things like remote collaboration solutions – the metaverse is arguably more of a social technology anyway. While enterprise does enterprise, someone else will build the metaverse (or whatever we end up calling it) – and they’ll probably come from within the VR/AR Association as well.

A Very Interesting VR/AR Association Enterprise & Training Forum Read More »

a-geospatial-web-platform-to-enhance-in-person-events?-absolutely,-says-fabric

A Geospatial Web Platform to Enhance In-Person Events? Absolutely, Says Fabric

Fabric aims to change how in-person events are held, through geospatial web. Starting with sports, the geospatial web and augmented reality platform Fabric can transform live events into an immersive, augmented reality-assisted experience to thrill sports fans.

Better Experiences With the Geospatial Web

The geospatial web is simply the use of geolocation technology within the greater realm of the Internet. For Fabric, it means syncing location, time, and content. By utilizing this technology along with immersive AR tools, spectators can elevate their experiences.

Fabric geospatial web

With geospatial technology, Fabric draws attention to an emerging trend in the experience economy in the sports industry. Using geospatial web technology, brands and sports teams can make in-person events more unique, social, and exciting.

While the concept of the geospatial web has already been around for some time, Fabric spent the past five years finding new ways to leverage this technology. The result is their main geospatial product, called “Space.” Space aims to prioritize human connections during in-person events as opposed to purely digital connections. It serves as a new medium of communication among fans, teams, and brands.

Fabric also offers a no-code platform that lets sports stakeholders, such as leagues and venues, display relevant content for any game or sporting experience.

Merging Sports and Augmented Reality for a Unique Experience

The company believes that sports are the top industry for live, in-person events; hence why they chose to start there. Fabric can facilitate peer-to-peer interactions within the same venue, plus help increase monetization and brand activation. They market Space as an “interactive jumbotron in every sports fan’s pocket.”

a 3D jumbotron Fabric

Space encourages sports spectators to disengage from artificial connections and seek real-time, location-based interactions with other people. And because it takes place at a single event, users know they already have a shared interest with other fans.

Within the app, AR assets called “Fabs” are powered by the geospatial web to encourage real-world interaction. These Fabs are designed to get people to interact more with each other in a unique and fun way, made possible by technology.

Enhancing Human Connections Through the Geospatial Web

Fabric is trying to bring back the experience of human connection, which is enhanced instead of hampered by technology. The company holds a different perspective than that of metaverse pioneers.

According to Fabric, the metaverse can offer unique, shared experiences via virtual reality. But ultimately, the user is, in fact, isolated from other people in the real world. The connection comes through VR via a headset. The “shared” experience is, in a way, manufactured artificially through VR technology.

This version of shared experiences provides advantages and disadvantages, as other technologies do. The metaverse can also open opportunities not available for other people and brands otherwise.

geospatial web Fabric

Meanwhile, Fabric offers an alternative way to experience life with digital technology. Fabric Spaces allow people within the same geographical location—in this case, a sports stadium or arena—to have meaningful, offline and online connections. Add to that the shared real-world experience of attending a sports event, and you have a potentially unforgettable encounter.

Growing the Social Fabric

Fabric began as an idea that founder and president Sarah Kass had while noticing societal issues brought about by connectivity. Together with co-founder and CEO Saul Garlick, they developed the geospatial web platform that became Fabric.

In an interview with Forbes magazine, Kass explained her reasoning for coming up with Fabric. She saw that mobile phones offer unprecedented connectivity but also distance people from others, so she sought to develop a product that could address this paradox.

“I began to frame the problem as ‘how do we grow the social fabric?’ What new infrastructure could propel the growth of social capital in the digital age? Or what new infrastructure would allow us to strengthen the social fabric in today’s time when we’re walking around with all these phones?” she stated in the interview.

As of press time, the Fabric team works with five professional sports teams and leagues. The goal is to provide fans with an elevated sports experience during games and other sporting events through the geospatial web and mixed reality.

A Geospatial Web Platform to Enhance In-Person Events? Absolutely, Says Fabric Read More »

a-demo-and-fresh-look-at-campfire

A Demo and Fresh Look at Campfire

For the last few years, Campfire 3D has been expanding the world of “holographic collaboration” with a custom headset and base station as well as software that works in headset-based mixed reality and virtual reality, mobile-based augmented reality, and now on desktop 3D.

The company is currently launching new hardware and a new product package, so we spoke with CEO and co-founder Jay Wright for a software demo and an explanation of the new release.

Gather Around the Campfire

“Our mission is to deliver what we call holographic collaboration – multiple people standing around a digital model of a physical thing, whether they’re all in the same room, or across the world,” said Wright, who called it the killer app for enterprise XR. “If this can be done successfully, we have a huge potential to reduce travel, shipping, and physical reworks.”

And Wright is no stranger to enterprise XR. He developed Vuforia as its vice president at Qualcomm. Qualcomm sold the project to PTC, where Wright followed as Vuforia’s president and general manager. While Wright left Vuforia in 2018, it remains PTC’s main enterprise augmented reality arm.

The following year, Wright co-founded Campfire with Roy Ashok, Alexander Turin, Steve Worden, Yuhan Liu, and XR pioneer Avi Bar-Zeev as founding advisor. Bar-Zeev has worked in XR since 1992 including co-founding Google Earth, serving as a consultant for Linden Labs, a principal architect for Microsoft, advisor for Tilt Five and Croquet, and president of the XR Guild.

In 2021, Campfire came out of stealth and started working with companies offering software, a headset, and a console that generated the virtual model.

A Demo on Desktop

While I haven’t yet gotten my hands on the company’s headset, the team did set me up for a demo on desktop – a major new offering for the tool. Wright did mention that he will be at the Augmented World Expo in a few weeks, so hopefully I’ll be able to try the headset there.

Basic functionalities with basic models include rotating and zooming in on models, and leaving annotations for other viewers to consider. This can be labeling items on the model, or taking screenshots, marking up the screenshot, and pinning it to avoid marking up the model directly.

As long as models are made up of components, they can be “exploded” to view and manipulate those components separately. This also allows users to see how systems are composed of parts through virtual assembly, disassembly, and reassembly. A “blue ghost” shows where selected components fit into a complete system for automatic guided instructions.

Selected components can also be reconfigured with different colors or textures on the fly. They can also be made invisible to make internal components easier to see without using the explode feature. A “slice” tool provides a transparency plane that can be moved through a model to create cross-sections. All of these tools work on all platforms.

“We spent a lot of time on ease-of-use,” said Wright. “The user interface is really similar whether it’s on a flat screen or in VR.”

Additions and Improvements

Today’s announcement includes a streamlined software package, expanded device accessibility, a larger base station option, and a new hardware and software package for teams.

A Cross-Platform Solution

The complete Campfire ecosystem consists of hardware and software. On the hardware side, the company has its own headset – which can be used for augmented reality or with a shaded visor for virtual reality – and two consoles for different-sized models. A phone can be an AR viewer but also serves as a controller for the headset via an adapter called “the pack.”

Campfire headset side

“We did this because everybody has used a phone and knows how to use it,” explained Wright.

One person must have a headset and console but additional participants can join on mobile or now on a desktop.

“Flat screens are still very important,” said Wright. “There are very few workflows in enterprise that involve XR and that don’t involve flat screens.”

That was one of the most consistent pieces of feedback that the company received from early users leading to this announcement. Of course, the different hardware that users join on does impact their experience – though all have access to basic collaboration tools.

“Once everybody is in Campfire, everybody has access to basic tools for pointing at things and communicating,” said Wright. “A huge amount of the power in holographic collaboration is just the ability to point things out in the 3D space.”

A Streamlined Software Offering

The apps were another common point of criticism. Until this announcement, the software side consisted of two separate end-user apps – one for authoring models and one for viewing models and collaborating. Now, one app can do both jobs.

Campfire new app mac car training

Participants can also be invited to a Campfire session via a link, just like 2D remote collaboration tools like Google Docs. This is fitting, as Wright believes that Campfire’s software has even more in common with legacy remote collaboration solutions.

“To the extent that spreadsheets or word documents drove the PC, we think that holographic collaboration does that for XR,” said Wright.

More Ways to View

Campfire launched with a tabletop console, which was great for designing smaller products like shoes, or modeling consumer packaged goods. Of course, virtual models of larger objects can be scaled down, but some users wanted larger models. That’s why Campfire now offers the “studio console” which goes on the floor instead of on a table.

Campfire console

Right now, viewing Campfire in AR or VR requires the company’s custom headset. However, the company is working on optimizing the application for use with the growing number of available passthrough headsets available on the market.

“We don’t see this class of device as something everyone has access to,” said Wright. “But people are going to purchase these devices and expect Campfire to work on them.”

Subscriptions Rolling Out Now

As of today, there are three ways to experience Campfire. First, the application does have a functionally-limited free model. Enterprise plans start at $1,500 per month and currently require contacting the company directly as they scale their public rollout. And now there’s “Campsite.”

Campfire new campsite experience

“Campsite” bundles five enterprise licenses, 2 headsets, packs, and tabletop consoles, and one studio console for $15,000 per year. Wright says that the whole Campsite can be set up in less than an hour.

A Future of Enterprise Collaboration

There are other companies doing parts of what Campfire is doing. And Wright’s argument that this technology is the future is hard to refute. While other companies are likely to step up, this is definitely a company to watch right now. After everything that they learned in the last two years, it’s exciting to think of what improvements this greater rollout will inspire.

A Demo and Fresh Look at Campfire Read More »

spring-has-sprung-for-niantic-and-8th-wall

Spring Has Sprung for Niantic and 8th Wall

It’s already been a year since Niantic acquired 8th Wall. While acquisitions can be a scary thing in the tech world, both companies are growing and strengthening through their partnership.

Pillars of the Earth

Niantic and 8th Wall are both AR companies that might be bigger and more important than some realize. However, they both come at AR architecture and accessibility from different perspectives. Their coming together was a game changer that’s hard to understate.

Niantic Senior Director of AR Product Marketing, Caitlin Lacey, helps us understand what the companies are doing in their own products and projects as well as how they are helping each other grow and develop.

“I joined Niantic a year ago primarily to focus on Lightship, and one of the things that I was really excited about coming in was the acquisition,” said Lacey. “Having 8th Wall as part of the Niantic family has definitely made it better.”

Niantic

For some readers, Niantic is synonymous with Pokémon Go. If you Ctrl+K “Niantic”, Google Docs suggests the Pokémon Go website as a link option. Other readers will recognize this as a gross misrepresentation. Pokémon Go may have made Niantic a household name, but it only scratches the surface of what the historic and storied company actually does.

In addition to games (including the just released AR real-world pet game Peridot), Niantic has probably the largest and most detailed working virtual map of the world ever. A few years ago, that was a neat trick. As devices become more powerful and AR gains traction, it’s increasingly becoming something a lot more.

Niantic Peridot AR pet game

Niantic games gather data for this virtual map of the world, but they also have a dedicated platform called Lightship that developers use to fill in the empty spots, add detail, and create their own experiences. Whether you’re building or playing, you’re using an app.

8th Wall

Like its parent company, readers have probably seen the 8th Wall logo on an AR experience but might not realize the magnitude of the operation. Also like its parent company, users can experience 8th Wall both through experiences that they enjoy or through developer tools.

Over the years, 8th Wall has been building out their developer tools and experiences making them easier to use and accessible on more devices. The company has tools for augmenting the world around a user, as well as for augmenting users themselves through lenses and filters.

8th Wall’s experiences and developer tools are web-based. No app installation required, they’re well-positioned to run on pretty much any connected device.

Web and Apps

Apps have a certain gravity bringing obstacles and opportunities. People know how apps work and they know what to expect. Apps can run larger and more in-depth experiences, but they only do one thing at a time. These two necessary strengths are at odds when people expect an experience to do everything and do it well – an unrealistic expectation called “the metaverse.”

“It took a long time to train people how to use apps, but now they’re trained,” said Lacey. However, as she points out, “if you’re thinking about a future where all of these mobile technologies have AR capabilities”, opening and switching apps can become a hassle.

WebAR is getting better all the time, but it’s still limited in terms of the experiences it can run. Thinking about being out and about, this compounds as people are away from stable home networks and relying on burdened public networks or potentially spotty data coverage.

“There are still limitations to experience and file size that the web just can’t handle,” said Lacey. “As computing power continues to grow and get stronger, we’ll see better experiences across platforms.”

In the meantime, both companies are working on leveraging their strengths in app and webAR respectively trying to achieve the best of both worlds in both worlds.

“On the Lightship side, there was tons of tech that was very app-based … we took that and asked, ‘What do you want, and how do we bring it to the web?’” said Lacey. “And then, on the other side, bringing things from the web to Lightship.”

Updates and Releases From Niantic and 8th Wall

In the last few weeks, some exciting changes have come for developers using both developer platforms – including some of those updates that look a lot like a cross-pollination between the two platforms.

Sky and World Effects

First, Sky Effects and World Tracking came to 8th Wall. These are two separate developer tools that allow an AR experience to augment the sky itself, or to help AR elements realistically appear in the physical world. However, when used together, a single experience can bridge the earth and heavens in new and immersive ways.

“With sky and world effects, an object drops from the sky, recognizes the environment, and can interact with that environment,” said Lacey. “We’re seeing that happen across the board and there’s more coming.”

To celebrate the launch, 8th Wall held the “Sky Effects Challenge” which invited developers to use the new technology in interesting and inventive ways. Creators turned the sky into a canvas, mapped the planets, and more.

“We are consistently amazed by what our community builds,” said Lacey.

A Cross-Device Scanning Framework

A new Scanning Framework for Lightship AR Developer’s Kit 2.5 allows users to virtually reconstruct physical spaces and objects without LiDAR. LiDAR is one of two common methods for capturing spatial data on mobile devices, but it’s only available on higher-end iOS devices. Opening the Scanning Framework to other methods greatly increases accessibility.

“We’ve continually heard the feedback, and we’re listening,” said Lacey. “We really want to be a consistent partner to developers in the AR space. We do believe that AR can help make the world more interesting and fun.”

Two New Games

8th Wall doesn’t do so much in the games category – again, games still work better as full apps for now. However, a big theme in this article is that the line between the two companies can be a little foggy these days – at least in terms of user experience. These apps likely benefited from 8th Wall technology and 8th Wall will likely benefit from what the apps learn and earn for Niantic.

Early this year, Niantic launched NBA All World. The app includes basketball mechanics and an NBA partnership, and grows to incorporate elements that make it more than just a game.

“Our version of an NBA basketball game starts with exciting one-on-one gameplay and expands from there to include the major elements of basketball culture, including music, fashion, sneakers, and more, all of which are integrated into real-world locations,” Niantic founder and CEO John Hanke said in a blog post.

If that wasn’t enough, by the time you read this, Peridot will be live. The highly anticipated game encourages players to nurture an AI-powered virtual pet, including feeding it, petting it, and playing with it. Players can also use Niantic’s social platform Campfire to meet with other players and breed new and unique Peridots (or Dots).

Spring Has Sprung for Niantic and 8th Wall

I’m not a huge basketball fan and Pokémon is a chapter of my life that closed a long time ago, but I’ve had my Dot Erin for a few days now. Erin mainly hangs out by my desk eating sandwiches, but was pretty excited to see the spring flowers in my backyard the other day.

Peridot AR pet game Niantic - Jon's Dot Erin

Much More to Come

Lacey advised that a lot more updates to Niantic and 8th Wall will continue to reinforce both platforms for the benefit of developers and end-users alike. There are also some interesting artistic activations coming in the next few weeks. And, of course, we’re excited about Peridot becoming publicly available. There’s definitely a lot more to come from this power pair.

Spring Has Sprung for Niantic and 8th Wall Read More »

wonderland-engine-is-here-to-make-webxr-development-faster-and-easier

Wonderland Engine Is Here to Make WebXR Development Faster and Easier

WebXR development is increasingly popular. Developers want to create content that users can enjoy without having to install apps or check the compatibility of their devices.

One of the companies working for the advancement of immersive technologies, Wonderland GmbH, based in Cologne, Germany, has recently announced one giant leap forward in this process. They have recently released Wonderland Engine 1.0.0, a WebXR development platform already vouched for by top content creators.

Wonderland Engine 1.0.0 – Bringing Native XR Performance to WebXR Development

What is special about the new engine launched by Wonderland? Its first benefit is the ability to mimic native XR performance. Before its launch, Wonderland Engine 1.0.0 passed the test of content creators.

WebXR development platform Wonderland Engine editor vr website with browser

Vhite Rabbit XR and Paradowski Creative, two companies creating XR games, used the engine to develop content. The Escape Artist, an upcoming title by Paradowski Creative, is created with Wonderland Engine 1.0.p0, and its developers say that it matches native games in terms of polish and quality.

“We’re excited to announce this foundational version of Wonderland Engine, as we seek to bridge the gap between native XR app development and WebXR,” said the CEO and founder of Wonderland, Jonathan Hale, in a press release shared with ARPost. “We see a bright future for the WebXR community, for its developers, hardware, support, and content.”

Top Features of Wonderland Engine 1.0.0

The developers who choose Wonderland GmbH’s WebXR development platform to create content will be able to use the following:

  • Full 8th Wall integration – complete integration of 8th Wall AR tracking features such as face tracking, image tracking, SLAM, and VPS;
  • Runtime API rewrite – better code completion, static checks for bugs before running the code, and complete isolation for integration with other libraries;
  • Translation tools – necessary for the localization of WebXR content;
  • Benchmarking framework – to check for content performance on various devices.

Developers can find the complete list of features and bug fixes on the official release page.

According to the company, Wonderland Engine users can launch their first running app into the browser in less than two minutes. With a bit of experience, users can build a multi-user environment that supports VR, AR, and 3D in 10 minutes, as demonstrated in this video.

The XR Development Platform Is Optimized for VR Browsers

To indicate their commitment to helping content creators, Wonderland GmbH is optimizing their tool specifically for the most popular VR browsers: Meta Quest Browser, Pico Browser, and Wolvic.  

Wonderland Engine WebXR meta browser

Wonderland Engine-based apps support any headset that has a browser available. Also, any headset released in the future will automatically be supported, if it has a browser. Apps created with Wonderland Engine can also run on mobile devices through the browser, as Progressive Web Apps (PWA), which also allows them to run offline.

Apart from the two game development companies mentioned above, the company is also working with various content creators.

“It was crucial to bring the whole ecosystem with us to test and validate the changes we made. This resulted in a highly reliable base to build upon in upcoming versions,” Hale said. “By making it easier to build XR on the web we hope to attract developers and content creators to WebXR. We see WebXR truly being able to rival native apps and offer consumers a rich world of rapidly accessible content to enjoy.”

Meet the Wonderland Team at AWE USA 2023

The creators of Wonderland Engine 1.0.0 will present the WebXR development platform at AWE USA 2023 (use ARPost’s discount code 23ARPOSTD for 20% off your ticket), which is taking place in Santa Clara, CA between May 31 and June 2.

The company is one of the sponsors of the event and will also be present at the event in booth no. 605.

Wonderland Engine Is Here to Make WebXR Development Faster and Easier Read More »

holo-interactive:-leading-the-way-in-shaping-the-future-of-mixed-reality-copresence

Holo Interactive: Leading the Way in Shaping the Future of Mixed Reality Copresence

While still in its early stages, mixed reality copresence has shown vast potential for applications beyond gaming and entertainment. Advancements in virtual, augmented, and mixed reality technology pave the way for seamless connections between real and virtual worlds. We see the lines between real-world interactions and digital-world experiences blurring into virtual oblivion.

However, mixed reality copresence still has a long way to go before it becomes available for mainstream use. Barriers to adoption, such as affordability, availability, and tech limitations, among many others, must be addressed for this technology to truly impact how we live our lives. Holo Interactive, a reality computing lab and content studio, is at the forefront of finding solutions to overcome these barriers.

Botao Amber Hu, the founder and CEO of Holo Interactive, shares his insights on the state of the mixed reality industry and his company’s role in shaping its future.

From Small Steps to Giant Leaps in Making Big Realities

After inventing the award-winning affordable stereoscopic AR headset HoloKit X, Hu gained popularity and esteem in the industry. Developing MOFA, an immersive multiplayer AR live-action RPG, further set Hu’s name as a trailblazer in mixed reality.

MOFA - Mixed Reality Copresence game Holo Interactive

With a deep belief that mixed reality copresence holds the key to unlocking the true potential of headword AR, Hu established Holo Interactive to bridge the gaps that hinder the accessibility of mixed reality copresence in the mass consumer market.

Now working with a globally distributed team, Hu is rallying developers, engineers, and other industry professionals to embrace the motto “A dream you dream alone is only a dream. A dream you dream together is reality.”

The Holo Interactive team is leading the way as the premier lab for mixed reality co-presence experiences that are accessible to all. Over the years, Holo Interactive has been developing applications and innovative products that could widen the adoption of head-worn AR. Recently, the company hasalso released the HoloKit Unity SDK to empower developers in creating copresence experiences.

Leading the Way in Mixed Reality Copresence

HoloKit X, an immersive stereoscopic hardware designed as an iPhone accessory, enhances AR experiences by creating a more realistic and engaging visual experience that allows users to interact with their environment and digital content more naturally.

It harnesses the powerful capabilities of iPhone and ARKit to deliver exceptional AR experiences to iPhone users. With its multi-modal control inputs and copresence functionality, it can provide face-to-face shared experiences with other users in real time, fostering a sense of presence and social interaction in AR environments.

HoloKit X Mixed Reality Copresence

Aside from creating AR hardware, Holo Interactive is shaping the future of mixed reality by giving developers access to tools that would enable them to create MR solutions. “The current market situation and our unique position within the ecosystem make now an ideal time to release an open-sourced SDK for HoloKit X,” Hu told ARPost in a written interview.

He explained that this strategic move enables them to establish their presence in the MR ecosystem, tap into the growing interest in AR/MR technologies, and empower developers to create copresence experiences.

By opening the HoloKit SDK to third-party users, Holo Interactive hopes to become the “Arduino for head-worn AR.“We want to encourage more people to experiment with their work in mixed reality copresence and to open-source their creations, inspiring others within the community,” said Hu.

In addition, he hopes that lowering the barriers to entry for mixed reality copresence projects and embracing open-source practices will accelerate progress in the field of MR.

Breaking the Barriers to Widespread Adoption

Immersive technologies are already transforming our lives. VR is gaining widespread use across industries. AR has also gone a long way since Pokémon Go first went viral. However, head-worn AR still faces challenges to widespread adoption.

Botao Amber Hu
Botao Amber Hu

According to Hu, “Head-worn AR has the potential to turn our world into a ‘software-defined reality’, allowing us to interact with the real world and others in novel ways, a concept known as co-presence. This exciting future, however, is not without hurdles.”

Asked about the barriers AR faces, Hu enumerates four obstacles: affordability of high-quality AR devices, efficiency of input methods, development of killer applications that drive adoption, and psychological barriers to social acceptance.

Holo Interactive is on a mission to break these barriers to adoption. Hu believes that addressing these challenges can help ensure that AR technology reaches its full potential, positively impacting our lives and the way we interact with the world around us.

Holo Interactive: Leading the Way in Shaping the Future of Mixed Reality Copresence Read More »

snap-partner-summit-2023-details-changes-coming-to-snapchat-and-beyond

Snap Partner Summit 2023 Details Changes Coming to Snapchat and Beyond

Snap’s annual Partner Summit is the company’s opportunity to showcase its working relationship with other brands. That includes the experiences that come out of those partnerships, as well as the hardware and software updates that drive them. The event covered a lot of ground but we’ll be looking specifically at AR-related updates and announcements.

Some of the announcements are already available for Snapchatters to explore, while others are coming soon. Even the parts of the summit that may seem boring for the average end users help to understand where the platform is going in the coming months and years. And this year’s event is extra special because it was held in person for the first time since 2019.

Snap Map and Bitmoji Features

Bitmojis, the 3D avatars used by Snapchatters for their profiles as well as in games and messages, is constantly expanding, including through new virtual fashion partnerships and this year is no different.

Digital fashions inspired by the Marvel Cinematic universe will be available soon. The avatar system itself will also be updating to allow for “realistic dimensions, shading, and lighting,” according to Vice President of Product Jack Brody.

“Bitmoji style has changed quite a bit, and they continue to evolve,” said Brody.

Snap Partner Summit 2023 - Jack Brody showing 3D Snap Map
Jack Brody showing 3D Snap Map

Brody also announced that the Snap Map is getting more updates, including more 3D locations and tags to help users find popular locations from their Snapchat communities. Users who access the app with Verizon +Play will also be getting new options for games and puzzles in calls with Snapchat’s connected lenses.

Camera Kit Integrations

Snap’s Head of Global AR Product Strategy and Product Marketing Carolina Arguelles Navas took to the stage to talk about recent and upcoming partnerships, including some that affect apps and experiences outside of Snapchat itself through its Camera Kit offering.

For example, Snap lenses can now be used in Microsoft Teams and in the NFL app. LA Rams’ SoFi Stadium even uses Snap Lenses on their Infinity Screen to show the audience with augmented reality effects.

Navas also discussed Snap’s ongoing partnership with Live Nation, bringing custom AR lenses to over a dozen concerts this year including Lollapalooza in Paris and The Governor’s Ball in New York. She also announced a new partnership with Disguise, a company that specializes in real-time interactive visuals for live events.

Snap is also partnering with individual artists. The first to be announced is KYGO, a DJ, with more artist partnerships to be announced throughout the year.

More Opportunities for Brands

Until now, Camera Kit has been the main way that other companies were able to use Snap’s technology. However, Jill Popelka announced a new division, Augmented Reality Enterprise Services (ARES), of which she is the head.

Snap AR Enterprise Services (ARES)

“We all know the shopping experience today, whether online or in-store, presents a lot of options,” said Popelka. “We’ve already seen how our AR advancements can benefit shoppers and partners.”

The “AR-as-a-Service” model currently consists of two main offerings. Shopping Suite brings together Snap’s virtual try-on and sizing recommendation solutions, while the Enterprise Manager helps companies keep track of their activations including through analytics.

Popelka also announced a new “Live Garment” feature that generates a wearable 3D garment from a 2D photo of a garment uploaded into a lens.

Commercial Hardware

Popelka also introduced two new hardware offerings from Snap to commercial partners – AR mirrors and AR-enabled vending machine.

AR mirrors are already making their way into clothing stores to make virtual try-on even easier for shoppers, including those who don’t have Snapchat. Some partners have even experimented with incorporating AR games that shoppers can play to unlock in-store rewards. Retailers are also using the opportunity specifically to engage with younger audiences.

Snap Partner Summit 2023 - Jill Popelka showing AR mirrors
Jill Popelka showing AR mirrors

Snap currently has its AR mirror in a Men’s Wearhouse store.

“[Men’s Wearhouse is] proud to launch digital partnerships and store innovations specifically geared toward how high school students want to shop and prepare for prom,” Tailored Brands President John Tighe said in a release shared with ARPost. “We are excited to offer these younger customers experiences in-store and online to make the shopping experience easier. Everyone deserves to look and feel their best on prom night.”

Snap also partnered with Coca-Cola to create a prototype of an AR vending machine controlled with hand gestures displayed on a screen.

AR-enabled Coca-Cola vending machine - Snap

It might be a while before you see either of these devices in a store near you, but keep an eye out all the same.

App Updates

The standard app is getting some AR updates too, mainly related to the company’s work with AI. When Snapchatters capture a photo or video, the app will recommend lenses that might match the scene. AI will also recommend lenses for reacting to Snapchat memories and produce a new generation of lenses available to users.

Keep Exploring Snapchat

There really was a lot in the Partner Summit that wasn’t detailed here. So, if you use Snapchat for more than just AR, keep checking into the app to see even more changes coming in the next few months.

AWE USA 2023 giveaway

Snap Partner Summit 2023 Details Changes Coming to Snapchat and Beyond Read More »

the-most-important-upcoming-ar/vr-events-this-year

The Most Important Upcoming AR/VR Events This Year

After the COVID-19 pandemic made most event organizers cancel their live AR/VR events or host them virtually, in the past year more and more live events are making a comeback. Even the immersive industry needs a real-life environment to discover gadgets and try AR/VR content.

And we will tell you exactly where you should go if you want to try the most exciting discoveries and devices. Also, we know that if you work in this industry, you should not miss these AR/VR events, either. You have the great opportunity to network with peers or even discover new business or employment opportunities.

The Most Relevant AR/VR Events in 2023 You Should Not Miss

Here are some of the most important AR/VR events that are taking place until the end of this year.

1. AWE USA 2023

When: May 31 – June 2

Where: Santa Clara, CA, US; and virtual

AWE USA 2023

Late spring/early summer is a great time to be in California. This year, it is made even better by one of the top AR/VR events: AWE USA. This is one of the most comprehensive fairs for professionals, stakeholders, and tech enthusiasts focused on immersive technologies.

They will be able to test various gadgets, discover content trends, and attend insightful keynote presentations. Attendees will be able to hear from a vast number of XR professionals, such as Peggy Johnson (the CEO of Magic Leap), Hugo Swat  (Vice-President and General Manager of XR at Qualcomm), Chi Xu (CEO of Nreal), Elizabeth Hyman (CEO of XR Association), John Riccitiello (CEO of Unity), Kavya Pearlman (Founder & CEO of XR Safety Initiative), and many more.

For those who can’t attend in person, there’s also the option to access some parts of the event online through the awe.live platform.

As AWE USA 2023 media partner, ARPost offers its readers a 20% discount code. Use the code 23ARPOSTD during registration.

2. XR Expo 2023

When: June 15-16

Where: Stuttgart, Germany

XR Expo 2023

If you happen to be in Germany this summer for business, you can also put aside two days to attend XR Expo 2023. This B2B event will bring together industry users, technology providers, content service providers, and XR researchers, and will focus on professional applications of extended reality technologies in health, industry, architecture, and trade and craft.

Right now, registration for XR Expo 2023 is open, but the event’s program is still under development.

3. Enterprise Metaverse Summit

When: June 28-29

Where: London, UK; and virtual

Enterprise Metaverse Summit 2023

Another AR/VR event taking place in Europe, Enterprise Metaverse Summit is hosted by the British weekly newspaper The Economist. The event audience includes senior executives focused on XR, automation, IoT, 5G, the future of work, AI, machine learning, and edge computing.

This year’s event will cover a vast number of topics, from how the metaverse is expected to replace or improve human experiences at work, and how VR is making surgeries safer, to how the metaverse can help businesses become more sustainable and reduce their carbon footprint, and the benefits of VR as an engine to catalyze diversity and inclusion initiatives.

Attendees will be able to hear from speakers such as Meta Reality Labs’ VP Christine Trodella, Wallart’s Head of 3D Creative Technology Cynthia Maller, MetaVRse co-founder Alan Smithson, and EndeavorXR founder and CEO Amy Peck.

For all those who cannot attend, the event organizers offer a free virtual pass for full access to all virtual sessions on the first day of the event.

4. XR:WA

When: July 20-23

Where: Perth, Western Australia

XR:WA 2023

Australian winter is mild compared to the summer heat in many parts of the US, so it’s a great moment to attend one of the best AR/VR events of the year. XR:WA is for everyone, from those curious to try immersive experiences for the first time to seasoned professionals.

The event aims to discuss the impact and opportunities for XR across a number of industries including architecture, design, medicine, education, training, science, art, and entertainment.

Back for its fifth edition, the event activities include talks, panels, various AR and VR experiences, workshops, a trade floor, and exhibition.

5. Industrial IMMERSIVE Week

When: August 28-30

Where: Houston, TX

Industrial Immersive Week 2023 event

Industrial IMMERSIVE Week is one of the AR/VR events for professionals interested in XR, spatial computing, digital twins, AI, IIoT, 5G, 3D models and simulations, and connected workforce solutions.

Touted as the event “where industry goes for real-world use cases, best practices and emerging tech for your enterprise to launch, pilot and deploy metaverse programs,” Industrial IMMERSIVE Week is intended for professionals from diverse industries, including manufacturing, construction, oil and gas, engineering, mining, automotive, aerospace, and logistics.

From C-suite leaders, to department leaders and professionals, Industrial IMMERSIVE Week brings together seasoned specialists and decision-makers. Some of the most important Fortune 500 companies will be represented at the event, including Boeing, Canon, Shell, Amazon, and Hewlett Packard Enterprise.

6. Augmented Enterprise Summit

When: October 24-26

Where: Houston, TX; virtual

Augmented Enterprise Summit 2023

As its name suggests, this AR/VR event is aimed at enterprises. Augmented Enterprise Summit is dedicated to the business and industrial applications of XR and other metaverse-related emerging technologies.

This year, the event boasts an impressive line-up of speakers including NASA’s Simulation and Graphics Capabilities Manager Angelica Garcia, CocaCola’s Director of Technical Training and Development Michael Whatley, Airbus Head of Augmented Worker Product Andreas Oeder, and Amazon Web Services Principal Spatial Computing Solutions Architect Kurt Scheuringer.

Among the sponsors and exhibitors, you can find a number of XR companies such as Mytaverse, RealWear, Strivr, and Vuzix.

Aside from more than a thousand expected attendees, the 10th Augmented Enterprise Summit will also feature over 50 exhibitors and sponsors, and more than 50 educational sessions.

The event also offers a virtual pass, with access to a virtual event app, digital exhibitor zone, exclusive post-event resources, and all sessions and on-demand recordings. As Augmented Enterprise Summit media partner, we can offer our readers 15% off for event tickets with the code arpost15.

The Most Important Upcoming AR/VR Events This Year Read More »

devar-launches-neural-network-for-ar-content-creation

DEVAR Launches Neural Network for AR Content Creation

The problem with today’s AR content creation platforms is that their output is usually suitable for powerful phones. But many consumers use older or budget smartphones. And they deserve to enjoy AR experiences as well. With this idea in mind, DEVAR prepares to launch the first neural network for AR content creation for augmented reality projects adapted to all devices.

The Latest Mission of DEVAR: Creating AR Content for Everyone

Big brands and gaming corporations hire experienced designers and developers to create their AR content. And they usually target consumers with new flagship phones. When someone with an older or budget model tries to install top AR apps and games, they find that they are either incompatible with their phone or they won’t run properly.

The Generative AR Platform featuring a neural network for AR will change that. Every content creator can generate 3D objects, without having to figure out all the technical detail. Moreover, these objects will be optimized for any kind of devices, including older and entry-level models.

Key Advantages of DEVAR’s Neural Network for AR

The new platform for creating AR content has several advantages that will likely attract a large number of users. First of all, the platform includes DEVAR’s no-code platform MyWebAR. Launched in 2021, this service allows users to create AR content for web (WebAR) – which can be displayed directly in a web browser, without the need to install an app.

Also, the neural network for AR will calculate all the necessary parameters for creating the 3D objects, including the number of polygons, the presence of textures, and the correct topology.

An experienced designer would need several hours to determine these parameters. The Generative AR Platform performs the computations in seconds.

Finally, the 3D objects generated by the neural network for AR can be used in two ways: as AR content and as markers.

Making AR Creation Simpler Will Increase Adoption Rate Among All Industries

AR is already in use in various fields, but more aspects of our life and work can benefit from it. But the problem is that it is still new territory, and experienced designers are very expensive to hire.

DEVAR plans to solve this issue. “Professional studios usually have no problem with creating new characters for AR – they have 3D artists, designers, and animators on staff to do this,” said the company founder and CTO, Andrei Komissarov, in a press release.  “But according to the data of our no-code platform MyWebAR, 60% of users have no experience creating AR. One of the main issues that becomes a barrier to their entry into the industry is the creation of 3D assets.”

Apart from the neural network for AR, the Generative AR Platform offers users a large library containing thousands of 2D and 3D objects they can use to create their own content.

DEVAR Launches Neural Network for AR Content Creation Read More »

meta-shows-new-progress-on-key-tech-for-making-ar-genuinely-useful

Meta Shows New Progress on Key Tech for Making AR Genuinely Useful

Meta has introduced the Segment Anything Model, which aims to set a new bar for computer-vision-based ‘object segmentation’—the ability for computers to understand the difference between individual objects in an image or video. Segmentation will be key for making AR genuinely useful by enabling a comprehensive understanding of the world around the user.

Object segmentation is the process of identifying and separating objects in an image or video. With the help of AI, this process can be automated, making it possible to identify and isolate objects in real-time. This technology will be critical for creating a more useful AR experience by giving the system an awareness of various objects in the world around the user.

The Challenge

Imagine, for instance, that you’re wearing a pair of AR glasses and you’d like to have two floating virtual monitors on the left and right of your real monitor. Unless you’re going to manually tell the system where your real monitor is, it must be able to understand what a monitor looks like so that when it sees your monitor it can place the virtual monitors accordingly.

But monitors come in all shapes, sizes, and colors. Sometimes reflections or occluded objects make it even harder for a computer-vision system to recognize.

Having a fast and reliable segmentation system that can identify each object in the room around you (like your monitor) will be key to unlocking tons of AR use-cases so the tech can be genuinely useful.

Computer-vision based object segmentation has been an ongoing area of research for many years now, but one of the key issues is that in order to help computers understand what they’re looking at, you need to train an AI model by giving it lots images to learn from.

Such models can be quite effective at identifying the objects they were trained on, but if they will struggle on objects they haven’t seen before. That means that one of the biggest challenges for object segmentation is simply having a large enough set of images for the systems to learn from, but collecting those images and annotating them in a way that makes them useful for training is no small task.

SAM I Am

Meta recently published work on a new project called the Segment Anything Model (SAM). It’s both a segmentation model and a massive set of training images the company is releasing for others to build upon.

The project aims to reduce the need for task-specific modeling expertise. SAM is a general segmentation model that can identify any object in any image or video, even for objects and image types that it didn’t see during training.

SAM allows for both automatic and interactive segmentation, allowing it to identify individual objects in a scene with simple inputs from the user. SAM can be ‘prompted’ with clicks, boxes, and other prompts, giving users control over what the system is attempting to identifying at any given moment.

It’s easy to see how this point-based prompting could work great if coupled with eye-tracking on an AR headset. In fact that’s exactly one of the use-cases that Meta has demonstrated with the system:

Here’s another example of SAM being used on first-person video captured by Meta’s Project Aria glasses:

You can try SAM for yourself in your browser right now.

How SAM Knows So Much

Part of SAM’s impressive abilities come from its training data which contains a massive 10 million images and 1 billion identified object shapes.  It’s far more comprehensive than contemporary datasets, according to Meta, giving SAM much more experience in the learning process and enabling it to segment a broad range of objects.

Image courtesy Meta

Meta calls the SAM dataset SA-1B, and the company is releasing the entire set for other researchers to build upon.

Meta hopes this work on promptable segmentation, and the release of this massive training dataset, will accelerate research into image and video understanding. The company expects the SAM model can be used as a component in larger systems, enabling versatile applications in areas like AR, content creation, scientific domains, and general AI systems.

Meta Shows New Progress on Key Tech for Making AR Genuinely Useful Read More »

top-5-e-commerce-ar-and-vr-trends-to-follow-in-2023

Top 5 E-Commerce AR and VR Trends To Follow in 2023

AR and VR are two of the most promising technologies of the modern era. Both can potentially revolutionize how we interact with the world around us. However, these technologies have taken a long time to reach their full potential.

While AR and VR have been around for decades, it wasn’t until recently that they offered a quality experience without being too limited by technological constraints or not being portable enough for widespread use.

Nevertheless, they’ve been making waves in many industries. And now, the e-commerce industry is hopping on board. Statista reports that by 2023, there will be 1.4 billion AR devices worldwide, projected to rise to 1.73 billion by 2024.

Number of mobile augmented reality (AR) active user devices worldwide from 2019 to 2024 - Statista
Source: Statista

What should we expect from AR and VR in 2023 and beyond? In this article, we will explore the potential of AR and VR for e-commerce and how they can enhance your shopping experience.

1. Increased Adoption of AR in E-Commerce

According to a recent survey, 38% of marketers reported using AR in 2022. It’s a significant increase from the 23% reported in 2017. And it’s understandable, given AR technology’s benefits to e-commerce customers.

For example, it allows them to feel like they’re physically interacting with products in a brick-and-mortar store while being online. AR can also help consumers visualize how products will look in their homes or on their bodies, improving the shopping experience and leading to more informed purchasing decisions and fewer returns.

Looking ahead to 2023, there are six exciting trends in the AR shopping space to keep an eye on. They are:

1. Social Media Apps and Camera Filters

Social media apps and camera filters, with Snapchat and Instagram leading the charge by incorporating AR into their platforms. Brands can use SnapAR Lens Studio or Meta Spark to create engaging AR filters and lenses that bring products to life. A case in point is Gucci.

Gucci AR instagram filter
Screenshots taken on the official Gucci Instagram account

2. Virtual Try-On Technology

Virtual try-on technology to see how products look on shoppers like on the Sephora Snapchat page.

Sephora Snapcaht Lens
Screenshots taken on the official Sephora Snapchat account

3. Virtual Showrooms

Virtual showrooms are similar to try-on but involve the buyer flipping the camera around. For example, it’s popular among furniture stores like EQ3.

virtual showroom EQ3
Screenshots taken on the official EQ3 website

4. Better AR Hardware Options

Better AR hardware options, caused by innovations in mobile technology, such as LiDAR and ToF (depth sensing hardware). Companies like Google, Microsoft, Lenovo, and Vuzix are developing smart glasses to enhance the AR experience.

5. AR Mirrors

AR mirrors for in-store shopping, assisting buyers who are in-store and either don’t want to test various alternatives or can’t for whatever reasons.

6. Gamifying

Gamifying in-store shopping to connect physical products with apps, creating a fun and interactive shopping experience.

2. VR-Enabled Online Shopping Experiences

VR creates an immersive visual environment, including 360-degree videos, photos, product demos, and complex experiences using devices such as the HTC Vive or Oculus Quest.

Unlike AR, VR is entirely simulated and disconnected from the physical world. VR can benefit businesses in various ways, such as:

  • virtual tours of showrooms and stores;
  • visualization of products;
  • greater user engagement;
  • increased consumer trust;
  • enhanced conversion rates;
  • better retention rates;
  • improved customer service.

However, it’s essential to remember the “shiny toy syndrome” and avoid it. Ensure that VR experiences align with your business goals and customer needs before opting for them. E-commerce stores can use VR for the following purposes:

  • virtual stores with virtual clothing racks, an opportunity to meet with friends and shop together online;
  • “try before you buy”;
  • in-store experiences;
  • live events;
  • interactive education.

3. Introducing AI Into AR and VR Solutions

Artificial intelligence can integrate with AR and VR technologies to revolutionize the shopping experience. AI-powered 3D representation of products in a user’s environment can increase conversions. How? Here is how AI can enhance virtual experiences:

1. Object Recognition

AR and VR experiences can adjust to the user’s movements and actions thanks to AI algorithms’ ability to detect and track things in real time.

2. Computer Vision

It involves image recognition and tracking, enabling the system to respond to the environment.

3. Natural Language Processing (NLP)

NLP is about using voice commands for people to explore and interact with virtual worlds.

4. Predictive Analytics

As AI can predict user behavior, merchants can build personalized and proactive experiences.

5. Usage Analytics

AI can also help analyze usage data and client feedback. You can optimize your AR/VR services and boost buyer satisfaction based on the results.

6. Personalized Experiences

One of the ways to employ customer insights is to tailor offers to their tastes. It can boost satisfaction and sales.

4. Creating Digital Twins

The past year has seen an increase in AR and 3D technology use by fashion brands to boost sales and brand recognition in physical and virtual worlds. And in 2023, we can expect more brands to utilize AR innovatively.

It includes the ability to try on digital versions of physical clothing on your avatar. Another example is unlocking special effects for physical apparel. Some brands create digital-only looks that users can capture on camera and share on social media.

This trend becomes possible thanks to avatar platforms and AR features such as image targets and body tracking. So brands can offer and sell virtual goods. And with NFC (Near Field Communication) and QR codes embedded in physical apparel, you can transform one thing (for example, a T-shirt) into infinite designs.

5. Security Concerns Over the Usage of AR and VR

Consumers are increasingly concerned about privacy, security, and safety in computing. The metaverse, new headsets, and more AR and VR content have made safety a greater focus. Devices can now gather more information through eye and hand tracking. AR also relies on spatial data to immerse users. That’s why customers remain skeptical about using such devices daily.

How can people safely enjoy digital realities? We need new frameworks, regulations, and social contracts prioritizing safety. All these require collaboration through working groups, policy and standard discussions, and new software solutions for moderation and cyber threats.

Final Word

To sum up, AR and VR can enhance the e-commerce industry by improving the customer experience, driving more engagement, and cutting costs. But there are many challenges to overcome before these technologies can become mainstream.

For example, some websites are incompatible with VR headsets or AR apps. Why? The reason is that they were not built with those devices in mind. And not everyone owns a headset or smartphone capable of using these technologies.

That’s why e-commerce merchants should take advantage of these new opportunities to not lose potential clients due to incompatibility issues. As these technologies get better, more online stores will use AR and VR to give shoppers immersive shopping experiences. The future of e-commerce is exciting. And augmented and virtual reality are sure to play a significant role in shaping it.

Guest Post


About the Guest Author(s)

Art Malkovich

Art Malkovich

Art Malkovich is CEO and co-founder of Onilab, an e-commerce development company. He has about 10 years of experience in team management and web development. He is passionate about keeping up with recent technologies and working on innovative projects like headless commerce solutions and PWAs in particular.

Top 5 E-Commerce AR and VR Trends To Follow in 2023 Read More »