featured

moth+flame-launches-ai-powered-vr-authoring-tool-for-custom-enterprise-vr-training-content-creation

Moth+Flame Launches AI-Powered VR Authoring Tool for Custom Enterprise VR Training Content Creation

A new VR authoring tool can potentially accelerate enterprise VR adoption this year. Invesco, a global investment firm, recently launched a virtual reality training experience they custom built using the new Moth+Flame’s VR authoring tool. Taking only 72 hours to develop, this custom VR training gives us a glimpse of the vast potential the new VR tool offers.

Moth+Flame VR Authoring Tool Addresses Challenges in VR Content Creation

In one of the most recent PwC surveys of PwC, 51% of companies have built VR into at least one specific line of business or are in the process of integrating VR into their processes. Many companies also see the benefits of VR as an effective way to develop and train people.

While the adoption of VR in enterprises is steadily increasing, enterprises face several challenges that hamper VR integration. Among these are the long development time and the level of expertise needed to create VR content.

Moth+Flame, a VR technology platform that specializes in enterprise-grade immersive learning solutions, addresses some of these challenges to accelerate the adoption of VR.

Ushering a year of great leaps in tech development, Moth+Flame launched a new VR authoring tool that leverages generative artificial intelligence. The new tool allows users to build custom VR content faster and easier. It empowers metaverse content creators and enterprise users to fast-track the development and adoption of VR across industries.

Empowering Novice Users to Create Immersive VR Content

Built on an advanced AI-driven platform, the VR authoring tool generates immersive training content that feels hyper-realistic and engaging. Voice-activated features prompt users to speak to navigate the training scenario. The interactive and immersive experience enhances training as it reinforces learning objectives and improves knowledge retention.

With its user-friendly interface, the platform can be used for VR content creation even by novice users with little or no technical expertise. Drag-and-drop editors and 3D asset libraries help users create high-quality immersive experiences. Multi-user support allows collaboration within teams and among learning and development departments. All these features enable enterprises to revamp existing training or create new ones that are more engaging and effective.

Invesco Uses Moth+Flame VR Authoring Tool to Build VR Training

The Moth+Flame VR authoring tool is still in its beta phase, but it has already shown how it can revolutionize VR use at the enterprise level. Invesco, the first early-access user to bring this technology to workforce development, has shown the benefits the tool brings to enterprise training.

Last week, Invesco launched their new VR training that was built using the Moth+Flame authoring tool. While the average development time for a VR training program is around eight to ten weeks, Invesco was able to build theirs in just 72 hours. Moreover, they were able to create VR content tailored to their specific requirements.

The new VR training is designed to help the Invesco sales team in handling customer complaints and concerns. Sales representatives use VR headsets to practice conversations with customers in realistic simulated scenarios. By adopting the Moth+Flame authoring tool, Invesco can generate other immersive training experiences across their enterprise.

“The biggest challenge for all education is the scale of content creation. So much enterprise training is limited to low-scoring e-learning products because of scale limitations,” said Kevin Cornish, CEO of Moth+Flame. “So much enterprise training is limited to low-scoring e-learning products because of scale limitations. With this tool, enterprises will be able to scale their content creation across all use cases in virtual reality, the most effective training modality available.”

Paving the Way for Rapid Adoption of VR in Enterprises

Invesco, along with other global brands, is now implementing VR training to maximize efficiency in the workplace. With the new VR authoring tool from Moth+Flame, enterprises gain access to advanced technology that empowers them to create immersive VR training. They can easily create VR content and deploy them to their workforce on iOS or VR headsets. With solutions like this authoring tool, we can expect the rapid adoption of VR at the enterprise level across industries.

Moth+Flame Launches AI-Powered VR Authoring Tool for Custom Enterprise VR Training Content Creation Read More »

a-farewell-to-altspacevr

A Farewell to AltspaceVR

The social VR community was shocked and devastated to learn that AltspaceVR is shutting its doors. While it’s in the nature of emerging technology applications to come and go, many with few tears shed by fewer remaining users, this platform seems to have been taken from us in the prime of its life leaving many wondering where to go next.

The Sun Goin’ Down on AltspaceVR

On January 20, a blog post titled “AltspaceVR to Sunset the Platform on March 10, 2023” appeared at the top of the company’s homepage. An email was also sent to users.

“When AltspaceVR first launched, our vision was to create a place where people from around the world could connect and socialize in real time,” the team wrote. “It was a bold vision, and with the help of our passionate community, the platform became a place where users made lifelong memories, formed cherished friendships, found love — and even married in IRL (in real life).” 

AltspaceVR

Much of the post waxes nostalgic about the platform, which launched in 2015 and was purchased by Microsoft in 2017. The post also encouraged “the many creators and developers who are part of the AltspaceVR community to host final events and download their content.”

“The decision has not been an easy one as this is a platform many have come to love, providing a place for people to explore their identities, express themselves, and find community,” wrote the team. “It has been a privilege to help unlock passions among users.”

Why, Microsoft? Why?

The team writes that they are closing AltspaceVR to “shift our focus to support immersive experiences powered by Microsoft Mesh.” Microsoft Mesh is the company’s mixed reality platform that was announced from within AltspaceVR during Microsoft Ignite 2021. The event, featuring Alex Kipman and James Cameron, looked at the platform as a tool for humanity.

“A dream you dream alone is just a dream. A dream we dream together is called reality,” Kipman said at the time, quoting John Lennon. “We tend to think of reality and dreams as separate. But, are they?”

Since then, Microsoft has focused on Mesh as more of an enterprise tool than a collaborative dream maker. At Microsoft Ignite 2022, CEO Satya Nadella announced that Mesh would be integrating with Microsoft Teams. Nadella also presented integrations between Microsoft Teams (presumably including Mesh) with Meta’s Horizon Workrooms at Meta Connect 2022.

“We knew early on that we wanted this to be a great place to get work done and with Microsoft, we can make that happen,” Meta CEO Mark Zuckerberg said at the event.

What About Us?

Shutting down AltspaceVR to focus on an enterprise platform is a strange play seeing as virtually none of the people using it were using it for enterprise. The always-full events schedule included comedy shows, community events, support groups and spiritual meetings, and musical performances. Where will they go?

Some of the events that you may know and love from AltspaceVR are already happening across multiple immersive platforms. For example, the live VR comedy show Failed to Render began in AltspaceVR but also has a home in Horizon Worlds and VRChat. Similarly, the Polys Awards takes place in AltspaceVR but also takes place in Mozilla Hubs, ENGAGE, and others.

But, what about those experiences that are only on AltspaceVR? Like BRCvr’s Burning Man experiences? An email from the organization said that they have plans for future events in another platform, but are respectfully waiting for the book to close forever on this chapter.

“Leaving AltspaceVR is difficult for all of us who have loved building this incredible community,” BRCvr co-founder Athena Demos said in the email. “Before we announce where we plan to continue our virtual Burning Man experiences, we want to honor the incredible friendships and creative partnerships we have forged on AltspaceVR.”

Kavya Pearlman, the CEO of XR Safety Initiative – which has been hosting events, such as Metaverse Safety Week, on the platform for years – said in a tweet that she was in talks with the group’s technical partner about moving to other platforms. Nothing official has been announced.

Yep , that’s quite a sad news .. just spoke to @ChickenWaffleVR about moving @MetaSafetyWeek to alternate platforms.

I wonder why @Microsoft bought @AltspaceVR in the first place , if it was only to sunset it later on.

So sorry for the team & everyone impacted.

— Kavya Pearlman (@KavyaPearlman) January 20, 2023

The King Is Dead. Long Live the King.

AltspaceVR had no enemies but it had many competitors. When the eulogy of a blog post went live on Twitter, representatives from a number of immersive platforms spoke at the wake.

We have some sad news, Altspacers. #AltspaceVR is shutting down on March 10th.

Though we hate saying goodbye, we also feel such pride and gratitude for all the magic that happened here. ✨

Thanks for joining us on this epic adventure. #socialvr https://t.co/peCwpaaBl3

— AltspaceVR (@AltspaceVR) January 20, 2023

There was no gloating at the grave, but tweets from Somnium Space, Spatial, and Mona invited displaced worldbuilders to take their events and spaces along. While this may be an option for events and assets, it’s not a meaningful option for users.

As mentioned in the blog post, users can download their data from the platform. However, there isn’t much that can be done with it. Photographs taken within the platform can be cherished but avatars can’t be transferred. It’s possible to download profile data from users that you met on the platform, but that’s not likely to help you connect with them on other platforms.

And, what are those platforms likely to be? Where will we meet next? A couple of solutions have already been named.

Where Will We Go?

VRChat is a proven and robust platform, but isn’t necessarily friendly towards new users, particularly when it comes to avatar creation. It has more tools and options than AltspaceVR but that can make it feel unwieldy. Somnium Space is in a similar category.

Spatial is a strong contender. The lightweight and easy-to-use platform runs in-browser, which is another strong bonus. Integration with Ready Player Me also means that creating an avatar is easy and bringing one with you if you already have one is even easier. Mona is in a similar category.

Mozilla Hubs has been putting in a lot of work, but they charge for that. With the new $20/month subscription price, many casual users won’t be signing on anymore.

ENGAGE is a solid free option, though it does require an app download. The same goes for Horizon – particularly if Meta makes good on its promise to make the platform available on hardware other than Meta headsets.

VIVE has also been building out its immersive offerings, but these also tend to be clunkier than many users will want. Virbela has also shown itself a capable, robust, and user-friendly platform, though a free app download is required.

The End of an Era

This was the first immersive world I ever entered. The fact that you don’t need a headset to enter the free and easy-to-use platform made it an unparalleled introduction to VR.

From before and after I got my first headsets, I have many memories from this platform, including the first time that I really felt like I had made eye contact with someone across the country. After years in the industry, there were still times when I would put on a headset just to shoot hoops outside of my VR house. AltspaceVR will be missed.

A Farewell to AltspaceVR Read More »

gdc-2023-state-of-the-game-industry-report-includes-insights-into-vr-and-ar

GDC 2023 State of the Game Industry Report Includes Insights Into VR and AR

Games are the largest use case for consumer VR and AR. While VR and AR remain a comparatively small segment of the games industry, the industry is taking notice as VR hardware in particular improves. This presents possibilities for new kinds of games but also promises to breathe new life into established franchises.

The GDC’s State of the Game Industry Report for this year is by no means dedicated to VR and AR. However, it does hold insights into how this segment of the industry is growing and changing. This includes insights into larger emerging technology trends like Web3 and the metaverse.

VR and AR in the Larger Games Industry

This GDC survey, the 11th in an annual series, found that “the metaverse has become more than a buzzword.” That doesn’t mean that VR and AR are now the driving force in the games industry.

In terms of which platforms developers are building for, VR headsets land in 10th place with 12% of respondents. AR landed in 14th place with 4% of respondents. When asked which platforms developers are building their next project for, VR headsets remained in 10th place with 12%, but AR moved up to 11th place with 5%.

GDC State of the Game Industry 2023 - platforms developers are building for
Source: GDC State of the Game Industry 2023

PC leads the pack, with the intervening platforms consisting of the usual suspects – legacy gaming platforms and mobile. However, this may be changing in the near future.

When asked which platforms developers are most interested in, 23% of respondents said VR, pushing the platform to 6th place, ahead of Android, Mac, and Xbox One. Similarly, 12% responded with AR, placing it in 11th place ahead of PS4 and web browsers.

GDC State of the Game Industry 2023 - which platform most interests game developers
Source: GDC State of the Game Industry 2023

So, while we might not see a boom period for VR and AR games in the immediate future, it’s increasingly on the radar of game developers. This trend looks like it could be setting up growth in this aspect of the industry within the next few years.

That said, last year’s big metaverse hype may have led to increased expectations for the cycle we’re in now. Last year, 42% of respondents said that they were actively involved in VR and AR game development. Now that number is at 38%, closer to where it was in 2021.

Platform Wars Within VR

So, of the developers that are working in VR and AR gaming, what platforms are they working on?

When asked which platform their next game will release on, 36% responded with Quest meaning Quest 2. An additional 10% responded with “Project Cambria” – the Quest Pro which had not yet been released at the time of the survey. A further 10% responded with Rift, Meta’s now discontinued line of tethered PC VR headsets.

GDC State of the Game Industry 2023 - VR and AR platforms developers are building games for
Source: GDC State of the Game Industry 2023

It is worth noting that the percentage of respondents working with Quest has gone up almost 10% since last year. That in itself is not necessarily surprising if not for the fact that the overall number of VR and AR game developers has gone down.

Interestingly, the runner-up is the as-yet-unreleased PlayStation VR 2 with 18%, followed by the HTC VIVE ecosystem at 15%. A further 12% responded with Apple’s ARKit, and another 9% responded with Android’s ARCore. There was also a potentially unexpected write-in entry.

“A handful of respondents shared that they were developing games for Pico, a platform that was not on the survey list,” the report offers. In some geographical markets, the Pico 4,  which was announced shortly before the Quest Pro, is a significant potential Quest Pro competitor. However, Pico Interactive does not currently offer consumer support in the US.

Gaming in the Metaverse?

“The concept of the metaverse continues to pick up steam in the game industry, as new and existing companies alike move to secure funding, spin up projects, and develop new technology,” reads the survey. However, like VR and AR gaming, this news comes with a grain of salt and some more sober attitudes since last year.

Nearly half of the respondents didn’t select any of the survey’s platform options. They instead said that “the metaverse concept will never deliver on its promise.” This occurred last year as well when around a third of respondents said that the metaverse will never materialize.

From a VR and AR perspective, it gets worse. More developers said that Fortnite would become the model metaverse platform than Horizon Worlds. This isn’t bad news because Horizon Worlds is better than Fortnite, it’s bad news because Horizon Worlds is VR and Fortnite isn’t. In fact, many of the more popular “metaverse” contenders are flat platforms.

GDC 2023 State of the Game Industry - Metaverse promise
Source: GDC State of the Game Industry 2023

And it gets worse. “Microsoft/Minecraft” came in a distant third place with 7% of respondents choosing them as the model metaverse. This presumably included AltspaceVR. As this article was being written, it was announced that AltspaceVR is coming to an end.

A Note on Blockchain

ARPost is not explicitly interested in blockchain but as a potential pillar of both the metaverse and the future of gaming, it shouldn’t be inappropriate to share some of the survey’s findings in this field. And, if you aren’t explicitly interested in blockchain either, the survey results should please you.

When asked about their interest in blockchain integration in games, 23% of respondents said that they were “very interested” or “somewhat interested”, with 75% saying that they were not interested at all. The remaining 2% are using blockchain in games already, with blockchain being the principal monetization strategy of around 4% of games.

Interest in blockchain is down slightly from last year, but, according to the report, most respondents were against blockchain last year as well and simply haven’t changed their minds.

GDC State of the Game Industry 2023 - blockchain in game industry
Source: GDC State of the Game Industry 2023

“Many developers said there could be a valuable place for blockchain technology in video games in the future,” the report explains. “Others said that the risks outweigh the benefits and that existing technologies serve similar purposes that negate the need for blockchain.”

A Maturing Industry

If you thought that the gaming industry was moving a little too fast last year, you were right. Metaverse hype driven by hardware expectations and blockchain buzz may have led to a brief, hard burn in the industry. It now seems that a small correction has taken place but the VR and AR games industry is settling in for longer-term development.

For the full picture of the whole gaming industry, find the complete report here.

GDC 2023 State of the Game Industry Report Includes Insights Into VR and AR Read More »

assisted-reality:-the-other-ar

Assisted Reality: The Other AR

“AR” stands for “augmented reality,” right? Almost always. However, there is another “AR” – assisted reality. The term is almost exclusively used in industry applications, and it isn’t necessarily mutually exclusive of augmented reality. There are usually some subtle differences.

Isn’t Augmented Reality Tricky Enough?

“AR” can already be confusing, particularly given its proximity to “mixed reality.” When ARPost describes something as “mixed reality” it means that digital elements and physical objects and environments can interact with one another.

This includes hand tracking beyond simple menus. If you’re able to pick something up, for example, that counts as mixed reality. In augmented reality, you might be able to do something like position an object on a table, or see a character in your environment, but you can’t realistically interact with them and they can’t realistically interact with anything else.

So, What Is “Assisted Reality?”

Assisted reality involves having a hands-free, heads-up digital display that doesn’t interact with the environment or the environment’s occupants. It might recognize the environment to do things like generate heatmaps, or incorporate data from a digital twin, but the priority is information rather than interaction.

The camera on the outside of an assisted reality device might show the frontline worker’s view to a remote expert. It might also identify information on packaging like barcodes to instruct the frontline worker how to execute an action or where to bring a package. This kind of use case is sometimes called “data snacking” – it provides just enough information exactly when needed.

Sometimes, assisted reality isn’t even that interactive. It might be used to do things like support remote instruction by enabling video calls or displaying workflows.

Part of the objective of these devices is arguably to avoid interaction with digital elements and with the device itself. As it is used in enterprise, wearers often need their hands for completing tasks rather than work an AR device or even gesture with one.

These less technologically ambitious use cases also require a lot less compute power and a significantly smaller display. This means that they can occupy a much smaller form factor than augmented reality or mixed reality glasses. This makes them lighter, more durable, easier to integrate into personal protective equipment, and easier to power for a full shift.

Where It Gets Tricky

One of the most popular uses for augmented reality, both in industry and in current consumer applications, are virtual screens. In consumer applications, these are usually media viewers for doing things like watching videos or even playing games.

However, in enterprise applications, virtual screens might be used for expanding a virtual desktop by displaying email, text documents, and other productivity tools. This is arguably an assisted reality rather than an augmented reality use case because the digital elements are working over the physical environment rather than working with it or in it.

In fact, some people in augmented reality refer to these devices as “viewers” rather than “augmented reality glasses.” This isn’t necessarily fair, as while some devices are primarily used as “viewers,” they also have augmented reality applications and interactions – Nreal Air (review) being a prime example. Still, virtually all assisted reality devices are largely “viewers.”

Nreal Air - Hands-on Review - Jon
Jon wearing Nreal Air

Words, Words, Words

All of these terms can feel overwhelming, particularly when the lines between one definition and another aren’t always straight and clear. However, emerging technology has emerging use cases and naturally has an emerging vocabulary. Terms like “assisted reality” might not always be with us, but they can help us stay on the same page in these early days.

Assisted Reality: The Other AR Read More »

virtex-stadium-holds-first-major-events,-inches-toward-open-access

Virtex Stadium Holds First Major Events, Inches Toward Open Access

A number of the attractions of watching live sports carry over into esports. However, unless you’re watching an esports tournament in person, a lot of those attractions go away. Interactions with other fans are limited. The game view is limited. The game is flattened and there’s little environment ambiance. Virtex wants to fix that.

A History of Virtex

Virtex co-founders Tim Mcguinness and Christoph Ortlepp met at an esports event in 2019. Mcguinness presented the idea of “taking that whole experience that we were doing there in the physical world and bringing it into the virtual world,” Ortlepp said in a video call with ARPost. The two officially launched the company in 2020.

The following year saw the company’s first major hires (and its first coverage from ARPost). The company was focusing on integrating Echo VR and needed permission from Meta (then Facebook), who purchased the game’s developer Ready At Dawn in 2022.

“The first thing we had to do was get something that we could show to Meta,” said Ortlepp. “For us, Echo was a good community to start with.”

Virtex got the green light from Meta. It also got Jim Purbrick who had previously been a technical director at Linden Lab and an engineering manager for Oculus.

“Moderation is an area where he had a big impact on us,” said Ortlepp. “We need live moderators to keep people safe… If now we have two or three hundred people in the platform, what if we have ten thousand people? Can we keep users safe and prevent a toxic environment?”

Meta’s support also meant that Virtex could finally launch its beta application. The beta is still technically closed – meaning that it isn’t on any app store, and you have to go through the Virtex website to access it. However, the closed beta isn’t limited. Testers have the opportunity to participate in “test sessions” – live streamed games every Thursday.

The platform held its first major tournament in December, with another about to kick off as this article was being written. Games are scheduled every week into the spring.

A Tour of the Stadium

Right now, the Virtex virtual world consists of a stadium entrance, a lounge area, and a commentator booth in addition to the stadium itself.

“The purpose [of the entrance and lounge] is really to set the stage for the user, to welcome them,” said Ortlepp.

Virtex Stadium Environment - Exterior

In the lounge, users can socialize, modify their avatars (through a Ready Player Me integration), and even watch a miniaturized version of the live match. The lounge itself is still being developed with plans for mini-games and walls of fame. Connected areas including a virtual store and bar area are also in the works.

In the stadium itself, users can see and interact with other spectators. They can watch a 3D reproduction of the live game in real time, or watch a Twitch stream of the game on a jumbo screen above the stadium floor.

“We feature the video because we didn’t want to take away from esports viewers what they’re currently used to,” said Ortlepp. Virtex wants to give spectators options to explore viewing in new ways, without leaving them in an entirely unfamiliar setting.

A teleport system allows faster movement to different areas of the stadium, including the stadium floor to watch from within the game or even follow players through the action. This is possible thanks to the unique solution that Virtex has developed for recreating the game within the virtual stadium.

virtex stadium virtual reality stadium streaming

The studio also adds special recording and hosting tools like camera bots for streaming games within the stadium to Twitch and YouTube. Aspects of the stadium’s appearance can even be changed to match whatever game is being played.

“We are the platform. Ideally, we don’t ever want to be the content creators,” said Ortlepp. “So we have certain user modes for the ones that are actually operating the tournaments.”

When Can We Expect an App?

Virtex Stadium is up and running. But, the team plans to spend at least the next few months in their “closed” beta phase. For one thing, they really want to have their moderation plan in place before making the app more discoverable. They’re also still collecting feedback on their production tools – and thinking of new ones.

Further, while the platform currently has a decent schedule, the team wants to work with more games and more gaming communities. That includes other VR titles as well as more traditional esports. Ideally, one day, something will be happening in Virtex no matter when a user signs in.

“Where do we take it from here? There are no standards – no one has done this before,” said Ortlepp. “The virtual home of esports is basically the vision. It’s something we don’t claim yet – we have to earn it.”

It’s Not Too Early to Check It Out

Everything about Virtex is exciting, from their plans for the virtual venue itself, to their passion and concern for their community. Ortlepp said that the company is “careful about making dated timeline promises.” In a way that’s a little frustrating but it’s only because the company would rather hold off on something amazing than push something that falls short of their vision.

Virtex Stadium Holds First Major Events, Inches Toward Open Access Read More »

brcvr-presents-re-burn-23,-a-burning-man-inspired-event,-launches-“blind-burners-world”

BRCvr Presents Re-Burn-23, a Burning Man-Inspired Event, Launches “Blind Burners World”

BRCvr, the award-winning virtual version of Black Rock City, will present Re-Burn-23 this month. The global virtual Burning Man-inspired event can be accessed via Zoom and AltspaceVR. It will run from January 27 to January 29, 2023, and will include exciting additions to its lineup, such as the metaverse environment “Blind Burners World.”

What Is Re-Burn-23?

Re-Burn-23 is a three-day immersive virtual event that will bring together artists, technologists, Burners, the Burner-curious, and other creatives in a Burning Man-inspired event.

It will feature new additions, such as Blind Burners World, as well as popular immersive worlds from previous virtual burn events. “We are taking the principle of ‘Communal Effort’ to a whole new level, bringing technologies and artists together in a robust collaboration that integrates 360 video from the 2022 burn with 3D digital assets, continuing our efforts to bring a rich Burner experience to the globe,” said Athena Demos, co-founder of BRCvr, in a press release shared with ARPost.

Re-Burn-23 is organized by Big Rock Creative (BRC), an award-winning company that crafts unique XR experiences. Offered as a gift to enthusiasts, this virtual event will be free of charge for interested individuals.

The event can be accessed by a global audience via Zoom, for mobile and desktop devices, and AltspaceVR, for VR headsets, Macs, and PCs. The wide accessibility via VR and 2D is an effort to bring about radical inclusion. Re-Burn-23 will be the final virtual burn event that will be hosted on the Microsoft AltspaceVR platform, since the platform is, unfortunately, shutting down on March 10 this year.

Blind Burners World

Re-Burn-23 will feature many new additions, and one of them is Blind Burners World.

Blind Burners World is the first-ever immersive metaverse environment that will showcase works by artists with visual impairment. In partnership with Blind Burners – a community of blind, partially sighted, and sighted artists and performers – Blind Burners World is designed to be navigable by sound. It will feature a gallery showcasing art and photography by artists with blindness and low vision, a Temple of Accessibility, and a Galactic Sound Garden. Each artwork will come with an audio description made and recorded by the artist.

Preview of Blind Burners World - BRCvr - Re-Burn-23

“When Burning Man communities moved online in 2020, Athena and the team at BRCvr were the first to embrace our call for accessibility in the metaverse,” said Blind Burners founder, Chris Hainsworth. “Without accessibility, cultural claims to radical inclusion are as shallow as a teacup.”

BRCvr Immersive Documentary Experience

On Re-Burn-23, participants will have the opportunity to watch a preview of the BRCvr Immersive Documentary Experience. It will be presented in a format of mixed reality storytelling in an immersive documentary application. The preview will feature live-action content captured during the 2022 on-playa burn in Black Rock City and will allow participants to engage and share this new experience.

Preview of BRCvr Immersive Documentary Experience - BRCvr - Re-Burn-23

Re-Burn-23 will feature other events including Sovereign Light 2023, Daisy Shaw’s World Tour of Re-Burn-23, a content creator meet and greet, artist talks, and even a “Bad Movies and Pizza Party” event, where participants can enjoy watching bad movies while eating pizza.

BRCvr Presents Re-Burn-23, a Burning Man-Inspired Event, Launches “Blind Burners World” Read More »

ces-2023-highlights-featuring-news-and-innovations-from-canon,-micledi,-and-nvidia

CES 2023 Highlights Featuring News and Innovations From Canon, MICLEDI, and NVIDIA

CES is considered the world’s tech event, showcasing groundbreaking technologies and innovations from some of the world’s biggest brands, developers, manufacturers, and suppliers of consumer technology. At CES 2023, attendees saw the unveiling of the latest developments from over 3,200 exhibitors, including technology companies Canon, MICLEDI, and NVIDIA.

Canon Immersive Movie Experience and Immersive Calling Experience

Canon USA has partnered with filmmaker and director M. Night Shyamalan (The Sixth Sense, The Village, and Signs) to create an immersive movie experience for CES 2023 attendees. Featuring M. Night Shyamalan’s upcoming film Knock at the Cabin (which will be in theaters February 3), Canon unveiled Kokomo, an immersive virtual reality software that leverages VR to give users an immersive calling experience.

Canon Kokomo - CES 2023
Kokomo

With Kokomo, users can now connect with their friends and family as if they’re there in person by using a compatible VR headset and smartphone. In a 3D call, Kokomo will emulate a photo-real environment and mirror the physical appearance of the user. CES 2023 participants were able to witness Kokomo in action at the Canon booth, where they were able to have a one-on-one Kokomo conversation with select characters from the movie Knock at the Cabin.

Aside from Kokomo, Canon also unveiled its Free Viewpoint Video System, which creates point-cloud-based 3D models for more immersive viewing experiences in larger areas like arenas and stadiums. At CES 2023, attendees were able to experience the Free Viewpoint System, which allowed them to watch an action scene from Knock at the Cabin from multiple viewpoints.

CES 2023 attendees also had the opportunity to see Canon’s mixed reality system MREAL in action, by experiencing a scene from Knock at the Cabin as if they were a character in the movie.

Canon MREAL X1 headset
MREAL X1

MICLEDI Demonstrates New Red µLEDs at CES 2023

MICLEDI Microdisplays, a technology company developing the microLED displays for the augmented reality market, also showcased its advancements in microLED display tech for AR glasses at CES 2023.

At the event, the company demonstrated its new red microLEDs on AllnGaP starting material. This development is in line with MICLEDI’s aim to create high-performance individual color-performing microLEDs that can be combined with the company’s full-color microLED display module.

Through MICLEDI’s innovations in microLED technology, users can begin to experience clearer and more precise digital images via AR glasses that are more portable and lightweight. The red AllnGaP microLEDs, along with MICLEDI’s three-panel full-color microLED display module, are poised to raise the standards of AR glasses in the coming years.

MICLEDI - Red GaN and Red AlInGaP microLED displays - CES 2023

“There is no one-size-fits-all solution for AR glasses,” said MICLEDI CEO, Sean Lord. “This achievement, with our previously announced blue, green, and red GaN µLEDs, opens the door to a broader offering of display module performance parameters which enables MICLEDI to serve customers developing AR glasses from medium to high resolution and medium to high brightness.”

Demonstration units of both Red GaN and Red AlInGaP were shown at the company’s booth at CES 2023.

NVIDIA Announces New Products and Innovations at CES 2023

NVIDIA announced new developments and NVIDIA Omniverse capabilities at CES 2023. The tech company, which is known for designing and building GPUs, unveiled its new GeForce RTX GPUs, which come with a host of new features that can be found in NVIDIA’s new studio laptops and GeForce RTX 4070 Ti graphics cards. This new series of portable laptops gives artists, creators, and gamers access to more powerful solutions and AI tools that will help them create 2D and 3D content faster.

NVIDIA also shared new developments to its Omniverse, including AI add-ons for Blender, access to new and free USD assets, and an update on the NVIDIA Canvas, which will be available for download in the future.

Aside from these updates, the company also released a major update to its Omniverse Enterprise, which enables users to access enhancements that will let them develop and operate more accurate virtual worlds. This major update is also set to expand the Omniverse’s capabilities through features such as new connectors, Omniverse Cloud, and Omniverse DeepSearch. More new partners are planning to use NVIDIA Omniverse to streamline their workflows and operations. These include Dentsu International, Zaha Hadid Architects, and Mercedes Benz.

NVIDIA Omniverse ACE - CES 2023
NVIDIA Omniverse ACE

Moreover, this January, NVIDIA opened its early-access program for NVIDIA Omniverse Avatar Cloud Engine (ACE), allowing developers and teams to build interactive avatars and virtual assistants at scale.

Demos of VITURE One XR Glasses and Mobile Dock

Aside from these established tech companies, VITURE, a new XR startup that received accolades from CES, TIME, and the Fast Company for its flagship product, the VITURE One XR glasses, also prepared something interesting for the CES 2023 attendees.

VITURE One XR glasses and Mobile Dock
VITURE One XR glasses and Mobile Dock

The company made both their VITURE One XR glasses, compatible with Steam Deck, laptops, and PCs, and their Mobile Dock, which introduces co-op play and Nintendo Switch compatibility, available for testing.

CES 2023 Highlights Featuring News and Innovations From Canon, MICLEDI, and NVIDIA Read More »

new-year,-new-goals:-hit-your-fitness-goals-this-year-with-immersive-workouts-from-fitxr’s-new-studios

New Year, New Goals: Hit Your Fitness Goals This Year With Immersive Workouts From FitXR’s New Studios

Towards the end of last year, FitXR launched new music collections with over two dozen new workout classes in Box, HIIT, and Dance. The company is all set to help users meet their fitness goals this year as well with new immersive workout classes.

The VR fitness app has recently launched two new studios, Sculpt and Combat, providing their users with low-impact strength and conditioning workouts for the first time, as well as more options for high-impact exercises. With each of the studios offering distinct training styles, FitXR users gain access to more rounded fitness experiences.

“The introduction of these studios reflects a first-ever moment for VR fitness, expanding what’s possible to bring to a workout,” said FitXR’s Director of Fitness, Kelly Cosentino, in a press release. “We at FitXR pride ourselves on demystifying VR fitness and expanding its reach to the masses, showcasing the true impact it can have to help improve overall health and wellness. We hope that these new studios encourage more people to get moving.”

Low-Impact Burn and Tone in Sculpt Studio

Sculpt studio brings us the first-ever low-impact strength and conditioning immersive workout classes that can be done anywhere at any time. It offers a new and exciting way to do strength training without using weights or straining the joints.

FitXR new studio Sculpt- immersive workout

For high-burn results at low impact, the movements created for this studio mimic those from pilates, barre, and isometric strength training. The exercises are designed to burn, tone, and sculpt while improving balance, strength, and endurance.

Each part of the routine targets a particular area to tone specific muscle groups for a more sculpted form. Done regularly, immersive workout exercises in Sculpt Studio help build strength in arms and legs, tone muscles, and boost mental endurance. Upbeat music makes the workouts more exciting as users pulse and hold along with the fast tempo.

Upon launch, Sculpt Studio started offering eight classes led by FitXR trainers Dillon, Elise, Garret, and Sarah. More supercharged classes will be added regularly to keep users motivated to stay fit.

Powerful Results From High-Impact VR Workouts in Combat Studio

Inspired by martial arts, the new Combat studio gets users in good fighting form. The fierce immersive workouts show us that virtually everything is possible in VR fitness.

FitXR new studio Combat - immersive workout

While it also includes boxing moves, exercises in this new studio are more dynamic than Box workouts. They combine signature moves from different forms of martial arts including karate, taekwondo, Muay Thai, and Brazilian Jiu-Jitsu. Users learn to do elbow strikes, hammer fists, double punches, high blocks, and other techniques. Because the exercise drills include different combos that simulate the unpredictability of real fights, users also develop faster reflexes, better coordination, and improved concentration.

FitXR trainers Dillon and Billy lead the immersive workout exercises in the four new classes. More classes will be launched regularly to keep users on their toes and in perfect fighting form. To make the workouts more exhilarating and stimulating, users are immersed in a modern urban virtual environment resonating with bold pop, hip-hop, and rock playlists.

Immersive Workouts Are Part of the Future of Fitness

With the two new launches, FitXR now has five innovative studios with distinct features—Box, Dance, HIIT, Sculpt, and Combat. Offering different workout styles for different fitness levels, they give users a vast array of exercises to choose from and find the ones that fit their lifestyle, interests, and needs.

Augmented and virtual reality will undoubtedly play a part in the future of fitness. The global success of FitXR shows that people are more than ready to adopt new technologies in their health and fitness regimens. Immersive workouts with personalized routines will make it easier for everyone to stay fit and healthy whatever their lifestyle is.

New Year, New Goals: Hit Your Fitness Goals This Year With Immersive Workouts From FitXR’s New Studios Read More »

ontop-studios-wants-to-bring-theaters-back-to-life-with-xr-esports

ONTOP Studios Wants to Bring Theaters Back to Life With XR Esports

Have you ever seen an empty theater and thought that the space had more potential? It doesn’t even have to be an abandoned theater, just a theater with more rooms than it regularly uses, or a theater that isn’t showing movies at all hours of the day. What if those theaters could be used for, say, XR esports? That’s the idea that Nuno Folhadela is exploring with ONTOP Studios.

Meet ONTOP Studios

ONTOP Studios makes AR experiences, filters, and games. Its mission is “turning the world into your playground” through augmented reality. The studio makes independent projects but also works with an impressive list of partners including Vodafone, Samsung, and Snapchat.

ONTOP’s most recent venture, ARcade Sports, involves turning empty theaters into XR esports playgrounds through its social AR games. The idea didn’t come about because Folhadela, the Studio’s founder, has anything against movies.

“My background is in cinema, but I’ve been working in games,” Folhadela said in a video call with ARPost. “My interest is always to bring stories into the real world… Going to the cinema isn’t just about movies, it’s about an experience.”

So, why change that experience? The answer, as so many answers do these days, has to do with trends that were already underway before the pandemic caused them to explode.

“After the pandemic, movies really got hit hard. We realized that gaming is what the younger audience is going for,” said Folhadela.

From Movies to Games to Esports

Games are more interactive than movies, but they’re also more social, and both of these elements of storytelling are drawing younger people away from conventional forms of linear narratives, according to Folhadela. But, that’s not the end of the story. Games are more interactive and social than movies, and AR is a more immersive medium than 2D games.

“As a player, [AR] brings everything that we view on a screen into the real world,” said Folhadela. “All of these adventures that you have, are confined onto a flat screen. Now, you can bring all of your adventures with you.”

All of this talk about young audiences doesn’t mean that ONTOP is only interested in kids. Like VR arcades, ONTOP’s theater arenas appeal to visitors of all ages, including entire families.

ONTOP Studios - esports - ARcade Sports - game Morgana

“One man said that it felt like the first time playing with his kids – he was used to them sitting and playing Fortnite and him sitting and watching them,” said Folhadela.

Further, AR gaming can involve a lot of movement. This makes things more exciting for the players, but it also opens up a whole new level of attraction for spectators. At a time when streaming video game playthroughs is already popular, making gaming more human brings a lot of promise by making esports a lot more sporty.

“You really see the players running around so when you see a good player it’s seeing a good athlete. It’s bringing those worlds together,” said Folhadela. “It’s taking the ‘e’ out of ‘esports.’”

Buying Tickets and Paying Bills

So, how does ARcade Sports work? Folhadela describes the esports platform as a “b2b2c” (business to business to consumer) model. ONTOP Studios develops the content and maintains the companion app. Content is then licensed to property managers who promote the availability in their area. Content can even be modified to fit different areas or different business licensees.

“ARcade Sports is a platform, it’s not a game. We are always adding new games and new features,” said Folhadela.

A ticket to play in the XR-enabled esports facility includes a QR code. Scanning the QR code with the companion app lets players enter the same session. The app tracks the players’ performance in the game including their activity levels. Games are tiered based on difficulty, so beginners aren’t left out and veteran gamers don’t get bored.

esports - ARcade Sports - Morgana game

Right now, ARcade Sports is only available at select locations in Portugal. That’s set to change.

“We launched the games locally to understand the mechanics … for many players, this was their first experience with AR,” said Folhadela. “Now that this is at the right moment, we are hoping to expand to the US this year.”

Coming Soon to a Theater Near You

During our call, Folhadela displayed a number of experimental social features that aren’t yet ready to be fully integrated into the platform. However, hopefully, by the time that ONTOP Studios brings its unique brand of XR esports to the US – ideally this summer – there will be even more to keep gamers entertained, whether they’re playing or watching.

ONTOP Studios Wants to Bring Theaters Back to Life With XR Esports Read More »

another-ces-2023-gem:-next-gen-z-lens-waveguide-technology-by-lumus

Another CES 2023 Gem: Next-Gen Z-Lens Waveguide Technology by Lumus

Lumus has recently launched its Z-Lens AR architecture, which can help with the development of more compact AR glasses in the near future, thanks to efforts that reduced its micro-projector’s size by 50%.

Making its debut at the Consumer Electronics Show (CES) 2023, the new Z-Lens—which builds on the company’s Maximus 2D reflective waveguide technology—can be fitted with prescription lenses.

Lumus’ Waveguide Technology

According to the company, Lumus is currently the only brand that produces waveguides for outdoor use. Its luminance efficiency is 10 times better than those of Lumus’s competitors. Its design allows for a “true white” background and color uniformity. Moreover, the battery life of its micro-projector is 10 times better than other waveguides on the market.

The structure of the new Z-Lens  gives manufacturers more options regarding where to position the aperture or the opening where the light passes through. Lumus CEO, Ari Grobman, expressed optimism that this flexibility can lead to the creation of less bulky and more “natural-looking” AR eyewear.

“In order for AR glasses to penetrate the consumer market in a meaningful way, they need to be impressive both functionally and aesthetically,” said Grobman in a press release shared with ARPost. “With Z-Lens, we’re aligning form and function, eliminating barriers of entry for the industry, and paving the way for widespread consumer adoption.”

Z-Lens 2D Image Expansion

In AR glasses, the lenses that use Z-Lens reflective waveguides will serve as the “screen” onto which a tiny projector would display the AR image. Lumus’s lenses consist of waveguides or a series of cascading partially reflective mirrors. These mirrors are responsible for 2D expansion, widening the projected image horizontally and vertically.

Lumus Z-Lens new waveguide technology

Maximus’ patented waveguides reflect the light from the projector two times before the light bounces into your eye. The mini-projector—which is hidden in the temple of the eyeglass frame—has two components. First is a microdisplay that produces the virtual image and second is a collimator, which beams the light waves to the waveguide. The mirrors then reflect the light out of the waveguide to the user’s eyes.

“Our introduction of Maximus 2D reflective waveguide technology two years ago was just the beginning,” said Grobman. “Z-Lens, with all of its improvements unlocks the future of augmented reality that consumers are eagerly waiting for.”

New Z-Lens Standout Features

Lumus’s second-generation Z-Lens boasts a lightweight projector with a 2K by 2K vibrant color resolution and 3K-nit/watt brightness. The latter feature allows users to enjoy AR viewing in daylight or outdoors. Other AR lenses on the market feature sunglass-type tinting on their products to ensure that users can view virtual images. The absence of dark tints allows others to see the user’s eyes as if they’re wearing regular eyeglasses.

The first prototypes of Z-Lens have a 50-degree field of view (FOV). However, the company’s goal is to reach at least 80 degrees FOV in the future.

Z-Lens waveguide technology - Lumus

Here are the other qualities of the Maximus successor:

  • Eliminates ambient light artifacts or small light glares on the optical display that typically occur in AR eyewear.
  • Offers dynamic focal lens integration, which eases vergence-accommodation conflict (VAC). VAC can make images blurry because virtual objects appear closer to the eyes than their actual distance from them.
  •  Z-Lens architecture allows for direct bonding of optical elements for prescription glasses.
  • Provides more privacy through light leakage control. Third parties can’t view the displays seen by the wearer. Moreover, users don’t draw attention because Z-Lens don’t produce any “eye glow.”

“The Future Is Looking Up”

Waveguides already have practical applications in the military and medical professions, particularly among air force pilots and spinal surgeons. Lumus believes these wearable displays can someday overtake mobile phone screens and laptop monitors as hands-free communication tools.

“AR glasses are poised to transform our society,” Grobman said. “They feature better ergonomics than smartphones, novel interaction opportunities with various environments and businesses, and a much more seamless experience than handheld devices. The future, quite literally, is looking up.”

Another CES 2023 Gem: Next-Gen Z-Lens Waveguide Technology by Lumus Read More »

digilens-announces-argo-–-its-first-mass-market-product

DigiLens Announces ARGO – Its First Mass Market Product

DigiLens has been making groundbreaking components for a while now. And, last spring, the company released a developers kit – the Design v1. The company has now announced its first made-to-ship product, the ARGO.

A Look at the ARGO

DigiLens is calling ARGO “the future of wearable computing” and “the first purpose-built stand-alone AR/XR device designed for enterprise and industrial-lite workers.” That is to say that the device features a 3D-compatible binocular display, inside-out tracking, and numerous other features that have not widely made their way into the enterprise world in a usable form factor.

ARGO AR glasses by DigiLens

“ARGO will open up the next generation of mobile computing and voice and be the first true AR device to be deployed at mass scale,” DigiLens CEO, Chris Pickett, said in a release shared with ARPost. “By helping people connect and collaborate in the real – not merely virtual – world, ARGO will deliver productivity gains across sectors and improve people’s lives.”

Naturally, ARGO is built around DigiLens crystal waveguide technology resulting in an outdoor-bright display with minimal eye glow and a compact footprint. The glasses also run on a Qualcomm Snapdragon XR2 chip.

Dual tracking cameras allow the device’s spatial computing while a 48 MP camera allows for capturing records of the real world through photography and live or recording video. One antenna on either temple of the glasses ensure uninterrupted connectivity through Wi-Fi and Bluetooth.

Voice commands can be picked up even in loud environments thanks to five microphones. The glasses also work via gaze control and a simple but durable wheel and push-button input in the frames themselves.

The DigiLens Operating System

The glasses aren’t just a hardware offering. They also come with “DigiOS” – a collection of optimized APIs built around open-source Android 12.

“You can have the best hardware in the world, hardware is still an adoption barrier, but software is where the magic happens,” DigiLens VP and GM of Product, Nima Shams, said in a phone interview with ARPost. “We almost wanted the system to be smarter than the user and present them with information.”

While not all of those aspirations made it into the current iteration of DigiOS, the operating system custom-tailored to a hands-free interface does have some tricks. These include adjusting the brightness of the display so that it can be visible to the user without entirely washing out their surroundings when they need situational awareness.

“This is a big milestone for DigiLens at a very high level. We have always been a component manufacturer,” said Shams. “At the same time, we want to push the market and meet the market and it seems like the market is kind of open and waiting.”

A Brief Look Back

ARPost readers have been getting to know DigiLens for the last four years as a component manufacturer, specifically making display components. Last spring, the company released Design v1. The heavily modular developers kit was not widely available, though, according to Shams, the kit heavily influenced the ARGO.

“What we learned from Design v1 was that there wasn’t a projector module that we could use,” said Shams. “We designed our own light LED projector. … It was direct feedback from the Design v1.”

A lot of software queues in the ARGO also came from lessons learned with Design v1. The headset helped pave the way for DigiOS.

DigiLens ARGO AR glasses

“Design v1 was the first time that we built a Qualcomm XR2 system, and ARGO uses the same system,” said Shams.

Of course, the Design v1 was largely a technology showcase and a lot of its highly experimental features were never intended to make it into a mass-market product. For example, the ARGO is not the highly individualized modular device that the Design v1 is.

The Future of DigiLens

DigiLens still is, and will continue to be, a components company first and foremost. Their relationship with enterprise led the company to believe that it is singularly situated to deliver a product that industries need and haven’t yet had an answer for.

“I’ve seen some things from CES coming out of our peers that are very slim and very sexy but they’re viewers,” said Shams. “They don’t have inside-out tracking or binocular outdoor-bright displays.”

With all of this talk about mass adoption and the excitement of the company’s first marketed product, I had to ask Shams whether the company had aspirations for an eventual consumer model.

“Our official answer is ‘no,’” said Shams. “Companies like the Samsungs and the Apples of the world all believe that glasses will replace the smartphone and we want to make sure that DigiLens components are in those glasses.”

In fact, in the first week of January, DigiLens announced a partnership with OMNIVISION to “collaborate on developing new consumer AR/VR/XR product solutions.”

“Since XR involves multiple senses such as touch, vision, hearing, and smell, it has potential use cases in a huge variety of fields, such as healthcare, education, engineering, and more,” Devang Patel, OMNIVISION Marketing Director for the IoT and Emerging Segment said in a release. “That’s why our partnership with DigiLens is so exciting and important.” 

Something We Look Forward to Looking Through

The price and shipping date for ARGO aren’t yet public, but interested companies can reach out to DigiLens directly. We look forward to seeing use cases come out of the industry once the glasses have had time to find their way to the workers of the world.

DigiLens Announces ARGO – Its First Mass Market Product Read More »

new-waveguide-tech-from-vividq-and-dispelix-promises-new-era-in-ar

New Waveguide Tech From VividQ and Dispelix Promises New Era in AR

Holograms have been largely deemed impossible. However, “possible” and “impossible” are constantly shifting landscapes in immersive technology. Dispelix and VividQ have reportedly achieved holographic displays through a new waveguide device. And the companies are bringing these displays to consumers.

A Little Background

“Hologram” is a term often used in technology because it’s one that people are familiar with from science fiction. However, science fiction is almost exclusively the realm in which holograms reside. Holograms are three-dimensional images. Not an image that appears three-dimensional, but an image that actually has height, width, and depth.

These days, people are increasingly familiar with augmented reality through “passthrough.” In this method, a VR headset records your surroundings and you view a live feed of that recording augmented with digital effects. The image is still flat. Through techno-wizardry, they may appear to occupy different spaces or have different depths but they don’t.

AR glasses typically use a combination of waveguide lenses and a tiny projector called a light engine. The light engine projects digital effects onto the waveguide, which the wearer looks through. This means lighter displays that don’t rely on camera resolution for a good user experience.

Most waveguide AR projects still reproduce a flat image. These devices, typically used for virtual screens or screen mirroring from a paired device, often include spatial controls like ray casting but are arguably not “true” augmented reality and are sometimes referred to as “viewers” rather than “AR glasses.”

Some high-end waveguide headsets – almost exclusively used in enterprise and defense – achieve more immersive AR, but the virtual elements are still on a single focal plane. This limits immersion and can contribute to the feelings of sickness felt by some XR users. These devices also have a much larger form factor.

These are the issues addressed by the new technology from Dispelix and VividQ. And their material specifically mentions addressing these issues for consumer use cases like gaming.

Bringing Variable-Depth 3D Content to AR

Working together, VividQ and Dispelix have developed a “waveguide combiner” that is able to “accurately display simultaneous variable-depth 3D content within a user’s environment” in a usable form factor. This reportedly increases user comfort as well as immersion.

“Variable-depth 3D content” means that users can place virtual objects in their environment and interact with them naturally. That is opposed to needing to work around the virtual object rather than with it because the virtual object is displayed on a fixed focal plane.

VividQ 3D waveguide

“A fundamental issue has always been the complexity of displaying 3D images placed in the real world with a decent field of view and with an eyebox that is large enough to accommodate a wide range of IPDs [interpupillary distances], all encased within a lightweight lens,” VividQ CEO, Darran Milne, said in a release shared with ARPost. “We’ve solved that problem.”

VividQ and Dispelix have not only developed this technology but have also formed a commercial partnership to bring it to market and bring it to mass production. The physical device is designed to work with VividQ’s software, compatible with major game engines including Unity and Unreal Engine.

“Wearable AR devices have huge potential all around the world. For applications such as gaming and professional use, where the user needs to be immersed for long periods of time, it is vital that content is true 3D and placed within the user’s environment,” Dispelix CEO and co-founder, Antti Sunnari, said in the release. “We are thrilled to be working with VividQ.”

When Waveguides Feel Like a Mirage

Both companies have been building toward this breakthrough for a long time. Virtually every time that APost has covered Dispelix it has at least touched on a partnership with another company, which is typical for a components manufacturer. New product announcements are comparatively rare and are always the result of lots of hard work.

“The ability to display 3D images through a waveguide is a widely known barrier to [a compelling AR wearable device],” VividQ Head of Research, Alfred Newman, said in an email. “To realize the full capability, we needed to work with a partner capable of developing something that worked with our exact specifications.”

Of course, those who have been following immersive tech for a while will understand that a long time working hard to achieve a breakthrough means that that breakthrough reaching the public will require working hard for a long time. Devices using this groundbreaking technology might not reach shelves for a few more calendar pages. Again, Newman explains:

“We license the technology stack to device manufacturers and support them as they develop their products so the timeframe for launching devices is dependent on their product development. …Typically, new products take about two to three years to develop, manufacture, and launch, so we expect a similar time frame until consumers can pick a device off the shelf.”

Don’t Let the Perfect Be the Enemy of the Good

Waiting for the hardware to improve is a classic mass adoption trope, particularly in the consumer space. If you’re reading that you have to wait two to three years for impactful AR, you may have missed the message.

There are a lot of quality hardware and experience options in the AR space already – many of those already enabled by Dispelix and VividQ. If you want natural, immersive, real 3D waveguides, wait two or three years. If you want to experience AR today, you have options in already-available waveguide AR glasses or via passthrough on VR headsets.

New Waveguide Tech From VividQ and Dispelix Promises New Era in AR Read More »