featured

homear-geolocates-virtual-homes,-new-metrics-for-developers

HomeAR Geolocates Virtual Homes, New Metrics for Developers

We first met homeAR in March. The solution for homebuilders and their clients creates virtual models of homes that are visible on-site or in a “dollhouse mode” from anywhere. So, what’s next? Entire AR communities? Actually, yes. Other recent updates to the platform include an “Always-On” feature and more compatibility with other applications.

Welcome homeAR

HomeAR has been around for quite a while now, but it has only existed in its current iteration for the last few years after CEO Richard Penny was inspired by his own experience in having a house built.

Always-On homeAR

The last time that we checked in with homeAR, prospective homeowners could see the AR model of their home on-site or wherever they happened to be. Either of these solutions made it easier for them to envision their future dwelling to better work with contractors to make sure that everything went according to plan (or change the plan).

“When a person is using this and expecting it to behave like a house, we want to make it usable so people aren’t just interacting with a 3D widget, they’re interacting with a house,” Penny told ARPost at the time.

The application was good for exactly that. People having a house built could view their house virtually before the ground was even broken on their property. But, not all houses are custom-built by a property owner working with a private contractor. What about people looking to move into a new housing complex or subdivision? That’s where some of the new features come in.

Your Next Home Hasn’t Been Built Yet

HomeAR has been rolling out a bunch of new features, but one of the most exciting is the Always-On feature. Builders can import their CAD models to the homeAR backend and then associate the model with a QR code on-site. Visitors to the site can then scan the QR code to launch the experience.

Always-On feature homeAR

That experience consists of virtual houses pinned to their future locations in the physical world. Potentially replacing a single model home and an artist’s 2D rendering of the building site, this experience allows visitors to envision an entire unbuilt community in the physical environment around them.

“Being able to take buyers on a journey where they can experience not only an individual home, but a whole community, is hugely powerful for both parties,” Penny said in a release shared with ARPost. “This Always-On technology provides a glimpse of the future at the site of the build and is the perfect tool to help someone imagine what lies in store for them.”

This tool doesn’t only provide information to visitors, it also provides information to developers. Metrics gathered from users interacting with the virtual development help project managers understand how potential residents are exploring the site.

What’s Next?

This isn’t the end of homeAR. Some of the features that Penny told us to expect in the future still aren’t here – like spatially anchored notes within the virtual model homes, and recording video within the app. We aren’t sure when to expect these features, but it’s nice to see that the company isn’t standing still.

HomeAR Geolocates Virtual Homes, New Metrics for Developers Read More »

redefining-immersive-virtual-experiences-with-embodied-audio

Redefining Immersive Virtual Experiences With Embodied Audio

EDGE Sound Research is pioneering “embodied audio,” a new technology that changes the way we experience virtual reality. When we think of “virtual reality,” the focus only seems to be on engaging our sense of sight. EDGE Sound Research’s embodied audio will revolutionize how we experience audio in VR worlds through its use of audible and tactile frequencies.

One of the things that sets this technology apart is that it stems from co-founder Ethan Castro’s experience. Castro had issues with hearing and, as a result, he had to resort to sound. Moreover, Castro loved music and even became a professional audio engineer and composer. He researched how sound can be perceived by combining hearing and feeling. Eventually, he teamed up with co-founder Val Salomaki to start EDGE Sound Research.

Bringing Embodied Audio to Life

Embodied audio adds realism to sound. This groundbreaking technology combines the auditory and physical sensations of sound in an “optimized and singular embodiment.”

“This means a user can enjoy every frequency range they can hear (acoustic audio) and feel (haptic and tactile audio, also known as physical audio),” said Castro and Salomaki.

Castro and Salomaki go on to explain that they invented a new patent-pending technology for embodied audio, which they dubbed ResonX™. This new technology, which has been nominated for the CES Innovation Award, has the capability to transform any physical space or environment into an embodied audio experience that has the ability to reproduce an expansive range of physical (7-5,000+ Hz) and acoustic audio frequencies (80-17,000 Hz).

Crafting New Experiences With the ResonX™ System

“The ResonX™ system is a combination of hardware and software. A user places the ResonX™ Core (hardware component) on the surface of a material and the ResonX™ software calibrates the surface of the material to resonate reliable hi-fidelity sound that the user can hear and feel,” said Castro and Salomaki.

ResonX Core - Embodied audio by Edge Sound Research

For example, when someone uses the ResonX™ system at home, they can attach the ResonX™ Core to their couch, effectively turning it into an embodied audio experience. So, when they sit on the couch while watching their favorite show, say a basketball game, they will feel as if they’re there in person. Users can hear every single sound, including the ball being dribbled and even the more subtle sounds like the squeaking sounds made by sneakers.

According to Castro and Salomaki, if a user wants to take their movie-viewing experience to the next level, here’s what they can do:

“An individual can attach the ResonX™ to flooring and then be fully immersed in walking around a new planet by hearing and feeling every moment to make the experience feel life-like.”

Aside from enriching users’ experiences in the metaverse, this new technology finally enables us to engage our other senses, thus adding a new dimension to how we experience music, games, live entertainment, and more.

Embodied audio - traditional sound vs ReasonX

“This opens the door to new possibilities in storytelling and connectivity around the world as an experience can now begin to blur what is real because of three senses simultaneously informing a user that a moment is happening. Not as an effect, but as an embodied reality,” shared the EDGE Sound Research co-founders.

Embracing Innovation in the VR Space

With ResonX™ and its ability to bring embodied audio to life, users can now have richer experiences in virtual worlds. Not only will they be engaging their sense of sight, but they’ll also get the opportunity to experience these virtual worlds using their sense of hearing and touch. Now, users have the chance to transform their physical environment into a cohesive sound system.

The good news is, users can enjoy the embodied audio experience in many public venues. According to Castro and Salomaki, they’ve already deployed the ResonX™ in various sports stadiums, bars, and art installations. Furthermore, if you want to bring home the ResonX™ experience, you can get in touch with EDGE Sound Research for a custom installation.

What will embodied audio look like in the future?

It’s likely going to become more widely accessible. “Over time, we will release a more widely available consumer version of the ResonX™ system that will make this ResonX™ technology more accessible to all,” said Castro and Salomaki.

Redefining Immersive Virtual Experiences With Embodied Audio Read More »

among-us-vr-review-on-quest-2

Among Us VR Review on Quest 2

Among Us has been a hit game for a while now. Among Us VR is a more recent phenomenon. Get your tasklist ready, memorize the map, warm up your button-smacking hand, and trust no one as we pilot the Skeld II through a trial run.

A Brief History of Among US VR

Game studio Innersloth released the original Among Us in 2018 as a free app game with some optional paid customization features. The immensely social game sees a team of travelers piloting a ship (the Skeld II) through space only to find that some among them are “Impostors” who sabotage the ship and kill crewmates.

impostor - Among US VR game

Playing as an Impostor, players try to blend in with the crewmates while destroying the ship’s functions and/or murdering enough of the real team to take over. Playing as a crewmate, players need to keep the ship flying and stay alive long enough to determine which of them are Impostors.

Among Us VR is no simple port (though unofficial attempts in the form of amateur mods on social VR platforms have existed for some time). To make the game immersive, Innersloth partnered with XR game studio Schell Games, known for titles like Until You Fall, and the I Expect You to Die series.

Announced at Meta Connect and launching the following month, Among Us VR is currently available for $10 on SteamVR and the Quest Store. Doing this review, I played on my Quest 2 (review).

Among Us VR is meant for players 13 and up. Violence is cartoony but graphic and inescapable. The title is also necessarily social so the effort at protecting young players is nice, even though it doesn’t work at all ever. The gameplay is complex when executed correctly by mature players – and equally complex when operated in ways a mature player doesn’t expect.

Navigating Menus, the Tutorial, and Online Gameplay

When first booting up Among Us VR, players are prompted to enter a birth date. You can lock this in on your headset, or you can choose to require a birth-date entry on each boot. That means that to play the game you have to be at least old enough to know how to lie about your birthday. The title is intended for players 13 and up. It is played by children 6 and up.

The main menu is simple. Your standard settings options are there, as are your customization options. Change the color of your crewmate and trick it out with little hats. Some hats are free, and some hat packs are available for purchase. (Yes, I do have the tiny crewmate hat only available to people who pre-ordered the game.)

VR game - Among US VR

The two largest buttons that dominate the main menu are to play online and learn how to play. The learn-how-to-play option is an offline tutorial that takes you through several aspects of gameplay without other users running around murdering you.

Learning the Ropes

Because the tutorial is representative of so many aspects of gameplay and to respect the privacy of online players, all of the screenshots in this review were taken in the offline tutorial or provided by Schell Games.

The tutorial takes you through life as a crewmate, solving tasks, pushing buttons, reporting bodies, and getting murdered. Then, you experience the afterlife (dead crewmates can’t vote, communicate with the living, or report bodies, but they can still complete tasks). You also get to play as the Impostor, climbing through vents, sabotaging the ship, and killing crewmates.

among us VR review

Unfortunately, the tutorial is limited to two rooms on a fairly large map. It also doesn’t include all of the tasks that you’ll need to complete when playing a full game. However, it’s still a nice introduction.

The controls are smooth. All of the tasks could theoretically be hand-tracked, but movement is controlled with the controllers, so they’re a must-have. There’s also a button to bring up the ship map and do some other basic commands. The controller layout isn’t overly complex or challenging, and all major controls are spelled out in the tutorial and are changeable in settings.

crewmates Among US VR

Movement is smooth, and your view goes into a sort of tunnel vision while you’re moving to prevent motion sickness. If you’ve read my reviews before, you know I can get motion sickness pretty bad pretty quick, but I find Among Us VR to be pretty comfortable. Also, because everything is controller-based, you can play sitting down.

Taking It Online

There are two main options for playing Among Us VR online, one for smaller and shorter games, and one for longer and more populated matches. A shorter game might only have five players including one Impostor, while the longer games have more crewmates and more Impostors. Other than that, the gameplay is the same.

There’s no formal breakdown of how a game plays out in terms of round length or anything like that. But, there is a sort of structure. Here’s how it plays out, as I understand it:

The Impostors can murder one crewmate and sabotage one ship component per round. A round culminates in an “emergency meeting” called when a body is discovered. All of the players converge on the cafeteria to try to decide who the Impostors are, followed by a round of voting, during which the players vote out one player – who may or may not be the Impostor.

Emergency meeting - Among US VR game

There are a few gameplay elements that make things a little trickier. For example, Impostors can still fix sabotages and report bodies. This helps them make it look like they’re really part of the team. Further, fixing sabotages usually requires standing still and facing a wall for a few seconds – a prime opportunity to get murdered by an Impostor.

Now, About My Crewmates…

The first time that I played Among Us VR, I was definitely the oldest person on deck by probably twenty years. I’m no autumn rooster, but I was definitely surrounded by spring chicks.

When this eventually became apparent, I became an immediate subject of suspicion. I felt a bit like Robin Williams in Hook when the Lost Boys rally against the only adult on their island in Neverland. I managed to survive the game, but only to see the Impostor take the ship. I wonder if this dynamic didn’t make things more interesting.

One crewmate shouted so loudly and so consistently that he knew who the Impostor was during the first round of voting that the rest of us all thought he was casting suspicion off of himself. We voted him off immediately only to find at the end of the game that he had been telling the truth.

who is the impostor - Among Us VR game

I’ve been writing about VR since this particular Impostor was eating dirt in daycare. But Among Us doesn’t care. That’s part of the beauty of the game. I chose not to trust my crewmate. Sure he was young, and sure he got a crewmate to change color in the lobby because “nobody likes purple” but – when push came to shove – I underestimated him and it cost us the ship.

If you would rather play Among Us VR with adults, I have a sneaking suspicion that younger players favor shorter matches. I’m sure that the time of day that you play makes a big difference too. But, we’ve already seen how well I understand children.

Fun for (Almost) Any Age

All things considered, Among Us VR is great fun at a great price. So what, there are grade schoolers online? The game is VR, but it’s also a game with simple mechanics built on a social framework. Maybe in an update developers should acknowledge the “age problem” and have separate lobbies for different ages. In the meantime, grow up and play your little video game.

Among Us VR Review on Quest 2 Read More »

over’s-map2earn-beta-program-to-make-the-creation-of-3d-world-maps-more-accessible

OVER’s Map2Earn Beta Program to Make the Creation of 3D World Maps More Accessible

OVER’s Map2Earn Beta program makes creating 3D world maps more accessible to users with smartphones. Using the OVER app, they can now take photos of any physical location and generate an OVRMap. As a result, they can take part in OVER’s global mapping initiative, while also getting the opportunity to earn rewards.

Paving the Way for Richer AR Experiences

According to OVER, Map2Earn is the company’s’ “biggest project yet.” It introduces more accurate data collection capabilities and takes geolocalization capabilities to the next level.

The accuracy of GPS systems in outdoor spaces is limited to about 6 meters. However, with Map2Earn Beta, OVER has increased the localization accuracy to 20 centimeters.

This broadens the horizon for newer and more immersive AR experiences and use cases. For instance, users can have superimposed AR experiences on existing real-world structures or accurate geolocation of assets in indoor and multi-floor settings.

“When we think about the future of AR, we imagine an augmented world with contextualized 3D experiences that seamlessly merge with the physical world,” said Diego Di Tommaso, COO and co-founder of OVER, in a press release shared with ARPost. “In order to achieve this, we need a system to precisely locate the observer in space – we need geolocalization accuracy. That’s why we built Map2Earn, a system to precisely relocate you in space using computer vision that goes far beyond what can be achieved with GPS only. Such a system will finally enable the creation of the AR use cases that, as of right now, we can only just dream about.”

Its Alpha testing phase, which involved more than 1,200 maps created by 300 early users, was a success. Now, the Beta version is accessible to all users via the OVER app (available for both iOS and Android).

How the Map2Earn Beta Program Works

The Map2Earn Beta program is backed by an intuitive UX that guides the users through the capture process. Users, or – as OVER calls them – the mappers, will film each OVRLand location to generate three assets:

  • The location’s 3D point cloud, which delivers a precise visual reference of the location the user wants to augment through AR;
  • Relocalization algorithms, which use the point cloud to locate the observer and create a more immersive AR experience;
  • A NeRF (neural radiance fields)-generated digital twin of the mapped area, which creates a simulation of the mapped location.

The user-creator will own the 3D structure of the locations that they’ve mapped using their smartphone. As of this writing, users can view the digital twin via a virtual drone fly-through. However, OVER will be working toward making these locations freely explorable.

A New Way to Explore the Metaverse

With the Map2Earn Beta program, OVER and its users can build up-to-date, Web3-based 3D maps of significant locations. Also, the scope of the program covers both indoor and outdoor spaces.

“OVER’s vision is to create a Web3-based, community-owned, up-to-date 3D map of the most important indoor and outdoor locations in the world – the likes of which we have not seen before,” said Di Tommaso. “OVRMaps have a fundamental importance for AR. They are the portal to the AR metaverse, without which there is no way to reliably and coherently augment the physical world.”

The Map2Earn production release is scheduled for late January, when users will be able to mint their 3D maps as NFTs. Then, they can trade these assets via the OVER marketplace, as well as other decentralized marketplaces, such as OpenSea. This is one of the ways through which users can earn.

In the future, OVER will be launching a direct incentivization program. Through this, users will be able to access the so-called open-to-buy orders and acquire the maps of their desired locations.

To access the Beta program, download the OVER app on Google Play or the App Store and follow the directions, which you can find under “Map2Earn”.

OVER’s Map2Earn Beta Program to Make the Creation of 3D World Maps More Accessible Read More »

uc-berkeley-releases-report-on-safety-in-social-vr

UC Berkeley Releases Report on Safety in Social VR

We’d all like to be safe in social VR experiences. Barring the human race one day waking up and unanimously deciding to be decent to one another, how might this future come about? One potential solution is robust, clear, accessible community guidelines from platforms. But, what might those look like?

VR researcher Rafi Lazerson recently published a paper with the University of California Berkeley’s Center for Long-Term Cyber Security, titled  “A Secure and Equitable Metaverse: Designing Effective Community Guidelines for Social VR.” The paper breaks down what harms can look like in social VR environments as well as what shape community guidelines for those environments should look like to prevent and address those harms.

Learning From the Past?

The paper’s introduction presents a provocative question:

“Will social VR platforms proactively develop clear community guidelines at this early stage of user adoption, or will their process follow the slow, opaque, and reactive trajectories that were typical of 2D social media platforms?”

Secure Equitable Metaverse - Report - safety in social VR

The paper draws on industry and academic research, academic literature, media reports, and the existing community guidelines of both 2D and VR social platforms. It takes a particularly close look at Meta’s guidelines for both its 2D and its social VR experiences. This is a handy example but it also comes with a message to Meta:

“Well-funded corporations have a disproportionate impact on the formation of the metaverse and on norms within social VR, and therefore have a responsibility to lead the industry in developing responsible policies and practices.”

Harms in Social VR

“Embodiment removes the sense of separation and distinction between the user and the avatar, contributing to interactions between the users that feel real and present,” wrote Lazerson. “To the user, any VR world, even the fantastical, can feel real and present due to avatar embodiment, world-immersion, and synchronous conduct-based interactions.”

This won’t present an entirely new idea to most readers, but it is central to this work in particular and to this whole body of work. It means that misconduct can be more difficult to identify because it might not be recorded in the way that most social media interactions are. It also means that the interactions are worth taking seriously even though they happen in a “game.”

“Experiences of harassment in VR have been described as comparable to in-person harassment,” wrote Lazerson. “As haptic gloves, suits, and other VR immersion hardware become a common part of VR use, experiences of harassment may feel increasingly indistinguishable from in-person harassment.”

What’s more than that, many of the forms of harassment that we know and hate from traditional social media – based on race, religion, gender, and other factors – are already being reported in VR even to such a degree that some users report hiding aspects of their identities in order to avoid it. The problem compounds as immersive tech is increasingly used for work and wellness.

“The inability of some users to present as themselves in or even enter into social VR without fear could have severe health and economic ramifications,” wrote Lazerson.

So, how do we preserve these environments as safe spaces for everyone?

Effective Community Guidelines

According to Lazerson, effective community safety practices consist of three main components:

  • Policy
    • External communication of expectations to users;
    • Internal communication of policies to moderators;
  • Product
    • User tools;
    • Moderator tools;
    • Educational tools;
    • Invisible safety tools (age-gating of select experiences, etc.).
  • Operations
    • The means by which policy is enforced through product.

It can be difficult for anyone other than a platform maintainer or moderator to see all of these pieces working together or to gauge how effective they are. However, one item on that list, outward-facing policy, is easy to see. So, how do Meta’s community guidelines work as a model for social VR experiences generally?

Are Meta’s Social VR Guidelines Sufficient?

A theme throughout this paper – and the realm of metaverse safety generally – is that immersive platforms can learn from conventional social media while recognizing that immersive content is different and accommodating those differences. According to Lazerson, one of Meta’s biggest problems may be that its immersive policies don’t come with Facebook’s existing safeguards.

“There is no single list of public-facing community guidelines for users to follow in Meta’s social VR. There are at least two, perhaps three: the Horizon Policy, the Conduct in VR Policy, and possibly the Facebook Community Standards,” Lazerson wrote. “There is a significant amount of ambiguity regarding where each of the aforementioned Meta community guidelines applies in VR.”

Lazerson isn’t only here to criticize. He closes the paper with recommendations to all immersive platforms. These include accessible, transparent, specific, and comprehensive guidelines using existing social media guidelines as a “baseline.” He also recommends platforms work with each other on policy “to ensure that no forms of harm are overlooked.”

For full recommendations, find the full report available for free here.

Be Good

It’s unfortunate that we need things like community guidelines – long documents detailing ways that people aren’t allowed to be mean to each other. Ideally, immersive worlds would play out like the world that we live in – in which most people are just impulsively decent to one another.

But, for whatever reason, social VR can be an uncomfortable space. It also has great beauty and can be a place where we share knowledge, experience, and just have fun. It’s up to each of us to help bring out the best in this medium. So read the guidelines, follow them, encourage others to follow them, and encourage your favorite platform to use them well.

UC Berkeley Releases Report on Safety in Social VR Read More »

perkins-coie-releases-6th-annual-industry-report-on-immersive-technology

Perkins Coie Releases 6th Annual Industry Report on Immersive Technology

The sixth annual Perkins Coie XR Report is out. The theme: “The Rise of Web3 Technologies to Accelerate XR.” While some people are skeptical of some of the goings-on in Web3, new ideas about connecting people are putting immersive technology into a context that highlights its value.

Is Web3 the Same as the Metaverse?

So far, ARPost has been largely silent on the idea of “Web3” because it brings in a lot of ideas that aren’t really central to immersive technology while not necessarily bringing in the immersive technology itself. For a quick and dirty shortcut, some people define the metaverse as “the next generation of the internet” and that’s exactly what Web3 means.

However, the metaverse movement is more focused on spatial computing while the Web3 movement is more focused on the mechanics of publication and ownership. So, the two aren’t necessarily mutually exclusive, but they are also not exactly the same thing. This issue was addressed in the report’s executive statement both as it benefits and potentially harms XR:

“Accelerated by . . . the emergence of NextGen technologies like Web3 and the metaverse, XR has hit the mainstream. Yet, new audiences, technologies, and products bring new challenges. When we add in the economic volatility that has at least temporarily affected many in the tech industry, the question becomes: What does the future have in store for XR?”

Overall, almost all of the respondents (significantly more than last year) expect to see growth in the immersive technology market, but the growth that they expect to see is more modest. Just under half of all respondents expect widespread adoption of the metaverse and Web3 in the next five years.

Insights on Investment

More people talking at a more reasonable volume was a trend in this year’s report. For example, 70% of respondents said that they would increase spending on XR for remote collaboration and training “to a moderate extent.” Last year, 51% said that they would increase this category of spending “to a large extent.”

Another 36% of respondents said that they expected the pace of investment to be “slightly higher” this year, while 32% percent expected that it would stay the same. This is significantly lower than the numbers for a similar question in last year’s survey.

Given otherwise growing positive sentiment, this might be less a reflection on immersive technology and more a reflection on the current state of the economy. Not only is the enterprise world increasingly returning to something like normal in terms of how internal communication happens, but rising costs in other sectors may lead to decreasing experimental budgets.

There are also remaining barriers to adoption of immersive technology.

Barriers to Adopting Immersive Technology

“Roughly half of respondents named user experience (e.g. bulky headwear and technical glitches) and content offering (e.g. lack of quality content) as barriers to mass adoption,” says the report. “Respondents expressed less concern on these fronts than they did in 2021, indicating that they perceive that the industry is making substantial progress.”

Respondents also said that improving data security, improving infrastructure, and improving affordability could all help to attract more consumers. Respondents also revisited one of the most interesting portions of last year’s report: the question of whether consumers or developers have a better understanding of compelling content when it comes to immersive technology.

This year, 43% of respondents agreed or strongly agreed that developers “do not yet understand what makes compelling content from a consumer standpoint” while 46% of respondents agreed or strongly agreed that consumers don’t understand where or how to find compelling content. Both of these numbers are down from last year.

“Respondents’ top recommendations [for improving immersive technology content] were the same as in 2021: Produce more interactive and immersive content (52%), as well as content that is compatible across platforms (49%).”

Who Benefits (Really)?

One of the biggest questions around emerging technology is who stands to benefit the most. At least right now, when we’re still early and a lot of buy-in is relatively high, it stands to reason that “high-income individuals” are most likely to benefit. This was the top answer (58%) when respondents were polled about who benefits the most from XR and NextGen technology.

While a handful of tech companies are wooing consumers, a great deal of the energy in the space is directed toward enterprise. Desk workers are having meetings in VR, deskless workers are benefiting from AR-enabled remote assistance. So, it may not be surprising that half of the respondents listed working professionals as among those most likely to benefit.

So, we have an understanding of who is benefiting. But, who is really benefiting? Many in the emerging technology space are concerned that the market may be perpetuating some classic workplace problems, like the exclusion of women and marginalized racial or ethnic groups. Fortunately, this doesn’t seem to be the case.

Just over half of the 150 respondents identified as being from minority/female-owned organizations. Of those respondents, nearly 80% agreed or strongly agreed that “funding for such founders is proportional to their white male peers.”

More People Talking

For much of XR’s history, a lot of the most meaningful sentiment has come from a fairly small number of people with particularly strong feelings.

One of the key takeaways of this report was that, while the immersive technology space still has its zealots, more people are warming up to the technology. This is potentially a huge sign of market maturity, even if it’s not the most exciting headline.

Perkins Coie Releases 6th Annual Industry Report on Immersive Technology Read More »

metaverse-safety-week-2022

Metaverse Safety Week 2022

Last week saw the XR Safety Initiative’s third annual Metaverse Safety Week (formerly known as XR Safety Week). As ever, the week started on International Human Rights Day, December 10, and was followed by five days packed full of panels and talks, with each day having a different theme relating to safety in immersive environments.

We couldn’t attend all of the Metaverse Safety Week sessions live or watch the feeds fast enough to keep up with everything. But, this article presents some highlights of the event.

Kavya Pearlman’s Welcome to Metaverse Safety Week

Of course, the event started with an introduction by XRSI founder Kavya Pearlman. Pearlman addressed the audience through her AltspaceVR avatar in the virtual model of the Taj Mahal custom-made for the event by the design studio Chicken Waffle.

“On the seventy-fourth Human Rights Day, we’ve assembled world leaders, technology professionals, global regulators, policy experts, ethicists, ethics organizations, researchers, and under-represented voices,” said Pearlman. “Metaverse Safety Week is so important as a point of reflection for all of us. We’re bringing the world together to safeguard the metaverse.”

The “Pong” Version of the Metaverse

The first topic of discussion was Dr. Louis Rosenberg’s video on the metaverse for More Perfect Union. Following the video, Rosenberg appeared on stage to further address some of the key topics in metaverse safety.

“When thinking about human rights for the metaverse, I think it’s useful to just set the context a little bit into the future – think about the 2030s. I say that because the work we do today is really preparing us for the future,” said Rosenberg. “Today is really the ‘Pong’ version of the metaverse.”

A little later in the day, HTC’s China President and Global VP of Corporate Development, Alvin Wang Graylin, took the stage. Graylin’s presentation focused on lessons learned of immersive technology. According to Graylin, today’s excitement is the result of related technologies developing together – a condition which creates opportunities for good and bad.

“As an industry, we need to make sure to educate the world as well as watch out for bad actors and keep them from doing the wrong things,” said Graylin. “Instead of trying to create a closed world and take value off the table, we really need to work together to create open worlds that are interoperable.”

This same day saw the announcement of the XR 2030 Policy Fund from the Minderoo Foundation. The program “will award funding to researchers and civil society leaders who are advancing the next generation of digital media ecosystems to prioritize public interest values.”

Building a “Super World”

Day two opened with a discussion led by SuperWorld’s Hrish Lotlikar. Lotlikar told ARPost last summer that he wants the platform to be “the gateway” between the physical and the virtual worlds. At Metaverse Safety Week, he was addressing how emerging technologies can work together to make individuals feel more attached to the world around them.

“The importance of decentralization is that it allows us all to become stakeholders in the environments that we’re playing in, that we’re working in, that we’re building together,” said Lotlikar. “At SuperWorld, our vision is to enhance society and build a better world – and we can do that with these technologies.”

That morning also saw the launch of the “Metaverse for All” certificate program. Launching next year, the program will provide “An informative and enlightening course to shed light on the many ways in which the metaverse will change the world.”

How the Metaverse Will Impact the Young

Day three was all about metaverse safety for children and it kicked off with a talk by Representative Lori Trahan. Despite it being her first time in AltspaceVR, Trahan’s talk was moving.

XRSI- metaverse safety week - lori trahan

“Whenever we access the metaverse, data about ourselves can be stored, accessed, and used by people who will pay for our data to influence and manipulate us in real time. And, as we all know by now, it could even be shared with third parties and governments without our knowledge,” said Trahan. “That’s why we must pass comprehensive privacy legislation.”

Trahan pointed out that, while data is a common metaverse safety concern in the immersive technology community, it is an issue largely outside of the experiences themselves. We also need to be mindful of interactions within the immersive experiences as they occur.

“The high levels of harassment in social media and in video games have already become prevalent in the metaverse,” said Trahan. “In fact, the harms we know all too well from platforms like Instagram, TikTok, and Snapchat can often be seen in the metaverse where immersion and a sense of physical presence can make these negative experiences even more visceral.”

And the Young Said…

Later that same day, a “Youth Panel” took the stage. The panel consisted of five members of the International Child Art Foundation (ICAF) between the ages of 13 and 23, and was moderated by the Foundation’s founder Ashfaq M. Ishaq. The panel presented a rare opportunity to hear young people speak for themselves on metaverse safety issues like the efficacy of age restrictions.

One speaker pointed out that, in her experience, VR motion sickness went away as she got older – so not exposing children to VR too young might help to give them a positive first impression. Another pointed out that age restrictions might be less helpful than a system of content warnings for VR experiences similar to those for films and television programs.

Perhaps the most engaging takeaway was that the metaverse might be about games or work for most people reading this but, for a growing number of young people, it’s also about socialization. That can introduce potential metaverse safety issues without immediate solutions.

“Not forming relationships through the metaverse is no longer an option. We have to rely on these new connections that we’re making to be able to reach out,” said ICAF member Alaalitya Acharya. “When you’re interacting with somebody through an avatar, what are some of the signs that you need to look out for in the same way as when you meet someone face-to-face?”

So. Much. Content.

We tried to bring you just about as much as possible from Metaverse Safety Week. And there’s so much more that would have been worth presenting. But alas, five days of sessions averaging about eight hours each doesn’t really fit into an article. If you want to explore it yourself, you can find the recordings on XRSI’s YouTube channel.

Metaverse Safety Week 2022 Read More »

riga-metacity:-a-state-supported-initiative-set-to-become-one-of-europe’s-largest-metaverse-projects

Riga Metacity: A State-Supported Initiative Set to Become One of Europe’s Largest Metaverse Projects

With the increasing interest in the concept of “Metacity” across the globe, it is clear that the metaverse is shaping the cities of the future. The United States, China, Singapore, Japan, and other countries are developing Metacities where immersive technologies are integrated into how cities work and the way people live.

While Europe struggles to establish itself as a leader in emerging technologies, it is fast gaining ground in helping shape the future of virtual worlds.

In a letter of intent by European Commission President Ursula von der Leyen, the EU specifies a Europe fit for the digital age as one of its key initiatives for 2023. It notes that initiatives on virtual worlds, such as the metaverse, are among its priorities next year.

Aside from this, a more concrete action toward building Europe’s Metacities is already underway in Latvia.

Riga, Latvia, Chosen as the Bed of Metacity Development in Europe

Based on a study by Cambridge Executive MBA students, the capital city of Latvia has the potential to become the next successful Megacity in Europe. The city was chosen based on its existing connectivity infrastructure, partners, innovators, and political will.

With 5G base stations across the region, Riga boasts a strong cellular network ranked 5th in Europe in terms of internet speed. It’s also well-connected within the Baltics, making it easily accessible to other cities and countries.

Aside from the solid infrastructure, the city also has a strong technical university that has become a hotbed for innovators. Along with local technology companies and a well-connected community, Riga becomes an ideal setting for metaverse deployment and adoption.

Riga Signs a Memorandum With Industry Partners at 5G Techritory Forum

Earlier this year, Riga launched a state-supported initiative to develop its own metaverse. It is set to undertake the immense challenge of creating a city-level testbed of Metaverse applications with practical use cases that benefit citizens and enterprises.

To put the Riga Metacity initiative into action, Riga signed a Memorandum of Understanding (MoU) at the 5G Techritory Forum. Twenty-two industry partners signed the MoU to mark their commitment to the metaverse project.

“I congratulate us that, with our signatures, we have now expressed our willingness and readiness to be not just talkers, but also doers,” said Neils Kalniņš, 5G Techritory Program Director. “Already this January, we will come together to discuss how we can create practical applications for the Metaverse. A safe and green future of the Metaverse will be Latvia’s contribution to the world, and I look forward to it.”

Riga Metacity - Memorandum of Understanding (MoU) at the 5G Techritory Forum

The key aim of this memorandum is to establish a central authority on XR that will coordinate the development of the Metacity. After which, all other activities and procedural steps will be decided and undertaken including the platform regulations, sourcing of funds, coordinating development, and much more.

Riga Metacity: Driving Forward the Future of XR in Europe

The Riga Metacity initiative is one of the first and largest Metaverse projects in Europe. It is expected to attract a sizeable share of the estimated $1,500 billion market size by 2030 which will greatly benefit the local community and government.

However, the benefits go far beyond the city. With this initiative, Riga drives forward extended reality research, technology, and applications across Europe. The regional initiative and development are also expected to accelerate the overall competitiveness of the European Union. It opens an opportunity to develop capabilities to maintain high economic growth and drive the future of XR forward.

Riga Metacity: A State-Supported Initiative Set to Become One of Europe’s Largest Metaverse Projects Read More »

a-tour-of-the-mytaverse-enterprise-immersive-world

A Tour of the Mytaverse Enterprise Immersive World

 

There are a number of great virtual world platforms out there – and a smaller number of those are dedicated to enterprise use. And, an even smaller number of those offer the immense graphic fidelity required for some applications. Mytaverse is an enterprise-only platform with special tools for digital twins and virtual assets, and I got the world tour.

What Is Mytaverse?

Mytaverse was founded during the COVID-19 pandemic and has seen its fair share of remote events and virtual work solutions since then. About this time last year, the company hosted a Pepsi-led event focused on innovation that brought over 300 representatives from Google, Adobe, IBM, Salesforce, Amazon, and others into the platform for the first time.

mytaverse map

“Our company is growing rapidly and we’re getting a lot of traction,” Inside Sales Manager, Robert Mc Illece, told me during an in-world interview and platform tour.

The platform isn’t just about remote presence. They acknowledge that the metaverse will never replace in-person. More importantly from the business angle, the metaverse has to replace in-person a lot less frequently now than two years ago. Fortunately, the platform has a way with virtual models – which might have more lasting and varied staying power than virtual people.

The platform has special tools for displaying and viewing virtual models. These can be digital collectibles or digital twins. Companies can bring their own models, or work with the Mytaverse team to build models for use in their applications.

mytaverse toolbox

The digital twins aren’t just viewable, they can also be made to be configurable in real time. Over the course of the interview, I change the exterior colors of model cars – and the interior colors of a model jet while standing inside it.

All of this is made even more impressive by the first major update to the Mytaverse platform, partially funded by a recent $7.6M seed round. The update brought Unreal Engine 5 to the platform (Did we mention that Unreal is a tech partner?), along with a tidy integration with the cross-platform avatar engine Ready Player Me.

Navigating Within the Platform

“We’ve been asked for a long time to incorporate human avatars. We used robot avatars for a long time,” said Mc Illece. “We were going to use Metahuman, but it’s too heavy.”

Those robot avatars are still available for users who don’t already have Ready Player Me, or who don’t want to take the time to make one on the spot. There is a wide selection of body types, and all have a screen-shaped head that can broadcast the webcam of the user – sort of like the robots from the Saga comic series.

Mytaverse avatars

It’s natural for Mytaverse team members to have a soft spot for the robot avatars, and it’s natural for them to want to team up with their partners at Unreal. However, there’s a sense that Ready Player Me lends a sense of identity because of how universal the avatars are becoming. For example, I used the same avatar in this interview as I used in a recent tour of Spatial.

No matter which avatar system you use, emotes are provided ranging from the practical, like waving a hand, to a softkey for dancing a jig. The avatar is controlled by standard WASD controls on a computer or by touch on a mobile device. Need to get somewhere fast? Hop on a hoverboard, teleport short distances, or pull up to “beam” in between locations on campus.

mytaverse emojis

The entire experience is browser-based and hardware-agnostic made possible by cloud computing (AWS is also a partner). So far, “hardware-agnostic” doesn’t include headset support – though that’s on the schedule for the second half of next year. What’s the holdup? A lot of headsets don’t yet have the display quality to keep up with Mytaverse graphics.

“We don’t want to dumb down our graphics for a headset,” said Mc Illece.

Enterprise-Only

Writing reviews of platforms like Mytaverse is almost a little sad. I was really impressed by my tour of the platform. And, I hope to be invited back for some remote event or another. However, this is an enterprise platform so a lot of readers might never get a look. But, that’s fair, as the platform brings a slew of top-shelf tools that enterprise users need more than casual users.

A Tour of the Mytaverse Enterprise Immersive World Read More »

metaverse-–-expectations-vs-reality,-part-2:-the-costs

Metaverse – Expectations vs Reality, Part 2: The Costs

 

In part 1 I addressed the technical reasons why Ready Player One – customer and user’s expectations – is still years away.

But the unrealistic expectations of the public regarding the metaverse involve another field. As co-owner of a small VR company, I find myself quite often obliged to disappoint people that think the metaverse and immersive technologies are cheap.

In this second part, I will very briefly explain the great obstacle of costs.

I Would Like… But I Can’t Afford It

This is an annoying topic for those of us who work with and on immersive technologies. Expectations and reality on the issue of costs diverge in an impressive way. As a business owner, I receive requests for very complex apps. What strikes me as a strange phenomenon is that the prospects very seldom have a rough idea of how much they are going to be charged for what they ask.

They seem to be unaware that the metaverse is powered by cutting-edge technologies, which use expensive instrumentation and equipment both for production and for use.

A virtual reality experience costs money. An augmented reality app costs money. A virtual tour costs money. A lot, sometimes more than a lot.

The work behind it is invisible to an outsider, and it makes sense. It’s ok that people do not know how time and energy-consuming this all can be.

Today, I would like to give you an example of a production process, so that you can appreciate how much work lies behind a seemingly simple project.

Let’s take a virtual tour with a minimum number of 10 photos (360 photos), 3 videos (360 videos), and 10 hotspots (interactions) to be put only in the videos (therefore 30 total interactions).

Here is the whole process, from the very beginning to the delivery:

  1. Inspection of the location where the shooting will take place.
  2. Draft of the shooting list (the script).
  3. Hardware and number of people in the crew to be selected based on points 1 and 2.
  4. Crew and equipment transfer.
  5. Possible overnight stay, plus food for the whole crew.
  6. Filming (1 day).
  7. Check-up of the footage in situ to check the quality, make sure everything is clear and does not need re-shooting.
  8. Footage download.
  9. Stitching (editing).
  10. Audio post-production.
  11. Hotspot programming.
  12. Hotspot content upload.
  13. Testing and debugging.
  14. Virtual tour upload on the client’s website or where required.

On top of all this, we must add the crew’s professional experience and the company’s markup. For personal reasons, I prefer not to bring you numbers here.

As mentioned above, all this is invisible to an outsider. We understand it very well. But the workload is there, nonetheless, and it must be taken into account.

And now consider that virtual and augmented reality tend to have even higher costs than a simple virtual tour.

The Metaverse We Currently Have Is Enough, For Now

Expectations and reality rarely coincide in the world of virtual reality, augmented reality, and virtual tours. The excessive hype generated by enthusiasts, who are a bit naïve, combined with the lack of correct information has created in recent years the illusion that Ready Player One is already here.

Nothing could be further from the truth. As it is often said, the metaverse is only a concept for the moment, and not even a clear one at that. Much can be done with the technologies available to us now, much more than we could have done only five years ago. Suffice to say, at the time, VR headsets, such as the Oculus Rift CV1, did not have the hand controllers that are used now, and necessarily needed a powerful computer to run apps.

Now we have excellent stand-alone headsets (so, no computer), and motion sensors that are directly applied to the device and that track the movement of the hands. We have multiple companies healthily competing for the mass market of headsets. And we have companies that are building platforms. A debate on interoperability has finally started. This all is very important and is a clear advancement.

However, the obstacles remain important, the costs as well.

Although we can understand that those who do not work in the space can not have a real understanding of all these aspects, we believe that it is always appropriate, when approaching a new sector, to be well-informed about what is possible and what is not. About how close expectations and realities of things actually are. Or distant.

Metaverse – Expectations vs Reality, Part 2: The Costs Read More »

answering-the-cryptic-question:-does-the-metaverse-need-nfts?

Answering the Cryptic Question: Does the Metaverse Need NFTs?

 

The metaverse and the NFT space have both had their vocal detractors, but could the secret to their mass adoption lie in their ability to work together? Developers of the first quantum-secure, hyper-realistic gaming metaverse, the Kryptiverse, believe so.

Walt Greene, the founder and CEO of QDEx Labs, which developed the Kryptiverse, helps us delve into what NFTs truly are and the role they play in the metaverse.

The Seeming ‘Uselessness’ of NFTs

In 2021, Steam explicitly banned apps built on blockchain that issue or allow the exchange of NFTs and cryptocurrencies. This year, Mojang Studios and Minecraft also took a firm stance against NFTs. Aside from tech and gaming companies veering away from non-fungible tokens, the NFT market has also been gravely affected by the crypto crash.

In fact, the monthly transaction volume for NFTs on OpenSea has fallen by 90% from January to August this year. These put into question the actual value—or lack thereof—NFTs have and their relevance in the emerging metaverse market.

However, Greene posits this state of ‘uselessness’ to the limited understanding of NFTs. According to him, many people have no idea what they are getting, much less, what the NFT technology has the potential to represent. He further explains that the centralized approach and business models associated with Minecraft and Steam do not necessarily mix well with the decentralized nature of a true NFT technology.

A Deeper Understanding of NFTs

To fully grasp the value that NFTs bring and the role they play in the metaverse, we need to have a more in-depth understanding of what an NFT actually is.

When we hear of NFTs, we categorize them into purely digital assets. By putting them in this box, we perceive them to have no real value beyond the digital space. However, NFTs are more than just digital assets. When you own one, you acquire a unique, transferable, and uncontestable token that gives you absolute rights to a digital asset. It is a verification of ownership of collector items, fractional real estate, art pieces, and other items.

NFTs and the Metaverse

Now to answer the cryptic question: Does the metaverse need NFTs? Well, not really. NFTs are not required in the metaverse. However, these two are not mutually exclusive.

“When speaking in terms of where the metaverse is heading, and that is into decentralized ecosystems, NFT-like technologies with actual utility can and should play a critical role because of the uses for the technology from recording information to the sovereign ownership they perpetuate,” Greene says.

While the metaverse does not really need NFTs, these unique tokens can address the challenges the metaverse is facing. Greene sees NFTs playing two major roles: the in-game use and ownership (DRM) via identification of an asset and the microtransactions in the developer or creator support ecosystem. These will help develop the metaverse into a truly successful, highly individualized, and experiential digital ecosystem.

NFTs: Powering the Creation of a Quantum-Secure High-Definition Gaming Metaverse

NFTs have the potential to power the creation of a quantum-secure, high-definition gaming metaverse. NFTs are being used to build the Kryptiverse, a fully-integrative gaming metaverse.

According to Greene, while the QDEx Community’s TSSYRQ Network will be bringing the quantum secure aspect to the gaming party, the KRYPTI game will be bringing the metaverse together with the crypto industry for easy onboarding into DeFi via a hyper-real, captivating user interface and storyline. All the user has to do is learn how to play the game.

Krypti game

NFTs will bring significant utility to the KRYPTI gaming ecosystem. Early adopters in the Kryptiverse get the “keys to the kingdom” from the initial 1,777 Genesis Mint. These versatile NFTs have over 2,150 lines of code in their contracts carrying class info, numerous stats, variable aspects, and other data that are fully reconciled to the blockchain.

Delivering Value Beyond the World of Gaming and Entertainment

NFTs have the potential for use beyond the world of gaming and entertainment. Genesis NFTs, for one, give utility and incentivization within the quantum secure QDEx App—a feature-rich, decentralized crypto and digital asset exchange marketplace being developed by QDEx Labs.

While the NFT market is highly volatile today, we still see the crucial role NFTs play in the metaverse. The QDEx Community, KRYPTI, and the Kryptiverse all together represent the intersection of usability and mass adoption of the blockchain. This complex connection among communities, NFTs, and metaverse platforms is what will pave the way for growth in this space.

Answering the Cryptic Question: Does the Metaverse Need NFTs? Read More »