augmented reality

denny’s-celebrates-its-70th-anniversary-with-ar-food-menu-that-enhances-dining-experience

Denny’s Celebrates Its 70th Anniversary With AR Food Menu That Enhances Dining Experience

While celebrating its 70th anniversary, Denny’s partnered with QReal to produce AR menus where food items seem to leap off its pages. You don’t need to install the restaurant chain’s app on your phone for the AR food menu to work. Just activate your phone’s camera and launch 8th Wall‘s web-based AR platform from your phone’s browser to watch the images come alive.

AR food menu Denny's

Denny’s AR Food Menu: What to Expect

With the new AR food menu, you’ll see flames surround the classic Moons Over My Hammy egg sandwich and hear the new Mac N’ Brisket Sizzlin’ Skillet sizzle as it emerges from a barbecue smoker. Also making an appearance is a 3D model of the original diner in 1953—then known as Danny’s Donuts—before becoming the beloved establishment it is today.

Denny's AR food menu

Denny’s AR food menu, only accessible when dining at physical outlets across America, is part of Denny’s “It’s Diner Time” brand platform. The campaign also involves the remodeling of its kitchens, the rollout of improved food offerings, and the unveiling of new staff uniforms.

AR Food Menu: Denny’s Latest Foray Into AR

When Denny’s shared its 2022 results in February, CEO Kelli Valade said that one of the company’s strategic priorities is “to lead with technology and innovation.” She also mentioned that “Denny’s is skewing towards younger generations with Millennials and Gen Z currently representing about 45% of our customer base.” So, augmented reality makes perfect sense.

However, this is not the first time the company has tapped into the world of AR. The last time it used this type of computer-generated content was in late 2016 when the diner chain launched its “Shrek the Halls” campaign for the Christmas and New Year holidays. Using the DreamWorks COLOR app, the restaurant’s customers saw characters from Shrek, The Penguins of Madagascar, Puss in Boots, and Turbo Fast arise from the kids’ menus as their phones scanned its pages.

QReal and the Appeal of the AR Food Menu

QReal (formerly Kabaq.io) specializes in creating lifelike, 3D, and AR content for e-commerce platforms and social media campaigns. It works for various industries, from real estate and automotive to fashion and beauty. However, its original passion was food, becoming the first company to make photorealistic AR models of cuisine in 2016 with its KabaQ AR Food Menu app.

“The traditional way people interact with menus is being transformed utilizing [AR and life-like 3D models], leading to an enhanced experience, strong branding, and potentially higher order throughput,” said Mike Cadoux, QReal’s General Manager.

Researchers from several universities who studied QReal’s AR food models attest that such presentations can improve “decision comfort” or “craveability,” spread positive feedback about products, and increase the desire for “higher-value” types of food. Because QReal’s app hardly uses post-production, its users can see their order in advance from different angles in the most realistic way possible.

How the AR Food Menu Will Transform the Restaurant Industry 

If we are to believe Cadoux’s forecast, “high-fidelity digital cuisine” will only increase in demand due to its strong potential to boost branding and sales.

Businesses predict that AR food menus will enable customers to order more smartly because AR renders the item’s size and quantity more accurately. Another benefit of such transparency is lower food waste.

Moreover, establishments can use AR to promote new products and enhance engagement with prospects and loyal clientele through behind-the-scenes tours, which can include how they prepare and cook food.

Denny’s Celebrates Its 70th Anniversary With AR Food Menu That Enhances Dining Experience Read More »

how-large-retail-brands-are-using-augmented-reality

How Large Retail Brands Are Using Augmented Reality

Over 83 million people in the US alone used augmented reality on a monthly basis in 2020. By the end of 2023, it is projected that the number will grow by over 30%, to over 110 million people.

With the pandemic having accelerated the evolution of digital shopping, retail and e-commerce brands are looking for new ways to engage with their consumers and to bridge the online-offline experience gap that exists today while shopping.

How Big Brands Leverage Augmented Reality

Immersive AR experiences are increasingly being leveraged in stores, to create memorable and personalized relationships between the brand and its customers. Through augmented reality, retailers can not only engage the otherwise passive customers but also provide the context needed for them to make a decision and significantly improve the likelihood of the customer making a purchase.

Lego, for instance, used an augmented reality digital box in its stores for parents/kids to put up the physical boxes in front of the screen and see different scenes being built and come to life. This allowed parents and kids to find the right set and also proved to be a fun way to engage with consumers.

Other retailers use augmented reality to specifically drive sales for products that typically need the in-person context to make a buying decision. Houzz’s AR-powered app offers consumers the ability to view their rooms from their phone camera and ‘drop in’ true-to-scale 3D furniture items superimposed on their physical reality, for them to make a more informed buying decision.

Converse’s AR app lets consumers try shoes at home by simply pointing the camera at their feet. They can then evaluate multiple models with varying colors within minutes from the comfort of their home. The app is also integrated with their e-commerce platform, creating a seamless flow from discovery to intent to making the final purchase.

The Future of Retail Is 3D

While all these examples use AR in slightly different ways, they all have one commonality: the buyer is at the center of the experience and the camera has become the new home page. Replacing 2D images with interactive 3D products gives the shoppers the context through visualizations that they need, to be confident in their decisions.

The experience boosts consumers’ confidence, allowing them to make the right choice because AR provides the level of real-life context missing from a flat, 2D product image online. It’s a win-win for the customers and the retail brands, who experience a big increase in conversion rates and a lower product return rate by leveraging augmented reality.

Consumers are coming to expect this experience. Augmented reality adoption is following a similar pattern to mobile phone adoption of the 2000s. And as the mobile-first Gen Z cohort continually gains more buying power beyond the $360 billion they already have in disposable income, we will see large retailers transforming their traditional online and in-person shopping experiences into more immersive, 3D retail experiences to reshape online browsing and buying behavior as we know it.

Guest Post


About the Guest Author(s)

Aluru Sravanth

Aluru Sravanth

A technology enthusiast and a student for life, Sravanth started Avataar in 2014, with a vision to uncover untapped potential from the confluence of self-learning AI and computer vision.

How Large Retail Brands Are Using Augmented Reality Read More »

ar-and-vr-content-creation-platform-fectar-integrates-ultraleap-hand-tracking

AR and VR Content Creation Platform Fectar Integrates Ultraleap Hand Tracking

For the  Fectar AR and VR content creation platform users, creating XR content with hand tracking feature has just become simpler and easier.

Launched in 2020, Fectar is “the multi-sided platform that makes the metaverse accessible for everyone, everywhere.” Focused on creating AR and VR spaces for education, training, onboarding, events, and more, and aimed at non-technical users, the company provides a cross-platform, no-code AR/VR building tool.

Last week, Fectar integrated the Ultraleap hand tracking feature within its AR and VR content creation platform, allowing users to build VR training experiences with hand tracking from the beginning.

AR and VR Content Creation With Integrated Ultraleap Hand Tracking

Ultraleap was founded in 2019 when Leap Motion was acquired by Ultrahaptics, and the two companies were rebranded under the new name. Ultraleap’s hand tracking and mid-air haptic technologies allow XR users to engage with the digital world naturally – with their hands, and without touchscreens, keypads, and controllers.

Thanks to the Ultraleap feature, Fectar’s users will now be able to create and share immersive VR experiences that use hands, rather than VR controllers. According to Ultraleap, this makes the interaction more intuitive, positively impacts the training outcomes, reduces the effort of adoption, and makes the experiences more accessible.

Non-Technical People Can Develop Immersive Experiences 

The new addition to the AR and VR content creation platform is a strategic decision for Fectar. The company’s target clients are non-technical content creators. They don’t need to know how to code to create VR apps and tools, including training programs.

This is, in fact, one of the most frequent use cases of the Fectar AR and VR content creation platform. “We want our customers to be able to create world-class VR training experiences,” said Fectar CTO and founder, Rens Lensvelt, in a press release. “By introducing Ultraleap hand tracking to our platform we’re giving them an opportunity to level up their programs by adding an intuitive interaction method.”

VR Programs and Tools – the Future of Collaborative Work and Training

Virtual reality content has expanded beyond the field of games or applications for entertainment. VR is part of education and training, medicine, business, banking, and, actually, any kind of work.

This is why an AR and VR content creation platform for non-technical users, like Fectar, is so successful. Companies worldwide want to create their own training and collaborative VR tools, without hiring developers.

“The combination of Ultraleap and Fectar provides people with the right tools they need to develop the best education or training programs – and makes it easy to do so. We already know that enterprise VR programs improve productivity by 32%,” said Matt Tullis, Ultraleap VP of XR. “By making that experience even more natural with hand tracking, Fectar customers can expect to see their VR training ROI increase even further.” 

AR and VR Content Creation Platform Fectar Integrates Ultraleap Hand Tracking Read More »

coach-partners-with-zero10-on-ar-try-on-tech-for-metaverse-fashion-week

Coach Partners With ZERO10 on AR Try-On Tech for Metaverse Fashion Week

The second edition of Metaverse Fashion Week (MVFW) is set to take place at the end of this month in Decentraland’s Luxury District, where global brands will feature their digital wearables. MVFW is a four-day-long event that combines fashion and AR try-on technology to offer a unique, immersive experience to attendees.

Metaverse Fashion Week 2023 -Arena

Metaverse Fashion Week, which will run from March 28–31 this year, will see the participation of luxury brand Coach for the first time, showcasing its products in the virtual show. The event brings together top designers and brands, making it an exciting opportunity for Coach to showcase its signature leather-made products in the metaverse.

ZERO10’s AR Try-On Tech Highlights Coach’s Iconic Tabby Bag

In collaboration with ZERO10, Coach will introduce its iconic Tabby bag with a unique AR enhancement as part of its upcoming activation during MVWF. The feature will be accessible via the ZERO10 app, allowing users in Decentraland to try on the product virtually, providing a new and engaging way to experience the brand.

COACH - Tabby bag
Source: Coach

The AR enhancement effect, which makes use of cutting-edge technology, adds a unique touch to the virtual fashion event and provides visitors with a dynamic way to interact with Coach’s products. Using AR try-on, shoppers may virtually try on clothes, accessories, and even cosmetics before making a purchase. Buyers interested in a product can virtually see how they might look in it.

As a global digital fashion platform, ZERO10 offers AR try-on technology to brands and independent creators. Through its iOS app, users can try on digital clothing in real time using their phone camera, collect items in a virtual wardrobe, and create shareable content for social media.

The digital collections are collaborations with both emerging and established fashion brands, designers, musicians, and artists and are released in limited drops within the app. The app’s cloth simulation technology simulates fabric flow, while the body tracking technology lets users try on virtual outfits for unique social media photos and videos.

Blending Tradition and Innovation

This year’s Metaverse Fashion Week theme, “Future Heritage,” encourages both traditional and emerging fashion designers to engage and work together. As part of the upcoming event, brands will conduct interactive virtual experiences both on and off the runway.

Dolce & Gabbana plans to exhibit pieces from its Future Reward digital design competition. Tommy Hilfiger intends to launch new wearables on a daily basis, along with products powered by artificial intelligence. DKNY will have a pop-up art gallery and restaurant called DKNY.3. Adidas, like Coach, will make its MVFW debut this year. For owners of its “Into the Metaverse” non-fungible token (NFT) collection, the sports brand will debut its first set of digital wearables.

Metaverse Fashion Week 2023 brands

Coach will also participate in Brand New Vision (BNV), a Web3 fashion ecosystem that enables attendees to try on wearables from various global brands seamlessly and instantly. BNV has created specifically designed stations to showcase the digital clothing collections created in partnership with top brands such as Tommy Hilfiger, Carolina Herrera, Michael Kors, and Vivienne Tam. Moreover, a newly built “Fashion Plaza” will also exhibit emerging digital fashion possibilities.

MVFW Open Metaverses and Web3 Interoperability

Dr. Giovanna Graziosi Casimiro, Decentraland’s head of MVFW, remarked that they are honored to carry on the Metaverse Fashion Week tradition this year. “We are seeing the return of many luxury fashion houses, and also the emergence and elevation of digitally native fashion. We are excited to see the world’s greatest fashion minds engaging in digital fashion and exploring what it can mean for their brands, and for their communities,” she said.

This year’s MVFW will highlight the force of interoperability across open metaverses while expanding the boundaries of what digital fashion can be. MVFW23, organized by Decentraland and UNXD, is an immersive art and culture event, in association with the Spatial and OVER metaverses, that welcomes fashionistas from all over the globe to gather, mingle, and witness the most recent breakthroughs in digital fashion.

Fashion brands trying on various virtual technologies like AR try-on is a testament to their commitment to staying at the forefront of the latest technology trends and providing their customers with unique and immersive experiences.

Coach Partners With ZERO10 on AR Try-On Tech for Metaverse Fashion Week Read More »

get-ready-to-battle-in-space-with-valo-motion’s-latest-mr-game-release-astro-blade

Get Ready to Battle in Space With Valo Motion’s Latest MR Game Release Astro Blade

Last year, Valo Motion launched ValoArena, a mixed reality playground, in the US. Now, the company is back with an exciting MR game release – Astro Blade.

The Finnish game company, known for designing and developing cutting-edge interactive mixed reality games, has recently released a new space-themed game which can now be played in ValoArenas across the country. This interstellar adventure in a galaxy far, far away is sure to give you and your friends a thrilling time. So, get ready to take part in an action-packed battle in space and become a virtual superhero.

Blast Into Space With Valo Motion’s MR Game Astro Blade

Step into the world of the company’s MR game Astro Blade, where players become virtual holograms fighting in the hangar of a futuristic spaceship. Players can arm themselves with laser swords or spears and protect themselves with shields.

The game is inspired by classic fighting games and space-themed classics like Star Wars lightsaber battles. But what sets it apart is the technology behind ValoArena that makes it possible to bring these classics to life in an entirely new way.

ValoArena’s mixed reality system allows players to fully immerse themselves in the game’s interstellar world, where they can battle against friends and foes alike. The technology accurately tracks players’ movements, making the experience incredibly lifelike and realistic. With stunning graphics, exciting sound effects, and interactive gameplay, Astro Blade is a unique experience.

A Social and Safe Game

Valo Motion’s MR game Astro Blade is designed for up to six people. It is suitable for 8 to 14-year-olds, but it can be enjoyed by both young and adults. Those who grew up wielding imaginary lightsabers as pretend Jedis would definitely love this game. The game company has also paid special attention to making the game safe for younger players, which makes the game great for families with diverse age segments.

Valo Motion - ValoArena MR game Astro Blade

The game is very social and interactive. The actions of other players directly affect what you should do next, making it an excellent addition to ValoArena’s game offerings for small groups. They want players to feel like superheroes in a fighting game and be able to come up with their special moves without limitations.

Where and How to Play Astro Blade

Astro Blade can be found in various locations worldwide where Valo Motion products are available. You can check out their interactive map to find the nearest ValoArena to your location.

According to the company, “Astro Blade is also a part of Valo Motion’s mission of empowering people to move more and be physically active but also have a lot of fun while doing it.  n Astro Blade the players use their entire body to play the game and an intense sword duel among friends is guaranteed to make them sweat!”

Astro Blade is designed to be an interactive and social game, so players can work together in teams or compete against each other. It’s a fun and exciting way to experience the latest in mixed reality gaming.

The Future of Gaming 

Astro Blade is a testament to the power of mixed reality and how it can bring classic gaming experiences to life in new and exciting ways. As the popularity of MR releases like Astro Blade continues to grow, we can expect to see them become an increasingly ubiquitous part of the gaming landscape.

Astro Blade MR game by Valo Motion - ValoArena

For players, MR offers a unique and interactive experience that allows them to socialize and have fun in a way that traditional gaming simply cannot match. And for amusement and entertainment centers, investing in MR technology can provide a competitive advantage by offering a cutting-edge gaming experience that attracts customers and keeps them coming back for more.

Looking to the future, it’s clear that MR is set to play an even bigger role in gaming. With advancements in technology, we can expect to see even more immersive and interactive experiences that blur the line between the real and virtual worlds.

So, whether you’re a gaming enthusiast or an amusement center owner looking to stay ahead of the curve, it’s clear that MR has its place in the future of gaming. It’s an exciting time to be a part of the gaming industry.

Get Ready to Battle in Space With Valo Motion’s Latest MR Game Release Astro Blade Read More »

lark-optics-is-targeting-your-retinas-for-ar-without-nausea-and-other-sickness

Lark Optics is targeting your retinas for AR without nausea and other sickness

This story is syndicated from the premium edition of PreSeed Now, a newsletter that digs into the product, market, and founder story of UK-founded startups so you can understand how they fit into what’s happening in the wider world and startup ecosystem.

Whether you believe it’s the future of everything, or just a useful tool that will be part of the mix of tech we regularly use a few years from now, augmented reality is a rapidly developing field with one major drawback – like VR, it can leave you feeling sick.

For example, US soldiers who tried Microsoft’s HoloLens goggles last year suffered “‘mission-affecting physical impairments’ including headaches, eyestrain and nausea,” Bloomberg reported.

While the technology could “bring net economic benefits of $1.5 trillion by 2030” according to PwC, this sickness is a massive inhibitor to the growth of AR and VR.

One startup looking to tackle the problem is Cambridge-based Lark Optics, which has developed a way of bypassing the issues that cause these problems.

“In the real world, we perceive depth by our eyes rotating and focusing. Two different cues need to work in harmony. However, in all existing AR glasses, these cues fundamentally mismatch,” explains Lark Optics CEO Pawan Shrestha.

Having to focus on a ‘virtual screen’ on augmented reality glasses, means users have to switch focus between the real world and the augmented one. This depth mismatch causes physical discomfort and conditions like nausea, dizziness, eyestrain, and headaches.

What Lark Optics does differently, Shrestha says, is it projects the augmented reality image onto the user’s retina. This means the AR is always in focus no matter what your eyes do to adjust to the real world around you.

So far the startup has developed a proof of concept and is now iterating to refine its demonstrator model. Shrestha says they conducted two successful user studies with their proof of concept; one in their own lab and another with an external partner he prefers not to name.

When the tech is ready, they want to use a fabless model for producing the components they design, which they will then sell to original equipment manufacturers who make AR headsets.

Given they’re addressing such a fundamental challenge to the mass adoption of AR, it’s unsurprising that other companies are tackling it in other ways (more on that below). But Shrestha says his startup’s approach is the most efficient in terms of processing power and battery power, and doesn’t affect the user’s field of vision.

Shrestha grew up in rural Nepal (“really rural… I was nearly nine years old before I saw electric lights”). He says his parents’ enthusiasm for his education eventually led him to New Zealand where he obtained a masters degree in Electronics Engineering from the University of Waikato.

Keen to develop technology he could commercialise, he says he developed an interferometer. While that venture didn’t work out, his work led him on to a PhD from the University of Cambridge, where he spotted the commercial potential of a new approach to AR displays.

“It was scientifically challenging, but  it was also something that could touch the lives of many, many people,” he says.

Shrestha co-founded Lark Optics (which was previously known as AR-X Photonics) with his friend Xin Chang, and Daping Chu who previously oversaw the PhD work of Shrestha and Chang. The trio have been working together for around a decade but only got started with Lark Optics in earnest last year,

Shrestha says this week they have been joined by a new recruit, Andreas Georgiou, who previously worked at Microsoft as a principal researcher in the field of optical engineering.

The Lark Optics team (L-R): Weijie Wu, Dr Pawan Kumar Shrestha, Professor Daping Chu, Dr Andreas Georgiou, Dr Xin Chang

Perhaps unsurprisingly, Shrestha says being based in Cambridge is a big benefit to them, with a community of experienced advisers around them, and access to relevant investors. He is particularly inspired by the progress made by Micro LED tech startup Porotech, which has raised a total of $26.1 million to date.

And Shrestha has warm words for the Royal Academy of Engineering’s Enterprise Fellowship, of which he is a part. This provides up to £75,000 in equity-free funding to cover salary and business costs, along with mentoring, training and coaching. This was what allowed him to get started on developing Lark Optics as a business.

Lark Optics itself raised a pre-seed round of £210,000 in October last year, Shrestha says, and will be raising a seed round in Q2 this year.

As mentioned above, others are tackling the problem of AR sickness in different ways. LetinAR uses a ‘pin mirror’ method, Kura Technologies has developed a ‘structured geometric waveguide eyepiece’, while VividQ “compute[s] holograms in real-time on low power devices and integrate[s] them with off-the-shelf display hardware.” 

Another company, SeeReal develops holography-based solutions to address depth issues in 3D displays.

But Shrestha says these rival technologies either require a very high level of data throughput, with a related computational and battery power overhead, or require very high resolution displays. And while some techniques decouple the AR display from the real world like Lark Optics does, Shrestha says they are “like looking through a chicken fence.

“We solved the problem without getting a significant penalty on processing power or battery power, or artefacts. So that’s why I think our approach is the best.”

Lark Optics’ ambition is to become established as the best optics for AR, VR, and mixed reality glasses.

“We want to realise the full potential of AR and VR. Now we have AR and VR you can wear for 20 minutes or 30 minutes. We want to make it feel as natural to look at real objects, VR ,or AR, and allow people to use it for all-day, everyday use.”

Shrestha sees the biggest challenge to achieving this is being able to recruit the right people in what is quite a specialised field. But he’s optimistic that attracting just one or two high-level people will end up attracting more, and the endorsement of a good seed round raise in the coming months won’t hurt either.

AR, VR, and MR has been massively hyped in recent years but there have been questions over how much of a future it has. Investor disquiet over Meta’s huge spending in the ‘metaverse’ space, and Microsoft’s job cuts in its HoloLens division as it struggles to turn it into a viable business, show that there’s no straight line from here to a future where this tech is widely used.

But that said, the current jitters of the public markets over stock prices and tech company spending isn’t an end for AR, VR, and MR at all. Apple’s first headset is on the horizon, which will no doubt spin up another wave of interest in the space (although the latest report says it’s been delayed two months, until June). 

If technology like Lark Optics’ can help prepare AR, VR, and MR for the mainstream, the startup could be well positioned to reap the rewards.

The article you just read is from the premium edition of PreSeed Now. This is a newsletter that digs into the product, market, and story of startups that were founded in the UK. The goal is to help you understand how these businesses fit into what’s happening in the wider world and startup ecosystem.

Lark Optics is targeting your retinas for AR without nausea and other sickness Read More »

the-2023-polys-webxr-awards-recap

The 2023 Polys WebXR Awards Recap

The third Annual Polys WebXR Awards took place this weekend. The show was bigger than ever thanks to the first-ever in-person awards and a special event saying farewell to AltspaceVR. However, despite some new categories, the overall category list was shorter this year as a number of previous awards were combined.

A Very Special Polys

The Polys launched during the height of the pandemic. Fortunately, not being in person has a way of not greatly hindering an event that’s already dedicated to WebXR.

The event took place in a bespoke AltspaceVR world, with watch parties on YouTube as well as other remote platforms. However, this time, people were able to get together in person but they did it in a very “metaverse” way.

The Polys 2023 WebXR Awards

In-person hosts, producers, presenters, and an audience gathered at ZeroSpace, an XR stage and motion capture studio in Brooklyn. Their actions on the stage were volumetrically captured and displayed in The Polys’ AltspaceVR environment, similar to the launch of Microsoft Mesh. Polys Director Ginna Lambert said that this was the first award show to use the technology.

Further, while winners and honorees had previously received their Polys Awards as NFTs, the team worked with Looking Glass Factory so that this year’s Polys can be presented in a physical frame. This is as physical as The Polys can get, seeing as Linda Ricci designed the award to defy physics.

A Funeral for AltspaceVR

In lieu of a half-time show, Big Rock Creative CEO, co-founder, and producer Athena Demos held a eulogy for AltspaceVR. Virtual attendees lined the aisle to a pulpit adorned with flowers and candles in a ceremony that was heartfelt and a little macabre. Following mourners down the aisle was a coffin containing one of the iconic robot avatars that AltspaceVR used at launch.

“AltspaceVR will always hold a special place in our hearts,” said Demos. “While we say goodbye to the platform that brought us together, we will always remember the connections that we made here.”

AltspaceVR funeral - The 2023 Polys WebXR Awards

While the WebXR team has used AltspaceVR to host The Polys Awards and numerous other town hall events and summits over the last three years, Demos and her team have been using it to bring Burning Man into virtual spaces. There is also a farewell party scheduled by Big Rock Creative to last until the moment that AltspaceVR servers shut down later this week.

The Polys Awards

Where last year’s Polys saw 15 awards categories (not counting personal honors of Lifetime Achievement, Ombudsperson of the Year, and the Community Award), this year’s show had eight categories. That includes some new categories reflecting the advancement of immersive technology even over the last few months.

“We in this community are ahead of a massive shift that we call the fourth industrial revolution,” said host Julie Smithson. “We’re here to celebrate the progress made in WebXR in the year of 2022.”

Julie Smithson at The Polys WebXR Awards

Entertainment Experience of the Year

When popular culture looks at “the metaverse” they typically equate it with irresponsible escapism – something that people use to avoid the challenges of life. XR producer and director Kiira Benzing pointed out that positive escapism – using XR to take a break from life rather than to neglect it – is one of the medium’s greatest strengths.

“With the immersive medium, you get the opportunity to step into an experience,” Benzing said in presenting the award for Entertainment Experience of the year.

The award went to Project Flowerbed, an immersive gardening experience by the Meta WebXR team. The same project was nominated for Experience of the Year.

Innovator of the Year

Futurewei Technologies Senior Director for VR, Metaverse, Mobile, Apps, and Services Daniel Ljunggren presented the award for Innovator of the Year – previously “Innovation of the Year.” The award went to Sean Mann, CEO and co-founder of RP1, a “persistent, seamless, real-time platform with limitless scalability.”

“To be amongst this many pioneers and innovators in one space is amazing. I think we’re all winners,” said Mann. “I’m super excited to be a part of this.”

Developer of the Year

“Being on the frontier of the immersive web is a pioneering effort,” Yinch Yeap said in presenting this award. “It still feels like the Wild West.”

And, like in the Wild West, many of the biggest names are pseudonyms. This is certainly the case for this year’s winner, known only as “Jin.” Jin appeared as a similarly anonymized avatar to accept the award.

“I am a huge believer in WebXR,” said Jin. “I stand on the shoulders of giants. I am very humbled and I owe this to everyone building the immersive web.”

Game of the Year Award

“Game of the Year” is a broad category as most WebXR experiences are arguably “games” – and that’s what makes the award so important according to presenter Rik Cabanier, a software engineer at Meta. The award went to the mini golf game Above Par-Adowski by Paradowski Creative.

Above Par-Adowski VR game

Accepting the award was Paradowski Creative Director of Emerging Technology James Kane, who called WebXR “the best expression of the metaverse there is.” Kane was also a nominee for Innovator of the Year.

“I want to thank our team,” said Kane. “And thanks to the Meta team for creating an amazing WebXR platform as well as for directly supporting us.”

AR Passthrough Experience of the Year

“Where, for the past years AR experiences were mainly relegated to phones, now passthrough devices are everywhere,” said presenter Lucas Rizzotto. This allows more passthrough experiences on devices available today, but it also allows more impactful development of experiences for future AR devices.

The award went to Spatial Fusion by PHORIA and Meta, an experience which sees players repairing a damaged spaceship. Ben Ferns, a consulting developer, was one of those accepting the award.

“Huge thanks to the entire team – it was a huge team effort,” said Ferns. “It’s just exciting to see the promise of WebXR and passthrough.”

WebXR Platform of the Year

In presenting the award for WebXR Platform of the Year, Prestidge Group founder and CEO Briar Prestidge pointed out that every WebXR platform has strengths and weaknesses – something that she learned a lot about while famously spending “48 hours in the metaverse” for a documentary.

The award went to Croquet, “the operating system of the metaverse,” which also took home the Startup Pitch Competition Auggie Award last year. The award was accepted by The Polys on behalf of the organization.

Education Experience of the Year

The “digital divide” describes accessibility differences exacerbated by the benefit of technology only being available to those who can afford the required hardware or programs. WebXR is vital to the future of education because it lowers the cost of access for immersive experiences, according to Silicon Harlem founder Clayton Banks in presenting this award.

Banks presented the award to Prehistoric Domain, an immersive tour that brings learners up close and personal with virtual representations of dinosaurs and other extinct species. Accepting the award was creator Benjamin Dupuy. Prehistoric Domain was also nominated for Experience of the Year.

“WebXR opens so many possibilities – it’s very exciting,” said Dupuy in accepting the award. “We are all pioneers of the immersive web here and I think we’re at the beginning of an era where the line between illusion and reality is very thin.”

Experience of the Year

Demos returned to the stage – this time in volumetric capture instead of in her AltspaceVR avatar – to present the award for Experience of the Year to Spatial Fusion.

This was the experience’s second win of the night. The experience was also a nominee for Entertainment Experience of the Year. Ferns returned to accept the award and pointed out that the code has been open-sourced.

“I’m really excited to see what other people do with this now that it’s freely accessible,” said Ferns. “It’s an exciting time for trying out all of these new UX opportunities.”

This Year’s Honorees

In addition to the nominated awards categories, there are three honors categories. The honoree in each category is named by the previous year’s recipient rather than by a panel of judges.

Community Honor

Last year’s community honoree Trevor Flowers named Evo Heyning for this year, specifically for her work with the XR Guild, the Open Metaverse Interoperability Group, and [email protected].

“Whether it’s exploring AR, exploring 3D objects and NERFs, exploring interoperability of avatars and [email protected] specifically, being a part of these experiences with [Sophia Moshasha], with Ben [Irwin], with Julie [Smithson], with everyone – it’s meant so much to me,” Heyning said in accepting the honor.

Ombudsperson of the Year

The Ombudsperson of the Year Honor is specifically set up to recognize people working on the social and human aspects of WebXR. Last year’s honoree, Avi Bar-Zeev said that he was “honored to hand off the title” to Brittan Heller, a lawyer who introduced the term “biometric psychography” to describe mental and emotional profiling through an XR user’s personal data.

Brittan Heller at The Polys WebXR Awards

“I’d like to thank Avi, Kent [Bye], and everyone at the XR Guild and the Virtual World Society, and everyone in the XR community,” said Heller. “I appreciate how everyone here is so involved in making the community so welcoming to everyone.”

Bye, referenced by Heller in her acceptance speech, is a leading XR ethicist, a strong speaker in the nascent field of biometric psychography, and the first-ever recipient of this award.

Lifetime Achievement Honor

Last year’s Lifetime Achievement Honoree Brandon Jones selected Patric Cozzi for this year’s honor. Cozzi is the CEO of Cesium, but he was selected for this award because of his work co-creating glTF as a contributor to the Khronos Group.

Patric Cozzi at The Polys WebXR Awards

“I’m really honored for glTF and the community,” said Cozzi. “It was a grassroots effort for years.”

Looking Forward to the Future

This was the last year that The Polys WebXR awards will be hosted in AltspaceVR, but the team is still looking forward to next year’s event. While they haven’t yet said what platform (or platforms) it will take place on, there’s a full year to figure that out. And a year is a long time in this industry. If you missed this year’s ceremony, you can find the recording here.

The 2023 Polys WebXR Awards Recap Read More »

awe-2023-is-right-around-the-corner

AWE 2023 Is Right Around the Corner

Augmented World Expo, AWE for short, returns to Santa Clara this year from May 31 to June 2, 2023. The agenda is still coming together but there’s already a lot to be excited about. Let’s take a look.

Morning Keynotes

Many XR companies save some of their biggest announcements for the AWE stage. Even when companies aren’t dropping new products, apps, and services, they use the time to inform and inspire listeners about this rapidly developing space.

Day One

The first day of AWE always starts with an opening keynote from event founder Ori Inbar. Inbar’s addresses are always insightful and digestible with good measures of his palpable enthusiasm and humor. During his opening keynote last year, Inbar spoke about how XR can help make both big dreams and small dreams become reality.

Next up is the Qualcomm keynote from Vice President and General Manager of XR Hugo Swart. At his keynote last year, Swart presented Snapdragon Spaces and introduced the first two recipients of Qualcomm Ventures’ metaverse-funded companies.

Then, Nreal CEO Chi Xu takes the stage. Nreal hasn’t been a keynote presenter in the years that ARPost has covered AWE. But, the company is definitely going places. This year saw the commercial launch of Nreal Air (review) and we know that they have at least one more model waiting in the wings for the next big launch.

Day Two

Day two only has one proper keynote scheduled, this time with Magic Leap. Last year, the company’s Head of Product Management, Jade Meskill, took the stage to talk about the Magic Leap 2 and “augmented enterprise.” We don’t yet know what will come of this year’s keynote but it’s being given by the company’s CEO Peggy Johnson.

Following that is a “Fireside Chat” with Unity CEO John Riccitiello. That it’s a “fireside chat” and not a “keynote” arguably suggests that there won’t be any big product announcements but that doesn’t mean that this session shouldn’t be on your schedule.

Days two and three are lighter on heavy-hitting speakers to encourage attendees to check out the expo floor, which we’ll look at next. Don’t worry though, there are sessions to look forward to beyond just keynotes and we’ll look at some of those later.

The Expo Floor

It’s impossible to know exactly what will be going on on the expo floor, which is part of what makes it so exciting. A list of exhibitors (over 130 of them) and a map of the expo floor are posted on the AWE website, but what companies will be showcasing and how is a mystery until the floor opens on day two.

First off, a number of haptics pioneers will be there including Haptx, bHaptics, and SenseGlove. Any immersive technology is better when you experience it yourself instead of just seeing it on YouTube, but this is doubly true for haptics. But, unfortunately, many of these products are still hard for the average person to get their hands on. That makes the expo floor a great intro.

Mojo Vision will also be on the AWE expo floor. While this company isn’t likely to be putting their AR contact lenses onto the eyeballs of just anybody, they do have rigs that allow you to get a glimpse through what they’re building.

DigiLens, Vuzix, and Lenovo will also be on the AWE Expo floor. These companies make components and enterprise hardware that’s usually a cut above available consumer models. Trying them out can be a glimpse into the future. I got to get my hands on some of their hardware at last year’s expo and left feeling enlightened.

Also, Tilt Five will be returning. Last year, their augmented game board was the life of the expo floor drawing huge crowds – not just to interact with the product but to watch other people interact with the product.

Of course, that’s only a sliver of the total exhibitors. Personally, I’m hoping to reconnect with some of my friends from Avatour, Echo3D, FundamentalVR, Inworld AI, Leia Inc., Mytaverse, OVR Technology, VRdirect, and Zappar.

Expert Talks and Panel Discussions

Day One

On day one, right after the keynotes, many will likely stay in their seats to see Forbes columnist, author, and educator Charlie Fink talk with Magic Leap founder and former CEO Rony Abovitz about “How We Can Invigorate XR.” A few hours later on the same stage, Qualcomm Director of Product Management Steve Lukas will talk about “Building AR for Today.”

A little after that, one might head out of the Mission City Ballroom to Grand Ballroom C’s “Web3” track where EndeavorXR founder and CEO Amy Peck will be debating “Pros &Cons of Web3” with XR Guild President Avi Bar-Zeev. It’s hard to find an XR organization that Peck isn’t or hasn’t been involved with, and Bar-Zeev co-created Google Earth and HoloLens.

From there, one might head back to the Mission City Ballroom for “Intersection of AI and the Metaverse: What’s Next?” a panel discussion with leading XR ethicist Kent Bye, HTC VIVE China President Alvin Graylin, WXR Fund Managing Partner Amy LaMeyer, and Creative Artist Agency’s Chief Metaverse Officer Joanna Popper.

But wait! Happening at the same time is “How XR Technology Is Changing the Fashion Landscape” with Beyond Creative Technologist David Robustelli, Ready Player Me co-founder Kaspar Tiri, and DressX co-founder Daria Shapovalova.

Depending on which of those last two talks you see, you might have time for “What Problem Does the Metaverse Solve?” with Nokia Head of Ecosystem and Trend Scouting Leslie Shannon.

If you miss the first fashion session, you can always catch “Redefining Fashion and Beauty’s Next Decade – From Virtual Beings and Gaming to Generative AI” with LVMH VP of Digital Innovation Nelly Mensah, 5th Column founder and CEO Akbar Hamid, and Journey founder and Chief Metaverse Officer Cathy Hackl.

Day Two

On the same day that the expo opens up, on the main stage, Paramount Pictures Futurist Ted Schilowitz presents “XR Excellence: Demonstration & Discussion” – billed as a collection of “what he thinks are the best experiences in VR and MR today, and what we can learn from those experiences” followed by Q&A.

But oh no! At the same time in Ballroom D, Khronos Group President Neil Treveett, XRSI founder and CEO Kavya Pearlman, and Moor Insights & Strategy Senior Analyst Anshel Sag are talking about building open standards for the metaverse!

XR Talks with ARPost

Both of those events conflict with a “Meet the Makers” session featuring Julie Smithson and Karen Alexander of MetaVRse, Sophia Moshasha of the VR/AR Association, and Ben Erwin of The Polys Awards.

Later in the afternoon, Inworld AI’s Chief Creative Officer John Gaeta and Chief Product Officer Kylan Gibbs debut a new concept demo called “Origins” – a new kind of caper in which a human detective must navigate a world of generative AI bots.

The evening of AWE Day Two is also The Auggie Awards. We can’t tell you too much about the Auggie Awards because the finalists aren’t out. In fact, you still have until April 7 to submit nominees. Then, there’s a period of public voting until May 4. You can submit nominees and vote for your favorites here.

Day Three

On day three, in the “AI and Virtual Beings” track, producer, director, and strategist Rebecca Evans, Stanford University Graduate Research Fellow Eugy Han, Odeon Theatrical CEO Stephanie Riggs, and Dulce Dotcom advisor Dulce Baerga will discuss “Avatars, Environments & Self Expression – from Social VR to Cross-Reality Experiences.”

From there, you might head back to the Mission City Ballroom for a Fireside Chat with Tom Furness, the founder and chairman of the Virtual World Society – one of the oldest and noblest organizations in immersive tech.

AWE concludes on the afternoon of day three with Inbar’s closing statements and the Best In Show Awards on the main stage.

How to Attend AWE

Once again, all AWE recordings will become available on AWE.live. If you want to experience AWE in person, you still have time to get tickets. If you’re reading this before February 28, you still have time for Super Early Bird Tickets. You can also get 20% off of your ticket price by using discount code 23ARPOSTD at checkout.

And keep an eye on ARPost as AWE draws nearer. As a media partner of the event, we’ll be giving two free tickets to selected readers as part of an upcoming drawing. Watch our social media channels for details.

AWE 2023 Is Right Around the Corner Read More »

qualcomm-partners-with-7-major-telecoms-to-advance-smartphone-tethered-ar-glasses

Qualcomm Partners with 7 Major Telecoms to Advance Smartphone-tethered AR Glasses

Qualcomm announced at Mobile World Congress (MWC) today it’s partnering with seven global telecommunication companies in preparation for the next generation of AR glasses which are set to work directly with the user’s smartphone.

Partners include CMCC, Deutsche Telekom, KDDI Corporation, NTT QONOQ, T-Mobile, Telefonica, and Vodafone, which are said to currently be working with Qualcomm on new XR devices, experiences, and developer initiatives, including Qualcomm’s Snapdragon Spaces XR developer platform.

Qualcomm announced Snapdragon Spaces in late 2021, a software tool kit which focuses on performance and low power devices which allows developers to create head-worn AR experiences from the ground-up or by adding head-worn AR to existing smartphone apps.

Qualcomm and Japan’s KDDI Corporation also announced a multi-year collaboration which it says will focus on the expansion of XR use cases and creation of a developer program in Japan.

Meanwhile, Qualcomm says OEMs are designing “a new wave of devices for operators and beyond” such as the newly unveiled Xiaomi Wireless AR Glass Discovery Edition, OPPO’s new Mixed Reality device and OnePlus 11 5G smartphone.

At least in Xiaomi’s case, its Wireless AR Glass headset streams data from compatible smartphones. Effectively offloading computation to the smartphone, the company’s 126g headset boasts a wireless latency of as low as 3ms between the smartphone device to the glasses, and a wireless connection with full link latency as low as 50ms which is comparable to wired solution.

Qualcomm Partners with 7 Major Telecoms to Advance Smartphone-tethered AR Glasses Read More »

“the-bear-who-touched-the-northern-lights”-is-a-charming-ar-story-puzzle

“The Bear Who Touched the Northern Lights” Is a Charming AR Story Puzzle

When a polar bear sees the northern lights for the first time, he wants to reach out and touch them. How will he get there and who will he meet along the way? That’s up to you with this charming interactive AR story puzzle.

The Bear Who Touched the Northern Lights

The Bear Who Touched the Northern Lights” is a sort of choose-your-own-adventure AR story for children where the “chapters” are physical puzzle pieces. The artwork and story are by Julie Puech and Karl Kim.

The ways in which these pieces fit together (or don’t) helps the AR story keep a logical narrative. However, pieces can be added and removed or swapped out resulting in multiple different possible tellings of the tale.

Of course, the adorable puzzle doesn’t tell the whole story. The puzzle pieces come to life with the help of a free AR mobile app for Apple and Android devices. The mobile app recognizes the pieces and animates their artwork, as well as queueing an audio narration by Kasey Miracle.

As a weary old XR veteran with a cold little heart, I sometimes find it helpful to recruit fresh eyes for product reviews – like when my younger brother provided his insights for my Nreal Air review. This time I recruited the help of my fiancée’s eight-year-old daughter.

What’s in the Box?

The puzzle comes with 15 AR story cards and an instructional booklet. The instructional booklet has information about the product, links to the app, and some advice for doing the puzzle for the first time – but don’t panic if you lose it. The puzzle information and a QR code to the app are both on the outside of the box and the first puzzle piece triggers an AR guide to using the app.

AR app - The Bear Who Touched the Northern Lights - AR Story Puzzle

The free app, powered by Unity, opens with a quick warning about being aware of your surroundings while using AR and encourages you to supervise children when using the app. From there, the app only has a play button and a settings button. Settings include background dimming to make the animations stand out better, or an option to turn the animations off.

Do be aware that the app is 394 MB and does require a fairly modern device to run. Like any AR app, it requires the use of your camera while the app is running.

Following Directions

Some pieces have special icons on them. Cards with a blue “+” are optional chapters that don’t have to be included in the AR story. Cards with green and orange arrows can be swapped out for one another, changing how the story unfolds.

The play guide recommends that you remove the optional chapters and two of the interchangeable chapters the first time that the puzzle is constructed. This is presumably an introductory version of the puzzle to avoid throwing too much at first-time players.

As with any puzzle, it’s important to find a flat surface large enough for the puzzle when completed. The play guide recommends a space of two feet by three-and-a-half feet. The AR story puzzle is long and narrow in nature, particularly with all of the possible pieces in play, but has some curves in the overall shape so it isn’t just a straight line.

AR app - The Bear Who Touched The Northern Lights

The AR instructions at the beginning of the puzzle remind you that you also need to have space to sit comfortably with the puzzle in front of you for about 20 minutes (give or take). After all, the play guide also recommends additional activities like asking the child to try to construct the story from the puzzle before watching the narration.

Putting the Pieces Together

The first time putting the puzzle together, we followed the play guide’s advice to remove extra pieces and one set of interchangeable chapters. The shapes of the pieces are similar enough to make it a little challenging for young hands to assemble without it being frustrating. They’re also different enough that the story can’t be constructed in an order that wouldn’t make sense.

It only took a few minutes to assemble the puzzle for the first time, and then we fired up the app. The AR instructions are short, cute, and very informative, telling us everything we needed to know without being boring. It takes the app a second or so to recognize the cards, so moving from one chapter to the next is neither seamlessly fast nor frustratingly slow.

The Bear Who Touched the Northern Lights - AR Story Puzzle

The animations are cute and colorful, and the effects are simply but beautifully done. The default background dimming on the app is 35%, and it certainly worked. Turning it up can make the background disappear completely, which makes for optimum viewing quality, but also makes it harder to find the pieces in the camera. Pick what setting you like best.

At one point in the story, the bear starts receiving items for his journey. The Child got to choose which items he used when, but only one item was ever needed in the story, and selecting the wrong item isn’t penalized – you just pick again. We were split on this. It’s nice that we couldn’t pick wrong, but picking at all felt kind of unnecessary. (This made more sense later on.)

We reached the end of the AR story. Sort of. Immediately upon finishing the puzzle and the story the first time, The Child asked to do the puzzle again with the extra chapters.

Putting the Pieces Together Again

We added in the two optional AR story pieces and swapped out both of the interchangeable pieces and put the puzzle together again. Suddenly, the choices made a much bigger difference and a lot more sense.

The interchangeable pieces provide the bear with a different item and see him use it in a different way. The additional chapters introduce new characters, which the bear befriends by using the different items. This gave The Child a new appreciation for the AR story, but it gave me a new appreciation for the AR app.

Doing the puzzle the first time, one would be forgiven for assuming that the chapters are stand-alone pieces that don’t affect one another. Doing the puzzle again makes it clear that the app is telling a new story each time based on the pieces, their placement, and your choices throughout the story.

AR Story Puzzle - The Bear Who Touched the Northern Lights

We’ve only done the puzzle those two times so far. I haven’t done the math to figure out how many different versions of the story are possible with different choices, pieces, and arrangements, but I know that there are a lot of versions of the story that we have yet to hear.

And that’s a good thing. As soon as we finished doing the puzzle the second time, The Child immediately asked if there were any more AR story puzzles like this one.

Where to Find the AR Story Puzzle

So far, The Bear Who Touched the Northern Lights is the only product by Red+Blue Stories (but we’re hopeful for more). The company is based in Canada but also ships to the US. Prices start at around US$34, but you can pay more for different shipping options. As of this writing, the AR story puzzle is not available on other online retailers like Amazon.

The AR instructions say that a child can use the product by themselves after the first go-around. That may be true, but if you’re letting your child construct this AR story puzzle without you, you’re missing out.

“The Bear Who Touched the Northern Lights” Is a Charming AR Story Puzzle Read More »

how-xr-fan-engagement-brings-fans-closer-to-the-game

How XR Fan Engagement Brings Fans Closer to the Game

Over the years, ARPost has covered the physical nature of XR in athletics and sports a number of times – from how athletes use XR to improve their game, to how gamers can use VR sports to stay fit, to how thrilling and active a good AR team game can be for players and spectators alike. XR is also increasingly being used in another capacity: fan engagement.

Is AR the Future of Fan Engagement?

Athletes are usually sports fans, but are sports fans usually athletes? This article isn’t about how XR can make a sports viewer into a finely-tuned machine, or how a sports viewer can become a star in their own right through things like esports. After all, not all sports fans want to do those sorts of things.

However, it’s probably fair to say that all sports fans want to feel closer to the athletes and teams that they follow. That doesn’t mean getting onto the field, but it might mean getting out of the stands. Sports teams and property managers are increasingly using XR for sports fan engagement to let fans get closer to their passion, if not closer to the action.

In-Arena Opportunities for CBJ Fans

In January, NHL team The Columbus Blue Jackets unveiled “The Fan Zone” in their home Nationwide Arena in partnership with MVP Interactive. Followers of ARPost might remember that MVP Interactive also made appearances in our 2021 article about how and why brand engagement is driving XR development.

“The Blue Jackets are one of the few sports organizations taking the lead to bring fans the latest in cutting-edge technology with first-ever immersive experiences to their arena,” MVP Interactive CEO James Giglio said in a release shared with ARPost. “Our team was honored to work with everyone at CBJ to bring technology forward with multi-generational experiences to their Fan Zone.”

Slapshot Challenge 3 - The Columbus Blue Jackets - fan engagement

The 4,000 square-foot space overlooks the team’s practice area and includes a number of XR experiences, as well as the eSports Lounge for CBJ gaming, the team’s official esports arm. As exciting a development as esports is in the general gaming world, we’re most interested in the XR fan engagement activations.

“With the upgraded space and technology advancement of our new Fan Zone, we hope to provide a world-class experience for fans of all ages,” Blue Jackets Vice President of Marketing Ryan Chenault said in the release.

XR in the Fan Zone

In the “Slapshot Challenge” fan engagement activation, fans choose between three different game modes including “Shots on Goalie” pitting their skills against a virtual goaltender. Using a real stick and a ball, the fan’s movements are tracked by sensors to replicate an on-ice experience in a space reminiscent of the Cave VR system.

Slapshot Challenge - AR fan engagement - The Columbus Blue Jackets

The “Goalie Challenge” flips the scenario, both figuratively and physically. In full goalie gear, the fan now faces the screen where a virtual contender appears to launch physical balls their way. While the goalie in the slapshot challenge is entirely automated, the placement of balls fired off in the goalie challenge can be controlled by a friend via a computer interface.

“The Blue Jackets are dedicated to removing barriers to the game of hockey and investment in this space is a meaningful nod to this mission,” said Chenault. “By providing both stick-in-hand and controller-in-hand activations, we can give fans an opportunity to not only watch the game but experience it first-hand.” 

Slapshot Challenge 2 - The Columbus Blue Jackets - fan engagement

There are less intense fan engagement opportunities as well. A “Pose with a Pro MorphingStation” gives fans an opportunity to take a selfie next to a virtual replica of their favorite Blue Jackets. A similar activation allows fans to pose in a virtual Blue Jackets jersey. All of these activations reward the fans with videos and images optimized for social media.

Pose with a pro - The Columbus Blue Jackets - XR fan engagement

Implementation and Stats

On entering the Fan Zone, fans have the opportunity to check in by scanning a QR code and providing an email address to receive their videos and photographs. According to figures provided to ARPost following the launch of the activation, over 1,200 fans entered the Fan Zone on opening night and 375 provided emails to receive their digital mementos.

Further, the “average dwell time across experiences was 24.55 seconds.” This may not seem like a long time, but it is averaged across all of the fan engagement experiences though the challenges likely engaged fans for significantly longer than the AR photo opportunities.

NIL in AR

The “Pose with a Pro” fan activation presented by the Blue Jackets shows that there is a lot of promise in sports fan engagement with virtual replicas of their favorite athletes. That isn’t just limited to professional sports, however.

College sports are tremendously popular but its athletes were, to some degree, barred from benefiting from that popularity for most of the history of college sports. That’s because college athletes were largely prevented from benefiting from their name, image, and likeness (NIL) by the NCAA – the organization that governs college sports.

However, in 2021, the NCAA began loosening NIL rules, opening up potentially lucrative opportunities for college athletes. AR publishing platform LDP Studio claims to be part of the first “NILAR” (name, image, and likeness in augmented reality) agreement. The signee? The University of Tennessee senior tight end Jacob Warren for the Craven Wings restaurant chain.

“We believe AR Hero will change the way college football fans experience the game by engaging more people with the players they know and love,” LDP Studio VP of business development Jessee Black said in a release shared with ARPost. “It’s a really cool and futuristic new concept for QR code use which increases engagement for businesses and brings fun to the fans.” 

NILAR Jadob Warren - fan engagement

AR Hero, the tool that runs the experience, invites users to trigger fan engagement activation via a QR code. From there, fans can take photos and videos with an AR version of Warren that goes through different poses giving plenty of opportunities to fans.

“With AR Hero, fans can feel like they are part of the action and experience the players they know and love in a whole new way,” said Black. “Businesses have the opportunity to create more engagement with fans through ‘NILAR’ as well.”

The First NILAR Agreement?

It’s easy to be skeptical of whether this fan engagement initiative is really the first NILAR agreement. It is very probably the first NILAR agreement in college sports and it just might be the first of its kind anywhere as LDP Studio claims.

Digital twins of celebrities aren’t brand new. However, the ownership of these twins has long been problematic. The owner of a digital twin is usually the studio that commissioned it, rather than the individual that the twin is created from.

NILAR agreements with athletes as well as other individuals have huge potential to give individuals more control over their own digital twins. That’s a big win for those individuals from an economic standpoint, but it’s also a good idea from an ethical perspective.

Getting Sports Fans Out of Their Seats

With good AR fan engagement, everybody wins. Fans get more interactive ways to engage with their favorite content and athletes. Athletes can have an AR proxy that’s available to fans while they’re busy training, on the field, or at home. Teams get new ways to bring fans deeper into the sports that they love (and, yes, collect some much-cherished user data).

The good news keeps getting better. XR fan engagement activations are becoming simpler to use, more interactive, and are even being created in ways that are more mindful of the humans that lend their digital duplicates to these activations.

How XR Fan Engagement Brings Fans Closer to the Game Read More »

apple-ar-glasses-put-on-hold-to-make-way-for-mr-glasses-–-vr-and-metaverse-expert-weighs-in

Apple AR Glasses Put on Hold to Make Way for MR Glasses – VR and Metaverse Expert Weighs In

Has Apple bitten more than it could chew? It appears that the long-awaited AR glasses won’t be hitting the shelves any time soon. A Bloomberg article published recently says that the Apple AR glasses are facing technical challenges, so their release has been delayed indefinitely and the project scope pared back. The report also revealed that Apple may instead opt to release a more affordable mixed reality headset.

Emma Ridderstad, CEO and Co-founder of Warpin Reality, shares her insights on the delayed release of the Apple AR glasses and the development of its mixed reality headset, probably to be called Reality Pro. She also shares her thoughts on what these developments mean for the industry, the consumers, and the future of AR/VR.

Apple AR Glasses Shelved to Make Way for an MR Headset

For a couple of years now, Apple has been developing AR glasses that resemble real eyeglasses. The design has already gone through several iterations but still, apparently, fails to meet expectations. While it is unclear where the real problem lies, it is clear that we won’t be seeing through the Apple AR glasses this year.

According to Bloomberg, what we may see soon are MR headsets that combine virtual and augmented reality elements. It was reported that Apple is shifting its focus towards developing a bulkier but less complicated MR headset with a projected price tag of $3,000. The company then plans to follow this with a more affordable version priced at just around $1,500, closer to Meta Quest Pro, though still with a higher price tag.

A Wise Move by Apple

When asked whether the delay of the Apple AR glasses will affect businesses that have already adopted the technology, Ridderstad believes that it would have little impact. Aside from the limited number of businesses currently using Apple’s AR technology, those that have adopted it are not fully reliant on it.

According to Ridderstad, AR/VR technology is still in its infancy. As immersive as these headsets are, they aren’t very convenient. The use cases are still quite limited, and the high cost of both hardware and software can be restrictive. “VR headsets need to become useful to people. Right now, they solve business-to-business problems but they’re still mostly just fun for the end consumer,” Ridderstad explained. So, Apple’s shift from AR glasses to MR headsets makes sense given the broader need to make immersive technology more accessible and affordable.

Ridderstad also believes that Apple will remain a key player in the industry, despite delays on its AR glasses. Consumers continue to trust Apple to produce well-researched and designed products. Considering the price, design, and content of these headsets, the market needs to see more affordable and functional headsets. “Since most people are just starting to see what these new technologies can do, we have to remind ourselves that this evolution is going to take time,” she said. “The real end consumer adoption will probably happen with Apple this time too.”

The True Value of XR Goes Beyond Gaming and Entertainment

XR technology has long been associated with gaming. But Ridderstad argues that the true value of XR lies in its potential in business, training, and education.

Her company, Warpin Reality, has developed a platform called Xelevate, which allows companies to launch customizable VR training courses for their employees. These courses range from safety drills to customer experience simulations and personality development workshops. Platforms like this have allowed construction companies to train their people on safety and equipment use and taught employees what to do during emergencies.

Ridderstad believes that VR/AR can optimize focus, learning, and training. She cites a PwC study that found that VR learners are more focused, learn more quickly, and are more emotionally engaged than e-learners. It could also create opportunities in remote work for those who struggle with in-person demands such as people with disabilities.

Diversity and Accessibility in Tech 

For years, the tech industry has been known to be a boys’ club. This still remains true in the metaverse. A McKinsey report found that in organizations shaping metaverse standards, 90% of leadership roles are held by men. Ridderstad warns, “The metaverse is not going to be an environment that people want to be in unless everyone feels welcome and comfortable. I think it is safe to say that unless women play their part in building the metaverse, and take their place among its architects, it won’t be.”

These technologies have the potential to revolutionize the future, so it’s important that they are designed for both men and women to see a higher level of adoption.

Apple AR Glasses Put on Hold to Make Way for MR Glasses – VR and Metaverse Expert Weighs In Read More »