magic leap

magic-leap-shakes-up-leadership-with-new-ceo

Magic Leap Shakes up Leadership with New CEO

Magic Leap, maker of one of the best AR headsets on the market, is making a major change to its leadership with a new CEO that will face the challenge of carving out territory for its transparent AR technology against a growing wave of passthrough AR headsets.

After a meteoric rise and then near catastrophic collapse under its original founder Rony Abovitz, Magic Leap brought on Peggy Johnson to stabilize the company, manage its pivot to enterprise, and launch the Magic Leap 2. Three years later, Johnson is out and a new CEO is taking over.

Magic Leap has announced that Ross Rosenberg will take up the position, an experienced tech executive who has worked in senior roles at a number of large-scale enterprise technology companies.

From the announcement, and its description of Rosenberg’s prior work, it seems clear that Magic Leap is hoping the new CEO will be able to guide it toward increased (or perhaps, initial) profitability.

But Rosenberg’s tenure will inevitably be about more than just streamlining operations and finding the right product-market fit; he’ll also need to both grow and defend the company’s turf as newer headsets focus on passthrough AR capabilities—the likes of Quest 3 and Vision Pro.

While neither headset is directly competing against Magic Leap’s enterprise-focused transparent AR headset, Rosenberg will surely be looking a few years down the road at which point passthrough AR headsets could begin to approach the size and real-life visual quality that is currently Magic Leap’s advantage.

The company hasn’t yet hinted at an upcoming Magic Leap 3 headset, though with the current Magic Leap 2 only being out for a little over a year at this point, that could well still be brewing.

At least from the outside, it looks like the company had an amicable split with the former CEO, Peggy Johnson, though it isn’t clear which side compelled the change.

“Having accomplished so much of what I set out to do at Magic Leap, I felt the time had come to transition leadership to a new CEO who can guide the company through its next period of growth,” Johnson said in the announcement. “I’m incredibly proud of the leadership team we’ve built at Magic Leap and want to sincerely thank all of the employees for their work in helping to successfully reorient the company to the enterprise market.”

Magic Leap Shakes up Leadership with New CEO Read More »

unicorn-ar-startup-magic-leap-is-killing-its-first-headset-next-year

Unicorn AR Startup Magic Leap is Killing Its First Headset Next Year

Magic Leap 1, the AR headset that helped the Plantation, Florida-based startup attract over three billion dollars in funding, will be completely defunct by late next year.

The company announced this week that Magic Leap 1’s cloud services are due to be shut off on December 31st, 2024. After that date, the headset will receive no further support.

The loss of cloud services indeed means the headset will essentially be ‘bricked’ on that date, as “core functionality will reach end-of-life and the Magic Leap 1 device and apps will cease to function,” the company says in a recent FAQ.

Image courtesy Magic Leap

On September 29th, 2023, a number of developer resources are being pulled too, including the forum dedicated to Magic Leap 1 as well as the headset’s dedicated Discord channel.

Here’s the full statement from Magic Leap, courtesy a comment made by Reddit user ‘The Golden Leaper’.

Today we are announcing that Magic Leap 1 end of life date will be December 31, 2024. Magic Leap 1 is no longer available for purchase, but will continue to be supported through December 31, 2024 as follows:

Magic Leap 1 Developer Forum: On September 29th, 2023, Magic Leap 1 Developer Forum (https://ml1-developer.magicleap.com/) will no longer be available. Please refer to the Magic Leap 2 Developer Forum for current information and updates on the Magic Leap platform.

Magic Leap Discord channel: On September 29th, 2023, Magic Leap Discord channel will no longer be available. Please refer to the Magic Leap 2 Developer Forum for current information and updates on the Magic Leap platform.

OS Updates: Magic Leap will only address outages that impact core functionality (as determined by Magic Leap) until December 31, 2024.

Customer Care (http://magicleap.com/contactus) will continue to offer Magic Leap 1 product troubleshooting assistance through December 31, 2024.

Warranties: Magic Leap will continue to honor valid warranty claims under the Magic Leap 1 Warranty Policy available here (https://www.magicleap.com/ml1/warranty-policies).

While no longer sold by Magic Leap, the company was selling ML 1 up until mid-2022. Brand new units, including the headset’s hip-worn compute unit and single controller, were being liquidated for the barn burner deal of $550 via Amazon subsidiary Woot. It’s uncertain how many developers and enterprise users will be affected by the shutdown, however they do have a little over a year to figure out a replacement strategy.

Launched in 2018, Magic Leap straddled an uneasy rift between enterprise and prosumers with ML 1 (known then as ‘ML One’), which gained it a lukewarm reception mostly thanks to its $2,300 price tag and relatively narrow differentiation from Microsoft HoloLens. Despite Magic Leap’s best efforts, it simply wasn’t the consumer device the company wanted to make from the onset.

A leadership shuffle in mid-2020 saw co-founder and CEO Rony Abovitz step down, tapping former Microsoft exec Peggy Johnson to take the reins and immediately pivot to target enterprise with its most recent AR headset, Magic Leap 2.

Unicorn AR Startup Magic Leap is Killing Its First Headset Next Year Read More »

xr-industry-giants-team-up-to-save-key-developer-tool

XR Industry Giants Team up to Save Key Developer Tool

Microsoft, Qualcomm and Magic Leap announced a partnership to “guide the evolution” of the Mixed Reality Toolkit (MRTK), a cross-platform AR/VR development framework which has now gone open-source.

MRTK was a Microsoft-driven project that provided a set of components and features used to accelerate cross-platform XR app development in the Unity game engine. The developing team behind MRTK was unfortunately disbanded, as Microsoft cut both MRTK and the AltspaceVR teams earlier this year in a wide-reaching round of layoffs.

Still, as an open-source project now, Microsoft is joining XR industry cohorts Qualcomm and Magic Leap to form their own independent organization within GitHub that aspires to transform the software into a “true multi-platform toolkit that enables greater third-party collaboration.”

“With Magic Leap joining Microsoft as equal stakeholders on the MRTK steering committee, we hope to enrich the current ecosystem and help our developer community create richer, more immersive experiences that span platforms,” Magic Leap says in a blogpost. “Additionally, our support for MRTK3 will allow for simple porting of MRTK3 apps from other OpenXR devices to our platform.”

MRTK3 already supports a wide range of platforms, either full or experimentally, including OpenXR devices like Microsoft HoloLens 2, Meta Quest, Windows Mixed Reality, SteamVR, Oculus Rift (on OpenXR), Lenovo ThinkReality A3, as well as Windows Traditional desktop. The committee says more devices are “coming soon,” one of which will likely be the Magic Leap 2 AR headset.

Meanwhile, Microsoft announced MRTK3 is on track to reach general availability to developers on the second week of September 2023. To learn more, check out Microsoft’s MRTK3 hub, which includes support info, tutorials, and more.

XR Industry Giants Team up to Save Key Developer Tool Read More »

unveiling-the-future-of-driving:-mercedes-benz-vision-one-eleven-concept-car-uses-magic-leap-2

Unveiling the Future of Driving: Mercedes-Benz Vision One-Eleven Concept Car Uses Magic Leap 2

The German luxury automaker Mercedes-Benz recently introduced its Vision One-Eleven concept car the Vision One-Eleven. On top of incorporating sustainability with its electric motor engine alongside a dynamic redesign, Vision-One Eleven uses Magic Leap 2 AR glasses for a more immersive car experience.

This approach reflects Mercedes-Benz’s commitment to creating better cars that provide the best possible driving experience to consumers while accommodating concerns about sustainable driving and introducing new tech. By partnering with Mercedes-Benz, Magic Leap also takes another step towards making AR experiences a part of everyday life.

Vision One-Eleven: A New Twist on an Old Classic

The Vision One-Eleven is a revisited concept car built on another beloved Mercedes-Benz classic, the C 111. The C 111 concept car incorporated iconic gullwing doors for a truly one-of-a-kind design in its day. Combined with its modern interiors, it proved to be an appealing concept car that influenced modern luxury vehicles.

Vision One-Eleven concept car

With the Vision One-Eleven, Mercedes-Benz further improves on the characteristics that set the C 111 apart, blending luxury interiors with intelligent design for a truly futuristic car. A sports vehicle with a lounge interior and a sleek body, the Vision One-Eleven is an exciting peek as to what the cars of the future may look like—from looks all the way to its electric motor.

The Capabilities of AR Glasses on the Road

Aside from visual and engineering overhauls, Vision One-Eleven also incorporates another rapidly growing technology: augmented reality. Since the adoption of full AR experiences has been slow in the larger market, XR companies like Magic Leap pivoted to a slower but steadier approach by bringing tech like the Magic Leap 2 into specific industries.

Drivers often have to manage a large amount of information to navigate and keep safe on the road. With the integration of technology such as built-in navigation or car sensors, drivers can rely on various tools that can help improve their driving efficiency.

This isn’t just progress for the sake of progress either: the introduction of AR technologies to drivers has plenty of benefits, from reducing the cognitive load to helping them navigate hazardous driving conditions.

While these applications have yet to be fully adopted by the market, the partnership between Mercedes-Benz and Magic Leap shows that this is an avenue both AR companies and car manufacturers can benefit from.

An Augmented and Seamless Driving Experience

Specific details about how Magic Leap 2 will integrate with Vision One-Eleven’s driving systems have yet to be released. Still, the goal is to create a configurable, immersive AR interface between the driver and their vehicle. This interface can display information about driving conditions on-demand, from the selected drive mode to information about the driver’s destination and current location.

Vision One-Eleven concept car and Magic Leap AR glasses

With Magic Leap 2, this system transforms the conventional dashboards of cars into a dynamic cockpit where drivers can fully use their field of vision to navigate the roads better. This drastically helps improve both the driving experience and car safety for car owners, passengers, and passersby—while also implementing an intelligent driving model that may potentially reinvent the way people drive.

A Partnership Built On Innovation

The Vision One-Eleven isn’t the first collaboration between Magic Leap and Mercedes-Benz: the two companies worked together in 2019 for the Mercedes Immersive Roadshow. While Magic Leap’s role in that collaboration was to enrich the viewing experience by augmenting the visual aesthetic of the exhibit, their new partnership with Vision One-Eleven shows Mercedes-Benz’s confidence in the potential of AR experiences.

Given the increasing entry rate of other competitors into the AR market, Mercedes-Benz and Magic Leap have secured themselves a lead over the competition when introducing AR into the driving experience. Whether they can hold on to this head start is something else altogether—but for now, the Vision One-Eleven holds the spotlight as a blend of technology and good car design.

What’s Next?

The Mercedes-Benz Vision One-Eleven, like most concept cars, is unlikely to be produced in its current form. However, its design, technology, and engineering innovations will undoubtedly be integrated into future Mercedes-Benz production vehicles. And it’s pretty certain that XR technology will find its place in those vehicles.

According to Mercedes-Benz, “The spatial user interface is a beacon for a Mercedes-Benz user experience that is unencumbered by technology. It is part of a wider vision that looks towards extended reality, whereby technology and hardware cease to be the focal point; instead becoming fully integrated and seamless facilitators of user needs and wishes.”

As for Magic Leap 2, the company shows no signs of slowing down with potential partnerships with established brands. Some of its latest potential forays include a partnership with Audi, as well as early talks with tech giant Meta, perhaps looking to expand towards more consumers in the AR space.

As for the future of AR driving? It’s difficult to tell, but one thing’s certain: everyone will be in for an interesting ride.

Unveiling the Future of Driving: Mercedes-Benz Vision One-Eleven Concept Car Uses Magic Leap 2 Read More »

awe-usa-2023-day-two:-more-keynotes,-more-panels,-and-the-open-expo-floor

AWE USA 2023 Day Two: More Keynotes, More Panels, and the Open Expo Floor

The second day of AWE is the day that the expo floor opens. That is always thrilling, and we’ll get there, but first – more keynotes and conversations.

AWE Day Two Keynotes

Day One kickstarted the keynotes, but AWE Day Two saw exciting presentations and announcements from Magic Leap and Niantic. Both affirmed a theme from the day before: meaningful XR is already here.

Magic Leap: Let’s Get to Work

“The vision of AR that some legacy tech companies are promising is still years out, is not years or months or days out,” Magic Leap CEO Peggy Johnson said in her keynote. “The small team at Magic Leap has made something that many larger companies are still struggling to achieve.”

Peggy Johnson, Magic Leap's CEO AWE Day 2
Peggy Johnson

Johnson also continued another theme from AWE Day One: AI and XR aren’t in competition – they help each other. Inbar’s opening talk included a line that quickly became a motto for almost the whole event: “XR is the interface for AI.”

“I honestly believe AR systems are going to become the endpoints for a lot of AI,” said Johnson. “The ability to provide contract input and get contextual output will really be a game changer.”

Magic Leap’s big announcement wasn’t to do with AI, but it will still be thrilling to developers: an Unreal Engine plugin is coming in August.

“AR Everywhere” With Niantic

While enterprise companies and hardware manufacturers are still struggling with adoption to since degree, few companies have done as much for AR consumer adoption as Niantic.

Brian McClendon Niantic Labs AWE Day 2
Brian McClendon

In his AWE keynote, “Empowering AR Everywhere”, Niantic Senior Vice President of Engineering, Brian McClendon, laid out a number of major updates coming to the company – as well as coming to or through 8th Wall.

First, ARDK 3.0 will allow developers using Niantic tools to also use outside AR asset libraries. It will also enable a QR code-triggered “lobby system” for multi-user shared AR experiences. The updated ARDK will enter a beta phase later this month. A new maps SDK compatible with Unity is also coming to 8th Wall.

Further, 8th Wall’s “Metaversal Deployment” announced at AWE 2021 is now compatible with mixed reality via Quest 2, Quest Pro, “and probably all future MR headsets.”

Big Picture Panel Discussions

One of the things that really makes AWE special is its ability to bring together the industry’s big thinkers. A number of insightful panel discussions from Day Two explored some of the biggest topics in XR today.

XR’s Inflection Point

The panel discussion “How Immersive Storytelling Can Deepen Human Understanding of Critical Issues” brought together Unity CEO John Riccitiello, journalist Ashlan Cousteau, and TRIPP CEO and co-founder Nanea Reeves. The talk included further affirmations that, contrary to some media pieces, XR as an industry is thriving.

John Riccitiello, Ashlan Cousteau, Nanea Reeves - AWE Day 2
From left to right: John Riccitiello, Ashlan Cousteau, and Nanea Reeves

“I now cancel what I said seven years ago about this not being a good time to build a business in this space,” said Riccitiello. “We’re at a time right now where it makes a lot of sense to look forward with optimism around XR. … Companies are born around technology transitions.”

Reeves echoed the sentiment, but included some of the cautious caveats expressed by XR ethicist Kent Bye during a panel discussion yesterday.

“We’re at such an interesting point of technology and the evolution of it, especially with AI and XR,” said Reeves. “What’s the next level of storytelling and what should we be aware of as we bring AI into it?”

Building Open Standards for the Metaverse

The good news is that the metaverse isn’t dead. The bad news is that it arguably hasn’t been born yet either. One of the most important features of the metaverse is also one of its most elusive.

It was also the crux of a panel discussion bringing together XR Safety Initiative founder and CEO Kavya Pearlman, XRSI Advisor Elizabeth Rothman, and Khronos Group President Neil Trevett, moderated by Moor Insights and Strategy Senior Analyst Anshel Sag.

Kavya Pearlman, Neil Trevett, Elizabeth Rothman, and Anshel Sag - AWE 2023 Day 2
From left to right: Kavya Pearlman, Neil Trevett, Elizabeth Rothman, and Anshel Sag

“Whichever way you come to the metaverse, you need interoperability,” said Trevett. “It’s foundational.”

The panel also addresses the lasting and fleeting effects of the wave of attention that has seemingly passed over the metaverse.

“We go through these hype cycles and bubbles,” said Rothman. “There are always technological innovations that come out of them.”

The panel also addressed AI, an overarching theme of the conference. However, the panel brought up one concern with the technology that had not been addressed elsewhere.

“This convergence has a way more visceral impact on children’s brains even than social media,” said Pearlman.

So far, the “solution” to this problem has been for content publishers to age-restrict experiences. However, this approach has crucial shortcomings. First, most approaches to age restrictions aren’t foolproof. Second, when they are, this measure excludes young users rather than protecting them.

“We run the risk of regulating children right out of the metaverse,” said Rothman. “We need to strike a balance.”

Hitting the AWE Floor

I first started covering AWE during the pandemic when the entire conference was virtual. AWE is a lot more fun in-person but, practically speaking, the demos are the only component that can’t really happen remotely.

Meeting Wol

I actually met Wol in the Niantic Lounge before the very first session on Day One. While this is where arranging this content makes sense to me, Wol was possibly my first impression of AWE. And it was a good one. But wait, who’s Wol?

Niantic Lounge AWE 2023
Niantic Lounge

Wol is a collaboration between 8th Wall, Liquid City, and InWorld AI. He’s an artificially intelligent character virtually embodied as an owl. His only job is to educate people about the Redwood Forest but he’s also passionate about mushrooms, fairies, and, well, you just have to meet him.

“Wol has a lot of personal knowledge about his own life, and he can talk to you about the forest through his own experience,” explained Liquid City Director Keiichi Matsuda. “Ultimately, Wol has a mind of its own and we can only provide parameters for it.”

Wol

I met Wol through the Quest Pro in passthrough AR via a portal that appeared in the room directly into the Redwoods – and, now that I think about it, this was the day before Niantic announced that 8th Wall supported Quest Pro MR. In any case, the whole experience was magical, and I can’t wait to get home and show it to the family.

Visiting Orlando via Santa Clara

Largely thanks to a group called the Orlando Economic Partnership, Orlando is quickly becoming a global epicenter of metaverse development. Just one of their many initiatives is an 800-square-mile virtual twin of the Orlando area. The digital twin has its own in-person viewing room in Orlando but it also exists in a more bite-size iteration that runs on a Quest 2.

“The idea was to showcase the entire region – all of its assets in terms of data points that we could present,” explained the OEP’s Director of Marketing and Communications Justin Braun. “It’s going to become a platform for the city to build on.”

I was able to see at AWE featured photorealistic 3D models of Orlando landmarks, complete with informational slides and quiz questions. The full version, which took 11 months, is a lot more fully featured. It just doesn’t fit in Braun’s backpack.

“At some point, this will be able to do things that are beneficial for the city and its utilities, like shower power outages,” said the OEP’s Chief Information Officer David Adelson. “It’s community-driven.”

Gathering Around the Campfire

I opened by saying that demos can’t be done remotely. I remotely demoed Campfire recently, but that was their desktop view. Campfire also offers tabletop and room-scale 3D interactions that require the company’s custom-made headset and markers. I got to try these solutions out hands-on when I reconnected with CEO and co-founder Jay Wright on the AWE floor.

campfire at AWE 2023 Day 2
Campfire at AWE USA 2023

“The perception system is designed to do one thing very well, and that’s to make multi-user AR as high-fidelity as desktop,” said Wright. And they’ve done it.

Models and mockups that I viewed in mixed reality using Campfire’s hardware were beautifully rendered. The internet connectivity at AWE is notoriously spotty and, while the controller disconnected a few times, the display never skipped a beat.

Wright demonstrated the visor that switches Campfire from MR to VR on a virtually reconstructed art museum that I could view from above in a “dollhouse mode” or travel through in a 1:1 model. In addition to showcasing more hardware and software ease-of-use, it might have been the most graphically impressive showcase I’ve seen from XR hardware ever.

The Lenovo VRX

With Lenovo ThinkReality’s new headset announced the day before AWE started, this might be the record for the shortest passage of time between a headset releasing and my putting it on – and it’s all thanks to ARPost’s longtime Lenovo contact Bill Adams.

“We think we have one of the best passthrough headsets and most comfortable headsets in the industry,” said Adams, who made a gentleman’s wager that I would (finally) be able to see my notes through the Lenovo VRX.

I couldn’t read my writing, but I could tell where the writing was on the page – which, honestly, is enough. Having tried the same experiment on the Quest Pro earlier that day, I can back up what Adams said about the headset’s passthrough quality.

As for comfort, ditto. The headset features a removable overhead strap, but it was so comfortable that I forgot that the strap was there anyway. Switching from VR to passthrough is a simple button press.

Catching Up With Snap

The average user can have a great AR experience with just a phone, and the average creator can make a really advanced experience without creating their own app, according to Snap Senior Product Communications Manager Cassie Bumgarner.

Snap AR at AWE 2023
Snap at AWE 2023

“There’s a lot of chatter on the hardware front, but what we want to show is that there’s so much more left to unlock on the mobile front,” said Bumgarner.

A Snap Lense made with QReal uses AI to identify LEGO bricks in a tub. A quick scan, and the lens recommends small models that can be made with the available pieces. Bumgarner and I still get the fun of digging out the pieces and assembling them, and then the app creates a virtual LEGO set to match our creation – in this case, a bathtub to go with the duck we made.

Snap bricks AWE 2023 Day 2

Of course, Snap has hardware too. On display at AWE, the company showed off the virtual try-on mirrors debuted at the Snap Partner Summit that took place in April.

One More Day of AWE

Two days down and there’s still so much to look forward to from AWE. The expo floor is still open tomorrow. There are no more keynotes, but that just means that there’s more time for panel discussions and insightful conversations. And don’t think we forgot about the Auggies. While most of the Auggies were awarded last evening, there are still three to be awarded.

AWE USA 2023 Day Two: More Keynotes, More Panels, and the Open Expo Floor Read More »

report:-meta-in-talks-with-magic-leap-for-multiyear-ar-headset-tech-deal

Report: Meta in Talks with Magic Leap for Multiyear AR Headset Tech Deal

A report from the Financial Times maintains Meta is currently in talks with AR headset creator Magic Leap to strike a multiyear deal, which could include intellectual property licensing and contract manufacturing of AR headsets in North America.

The AR unicorn is said to possess valuable IP regarding custom components, including its optics, waveguides, and software.

It’s said a potential deal may also allow Meta to lessen its reliance on China for component manufacturing. In 2019, Magic Leap partnered with manufacturing solutions company Jabil to create a Guadalajara, Mexico plant which the report maintains can assemble headsets in “the tens of thousands a year.”

Magic Leap 2 | Photo by Road to VR

Citing people with knowledge of the talks, the report maintains however a specific joint Meta-Magic Leap headset isn’t expected.

While both companies didn’t comment on a potential partnership, Magic Leap said this to the Financial Times:

“Given the complexities of developing true augmented reality technologies and the intricacies involved with manufacturing these optics, as well as the issues many companies experience with overseas supply chain dependencies, we have entered into several non-exclusive IP licensing and manufacturing partnerships with companies looking to enter the AR market or expand their current position.”

Since it exited stealth in 2014, Magic Leap has released two AR headsets, Magic Leap 1 and Magic Leap 2, which have been compared in functionality to Microsoft’s HoloLens AR headsets.

The company has raised over $4 billion, with minority investors including Google, Alibaba, Qualcomm, AT&T, and Axel Springer. Its majority stakeholder is Saudi Arabia’s state-owned sovereign wealth fund.

In addition to Quest Pro mixed reality headset, Meta has confirmed it’s currently working on its next iteration of Quest, likely Quest 3, as well as its own AR glasses. Meta started real-world testing of Project Aria in 2020, a platform for training its AR perception systems and asses public perception of the technology.

Report: Meta in Talks with Magic Leap for Multiyear AR Headset Tech Deal Read More »

eye-tracking-is-a-game-changer-for-xr-that-goes-far-beyond-foveated-rendering

Eye-tracking is a Game Changer for XR That Goes Far Beyond Foveated Rendering

Eye-tracking—the ability to quickly and precisely measure the direction a user is looking while inside of a VR headset—is often talked about within the context of foveated rendering, and how it could reduce the performance requirements of XR headsets. And while foveated rendering is an exciting use-case for eye-tracking in AR and VR headsets, eye-tracking stands to bring much more to the table.

Updated – May 2nd, 2023

Eye-tracking has been talked about with regards to XR as a distant technology for many years, but the hardware is finally becoming increasingly available to developers and customers. PSVR 2 and Quest Pro are the most visible examples of headsets with built-in eye-tracking, along with the likes of Varjo Aero, Vive Pro Eye and more.

With this momentum, in just a few years we could see eye-tracking become a standard part of consumer XR headsets. When that happens, there’s a wide range of features the tech can enable to drastically improve the experience.

Foveated Rendering

Let’s first start with the one that many people are already familiar with. Foveated rendering aims to reduce the computational power required for displaying demanding AR and VR scenes. The name comes from the ‘fovea’—a small pit at the center of the human retina which is densely packed with photoreceptors. It’s the fovea which gives us high resolution vision at the center of our field of view; meanwhile our peripheral vision is actually very poor at picking up detail and color, and is better tuned for spotting motion and contrast than seeing detail. You can think of it like a camera which has a large sensor with just a few megapixels, and another smaller sensor in the middle with lots of megapixels.

The region of your vision in which you can see in high detail is actually much smaller than most think—just a few degrees across the center of your view. The difference in resolving power between the fovea and the rest of the retina is so drastic, that without your fovea, you couldn’t make out the text on this page. You can see this easily for yourself: if you keep your eyes focused on this word and try to read just two sentences below, you’ll find it’s almost impossible to make out what the words say, even though you can see something resembling words. The reason that people overestimate the foveal region of their vision seems to be because the brain does a lot of unconscious interpretation and prediction to build a model of how we believe the world to be.

Foveated rendering aims to exploit this quirk of our vision by rendering the virtual scene in high resolution only in the region that the fovea sees, and then drastically cut down the complexity of the scene in our peripheral vision where the detail can’t be resolved anyway. Doing so allows us to focus most of the processing power where it contributes most to detail, while saving processing resources elsewhere. That may not sound like a huge deal, but as the display resolution of XR headsets and field-of-view increases, the power needed to render complex scenes grows quickly.

Eye-tracking of course comes into play because we need to know where the center of the user’s gaze is at all times quickly and with high precision in order to pull off foveated rendering. While it’s difficult to pull this off without the user noticing, it’s possible and has been demonstrated quite effectively on recent headset like Quest Pro and PSVR 2.

Automatic User Detection & Adjustment

In addition to detecting movement, eye-tracking can also be used as a biometric identifier. That makes eye-tracking a great candidate for multiple user profiles across a single headset—when I put on the headset, the system can instantly identify me as a unique user and call up my customized environment, content library, game progress, and settings. When a friend puts on the headset, the system can load their preferences and saved data.

Eye-tracking can also be used to precisely measure IPD (the distance between one’s eyes). Knowing your IPD is important in XR because it’s required to move the lenses and displays into the optimal position for both comfort and visual quality. Unfortunately many people understandably don’t know what their IPD off the top of their head.

With eye-tracking, it would be easy to instantly measure each user’s IPD and then have the headset’s software assist the user in adjusting headset’s IPD to match, or warn users that their IPD is outside the range supported by the headset.

In more advanced headsets, this process can be invisible and automatic—IPD can be measured invisibly, and the headset can have a motorized IPD adjustment that automatically moves the lenses into the correct position without the user needing to be aware of any of it, like on the Varjo Aero, for example.

Varifocal Displays

A prototype varifocal headset | Image courtesy NVIDIA

The optical systems used in today’s VR headsets work pretty well but they’re actually rather simple and don’t support an important function of human vision: dynamic focus. This is because the display in XR headsets is always the same distance from our eyes, even when the stereoscopic depth suggests otherwise. This leads to an issue called vergence-accommodation conflict. If you want to learn a bit more in depth, check out our primer below:

Accommodation

Accommodation is the bending of the eye’s lens to focus light from objects at different distances. | Photo courtesy Pearson Scott Foresman

In the real world, to focus on a near object the lens of your eye bends to make the light from the object hit the right spot on your retina, giving you a sharp view of the object. For an object that’s further away, the light is traveling at different angles into your eye and the lens again must bend to ensure the light is focused onto your retina. This is why, if you close one eye and focus on your finger a few inches from your face, the world behind your finger is blurry. Conversely, if you focus on the world behind your finger, your finger becomes blurry. This is called accommodation.

Vergence

Vergence is the inward rotation of each eye to overlap each eye’s view into one aligned image. | Photo courtesy Fred Hsu (CC BY-SA 3.0)

Then there’s vergence, which is when each of your eyes rotates inward to ‘converge’ the separate views from each eye into one overlapping image. For very distant objects, your eyes are nearly parallel, because the distance between them is so small in comparison to the distance of the object (meaning each eye sees a nearly identical portion of the object). For very near objects, your eyes must rotate inward to bring each eye’s perspective into alignment. You can see this too with our little finger trick as above: this time, using both eyes, hold your finger a few inches from your face and look at it. Notice that you see double-images of objects far behind your finger. When you then focus on those objects behind your finger, now you see a double finger image.

The Conflict

With precise enough instruments, you could use either vergence or accommodation to know how far away an object is that a person is looking at. But the thing is, both accommodation and vergence happen in your eye together, automatically. And they don’t just happen at the same time—there’s a direct correlation between vergence and accommodation, such that for any given measurement of vergence, there’s a directly corresponding level of accommodation (and vice versa). Since you were a little baby, your brain and eyes have formed muscle memory to make these two things happen together, without thinking, anytime you look at anything.

But when it comes to most of today’s AR and VR headsets, vergence and accommodation are out of sync due to inherent limitations of the optical design.

In a basic AR or VR headset, there’s a display (which is, let’s say, 3″ away from your eye) which shows the virtual scene, and a lens which focuses the light from the display onto your eye (just like the lens in your eye would normally focus the light from the world onto your retina). But since the display is a static distance from your eye, and the lens’ shape is static, the light coming from all objects shown on that display is coming from the same distance. So even if there’s a virtual mountain five miles away and a coffee cup on a table five inches away, the light from both objects enters the eye at the same angle (which means your accommodation—the bending of the lens in your eye—never changes).

That comes in conflict with vergence in such headsets which—because we can show a different image to each eye—is variable. Being able to adjust the imagine independently for each eye, such that our eyes need to converge on objects at different depths, is essentially what gives today’s AR and VR headsets stereoscopy.

But the most realistic (and arguably, most comfortable) display we could create would eliminate the vergence-accommodation issue and let the two work in sync, just like we’re used to in the real world.

Varifocal displays—those which can dynamically alter their focal depth—are proposed as a solution to this problem. There’s a number of approaches to varifocal displays, perhaps the most simple of which is an optical system where the display is physically moved back and forth from the lens in order to change focal depth on the fly.

Achieving such an actuated varifocal display requires eye-tracking because the system needs to know precisely where in the scene the user is looking. By tracing a path into the virtual scene from each of the user’s eyes, the system can find the point that those paths intersect, establishing the proper focal plane that the user is looking at. This information is then sent to the display to adjust accordingly, setting the focal depth to match the virtual distance from the user’s eye to the object.

A well implemented varifocal display could not only eliminate the vergence-accommodation conflict, but also allow users to focus on virtual objects much nearer to them than in existing headsets.

And well before we’re putting varifocal displays into XR headsets, eye-tracking could be used for simulated depth-of-field, which could approximate the blurring of objects outside of the focal plane of the user’s eyes.

As of now, there’s no major headset on the market with varifocal capabilities, but there’s a growing body of research and development trying to figure out how to make the capability compact, reliable, and affordable.

Foveated Displays

While foveated rendering aims to better distribute rendering power between the part of our vision where we can see sharply and our low-detail peripheral vision, something similar can be achieved for the actual pixel count.

Rather than just changing the detail of the rendering on certain parts of the display vs. others, foveated displays are those which are physically moved (or in some cases “steered”) to stay in front of the user’s gaze no matter where they look.

Foveated displays open the door to achieving much higher resolution in AR and VR headsets without brute-forcing the problem by trying to cram pixels at higher resolution across our entire field-of-view. Doing so is not only be costly, but also runs into challenging power and size constraints as the number of pixels approach retinal-resolution. Instead, foveated displays would move a smaller, pixel-dense display to wherever the user is looking based on eye-tracking data. This approach could even lead to higher fields-of-view than could otherwise be achieved with a single flat display.

A rough approximation of how a pixel-dense foveated display looks against a larger, much less pixel-dense display in Varjo’s prototype headset. | Photo by Road to VR, based on images courtesy Varjo

Varjo is one company working on a foveated display system. They use a typical display that covers a wide field of view (but isn’t very pixel dense), and then superimpose a microdisplay that’s much more pixel dense on top of it. The combination of the two means the user gets both a wide field of view for their peripheral vision, and a region of very high resolution for their foveal vision.

Granted, this foveated display is still static (the high resolution area stays in the middle of the display) rather than dynamic, but the company has considered a number of methods for moving the display to ensure the high resolution area is always at the center of your gaze.

Continued on Page 2: Better Social Avatars »

Eye-tracking is a Game Changer for XR That Goes Far Beyond Foveated Rendering Read More »

magic-leap-2-now-supports-openxr,-strengthening-industry-against-potential-apple-upheaval

Magic Leap 2 Now Supports OpenXR, Strengthening Industry Against Potential Apple Upheaval

Though delayed from its commitment last year, Magic Leap today announced that ML2 now fully supports OpenXR. The timing might have something to do with Apple’s looming entrance into the XR space.

Magic Leap had planned to deliver OpenXR support for its ML2 headset last year, but it was seemingly delayed until now. Today the company announced that Magic Leap 2 is conformant with OpenXR.

OpenXR is an open standard that aims to standardize the development of VR and AR applications, making hardware and software more interoperable. The standard has been in development since 2017 and is backed by virtually every major hardware, platform, and engine company in the XR industry.

“The adoption of OpenXR as a common AR ecosystem standard ensures the continual growth and maturation of AR,” Magic Leap said in its announcement. “Magic Leap will continue to advance this vision as Vice Chair of the OpenXR Working Group. In this role, Magic Leap provides technical expertise and collaborates with other members to address the needs of developers and end-users, the scope of the standard, and best practices for implementation.”

Its true that Magic Leap has been part of the OpenXR Working Group—a consortium responsible for developing the standard—for a long time, but we can’t help but feel like Apple’s heavily rumored entrance into the XR space lit a bit of a fire under the feet of the company to get the work across the finish line.

In doing so, Magic Leap has strengthened itself—and the existing XR industry—against what could be a standards upheaval by Apple.

Apple is well known for ignoring certain widely adopted computing standards and choosing to use their own proprietary technologies, in some cases causing a technical divide between platforms. You very well may have experienced this yourself, have you ever found yourself in a conversation about ‘blue bubbles and green bubbles’ when it comes to texting.

With an industry as young as XR—and with Apple being so secretive about its R&D in the space—there’s a good chance the company will have its own way of doing things, especially when it comes to how developers and their applications are allowed to interact with the headset.

If Apple doesn’t want to support OpenXR, this is likely the biggest risk for the industry; if developers have to change their development processes for Apple’s headset, that would create a divide between Apple and the rest of the industry, making applications less portable between platforms.

And while OpenXR-supporting incumbents have the upper hand for the time being (because they have all the existing XR developers and content on their side), one would be foolish to forget the army of experienced iOS developers that are used to doing things the ‘Apple way’. If those developers start their XR journey with Apple’s tools, it will be less likely that their applications will come to OpenXR headsets.

On the other hand, it’s possible that Apple will embrace OpenXR because it sees the value that has already come from years of ironing out the standard—and the content that already supports it. Apple could even be secretly part of the OpenXR Working Group, as companies aren’t forced to make their involvement known.

In the end it’s very likely that Apple will have its own way of doing things in XR, but whether that manifests more in the content running on the headset or down at the technical level, remains to be seen.

Magic Leap 2 Now Supports OpenXR, Strengthening Industry Against Potential Apple Upheaval Read More »

magic-leap-2-gains-certification-so-doctors-can-use-ar-during-surgery

Magic Leap 2 Gains Certification so Doctors Can Use AR During Surgery

Magic Leap, the storied unicorn developing enterprise AR headsets, announced at CES 2023 that its flagship device Magic Leap 2 earned a certification that clears it for use in the operating room.

The company first intimated it had pursued IEC 60601-1 certification at SPIE’s XR conference in January 2022, however the news largely went unreported since the information was presented in a single slide at the conference.

At AMD’s CES 2023 keynote, Magic Leap CEO Peggy Johnson confirmed Magic Leap 2 has indeed obtained IEC 60601-1 certification for its flagship AR headset.

As explained by TÜV Rheinland, the IEC 60601-1 certification specifies a device that is “intended to diagnose, treat, or monitor a patient under medical supervision and, which makes physical or electrical contact with the patient and/or transfers energy to or from the patient and/or detects such an energy transfer to or from the patient.”

Magic Leap says this certification allows Magic Leap 2 to be used both in an operating room as well as in other clinical settings, allowing medical professionals such as surgeons to focus on the patient and not have to refer to 2D screens.

By and large, this gives software developers a non-inconsequential inroad into gaining FDA certification for apps that could be used during surgery, and not just for pre-surgical training.

One such Magic Leap partner, SentiAR, is currently under review by the FDA for its app which connects physicians to live clinical data and images, allowing them to do operations such as navigating a catheter through blood vessels of the heart using a 3D map of a patient’s heart and the location of the catheter in real time.

Founded in 2010, the Plantation, Florida-based company initially exited the gate with consumer ambitions for its first AR headset, Magic Leap 1 (previously styled ‘One’). After awkwardly straddling the segment with its $2,300 AR headset, the company made a decisive pivot in mid-2020 when co-founder Rony Abovitz announced he would be stepping down as CEO, positioning the company to reprioritize its future devices away from consumers. It has since released Magic Leap 2, which is largely targeted at enterprise.

The well-funded company, which has amassed $4 billion in funds to date, has recently taken on $450 million from Saudi Arabia’s sovereign wealth fund, giving the country a majority share in the US-based augmented reality company.

Magic Leap 2 Gains Certification so Doctors Can Use AR During Surgery Read More »

saudi-arabia-gains-majority-stake-in-magic-leap-in-$450m-deal

Saudi Arabia Gains Majority Stake in Magic Leap in $450M Deal

Saudi Arabia has taken majority share of the US-based augmented reality company Magic Leap, The Telegraph reports, widening the stake via its state-owned sovereign wealth fund with a deal amounting to $450 million.

Citing delayed accounts obtained from its European division, the company is said to have raised $150 million in preferred convertible stock and $300 million in debt from Saudi Arabia’s Public Investment Fund (PIF) over the course of 2022. The investment puts the country’s ownership of Magic Leap over 50 percent, giving it overall majority control.

The Telegraph reports that, as of November 2022, Saudi Arabia’s PIF is “entitled to appoint four of the eight directors of the board of directors of Magic Leap.”

The wealth fund, which is controlled by Crown Prince Mohammed bin Salman, invests in projects considered to be strategically significant to diversifying its national economy.

Through PIF, Saudi Arabia owns minority stakes in Uber, Capcom, Nexon, Live Nation, Boeing, Meta, Alphabet, Citigroup, Disney, and Bank of America to name a few. It also owns Premier League football team Newcastle United and LIV Golf, a challenger to the PGA Tour.

Photo by Road to VR

Founded in 2010 by Rony Abovitz, the Plantation, Florida-based company kicked off its consumer ambitions with a long and ambitious tease of its first AR headset, Magic Leap 1 (previously styled ‘One’), starting its marketing campaign as it emerged from stealth in 2014.

Released nearly four years later, the developer-focused ‘Creator Edition’ headset was initially priced at an eye-watering $2,300, which not only deflated some of the potent hype behind the unicorn startup, but also cemented a long and bumpy road ahead if Magic Leap wanted to eventually offer its tech at a consumer price point.

Having awkwardly straddled the prosumer segment with limited success, in mid-2020 Abovitz announced he would be stepping down as CEO, signaling a pivot that would refocus the company’s efforts on servicing enterprise instead of consumers. Shortly afterward, Microsoft’s Executive VP of Business Development Peggy Johnson took the reins as CEO of Magic Leap.

The company has since released its follow-up headset, Magic Leap 2, to enterprise partners and through third-party vendors, putting the device in direct competition with Microsoft’s HoloLens 2.

To date, Magic Leap has raised $4 billion, with minority investors including Google, Alibaba, Qualcomm, AT&T, and Axel Springer.

Saudi Arabia Gains Majority Stake in Magic Leap in $450M Deal Read More »

how-different-xr-companies-approach-cloud-services

How Different XR Companies Approach Cloud Services

 

XR hardware is on the move. But, software is important too. The bigger your XR needs are, the larger your software needs are. So, more and more XR providers are providing cloud services in addition to their hardware and platform offerings. But, what is the cloud anyway?

Generally, “the cloud” refers to remote servers that do work off of a device. This allows devices to become smaller while running more robust software. For example, some of the cloud services that we’ll look at are cloud storage solutions. Cloud storage is increasingly important because 3D assets can take up a lot of space. Others run computations on the cloud.

Other solutions make up “local clouds.” These are networks of devices managed from a central portal all on location. This kind of solution is usually used by organizations managing a large number of devices from one central computer.

Varjo’s Reality Cloud

“Cloud” takes on yet another meaning for Varjo. For Varjo clients, a lot of the management and IT solutions that make up cloud services for other developers are handled through software subscriptions bundled with almost all Varjo hardware. Varjo’s “Reality Cloud” allows users to join XR meetings including remotely present coworkers and virtual assets.

Varjo Reality Cloud - XR cloud services

“Varjo Reality Cloud is our platform that will allow the ultimate science fiction dream – photo-realistic teleportation – to come true,” CTO Urho Konttori said in a launch event last summer. “What this means, in practice, is true virtual teleportation – sharing your reality, your environment, with other people in real time so that others can experience your world.”

At the beginning of this year, Varjo announced that XR content will soon stream through Reality Cloud services as well. Just like streaming other forms of media, XR streaming aims to provide more content to smaller devices by hosting that content remotely and serving it to users on demand.

“These scalability opportunities that the cloud provides are significantly meaningful when we talk about XR deployment in the corporate world,” Konttori told ARPost in January. “We are now at the level that we are super happy with the latency and deployments.”

In a recent funding announcement, Varjo announced the most recent development in their cloud services. Patrick Wyatt, a C-suite veteran, has been appointed the company’s new CPO and “will be the primary lead for Varjo’s software and cloud development initiatives.” As this article was being written, Varjo further expanded its cloud with Unreal and Unity engine integrations.

CloudXR From NVIDIA

XR streaming is already a reality on other cloud platforms. NVIDIA offers CloudXR that streams XR content to Android and Windows devices. (Remember that Android isn’t a hardware manufacturer, but an operating system. While almost all non-Apple mobile devices run Android, it is also the backbone of many XR headsets.)

NVIDIA CloudXR - XR cloud services

According to NVIDIA, “CloudXR lets you leverage NVIDIA RTX-powered servers with GPU virtualization software to stream stunning augmented and virtual reality experiences from any OpenVR application. This means you can run the most complex VR and AR experiences from a remote server across 5G and Wi-Fi networks to any device, while embracing the freedom to move—no wires, no limits.”

This can be a “pure” cloud application, but it can also be an “edge” application that does some lifting on the device and some remotely. While NVIDIA promotes their cloud services for use cases like location-based experiences and virtual production, edge computing is being embraced by enterprises who may want to keep sensitive content offline.

RealWear’s New Cloud Services

Enterprise XR hardware manufacturer RealWear recently launched their own cloud. This is of the last kind of cloud discussed above. The solution allows IT specialists to “easily control and manage their entire RealWear device fleet from one easy-to-use interface.” That includes content, but it also includes managing updates.

If you own one headset, you know that installing software and updates can be a chore. Now, imagine owning a dozen headsets, or even a hundred or more. Putting on each headset individually to add content and install updates quickly becomes unscalable. The RealWear Cloud also allows real-time tech support, which wouldn’t be possible otherwise.

RealWear Cloud

The RealWear Cloud also allows data analysis across headsets. This is vital in enterprise applications which may be tracking items as they move through a supply chain or tracking employees as they move through tasks or training modules. Handling this data for an individual on an individual headset is possible but, again, becomes unbearable at scale sans cloud.

Cloud Storage in Lens Studio

As for cloud storage, Snapchat recently announced a solution in a Lens Studio update that gives creators up to 25MB of remote storage. While the file size is still capped per asset (you can’t have one 25MB asset), it drastically increases the abilities of Lens Creators working with large or complex models.

Snap Lens Cloud

“Prior to the launch of Remote Assets, if a project was over the Lens size limit, you only had two options: either remove the asset if it wasn’t critical to the experience or resize the image to lower its RAM usage and re-submit,” reads the release. “Now you can utilize our Lens Cloud service to host assets of larger sizes outside of the Lens, and then load them in at run time.”

This is significant because Snap Lenses run on mobile devices that not only have limited space but also share that computing power with a slew of non-XR applications. At least, until Snapchat makes a consumer version of Spectacles.

“At first, we were just building for the phone and porting to the glasses,” Lens Creator Alex Bradt told me when I got to demo Snap’s Spectacles at AWE. “Now we’re like, ‘what can we actually do with these that will solve problems for people that they didn’t know they had?’”

Parents and Partners

Not all XR companies offer their own cloud services. For example, Magic Leap has had a partnership with Google Cloud for the past year now. Likewise, AutoDesk offers its XR cloud services through a partnership with Amazon.

Similarly, ThinkReality cloud services are offered through parent company Lenovo. A similar relationship exists between Azure and Microsoft’s MR hardware.

Partnerships like these help each company get the most out of their existing offerings without needing to build services from the ground up. As enterprises explore entering XR, these offerings also help them integrate into cloud services offered by suppliers that they may already be working with, like Microsoft, Google, Amazon, or Lenovo.

Your Forecast: Cloudy

Right now, a lot of cloud services serve industry – where it is doing very impactful things for industry. That doesn’t mean that people with just one headset (or a phone) shouldn’t be taking note. Developments in XR cloud development (for enterprise or for consumer applications) are making smoother, faster, lighter-weight, and more robust XR applications possible for everyone.

How Different XR Companies Approach Cloud Services Read More »