VR displays

‘digital-lens’-plugin-for-eye-tracking-headsets-improves-visual-clarity-&-pupil-swim

‘Digital Lens’ Plugin for Eye-tracking Headsets Improves Visual Clarity & Pupil Swim

Imaging company Almalence has released a trial plugin for its Digital Lens technology which makes use of eye-tracking to purportedly increase the resolving power and clarity of XR headsets.

Almalense argues that the lenses on most XR headsets today aren’t being used to their fullest potential. By taking advantage of eye-tracking and smarter calibration, the company says its image pre-processing technology can actually increase the resolving power of a headset, including expanding the ‘sweet spot’ (the part of the lens with the highest visual fidelity).

The company has released a trial version of its technology through a plugin that works with Pico 3 Neo Pro Eye, HP Reverb G2 Omnicept, and HTC Vive Pro Eye. The plugin works with OpenXR compatible content, and even allows users to switch back and forth between each headset’s built-in image processing and the Almalence Digital Lens processing.

Based on through-the-lens demonstrations by the company, the technology does objectively increase the resolving power of the headsets. The company focuses on doing more advanced pre-processing to account for artifacts introduced by the lens, like chromatic aberration and image distortion. In essence the software increases the sharpness of the image by making the light passing through the lens land more precisely where it’s supposed to.

Almalence has shared heat maps comparing the changes in visual quality with and without its image technology, along with a broader explanation of how it works.

Another big advantage over the status quo, Almalence says, is the Digital Lens tech uses eye-tracking to perform these corrections in real-time, meaning that as you move your eyes around the scene (and off-axis from the center of the lens), the corrections are updated to account for the new angles. This can expand the ‘sweet spot’ of the lens and ‘pupil swim’ by making adjustments to account for the position of the pupil relative to the center of the lens. This video demonstrates the pupil swim correction:

The plugin, which anyone can use until January 2024, aims to demonstrate the company’s claims. Ultimately it appears the company wants to license its technology to headset makers to improve image quality out of the box.

‘Digital Lens’ Plugin for Eye-tracking Headsets Improves Visual Clarity & Pupil Swim Read More »

hands-on:-creal’s-light-field-display-brings-a-new-layer-of-immersion-to-ar

Hands-on: CREAL’s Light-field Display Brings a New Layer of Immersion to AR

More than four years after I first caught wind of their tech, CREAL’s light-field display continues to be one of the most interesting and promising solutions for bringing light-fields to immersive headsets. At AWE 2023 I got to check out the company’s latest tech and saw first hand what light-fields mean for immersion in AR headsets.

More Than One Way to Focus

So first, a quick recap. A light-field is a fundamentally different way of showing light to your eyes compared to the typical displays used in most headsets today. The key difference is about how your eyes can focus on the virtual scene.

Your eyes have two focus methods. The one most people are familiar with is vergence (also called stereoscopy), where both eyes point at the same object to bring overlapping views of that object into focus. This is also what makes things look ‘3D’ to us.

But each individual eye is also capable of focusing in a different way by bending the lens of the eye to focus on objects at different distances—the same way that a camera with only one lens focuses. This is called accomodation.

Vergence-Accommodation Conflict

Most XR headsets today support vergenge (stereoscopic focus), but not accomodation (single-eye focus). You may have heard this called Vergence-Accomodation Conflict; also known to the industry as ‘VAC’ because it’s a pervasive challenge for immersive displays.

The reason for the ‘conflict’ is that normally the vergence and accommodation of your eyes work in tandem to achieve optimal focus on the thing you want to look at. But in a headset that supports vergence, but not accomodation, your eyes need to break these typically synchronous functions into independent functions.

It might not be something you ‘feel’ but it’s the reason why in a headset it’s hard to focus on things very near to you—especially objects in your hands that you want to inspect up close.

The conflict between vergence and accommodation can be not just uncomfortable for your eyes, but in a surprising way also rob the scene of immersion.

Creal’s Solution

And this is where we get back to Creal, a company that wants to solve the Vergence-Accommodation Conflict with a light-field display. Light-field displays structure light in the same way that we see it in the real world, allowing both of the focus functions of the eyes—vergence and accommodation—to work in tandem as they normally do.

At AWE 2023 this week, I got to check out the company’s latest light-field display tech, and came away with an added sense of immersion that I haven’t felt in any other AR headset to date.

I’ve seen Creal’s static bench-top demos before, which show static floating imagery through the lens to a single eye, demonstrating that you can indeed focus (accommodate) at different depths. But you won’t really see the magic until you see a light-field with both eyes and head-tracking. Which is exactly what I got to do this week at AWE.

Photo by Road to VR

On an admittedly bulky proof-of-concept AR headset, I got to see the company’s light-field display in its natural habitat—floating immersively in front of me. What really impressed me was when I held my hand out and a little virtual turtle came floating over to the palm of my hand. Even though it was semi-transparent, and not exceptionally high resolution or accurately colored, it felt… weirdly real.

I’ve seen all kinds of immersive XR experiences over the years, and holding something in your hand sounds like a banal demo at this point. But there was just something about the way this little turtle looked—thanks to the fact that my eyes could focus on it in the same way they would in the real world—that made it feel more real than I’ve ever really felt in other headsets. Like it was really there in my hand.

Photo by Road to VR

The trick is that, thanks to the light-field, when I focused my eyes on the turtle in my hand, both the turtle (virtual) and my hand (real) were each in proper focus—something that isn’t possible with conventional displays—making both my hand and the turtle feel more like they were inhabiting the same space right in front of me.

It’s frustratingly impossible to explain exactly how it appeared via text alone; this video from Creal shot through-the-lens gives some idea of what I saw, but can’t quite show how it adds immersion over other AR headsets:

It’s a subtle thing, and such added immersion probably only meaningful impacts objects within arms reach or closer—but then again, that distance is where things have the potential to feel most real to use because they’re in our carefully watched personal space.

Digital Prescriptions

Beyond just adding a new layer of visual immersion, light-field displays stand to solve another key problem, which is vision correction. Most XR headsets today do not support any kind of prescription vision correction, which for maybe even more than half of the population means they either need to wear their correctives while using these devices, buy some kind of clip-on lens, or just suffer through a blurry image.

But the nature of light-fields means you can apply a ‘digital prescription’ to the virtual content that exactly matches the user’s corrective prescription. And because it’s digital, this can be done on-the-fly, meaning the same headset could have its digital corrective vision setting change from one user to the next. Doing so means the focus of virtual image can match the real world image for those with and without glasses.

Continue on Page 2: A More Acceptable Form-factor »

Hands-on: CREAL’s Light-field Display Brings a New Layer of Immersion to AR Read More »

samsung-acquires-emagin-microdisplay-maker,-citing-‘significant-potential-growth’-of-xr-devices

Samsung Acquires eMagin Microdisplay Maker, Citing ‘significant potential growth’ of XR Devices

eMagin, the US-based developer and manufacturer of OLED microdisplays for AR/VR headsets, announced a merger agreement with Samsung Display, a subsidiary of the Korean tech giant. Samsung says it anticipates “significant potential growth” of XR devices.

The company announced in a press statement that Samsung will acquire all outstanding shares of eMagin common stock on a fully diluted basis for $2.08 per share in cash, totaling approximately $218 million.

Founded in 2001, eMagin has created head-mounted displays to showcase its OLED technology since the release of Z800, which launched in mid-2005. Since then, the company has focused on creating VR headset prototypes to further showcase its high-density OLED microdisplays while also providing its displays for integration into aircraft helmets, heads-up display systems, AR/VR headsets, thermal scopes, night vision goggles, and future weapon systems.

President & CEO of Samsung Display, Joo Sun Choi, says the acquisition comes along with expectations that XR devices will have “significant potential of growth in the future.”

“This agreement is a validation of our technical achievements to date including our proprietary direct patterning (dPd) technology, provides a significant premium for our shareholders, and represents a win for our customers and employees,” said Andrew G. Sculley, eMagin’s CEO. “By teaming with Samsung Display, we will be able to achieve the full potential of our next-generation microdisplay technology with a partner that can provide the resources and expertise we will need to scale production. Moreover, our customers will benefit from resulting improvements to our production capabilities in terms of yield, efficiency, and quality control.”

The merger will very likely allow Samsung to exclusively manufacture micro-OLED displays using eMagin’s direct patterning display (dPd) technology, which boasts higher efficiencies and brightness since its displays use RGB emitters instead of traditional displays, which typically use a white OLED with a RGB color filter.

The transaction is expected to close in the second half of 2023, whereby eMagin will continue to maintain its operations and facilities in Hopewell Junction, NY. The merger agreement has received unanimous approval from eMagin’s Board of Directors, and stockholders holding around 98% of eMagin’s total voting power have committed to voting in favor of the transaction.

Samsung Acquires eMagin Microdisplay Maker, Citing ‘significant potential growth’ of XR Devices Read More »

smart-contact-lens-company-mojo-vision-raises-$22m,-pivots-to-micro-led-displays-for-xr-&-more

Smart Contact Lens Company Mojo Vision Raises $22M, Pivots to Micro-LED Displays for XR & More

Mojo Vision, a company once noted for its work on smart contact lenses, has raised $22.4 million in a new Series A investment round which it will use in a pivot to develop and commercialize micro-LED display technology for consumer, enterprise, and government applications.

The funding round is led by existing investors NEA and Khosla Ventures, with participation from other investors including Dolby Family Ventures, Liberty Global Ventures, Fusion Fund, Drew Perkins, Open Field Capital, and Edge.

The new Series A comes months after the company was forced to put its smart contact lenses on hold, which also included a 75% downsizing in the company’s workforce.

Prior to the pivot, the company had amassed $205 million in outside investment, with its most recent in January 2022 bringing to the company $45 million.

Its new focus is on displays for AR/VR, automotive, light field, large format displays and others that require high performance micro-LED displays. Mojo’s prototype smart contacts made use of its own in-house displays, which at the time included a monochrome display capable of over 14,000 pixels per inch (ppi).

Now the company is developing its own High Performance Quantum Dot (HPQD) technology to make a “very small, very bright, very efficient RGB pixel,” the company says in a press statement.

The company is boasting a number of advances in its proprietary technology, including dynamic displays with up to 28,000ppi, efficient blue micro-LED devices at sub-μm scale, high efficiency quantum dot ink for red and green, high brightness at 1M+ nits, and a display system that incorporates an optimized CMOS backplane, wafer-to-wafer bonding, and custom micro-lens optics.

Mojo Vision’s new CEO, Dr. Nikhil Balram, is said to bring semiconductor and display technology expertise to the company:

“The market opportunity in the display industry is big – over $100 billion. Sometimes in order to do something very big, you have to start very small. That is exactly what we are doing at Mojo,” said Balram. “We started by developing the world’s smallest, densest dynamic micro-LED display, and now we are applying that innovation to power the next generation of displays. Mojo is combining breakthrough technology, leading display and semiconductor expertise, and an advanced manufacturing process to commercialize micro-LEDs for the most demanding hardware applications.”

“This round of funding will enable us to deliver our breakthrough monolithic micro-LED technology to customers and help bring high-performance micro-LEDs to market,” concluded Balram.

Smart Contact Lens Company Mojo Vision Raises $22M, Pivots to Micro-LED Displays for XR & More Read More »