Copied link

Download our whitepaper

It will explain more about Computer-Generated Holography and why it represents the future of AR wearables.

VividQ will use the information you provide on this form to be in touch with you and to provide updates and marketing. Please let us know all the ways you would like to hear from us:

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please visit our website.

Thank you for your interest in VividQ. Your submission has been received and we will be in touch shortly.
Oops! Something went wrong while submitting the form.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.

Breakthrough in AR optics means 3D holographic gaming now a reality

January 17, 2023
4
min read
Breakthrough in AR optics means 3D holographic gaming now a reality

VividQ™, the pioneer of holographic display technology for AR gaming, and Dispelix, the world’s leading waveguide designer and manufacturer, have today announced achieving what leading industry figures described as “quasi-impossible”* only two years ago. The two companies have designed and manufactured a ‘waveguide combiner’ that can accurately display simultaneous variable-depth 3D content within a user’s environment. For the first time ever, users will be able to enjoy immersive AR gaming experiences where digital content can be placed in their physical world and they can interact with it naturally and comfortably.

The two companies have also announced the formation of a commercial partnership to develop the new 3D waveguide technology towards mass production readiness. This will enable headset manufacturers the ability to kick-start their AR product roadmaps now.

Early augmented reality experiences seen so far through headsets such as Magic Leap, Microsoft HoloLens, Vuzix, and others, produce 2D stereoscopic images at fixed focal distances, or one focal distance at a time. This often leads to eye fatigue and nausea for users and doesn’t offer the necessary immersive three-dimensional experiences - for example, objects cannot be interacted with naturally at arm’s length, and they are not placed exactly within the real world.

In order to deliver the types of immersive experiences necessary for AR to reach mass-market adoption, consumers need a sufficient field of view and the ability to focus on 3D images at the full range of natural distances – anywhere from 10cm to optical infinity, simultaneously – in the same way they do naturally with physical objects. A waveguide combiner is the industry’s favoured method of displaying AR images in a compact form factor. This next-generation waveguide and accompanying software are optimised for 3D applications like gaming, which means that consumer brands around the world can unlock the market’s full potential.

“There has been significant investment and research into the technology that can create the types of AR experiences we’ve dreamt of, but they fall short because they can’t live up to even basic user expectations,” said VividQ CEO, Darran Milne. “In an industry that has already seen its fair share of hype, it can be easy to dismiss any new invention as yet more of the same, but a fundamental issue has always been the complexity of displaying 3D images placed in the real world with a decent field of view and with an eyebox that is large enough to accommodate a wide range of IPDs (interpupillary distance, or the space between the user’s pupils), all encased within a lightweight lens. We’ve solved that problem, designed something that can be manufactured, tested and proven it, and established the manufacturing partnership necessary to mass produce them. It is a breakthrough because without 3D holography, you can’t deliver AR.”

“To put it simply, while others have been developing a 2D screen to wear on your face, we’ve developed the window through which you’ll experience real and digital worlds in one place.”
Above: A concept image that shows how consumers can interact at arm’s length and with objects placed accurately in the real-world

VividQ’s patent-pending 3D waveguide combiner is designed to work in harmony with the company’s software, both of which can be licensed by wearable manufacturers in order to build out a wearable product roadmap. VividQ’s holographic display software works with standard games engines like Unity and Unreal Engine, making it very easy for games developers to create new experiences. The 3D waveguide can be manufactured and supplied at scale through VividQ’s manufacturing partner Dispelix, a leader in see-through waveguides for wearables.

Antti Sunnari, CEO & Co-Founder of Dispelix, adds: “Wearable AR devices have huge potential all around the world. For applications such as gaming and professional use, where the user needs to be immersed for long periods of time, it is vital that content is true 3D and placed within the user’s environment. This also overcomes the issues of nausea and fatigue. We are very pleased to be working with VividQ as a waveguide design and manufacturing partner on this breakthrough 3D waveguide.”

In its Cambridge (UK) HQ, VividQ is able to demonstrate its software and the 3D waveguide technology for leading device manufacturers and consumer tech brands, who it is working closely with to deliver next-generation AR wearables.

— ENDS —

Notes to Editors

In their January 2021 Nanophotonics paper, widely respected photonics researchers Bernard Kress and Ishan Chatterjee (both currently at Google and previously Microsoft) said: “When a pupil replication scheme is used in a waveguide combiner, no matter the coupler, the input pupil needs to be formed over a collimated field (image at infinity/far field). If the focus is set to the near field instead of the far field in the display engine, each waveguide exit pupil will produce an image at a slightly different distance, thereby producing a mixed visual experience, overlapping the same image with different focal depths. It is quasi-impossible to compensate for such focus shift over the exit pupils because of both spectral spread and field spread over the exit pupils, as discussed previously.”

View notes here

What is a waveguide and why is it important?

Waveguides (also known as ‘combiners’ or ‘waveguide combiners’) give a lightweight and conventional looking (i.e look like normal glass lenses) front end for AR headsets, and are necessary for widespread adoption. Apart from the form factor advantages, the waveguides on the market today perform a process called pupil replication. This means they can take an image from a small display panel (aka an ‘eyebox’) and effectively make it larger through creating a grid of copies of the small image in front of the viewer’s eye - a bit like a periscope but instead of a single view, it creates multiple views. This is essential to make the AR wearable ergonomic and easy to use.Small eyeboxes are notoriously difficult to line up with the user’s pupil and the eye can easily “fall off” the image if they are not lined up correctly. It requires that the headsets be precisely fitted to the user, since even variations in different users’ Inter Pupillary Distance (IPD) may mean that they may not get their eye exactly lined up with the eyebox and be unable to see the virtual image.Since there is a fundamental tradeoff between the image size (which we call “eyebox” or “exit pupil”) and the Field of View (FoV) in display, this replication allows the optical designer to make the eyebox very small, relying on the replication process to give a big effective image to the viewer, while also maximising the FoV.

What is the difference between a 2D waveguide and a 3D waveguide?

A 2D image is effectively the same as an image but at some large distance (when you look into the distance, there is no difference in focus between objects so everything that is >10m is effectively 2D to your eyes). The reason is that the light rays from the image at a large distance are pretty much parallel, while for nearby objects the rays would diverge.

Waveguides assume that the light rays coming in are parallel (hence a 2D image) as they require that the light bouncing around within the structure all follows paths of the same length. If you were to put in diverging rays (a 3D image) then the light paths would all be different, depending on where on the input 3D image the ray originated from. This is a big problem since this effectively means that the extracted light has all traveled different distances and the effect, as shown in the above image on the right, is seeing multiple partially overlapping copies of the input image all at random distances. Which makes it essentially useless for any application. Alternatively, this new 3D waveguide combiner is able to adapt to the diverging rays and display images correctly, as shown in the above image on the left. So currently, to get a big eyebox and FoV (as required for a decent AR experience) you are restricted to only using 2D display types. However, we know this leads to a host of unfortunate effects on the user - nausea, eye fatigue - and does not provide a fully immersive experience as 2D displays don’t allow for accurate placement of AR content in the environment. The 3D waveguide from VIVIDQ is composed of two elements: Firstly, a modification of the standard pupil replicating waveguide design as described above.. And secondly, an algorithm that computes a hologram that corrects for distortion due to the waveguide. The hardware and software components work in harmony with each other and as such you couldn’t use the VIVIDQ waveguide with anybody else’s software or system.

About VividQ

Based in Cambridge and London, VividQ is the pioneer of holographic display technology, which is pivotal in augmented reality. The company was founded in 2017 and has raised over $20m in VC funding to date. The company licenses its technology to consumer device manufacturers, who can then build consumer-ready AR wearables that usher in a new generation of gaming experiences.

About Dispelix

Dispelix Oy is an advanced waveguide designer and manufacturer delivering next-generation visual solutions for consumer as well as enterprise AR and MR wearables. The company's patented single layer DPX waveguides are the thinnest on the market and do not compromise full color, image quality, clear eye contact, and field of view. They bring unmatched image quality, performance, and visual fidelity combined with mass manufacturability to scale for even the largest vendors. Led by the world's most sought-after experts in optics, photonics, and manufacturing, Dispelix is headquartered in the technology hub of Espoo, Finland with field offices throughout the United States and China. Learn more at www.dispelix.com.