六合彩直播开奖

AR/VR Needs Disruptive, Smart Imaging Systems

Emilie Viasnoff

Apr 21, 2022 / 5 min read

Vision is so important to humans that almost half of your brain’s capacity is dedicated to visual perception. It’s no surprise that hyperscalers such as Meta, Microsoft, and Apple have bet on augmented/virtual reality (AR/VR) – starting with augmented vision – to become the new machine-human interface. Since our eyes are the second most complex organ after our brains, it makes sense for AR/VR to replace personal computers and smartphones. However, many technical hurdles remain in electronics (the brain) and optics (the eyes). In my first blog post on trends in imaging design, I discussed how digital twins will foster mass customization and data optimization through end-to-end simulation of imaging systems, from manufacturing and testing to user experience virtualization. In this blog post, I will focus on how AR/VR is driving innovation in imaging design. Whoever thought that digital twins include only mechanical, thermal, or electrical components are missing the need for AR/VR systems to develop disruptive, optical, smart imaging systems.

Why AR/VR Needs Disruptive, Smart Imaging Systems

What Are Imaging Systems and How Are They Used in AR/VR Systems?

Imaging systems are used for observation, image capture, or image display in various applications, including space exploration, machine vision for industry 5.0, defense and security, entertainment, assisted driving, computers, and smartphones. Imaging systems typically consist of a camera and a display, along with an imaging lens. They can be self-contained units or modular component assemblies.

Digitalization entered our lives in the mid-20th century. The first machine-human interfaces (MHIs) were massive computers that only experts could use and interact with. In the late 1970s, the first personal computers entered our homes before smartphones entered our pockets. The next revolution of MHI is predicted to be AR/VR systems — heads-up and hands-free. No one can yet predict the exact shape of future MHIs, but there are enormous challenges ahead to make them seamless and at the edge, as opposed to being connected to the cloud.

A perfect immersive environment is still a dream for virtual reality. Seamless superposition everywhere is the challenge for augmented reality. And don’t forget multiple constraints on price, overall performance (image quality, power efficiency), and ergonomics. One day, AR/VR systems could be as lightweight, seamless, and efficient as a contact lens—not defocusing your attention, but reinforcing it by providing tailored information at the exact right place in your field of view.

Just as our brain and our eyes are two main organs, computing chips and imaging systems are at the heart of future AR/VR MHIs. Following Moore’s Law, the semiconductor ecosystem has for decades put tremendous effort into miniaturizing computing chips. As a result, today’s smartphones have the computational power of a 1970s supercomputer. The optics-electronics community must deploy the same level of effort to enable miniaturized, digital, and smart imaging systems for AR/VR systems.

Imaging Systems Go Miniaturized

For $1 today, you can buy a set of eight freeform plastic lenses that will fit into less than one-thousandth of an inch1. This complex set of lenses has been designed by optimizing more than a thousand parameters. The CMOS image sensor and the computer vision chip, which come just after the lens set, have also been miniaturized and improved to a point where each smartphone has no less than four cameras: they take better images than the first digital camera, which had more than one million pixels and bulky, complex lenses, twenty years ago.

Displays are also getting so small that our next MHI could be not only on our eyes but also in our eyes. Companies like  are developing disruptively small systems that can fit on a contact lens and are at first glance invisible. This bionic eye consists of a display smaller than a grain of sand that directs its emission to the most sensitive part of our eye—the fovea—where the highest number and density of photoreceptors are. Later versions will include processors, eye-tracking sensors, and a communication chip. A tiny battery will also power the entire device.

Man looking at AR/VR screen | 六合彩直播开奖

Imaging Systems Go Digital

Similar to electronics several decades ago, today’s optics are primarily analog. Optical systems consist of complex bulky surfaces, stacks of thin films, or continuous shapes. It’s only recently that an alternative to this traditional approach arose. Flat optics emerged to create nanostructured devices (with features smaller than a wavelength) able to replicate optical functions such as focusing, color filtering, and reflecting. The main advantage of this technology is its compactness, versatility, and compatibility with semiconductor processes, unleashing new possibilities for optics and electronics co-integration. Many hurdles need to be overcome to bring this technology to maturity. Leaders in materials and semiconductors processes, such as Applied Materials, are using their expertise to help drive flat optics technology into various volume markets.

 

Every optical element will transition from analog to digital. Fully digital imaging systems would collect or display images in a more agile and specific way. Foveated cameras or displays are an example of how pixelated imaging systems could be used in a more customized and efficient manner, bringing detailed visual information precisely to where it has to be processed and leaving other areas with less resolution. In a sense, digital imaging systems will mimic how our eyes function.

Designing for the Future: Imaging Systems Go Smart

Half of our brain processes visual information. Similarly, imaging systems have to become smart, which means optics and electronics must be co-designed, co-optimized, and maybe even co-manufactured.

An imaging system is composed of a sensor that collects the light, a converter that transfers the raw sensor signal to a processing unit, and a computer vision unit that post-processes the digital signal to make a decision. Each component is designed separately by different teams and manufactured on various technologies.

Near or in-sensor computing is a major trend towards more innovative imaging systems with optimized energy consumption, low latency, and improved security. For example, CMOS image sensors could consist of a planar System on Chip that would integrate a photodiode array, pulse-modulated circuits, and a simple ADC converter to have a front-end processing unit near or into the sensor layer. Other  approaches could include a solution in which the optics, the sensor, and the algorithms are jointly optimized and then used in tandem with one another as a solution to a unique imaging problem: recognizing an object to make a decision is different from taking souvenir pictures, video surveillance cameras should be different from eye-tracking ones. Display engines could also become so smart and fully optimized that we might not even need to build intermediate images to render truly immersive content .

The next generation of imaging systems promises to extend AR/VR technology so that it can replace personal computers and smartphones. It is an exciting time for the optics industry and especially for the catalyst ecosystem that is the design of smart imaging systems.

In Case You Missed It

Catch up on our other recent optical blog posts:

Sources:

  • 1A compact camera module is $6 for a smartphone: Technical innovations and market demand are shaping the Compact Camera Module industry – System Plus Consulting
  • 1The lens set market is 1/7th of the global compact camera module market: Status of the Camera Module Industry 2019 – Focus on Wafer Level Optics – Yole Développement (systemplus.fr)

Continue Reading