Ams to Demo Under-Display 3D Sensor for Smartphones in 2H 2020

Image Sensors World        Go to the original article...

SeekingAlpha: Ams Q1 earnings report mentions the directions of its R&D and production efforts. The company plans to demo its behind-OLED-display (BOLED) 3D sensing in the second half of this year.


"In display management, we are seeing further adoption of our behind-OLED light sensing in high-volume Android smartphone and mobile device platforms, which includes several recent releases. This unmatched technology moves light and proximity sensing invisibly behind the OLED display so OEMs can remove bezel-placed elements from the device front and maximize the screen-to-body ratio.

Our strong market success continues to be driven by leading Android OEMs while we move along a multigeneration road map for this technology. All in all, we are shipping significant volumes of our wide range of advanced display management across the leading consumer OEMs.

We continued our strong R&D investments for further innovation in optical sensing. Leveraging our unmatched behind-OLED capabilities into 3D sensing, we are progressing with our developments to move front-facing 3D sensing invisibly behind the display. Based on active stereo vision technology, we continue to expect to demonstrate behind-OLED 3D solution in the second half. Here, we work to combine ams VCSEL illumination, near-infrared sensing, software and algorithm from our portfolio to create a high-performance 3D offering. We address the market trend to reduce visible components on the device front and expect significant market interest in 3D behind-OLED technology in mobile devices. Generally, both active stereo vision and structured light technology are able to support behind-OLED 3D sensing. So we plan to explore all paths for innovation in front-facing 3D.
"

Then, in Q&A part of the earnings call:

"...The question about structured light behind the OLED. Based on our technical information, it appears that it's possible, it's doable, and that's why we are keen to explore all the ways related to structured light, but also for active stereo vision to move this technology behind the OLED screen because I strongly believe that will be the future, that you don't see any sensors on the display in a smartphone anymore. And because of that reason, we see a future for those either structured light or active stereo vision behind-OLED.

Q: Just on the timing of structured light, is it the same timing that you have for active stereo vision, a product that could be showcased over the next 6 months or it will be after active stereo vision on your road map?

A: It will be later than active stereo vision, that's correct. So active stereo vision will happen this - in the second half of this year. Structured light will be a bit later. But we are working, as always, on road maps for the next quarters and years ahead so that we are always available to be the first to offer new technology to our customer base.

Q: The 3D behind-OLED, you added now structured light. When are you expecting industry adoption? As you mentioned, you will have it in the second half. And potentially what content could we look at for both ASV and structured light?

A: So 3D behind-OLED, that's, as I mentioned before, is, I think, a very attractive solution for our customer base, and we expect a very nice adoption from the end customers. Obviously, the industry adoption will be not this year, will be next year, probably end of next year. Our customer base have to use designs after we demonstrated with our demonstrator, what we will release during the second half of this year. So we have to be a bit sensitive, but I think in the - the discussion with customers will start end of the year and continues into next year. And then it depends on the road map of customers when they bring this to market. But it's certainly, and that's why we're investing in this technology, a very positive and unique selling point for our customer base related to their end customers.
"


Ams application note for its TCS3701 BOLED Color and Proximity Sensor explains how it works:

"The IR transmissivity of an OLED display is inherently low, primarily due to the required layers of metallization that are present. In order to get the most reflected IR energy through the display to the photodiode, any non-essential, IR-blocking materials on the bottom side of the display (protective barriers, copper, glue, etc.), must be removed to essentially create an aperture on the bottom side of the display. The design and the shape of this aperture should be optimized to maximize proximity signal and to minimize crosstalk. An optimized display aperture is one that is aligned with the device’s package (or rubber boot) aperture and is the same shape but slightly oversized. The ideal aperture design for the sensor will consider the sensor’s Field-Of-View (FOV), boot thickness, air gap, mechanical tolerances, assembly tolerances, etc. The same guidelines above apply to the emitter Field-Of-Illumination (FOI)."

Go to the original article...

Leave a Reply

css.php