Thesis on Parasitic Light Sensitivity in Global Shutter Pixels

Image Sensors World        Go to the original article...

Toulouse University publishes a PhD thesis "Developing a method for modeling, characterizing and mitigating parasitic light sensitivity in global shutter CMOS image sensors" by Federico Pace.

"Though being treated as a figure of merit, there is no standard metric for measuring Parasitic Light Sensitivity in Global Shutter CMOS Image Sensors. Some measurement techniques have been presented in literature [Mey+11], though they may not apply for a general characterization of each pixel in the array. Chapter 4 presents a development of a standard metric for measuring Parasitic Light Sensitivity in Global Shutter CMOS Image Sensors that can be applied to the large variety of Global Shutter CMOS Image Sensors on the market.

The metric relies on Quantum Efficiency (QE) measurements, which are widely known in the image sensor community and well standardized. The metric allows per-pixel characterization at different wavelength and at different impinging angles, thus allowing a more complete characterization of the Parasitic Light Sensitivity in Global Shutter CMOS Image Sensors."

Go to the original article...

LiDAR with Entangled Photons

Image Sensors World        Go to the original article...

EPFL and Glasgow University publish an Optics Express paper "Light detection and ranging with entangled photons" by Jiuxuan Zhao, Ashley Lyons, Arin Can Ulku, Hugo Defienne, Daniele Faccio, and Edoardo Charbon.

"Single-photon light detection and ranging (LiDAR) is a key technology for depth imaging through complex environments. Despite recent advances, an open challenge is the ability to isolate the LiDAR signal from other spurious sources including background light and jamming signals. Here we show that a time-resolved coincidence scheme can address these challenges by exploiting spatio-temporal correlations between entangled photon pairs. We demonstrate that a photon-pair-based LiDAR can distill desired depth information in the presence of both synchronous and asynchronous spurious signals without prior knowledge of the scene and the target object. This result enables the development of robust and secure quantum LiDAR systems and paves the way to time-resolved quantum imaging applications."

Go to the original article...

Polarization Event Camera

Image Sensors World        Go to the original article...

AIT Austrian Institute of Technology, ETH Zurich, Western Sydney University, and University of Illinois at Urbana-Champaign publish a pre-print paper "Bio-inspired Polarization Event Camera" by Germain Haessig, Damien Joubert, Justin Haque, Yingkai Chen, Moritz Milde, Tobi Delbruck, and Viktor Gruev

"The stomatopod (mantis shrimp) visual system has recently provided a blueprint for the design of paradigm-shifting polarization and multispectral imaging sensors, enabling solutions to challenging medical and remote sensing problems. However, these bioinspired sensors lack the high dynamic range (HDR) and asynchronous polarization vision capabilities of the stomatopod visual system, limiting temporal resolution to ~12 ms and dynamic range to ~ 72 dB. Here we present a novel stomatopod-inspired polarization camera which mimics the sustained and transient biological visual pathways to save power and sample data beyond the maximum Nyquist frame rate. This bio-inspired sensor simultaneously captures both synchronous intensity frames and asynchronous polarization brightness change information with sub-millisecond latencies over a million-fold range of illumination. Our PDAVIS camera is comprised of 346x260 pixels, organized in 2-by-2 macropixels, which filter the incoming light with four linear polarization filters offset by 45 degrees. Polarization information is reconstructed using both low cost and latency event-based algorithms and more accurate but slower deep neural networks. Our sensor is used to image HDR polarization scenes which vary at high speeds and to observe dynamical properties of single collagen fibers in bovine tendon under rapid cyclical loads."

Go to the original article...

SWIR Startup Trieye Collaborates with Automotive Tier 1 Supplier Hitachi Astemo

Image Sensors World        Go to the original article...

PRNewswire:  TriEye announces collaboration with Hitachi Astemo, Tier 1 automotive supplier of world-class products. Trieye's SEDAR (Spectrum Enhanced Detection And Ranging), has also received significant recognition when it was named CES 2022 Innovation Award Honoree, in the Vehicle Intelligence category.

"We believe that TriEye's SEDAR can provide autonomous vehicles with ranging and accurate detection capabilities that are needed to increase the safety and operability under all visibility conditions," says John Nunneley, SVP Design Engineering, Hitachi Astemo Americas, Inc.

Go to the original article...

SeeDevice Focuses on SWIR Sensing and Joins John Deere’s 2022 Startup Collaborator Program

Image Sensors World        Go to the original article...

GlobeNewswire: Deere & Company announces the companies that will be part of the 2022 cohort of their Startup Collaborator program, including SeeDevice. This program launched in 2019 to enhance and deepen its interaction with startup companies whose technology could add value for John Deere customers.

SeeDevice is said to be a pioneer in CMOS-based SWIR image sensor technology, the first of its kind, based in quantum tunneling and plasmonic phenomena in standard logic CMOS process. A fabless quantum image sensor licensing company, Seedevice will collaborate with John Deere to implement its Quantum Photo-Detection-- QPD CMOS SWIR image sensor technology for agricultural and industrial applications and solutions. SeeDevice's unique technology is capable of broad-spectrum detection ability from a single CMOS pixel to detect spectral wavelengths from visual and near infrared -NIR (~400nm - 1,100nm), up to short-wave infrared -SWIR (~1,600nm), manufactured on a normal logic CMOS process.

"We're very honored to be invited to Deere's Start-up Collaborator program. The feasibility of a single-sensor solution from visible to SWIR wavelengths opens the doors to new industrial use-cases previously not possible due to the limitations of performance, cost, power, and size. To our knowledge, it is the first in the industry to achieve this level of performance, so we're excited to be working with John Deere to enhance next-generation image sensing devices with quantum sensing," said Thomas Kim, CEO and Founder of SeeDevice. 

SeeDevice has redesigned its website emphasizing the SWIR sensitivity of its image sensors:

Go to the original article...

Omnivision Unveils its New Logo

Image Sensors World        Go to the original article...

Omnivision publishes short videos explaining its new logo:

Go to the original article...

UV Sensors in SOI Process

Image Sensors World        Go to the original article...

Tower publishes a MDPI paper "Embedded UV Sensors in CMOS SOI Technology" by Michael Yampolsky, Evgeny Pikhay, and Yakov Roizin.

"We report on ultraviolet (UV) sensors employing high voltage PIN lateral photodiode strings integrated into the production RF SOI (silicon on isolator) CMOS platform. The sensors were optimized for applications that require measurements of short wavelength ultraviolet (UVC) radiation under strong visible and near-infrared lights, such as UV used for sterilization purposes, e.g., COVID-19 disinfection. Responsivity above 0.1 A/W in the UVC range was achieved, and improved blindness to visible and infrared (IR) light demonstrated by implementing back-end dielectric layers transparent to the UV, in combination with differential sensing circuits with polysilicon UV filters. Degradation of the developed sensors under short wavelength UV was investigated and design and operation regimes allowing decreased degradation were discussed. Compared with other embedded solutions, the current design is implemented in a mass-production CMOS SOI technology, without additional masks, and has high sensitivity in UVC."

Go to the original article...

Nanostructure Modifiers for Pixel Spectral Response

Image Sensors World        Go to the original article...

University of California – Davis and W&WSens publish an Arxiv.org paper "Reconstruction-based spectroscopy using CMOS image sensors with random photon-trapping nanostructure per sensor" by Ahasan Ahamed, Cesar Bartolo-Perez, Ahmed Sulaiman Mayet, Soroush Ghandiparsi, Lisa McPhillips, Shih-Yuan Wang, M. Saif Islam.

"Emerging applications in biomedical and communication fields have boosted the research in the miniaturization of spectrometers. Recently, reconstruction-based spectrometers have gained popularity for their compact size, easy maneuverability, and versatile utilities. These devices exploit the superior computational capabilities of recent computers to reconstruct hyperspectral images using detectors with distinct responsivity to different wavelengths. In this paper, we propose a CMOS compatible reconstruction-based on-chip spectrometer pixels capable of spectrally resolving the visible spectrum with 1 nm spectral resolution maintaining high accuracy (>95 %) and low footprint (8 um x 8 um), all without the use of any additional filters. A single spectrometer pixel is formed by an array of silicon photodiodes, each having a distinct absorption spectrum due to their integrated nanostructures, this allows us to computationally reconstruct the hyperspectral image. To achieve distinct responsivity, we utilize random photon-trapping nanostructures per photodiode with different dimensions and shapes that modify the coupling of light at different wavelengths. This also reduces the spectrometer pixel footprint (comparable to conventional camera pixels), thus improving spatial resolution. Moreover, deep trench isolation (DTI) reduces the crosstalk between adjacent photodiodes. This miniaturized spectrometer can be utilized for real-time in-situ biomedical applications such as Fluorescence Lifetime Imaging Microscopy (FLIM), pulse oximetry, disease diagnostics, and surgical guidance."

Go to the original article...

Image Sensor Facts for Kids

Image Sensors World        Go to the original article...

Kiddle, an encyclopedia for kids, publishes a page about image sensors:

Go to the original article...

Recent Videos: EnliTech, IPVM, Scantinel, Infiray, Omron, Ibeo

Image Sensors World        Go to the original article...

EnliTech presents its CIS wafer testing solutions:


IPVM publishes "Intro to Surveillance Cameras:"
   

Scantinel presents its FMCW LiDAR:


Infiray presents bright future for thermal cameras in ADAS applications:


Guide Sensmart presents the world's first smartphone thermal camera with AF:


Omron publishes a webinar about its QVGA ToF sensor capable of 100klux ambient light operation:


Ibeo publishes a webinar about its SPAD-based automotive "Digital LiDAR:"

Go to the original article...

Bankrupt HiDM is Acquired by Rongxin Semiconductor

Image Sensors World        Go to the original article...

JW Insights reports that Rongxin Semiconductor acquired through an auction the bankrupt HiDM (Huaian Imaging Device manufacturing Corporation) in Huaian, Jiangsu province. Rongxin Semiconductor was founded in April 2021 in Ningbo, Jiangsu province. Rongxin paid RMB1.666 billion ($262.1 million) for HiDM assets.

As a private capital, Rongxin’s participation in wafer manufacturing by rescuing HiDM represents a new source of solutions to failed mega semiconductor projects that had occurred over the last several years. It is also regarded as a new force in improving China’s foundry capacity.

Rongxin mainly focuses on 90-55nm 12-inch chip production lines of CIS and other semiconductors. The company's WLCSP TSV packaging focuses on advanced packaging and testing of CIS products.

RongSemi has formed strategic cooperation partnerships with several companies, including OmniVision. Currently, Rongxin is completing the fab construction and hiring its personnel now. The company needs a total of about 1,500 employees, including 70 management personnel, 650 technical personnel, and 780 production personnel.

Go to the original article...

EI 2022 Course on Signal Processing for Photon-Limited Imaging

Image Sensors World        Go to the original article...

Stanley Chan from Purdue University publishes slides for his 2022 Electronic Imaging short course "Signal Processing for Photon-Limited Imaging." Few slides out of 81:

Go to the original article...

Actlight DPD Presentation

Image Sensors World        Go to the original article...

Actlight CEO Serguei Okhonin presented at Photonics Spectra Conference held on-line last week "Dynamic Photodiodes: Unique Light-Sensing Technology with Tunable Sensitivity." The conference registration registration is free of charge. Few slides from the presentation:

Go to the original article...

"Electrostatic Doping" for In-Sensor Computing

Image Sensors World        Go to the original article...

Harvard University, KIST, Pusan University, and Samsung Advanced Institute of Technology publish a pre-print paper "In-sensor optoelectronic computing using electrostatically doped silicon" by Houk Jang, Henry Hinton, Woo-Bin Jung, Min-Hyun Lee, Changhyun Kim, Min Park, Seoung-Ki Lee, Seongjun Park, and Donhee Ham.

"Complementary metal-oxide-semiconductor (CMOS) image sensors are a visual outpost of many machines that interact with the world. While they presently separate image capture in front-end silicon photodiode arrays from image processing in digital back-ends, efforts to process images within the photodiode array itself are rapidly emerging, in hopes of minimizing the data transfer between sensing and computing, and the associated overhead in energy and bandwidth. Electrical modulation, or programming, of photocurrents is requisite for such in-sensor computing, which was indeed demonstrated with electrostatically doped, but non-silicon, photodiodes. CMOS image sensors are currently incapable of in-sensor computing, as their chemically doped photodiodes cannot produce electrically tunable photocurrents. Here we report in-sensor computing with an array of electrostatically doped silicon p-i-n photodiodes, which is amenable to seamless integration with the rest of the CMOS image sensor electronics. This silicon-based approach could more rapidly bring in-sensor computing to the real world due to its compatibility with the mainstream CMOS electronics industry. Our wafer-scale production of thousands of silicon photodiodes using standard fabrication emphasizes this compatibility. We then demonstrate in-sensor processing of optical images using a variety of convolutional filters electrically programmed into a 3 × 3 network of these photodiodes."

Go to the original article...

Quanta Image Sensor Presentation

Image Sensors World        Go to the original article...

Eric Fossum presented a keynote at Phototics Spectra Conference last week (free registration) "Quanta Image Sensors: Every Photon Counts, Even in a Smartphone." Few slides:

Go to the original article...

TrendForce: 4-camera Modules in Smartphones Become Less Popular

Image Sensors World        Go to the original article...

TrendForce analysis shows that 4 camera modules in smartphones become less popular:

"The trend towards multiple cameras started to shift in 2H21 after a few years of positive growth. The previous spike in the penetration rate of four camera modules was primarily incited by mid-range smart phone models in 2H20 when mobile phone brands sought to market their products through promoting more and more cameras. However, as consumers realized that the macro and depth camera usually featured on the third and fourth cameras were used less frequently and improvements in overall photo quality limited, the demand for four camera modules gradually subsided and mobile phone brands returned to fulfilling the actual needs of consumers.

Overall, TrendForce believes that the number of camera modules mounted on smartphones will no longer be the main focus of mobile phone brands, as focus will return to the real needs of consumers. Therefore, triple camera modules will remain the mainstream design for the next 2~3 years.

Although camera shipment growth has slowed, camera resolution continues to improve. Taking primary cameras as an example, the current mainstream design is 13-48 million pixels, accounting for more than 50% of cameras in 2021. In second place are products featuring 49-64 million pixels which accounted for more than 20% of cameras last year with penetration rate expected to increase to 23% in 2022. The third highest portion is 12 million pixel products, currently dominated by the iPhone and Samsung's flagship series."

Go to the original article...

Samsung Hyper-Spectral Sensor for Mobile Applications

Image Sensors World        Go to the original article...

De Gruyter Nanophotonics publishes a paper "Compact meta-spectral image sensor for mobile applications" by Jaesoong Lee, Yeonsang Park, Hyochul Kim, Young-Zoon Yoon, Woong Ko, Kideock Bae, Jeong-Yub Lee, Hyuck Choo, and Young-Geun Roh from Samsung Advanced Institute of Technology and Chungnam National University.

"We have demonstrated a compact and efficient metasurface-based spectral imager for use in the near-infrared range. The spectral imager was created by fabricating dielectric multilayer filters directly on top of the CMOS image sensor. The transmission wavelength for each spectral channel was selected by embedding a Si nanopost array of appropriate dimensions within the multilayers on the corresponding pixels, and this greatly simplified the fabrication process by avoiding the variation of the multilayer-film thicknesses. The meta-spectral imager shows high efficiency and excellent spectral resolution up to 2.0 nm in the near-infrared region. Using the spectral imager, we were able to measure the broad spectra of LED emission and obtain hyperspectral images from wavelength-mixed images. This approach provides ease of fabrication, miniaturization, low crosstalk, high spectral resolution, and high transmission. Our findings can potentially be used in integrating a compact spectral imager in smartphones for diverse applications."

Go to the original article...

One More Terabee iToF Webinar

Image Sensors World        Go to the original article...

 Terabee publishes "An introduction to Time-of-Flight sensing" webinar:

Go to the original article...

Emberion Raises €6M

Image Sensors World        Go to the original article...

Emberion, a developer of SWIR image sensors using nanomaterials, has raised €6M in funding from Nidoco AB, Tesi (Finnish Industry Investment Ltd) and Verso Capital.

We are disrupting multiple imaging markets by extending the wavelength range at a significantly more affordable cost. Our revolutionary sensor is designed to meet the needs of even the most challenging machine vision applications, such as plastic sorting. We look forward to helping customers access new information at infrared wavelengths, thereby critically enhancing their applications beyond today’s capabilities,” said Jyrki Rosenberg, CEO, Emberion.

We have created a new generation of image sensors using nanomaterials. Our high-performance industrial cameras can increase efficiency and reduce the loss of resources in many industrial processes. We innovate at all levels of camera design: nanomaterials, integrated circuit design, electronics, photonics and software. We are now stepping forward to expand our capacity to manufacture,” commented Tapani Ryhänen, CTO.

We are appreciative of the high interest and trust towards our technology from investors and customers. With this funding, our next step is to increase our production capacity to be able to serve our customers’ needs. We will also intensify our efforts to further develop mid-wave infrared (MWIR) and broadband solutions to expand our offerings and to enhance the capabilities of our current VIS-SWIR product line,” added Rosenberg.

Go to the original article...

IDTechEx on SWIR Sensor Technologies

Image Sensors World        Go to the original article...

Photonics Spectra conference held on-line this week (with free registration) features IDTechEx analyst Matthew Dyson presentation "Emerging Short-Wavelength Infrared Sensors." Few slides from the presentation:

Go to the original article...

SWIR Multi-Spectral Sensor

Image Sensors World        Go to the original article...

Phys.org: Nature publishes Eindhoven University's of Technology paper "Integrated near-infrared spectral sensing" by Kaylee D. Hakkel, Maurangelo Petruzzella, Fang Ou, Anne van Klinken, Francesco Pagliano, Tianran Liu, Rene P. J. van Veldhoven & Andrea Fiore.

"Spectral sensing is increasingly used in applications ranging from industrial process monitoring to agriculture. Sensing is usually performed by measuring reflected or transmitted light with a spectrometer and processing the resulting spectra. However, realizing compact and mass-manufacturable spectrometers is a major challenge, particularly in the infrared spectral region where chemical information is most prominent. Here we propose a different approach to spectral sensing which dramatically simplifies the requirements on the hardware and allows the monolithic integration of the sensors. We use an array of resonant-cavity-enhanced photodetectors, each featuring a distinct spectral response in the 850-1700 nm wavelength range. We show that prediction models can be built directly using the responses of the photodetectors, despite the presence of multiple broad peaks, releasing the need for spectral reconstruction. The large etendue and responsivity allow us to demonstrate the application of an integrated near-infrared spectral sensor in relevant problems, namely milk and plastic sensing. Our results open the way to spectral sensors with minimal size, cost and complexity for industrial and consumer applications."

Go to the original article...

Infiray Presents its Visible and Thermal Camera Fusion for Automotive Applications

Image Sensors World        Go to the original article...

Infiray presents its "new breakthrough on Automotive Night Vision:"

Go to the original article...

Pixart Introduces Low-Power Intelligent Object Detecting Sensor

Image Sensors World        Go to the original article...

It appears that integration of vision processor with image sensor is becoming a trend. First, Himax and Sony did it, now - Pixart.

Pixart presents Low-Power Intelligent Object Detection (LIOD) product line and the working principle of its PAG7681LS sensor.

Correction: I got a following email from Pixart that corrects my post:

"Seem like you may misunderstand on the offering of our Low-power Intelligent Object Detection (LIOD) product. I would like to clarify that the PixArt supplying LIOD as a “processor” in the form of stand-alone chip that could be externally assembled with our image sensor to work as a complete intelligent vision system. The PAG7681LS is not a "Sensor" nor "Sensor Module" that integrates a vision processor with image sensor in the same board package. However, we do offer custom design service for any system module based on the custom requirements.

The PCB shown in the PAG7681LS product landing page in our website is the 3-in1 evaluation board for the solution of PixArt's Smart Forehead Temperature Detection that includes the three parts: PAF9701C1 (Far Infrared Sensor), PAG7920LT (Global Shutter Image Sensor), and PAG7681LS (Low-Power Intelligent Object Detection). In our current product line card, we supply the PAF9701C1, PAG7920LT and PAG7681LS as 3 stand-alone parts."


Go to the original article...

SiPM in 55nm Globalfoundries BCD Process

Image Sensors World        Go to the original article...

EPFL, Globalfoundries, and KIST publish an Arxiv.org paper "On Analog Silicon Photomultipliers in Standard 55-nm BCD Technology for LiDAR Applications" by Jiuxuan Zhao, Tommaso Milanese, Francesco Gramuglia, Pouyan Keshavarzian, Shyue Seng Tan, Michelle Tng, Louis Lim, Vinit Dhulla, Elgin Quek, Myung-Jae Lee, and Edoardo Charbon.

"We present an analog silicon photomultiplier (SiPM) based on a standard 55 nm Bipolar-CMOS-DMOS (BCD) technology. The SiPM is composed of 16×16 single-photon avalanche diodes (SPADs) and measures 0.29×0.32 mm2. Each SPAD cell is passively quenched by a monolithically integrated 3.3 V thick oxide transistor. The measured gain is 3.4× 10^5 at 5 V excess bias voltage. The single-photon timing resolution (SPTR) is 185 ps and the multiple-photon timing resolution (MPTR) is 120 ps at 3.3 V excess bias voltage. We integrate the SiPM into a co-axial light detection and ranging (LiDAR) system with a time-correlated single-photon counting (TCSPC) module in FPGA. The depth measurement up to 25 m achieves an accuracy of 2 cm and precision of 2 mm under the room ambient light condition. With co-axial scanning, the intensity and depth images of complex scenes with resolutions of 128×256 and 256×512 are demonstrated. The presented SiPM enables the development of cost-effective LiDAR system-on-chip (SoC) in the advanced technology."

Go to the original article...

Sony Drone Features 10 Image Sensors

Image Sensors World        Go to the original article...

Sony publishes an extended "Airpeak S1 developer interview: 2nd Sensing Edition." Few quotes:

"The Airpeak S1 is equipped with many sensors to provide maneuverability and safe flight. The most important of these is a 5-way stereo camera that captures information about the aircraft and its surroundings from images. In the "Airpeak S1", the vision sensing processor uses these 5 x 2 = 10 image sensors, and the IMU (Inertial Measurement Unit), which detects the angular velocity and acceleration of the aircraft, for a total of 11 sensors. 

After all, I wanted to fulfill the user's request to fly the drone safely. Safety is ensured by detecting obstacles in the surroundings, and by performing highly accurate self-position estimation even under sudden aircraft movements and violent vibrations of the highly mobile "Airpeak S1", stable flight even in places where GNSS is difficult to enter. I wanted to be able to provide.

When thinking so, for example, if there are only two stereo cameras, there is a risk that the necessary information cannot be obtained depending on the location and environment. Since vision sensing with a stereo camera requires the presence of a textured object within the range of measurement, it is possible to detect surrounding obstacles by increasing the number of observable directions, and it also demonstrates the performance of self-position estimation. It will be easier to meet the conditions. However, it is not easy to install 5 stereo cameras. In addition to the installation location, cost, and weight issues I mentioned earlier, we need a high-performance microprocessor that has a large number of input systems and can process a large amount of video in real time."


Go to the original article...

ST Presents 4.6um Pixel iToF Sensor

Image Sensors World        Go to the original article...

ST presents its 4.6um iToF sensor at ESSCIRC 2021 that appears to be quite close to production, judging by the large team size: "4.6μm Low Power Indirect Time-of-Flight Pixel Achieving 88.5% Demodulation Contrast at 200MHz for 0.54MPix Depth Camera" by Cédric Tubert, Pascal Mellot, Yann Desprez, Celine Mas, Arnaud Authié, Laurent Simony, Grégory Bochet, Stephane Drouard, Jeremie Teyssier, Damien Miclo, Jean-Raphael Bezal, Thibault Augey, Franck Hingant, Thomas Bouchet, Blandine Roig, Aurélien Mazard, Raoul Vergara, Gabriel Mugny, Arnaud Tournier, Frédéric Lalanne, François Roy, Boris Rodrigues Goncalves, Matteo Vignetti, Pascal Fonteneau, Vincent Farys, François Agut, Joao Migue Melo Santos, David Hadden, Kevin Channon, Christopher Townsend, Bruce Rae, and Sara Pellegrini.

Go to the original article...

PMD-Infineon Advances in FSI iToF Sensors

Image Sensors World        Go to the original article...

ISSCIRC 2021 paper and presentation "Advancements in Indirect Time of Flight Image Sensors in Front Side Illuminated CMOS" by M. Dielacher, M. Flatscher, R. Gabl, R. Gaggl, D. Offenberg, and J. Prima is available on-line. Few interesting slides:

Go to the original article...

ORRAM Neuromorphic Vision Sensor

Image Sensors World        Go to the original article...

Hong Kong Polytechnic University publishes a video explaining its Optoelectronic Resistive Memory-based vision sensor with image recognition capabilities:

Go to the original article...

LiDAR News: Voyant, Innoviz, Quanergy

Image Sensors World        Go to the original article...

PRNewswire: Lidar-on-a-chip startup Voyant Photonics raises $15.4M in Series A. Voyant's LiDAR system, containing thousands of optical components fabricated on a single semiconductor chip, enables its customers to integrate an effective and exponentially more scalable LiDAR system than possible to date.


Innoviz CEO Omer Keilaf presents the requirements to the automotive LiDAR that the company has learned over 6 years of its existence:


PRNewswire:  The new Innoviz360 HD LiDAR is claimed to be a breakthrough that leapfrogs traditional, standard-resolution spinner solutions that are performance limited, expensive, big and unreliable. Unlike traditional spinner solutions that are performance limited with only up to 128 scanning lines, the new, lightweight, Innoviz360 LiDAR allows multiple scanning software configurations with up to 1280 scanning lines (10x) at a cost-effective and durable solution than traditional 360 LiDARs.


BusinessWire: Quanergy reports achieving 200m range in its S3 Series OPA LiDAR. . The demo included the detection and tracking of a target with 10% reflectivity mounted on a vehicle stationed 200 meters away and a person with dark clothes as he walked the entire distance to the vehicle at 200 meters.

Go to the original article...

Ge/MoS2 Broadband Detectors

Image Sensors World        Go to the original article...

Science publishes a paper "Visible and infrared dual-band imaging via Ge/MoS2 van der Waals heterostructure" by Aujin Hwang, Minseong Park, Youngseo Park, Yeongseok Shim, Sukhyeong Youn, Chan-Ho Lee, Han Beom Jeong, Hu Young Jeong, Jiwon Chang, Kyusang Lee, Geonwook Yoo, Junseok Heo from Ajou University, Yonsei University, Soongsil University, KAIST, UNIST (Korea), and University of Virginia (USA).

"In this paper, we demonstrate a Ge/MoS2 van der Waals heterojunction photodetector for VIS- and IR-selective detection capability under near-photovoltaic and photoconductive modes. The simplified single-polarity bias operation using single pixel could considerably reduce structural complexity and minimize peripheral circuitry for multispectral selective detection. The proposed multispectral photodetector provides a potential pathway for the integration of VIS/NIR vision for application in self-driving, surveillance, computer vision, and biomedical imaging."

Go to the original article...

css.php