Image Sensors World Go to the original article...
In a recent preprint (https://arxiv.org/pdf/2209.11772.pdf) Gyongy et al. describe a new 64x32 SPAD-based direct time-of-flight sensor with in-pixel histogramming and processing capability.
Visual Industry Guide
Image Sensors World Go to the original article...
In a recent preprint (https://arxiv.org/pdf/2209.11772.pdf) Gyongy et al. describe a new 64x32 SPAD-based direct time-of-flight sensor with in-pixel histogramming and processing capability.
Image Sensors World Go to the original article...
The June 2022 issue of IEEE Trans. Electron. Devices has an invited paper titled Direct Time-of-Flight Single-Photon Imaging by Istvan Gyongy et al. from University of Edinburgh and STMicroelectronics.
This is a comprehensive tutorial-style article on single-photon 3D imaging which includes a description of the image formation model starting from first principles and practical system design considerations such as photon budget and power requirements.
Abstract: This article provides a tutorial introduction to the direct Time-of-Flight (dToF) signal chain and typical artifacts introduced due to detector and processing electronic limitations. We outline the memory requirements of embedded histograms related to desired precision and detectability, which are often the limiting factor in the array resolution. A survey of integrated CMOS dToF arrays is provided highlighting future prospects to further scaling through process optimization or smart embedded processing.
Image Sensors World Go to the original article...
C. Bamji et al. from Microsoft published a paper titled "A Review of Indirect Time-of-Flight Technologies" in IEEE Trans. Electron Devices (June 2022).
Abstract: Indirect time-of-flight (iToF) cameras operate by illuminating a scene with modulated light and inferring depth at each pixel by combining the back-reflected light with different gating signals. This article focuses on amplitude-modulated continuous-wave (AMCW) time-of-flight (ToF), which, because of its robustness and stability properties, is the most common form of iToF. The figures of merit that drive iToF performance are explained and plotted, and system parameters that drive a camera’s final performance are summarized. Different iToF pixel and chip architectures are compared and the basic phasor methods for extracting depth from the pixel output values are explained. The evolution of pixel size is discussed, showing performance improvement over time. Depth pipelines, which play a key role in filtering and enhancing data, have also greatly improved over time with sophisticated denoising methods now available. Key remaining challenges, such as ambient light resilience and multipath invariance, are explained, and state-of-the-art mitigation techniques are referenced. Finally, applications, use cases, and benefits of iToF are listed.
Image Sensors World Go to the original article...
Analog Devices has released an industrial-grade megapixel ToF module ADTF3175 and a VGA resolution sensor the ADSD3030 that seeks to bring the highest accuracy ToF technology in the most compact VGA footprint.Image Sensors World Go to the original article...
In a blog post titled "Comparing Depth Cameras: iToF Versus Active Stereo" Refael Whyte of Chronoptics compares depth reconstructions from their indirect time-of-flight (iToF) "KEA" camera with active stereo using an Intel RealSense D435 sensor.
Depth data can also be overlaid on RGB to get colored point cloud visualizations. KEA provides much cleaner-looking results:
They show some limitations too. In this scene the floor has very low reflectivity in IR so the KEA camera struggles to collect enough photons there:
[PS: I wish all companies showed "failure cases" as part of their promotional materials!]
Full article here: https://medium.com/chronoptics-time-of-flight/comparing-depth-cameras-itof-versus-active-stereo-e163811f3ac8
Image Sensors World Go to the original article...
Image Sensors World Go to the original article...
A team from Stanford University's Laboratory for Integrated Nano-Quantum Systems (LINQS) and ArbabianLab present a new method that can potentially convert any conventional CMOS image sensor into an amplitude-modulated continuous-wave time-of-flight camera. The paper titled "Longitudinal piezoelectric resonant photoelastic modulator for efficient intensity modulation at megahertz frequencies" appeared in Nature Communications.
Intensity modulators are an essential component in optics for controlling free-space beams. Many applications require the intensity of a free-space beam to be modulated at a single frequency, including wide-field lock-in detection for sensitive measurements, mode-locking in lasers, and phase-shift time-of-flight imaging (LiDAR). Here, we report a new type of single frequency intensity modulator that we refer to as a longitudinal piezoelectric resonant photoelastic modulator. The modulator consists of a thin lithium niobate wafer coated with transparent surface electrodes. One of the fundamental acoustic modes of the modulator is excited through the surface electrodes, confining an acoustic standing wave to the electrode region. The modulator is placed between optical polarizers; light propagating through the modulator and polarizers is intensity modulated with a wide acceptance angle and record breaking modulation efficiency in the megahertz frequency regime. As an illustration of the potential of our approach, we show that the proposed modulator can be integrated with a standard image sensor to effectively convert it into a time-of-flight imaging system.
Return to top of page
Copyright © 2024 F4news Terms & Policies