Assorted Videos: Elmos, Mizoram University, PMD, Sense Photonics

Image Sensors World        Go to the original article...

Elmos publishes a gesture recognition demo based on its ToF sensor:


Mizoram University, India, publishes an 1-hout introduction into CMOS sensor technology by Bhaskar Choubey from Universität Siegen and Fraunhofer Institute, Germany:


PMD video talks about the company's long-range iToF development:


Another PMD video - an interview with the company's CEO Bernd Buxbaum talking about ability to maintain the same performance while shrinking the pixel size:


Sense Photonics presents its automotive flash LiDAR platform:

Go to the original article...

Sony Presents 2.9um Pixel with 88dB DR in a Single Exposure

Image Sensors World        Go to the original article...

Sony announces the upcoming release of IMX585, a 1/1.2-type 4K CMOS  sensor for security cameras, which delivers approximately 8 times the DR of the company's conventional model in a single exposure. 

The new product belongs to the new “STARVIS 2” series featuring high sensitivity and 88dB HDR in a single exposure. It also increases the NIR sensitivity by approximately 1.7x compared to conventional model. The new sensor can also be used in multi-exposure mode, delivering a HDR of 106 dB.

Sony also plans to launch IMX662, a 1/3-type 2K resolution image sensor employing “STARVIS 2” to deliver an 88 dB DR in a single exposure. Samples are scheduled to be made available for shipment within this year.

Generally, in order to provide HDR imaging, multi-exposure image capture is required, and the multiple images recorded at differing exposure times are composited into a single shot. This results in issues with artifacts, which can cause false recognition when using AI, especially when capturing moving subjects. The new product adopts Sony’s new proprietary “STARVIS 2” technology, which delivers both high sensitivity and HDR imaging.

Go to the original article...

2021 International Image Sensor Workshop Unveils Draft Agenda

Image Sensors World        Go to the original article...

International Image Sensor Workshop (IISW 2021) to be held on-line on September 20-23, presents its Draft Agenda. There are 52 Regular papers, 23 Posters (Flash presentations), and 2 Invited papers. The registration is open here.

Go to the original article...

Quantum 3D Imaging Promises All-in-Focus, Low Noise, High Resolution, Scanning-Free Depth Images

Image Sensors World        Go to the original article...

Arxiv.org paper "Towards quantum 3D imaging devices: the Qu3D project" by Cristoforo Abbattista, Leonardo Amoruso, Samuel Burri, Edoardo Charbon, Francesco Di Lena, Augusto Garuccio, Davide Giannella, Zdenek Hradil, Michele Iacobellis, Gianlorenzo Massaro, Paul Mos, Libor Motka, Martin Paur, Francesco V. Pepe, Michal Peterek, Isabella Petrelli, Jaroslav Rehacek, Francesca Santoro, Francesco Scattarella, Arin Ulku, Sergii Vasiukov, Michael Wayne, Milena D'Angelo, Claudio Bruschini, Maria Ieronymaki, and Bohumil Stoklasa, from Planetek Hellas (Greece), EPFL (Switzerland), INFN (Italy), Universit´a degli Studi di Bari (Italy), Palack´y University (Czech Republic).

"We review the advancement of the research toward the design and implementation of quantum plenoptic cameras, radically novel 3D imaging devices that exploit both momentum-position entanglement and photon-number correlations to provide the typical refocusing and ultra-fast, scanning-free, 3D imaging capability of plenoptic devices, along with dramatically enhanced performances, unattainable in standard plenoptic cameras: diffraction-limited resolution, large depth of focus, and ultra-low noise. To further increase the volumetric resolution beyond the Rayleigh diffraction limit, and achieve the quantum limit, we are also developing dedicated protocols based on quantum Fisher information. However, for the quantum advantages of the proposed devices to be effective and appealing to end-users, two main challenges need to be tackled. First, due to the large number of frames required for correlation measurements to provide an acceptable SNR, quantum plenoptic imaging would require, if implemented with commercially available high-resolution cameras, acquisition times ranging from tens of seconds to a few minutes. Second, the elaboration of this large amount of data, in order to retrieve 3D images or refocusing 2D images, requires high-performance and time-consuming computation. To address these challenges, we are developing high-resolution SPAD arrays and high-performance low-level programming of ultra-fast electronics, combined with compressive sensing and quantum tomography algorithms, with the aim to reduce both the acquisition and the elaboration time by two orders of magnitude. Routes toward exploitation of the QPI devices will also be discussed."

Go to the original article...

Radiation Detection with iPhone 6s Camera

Image Sensors World        Go to the original article...

Nature publishes a paper "The suitability of smartphone camera sensors for detecting radiation" by Yehia H. Johary, Jamie Trapp, Ali Aamry, Hussin Aamri, N. Tamam & A. Sulieman from Queensland University of Technology (Australia), King Saud Medical City (Saudi Arabia), Princess Nourah Bint Abdulrahman University (Saudi Arabia), and Prince Sattam Bin Abdulaziz University (Saudi Arabia.)

"The advanced image sensors installed on now-ubiquitous smartphones can be used to detect ionising radiation in addition to visible light. Radiation incidents on a smartphone camera’s Complementary Metal Oxide Semiconductor (CMOS) sensor creates a signal which can be isolated from a visible light signal to turn the smartphone into a radiation detector. This work aims to report a detailed investigation of a well-reviewed smartphone application for radiation dosimetry that is available for popular smartphone devices under a calibration protocol that is typically used for the commercial calibration of radiation detectors. The iPhone 6s smartphone, which has a CMOS camera sensor, was used in this study. Black tape was utilized to block visible light. The Radioactivity counter app developed by Rolf-Dieter Klein and available on Apple’s App Store was installed on the device and tested using a calibrated radioactive source, calibration concrete pads with a range of known concentrations of radioactive elements, and in direct sunlight. The smartphone CMOS sensor is sensitive to radiation doses as low as 10 µGy/h, with a linear dose response and an angular dependence. The RadioactivityCounter app is limited in that it requires 4–10 min to offer a stable measurement. The precision of the measurement is also affected by heat and a smartphone’s battery level. Although the smartphone is not as accurate as a conventional detector, it is useful enough to detect radiation before the radiation reaches hazardous levels. It can also be used for personal dose assessments and as an alarm for the presence of high radiation levels."

Go to the original article...

Yole Updates about 2020-21 CIS Market and Market Shares

Image Sensors World        Go to the original article...

Yole Developpement publishe an article "COVID-19 & Huawei ban winds of change on the CIS industry – Quarterly Market Monitor":

2020 saw CIS revenues reach $20.7B with an annual growth of 7.3%. As with other semiconductor products, Yole’s analysts noted that long production cycles and the activity of markets such as consumer, automotive, security, and industrial led to challenges in procurement of CIS at the end of 2020. In 2021, Yole’s imaging team expects a more stable situation. Q1-2021 has been very good due to some production overflow identified from Q4-2020, leading to a 7% better quarter compared to Q1-2020.

2020 was a very unusual year for the CIS industry. Everybody had the COVID-19 situation in mind, and, indeed, it created a temporary disruption in the supply chain which had to be made up towards year-end. Another disruptive aspect was the Huawei ban and its effect on the market between Q3 2020 and Q4 2020, especially for Sony. However, it did not lead to a CIS market collapse thanks to the increasing number of cameras per mobile and a stable average selling price overall.

All these events combined allowed for the CIS industry to maintain significant growth in 2020.


Q1 is seasonally a typically lower revenue quarter, though there was fear of shortages in the overall semiconductor industry,” explains Chenmeijing Liang, Technology & Market Analyst within the Photonics, Sensing & Display Division at Yole. “At Yole, we believe that the effect here is more linked to supply chain issues rather than real capacity issues for CIS.


Sony was hit the hardest by these crises, as it was highly exposed to the mobile market and subsequent international trade tensions; the toll on Sony’s revenue in Q4 2020 is notable,” asserts Pierre Cambou, Principal analyst in the Photonics and Sensing Division at Yole.

Sony, of course, remains the market leader, and though they did lose some market share in Q4-2020, they regained traction in Q1-2021. However, they are being challenged by increased competition. In comparison, their nearest competitor Samsung was more protected from this market shake-up due to its vertical integration. Their recently released line of 0.7µm pixel sensors targeting the mobile market helped them seize some of the new opportunities, with some OEMs, such as Xiaomi, benefiting from Huawei’s disappearance.

Some CIS fabless players, like ON Semiconductor, may not have the ability to secure capacity from TSMC, as Sony did. They did ok during 2020 but could have done better by surfing the automotive and logistic camera market growth.

But other fabless players, probably Chinese, like Smartsens Technology and Omnivision, diversified their sourcing long ago and grew far more.

Go to the original article...

PhotonicSens Promises 3D Image from Single Lens Camera Module

Image Sensors World        Go to the original article...

PRNewswire: Photonicsens presents its 3D single-lens camera design with Qualcomm:

"photonicSENS' single lens 3D depth sensing solution will be a game changer for smartphones," says Ann Whyte, President of photonicSENS, "The  3D depth camera reference designs of this collaboration are based on our single lens apiCAM technology that with a single device delivers simultaneously an RGB image and depth map to offer smartphone manufacturers the means to differentiate with enhanced photographic features, a 1.4Mpx depthmap, the lowest component count, lowest cost and the lowest power dissipation, as well as the best performance in any environment.   Snapdragon 888 is a clear leader, and we are excited to be working with Qualcomm Technologies to release our cutting-edge 3D sensing solution to market."

Go to the original article...

Axcelis Ships 8MeV Ion Implanter to "Leading CIS Manufacturer in China"

Image Sensors World        Go to the original article...

PRNewswire:  Axcelis has shipped an 8MeV Purion VXE high energy ion implantation system to a "leading CMOS image sensor manufacturer located in China." This is the first Purion VXE shipped to this chipmaker.

EVP of Product Development, Bill Bintz, commented, "The Purion VXE was designed to address the specific needs of customers developing and manufacturing the most advanced CMOS image sensors. To optimize both performance and yield, these emerging image sensor devices require ultra-high energy implants with extremely precise and deep implant profiles, concurrent with ultra-low metal contamination levels. Building off of Axcelis' market leading LINAC technology, the Purion VXE uniquely addresses these customer needs."

Go to the original article...

IDTechEx on Emerging Sensor Technologies

Image Sensors World        Go to the original article...

IDTechEx CEO Raghu Das and analyst Matthew Dyson present their view on "Emerging Image Sensor Technologies 2021-2031."


Go to the original article...

Infineon and PMD Partner with ArcSoft for Under-Display ToF Turnkey Solution

Image Sensors World        Go to the original article...

BusinessWire: Infineon, PMD and ArcSoft are developing a turnkey solution that allows a ToF camera to work under the display of commercial smartphones. It will provide reliable IR images and 3D data for security-relevant applications like face authentication and mobile payment. The market for ToF solutions in smartphones is estimated to reach above 600M sensor units in 2025 with a CAGR of around 32% from 2021 onward, according to Strategy Analytics.

Time-of-Flight technology offers tremendous value for smartphones and in our daily lives by making electronic devices aware of the context in which we use them,” says Andreas Urschitz, Division President Power & Sensor Systems at Infineon. “In addition to our continuous technological achievements in terms of smaller size, reduced power consumption, and better 3D performance, our AI-enabled and secure under-display solution will provide a display design beautification for smartphone manufacturers.

To build powerful Time-of-Flight cameras, you need to have a deep understanding of the 3D data and how applications make use of it. That is why we are working closely with middleware partners and OEMs to provide them best in class ToF-algorithms, software, and high-quality 3D data to build their application on. The solution, that we are jointly developing with ArcSoft, allows our ToF cameras to see through displays while still meeting the requirements for secure face authentication in mobile phone unlock and mobile payment,” adds Bernd Buxbaum, CEO at PMD.

"The implementation of 3D ToF in mobile devices promises to spark the next wave of killer consumer applications, which is exactly why ArcSoft is excited to work with Infineon and pmdtechnologies," says Sean Bi, COO of ArcSoft. "By deeply integrating ToF cameras with ArcSoft's computer vision algorithms, under-display ToF can bring reliable facial recognition solutions and a superior full-screen experience to consumers. Relying on under-display ToF, ArcSoft will also enable more applications such as AR related, which mobile manufacturers value when deployed in support of new and exciting mobile apps.

Go to the original article...

Intevac Night Vision Sensor Development Attracts $23M Funding

Image Sensors World        Go to the original article...

BusinessWire: Intevac has received two additional Phase 1 development program awards in addition to the ManTech development award received from the Night Vision and Electronics Sensors Directorate during Q1 of 2021.

The ManTech award continues our work on the current CMOS camera developed in support of IVAS, targeting reduced power and cost, and improved performance. In the new awards announced today, the Enhanced Performance CIS award is aimed at further improving low-light performance for our next-generation CMOS camera, advancing from the current high-starlight operating capability to overcast starlight. The second of the two new awards, the Enhanced Performance EBAPS award, is aimed at significantly improved low-light performance utilizing Intevac’s ISIE19 EBAPS technology. This Enhanced Performance EBAPS award is designed to provide ISIE19 low-light performance down to overcast-starlight capability in a greatly reduced form factor required for this application.

If selected for Phase 2 development work on all three of these IVAS-supporting programs, funded development revenues for Intevac Photonics would total approximately $23M over a 36-month period.

Intevac’s digital night-vision sensors, based on its patented Electron Bombarded Active Pixel Sensor (EBAPS) technology, provide state-of-the-art capability to the most advanced avionic fighting platforms in the U.S. Department of Defense inventory.

Go to the original article...

Nissan Gives Away Free Licenses for its Thermal Imaging Patents for Use in Anti-COVID Applications

Image Sensors World        Go to the original article...

I missed this news first announced in December 2020 and then again in April 2021: Nissan is providing licenses free of charge for thermal imaging sensor technology developed by the company.

Nissan is licensing the low-cost technology under the terms of the IP Open Access Declaration Against COVID-19, which the company joined in May. By signing the declaration, Nissan agreed not to seek compensation nor assert any patent, utility model, design or copyright claim against any activities aimed at combatting the pandemic.

The licenses are for multiple products being developed by Chino Corp. and Seiko NPC Corp. Chino is using Nissan’s technology to develop, manufacture and sell non-contact body surface temperature measuring devices that can quickly detect high body surface temperatures.

Seiko NPC has developed sensors under a sublicense of the technology from IHI Aerospace Co., Ltd. These sensors are being used in non-contact body surface temperature measuring devices for multiple companies.

Nissan’s contactless temperature-measuring sensor detects infrared rays from an object or area. It can display images, such as temperature distributions, with a resolution of about 2000 pixels and can be manufactured at significantly lower cost than sensors made using conventional technologies.

Go to the original article...

Jabil Develops 360-deg ToF Camera Based on ADI Reference Design

Image Sensors World        Go to the original article...

BusinessWire: Jabil announces that its optical design center in Jena, Germany, is currently developing a novel omnidirectional sensor for robotic and industrial platforms. By combining a custom optical assembly with an innovative active illumination approach, a new 3D ToF depth sensor with an industry-leading 360° x 60° FOV is being developed (data sheet states 270deg x 60deg FOV). The ground-breaking, solid-state design is one of several sensing systems Jabil’s optical business unit (Jabil Optics) is designing to support lower-cost autonomous mobile robotics and collaborative robotics platforms.

A mission of Analog Devices is to enable the autonomous mobile robot revolution by providing high performance and highly differentiated signal chains that bridge the gap between the analog and digital worlds,” said Donnacha O’Riordan, director of ADI. “The Jabil omnidirectional sensor is one of the most innovative implementations of the ADI depth-sensing technology we have encountered. Jabil’s wide field-of-view, depth-sensing approach is opening up new possibilities for human interaction with robots.


Go to the original article...

International Image Sensor Workshop Registration Opens

Image Sensors World        Go to the original article...

2021 International Image Sensor Workshop (IISW 2021) registration is open now. The Workshop is an on-line virtual event this year, to be held on September 20-23. The details are explained in FAQ part on the bottom of the registration page:

Go to the original article...

GPixel Expands its Line Scan Sensors Family

Image Sensors World        Go to the original article...

Gpixel expands its GL product family with GL3504, a C-mount line scan image sensor targeting industrial inspection, logistics barcode scanning, and printing inspection.

GL3504 has two photosensitive pixel arrays: a 2048 x 4 resolution array with 7 μm x 7 μm square pixel size and a 4096 x 2 resolution array with 3.5 μm x 3.5 μm pixel size. Both monochromatic and color variants are offered. The color filter array on the 3.5 μm pixel line is Bayer type; The 7 μm pixel lines are RGB true color type.

GL3504 engineering samples can be ordered today for delivery in July, 2021.

Go to the original article...

ESPROS about Human Eye as a LiDAR

Image Sensors World        Go to the original article...

Espros publishes its CEO Beat De Coi's presentation at Autosens Detroit 2021 "The Human eye as an example for LiDAR."

"The performance of the human eye is awesome. It has a fantastic resolution, hence small objects can bee seen at long distances. It works very well in a huge brightness dynamic range and it is able to estimate distance. This in a system of two eyes and a dedicated computer system - the human vision system (HVS). There are many aspects of the HVS which outperforms any LiDAR system. However, the perfomance is based on a very clever designed system. Why not to use the human eye and the human vision system as an example for future LiDAR systems?"

Go to the original article...

Megapixel ToF Imager with 35um Depth Resolution

Image Sensors World        Go to the original article...

IEEE Transactions on Pattern Analysis and Machine Intelligence publishes a paper "Exploiting Wavelength Diversity for High Resolution Time-of-Flight 3D Imaging" by Fengqiang Li, Florian Willomitzer, Muralidhar Madabhushi Balaji, Prasanna Rangarajan, and Oliver Cossairt from Northwestern University at Evanston, IL, and Southern Methodist University, Dallas, TX. The paper has also been publishes in Arxiv.org and IEEE Computer Society Digital Library.

"The poor lateral and depth resolution of state-of-the-art 3D sensors based on the time-of-flight (ToF) principle has limited widespread adoption to a few niche applications. In this work, we introduce a novel sensor concept that provides ToF-based 3D measurements of real world objects and surfaces with depth precision up to 35 μm and point cloud densities commensurate with the native sensor resolution of standard CMOS/CCD detectors (up to several megapixels). Such capabilities are realized by combining the best attributes of continuous wave ToF sensing, multi-wavelength interferometry, and heterodyne interferometry into a single approach. We describe multiple embodiments of the approach, each featuring a different sensing modality and associated tradeoffs."

Go to the original article...

3rd International Workshop on Event-Based Vision – Day 1

Image Sensors World        Go to the original article...

Day 1 of the Third International Workshop on Event-Based Vision live feed is complete and available on Youtube:

Go to the original article...

Exotic Photodetector News

Image Sensors World        Go to the original article...

The papers below promise to revolutionize future image sensor technology in many different ways. Whether you believe it or not is up to you.

OSA publishes Tianjin University, China, paper "Low operating voltage monolithic stacked perovskite photodetectors for imaging applications" by Hongliang Zhao, Tengteng Li, Qingyan Li, Chengqi Ma, Jie Li, Chenglong Zheng, Yating Zhang, and Jianquan Yao.

"The monolithic stacked design is expected to solve the challenges of wiring difficulties, complex fabrication processes, and low resolution. However, a photodetector array with low operating voltage that is suitable for imaging applications has not been proposed. Here, a perovskite photodetector array with a monolithic stacked structure is proposed. The CH3NH3PbI3 photodetector has a low power consumption off-state (0 V) and on-state (−2 V) voltage, and the highest responsivity and specific detectivity of 0.39 A/W and 4.53e12 Jones at 775 nm, respectively. The rise time and decay time are 111 µs and 250 µs respectively. In addition, the imaging application shows high contrast, which provides a simple and effective way to prepare high performance perovskite imaging devices."


Science Magazine publishes North Carolina State University and KAIS paper "Mantis shrimp–inspired organic photodetector for simultaneous hyperspectral and polarimetric imaging" by Ali Altaqui, Pratik Sen, Harry Schrickx, Jeromy Rech, Jin-Woo Lee, Michael Escuti, Wei You, Bumjoon J. Kim, Robert Kolbas, Brendan T. O’Connor, and Michael Kudenov.

"Combining hyperspectral and polarimetric imaging provides a powerful sensing modality with broad applications from astronomy to biology. Existing methods rely on temporal data acquisition or snapshot imaging of spatially separated detectors. These approaches incur fundamental artifacts that degrade imaging performance. To overcome these limitations, we present a stomatopod-inspired sensor capable of snapshot hyperspectral and polarization sensing in a single pixel. The design consists of stacking polarization-sensitive organic photovoltaics (P-OPVs) and polymer retarders. Multiple spectral and polarization channels are obtained by exploiting the P-OPVs’ anisotropic response and the retarders’ dispersion. We show that the design can sense 15 spectral channels over a 350-nanometer bandwidth. A detector is also experimentally demonstrated, which simultaneously registers four spectral channels and three polarization channels. The sensor showcases the myriad degrees of freedom offered by organic semiconductors that are not available in inorganics and heralds a fundamentally unexplored route for simultaneous spectral and polarimetric imaging."


Sandia Labs publishes a research "Design of High-Performance Photon-Number-Resolving Photodetectors Based on Coherently Interacting Nanoscale Elements" by Steve M. Young, Mohan Sarovar, and François Léonard.

"In summary, we employed a fundamental approach based on quantum master equations to identify the challenges in high performance photon number resolving photodetectors. A number of obstructions arise when attempting to achieve PNR (Photon Number Resolving) while simultaneously optimizing important metrics. Using our approach we are able to understand the reasons for these obstructions and formulate designs that circumvent them. As a result, we designed a novel detector architecture based on coherently and collectively interacting absorbing elements, energy transfer, and a continuous monitoring process, that is able to achieve PNR as well as excellent performance in terms of efficiency, dark counts, bandwidth, and count rate. The needed physical properties of this architecture suggest that molecular and nanoscale systems are prime candidates to realize new generations of photodetectors."


Applied Physics Letters publishes a paper "Monolithic infrared silicon photonics: The rise of (Si)GeSn semiconductors" by O. Moutanabbir,  S. Assali,  X. Gong,  E. O'Reilly,  C. A. Broderick,  B. Marzban,  J. Witzens,  W. Du, S-Q. Yu,  A. Chelnokov,  D. Buca, and  D. Nam from École Polytechnique de Montréal, National University of Singapore, University College Cork,  RWTH Aachen University, Wilkes University, University of Arkansas, University Grenoble Alpes, Peter Gruenberg Institute, and Nanyang Technological University.

"(Si)GeSn semiconductors are finally coming of age after a long gestation period. The demonstration of device-quality epi-layers and quantum-engineered heterostructures has meant that tunable all-group IV Si-integrated infrared photonics is now a real possibility. Notwithstanding the recent exciting developments in (Si)GeSn materials and devices, this family of semiconductors is still facing serious limitations that need to be addressed to enable reliable and scalable applications. The main outstanding challenges include the difficulty to grow high-crystalline quality layers and heterostructures at the desired content and lattice strain, preserve the material integrity during growth and throughout device processing steps, and control doping and defect density. Other challenges are related to the lack of optimized device designs and predictive theoretical models to evaluate and simulate the fundamental properties and performance of (Si)GeSn layers and heterostructures. This Perspective highlights key strategies to circumvent these hurdles and hopefully bring this material system to maturity to create far-reaching opportunities for Si-compatible infrared photodetectors, sensors, and emitters for applications in free-space communication, infrared harvesting, biological and chemical sensing, and thermal imaging."

Go to the original article...

Infineon Posts Chart Explaining ToF Camera Design Tasks

Image Sensors World        Go to the original article...

Infineon publishes a chart explaining 3D camera design tasks beyond a ToF sensor:

Go to the original article...

NHK Presentation on 8K Organic Sensor Camera

Image Sensors World        Go to the original article...

NHK presented its organic image sensor-based camera for 8K TV broadcast (2019 presentation):



For those of you with IEEE SSCS account access, there is also Panasonic webinar by Kazuko Nishimura on organic image sensor.

Go to the original article...

Canon Presents X-Ray Sensor With Auto-Exposure Control

Image Sensors World        Go to the original article...

Canon announces the new "Built-in AEC assistance" technology for digital radiography. With this technology, the device's X-ray image sensor uses real-time detection of the pixel value corresponding to emitted X-rays, notifying the X-ray generator when pixel value reaches a preset value.

In clinical environments, X-ray imaging is conducted with various precautions in accordance with the ALARA (As Low As Reasonably Achievable) principle of radiation safety, which states that the use of radiation must, among other factors, "take into account benefits to the public health and safety, and other societal and socioeconomic considerations." The AEC technology enables operators to specify a pixel value and automatically send a notification to the X-ray generator. When that value is reached, eliminating the need for a dedicated external attachment and enabling the automatic stopping of X-ray emissions from the X-ray generator.

Go to the original article...

Samsung Presents AI-based Bad Pixel Detection

Image Sensors World        Go to the original article...

Springer publishes a Samsung paper presented at 2020 International Conference on Computer Vision and Image Processing "A Pre-processing Assisted Neural Network for Dynamic Bad Pixel Detection in Bayer Images" by Girish Kalyanasundaram, Puneet Pandey, and Manjit Hota.

"CMOS image sensor cameras are integral part of modern hand held devices. Traditionally, CMOS image sensors are affected by many types of noises which reduce the quality of image generated. These spatially and temporally varying noises alter the pixel intensities, leading to corrupted pixels, also known as “bad” pixels. The proposed method uses a simple neural network approach to detect such bad pixels on a Bayer sensor image so that it can be corrected and overall image quality can be improved. The results show that we are able to achieve a defect miss rate of less than 0.045% with the proposed method."

Go to the original article...

ISSCC 2021 On-Line: Samsung ISOCELL Vizion 33D Paper

Image Sensors World        Go to the original article...

 Samsung ISOCELL Vizion presentation at ISSCC 2021 details the company iToF approach: "7.1 - A 4-tap 3.5μm 1.2Mpixel Indirect Time-of-Flight CMOS Image Sensor with Peak Current Mitigation and Multi-User Interference Cancellation" by Min-Sun Keel, Daeyun Kim, Yeomyung Kim, Myunghan Bae, Myoungoh Ki, Bumsik Chung, Sooho Son, Hoyong Lee, Heeyoung Jo, Seung-Chul Shin, Sunjoo Hong, Jaeil An, Yonghun Kwon, Sungyoung Seo, Sunghyuck Cho, Youngchan Kim, Young-Gu Jin, Youngsun Oh, Yitae Kim, JungChak Ahn, Kyoungmin Koh, and Yongin Park.

I'm not sure what 33D means in the presentation:

Go to the original article...

ISSCC 2021 On-Line: Sony Sensor with Integrated AI Processor

Image Sensors World        Go to the original article...

Sony ISSCC presentation "9.6 - A 1/2.3inch 12.3Mpixel with On-Chip 4.97TOPS/W CNN Processor Back-Illuminated Stacked CMOS Image Sensor" by Ryoji Eki, Satoshi Yamada, Hiroyuki Ozawa, Hitoshi Kai, Kazuyuki Okuike, Hareesh Gowtham, Hidetomo Nakanishi, Edan Almog, Yoel Livne, Gadi Yuval, Eli Zyss, and Takashi Izawa explains the trade-offs of such a product:

Go to the original article...

GPixel Presents its Pipelined Readout Scheme

Image Sensors World        Go to the original article...

IEICE Electronics Express publishes GPixel paper "A pipeline row operation method of CMOS image sensors" by Yang Li, Chao Fu, Tao Jiang, Yang Liu, Cheng Ma, Jan Bogaerts, Xinyang Wang.

"In this paper, we present a pixel array operation method of CMOS image sensor that enables pipeline processing of pixel operations. The sensor frame rate constraint from the delay of pixel array control lines is much relieved by manipulating control phases of adjacent pixel rows simultaneously. An analog frontend readout circuit is proposed to support the row pipeline operation pixel readout. A prototype image sensor was designed with its performance characterized and analyzed."

Go to the original article...

LiDAR Technology Map

Image Sensors World        Go to the original article...

EETimes reporter Junko Yoshida publishes an article "Lidar Sweepstakes Draws 15 RFQs, But No Frontrunner."

"More than a dozen RFQs for lidars are reportedly flying around. Evidently, lidars are beginning to penetrate the ADAS market. Automotive industry observers, however, caution not to expect a single lidar supplier to win this sweepstakes.

“When we recently interviewed lidar companies, they told us that every OEM and Tier One has different requirements — demanding a specific field of view, distance and position to integrate lidars in a vehicle,” Pierrick Boulay, solid state lighting and lighting systems analyst at Yole Développement, told EE Times.

Yole put FMCW in the R&D bin, calling it “TBD” — to be determined. Boulay said “We do not expect to see FMCW lidars before 2025.” Last year, Waymo talked about FMCW for its future home-grown lidars. Mobileye, most prominently, discussed FMCW early this year, describing it as its choice for lidars under development for fully autonomous vehicles."

Go to the original article...

XFAB Presents Photodiode Improvements, Reports 20% Smartphone ALS Market Share

Image Sensors World        Go to the original article...

X-FAB Foundries unveils a new photodiode-specific process core module. Whereas previously the XS018 process had been mainly focused on the fabrication of multi-pixel CMOS image sensors, this new module is dedicated to photodiode fabrication. 

Through this module, X-FAB offers six different photodiode options. These cover wavelengths from UV to NIR. The new photodiodes are capable of delivering best-in-class UV sensitivity, attaining 40% QE in the UVA band, 50-60% QE in the UVB band, and over 60% QE in the UVC band. At 850 nm the photodiodes have 17% greater QE than legacy devices based on the original XH018 process, and at 905 nm there is a 5% QE increase witnessed. With a QE of approximately 90%, the human eye response option is highly suited to ambient light sensing applications. 

A unique feature means that photodiode responsivity can be stipulated, via specifying the size of the metal aperture. The output current of the photodiode is thereby scalable between full current and no current – in order that any differences caused by filtering can be compensated for. This in turn simplifies the accompanying amplification circuitry for photodiode arrays. Other enhancements include a 10% increase in the fill factor compared to devices based on the earlier XH018 generation.

Through ongoing investment, X-FAB has built up strong optoelectronic credentials. Among the proof points of this is the fact that over 20% of mobile phone handsets manufactured worldwide feature an ambient light sensor that was produced by us,” states Luigi Di Capua, VP of Product Marketing at X-FAB. “Thanks to the advances we have announced in relation to our photodiode offering, we will now be better positioned to address client demands for proximity sensing, spectral analysis and optical distance/triangulation measurement solutions.

The six photodiodes are available now.

Go to the original article...

Rudolf Schwarte, Father of Photonic Mixer Device, Died at the Age of 82

Image Sensors World        Go to the original article...

University of Siegen, Germany, and the Center for Sensor Systems (ZESS) mourn for Prof. Dr. Rudolf Schwarte death on March, 7, 2021, at the age of 82.

In 1981 he accepted the call as a professor at the Institute for Telecommunications at the University of Siegen, where he taught and researched as head of the institute until 2007. In 1988 he founded the “Interdisciplinary Center for Sensor Systems NRW (ZESS)”, a focal point of today’s sensor research in Siegen, and was also its chairman until 1998.

He is considered the father of “fast” 3D vision and is the inventor of the Photonic Mixer Device (PMD) technology, which forms the technological basis of various spin-offs from ZESS (S-Tec and today’s successful company PMD-Technologies). He received several patents for this development and was nominated for the Research and Innovation Prize of the Federal Republic of Germany. With more than 50 patents, Prof. Schwarte was one of the world’s leading researchers in his discipline. In 1995 and 1997, the state government of North Rhine-Westphalia awarded him the innovation prize.

In 2005 he was awarded the Cross of Merit 1st Class of the Order of Merit of the Federal Republic of Germany by the Secretary of State for Science, Hartmut Krebs.

Go to the original article...

Call for Nominations for the 2021 Walter Kosonocky Award

Image Sensors World        Go to the original article...

International Image Sensor Society Call for Nominations for the 2021 Walter Kosonocky Award for Significant Advancement in Solid-State Image Sensors:

The Walter Kosonocky Award is presented biennially for THE BEST PAPER presented in any venue during the prior two years representing significant advancement in solid-state image sensors. The award commemorates the many important contributions made by the late Dr. Walter Kosonocky to the field of solid-state image sensors. Personal tributes to Dr. Kosonocky appeared in the IEEE Transactions on Electron Devices in 1997.

Founded in 1997 by his colleagues in industry, government and academia, the award is also funded by proceeds from the International Image Sensor Workshop. (See International Image Sensor Society’s website for detail and past recipients)

The award is selected from nominated papers by the Walter Kosonocky Award Committee, announced and presented at the International Image Sensor Workshop (IISW), and sponsored by the International Image Sensor Society (IISS). The winner is presented with a certificate, complementary registration to the IISW, and an honorarium.

Please send us an email nomination for this year's award, with a pdf file of the nominated paper (that you judge is the best paper published/ presented in calendar years 2019 and 2020) as well as a brief description (less than 100 words) of your reason nominating the paper. Nomination of a paper from your company/ institute is also welcome.

The deadline for receiving nominations is April 26th, 2021.

Your nominations should be sent to Rihito Kuroda (2021nominations@imagesensors.org), Chair of the IISS Award Committee.

Go to the original article...

css.php