Canon’s enforcement of its intellectual property right leads to the removal of toner packs, including toner packs sold as “V4Ink” brand, from Tmall
Review paper on IR photodiodes
Image Sensors World Go to the original article...
A team from Military University of Technology (Poland) and Shanghai Institute of Technical Physics (China) have published a review article titled "Infrared avalanche photodiodes from bulk to 2D materials" in Light: Science & Applications journal.
Open access paper: https://www.nature.com/articles/s41377-023-01259-3
Abstract: Avalanche photodiodes (APDs) have drawn huge interest in recent years and have been extensively used in a range of fields including the most important one—optical communication systems due to their time responses and high sensitivities. This article shows the evolution and the recent development of AIIIBV, AIIBVI, and potential alternatives to formerly mentioned—“third wave” superlattices (SL) and two-dimensional (2D) materials infrared (IR) APDs. In the beginning, the APDs fundamental operating principle is demonstrated together with progress in architecture. It is shown that the APDs evolution has moved the device’s performance towards higher bandwidths, lower noise, and higher gain-bandwidth products. The material properties to reach both high gain and low excess noise for devices operating in different wavelength ranges were also considered showing the future progress and the research direction. More attention was paid to advances in AIIIBV APDs, such as AlInAsSb, which may be used in future optical communications, type-II superlattice (T2SLs, “Ga-based” and “Ga-free”), and 2D materials-based IR APDs. The latter—atomically thin 2D materials exhibit huge potential in APDs and could be considered as an alternative material to the well-known, sophisticated, and developed AIIIBV APD technologies to include single-photon detection mode. That is related to the fact that conventional bulk materials APDs’ performance is restricted by reasonably high dark currents. One approach to resolve that problem seems to be implementing low-dimensional materials and structures as the APDs’ active regions. The Schottky barrier and atomic level thicknesses lead to the 2D APD dark current significant suppression. What is more, APDs can operate within visible (VIS), near-infrared (NIR)/mid-wavelength infrared range (MWIR), with a responsivity ~80 A/W, external quantum efficiency ~24.8%, gain ~105 for MWIR [wavelength, λ = 4 μm, temperature, T = 10–180 K, Black Phosphorous (BP)/InSe APD]. It is believed that the 2D APD could prove themselves to be an alternative providing a viable method for device fabrication with simultaneous high-performance—sensitivity and low excess noise.
Fig. 1: Bulk to low-dimensional material, tactics to fabricate APDs and possible applications: FOC, FSO, LIDAR and QKDs.
Fig. 2: The APD’s operating principle. a Electron and hole multiplication mechanisms, schematic of multiplication mechanism for b k = 0 (αh = 0) and c k = 1 (αe = αh), where k = αh/αe – αe, αh represent electron and hole multiplication coefficients. d αe, αh ionization coefficients versus electric field for selected semiconductors used for APDs’ fabrication
Fig. 3: APDs. a p–n device, b SAM device, and c SAGCM device with electric field distribution. F(M) dependence on M for the selected k = αh/αe in APDs when: d electrons and e holes dominate in the avalanche mechanism. The multiplication path length probability distribution functions in the: f local and g non-local field “dead space” models
Fig. 4: InGaAs/InP SAM-APD. a device structure, b energy band profile, and electric field under normal reverse bias condition. AlxIn1–xAsySb1–y based SACM APD: c detector’s design with the E distribution within the detector, d measured and theoretically simulated gain, dark current, photocurrent versus reverse voltage for 90 μm diameter device at room temperature39. InAs planar avalanche photodiode: e a schematic design diagram, f comparison of the gain reached by 1550 nm wavelength laser132,133. The M normalized dark current for 100 μm radius planar APD was presented for 200 K
Fig. 5: F(M) versus M for. a Si, AlInAs, GaAs, Ge, InP [the solid lines present the F(M) for k within the range 0–1 (increment 0.1) calculated by the local field model24, typical F(M) are shown by shaded regions37 and b selected materials: 3.5 μm thick intrinsic InAs APDs (50 μm and 100 μm radius), 4.2 μm cut-off wavelengths HgCdTe and 2.2 μm InAlAs APDsFig. 6: Gain and k versus Hg1–xCdxTe bandgap energy. a the crossover between e-APD and h-APD. The crossover at Eg ≈ 0.65 eV corresponds to the λc = 1.9 μm for 300 K46. Hole-initiated avalanche HgCdTe photodiode: b detector profile, c energy band structure, d hole-initiated multiplication process energy band structure. The multiplication layer bandgap energy is adjusted to the resonance condition where the bandgap and the split-off valence band energy and the top of the heavy-hole valence band energy difference are equal. Electron-initiated avalanche HgCdTe photodiode: e diagram of electron-initiated avalanche process for HgCdTe-based high-density vertically integrated photodiode (HDVIP) structure (n-type central region and p-type material around), f electron avalanche mechanism, and g relative spectral response for 5.1 μm cut-off wavelength HgCdTe HDVIP at T = 80 K
Fig. 7: HgCdTe APDs performance. a the experimental gain versus bias for selected cut-off wavelengths for DRS electron-initiated APDs at 77 K together with extra measured data points taken at ∼77 K51 and LETI e-APDs at 80 K59, b constant F(M) ~ 1 versus M at 80 K for 4.3 μm cut-off wavelength APD135
Fig. 9: Low-dimensional solid avalanche photodetectors. a graphite/InSe Schottky avalanche detector - injection, ionization, collection electron transport mechanisms, b e-ph scattering dimensionality reduction affects electron acceleration process and gain versus electric field in 2D (red line) and 3D (blue line), c breakdown voltage (Vbd) and gain as a function of temperature—exhibits a negative temperature coefficient81. Nanoscale vertical InSe/BP heterostructures ballistic avalanche photodetector: d schematic of the graphene/BP/metal avalanche device83, e ballistic avalanche photodetector operating principle, f quasi-periodic current oscillations, g schematic of the graphene InSe/BP83, h Ids–Vds characteristics for selected temperatures (40 − 180 K), i avalanche breakdown threshold voltage (Vth) and gain versus temperature—showing a negative temperature coefficient. Pristine PN junction avalanche photodetector: j device structure, k as the number of layers increases, a positive/negative signal of SCM denotes hole/electron carries, l APD’s low temperature (~100 K) dark and photocurrent I–V curves
Job Postings – Week of 8 Oct 2023
Image Sensors World Go to the original article...
Alphacore
|
Sr Design Engineer |
Tempe, Arizona, USA |
Tesla Motors
|
Image Scientist, Camera Technology |
Palo Alto, California, USA |
|
|
Supplier Industrialization Engineer, Camera Top Level Assembly |
Palo Alto, California, USA |
|
|
Sr. Process Engineer, Vision Automation |
Austin, Texas, USA |
|
|
Internship, Machine Vision, Cell Manufacturing (Spring/Summer 2024) |
Palo Alto, California, USA |
TSMC
|
CMOS Image Sensor Analog Design Engineer |
Taiwan |
|
|
CIS Technology Program-Process Integration Engineer |
Taiwan |
ams OSRAM
|
Senior Engineering Manager |
Valencia, Spain |
|
|
Digital Verification Engineer |
Garching, Germany |
GlobalFoundries
|
MTS Silicon Photonics Integration Engineer |
Malta, New York, USA |
|
|
Si Photonic Pipeline Engineer |
Malta, New York, USA |
Sony
|
Imaging Field Applications Engineer |
Weybridge, Surrey, UK |
|
|
Digital Physical Implementation Engineer |
Trento, Trentino, Italy |
Conference List – January 2024
Image Sensors World Go to the original article...
IS&T Electronic Imaging - 21-25 Jan 2024 - Burlingame, CA, USA - Website
IEEE Applied Sensing Conference - 22-24 Jan 2024 - Goa, India - Website
SPIE Photonics West - 27 Jan-1 Feb 2023 - San Francisco, CA, USA - Website
Return to Conference List Index
Conference List – December 2023
Image Sensors World Go to the original article...
IEEE International Electron Devices Meeting - 9-13 Dec 2023 - San Francisco, CA, USA - Website
Return to Conference List index
Foveon Documentation
Image Sensors World Go to the original article...
Now for a third company with unique technology out of the sensor business - Foveon. For those of you unfamiliar with Foveon, they started out to build three-chip, prism-based color cameras with custom CMOS sensors. When this market was discovered to be way too small, they acquired Dick Merrill and his inventions from National Semiconductor and developed sensors with stacked RGB photodiodes in which the silicon itself provided color separation. These were intended for DSLRs, then point-and-shoot cameras and then mobile phones. Didn't happen so, eventually, their only photographic customer, Sigma Photo, bought the assets for about 10 cents on the dollar and moved the work to Japan. Full disclosure - I sold Foveon sensors in the non-photo markets for about 10 years. Aside from Sigma, the only enduring legacy is the F13 on the ESA ExoMARS rover (now scheduled for an October 2028 launch).
Review paper on long range single-photon LiDAR
Image Sensors World Go to the original article...
Hadfield et al. recently published a review paper titled "Single-photon detection for long-range imaging and sensing" in Optica:
Abstract: Single-photon detectors with picosecond timing resolution have advanced rapidly in the past decade. This has spurred progress in time-correlated single-photon counting applications, from quantum optics to life sciences and remote sensing. A variety of advanced optoelectronic device architectures offer not only high-performance single-pixel devices but also the ability to scale up to detector arrays and extend single-photon sensitivity into the short-wave infrared and beyond. The advent of single-photon focal plane arrays is poised to revolutionize infrared imaging and sensing. In this mini-review, we set out performance metrics for single-photon detection, assess the requirements of single-photon light detection and ranging, and survey the state of the art and prospects for new developments across semiconductor and superconducting single-photon detection technologies. Our goal is to capture a snapshot of a rapidly developing landscape of photonic technology and forecast future trends and opportunities.
Fig. 1. Examples of imaging LIDAR configurations. (a) Flash LIDAR configuration using an array sensor and full-field illumination (a bistatic system is shown, with source and sensor separated). (b) Scanning LIDAR approach where the source is scanned and an individual sensor is used. (In this illustration, a bistatic configuration is shown; however, a monostatic scanning configuration is often used with a common transmit and receive axis).
Fig. 2. Single-photon LIDAR depth profiles taken at a range of greater than 600 m using a 100-channel Si SPAD detector system in scanning configuration. The operational wavelength is 532 nm. (a) Visible-band photograph of scene. (b) Reconstructed depth image of the city scene. (c) Detailed depth profile of the subsection of the scene within the red rectangle in (a). Further details in Z. Li et al. [60]. Figure reproduced with permission of Optica Publishing Group.
Fig. 3. Example of data fusion of a 3D image from a CMOS SPAD detector array and passive imagery of a scene at 150 m range. (a) Retrieved depth information from a SPAD detector array. (b) Intensity information from the SPAD overlaid on top of the retrieved depth information. (c) Intensity information from a color camera overlaid on top of the retrieved depth information [65]. Figure reproduced with permission of Springer Nature publishing.
Fig. 4. Solar irradiance versus wavelength at sea level (red) and in the upper atmosphere (blue). MODTRAN simulation [86]. The following spectral bands beyond the visible wavelength range are denoted by the shaded regions: near infrared (NIR), yellow; short-ware infrared (SWIR), cyan; mid-wave infrared (MWIR), red.
Fig. 5. Example of scanning SWIR single-photon LIDAR imaging. (a) Visible-band image of a residential building taken with an f=200mm camera lens. (b) Depth intensity plot of the building imaged with 32×32 scan points over a range of 8.8 km. (c) Depth plot of the building imaged with 32×32 scan points over a range of 8.8 km; side view of the target [89]. Figure reproduced with permission of Optica Publishing Group.
Fig. 6. Reconstruction results of a mountain scene over a range of 201.5 km using SWIR single-photon LIDAR [91]. (a) Visible-band imaged photograph. (b) Reconstructed depth result using algorithm by Lindell et al. [92] for data with signal-to-background ratio ∼0.04 and mean signal photon per pixel ∼3.58. (c) 3D profile of the reconstructed result. Figure reproduced with permission of Optica Publishing Group.
Fig. 7. Analysis of a scene with an actor holding a wooden plank across his chest and standing 1 m behind camouflage netting at a range of 230 m in daylight conditions. (a) Photograph of the scene, showing the actor holding a wooden plank behind the camouflage. (b), (c) Intensity and depth profiles of the target scene using all the collected single-photon LIDAR data. (d), (e) Intensity and depth profiles after time gating to exclude all data except those with a 0.6 m range around the target location. The pixel format used in the depth and intensity profiles is 80×160 [95]. Figure reproduced with permission of SPIE publishing.
Fig. 8. Schematic diagram of a SWIR single-photon 3D flash imaging experiment. The scene consists of two people walking behind a camouflage net at a stand-off distance of 320 m from the LIDAR system. An RGB camera was positioned a few meters from the 3D scene and used to acquire a reference video. The proposed algorithm is able to provide real-time 3D reconstructions using a graphics processing unit (GPU). As the LIDAR presents only 32×32 pixels, the point cloud was estimated in a higher resolution of 96×96 pixels. The acquired movie is shown in [101]. Figure reproduced with permission of Springer Nature publishing.
Fig. 9. Single-photon detector technologies for infrared single-photon LIDAR, with spectral coverage for each detector type indicated. (a) Schematic diagram cross section of a Si-based SPAD detector. The design is a homojunction. (b) Schematic diagram cross section of a Ge-on-Si structure, illustrating optical absorption in the Ge layer, and multiplication in the intrinsic Si layer. (c) Schematic diagram cross section of an InGaAs/InP SPAD detector; the absorption is in the narrow-gap InGaAs and the multiplication in the wider gap InP layer. In both (b) and (c), the charge sheet is used to alter the relative electric fields in the absorption and multiplication layers. (d) Schematic illustration of SNSPD architecture for near-unity efficiency at 1550 nm wavelength and optical micrograph of chip with single-pixel detector [109]; (d) reproduced with permission of Optica Publishing Group.
Link to paper (open access): https://opg.optica.org/optica/abstract.cfm?URI=optica-10-9-1124
Sigma 10-18mm f2.8 DC DN review
MDPI IISW2023 Special Issue – paper on random telegraph noise
Image Sensors World Go to the original article...
The first article in the Sensors special issue for IISW2023 is now available:
https://www.mdpi.com/1424-8220/23/18/7959
Chao et al. from TSMC in a paper titled "Random Telegraph Noise Degradation Caused by Hot Carrier Injection in a 0.8 μm-Pitch 8.3Mpixel Stacked CMOS Image Sensor" write:
In this work, the degradation of the random telegraph noise (RTN) and the threshold voltage (Vt) shift of an 8.3Mpixel stacked CMOS image sensor (CIS) under hot carrier injection (HCI) stress are investigated. We report for the first time the significant statistical differences between these two device aging phenomena. The Vt shift is relatively uniform among all the devices and gradually evolves over time. By contrast, the RTN degradation is evidently abrupt and random in nature and only happens to a small percentage of devices. The generation of new RTN traps by HCI during times of stress is demonstrated both statistically and on the individual device level. An improved method is developed to identify RTN devices with degenerate amplitude histograms.
Figure 1. Simplified test chip architecture. The device under stress is the source follower (SF) NMOS in the 4 × 2-shared pixels on the top layer. The PD0–7 are the photodiodes, and the TG0–7 are the transfer gates in each 4 × 2-shared pixel. The total number of SF is 628 × 1648 = 1.03 M.
Figure 2. (a) The measured IB of a SF device vs. VD with VG stepping from 1.3 V to 2.8 V; (b) The same data as in (a) but plotted against VDS−VDsat≈VD−VG+Vt with Vt as a fitting parameter; (c) The same data as in (b) plotted against 1/(VDS−VDsat) with P=(P1,P2) as two fitting parameters according to Equation (1).
Figure 3. The bias configuration of the SF under test. The red and blue solid circles symbolize electrons and holes, respectively.
Figure 4. The histograms of the measured VGS of the SF for stress time (t) from 0 to 100 min.
Figure 5. (a) The histograms of the threshold voltage shift (ΔVt) after 10-, 20-, 50-, and 100-min stress; (b) The inverse cumulative distribution function (ICDF) curves of ΔVt; (c) the constant ICDF contours against stress time (t).
Figure 6. (a) The histograms of the random noise changes (ΔRN) after 10, 20, 50, 100 min stress; (b) The inverse cumulative distribution function (ICDF) curves; (c) the constant ICDF contours as functions of stress time (t).
Figure 8. The correlation of the random noises (RN) before HCI stress (t = 0) vs. after (a) 10 min, (b) 20 min, and (c) 100 min stress, respectively The RN increases are noticeably nonuniform. The RN along the x = y red dash line remains relatively unchanged. The devices on the lower-right branches show a significant increase in RN. The population of the lower branch increases as stress time increases. Random colors are assigned to the data points to separate the dots from each other.
Figure 9. The 2D histograms of the correlation of the Vt shift and RN degradation shows dramatically different statistical behaviors. (a) The Vt change after 100-min stress versus that after 10-min stress. (b) The RN after 100 min stress versus that before the stress.
Figure 10. Generation of RTN traps during HCI stress. The 5000-frame waveforms before (t = 0) and after the HCI stress (t = 20, 100 min) with the corresponding histograms are shown for three selected examples. (a) Device (296, 137) shows one trap before stress and remains unchanged after stress. (b) Device (202, 1338) shows no trap before stress and one trap generated after 20 min of stress. (c) Device (400, 816) shows no trap before stress; however, one trap is generated after 100 min of stress. The RN unit is mV-rms.
Figure 11. Degeneration of the RTN discrete levels. During HCI stress, the non-RTN noises may be increased significantly such that the discrete RTN levels become indistinguishable. (a) Device (141, 1393) show such degeneration after 100 min of stress. (b) Device (481, 405) show degeneration after 20 min of stress. (c) Device (519, 1638) shows unsymmetric side peaks and unsymmetric degeneration after 20 min and 100 min of stresses. The RN unit is mV-rms.
Figure 12. For devices showing a single histogram peak, if the histogram is significantly different from the Gaussian distribution, they are counted as RTN-like devices. The ratio R expressed in Equation (2) is defined as the red area versus the total area under the black histogram. The R values in examples (a) and (b) are 36% and 28%, respectively. The RN unit is mV-rms.
Figure 13. Devices with amplitude distributions close to Gaussian are considered as non-RTN devices. The deviation ratio R is 7% for device (587, 492) in (a) and 9% for device (124, 1349) in (b). The RN unit is mV-rms.
Figure 14. The RN distribution of the RTN and non-RTN devices, sorted by the improved algorithm: (a) before HCI stress, (b) after 20 min stress, and (c) after 100 min stress. The RTN devices clearly contribute to and dominate the long tails of the RN histograms. The number of RTN devices (Nx) (with the R-threshold set to 15%) increases systematically as the stress time increases.
Figure 15. The count of RTN devices increases consistently as stress time increases. N2 is the number of devices showing two or more peaks in amplitude histograms. Nx is N2 plus the number of RTN-like devices determined by setting the R-threshold to 10%, 15%, and 20%, respectively.
Figure 16. (a) The Vt shift and (b) the RN degradation trends against the effective stress defined in Equation (3), where the effectiveness factors are treated as empirical fitting parameters such that all the constant-ICDF points for different voltages fall onto a family of continuous and smooth curves. The fitting results are listed in Table 1.
MDPI IISW2023 Special Issue – paper on random telegraph noise
Image Sensors World Go to the original article...
The first article in the Sensors special issue for IISW2023 is now available:
https://www.mdpi.com/1424-8220/23/18/7959
Chao et al. from TSMC in a paper titled "Random Telegraph Noise Degradation Caused by Hot Carrier Injection in a 0.8 μm-Pitch 8.3Mpixel Stacked CMOS Image Sensor" write:
In this work, the degradation of the random telegraph noise (RTN) and the threshold voltage (Vt) shift of an 8.3Mpixel stacked CMOS image sensor (CIS) under hot carrier injection (HCI) stress are investigated. We report for the first time the significant statistical differences between these two device aging phenomena. The Vt shift is relatively uniform among all the devices and gradually evolves over time. By contrast, the RTN degradation is evidently abrupt and random in nature and only happens to a small percentage of devices. The generation of new RTN traps by HCI during times of stress is demonstrated both statistically and on the individual device level. An improved method is developed to identify RTN devices with degenerate amplitude histograms.
Figure 1. Simplified test chip architecture. The device under stress is the source follower (SF) NMOS in the 4 × 2-shared pixels on the top layer. The PD0–7 are the photodiodes, and the TG0–7 are the transfer gates in each 4 × 2-shared pixel. The total number of SF is 628 × 1648 = 1.03 M.
Figure 2. (a) The measured IB of a SF device vs. VD with VG stepping from 1.3 V to 2.8 V; (b) The same data as in (a) but plotted against VDS−VDsat≈VD−VG+Vt with Vt as a fitting parameter; (c) The same data as in (b) plotted against 1/(VDS−VDsat) with P=(P1,P2) as two fitting parameters according to Equation (1).
Figure 3. The bias configuration of the SF under test. The red and blue solid circles symbolize electrons and holes, respectively.
Figure 4. The histograms of the measured VGS of the SF for stress time (t) from 0 to 100 min.
Figure 5. (a) The histograms of the threshold voltage shift (ΔVt) after 10-, 20-, 50-, and 100-min stress; (b) The inverse cumulative distribution function (ICDF) curves of ΔVt; (c) the constant ICDF contours against stress time (t).
Figure 6. (a) The histograms of the random noise changes (ΔRN) after 10, 20, 50, 100 min stress; (b) The inverse cumulative distribution function (ICDF) curves; (c) the constant ICDF contours as functions of stress time (t).
Figure 8. The correlation of the random noises (RN) before HCI stress (t = 0) vs. after (a) 10 min, (b) 20 min, and (c) 100 min stress, respectively The RN increases are noticeably nonuniform. The RN along the x = y red dash line remains relatively unchanged. The devices on the lower-right branches show a significant increase in RN. The population of the lower branch increases as stress time increases. Random colors are assigned to the data points to separate the dots from each other.
Figure 9. The 2D histograms of the correlation of the Vt shift and RN degradation shows dramatically different statistical behaviors. (a) The Vt change after 100-min stress versus that after 10-min stress. (b) The RN after 100 min stress versus that before the stress.
Figure 10. Generation of RTN traps during HCI stress. The 5000-frame waveforms before (t = 0) and after the HCI stress (t = 20, 100 min) with the corresponding histograms are shown for three selected examples. (a) Device (296, 137) shows one trap before stress and remains unchanged after stress. (b) Device (202, 1338) shows no trap before stress and one trap generated after 20 min of stress. (c) Device (400, 816) shows no trap before stress; however, one trap is generated after 100 min of stress. The RN unit is mV-rms.
Figure 11. Degeneration of the RTN discrete levels. During HCI stress, the non-RTN noises may be increased significantly such that the discrete RTN levels become indistinguishable. (a) Device (141, 1393) show such degeneration after 100 min of stress. (b) Device (481, 405) show degeneration after 20 min of stress. (c) Device (519, 1638) shows unsymmetric side peaks and unsymmetric degeneration after 20 min and 100 min of stresses. The RN unit is mV-rms.
Figure 12. For devices showing a single histogram peak, if the histogram is significantly different from the Gaussian distribution, they are counted as RTN-like devices. The ratio R expressed in Equation (2) is defined as the red area versus the total area under the black histogram. The R values in examples (a) and (b) are 36% and 28%, respectively. The RN unit is mV-rms.
Figure 13. Devices with amplitude distributions close to Gaussian are considered as non-RTN devices. The deviation ratio R is 7% for device (587, 492) in (a) and 9% for device (124, 1349) in (b). The RN unit is mV-rms.
Figure 14. The RN distribution of the RTN and non-RTN devices, sorted by the improved algorithm: (a) before HCI stress, (b) after 20 min stress, and (c) after 100 min stress. The RTN devices clearly contribute to and dominate the long tails of the RN histograms. The number of RTN devices (Nx) (with the R-threshold set to 15%) increases systematically as the stress time increases.
Figure 15. The count of RTN devices increases consistently as stress time increases. N2 is the number of devices showing two or more peaks in amplitude histograms. Nx is N2 plus the number of RTN-like devices determined by setting the R-threshold to 10%, 15%, and 20%, respectively.
Figure 16. (a) The Vt shift and (b) the RN degradation trends against the effective stress defined in Equation (3), where the effectiveness factors are treated as empirical fitting parameters such that all the constant-ICDF points for different voltages fall onto a family of continuous and smooth curves. The fitting results are listed in Table 1.
Image Sensor Industry List
Image Sensors World Go to the original article...
ISW is building a new comprehensive Image Sensor Industry List. Click for more details.
/Data Error/
Image Sensors World Go to the original article...
I apologize to those of you who tried to read the Pixim data sheets. I forgot to set the share flag on the folder. The files are available for viewing by anyone now.
Job Postings – Week of 1 Oct 2023
Image Sensors World Go to the original article...
Teledyne
|
CMOS Sensor Product Support |
Waterloo, Ontario, Canada |
|
|
Senior Manager, Digital & Analog Mixed Signal IC Design |
Camarillo, California, USA |
|
|
MBE Growth Production Engineer (US Citizen) |
Camarillo, California, USA |
|
|
ASIC Design Engineer (US Citizen or equivalent) |
Goleta, California, USA |
|
|
Director - Market Development |
Grenoble, France |
|
|
Senior Scientist (US Citizen) |
Acton, Massachusetts, USA |
Lumotive
|
Optoelectronics Systems Engineer |
Redmond, Washington, USA |
|
|
Optoelectronics Systems Engineer |
Vancouver, British Columbia, Canada |
University of Arizona
|
Optical Sciences Postdoctoral Research Associate I |
Tucson, Arizona, USA |
|
|
Physics Postdoctoral Research Associate I |
Tucson, Arizona, USA |
|
|
Optical Sciences Postdoctoral Research Associate |
Tucson, Arizona, USA |
Sandia National Laboratories
Nanophotonics - Postdoctoral Appointee |
Albuquerque, New Mexico, USA |
|
Santa Clara University
|
Assistant Professor, Electrical and Computer Engineering (Tenure-track) |
Santa Clara, California, USA |
Conference List – November 2023
Image Sensors World Go to the original article...
IEEE Nuclear Science Symposium and Medical Imaging Conference - 4-11 Nov 2023 - Vancouver, British Columbia, Canada - Website
Semi MEMS and Sensors Executive Conference - 6-8 Nov 2023 - Phoenix, Arizona, USA - Website
Coordinating Panel for Advanced Detectors Workshop - 7-10 Nov 2023 - Menlo Park, California, USA - Website
Compamed - 13-16 Nov 2023 - Dusseldorf, Germany - Website
Fraunhofer IMS 10th CMOS Imaging Workshop - 21-22 Nov 2023 - Duisburg, Germany - Website
RSNA 109th Scientific Assembly and Annual Meeting - 26-30 Nov 2023 - Chicago, Illinois, USA - Website
Return to Conference List index
Pixim Documentation
Image Sensors World Go to the original article...
Since someone just pointed out that the Pixim weblink in the ISW vendor list was dead, I decided to put up Pixim datasheets next. These are different from most because Pixim sold a chipset that included the sensor and a matching processor to turn the chip outputs (basically timing data) into digital image data. Pixim also made several demonstration cameras. Sony bought Pixim in 2012 and stopped making Pixim-type chips a couple of years later because they felt they had a better way to do HDR imaging. The debate continues.
Omnivision announces new sensor for security and surveillance applications
Image Sensors World Go to the original article...
OMNIVISION Announces New Low-power, Enhanced-performance 2MP Image Sensor for Security Surveillance Cameras
The OS02N features a 2.5-micron enhanced-performance FSI pixel with on-sensor DPC for higher sensitivity, performance and reliability while remaining cost-effective
SANTA CLARA, Calif. – September 27, 2023 – OMNIVISION, a leading global developer of semiconductor solutions, including advanced digital imaging, analog, and touch & display technology, today announced the new OS02N, a 2-megapixel (MP) frontside illumination (FSI) image sensor with optimized defective pixel correction (DPC) algorithm for higher sensitivity, improved performance and increased reliability for IP and HD analog security cameras, including professional surveillance and outdoor home security cameras. The OS02N supports always-on with its low-power capability.
“Customers need high-performing security cameras that produce sharp, high-resolution images with low power consumption for extended battery life. The OS02N meets these requirements and is also a cost-effective solution,” said Cheney Zhang, senior marketing manager, OMNIVISION. “The OS02N uses FSI technology, which has a large pixel size for better quantum efficiency and excellent signal-to-noise ratio, resulting in high sensitivity in low-light conditions and dramatically improved image quality and performance. It has a 1/3.27-inch optical format and is designed to be pin-to-pin compatible with our OS04L and OS04D image sensors.”
The OS02N features a 2.5-micron pixel based on OMNIVISION’s OmniPixel®3-HS technology. This enhanced-performance, cost-effective solution uses FSI technology for true-to-life color reproduction in both bright and dark conditions. Optimized DPC algorithm improves sensor quality and reliability above and beyond standard devices by providing real-time correction of defective pixels that can result throughout the sensor’s life cycle, especially in harsh operating conditions. The OS02N features 1920x1080 resolution at 30 frames per second (FPS).
The OS02N supports MIPI and DVP interfaces. It is sampling now and will be in mass production in Q1 2024. For more information, contact your OMNIVISION sales representative: www.ovt.com/contact-sales.
Sheba Microsystems MEMS-based lens athermalization solution
Image Sensors World Go to the original article...
Sheba Microsystems Launches Revolutionary MEMS Autofocus Actuator for Active Athermalization in Embedded Vision Cameras
Breakthrough µPistons™ technology uniquely solves decades-long embedded vision camera industry’s problem of lens thermal expansion. Novel product unlocks unparalleled resolution and consistent high-quality imaging performance for automotive, action, drone, mobile robotics, security and surveillance, and machine vision cameras.
TORONTO--(BUSINESS WIRE)--Sheba Microsystems Inc., a global leader in MEMS technologies, today announced the launch of its revolutionary new product, the MEMS Autofocus Actuator for Active Athermalization in Embedded Vision Cameras used in automotive, action, drones, machine vision, security and surveillance, and mobile robotics.
The first-of-its-kind solution tackles the long-standing industry problem of embedded vision cameras’ inability to maintain image quality and focus stability during temperature fluctuations as optics undergo thermal expansion.
While smartphones use autofocus actuators and electromagnetic actuators including voice coil motors (VCMs), these actuators are unreliable for achieving active athermalization in embedded vision cameras due to extreme environmental conditions. Embedded vision camera optics are also 30 times larger than smartphone optics. Other autofocus systems in-market such as tunable lenses lack thermal stability and compromise optical quality.
“MEMS actuators are fast, precise, and small in size, and are actually uniquely suited to solve thermal expansion issues, because they are thermally stable and maintain consistent performance regardless of temperature changes,” said CEO and co-founder Dr. Faez Ba-Tis, PhD. “Because of these known advantages, there have been previous industry attempts at incorporating MEMS actuators into cameras, but because they failed drop tests they were quickly abandoned. Sheba’s new design solves for all of these previous blockers, which opens up limitless possibilities for embedded vision camera innovation.”
Sheba’s proprietary technology compensates for thermal expansion by uniquely moving the lightweight sensor, instead of moving the lenses. The silicon-based MEMS actuator platform actuates the image sensor along the optical axis to compensate for thermal expansion in the optics. The weight of the image sensor represents only 2-3 % of the optical lens weight, which makes it easier to handle, enabling ultra-fast and precise autofocus performance even when temperatures fluctuate.
Sheba’s novel piston-tube electrode configuration takes advantage of a larger capacitive area, allowing for substantial stroke and increased force. In contrast to traditional MEMS comb-drive electrode configuration, Sheba’s µPistons™ design makes the MEMS actuators uniquely resilient against severe shocks, since the electrodes are well-supported and interconnected with each other.
Sheba’s new MEMS actuator has successfully passed drop tests as well as other reliability tests, including thermal shock, thermal cycling, vibration, mechanical shock, drop, tumble, and microdrop tests. It is also highly rugged, which helps maintain image focus during high shocks in action cameras or machine vision environments.
“Digital camera technologies are increasingly used in almost every aspect of our lives,” said Ba-Tis. “From sharing photos of our travels in social media, to experiencing new artificial intelligence innovations powered by machine vision, and accelerating the deployment of autonomous vehicles in our communities, high quality images are imperative to not only capture our most memorable events, but to also keep us safe. In situations where split-second decisions are critical, image quality becomes paramount.”
Sheba’s MEMS actuator offers lens design flexibility and is suitable for near and far-field imaging. It is easily integrated into existing systems and scaled up on mass production tools for automotive, action, drone, mobile robotics, security and surveillance, and machine vision cameras.
Sheba is offering evaluation kits to interested customers, so they can test and evaluate the new product in their own labs to ensure the reliability of the technology. The kit includes camera samples, a daughter board with the MEMS driver, interposer, and camera test jig to perform mechanical reliability tests, software, and user manual.
To learn more about Sheba Microsystems or to order an evaluation kit for your organization, visit www.shebamicrosystems.ca.
EPA Images announces global partnership with Canon as exclusive imaging supplier
EETimes article on PixArt Imaging’s "smart pixel" sensor
Image Sensors World Go to the original article...
Link: https://www.eetimes.com/smart-pixel-optical-sensing-exerting-ai-in-pixels-level/
Smart Pixel Optical Sensing – Exerting AI in Pixels Level
The PAC9001LU Smart Pixel Optical Sensing Chip is a Computer Vision ASIC that fits as an always-on motion sensor by leveraging the novel AI-driven pixel architecture into the sensor array design. Based on a CMOS Image Sensor rolling shutter structural design with an array of 36 x 16 pixels, it can support a high frame rate of up to 1000Hz to facilitate image capturing of fast-moving object applications. The design of AI in pixels integrates a frame comparing circuit with AI-powered algorithms to compute differences in pixel luminosity within a configurable image area. It directly provides analog frame differences and event info in Pixel Differences Mode and supports Smart Motion Detection Mode to eliminate the complex image signal processing in the processor. The partial array sensing, such as configurable ROI region, provides supple custom scene capturing for the needs of AIoT edge applications.
The PAC9001LU chip is in a W2.5 x L2.6 x H0.43 mm3 CSP package body (excluding solder balls). A recommended matching lens set, LST0-2621 is also available to form a complete module when assembled with the PAC9001LU chip and comes in a size of W3.79 x L3.63 x H1.67 mm3 (height is including guide pin).
- The low-power consumption during the Smart Motion Detection Mode that comes with intelligent informative is the most remarkable building block worthwhile in enabling AI applications. As compare to PIR or CMOS Image Sensor (CIS), higher power consumption is required for further data processing in system level.
- The high report rate, which can go up to 1000Hz can achieve motion detection with fast-moving objects, which outperforms the PIR or conventional CIS.
- The PAC9001LU is more robust with reliable performance. It has fewer false alarms detecting motion and higher immunity to temperature interference. The external environment factors, such as bright and hot sunlight from outdoors, the indoors thermal noise from heated devices are not affecting its sensing performance. The built-in algorithms can eliminate interferences like background noise too.
- The small form factor of the complete PAC9001LU sensor module, including the lens set, can nicely fit into the slim bezel ID design.
- The traditional PIR sensors are usually required not to be shielded by plastics or glass front-facing cover, which may impact the detection of thermal IR radiation. Whereas the PAC9001LU solution does not have the restraint of having a front cover of any materials and still can keep the motion sensing quality, even placing the motion sensing device indoors looking out from the glass window is possible. With the cover protection, the PAC9001LU is less prone to external damage.
PAC9001LU sensor can support low-light sensing in low or no-light conditions, which is very suitable for use in a dark environment, such as a basement.
The PAC9001LU can cater to the need for in-chip high-speed motion detection, eliminating the external controller processing.
In addition to motion sensing, the PAC9001LU sensor can provide the coordinate information of a targeted moving object that is in sync with each pixel differences image data.
Job Postings – Week of 24 Sep 2023
Image Sensors World Go to the original article...
To start the revised posting scheme,here are recently posted jobs from Apple and onsemi:
Apple
|
Image Sensor Validation Engineer |
Cupertino, California, USA |
|
|
Sensor Process Engineer - Camera Hardware |
Cupertino, California, USA |
|
|
Image Sensor Validation Engineer |
Grenoble, Isere, France |
|
|
Sensor Process Engineer |
Kanagawa, Kanagawa-ken, Japan |
|
|
Sensor Process Engineer |
Cupertino, California, USA |
|
|
Pixel Development Engineer |
Pasadena, California, USA |
|
|
Technical Program Manager (TPM), Image Sensor |
Tokyo, Tokyo-to, Japan |
onsemi
|
Strategic Platform Architect – Image Sensors |
San Jose, California, USA |
|
|
Technical Project Manager – Image and Depth Sensors |
Haifa, Israel |
|
|
Sr Director Bracknell Design Center |
Bracknell, Berkshire, UK |
|
|
Summer 2024 Analog/Digital Verification Intern |
San Jose, California, USA |
|
|
Process Design Kit Development Staff Engineer |
Scottsdale, Arizona, USA |
Job Posting Update
Image Sensors World Go to the original article...
In order to eliminate the delays involved in reproducing jobs listings in detail, the method of reporting is changing. Starting today, this Jobs Update will be a report on job listings rather than the listings themselves. Even in the apparent slowdown in the job market, the number of listings is still quite large and the backlog of listings that might be of interest is much larger.
To manage this situation, we will proceeds as follows: for the next few weeks, the posting will focus each week on a small number of employers with multiple openings. Relevant job titles with their locations and original listing links will be posted. The job descriptions will no be included simply because they are all so long - click the links to see the details. Please note that we will attempt always to link to the original listing on the employer website, not to job boards.
Weekly job lists in ISW will continue to have individual links for four weeks but since many jobs remain unfilled longer than that, the older postings will be held in an archive for a year. The link to the archive is positioned below the four week links.
Initially, some of the postings may be several months old but, at some point, our listings will catch up with the backlog and each weekly posting will include only recent additions.
Conference List – October 2023
Image Sensors World Go to the original article...
Optica Laser Congress and Exhibition - 8-10 Oct 2023 - Tacoma, Washington, USA - Website
IEEE International Conference on Image Processing - 8-11 Oct 2023 - Kuala Lumpur, Malaysia - Website
244th Electrochemical Society Meeting - 8-12 Oct 2023 - Gothenburg, Sweden - Website
SPIE/COS Photonics Asia - 14-16 Oct 2023 - Beijing, China - Website
ASNT Annual Conference - 22-26 Oct 2023 - Houston, Texas, USA - Website
SPIE Photonex - 24-26 Oct 2023 - Glasgow, Scotland, UK - Website
BioPhotonics Conference - 24-26 Oct 2023 - Online - Website
OPTO Taiwan - 25-27 Oct 2023 - Taipei, Taiwan - Website
Return to Conference List index
Conference List – September 2023
Image Sensors World Go to the original article...
SPIE Photonics Industry Summit - 27 Sep 2023 - Washington, DC, USA - Website
2023 International Conference on IC Design and Technology - 25-27 Sep 2023 - Tokyo, Japan - Website
Return to Conference List Index
Conference List Index
Image Sensors World Go to the original article...
The Conference Lists are sorted by month. Here is an index of the currently active months. Click the month to see the list.
2023:
September - October - November - December
2024:
January
2025:
Conference List Announcement
Image Sensors World Go to the original article...
Image Sensors World is pleased to introduce its compiled list of conferences and exhibitions that include image sensor topics and products. The range includes everything from device physics and designs to applications and test. Included are on-site meetings, on-line webinars and various sorts of hybrid and archived events.
Because many meetings keep the presented materials available on-line for extended periods, each event will remain in the listings for one year or until the next session of the event occurs, whichever is shorter. New events will be listed as soon as they are announced and grouped by the month the event is scheduled to begin.
The ISW list will be broader than the individual posts announcing events that ar4e focused on image sensor such as the IISW or some of the IEEE meetings. Those posts will continue to assure that ISW readers are aware of them in time to submit papers or make travel plans.
Building the list will take some time so the postings will be added in chronological order with those scheduled soonest first. Listings will be brief, giving the sponsor, event name, location, dates and a link to the event website. Note that often the links change as the date of the event approaches. Occasionally, additional information will be supplied that affects who can attend - security clearance or membership requirements, for example.
Most events require payment for attendance to technical sessions and some charge for exhibits. These listings will not include the costs because those are often contingent on time, membership, discounts and other factors. See the event websites for registration details.
Finally, if you are reading the listings looking for opportunities to exhibit your products, those with associated exhibitions will usually have exhibitor information sections on their websites. Please consult them as early as possible. Many popular shows are sold out long before the event begins and some require reservations years in advance.
Feel free to post comments on the shows - your experiences or plans - and let us know if you are aware of an event ISW should list.
The index to the monthly lists is here.
Sony announces IMX735 17.42MP Automotive CIS
Image Sensors World Go to the original article...
Press release: https://www.sony-semicon.com/en/news/2023/2023091201.html
Sony Semiconductor Solutions to Release CMOS Image Sensor for Automotive Cameras with Industry-Leading 17.42-Effective Megapixels
Delivering sophisticated sensing and recognition performance and contributing to safe, secure automated driving
Atsugi, Japan — Sony Semiconductor Solutions Corporation (SSS) today announced the upcoming release of the IMX735, a new CMOS image sensor for automotive cameras with the industry’s highest pixel count, at 17.42 effective megapixels. The new sensor product will support the development of automotive camera systems capable of sophisticated sensing and recognition performance, thereby contributing to safe, secure automated driving.
For automated systems to deliver automated driving, they must offer sophisticated, high-precision sensing and recognition performance, encompassing all 360 degrees of the environment around the vehicle. Accordingly, there is considerable demand for image sensors that can help achieve this level of performance and support the development of more advanced automotive camera systems.
The new sensor product achieves the industry’s highest pixel count of 17.42 effective megapixels, enabling high definition capture of far-off objects. Moreover, automated driving systems often use automotive cameras in combination with LiDAR and other sensing systems. While typical CMOS image sensors readout signals output from pixels one vertical line at a time, this product outputs signals horizontally, one row at a time. This means that automotive cameras employing this sensor can more easily synchronize with mechanical scanning LiDAR, since their laser beams also scan horizontally. This better synchronization will improve the sensing and recognition capabilities of the automated driving system as a whole.
Furthermore, the new sensors’ improved saturation illuminance, made possible by a proprietary pixel structure, and unique exposure method yield a wide dynamic range of 106 dB even when simultaneously employing high dynamic range (HDR) imaging and LED flicker mitigation. The dynamic range is even higher, at 130 dB, when using dynamic range priority mode. This creative design helps suppress highlight blowouts even in backlit conditions, enabling more precise object capture in road environments with significant differences in brightness, such as tunnel entrances and exits.
Main Features
■Long-distance recognition delivered by industry-leading 17.42 megapixels
Thanks to the industry’s highest pixel count of 17.42 effective megapixels, the new sensor is capable of high definition capture, extending the object recognition range to greater distances and thereby allowing better detection of road conditions, vehicles, pedestrians and other objects. Early detection of far-away objects while driving helps make automated driving systems safer.
■Horizontal pixel signal output for easier synchronization with mechanical-scanning LiDAR
When reading signals from pixels, CMOS image sensors generally do so in a vertical direction one line at a time. This product, on the other hand, employs a readout method that outputs signals horizontally one row at a time, making it easier to synchronize with mechanical-scanning LiDAR, which also uses a horizontal scanning method. This means that the information output from automotive cameras equipped with this product can be integrated with LiDAR information downstream on the system. This will improve the sensing and recognition capabilities of the automated driving system as a whole.
■Wide dynamic range even during simultaneous use of HDR and LED flicker mitigation
In automobile driving, objects must be precisely detected and recognized even in road environments with significant differences in brightness, such as tunnel entrances and exits. Automotive cameras are also required to suppress LED flicker, even while in HDR mode, to deal with the increasing prevalence of LED signals and other traffic devices. The proprietary pixel structure and unique exposure method of this product improves saturation illuminance, yielding a wide dynamic range of 106 dB even when simultaneously employing HDR and LED flicker mitigation (when using dynamic range priority mode, the range is even wider, at 130 dB). This design also helps reduce motion artifacts generated when capturing moving subjects.
■Compliant with standards required for automotive applications
The product is qualified for AEC-Q100 Grade 2 automotive electronic component reliability tests by mass production. Also, SSS has introduced a development process compliant with the ISO 26262 road vehicle functional safety standard, at automotive safety integrity level ASIL-B(D). This contributes to improve automotive camera system reliability.
■Cybersecurity required for automotive applications (optional)
The product can support cybersecurity features such as camera authentication via a public-key algorithm to confirm CMOS image sensor authenticity, image authentication to detect any tampering with acquired images, and communication authentication to detect any tampering with control communications.
Texas Instruments Documentation
Image Sensors World Go to the original article...
Texas Instruments made (mostly) area CCDs using the virtual-phase architecture invented by Jerry Hynecek in their Central Research Lab. Some later devices, designated "Impactron" incorporated a high-voltage shift register that provided electron multiplication. TI made these devices until 2011 when their fab in Aizu-wakamatsu, Japan, was heavily damaged in an earthquake. The CCD line was never restarted.
You may notice that the archive includes a data sheet for the TIVICON silicon vidicon camera tube. Truly, TI made an imaging vacuum tube before it made solid-state sensors. It was built for an Air Force forward-looking infrared (FLIR) system that flew over the jungles of Vietnam making thermal images of people among the trees. The silicon vidicon looked at a spinning line of infrared LEDs (another TI product) to produce a windshield-wiper-shaped image that was displayed on a video monitor. I ran the lab that tested these tubes and I wrote the data sheet included in the archive to start TI on commercial sales of the tubes. I was gone before TI introduced CCDs but my boss, Frank Skaggs, moved to that program.
Link to the TI folder
Return to the Documentation List
Image Sensor Documentation Archive
Image Sensors World Go to the original article...
We are pleased to introduce a new feature of the ISW blog, an archive of data sheets and other documents related to image sensor products. A recent survey of blog readers indicated that many of you would like access to a location where both historical and current data sheets, application notes, test data and other documents describing image sensors released to the market are available for download. The new archive will provide this.
Image sensors have been made by over 200 companies since manufacturing began in the early 1970s and new ones appear almost every day now. I will attempt to both fill in the past and keep the archive current but catching up will take a while. Please comment if you have priorities you would like me to consider. Also comment if you have recollections that might interest the newer members of the image sensor community.
All of the documents will be stored on a Google Drive account with open access. This means that documents that are export-controlled, mostly for thermal sensors, will not be included. To get to the Drive folders, you need only to click a company link on the blog post that comes up when you select "Image Sensor Documentation" from the list on the left of the blog front page. Download whatever you like.
If you have anything you would like to contribute, send me an e-mail and we can make an arrangement. While we are interested in more than just data sheets, we can't post any company confidential information even if the company is out of the image sensor business or no longer exists.
As I add new companies to the archive, I will post announcements with a link to the company folder and a little background on the company. Again, feel free to comment if you have something interesting to add. Note that the companies and their documents will be identified by the company name in use when each sensor was introduced. Thus, companies like OnSemi may have products under other names going all the way back to Photobit or IMEC or Kodak.
The first company up is Texas Instruments, active 1978-2011, making virtual-phase and electron-multiplying sensors invented by Jerry Hynecek. The post with the link to the TI documents is here.
Image Sensor Documentation List
Image Sensors World Go to the original article...
The ISW Blog documentation folders contain documents for image sensors from these companies:
Foveon Announcement Document Archive
Pixim Announcement Document Archive
SITe Announcement Document Archive
Tektronix Announcement Document Archive
Texas Instruments Announcement Document Archive
Photonis is now ExoSens
Image Sensors World Go to the original article...
PHOTONIS GROUP BECOMES EXOSENS
PRESS RELEASE
MÉRIGNAC – SEPTEMBER 20th 2023
PHOTONIS GROUP a global leader of highly differentiated technology for detection, imaging and light, held by Groupe HLD since 2021 is deeply transforming by developing adjacent technologies, expanding to particles detection markets. Following that strategy, the group has acquired four companies (Xenics, Proxivision, Telops and Elmul) since December 2022. Worldwide leader for image intensifier tubes, the company has diversified its technologies and products portfolios with the ambition to become the worldwide leader in detection and imaging technologies. To illustrate that strategy, PHOTONIS GROUP becomes EXOSENS.
Proposing electro-optic devices covering the full optical spectrum from UV to LWIR in addition to electron, ion, neutron and gamma detectors, EXOSENSaddresses four markets which are lifescience, industrial control, nuclear energy and defense. The company takes benefits of positive dynamics in each of these four verticals, such as enhanced diagnosis demand, factory automation, small modular reactors deployment and defense budget increase.
Jean-Hubert Vial, partner at Groupe HLD said: “It’s an important step for Photonis Group. By becoming EXOSENS, the company clearly anchors its position as high-end technology provider to serve high growing commercial and defense markets for more sustainability and safety”
Jérôme Cerisier, CEO of the new group said: “EXOSENS means “to detect, to see and to give meaning to what is beyond”. It perfectly reflects what we are doing, we reveal the invisible, we sense the world to make it safer. With EXOSENS, we aim to share our common values throughout the whole organization, to integrate new companies and colleagues and to always offer high performances products to meet customers satisfaction.”
Operationally, legal entities will keep their existing names. The four product brands Photonis (for intensified products, nuclear and mass spectrometry detectors), Xenics (for infrared sensors and cameras), Elmul (for electron detectors) and Telops (for hyperspectral and cooled infrared camera) will continue to be deployed and promoted in their markets.
ABOUT EXOSENS:
Accompanied by Groupe HLD since 2021, EXOSENS is a high-tech company, with more than 85 years of experience in the innovation, development, manufacture and sale of technologies in the field of particles and photo detection and imaging. Today, it offers its customers detectors and detection solutions: its travelling wave tubes, advanced cameras, neutron & gamma detectors, instrument detectors and light intensifier tubes allow EXOSENS to respond to complex issues in environments extremely demanding by offering tailor-made solutions to its customers. Thanks to its sustained and permanent investment, EXOSENS is internationally recognized as a major innovator in optoelectronics, with production and R&D carried out on 9 sites, in Europe and North America and over 1 500 employees.
































