Archives for August 2022

Gpixel announces new global shutter GSPRINT 4502 sensor

Image Sensors World        Go to the original article...

Gpixel press release on August 17, 2022:

Gpixel expands high-speed GSPRINT image sensor series with a 2/3” 2.5 MP 3460 fps global shutter GSPRINT4502

Gpixel announces a high-speed global shutter image sensor, GSPRINT4502, a new member of the GSPRINT series taking high speed imaging to another level.

GSPRINT4502 is a 2.5 Megapixel (2048 x 1216), 2/3” (φ10.7 mm), high speed image sensor designed with the latest 4.5 µm charge domain global shutter pixel. It achieves more than 30 ke- charge capacity and less than 4 e- rms read noise, with dynamic range of 68 dB which can be expanded using a multi-slope HDR feature. Utilizing an advanced 65 nm CIS process with light pipe and micro lens technology, the sensor achieves >65% quantum efficiency and < -92 dB parasitic light sensitivity.

GSPRINT4502 can achieve extremely high frame rates up to 3460 fps in 8-bit mode, 1780 fps in 10-bit mode or 850 fps in 12-bit mode, all at full resolution. With 2×2 on-chip charge binning, full well capacity can be further increased to 120 ke- and frame rate to 10,200 fps. GSPRINT4502 supports vertical and horizontal regions of interest for higher frame rates. GSPRINT4502 is perfect for high-speed applications including 3D laser profiling, industrial inspection, high speed video and motion analysis.

Data output from GSPRINT4502 is through 64 pairs sub-LVDS channels running 1.2 Gbps each. Flexible output channel multiplex modes make it possible to reduce frame and data rate to make the sensor compatible with all available camera interface options. GSPRINT4502 is packaged in a 255-pin uPGA ceramic package and will be offered in sealed and removable glass lid versions.

“The market reaction to the GSPRINT high-speed image sensor family provides evidence that a growing number of applications require higher frame rates,” said Wim Wuyts, Chief Commercial Officer of Gpixel. “We are excited to continue to expand the portfolio to bring these high frame rates to more applications.”

GSPRINT4502 engineering samples can be ordered today for delivery in October, 2022. 

About the GSPRINT sensor family

The GSPRINT series is Gpixel’s high-speed global shutter product family, including the 21 MP GSPRINT4521, 10 MP GSPRINT4510 and 2.5 MP GSPRINT4502. The GSPRINT technology will be used to expand the sizes and resolutions available in the family in the future. To learn more about the GSPRINT series, please contact us at:

About Gpixel

Gpixel provides high-end customized and off-the-shelf CMOS image sensors for industrial, professional, medical, and scientific applications. Gpixel’s standard products include the GMAX and GSPRINT global shutter, fast frame rate sensors, the GSENSE and GLUX high-end scientific CMOS image sensor series, the GL series of line scan imagers, the GLT series of TDI line scan imagers and the GTOF series of iTOF imagers. Gpixel’s broad portfolio of products utilizes the latest technologies to meet the ever-growing demands of the professional imaging market.

Go to the original article...

Nikon D1 retro review

Cameralabs        Go to the original article...

In 1999, Nikon launched the D1, the first DSLR the company entirely designed by itself. 23 years later I take this 2.7 Megapixel classic out for a retro review!…

Go to the original article...

Sigma 24mm f1.4 DG DN Art review

Cameralabs        Go to the original article...

The 24mm f1.4 DG DN Art from Sigma is a wide-angle prime lens designed for mirrorless full-frame cameras with L-mount or Sony E-mount. Is it a good alternative to Sony's FE 24mm f1.4 GM lens? Find out in my in-depth review!…

Go to the original article...

2023 International Image Sensors Workshop – Call for Papers

Image Sensors World        Go to the original article...

The 2023 International Image Sensors Workshop (IISW) will be held in Scotland from 22-25 May 2023. The first call for papers is now available at this link: 2023 IISW CFP.



2023 International Image Sensor Workshop

Crieff Hydro Hotel, Scotland, UK

22-25 May, 2023

The 2023 International Image Sensor Workshop (IISW) provides a biennial opportunity to present innovative work in the area of solid-state image sensors and share new results with the image sensor community. Now in its 35th year, the workshop will return to an in-person format. The event is intended for image sensor technologists; in order to encourage attendee interaction and a shared experience, attendance is limited, with strong acceptance preference given to workshop presenters. As is the tradition, the 2023 workshop will emphasize an open exchange of information among participants in an informal, secluded setting beside the Scottish town of Crieff. The scope of the workshop includes all aspects of electronic image sensor design and development. In addition to regular oral and poster papers, the workshop will include invited talks and announcement of International Image Sensors Society (IISS) Award winners.

Papers on the following topics are solicited:

Image Sensor Design and Performance
CMOS imagers, CCD imagers, SPAD sensors
New and disruptive architectures
Global shutter image sensors
Low noise readout circuitry, ADC designs
Single photon sensitivity sensors
High frame rate image sensors
High dynamic range sensors
Low voltage and low power imagers
High image quality; Low noise; High sensitivity
Improved color reproduction
Non-standard color patterns with special digital processing
Imaging system-on-a-chip, On-chip image processing

Pixels and Image Sensor Device Physics
New devices and pixel structures
Advanced materials
Ultra miniaturized pixels development, testing, and characterization
New device physics and phenomena
Electron multiplication pixels and imagers
Techniques for increasing QE, well capacity, reducing crosstalk, and improving angular response
Front side illuminated, back side illuminated, and stacked pixels and pixel arrays
Pixel simulation: Optical and electrical simulation, 2D and 3D, CAD for design and simulation, improved models

Application Specific Imagers
Image sensors and pixels for range sensing: LIDAR, TOF,
RGBZ, Structured light, Stereo imaging, etc.
Image sensors with enhanced spectral sensitivity (NIR, UV, IR)
Sensors for DSC, DSLR, mobile, digital video cameras and mirror-less cameras
Array imagers and sensors for multi-aperture imaging,
computational imaging, and machine learning
Sensors for medical applications, microbiology, genome sequencing
High energy photon and particle sensors (X-ray, radiation)
Line arrays, TDI, Very large format imagers
Multi and hyperspectral imagers
Polarization sensitive imagers

Image sensor manufacturing and testing
New manufacturing techniques
Backside thinning
New characterization methods
Defects & leakage current

On-chip optics and imaging process technology
Advanced optical path, Color filters, Microlens, Light guides
Nanotechnologies for Imaging
Wafer level cameras
Packaging and testing: Reliability, Yield, Cost
Stacked imagers, 3D integration
Radiation damage and radiation hard imager


General Workshop Co-Chairs
Robert Henderson – The University of Edinburgh
Guy Meynants – Photolitics and KU Leuven

Technical Program Chair
Neale Dutton – ST Microelectronics

Technical Program Committee
Jan Bogaerts - GPixel, Belgium
Calvin Yi-Ping Chao - TSMC, Taiwan
Edoardo Charbon - EPFL, Switzerland
Bart Dierickx - Caeleste, Belgium
Amos Fenigstein - TowerJazz, Israel
Manylun Ha -  DB Hitek, South Korea
Vladimir Korobov - ON Semiconductor, USA
Bumsuk Kim - Samsung, South Korea
Alex Krymski - Alexima, USA
Jiaju Ma - Gigajot, USA
Pierre Magnan - ISAE, France
Robert Daniel McGrath - Goodix Technology, US 
Preethi Padmanabhan - AMS-Osram, Austria
Francois Roy - STMicroelectronics, France
Andreas Suess - Omnivision Technologies, USA

IISS Board of Directors
Boyd Fowler – OmniVision
Michael Guidash – R.M. Guidash Consulting
Robert Henderson – The University of Edinburgh
Shoji Kawahito – Shizuoka University and Brookman Technology
Vladimir Koifman – Analog Value
Rihito Kuroda – Tohoku University
Guy Meynants – Photolitics
Junichi Nakamura – Brillnics
Yusuke Oike – Sony (Japan)
Johannes Solhusvik – Sony (Norway)
Daniel Van Blerkom – Forza Silicon-Ametek
Yibing Michelle Wang – Samsung Semiconductor

ISS Governance Advisory Committee:
Eric Fossum - Thayer School of Engineering at Dartmouth, USA
Nobukazu Teranishi - University of Hyogo, Japan
Albert Theuwissen - Harvest Imaging, Belgium / Delft University of Technology, The Netherlands

Go to the original article...

Tamron 50-400mm f4.5-6.3 Di III VC review

Cameralabs        Go to the original article...

The Tamron 50-400mm f4.5-6.3 is a zoom lens for Sony’s Alpha mirrorless cameras sporting a huge 8x range from standard to long telephoto. Find out how it compares in my full review!…

Go to the original article...

Surprises of Single Photon Imaging

Image Sensors World        Go to the original article...

[This is an invited blog post by Prof. Andreas Velten from University of Wisconsin-Madison.]

When we started working on single photon imaging we were anticipating having to do away with many established concepts in computational imaging and photography. Concepts like exposure time, well depth, motion blur, and many others don’t make sense for single photon sensors. Despite this expectation we still encountered several unexpected surprises.

Our first surprise was that SPAD cameras, which typically are touted for low light applications, have an exceptionally large dynamic range and therefore outperform conventional sensors not only in dark, but also in very bright scenes. Due to their hold off time, SPADs reject a growing number of photons at higher flux levels resulting in a nonlinear response curve. The classical light flux is usually estimated by counting photons over a certain time interval. One can instead measure the time between photons or the time a sensor pixel waits for a photon in the active state. This further increases dynamic range so that the saturation flux level is above the safe operating range of the detector pixel and far above eye safety levels. The camera does not saturate. [1][2][3]

The second surprise was that single photon cameras, without further computational improvements, are of limited use in low light imaging situations. In most imaging applications motion of the scene or camera demands short exposure times well below 1 second to avoid motion blur. At light levels low enough to present a challenge to current CMOS sensors results in low photon counts even for a perfect camera. The image looks noisy not because of a problem introduced by the sensor, but because of Poisson noise due to light quantization. The low light capabilities of SPADs only come to bear when long exposure times are used or when motion can be compensated for. Luckily motion compensation strategies inspired by burst photography and event cameras work exceptionally well for SPADs due to the absence of readout noise and inherent motion blur. [4][5][6]

Finally, we assumed early on that single photon sensors have an inherent disadvantage due to larger energy consumption. They either need internal amplification like the SPAD or high frame rates like QIS and qCMOS both of which result in higher power consumption. We learned that the internal amplification process in SPADs makes up a small and decreasing portion of the overall energy consumption of a SPAD. The lions share is spent in transferring and storing the large data volumes resulting from individually processing every single photon. To address the power consumption of SPAD cameras we therefore need to find better ways to compress photon data close to the pixel and be more selective about which photons to process and which to ignore. Even the operation of a conventional CMOS camera can be thought of as a type of compression. Photons are accumulated over an exposure time and only the total is read out after each frame. The challenge for SPAD cameras is to use their access to every single photon and combine it with more sophisticated ways of data compression implemented close to the pixel. [7]

As we transition imaging to widely available high resolution single photon cameras, we are likely in for more surprises. Light is made up of photons. Light detection is a Poisson process. Light and light intensity are derived quantities that are based on ensemble averages over a large number of photons. It is reasonable to assume that detection and processing methods that are based on the classical concept of flux are sub-optimal. The full potential of single photon capture and processing is therefore not yet known. I am hoping for more positive surprises.


[1] Ingle, A., Velten, A., & Gupta, M. (2019). High flux passive imaging with single-photon sensors. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 6760-6769). [Project Page]

[2] Ingle, A., Seets, T., Buttafava, M., Gupta, S., Tosi, A., Gupta, M., & Velten, A. (2021). Passive inter-photon imaging. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 8585-8595). [Project Page]

[3] Liu, Y., Gutierrez-Barragan, F., Ingle, A., Gupta, M., & Velten, A. (2022). Single-photon camera guided extreme dynamic range imaging. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (pp. 1575-1585). [Project Page]

[4] Seets, T., Ingle, A., Laurenzis, M., & Velten, A. (2021). Motion adaptive deblurring with single-photon cameras. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (pp. 1945-1954). [Interactive Visualization]

[5] Ma, S., Gupta, S., Ulku, A. C., Bruschini, C., Charbon, E., & Gupta, M. (2020). Quanta burst photography. ACM Transactions on Graphics (TOG), 39(4), 79-1. [Project Page]

[6] Laurenzis, M., Seets, T., Bacher, E., Ingle, A., & Velten, A. (2022). Comparison of super-resolution and noise reduction for passive single-photon imaging. Journal of Electronic Imaging, 31(3), 033042.

[7] Gutierrez-Barragan, F., Ingle, A., Seets, T., Gupta, M., & Velten, A. (2022). Compressive Single-Photon 3D Cameras. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 17854-17864). [Project Page]


About the author:

Andreas Velten is Assistant Professor at the Department of Biostatistics and Medical Informatics and the department of Electrical and Computer Engineering at the University of Wisconsin-Madison and directs the Computational Optics Group. He obtained his PhD with Prof. Jean-Claude Diels in Physics at the University of New Mexico in Albuquerque and was a postdoctoral associate of the Camera Culture Group at the MIT Media Lab. He has included in the MIT TR35 list of the world's top innovators under the age of 35 and is a senior member of NAI, OSA, and SPIE as well as a member of Sigma Xi. He is co-Founder of OnLume, a company that develops surgical imaging systems, and Ubicept, a company developing single photon imaging solutions.

Go to the original article...

amsOSRAM announces new sensor Mira220

Image Sensors World        Go to the original article...

  • New Mira220 image sensor’s high quantum efficiency enables operation with low-power emitter and in dim lighting conditions
  • Stacked chip design uses ams OSRAM back side illumination technology to shrink package footprint to just 5.3mm x 5.3mm, giving greater design flexibility to manufacturers of smart glasses and other space-constrained products
  • Low-power operation and ultra-small size make the Mira220 ideal for active stereo vision or structured lighting 3D systems in drones, robots and smart door locks, as well as mobile and wearable devices

Press Release:

Premstaetten, Austria (14th July 2022) -- ams OSRAM (SIX: AMS), a global leader in optical solutions, has launched a 2.2Mpixel global shutter visible and near infrared (NIR) image sensor which offers the low-power characteristics and small size required in the latest 2D and 3D sensing systems for virtual reality (VR) headsets, smart glasses, drones and other consumer and industrial applications.

The new Mira220 is the latest product in the Mira family of pipelined high-sensitivity global shutter image sensors. ams OSRAM uses back side illumination (BSI) technology in the Mira220 to implement a stacked chip design, with the sensor layer on top of the digital/readout layer. This allows it to produce the Mira220 in a chip-scale package with a footprint of just 5.3mm x 5.3mm, giving manufacturers greater freedom to optimize the design of space-constrained products such as smart glasses and VR headsets.

The sensor combines excellent optical performance with very low-power operation. The Mira220 offers a high signal-to-noise-ratio as well as high quantum efficiency of up to 38% as per internal tests at the 940nm NIR wavelength used in many 2D or 3D sensing systems. 3D sensing technologies such as structured light or active stereo vision, which require an NIR image sensor, enable functions such as eye and hand tracking, object detection and depth mapping. The Mira220 will support 2D or 3D sensing implementations in augmented reality and virtual reality products, in industrial applications such as drones, robots and automated vehicles, as well as in consumer devices such as smart door locks.

The Mira220’s high quantum efficiency allows device manufacturers to reduce the output power of the NIR illuminators used alongside the image sensor in 2D and 3D sensing systems, reducing total power consumption. The Mira220 features very low power consumption at only 4mW in sleep mode, 40mW in idle mode and at full resolution and 90fps the sensor has a power consumption of 350mW. By providing for low system power consumption, the Mira220 enables wearable and portable device manufacturers to save space by specifying a smaller battery, or to extend run-time between charges.

“Growing demand in emerging markets for VR and augmented reality equipment depends on manufacturers’ ability to make products such as smart glasses smaller, lighter, less obtrusive and more comfortable to wear. This is where the Mira220 brings new value to the market, providing not only a reduction in the size of the sensor itself, but also giving manufacturers the option to shrink the battery, thanks to the sensor’s very low power consumption and high sensitivity at 940nm,” said Brian Lenkowski, strategic marketing director for CMOS image sensors at ams OSRAM.

Superior pixel technology

The Mira220’s advanced back-side illumination (BSI) technology gives the sensor very high sensitivity and quantum efficiency with a pixel size of 2.79μm. Effective resolution is 1600px x 1400px and maximum bit depth is 12 bits. The sensor is supplied in a 1/2.7” optical format.

The sensor supports on-chip operations including external triggering, windowing, and horizontal or vertical mirroring. The MIPI CSI-2 interface allows for easy interfacing with a processor or FPGA. On-chip registers can be accessed via an I2C interface for easy configuration of the sensor.

Digital correlated double sampling (CDS) and row noise correction result in excellent noise performance.

ams OSRAM will continue to innovate and extend the Mira family of solutions, offering customers a choice of resolution and size options to fit various application requirements.

The Mira220 NIR image sensor is available for sampling. More information about Mira220.

Mira220 image sensor achieves high quantum efficiency at 940nm to allow for lower power illumination in 2D and 3D sensing systems
Image: ams

The miniature Mira220 gives extra design flexibility in space-constrained applications such as smart glasses and VR headsets
Image: OSRAM

Go to the original article...

Sigma 20mm f1.4 DG DN Art review

Cameralabs        Go to the original article...

The 20mm f1.4 DG DN Art from Sigma is an ultra-wide prime lens designed for mirrorless full-frame cameras with L-mount or Sony E-mount. Is it a worthy alternative to Sony's FE 20mm f1.8 G lens? Find out in my in-depth review!…

Go to the original article...

Gigajot article in Nature Scientific Reports

Image Sensors World        Go to the original article...

Jiaju Ma et al. of Gigajot Technology, Inc. have published a new article titled "Ultra‑high‑resolution quanta image sensor with reliable photon‑number‑resolving and high dynamic range capabilities" in Nature Scientific Reports.


Superior low‑light and high dynamic range (HDR) imaging performance with ultra‑high pixel resolution are widely sought after in the imaging world. The quanta image sensor (QIS) concept was proposed in 2005 as the next paradigm in solid‑state image sensors after charge coupled devices (CCD)and complementary metal oxide semiconductor (CMOS) active pixel sensors. This next‑generation image sensor would contain hundreds of millions to billions of small pixels with photon‑number‑resolving and HDR capabilities, providing superior imaging performance over CCD and conventional CMOS sensors. In this article, we present a 163 megapixel QIS that enables both reliable photon‑number‑resolving and high dynamic range imaging in a single device. This is the highest pixel resolution ever reported among low‑noise image sensors with photon‑number‑resolving capability. This QIS was fabricated with a standard, state‑of‑the‑art CMOS process with 2‑layer wafer stacking and backside illumination. Reliable photon‑number‑resolving is demonstrated with an average read noise of 0.35 e‑ rms at room temperature operation, enabling industry leading low‑light imaging performance. Additionally, a dynamic range of 95 dB is realized due to the extremely low noise floor and an extended full‑well capacity of 20k e‑. The design, operating principles, experimental results, and imaging performance of this QIS device are discussed.

Ma, J., Zhang, D., Robledo, D. et al. Ultra-high-resolution quanta image sensor with reliable photon-number-resolving and high dynamic range capabilities. Sci Rep 12, 13869 (2022).

This is an open access article:

Go to the original article...

Four Nikon products, including the Nikon Z 9 mirrorless camera, honored with EISA Awards

Nikon | Imaging Products        Go to the original article...

Go to the original article...

New understanding of color perception theory

Image Sensors World        Go to the original article...

From a news article about a recent paper that casts doubt on the traditional understanding of how human color perception works: "Math error: A new study overturns 100-year-old understanding of color perception":

A new study corrects an important error in the 3D mathematical space developed by the Nobel Prize-winning physicist Erwin Schrödinger and others, and used by scientists and industry for more than 100 years to describe how your eye distinguishes one color from another. The research has the potential to boost scientific data visualizations, improve TVs and recalibrate the textile and paint industries.

The full paper appears in the Proceedings of the National Academy of Sciences vol. 119 no. 18 (2022). It is titled "The non-Riemannian nature of perceptual color space" authored by Dr. Roxana Bujack and colleagues at Los Alamos National Lab.

The scientific community generally agrees on the theory, introduced by Riemann and furthered by Helmholtz and Schrödinger, that perceived color space is not Euclidean but rather, a three-dimensional Riemannian space. We show that the principle of diminishing returns applies to human color perception. This means that large color differences cannot be derived by adding a series of small steps, and therefore, perceptual color space cannot be described by a Riemannian geometry. This finding is inconsistent with the current approaches to modeling perceptual color space. Therefore, the assumed shape of color space requires a paradigm shift. Consequences of this apply to color metrics that are currently used in image and video processing, color mapping, and the paint and textile industries. These metrics are valid only for small differences. Rethinking them outside of a Riemannian setting could provide a path to extending them to large differences. This finding further hints at the existence of a second-order Weber–Fechner law describing perceived differences.


The key observation that this paper rests on is the concept of "diminishing returns". Statistical analysis of experimental data collected in this paper suggests that the perceived difference between pairs of colors A, B and C that lie along a single shortest path (geodesic) do not satisfy the additive equality.

A commentary by Dr. David Brainard (U. Penn.) about this paper was published in PNAS and is available here:

Some of the caveats noted in this commentary piece:

First, the authors make a first principles assumption that the achromatic locus is a geodesic and use this in their choice of stimuli. This assumption is intuitively appealing in that it would be surprising that the shortest path in color space between two achromatic stimuli would involve a detour through a chromatic stimulus and back. However, the achromatic locus as a geodesic was not empirically established, and more work could be considered to shore up this aspect of the argument. Second, the data were collected using online methods and combined across subjects prior to the analysis. This raises the question of whether the aggregate performance analyzed could be non-Riemannian even when the performance of each individual subject was itself Riemannian. Although it is not immediately obvious whether this could occur, it might be further considered as a possibility. press release:

LANL press release:

PNAS paper:

Go to the original article...

Voigtlander 35mm f2 APO-Lanthar review

Cameralabs        Go to the original article...

The Voigtländer 35mm f2 APO-Lanthar is a mild wide-angle prime lens designed for mirrorless full-frame cameras from Sony and Nikon. Find out whether it's worth considering although it is manual focus only in my full review!…

Go to the original article...

Direct ToF Single-Photon Imaging (IEEE TED June 2022)

Image Sensors World        Go to the original article...

The June 2022 issue of IEEE Trans. Electron. Devices has an invited paper titled Direct Time-of-Flight Single-Photon Imaging by Istvan Gyongy et al. from University of Edinburgh and STMicroelectronics. 

This is a comprehensive tutorial-style article on single-photon 3D imaging which includes a description of the image formation model starting from first principles and practical system design considerations such as photon budget and power requirements.

Abstract: This article provides a tutorial introduction to the direct Time-of-Flight (dToF) signal chain and typical artifacts introduced due to detector and processing electronic limitations. We outline the memory requirements of embedded histograms related to desired precision and detectability, which are often the limiting factor in the array resolution. A survey of integrated CMOS dToF arrays is provided highlighting future prospects to further scaling through process optimization or smart embedded processing.

Go to the original article...

Canon RF 100mm f2.8L Macro IS USM review

Cameralabs        Go to the original article...

The Canon RF 100mm f2.8L IS USM is a high-end macro lens for Canon’s EOS R mirrorless system that’s capable of 1.4x magnification on a full-frame body. In my review I’ll compare it against the highly-regarded EF version.…

Go to the original article...

How to backup photos

Cameralabs        Go to the original article...

In this article I’ll show you how I backup my photos, videos and other precious files, using a combination of portable drives and cloud storage.…

Go to the original article...

CFP: International Workshop on Image Sensors and Imaging Systems 2022

Image Sensors World        Go to the original article...

The 5th International Workshop on Image Sensors and Imaging Systems (IWISS2022) will be held in December 2022 in Japan. This workshop is co-sponsored by IISS.

-Frontiers in image sensors based on conceptual breakthroughs inspired by applications-

Date: December 12 (Mon) and 13 (Tue), 2022

Venue: Sanaru Hall, Hamamatsu Campus, Shizuoka University 

Access: see

Address: 3-5-1 Johoku, Nakaku, Hamamatsu, 432-8561 JAPAN

Official language: English


In this workshop, people from various research fields, such as image sensing, imaging systems, optics, photonics, computer vision, and computational photography/imaging, come together to discuss the future and frontiers of image sensor technologies in order to explore the continuous progress and diversity in image sensors engineering and state-of-the-art and emerging imaging systems technologies. The workshop is composed of invited talks and a poster session. We are accepting approximately 20 poster papers, whose submission starts in August, with deadline on October 14 (Fri), 2022. A Poster Presentation Award will be given to the selected excellent paper. We encourage everyone to submit the latest original work. Every participant is required to register online by December 5 (Mon), 2022. On-site registration is NOT accepted. Since the workshop is operated by a limited number of volunteers, we can offer only minimal service; therefore, no invitation letters for visa applications to enter Japan can be issued.

Latest Information: Call for Paper, Advance Program

Poster Session
Submit a paper:
Submission deadline: Oct. 14(Fri), 2022 (Only title, authors, and short abstract are required)
Please use the above English page. DO NOT follow the Japanese instructions at the bottom of the page.
Notification of acceptance: by Oct. 21 (Fri)

Manuscript submission deadline: Nov. 21 (Mon), 2022 (2-page English proceeding is required)
One excellent poster will be awarded.

Plenary and Invited Speakers


“Deep sensing - Jointly optimize imaging and processing –“ by
Hajime Nagahara (Osaka University, Japan)

[Invited Talks]
- Image Sensors
“InGaAs/InP and Ge-on-Si SPADs for SWIR applications” by Alberto Tosi (Politecnico di Milano, Italy)
“CMOS SPAD-Based LiDAR Sensors with Zoom Histogramming TDC Architectures” by Seong-Jin Kim et al. (UNIST, Korea)
"TBD" by Min-Sun Keel (Samsung Electronics, Korea)
“Modeling and verification of a photon-counting LiDAR” by Sheng-Di Lin (National Yang Ming Chiao Tung Univ., Taiwan)
- Computational Photography/Imaging and applications “Computational lensless imaging by coded optics” by Tomoya Nakamura (Osaka Univ., Japan)
“TBD” by Miguel H. Conde (Siegen Univ.) “TBD” by TBD (Toronto Univ.)

- Optics and Photonics
“Optical system integrated time-of-flight and optical coherence tomography for high-dynamic range distance measurement” by Yoshio Hayasaki et al. (Utsunomiya Univ., Japan)
“High-speed/ultrafast holographic imaging using an image sensor” by Yasuhiro Awatsuji et al. (Kyoto Institute of Technology, Japan)
“Near-infrared sensitivity improvement by plasmonic diffraction technology” by Nobukazu Teranishi et al. (Shizuoka Univ, Japan)

- Image sensor technologies: fabrication process, circuitry, architectures
- Imaging systems and image sensor applications
- Optics and photonics: nanophotonics, plasmonics, microscopy, spectroscopy
- Computational photography/ imaging
- Applications and related topics on image sensors and imaging systems: e.g., multi-spectral imaging, ultrafast imaging, biomedical imaging, IoT, VR/AR, deep learning, ...

Online Registration for Audience
Registration is necessary due to the limited number of available seats.
Registration deadline is Dec. 5 (Mon).
Register and pay online from the following website: <to appear>

Registration Fee
Regular and student: approximately 2,000-yen (~15 USD)
Note: This price is for purchasing the online proceeding of IWISS2022 through the ITE. If you cannot join the workshop due to any reason, no refund will be provided.

Collaboration with MDPI Sensors Special Issue
Special Issue on "Recent Advances in CMOS Image Sensor"
Special issue editor: Dr. De Xing Lioe
Paper submission deadline: Feb. 25 (Sat), 2023
The poster presenters are encouraged to submit a paper to this special issue!
Note-1: Those who do not give a presentation in the IWISS2022 poster session are also welcome to submit a paper!
Note-2: Sensors is an open access journal, the article processing charges (APC) will be applied to accepted papers.
Note-3: For poster presenters of IWISS2022, please satisfy the following conditions.

The submitted extended papers to the special issue should have more than 50% new data and/or extended content to make it a real and complete journal paper. It will be much better if the Title and Abstract are different with that of conference paper so that they can be differentiated in various databases. Authors are asked to disclose that it is conference paper in their cover letter and include a statement on what has been changed compared to the original conference paper.

Sponsored by Technical Group on Information Sensing Technologies (IST),
the Institute of Image Information and Television Engineers (ITE)
Co-sponsored by International Image Sensor Society (IISS), Group of
Information Photonics (IPG) +CMOS Working Group, the Optical Society of
Japan, and innovative Photonics Evolution Research Center (iPERC)
[General Chair] Keiichiro Kagawa (Shizuoka Univ., Japan)
[Technical Program Committee (alphabetical order)]
Chih-Cheng Hsieh (National Tsing Hua Univ., Taiwan)
Keiichiro Kagawa (Shizuoka Univ., Japan)
Takashi Komuro (Saitama Univ., Japan)
De Xing Lioe (Shizuoka Univ., Japan)
Hajime Nagahara (Osaka Univ., Japan)
Atushi Ono (Shizuoka Univ., Japan)
Min-Woong Seo (Samsung Electronics, Korea)
Hiroyuki Suzuki (Gunma Univ., Japan)
Hisayuki Taruki (Toshiba Electronic Devices & Storage Corporation, Japan)
Franco Zappa (Politecnico di Milano, Italy)

Contact for any question about IWISS2022
(Keiichiro Kagawa, Shizuoka Univ., Japan)

Go to the original article...

Nikon 500mm f5.6E PF VR review

Cameralabs        Go to the original article...

Nikon's AF-S 500mm f5.6E VR S was introduced in 2018. Does it still hold up to Nikon's newest long telephoto lenses for their Z system? Find out in my in-depth review!…

Go to the original article...