Image Sensors Europe Interviews

Image Sensors World        Go to the original article...

Image Sensors Europe to be held in mid-March 2018 in London, UK publishes a number of interviews ahead of the conference.

Ian Riches, Global Automotive Practice from Strategy Analytics:

Q: What do you see as the most significant changes coming up in vision systems development and their applications within automotive in the next 12-24 months?

A:
a) Much more use of machine vision in the currently largely “dumb” applications of park assist and surround view.
b) Camera resolutions markedly increasing
c) A lot more in-cabin sensing

Albert Theuwissen, Founder of Harvest Imaging:

There are several reasons why I think (= am convinced) that monolithic CMOS imagers are superior to hybrid imagers :

  • The hybrid imagers are always based on 3T structures, while monolithic can make use of the 4T imagers. This results in a lower noise for the latter,
  • The dark current, dark current non-uniformities and isolated hot-pixel count is always better for monolithic silicon,
  • Monolithic silicon has improved quite a lot w.r.t. their response to near-IR light, such that they show even better (QE) performance in the near-IR than most of the hybrid imagers,
  • Monolithic silicon has a better signal-to-noise ratio than the hybrid imagers.

These statements are valid within the wavelength range of monolithic silicon (visible spectrum up to 1.1 um). Outside this wavelength range, the story can be completely different.

Q: What would you say are the 3 biggest game changers that will soon hit the image sensors industry and how can we prepare for it?

A:
  1. The increase of stacked imager technology - more companies are following this trend outside of just image-sensor companies, for example companies that (will) have signal-processing chips available that can be stacked to imagers. On the other hand, the stacking technology is quickly moving to the stacking on pixel level. This will result in ultra-fast devices with a huge amount of parallel processing capabilities on the (stacked) chip.
  2. The use of near-IR information will not only add more features to the cameras, but also introduce new applications, e.g. face recognition in mobile phones, measurement of distances, etc.
  3. Imagers are no longer mainly used for making beautiful images, but more and more applications are being created to use image sensors for totally other functions. Examples are the time-of-flight applications, the auto focus pixels, use in autonomous driving cars, etc.

Go to the original article...

Leave a Reply

css.php