Fly Vision vs Human Vision

Image Sensors World        Go to the original article...

BBC publishes an article "Why is it so hard to swat a fly?" comparing human vision with fly vision:

"...have a look at a clock with a ticking hand. As a human, you see the clock ticking at a particular speed. But for a turtle it would appear to be ticking at twice that speed. For most fly species, each tick would drag by about four times more slowly. In effect, the speed of time differs depending on your species.
This happens because animals see the world around them like a continuous video. But in reality, they piece together images sent from the eyes to the brain in distinct flashes a set number of times per second. Humans average 60 flashes per second, turtles 15, and flies 250.
"

Go to the original article...

Intel Project Alloy Cancelled

Image Sensors World        Go to the original article...

SlashGear: Intel work on Project Alloy "Merged Reality" headset featuring RealSense 3D camera has been stopped. In a statement to RoadToVR Intel says:

"Intel has made the decision to wind down its Project Alloy reference design, however we will continue to invest in the development of technologies to power next-generation AR/VR experiences. This includes: Movidius for visual processing, Intel RealSense depth sensing and six degrees of freedom (6DoF) solutions, and other enabling technologies..."

Go to the original article...

TechInsights Unveils iPhone 8 Plus Camera Surprises

Image Sensors World        Go to the original article...

TechInsights was quick to unveil few finding from iPhone 8+ reverse engineering:

The dual rear camera uses 1.22um pixel size in 12MP wide angle sensor and 1.0um pixel is 12MP tele sensor. The 7MP front camera has 1.0um pixel size.

"The dual camera module size is 21.0 mm x 10.6 mm x 6.3 mm thick. Based on our initial X-rays it appears the wide-angle camera uses optical image stabilization (OIS), while the telephoto camera does not (the same configuration as iPhone 7 Plus).

The wide-angle Sony CIS has a die size of 6.29 mm x 5.21 mm (32.8 mm2). This compares to a 32.3 mm2 die size for iPhone 7’s wide-angle CIS.

We do note a new Phase Pixel pattern, but the big news is the absence of surface artifacts corresponding to the through silicon via (TSV) arrays we’ve seen for a few years. A superficial review of the die photo would suggest it’s a regular back-illuminated (BSI) chip. However, we’ve confirmed it’s a stacked (Exmor RS) chip which means hybrid bonding is in use for the first time in an Apple camera!
"

Go to the original article...

Cameras with Black Silicon Sensors Reach the Market

Image Sensors World        Go to the original article...

It came to my attention that a number of Japanese camera companies started selling cameras with SiOnyx Black Silicon sensors. One of these companies is Bitran with CS-64NIR cooled camera based on XQE-0920 sensor. The company publishes a presentation with the application examples for the new camera sensitive up to 1200nm.


Another company is ACH2 Technologies selling ACH100-NIR camera, saying it's sensitive up to 1400nm:


Yet another company is Artray with two cameras: 1.3MP ARTCAM-130XQE-WOM and 0.92MP ARTCAM-092XQE-WOM.



It's very nice to see a new, radically different technology finally reaching the market.

Go to the original article...

Tractica Forecasts Rise of Enterprise AR

Image Sensors World        Go to the original article...

Tractica posts "Augmented Reality: The Rise of Enterprise Use Cases" article on its website. Few interesting statements:

"Smart glasses that replace or complement the desktop likely face a long-haul journey. There are a number of technical issues to overcome, including FOV, weight, ergonomics and comfort, and extended AR use.

The momentum for smart AR glasses has shifted toward mixed reality (MR) headsets, which offer a much more compelling user experience, using 3D depth sensing and positional tracking to immerse the user into a holographic world. Microsoft HoloLens is the first truly capable MR headset and is seeing rapid momentum in terms of trials and pilots. There are still questions about whether or not Microsoft has oversold the capabilities of the device, and if the enterprise market can scale to make it a commercially viable product. Tractica expects that Microsoft is likely to be committed to the enterprise market at least through the end of 2018 before it readies the HoloLens for consumer launch. If the pilots do not convert into meaningful volumes, Microsoft could find itself in an awkward place like Google did with Glass, eventually pulling the plug.

Tractica estimates that the monthly active users (MAUs) for smartphone/tablet enterprise AR will be 49 million by the end of 2022. In contrast, the installed base of enterprise smart glasses users at the end of 2022 will be approximately 19 to 21 million.
"

Go to the original article...

Espros Presents its Pulsed ToF Solution

Image Sensors World        Go to the original article...

Espros CEO Beat De Coi presents the first results of his company pulsed ToF chip (pToF) at AutoSens 2017 in Brussels, Belgium. Performance of the sensors include a QE of 70% at 905nm, a sensitivity trigger level as low as 20 e- for object detection, 250MHz CCD sampling and interpolation algorithms to reach centimeter accuracy. The sensors will be operating in full sunlight without disturbance and function under all weather conditions. The presentation is available for download at Espros site.

Beat De Coi says: «This new generation of pulsed time-of-flight sensors will show a performance that will boost autonomous driving effort. I have been working on time-of-flight technology since 30 years and I am extremely proud that we reached this level with conventional silicon.»

Few slides from the presentation explaining Espros new chip operation:

Go to the original article...

Auger Excitation Shows APD-like Gains

Image Sensors World        Go to the original article...

A group of UCSD researchers publishes an open-access Applied Physics Letters paper "An amorphous silicon photodiode with 2 THz gain‐bandwidth product based on cycling excitation process" by Lujiang Yan, Yugang Yu, Alex Ce Zhang, David Hall, Iftikhar Ahmad Niaz, Mohammad Abu Raihan Miah, Yu-Hsin Liu, and Yu-Hwa Lo. The paper proposes APD-magnitude gain mechanism in by means of 30nm-thing amorphous Si film deposited on top of the bulk silicon:


"APDs have relatively high excess noise, a limited gain-bandwidth product, and high operation voltage, presenting a need for alternative signal amplification mechanisms of superior properties. As an amplification mechanism, the cycling excitation process (CEP) was recently reported in a silicon p-n junction with subtle control and balance of the impurity levels and profiles. Realizing that CEP effect depends on Auger excitation involving localized states, we made the counter intuitive hypothesis that disordered materials, such as amorphous silicon, with their abundant localized states, can produce strong CEP effects with high gain and speed at low noise, despite their extremely low mobility and large number of defects. Here, we demonstrate an amorphous silicon low noise photodiode with gain-bandwidth product of over 2 THz, based on a very simple structure."

Go to the original article...

Yole on iPhone X 3D Innovations

Image Sensors World        Go to the original article...

Yole Developpement publishes its analysis of iPhone X 3D camera design and implications "Apple iPhone X: unlocking the next decade with a revolution:"


"The infrared camera, proximity ToF detector and flood illuminator seem to be treated as a single block unit. This is supplied by STMicroelectronics, along with Himax for the illuminator subsystem, and Philips Photonics and Finisar for the infrared-light vertical-cavity surface-emitting laser (VCSEL). Then, on the right hand of the speaker, the regular front-facing camera is probably supplied by Cowell, and the sensor chip by Sony. On the far right, the “dot pattern projector” is from ams subsidiary Heptagon... It combines a VCSEL, probably from Lumentum or Princeton Optronics, a wafer level lens and a diffractive optical element (DOE) able to project 30,000 dots of infrared light.

The next step forward should be full ToF array cameras. According to the roadmap Yole has published this should happen before 2020.
"

Go to the original article...

Luminar on Automotive LiDAR Progress

Image Sensors World        Go to the original article...

OSA publishes a digest of Luminar CTO, Jason Eichenholz, talk at 2017 Frontiers in Optics meeting. Few quotes:

"Surprisingly, however, despite this safety imperative, Eichenholz pointed out that the lidar system used (for example) in Uber’s 2017 self-driving demo has essentially the same technical specifications as the system of the winning vehicle in DARPA’s 2007 autonomous-vehicle grand challenge. “In ten years,” he said, “you have not seen a dramatic improvement in lidar systems to enable fully autonomous driving. There’s been so much progress in computation, so much in machine vision … and yet the technology for the main set of eyes for these cars hasn’t evolved.”

On the requirements side, the array of demands is sobering. They include, of course, a bevy of specific requirements: a 200-m range, to give the vehicle passenger a minimum of seven seconds of reaction time in case of an emergency; laser eye safety; the ability to capture millions of points per second and maintain a 10-fps frame rate; and the ability to handle fog and other unclear conditions.

But Eichenholz also stressed that an autonomous vehicle on the road operates in a “target-rich” environment, with hundreds of other autonomous vehicles shooting out their own laser signals. That environment, he said, creates huge challenges of background noise and interference. And he noted some of the same issues with supply chain, cost control, and zero error tolerance.

Eichenholz outlined some of the approaches and technical steps that Luminar has adopted in its path to meet those many requirements in autonomous-vehicle lidar. One step, he said, was the choice of a 1550-nm, InGaAs laser, which allows both eye safety and a good photon budget. Another was the use of an InGaAs linear avalanche photodiode detector rather than single-photon counting, and scanning the laser signal for field coverage rather than using a detector array. The latter two decisions, he said, substantially reduce problems of background noise and interference. “This is a huge part of our architecture.


Wired UK publishes a video interview with LiDAR CEO Austin Russell:

Go to the original article...

Functional Safety in Automotive Image Sensors

Image Sensors World        Go to the original article...

ON Semi publishes a webinar on Evaluating Functional Safety in Automotive Image Sensors:

Go to the original article...

Exvision High-Speed Image Sensor-Based Gesture Control

Image Sensors World        Go to the original article...

Exvision, a spin-off from University of Tokyo's Ishikawa-Watanabe Laboratory, demos gesture control from far away, based on a high speed image sensor (currently, 120fps Sony IMX208):



Go to the original article...

SensL Demos 100m LiDAR Range

Image Sensors World        Go to the original article...

SensL publishes a demo video of 100m LiDAR based on its 1 x 16 photomultiplier imager scanned in 5 x 80 deg angle:

Go to the original article...

3D Camera Use Cases

Image Sensors World        Go to the original article...

Occipital publishes few videos on a 3D camera use cases:



Go to the original article...

OmniVision Announces Automotive Reference Design

Image Sensors World        Go to the original article...

PRNewswire: OmniVision announces an automotive reference design system (ARDS) that allows automotive imaging-system and software developers to mix and match image sensors, ISPs and long-distance serializer modules.

The imaging-system industry is anticipating significant growth in ADAS, including surround-view and rear-view camera systems. NCAP mandates all new vehicles in the U.S. to be equipped with rear-view cameras by 2018. Surround-view systems (SVS) are also expected to become an even more popular feature for the luxury-vehicle segment within the same timeframe. SVSs typically require at least four cameras to provide a 360-degree view.

OmniVision's ARDS demo kits feature OmniVision's 1080p60 OV2775 image sensor, optional OV495 ISP and serializer camera module. The OV2775 is built on 2.8um OmniBSI-2 Deep Well pixel with a 16-bit linear output from a single exposure.

Go to the original article...

Samsung to Start Mass Production of 1000fps 3-Layer Sensor

Image Sensors World        Go to the original article...

ETNews reports that Samsung follows Sony footsteps to develop its own 1000fps image sensor for smartphones:

"Samsung Electronics is going to start mass-producing ‘3-layered image sensor’ in November. This image sensor is made into a layered structure by connecting a system semiconductor (logic chip) that is in charge of calculations and DRAM chip that can temporarily store data through TSV (Through Silicon Via) technology. Samsung Electronics currently ordered special equipment for mass-production and is going to start mass-producing ‘3-layered image sensor’ after doing pilot operation in next month.

SONY established a batch process system that attaches a sensor, a DRAM chip, and a logic chip in a unit of a wafer. On the other hand, it is understood that Samsung Electronics is using a method that makes 2-layered structure with a sensor and a logic chip and attaches DRAM through TC (Thermal Compression) bonding method after flipping over a wafer. From productivity and production cost, SONY has an upper hand. It seems that a reason why Samsung Electronics decided to use its way is because it wanted to avoid using other patents.
"

Go to the original article...

Turkish Startup Demos CMOS Night Vision

Image Sensors World        Go to the original article...

Ankara, Turkey-based PiKSELiM demos low-light sensitivity of its 640x512 CMOS sensor operating in the global shutter mode at 10fps and using an f/0.95 C-mount security camera optics:

Go to the original article...

Magic Leap Valuation to Grow to $6B

Image Sensors World        Go to the original article...

Bloomberg reports that AR headset startup Magic Leap is in the process of raising a new financing round of more than $500M at the valuation close to $6B. The company has already raised more than $1.3B in the previous rounds valuing it at $4.5B.

"According to people familiar with the company’s plans, the headset device will cost between $1,500 and $2,000, although that could change. Magic Leap hopes to ship its first device to a small group of users within six months, according to three people familiar with its plans."

Go to the original article...

Haitong Securities Forecasts Smartphones with 3D Sensing Market $992.5B in 2020

Image Sensors World        Go to the original article...

InstantFlashNews quotes a number of sources in Chinese language saying that Haitong Securities analysts forecast the global sales of smartphones equipped with 3D sensors to reach $992.5B in 2020. The sales of smartphones with front structured light camera will be $667.8B, while the sales of smartphones with rear ToF camera will take $324.7B.

Haitong Securities estimates iPhone X 3D structured light components cost at ~$15, with 3D image sensor ~$3, TX component ~$7, RX ~$3, and system module about $2.

Go to the original article...

Image Sensors in AR/VR Devices

Image Sensors World        Go to the original article...

Citibank publishes a nice market report on Augmented and Virtual Reality dated by October 2016. The report emphasize a large image sensing content in almost all AR/VR devices:

Go to the original article...

Espros Keeps Improving its ToF Sensors

Image Sensors World        Go to the original article...

Espros September 2017 Newsletter updates on the company progress with its ToF solutions:

"A real breakthrough was achieved in the field of camera calibration. Our initial goal was to simply find the optimum procedure to calibrate a DME660 camera. The result however is a revolutionary finding, that not only includes the compensation algorithm but also a simple desktop hardware for distance calibration.

No need any more for large target screens and moving stages! Simply put your camera in a shoebox sized flat field setup and calibrate the full distance range with help of the on-chip DLL stage. Done!
"


"You won't recognize our epc660 flagship QVGA imager in version 007! Improved ADC performance, 28% higher sensitivity, as well as low distance response non-uniformity (DRNU) of a few centimeters only (uncalibrated). We took 3 rounds (versions 004-006) in the fab transfer process and did not let go before we got it right."

The company also presents a preliminary data on its ToFCam 635 module:

Go to the original article...

iPhone X 3D Camera Cost Estimated at 6% of BOM

Image Sensors World        Go to the original article...

GSMArena, MyFixGuide quote Chinese site ICHunt.com estimating Apple iPhone X 3D camera components cost at $25 out of the whole BOM of $412.75:

Go to the original article...

Digitimes on iPhone X Influence on the Industry

Image Sensors World        Go to the original article...

Digitimes believes that iPhone X "new features... such as 3D sensing are likely to become new standards for next-generation smartphones launched by Android-based smartphone vendors. The demand for 3D sensor modules is likely to experience an explosive growth in 2018-2019. Major players in the Android camp, including Samsung and Huawei, certainly will jump onto the bandwagon."

Meanwhile, the smaller Android phone makers have jumped on this bandwagon even faster. Doogee Mix 2 already presents the face authentication based on its front 3D stereo camera:

Go to the original article...

IR Sensor Consumes No Power till Specific Wake-up Scene Detected

Image Sensors World        Go to the original article...

IEEE Spectrum, DARPA: Northeastern University, Boston, MA researchers publish Nature Photonics paper "Zero-power infrared digitizers based on plasmonically enhanced micromechanical photoswitches" by Zhenyun Qian, Sungho Kang, Vageeswar Rajaram, Cristian Cassella, Nicol McGruer & Matteo Rinaldi.

"It consists of a tiny, micromechanical switch that controls the connection to a battery. Only when the switch is activated by the infrared radiation does it move to close the gap between itself and its battery, triggering the wake-up signal.

The switch contacts are supported by beams made out of a two-material stack. When the temperature of this structure increases, one material expands more than the other, and therefore the beams bend,” Rinaldi explains. That bending allows the switch to make contact with the battery and spit out a signal.
"


What is really interesting about the Northeastern IR sensor technology is that, unlike conventional sensors, it consumes zero stand-by power when the IR wavelengths to be detected are not present,” said Troy Olsson, manager of the N-ZERO Program in DARPA’s Microsystems Technology Office. “When those IR wavelengths are present and impinge on the Northeastern team’s IR sensor, the energy from the IR source heats the sensing elements which, in turn, causes physical movement of key sensor components. These motions result in the mechanical closing of otherwise open circuit elements, thereby leading to signals that the target IR signature has been detected.

The technology features multiple sensing elements—each tuned to absorb a specific IR wavelength,” Olsson noted. “Together, these combine into complex logic circuits capable of analyzing IR spectrums, which opens the way for these sensors to not only detect IR energy in the environment but to specify if that energy derives from a fire, vehicle, person or some other IR source.

Go to the original article...

40 Years in Imaging

Image Sensors World        Go to the original article...

Albert Theuwissen writes about his early CCD projects in 1970s and what was considered to be the cutting edge in imaging in that time.

Go to the original article...

Yole Thoughts on iPhone X 3D Camera

Image Sensors World        Go to the original article...

EETimes' Junko Yoshida interviews Pierre Cambou, activity leader for imaging and sensor at Yole Développement:

"Cambou acknowledged that he was surprised to see the solution “way more complex than initially envisioned.” Building blocks inside the iPhone X, designed to enable Apple’s TrueDepth camera, include a structured light transmitter, a structure light receiver on the front camera and a time-of- flight/proximity sensor. Cambou said, “Apple managed to have so many technologies, and players behind those technologies, to work together for a very impressive result.”

Cambou said, “Well done indeed, if they were able to do such complex assembly.”

The Yole analyst suspects that STMicroelectronics is supplying the infrared camera and the proximity sensor. Apple might have sourced the front camera and the dot projector from AMS, he added.

While admitting that Apple isn’t — after all — using in iPhone X “ST’s SPAD imager as I dreamed,” Cambou conceded, “Apple combined admirably all the available technologies.

Go to the original article...

Automotive LiDAR Market Overview

Image Sensors World        Go to the original article...

Semiconductor Engineering publishes an article "LiDAR Market Continues To Percolate." Few quotes:

"It’s too early to tell how market share for automotive LiDAR is shaping up, as the bigger vendors are still working to make sensors cost-efficient for use in advanced driver-assistance systems and automated driving.

Market research firms are issuing hockey-stick analyses on the LiDAR market’s potential growth. Grand View Research forecasts the worldwide automotive LiDAR market will be worth $223.2 million by 2024.

BIS Research estimates the automotive LiDAR market was worth $65 million last year. It will show double-digit compound annual growth over the next decade, according to the firm.
"

Go to the original article...

PhaseOne Trichromatic MF Sensor

Image Sensors World        Go to the original article...

Working closely with Sony, Phase One introduces IQ3 101MP Trichromatic medium format digital back. It uses "a new CMOS sensor and Bayer Filter color technology, available only through Phase One, we have given the photographer 101-megapixels of creative possibility in never-before possible color definition." It is said to be capable of replicating, closer than ever, the color definition that the human eye sees.

"Designed around the concept of mimicking the dynamic color response of the human eye, we have physically customized the Color Bayer Filter on the 101-megapixel sensor to tailor the color response. This allows the Digital Back to capture color in a new way, unlike anything else.

The Phase One Trichromatic Philosophy is a promise that where color and quality can expand, while others may be satisfied with what they have, Phase One will always strive for perfection.
"

There is not much more info released about the new image sensor:

Go to the original article...

PMD and SensibleVision Present 3D Face Authentication Solution for Smartphones

Image Sensors World        Go to the original article...

PMD and SensibleVision announced a technology partnership to create a modern, mobile 3D facial recognition platform.

With our leading 3D facial authentication solution, all handset makers can now transform the way people access and interact with their devices – and keep pace with or even move ahead of Apple,” says George Brostoff, co-founder and CEO of SensibleVision. “The quality of the data from the pmd ToF technology is amazing. Combing our 3D recognition with the 3D sensors allows perfect operation in the brightest sunlight and the darkest rooms with amazing speed and accuracy.

The combination of all the partner’s skills lead to an astonishing small, robust, fast and effective 3D authentication solution for mobile devices. As Apple seems to predefine the future of innovative authentication solutions, we’re thrilled to enable OEMs with pmd depth sensors and SensibleVision’s 3DVerify solution to a rapid and efficient integration into their devices,” says Bernd Buxbaum, founding CEO of pmdtechnologies.

Go to the original article...

More Details on iPhone X 3D Camera

Image Sensors World        Go to the original article...

From Apple iPhone X official video:


Go to the original article...

Apple iPhone X Official Details

Image Sensors World        Go to the original article...

Mashable: Apple officially unveils its iPhone X featuring "True Depth Camera System" based on structured light and Face ID unlock. A double tap on the side button is necessary to activate Face ID system. The chances for unlocking for a wrong face are said to be 1:1,000,000:

Go to the original article...

css.php