Archives for June 2024

Canon develops high-performance materials for perovskite solar cells to improve substantial durability and mass-production stability

Newsroom | Canon Global        Go to the original article...

Go to the original article...

IISS updates its papers database

Image Sensors World        Go to the original article...

The International Image Sensor Society has a new and updated papers repository thanks to a multi-month overhaul effort.

  • 853 IISW workshop papers in the period 2007-2023 are updated with DOI (Digital Object Identifier). Check out any of these papers in the IISS Online Library.
  • Each paper has a landing page containing metadata such as title, authors, year, keywords, references, and of course link to the PDF.
  • As an extra service we have also identified DOIs (if exists) to referenced papers in workshop papers. This makes it convenient to access referenced papers by clicking on the DOI directly from the landing page.
  • DOIs for pre-2007 workshop papers will be added later.

IISS website: https://imagesensors.org/

IISS Online Library: https://imagesensors.org/past-workshops-library/ 

Go to the original article...

Job Postings – Week of 16 June 2024

Image Sensors World        Go to the original article...

Meta

Sensor Architect, Reality Labs

Sunnyvale, California, USA

Link

Jenoptik

Imaging Engineer

Camberley, England, UK

Link

Omnivision

Automotive OEM Business Development Manager

Farmington Hills, Michigan, USA

Link

IMEC

R&D Project Leader 3D & Si Photonics

Leuven, Belgium

Link

Rivian

Sr. Staff Camera Validation and Integration Engineer

Palo Alto, California, USA

Link

CERN

Applied Physicist

Geneva, Switzerland

Link

Apple

Camera Image Sensor Analog Design Engineer

Austin, Texas, USA

Link

Gottingen University

PhD position in pixel detector development

Göttingen, Germany

Link

Federal University of Rio de Janeiro

Faculty position in Experimental Neutrino Physics

Rio de Janiero, Brazil

Link

.

Go to the original article...

Paper on event cameras for automotive vision in Nature

Image Sensors World        Go to the original article...

In a recent open access Nature article titled "Low-latency automotive vision with event cameras", Daniel Gehrig and Davide Scaramuzza write:

The computer vision algorithms used currently in advanced driver assistance systems rely on image-based RGB cameras, leading to a critical bandwidth–latency trade-off for delivering safe driving experiences. To address this, event cameras have emerged as alternative vision sensors. Event cameras measure the changes in intensity asynchronously, offering high temporal resolution and sparsity, markedly reducing bandwidth and latency requirements. Despite these advantages, event-camera-based algorithms are either highly efficient but lag behind image-based ones in terms of accuracy or sacrifice the sparsity and efficiency of events to achieve comparable results. To overcome this, here we propose a hybrid event- and frame-based object detector that preserves the advantages of each modality and thus does not suffer from this trade-off. Our method exploits the high temporal resolution and sparsity of events and the rich but low temporal resolution information in standard images to generate efficient, high-rate object detections, reducing perceptual and computational latency. We show that the use of a 20 frames per second (fps) RGB camera plus an event camera can achieve the same latency as a 5,000-fps camera with the bandwidth of a 45-fps camera without compromising accuracy. Our approach paves the way for efficient and robust perception in edge-case scenarios by uncovering the potential of event cameras.

Also covered in an ArsTechnica article: New camera design can ID threats faster, using less memory https://arstechnica.com/science/2024/06/new-camera-design-can-id-threats-faster-using-less-memory/

 


 a, Unlike frame-based sensors, event cameras do not suffer from the bandwidth–latency trade-off: high-speed cameras (top left) capture low-latency but high-bandwidth data, whereas low-speed cameras (bottom right) capture low-bandwidth but high-latency data. Instead, our 20 fps camera plus event camera hybrid setup (bottom left, red and blue dots in the yellow rectangle indicate event camera measurements) can capture low-latency and low-bandwidth data. This is equivalent in latency to a 5,000-fps camera and in bandwidth to a 45-fps camera. b, Application scenario. We leverage this setup for low-latency, low-bandwidth traffic participant detection (bottom row, green rectangles are detections) that enhances the safety of downstream systems compared with standard cameras (top and middle rows). c, 3D visualization of detections. To do so, our method uses events (red and blue dots) in the blind time between images to detect objects (green rectangle), before they become visible in the next image (red rectangle).

Our method processes dense images and asynchronous events (blue and red dots, top timeline) to produce high-rate object detections (green rectangles, bottom timeline). It shares features from a dense CNN running on low-rate images (blue arrows) to boost the performance of an asynchronous GNN running on events. The GNN processes each new event efficiently, reusing CNN features and sparsely updating GNN activations from previous steps.


 

a,b, Comparison of asynchronous, dense feedforward and dense recurrent methods, in terms of task performance (mAP) and computational complexity (MFLOPS per inserted event) on the purely event-based Gen1 detection dataset41 (a) and N-Caltech101 (ref. 42) (b). c, Results of DSEC-Detection. All methods on this benchmark use images and events and are tasked to predict labels 50 ms after the first image, using events. Methods with dagger symbol use directed voxel grid pooling. For a full table of results, see Extended Data Table 1.

a, Detection performance in terms of mAP for our method (cyan), baseline method Events + YOLOX (ref. 34) (blue) and image-based method YOLOX (ref. 34) with constant and linear extrapolation (yellow and brown). Grey lines correspond to inter-frame intervals of automotive cameras. b, Bandwidth requirements of these cameras, and our hybrid event + image camera setup. The red lines correspond to the median, and the box contains data between the first and third quartiles. The distance from the box edges to the whiskers measures 1.5 times the interquartile range. c, Bandwidth and performance comparison. For each frame rate (and resulting bandwidth), the worst-case (blue) and average (red) mAP is plotted. For frame-based methods, these lie on the grey line. The performance using the hybrid event + image camera setup is plotted as a red star (mean) and blue star (worst case). The black star points in the direction of the ideal performance–bandwidth trade-off.

The first column shows detections for the first image I0. The second column shows detections between images I0 and I1 using events. The third column shows detections for the second image I1. Detections of cars are shown by green rectangles, and of pedestrians by blue rectangles.


Go to the original article...

PIXEL2024 workshop

Image Sensors World        Go to the original article...

The Eleventh International Workshop on Semiconductor Pixel Detectors for Particles and Imaging (Pixel2024) will take place 18-22 November 2024 at the Collège Doctoral Européen, University of Strasbourg, France.


The workshop will cover various topics related to pixel detector technology. Development and applications will be discussed for charged particle tracking in high energy physics, nuclear physics, astrophysics, astronomy, biology, medical imaging and photon science. The conference program will also include reports on radiation effects, timing with pixel sensors, monolithic sensors, sensing materials, front and back end electronics, as well as interconnection and integration technologies toward detector systems.
All sessions are plenary and include a poster session. Contributions will be chosen from submitted abstracts.


Key deadlines:

  •  abstract submission: July 5,
  •  early bird registration: September 1,
  •  late registration: September 30.

Abstract submission link: https://indico.in2p3.fr/event/32425/abstracts/ 



Go to the original article...

Himax invests in Obsidian thermal imagers

Image Sensors World        Go to the original article...

From GlobeNewswire: https://www.globenewswire.com/news-release/2024/05/29/2889639/8267/en/Himax-Announces-Strategic-Investment-in-Obsidian-Sensors-to-Revolutionize-Next-Gen-Thermal-Imagers.html

Himax Announces Strategic Investment in Obsidian Sensors to Revolutionize Next-Gen Thermal Imagers

TAINAN, Taiwan and SAN DIEGO, May 29, 2024 (GLOBE NEWSWIRE) -- Himax Technologies, Inc. (Nasdaq: HIMX) (“Himax” or “Company”), a leading supplier and fabless manufacturer of display drivers and other semiconductor products, today announced its strategic investment in Obsidian Sensors, Inc. ("Obsidian"), a San Diego-based thermal imaging sensor solution manufacturer. Himax's strategic investment in Obsidian Sensors, as the lead investor in Obsidian’s convertible note financing, was motivated by the potential of their proprietary and revolutionary high-resolution thermal sensors to dominate the market through low-cost, high-volume production capabilities. The investment amount was not disclosed. In addition to an ongoing engineering collaboration where Obsidian leverages Himax's IC design resources and know-how, the two companies also aim to combine the advantages of Himax’s WiseEye ultralow power AI processors with Obsidian’s high-resolution thermal imaging to create an advanced thermal vision solution. This would complement Himax's existing AI capabilities and ecosystem support, improving detection in challenging environments and boosting accuracy and reliability, thereby opening doors to a wide array of applications, including industrial, automotive safety and autonomy, and security systems. Obsidian’s proprietary thermal imaging camera solutions have already garnered attention in the industry, with notable existing investors including Qualcomm Ventures, Hyundai, Hyundai Mobis, SK Walden and Innolux.

Thermal imaging sensors offer unparalleled versatility, capable of detecting heat differences in total darkness, measuring temperature, and identifying distant objects. They are particularly well suited for a wide range of surveillance applications, especially in challenging and life-saving scenarios. Compared to prevailing thermal sensor solutions, which typically suffer from low resolution, high cost, and limited production volumes, Obsidian is revolutionizing the thermal imaging industry by producing high resolution thermal sensors with its proprietary Large Area MEMS Platform (“LAMP”), offering low-cost production at high volumes. With large glass substrates capable of producing sensors with superior resolution, VGA or higher, at volumes exceeding 100 million units per year, Obsidian is poised to drive the mass market adoption of this unrivaled technology across industries, including automotive, security, surveillance, drones, and more.

With accelerating interest in both the consumer and defense sectors, Obsidian’s groundbreaking thermal imaging sensor solutions are gaining traction in automotive applications and poised to play a pivotal role. The novel ADAS (Advanced Driver Assistance Systems) and AEB (Automatic Emergency Braking) system, integrated with Obsidian’s thermal sensors, significantly enable higher-resolution and clear vision in low-light and adverse weather conditions such as fog, smoke, rain, and snow, ensuring much better driving safety and security. This aligns perfectly with measures announced by the NHTSA (National Highway Traffic Safety Administration) on April 29, 2024, which issued its final rule mandating the implementation of AEB, including PAEB (Pedestrian AEB) that is effective at night, as a standard feature on all new cars beginning in 2029, recognizing pedestrian safety features as essential components rather than just luxury add-ons. This safety standard is expected to significantly reduce rear-end and pedestrian crashes. Traffic safety authorities in other countries are also following suit with similar regulations underscoring the trend and significant potential demand for thermal imaging sensors from Obsidian Sensors in the years to come.

 

A dangerous nighttime driving situation can be averted with a thermal camera
 

“We are pleased to begin our strategic partnership with Himax through this funding round and look forward to a fruitful collaboration to potentially merge our market leading thermal imaging sensor and camera technologies with Himax’s advanced ultralow power WiseEyeTM endpoint AI, leveraging each other's domain expertise. Furthermore, progress has been made in the engineering projects for mixed signal integrated circuits, leveraging Himax’s decades of experience in image processing. Given our disruptive cost and scale advantage, this partnership will enable us to better cater to the needs of the rapid-growing thermal imaging market,” said John Hong, CEO of Obsidian Sensors.

“We see great potential in Obsidian Sensors' revolutionary high-resolution thermal imaging sensor. Himax’s strategic investment in Obsidian further enhances our portfolio and expands our technology reach to cover thermal sensing which represents a great compliment to our WiseEye technology, a world leading ultralow power image sensing AI total solution. Further, we see tremendous potential of Obsidian’s technology in the automotive sector where Himax already holds a dominant position in display semiconductors. We also anticipate additional synergies through expansion of our partnership with our combined strength and respective expertise driving future success,” said Mr. Jordan Wu, President and Chief Executive Officer of Himax.

Go to the original article...

Canon developing new RF-S7.8mm F4 STM DUAL lens for EOS R7 camera for recording spatial video for Apple Vision Pro

Newsroom | Canon Global        Go to the original article...

Go to the original article...

IEEE SENSORS 2024 Update from Dan McGrath

Image Sensors World        Go to the original article...

 

IEEE SENSORS 2024 Image Sensor Update

This is a follow-up to my earlier Image Sensor World post on how the program initiative related to image sensors participation in IEEE SENSORS 2024 is coming together. Two activities targeted at the image sensor community have been organized as follows:

·         A full-day workshop on Sunday, 20 October, organized by Sozo Yokogawa of SONY and Erez Tadmor of onSemi, titled “From Imaging to Sensing: Latest and Future Trends of CMOS Image Sensors”. It includes speakers from Omnivision, onSemi, Samsung, Canon, SONY, Artilux, TechInsights and Shizuoka University.

·         A focus session on Monday afternoon, 21 October, organized by S-G Wuu of Brillnics, DN Yang of TSMC and John McCarten of L3/Harris on stacking in image sensors. It will lead with an invited speaker. There is the opportunity for submitted presentations on any aspect of stacking. Those interested should submit an abstract to me at dmcgrath@ieee.org before 30 June. The selection process will be handled separately from the regular process for the conference.

This initiative is to encourage the image sensor community to give SENSORS the chance to prove itself a vibrant, interesting and welcoming home for the exchange of technical advances. It is part of the IEEE Sensors Council’s initiative to increase industrial participation across the council’s activities. Other events planned at SENSORS 2024 as part of this initiative are a session on standards and a full-day in-conference workshop on the human-machine interface. There will also be the opportunity for networking between industry and students.

Consider joining the Sensors Council – it is free if you are an IEEE member. Consider the mutual benefit of being in an organization and participating in a conference that shares more than just the name “sensors”. Our image sensor community is a leader in tackling the problems of capturing what goes on in the physical world, but there are also things that can be learned by our community from the cutting-edge work related to other sensors.

The submission date for the conference in general is at present 11 June, but there is a proposal to extend it to 25 June. Check the website.

Looking forward to seeing you in Kobe.

Dan McGrath

TechInsights Inc.

Industrial Co-Chair, IEEE SENSORS 2024

AdCom member, IEEE Solid State Circuits Society & IEEE Sensor Council

dmcgrath@ieee.org

Go to the original article...

Conference List – September 2024

Image Sensors World        Go to the original article...

IEEE International Conference on Multisensor Fusion and Integration - 4-6 Sep 2024 - Pilsen, Czechia - Website

IEEE Sensors in Spotlight 2024 - 5 Sep 2024 - Boston, Massachusetts, USA - Website

Semi MEMS and Sensors Executive Conference - 7-9 Sep 2024 - Quebec, QC, Canada - Website

Sensor China Expo & Conference 2024 - 11-13 Sep 2024 - Shanghai, China - Website

SPIE Sensors + Imaging 2024 - 16-19 Sep 2024 - Edinburgh, Scotland, UK - Website

SPIE Photonics Industry Summit - 25 Sep 2024 - Washington, DC, USA - Website

21st International Conference on IC Design and Technology - 25-27 Sep 2024 - Singapore- Website

10th International Conference on Sensors and Electronic Instrumentation Advances - 25-27 Sep 2024 - Ibiza, Spain - Website

If you know about additional local conferences, please add them as comments.

Return to Conference List index

Go to the original article...

Canon RF 35mm f1.4L VCM review so far

Cameralabs        Go to the original article...

The Canon RF 35mm f1.4L VCM is a high-end wide-angle prime lens for EOS R cameras. Canon describes it as a hybrid lens, optimised for photo and video. Find out everything I know in my review so far!…

Go to the original article...

Canon enters recycling system business with innovative technology, promoting circular economy with high-speed, accurate plastic sorting equipment capable of measuring even black plastic waste

Newsroom | Canon Global        Go to the original article...

Go to the original article...

ID Quauntique webinar: single photon detectors for quantum tech

Image Sensors World        Go to the original article...



In this webinar replay, we first explore the role of single-photon detectors in advancing quantum technologies, with a focus on superconducting nanowire detectors (SNSPDs) and the benefits they offer for quantum computing and high-speed quantum communication.

After which, we discuss the evolving needs of the field and describe IDQ’s user-focused detector solutions, including our innovative photon-number-resolving (PNR) SNSPDs and our new rack-mountable SNSPD system. We show real-world experiments that have already benefited from the outstanding performances of our detectors, including an enhanced heralded single-photon source and high key-rate QKD implementation.

Finally, we conclude with our vision on the future of single-photon detection for quantum information and networking, and the exciting possibilities this can unlock.

Go to the original article...

ID Quauntique webinar: single photon detectors for quantum tech

Image Sensors World        Go to the original article...



In this webinar replay, we first explore the role of single-photon detectors in advancing quantum technologies, with a focus on superconducting nanowire detectors (SNSPDs) and the benefits they offer for quantum computing and high-speed quantum communication.

After which, we discuss the evolving needs of the field and describe IDQ’s user-focused detector solutions, including our innovative photon-number-resolving (PNR) SNSPDs and our new rack-mountable SNSPD system. We show real-world experiments that have already benefited from the outstanding performances of our detectors, including an enhanced heralded single-photon source and high key-rate QKD implementation.

Finally, we conclude with our vision on the future of single-photon detection for quantum information and networking, and the exciting possibilities this can unlock.

Go to the original article...

Canon’s enforcement of its intellectual property right leads to the removal of toner cartridges, including the “TONER image technology” cartridge brand, from Coupang

Newsroom | Canon Global        Go to the original article...

Go to the original article...

ISSW 2024 this week in Trento, Italy

Image Sensors World        Go to the original article...

The 2024 International SPAD Sensor Workshop is happening this week in Trento, Italy. Full program is available here: https://issw2024.fbk.eu/program

Talks:



Posters:

Go to the original article...

DB HiTek global shutter and SPAD

Image Sensors World        Go to the original article...

From PR Newswire: https://www.prnewswire.com/news-releases/db-hitek-advances-global-shutter-and-spad-302157652.html

DB HiTek Advances Global Shutter and SPAD


SEOUL, South Korea, June 3, 2024 /PRNewswire/ -- DB HiTek, a leading foundry specialist in South Korea, is enhancing its global shutter and single-photon avalanche diode (SPAD) process technologies, which are highly utilized in the automotive, industrial, robotics, and medical fields, to expand its specialized image sensor business.

The global shutter is a sensor that captures images of fast-moving objects without distortion. The demand for global shutters is rapidly increasing in various fields, including machine vision, automotive, drones, robotics, and medical devices, with an expected annual average market growth rate of 16% from 2022 to 2029.

DB HiTek's 7 Tr charge domain global shutter achieves PLS≥35,000 at 5.6 um pixels using light shield and light guide technologies and supports various sizes down to a minimum of 2.8 um pixels (PLS≥10,000).

Parasitic light sensitivity (PLS) is a concept indicating sensitivity to light, and a PLS of 10,000 or higher demonstrates a shutter efficiency level significantly high enough to achieve a light detection rate of 99.99% (with a noise occurrence rate of less than one in 10,000).

DB HiTek's 6 Tr charge domain global shutter has secured PLS ≥10,000 and a memory dark current ≤20e/s at 60C in the 2.8 μm pixel. This process is expected to be completed and provided to customers by the end of this year.

SPAD is an ultra-high-sensitivity 3D image sensor that detects weak light signals at the particle level. It has high precision and allows for long-distance measurement, making it a key component in implementing future advanced technologies such as autonomous vehicles, AR/VR devices, robotics, and smartphones.

DB HiTek's second-generation SPAD process, utilizing a backside scattering technology (BST) and backside deep trench isolation (BDTI) in a BSI structure, achieves an advanced technological level with a photon detection probability of 15.8% at a wavelength of 940 nm. In addition, it ensures improved quality by securing a dark current rate (DCR) performance equivalent to 0.69 cps/um2, corresponding to the dark current of a typical CIS.

Building on the upgraded global shutter and second-generation SPAD process, DB HiTek plans to actively support fabless customers in expanding their specialized image sensor business.
A DB HiTek official said, "Currently, our company is collaborating with leading global companies in the United States, Europe, China, Japan, and other regions to develop products," adding, "We plan to enhance customer support by providing services such as customized processes, TDK for pixel development simulations, as well as multi-layer mask (MLM)."

Meanwhile, DB HiTek has recently expanded its X-ray CIS business by successfully developing products in collaboration with a leading medical sensor specialist in Europe. It is reported that the advanced quality and yield characteristics lead positive response from the customer, and the company will expand its business into the manufacturing sector following the medical field.

Go to the original article...

Job Postings – Week of 2 June 2024

Image Sensors World        Go to the original article...

Vantage MedTech

FPGA Engineer (Video)

Moonachie, New Jersey, USA

Link

Brookhaven National laboratory

Deputy Director – Instrumentation Division

Upton, New York, USA

Link

onsemi

Process Integration/Device Technology Development Engineer

Gresham, Oregon, USA

Link

Boeing

Senior Engineer, Infrared, Optical and Opto-Mechanical Sensor Products

Huntington Beach, California, USA

Link

onsemi

2024 New College Graduate (NCG)

Seremban, Negeri Sembilan, Malaysia

Link

Siegen University

Postdoc position for pixel detectors

Siegen, Germany

Link

onsemi

Image Test Algorithm Developer

Nampa, Idaho, USA

Link

University of Edinburgh

PhD Studentship - Adaptive Sensor Fusion for Optimised 3D Sensing

Edinburgh, Scotland, UK

Link

European Spallation Source

Entry Level Detector Scientist - Beam Monitors

Lund, Sweden

Link

Go to the original article...

css.php