Post Syndicated from Michael Dumiak original https://spectrum.ieee.org/tech-talk/sensors/imagers/super-fast-cameras-capture-light-pulse-midflight
The footage is something between the ghostlike and commonplace: a tiny laser pulse tracing through a box pattern, as if moving through a neon storefront sign. But this pulse is more than meets the eye, more than a pattern for the eye to follow, and there’s no good metaphor for it. It is quite simply and starkly a complete burst of light—a pulse train with both a front and a back—captured in a still image, mid-flight.
Electrical engineers and optics experts at the Advanced Quantum Architecture (AQUA) Laboratory in Neuchâtel, Switzerland made this footage last year, mid-pandemic, using a single-photon avalanche diode camera, or SPAD. Its solid-state photodetectors are capable of very high-precision measurements of time and therefore distance, even as its single pixels are struck by individual light particles, or photons.
Edoardo Charbon and his colleagues at the Swiss Ecole Polytechnique Fédérale de Lausanne, working with camera maker Canon, were able in late 2019 to develop a SPAD array at megapixel size in a small camera they called Mega X. It can resolve one million pixels and process this information very quickly.
The Charbon-led AQUA group—at that point comprising nanotechnology prizewinner Kazuhiro Morimoto, Ming-Lo Wu and Andrei Ardelean—synchronized Mega X to a femtosecond laser. They fire the laser through an aerosol of water vapor, made using dry ice procured at a party shop.
The photons hit the water droplets, and some of them scatter toward the camera. With a sensor able to realize a megapixel resolution, the camera can capture 24,000 frames per second with exposure times as fast as 3.8 nanoseconds.
The sort of photodetectors used in Mega X have been in development for several decades. Indeed, SPAD imagers can be found in smartphone cameras, industrial robots, and lab spectroscopy. The kind of footage caught in the Neuchâtel lab is called light-in-flight. MIT’s Ramesh Raskar in 2010 and 2011 used a streak camera—a kind of camera used in chemistry applications—to produce 2D images and build films of light in flight in what he called femtophotography.
The development of light-in-flight imaging goes back at least to the late 1970s, when Nils Abramson at Stockholm’s Royal Institute of Technology used holography to record light wavefronts. Genevieve Gariepy, Daniele Faccio and other researchers then in Edinburgh in 2015 used a SPAD to show footage of light in flight. The first light-in-flight images took many hours to construct, but the Mega X camera can do it in a few minutes.
Martin Zurek, a freelance photographer and consultant in Bavaria (he does 3D laser scanning and drone measurements for architectural and landscape projects) recalls many long hours working on femtosecond spectroscopy for his physics doctorate in Munich in the late 1990s. When Zurek watches the AQUA light footage, he’s impressed by the resolution in location and time, which opens up dimensions for the image. “The most interesting thing is how short you can make light pulses,” he says. “You’re observing something fundamental, a fundamental physical property or process. That should astound us every day.”
A potential use for megapixel SPAD cameras would be for faster and more detailed light-ranging 3D scans of landscapes and large objects in mapping, facsimile production, and image recording. For example, megapixel SPAD technology could greatly enhance LiDAR imaging and modeling of the kind done by the Factum Foundation in their studies of the tomb of the Egyptian Pharaoh Seti I.
Faccio, now at the University of Glasgow, is using SPAD imaging to obtain better time-resolved fluorescence microscopy images of cell metabolism. As with Raskar’s MIT group, Faccio hopes to apply the technology to human body imaging.
The AQUA researchers were able to observe an astrophysical effect in their footage called superluminal motion, which is an illusion akin to the doppler effect. Only in this case light appears to the observer to speed up—which it can’t really do, already traveling as it does at the speed of light.
Charbon’s thoughts are more earthbound. “This is just like a conventional camera, except that in every single pixel you can see every single photon,” he says. “That blew me away. It’s why I got into this research in the first place.”