Tag Archives: sensors

Ultra-quick Cameras Reveal How to Catch a Light Pulse Mid-Flight

Post Syndicated from Michael Dumiak original https://spectrum.ieee.org/tech-talk/sensors/imagers/super-fast-cameras-capture-light-pulse-midflight

The footage is something between the ghostlike and commonplace: a tiny laser pulse tracing through a box pattern, as if moving through a neon storefront sign. But this pulse is more than meets the eye, more than a pattern for the eye to follow, and there’s no good metaphor for it. It is quite simply and starkly a complete burst of light—a pulse train with both a front and a back—captured in a still image, mid-flight.

Electrical engineers and optics experts at the Advanced Quantum Architecture (AQUA) Laboratory in Neuchâtel, Switzerland made this footage last year, mid-pandemic, using a single-photon avalanche diode camera, or SPAD. Its solid-state photodetectors are capable of very high-precision measurements of time and therefore distance, even as its single pixels are struck by individual light particles, or photons.

Edoardo Charbon and his colleagues at the Swiss Ecole Polytechnique Fédérale de Lausanne, working with camera maker Canon, were able in late 2019 to develop a SPAD array at megapixel size in a small camera they called Mega X. It can resolve one million pixels and process this information very quickly. 

The Charbon-led AQUA group—at that point comprising nanotechnology prizewinner Kazuhiro Morimoto, Ming-Lo Wu and Andrei Ardelean—synchronized Mega X to a femtosecond laser. They fire the laser through an aerosol of water vapor, made using dry ice procured at a party shop. 

The photons hit the water droplets, and some of them scatter toward the camera. With a sensor able to realize a megapixel resolution, the camera can capture 24,000 frames per second with exposure times as fast as 3.8 nanoseconds.

The sort of photodetectors used in Mega X have been in development for several decades. Indeed, SPAD imagers can be found in smartphone cameras, industrial robots, and lab spectroscopy. The kind of footage caught in the Neuchâtel lab is called light-in-flight. MIT’s Ramesh Raskar in 2010 and 2011 used a streak camera—a kind of camera used in chemistry applications—to produce 2D images and build films of light in flight in what he called femtophotography

The development of light-in-flight imaging goes back at least to the late 1970s, when Nils Abramson at Stockholm’s Royal Institute of Technology used holography to record light wavefronts. Genevieve Gariepy, Daniele Faccio and other researchers then in Edinburgh in 2015 used a SPAD to show footage of light in flight. The first light-in-flight images took many hours to construct, but the Mega X camera can do it in a few minutes. 

Martin Zurek, a freelance photographer and consultant in Bavaria (he does 3D laser scanning and drone measurements for architectural and landscape projects) recalls many long hours working on femtosecond spectroscopy for his physics doctorate in Munich in the late 1990s. When Zurek watches the AQUA light footage, he’s impressed by the resolution in location and time, which opens up dimensions for the image. “The most interesting thing is how short you can make light pulses,” he says. “You’re observing something fundamental, a fundamental physical property or process. That should astound us every day.”

A potential use for megapixel SPAD cameras would be for faster and more detailed light-ranging 3D scans of landscapes and large objects in mapping, facsimile production, and image recording. For example, megapixel SPAD technology could greatly enhance LiDAR imaging and modeling of the kind done by the Factum Foundation in their studies of the tomb of the Egyptian Pharaoh Seti I.

Faccio, now at the University of Glasgow, is using SPAD imaging to obtain better time-resolved fluorescence microscopy images of cell metabolism. As with Raskar’s MIT group, Faccio hopes to apply the technology to human body imaging.

The AQUA researchers were able to observe an astrophysical effect in their footage called superluminal motion, which is an illusion akin to the doppler effect. Only in this case light appears to the observer to speed up—which it can’t really do, already traveling as it does at the speed of light.

Charbon’s thoughts are more earthbound. “This is just like a conventional camera, except that in every single pixel you can see every single photon,” he says. “That blew me away. It’s why I got into this research in the first place.”

Atomically Precise Sensors Could Detect Another Earth

Post Syndicated from Jason Wright original https://spectrum.ieee.org/sensors/imagers/atomically-precise-sensors-could-detect-another-earth

Gazing into the dark and seemingly endless night sky above, people have long wondered: Is there another world like ours out there, somewhere? Thanks to new sensors that we and other astronomers are developing, our generation may be the first to get an affirmative answer—and the earliest hints of another Earth could come as soon as this year.

Astronomers have discovered thousands of exoplanets so far, and almost two dozen of them are roughly the size of our own planet and orbiting within the so-called habitable zone of their stars, where water might exist on the surface in liquid form. But none of those planets has been confirmed to be rocky, like Earth, and to circle a star like the sun. Still, there is every reason to expect that astronomers will yet detect such a planet in a nearby portion of the galaxy.

So why haven’t they found it yet? We can sum up the difficulty in three words: resolution, contrast, and mass. Imagine trying to spot Earth from hundreds of light-years away. You would need a giant telescope to resolve such a tiny blue dot sitting a mere 150 million kilometers (0.000016 light-year) from the sun. And Earth, at less than a billionth the brightness of the sun, would be hopelessly lost in the glare.

For the same reasons, current observatories—and even the next generation of space telescopes now being built—have no chance of snapping a photo of our exoplanetary twin. Even when such imaging does become possible, it will allow us to measure an exoplanet’s size and orbital period, but not its mass, without which we cannot determine whether it is rocky like Earth or largely gaseous like Jupiter.

For all these reasons, astronomers will detect another Earth first by exploiting a tool that’s been used to detect exoplanets since the mid-1990s: spectroscopy. The tug of an orbiting exoplanet makes its star “wobble” ever so slightly, inducing a correspondingly slight and slow shift in the spectrum of the star’s light. For a doppelgänger of Earth, that fluctuation is so subtle that we have to look for a displacement measuring just a few atoms across in the patterns of the rainbow of starlight falling on a silicon detector in the heart of a superstable vacuum chamber.

Our group at Pennsylvania State University, part of a team funded by NASA and the U.S. National Science Foundation (NSF), has constructed an instrument called NEID (short for NN-explore Exoplanet Investigations with Doppler spectroscopy), for deployment in Arizona to search the skies of the Northern Hemisphere. (The name NEID—pronounced “NOO-id”—derives from an American Indian word meaning “to see.”) A Yale University team has built a second instrument, called EXPRES (for EXtreme PREcision Spectrometer), which will also operate in Arizona. Meanwhile, a third team led by the University of Geneva and the European Southern Observatory is commissioning an instrument called ESPRESSO (Echelle SPectrograph for Rocky Exoplanet and Stable Spectroscopic Observations), in Chile, to hunt in the southern skies.

All three groups are integrating Nobel Prize–winning technologies in novel ways to create cutting-edge digital spectrographs of femtometer-scale precision. Their shared goal is to achieve the most precise measurements of stellar motions ever attempted. The race to discover the elusive “Earth 2.0” is afoot.

Using spectroscopy to pick up an exoplanet-induced wobble is an old idea, and the basic principle is easy to understand. As Earth circles the sun in a huge orbit, it pulls the sun, which is more than 300,000 times as massive, into a far smaller “counter orbit,” in which the sun moves at a mere 10 centimeters per second, about the speed of a box turtle.

Even such slow motions cause the wavelengths of sunlight to shift—by less than one part per billion—toward shorter, bluer wavelengths in the direction of its motion and toward longer, redder wavelengths in the other direction. Exoplanets induce similar Doppler shifts in the light from their host stars. The heavier the planet, the bigger the Doppler shift, making the planet easier to detect.

In 1992, radio astronomers observing odd objects called pulsars used the timing of the radio signals they emit to infer that they hosted exoplanets. In 1995, Michel Mayor and Didier Queloz of the University of Geneva were using the Doppler wobbles in the light from ordinary stars to search for orbiting companions and found a giant planet orbiting the star 51 Pegasi every four days.

This unexpected discovery, for which Mayor and Queloz won the 2019 Nobel Prize in Physics, opened the floodgates. The “Doppler wobble” method has since been used to detect more than 800 exoplanets, virtually all of them heavier than Earth.

So far, almost all of the known exoplanets that are the size and mass of Earth or smaller were discovered using a different technique. As these planets orbit, they fortuitously pass between their star and our telescopes, causing the apparent brightness of the star to dim very slightly for a time. Such transits, as astronomers call them, give away the planet’s diameter—and together with its wobble they will reveal the planet’s mass.

Since the first transiting exoplanet was discovered in 1999, the Kepler space telescope and Transiting Exoplanet Survey Satellite (TESS) have found thousands more, and that number continues to grow. Most of these bodies are unlike any in our solar system. Some are giants, like Jupiter, orbiting so close to their host stars that they make the day side of Mercury look chilly. Others are gassy planets like Neptune or Uranus but much smaller, or rocky planets like Earth but much larger.

Through statistical analysis of the bounteous results, astronomers infer that our own Milky Way galaxy must be home to billions of rocky planets like Earth that orbit inside the habitable zones of their host stars. A study published recently by scientists who worked on the Kepler mission estimated that at least 7 percent—and probably more like half—of all sunlike stars in the Milky Way host potentially habitable worlds. That is why we are so confident that, with sufficiently sensitive instruments, it should be possible to detect other Earths.

But to do so, astronomers will have to measure the mass, size, and orbital period of the planet, which means measuring both its Doppler wobble and its transit. The trove of data from Kepler and TESS—much of it yet to be analyzed—has the transit angle covered. Now it’s up to us and other astronomers to deliver higher-precision measurements of Doppler wobbles.

NEID, our contestant in the race to find Earth 2.0, improves the precision of Doppler velocimetry by making advances on two fronts simultaneously: greater stability and better calibration. Our design builds on work done in the 2000s by European astronomers who built the High Accuracy Radial Velocity Planet Searcher (HARPS), which controlled temperature, vibration, and pressure so well that it faithfully tracked the subtle Doppler shifts of a star’s light over years without sacrificing precision.

In 2010, we and other experts in this field gathered at Penn State to compare notes and discuss our aspirations for the future. HARPS had been producing blockbuster discoveries since 2002; U.S. efforts seemed to be lagging behind. The most recent decadal review by the U.S. National Academy of Sciences had recommended an “aggressive program” to regain the lead.

Astronomers at the workshop drafted a letter to NASA and the NSF that led to them commissioning a new spectrograph, to be installed at the 3.5-meter WIYN telescope at Kitt Peak National Observatory, in Arizona. (“WIYN” is an acronym derived from the names of the four founding institutions.) NEID would be built at Penn State by an international consortium.

Like HARPS before it, NEID combines new technologies with novel methods to achieve unprecedented stability during long-term observations. Any tiny change in the instrument—whether in temperature or pressure, in the way starlight illuminates its optics, or even in the nearly imperceptible expansion of the instrument’s aluminum structure as it ages—can cause the dispersed starlight to creep across the detector. That could look indistinguishable from a Doppler shift and corrupt our observations. Because the wobbles we aim to measure show up as shifts of mere femtometers on the sensor—comparable to the size of the atoms that make up the detector—we have to hold everything incredibly steady.

That starts with exquisite thermal control of the instrument, which is as big as a car. It has required us to design cutting-edge digital detectors and sophisticated laser systems, tied to atomic clocks. These systems act as the ruler by which to measure the amplitudes of the stellar wobbles.

Yet we know we cannot build a perfectly stable instrument. So we rely on calibration to take us the rest of the way. Previous instruments used specialized lamps containing thorium or uranium, which give off light at hundreds of distinct, well-known wavelengths—yardsticks for calibrating the spectral detectors. But these lamps change with age.

NEID instead relies on a laser frequency comb: a Nobel Prize–winning technology that generates laser light at hundreds of evenly spaced frequency peaks. The precision of the comb, limited only by our best atomic clocks, is better than a few parts per quadrillion. By using the comb to regularly calibrate the instrument, we can account for any residual instabilities.

Somewhere out there, another Earth is orbiting its star, which wobbles slowly in sympathy around their shared center of gravity. Over the course of a year or so, the star’s spectrum shifts ever so slightly toward the red, then toward the blue. As that starlight reaches Kitt Peak, the telescope brings it to a sharp focus at the tip of a glass optical fiber, which guides the light down into the bowels of the observatory through a conduit into a custom-designed room housing the NEID instrument.

The focus spot must strike the fiber in exactly the same way with every observation; any variations can translate into slightly different illumination patterns on the detector, mimicking stellar Doppler shifts. The quarter-century-old WIYN telescope was never designed for such work, so to ensure consistency our colleagues at the University of Wisconsin–Madison built an interface that monitors the focus and position of the stellar image hundreds of times a second and rapidly adjusts to hold the focal point steady.

Down in the insulated ground-floor room, a powerful heating-and-cooling system keeps a vacuum chamber bathed in air at a constant temperature of 20 °C. NEID’s spectrograph components, held at less than one-millionth of the standard atmospheric pressure inside the vacuum chamber, are protected from even tiny changes in pressure by a combination of pumps and “getters”—including 2 liters of cryogenic charcoal—to which stray gas molecules stick.

A network of more than 75 high-precision thermometers actively controls 30 heaters placed around the instrument’s outer surface to compensate for any unexpected thermal fluctuations. With no moving parts or motors that might generate heat and disrupt this delicate thermal balance, the system is able to hold NEID’s core optical components to exactly 26.850 °C, plus or minus 0.001 °C.

As the starlight emerges from the optical fiber, it strikes a parabolic collimating mirror and then a reflective diffraction grating that is 800 millimeters long. The grating splits the light into more than 100 thin colored lines. A large glass prism and a system of four lenses then spread the lines across a silicon-based, 80-megapixel charge-coupled device. Within that CCD sensor, photons from the distant star are converted into electrons that are counted, one at a time, by supersensitive amplifiers.

Heat from the amplifiers makes the detector itself expand and contract in ways that are nearly impossible to calibrate. We had to devise some way to keep that operating heat as constant and manageable as possible.

Electrons within the CCD accumulate in pixels, which are actually micrometer-size wells of electric potential, created by voltages applied to minuscule electrodes on one side of the silicon substrate. The usual approach is to hold these voltages constant while the shutter is open and the instrument collects stellar photons. At the end of the observation, manipulation of the voltages shuffles collected electrons over to the readout amplifiers. But such a technique generates enough heat—a few hundredths of a watt—to cripple a system like NEID.

Our team devised an alternative operating method that prevents this problem. We manipulate the CCD voltages during the collection of stellar photons, jiggling the pixels slightly without sacrificing their integrity. The result is a constant heat load from the detector rather than transient pulses.

Minor imperfections in the CCD detector present us with a separate engineering challenge. State-of-the-art commercial CCDs have high pixel counts, low noise, and impeccable light sensitivity. These sensors nevertheless contain tiny variations in the sizes and locations of the millions of individual pixels and in how efficiently electrons move across the detector array. Those subtle flaws induce smearing effects that can be substantially larger than the femtometer-size Doppler shifts we aim to detect. The atomic ruler provided by the laser frequency comb allows us to address this issue by measuring the imperfections of our sensor—and appropriately calibrating its output—at an unprecedented level of precision.

We also have to correct for the motion of the telescope itself as it whirls around Earth’s axis and around the sun at tens of kilometers per second—motion that creates apparent Doppler shifts hundreds of thousands of times as great as the ~10 cm/s wobbles caused by an exoplanet. Fortunately, NASA’s Jet Propulsion Laboratory has for decades been measuring Earth’s velocity through space to much better precision than we can measure it with our spectrograph. Our software combines that data with sensitive measurements of Earth’s rotation to correct for the motion of the telescope.

In January 2020, NEID achieved its “first light” with observations of 51 Pegasi. Since then, we have refined the calibration and tuning of the instrument, which in recent months has approached our design target of measuring Doppler wobbles as slow as 33 cm/s—a big step toward the ultimate goal of 10-cm/s sensitivity. With commissioning underway, the instrument is now regularly measuring some of the nearest, brightest sunlike stars for planets.

Meanwhile, in Chile’s Atacama Desert, the most sophisticated instrument of this kind yet built, ESPRESSO, has been analyzing Doppler shifts in starlight gathered and combined from the four 8.2-meter telescopes of the Very Large Telescope. ESPRESSO is a technological marvel that achieves unprecedentedly precise observations of very faint and distant stars by using two arms that house separate optics and CCD detectors optimized for the red and blue halves of the visible spectrum.

ESPRESSO has already demonstrated measurements that are precise to about 25 cm/s, with still better sensitivity expected as the multiyear commissioning process continues. Astronomers have used the instrument to confirm the existence of a supersize rocky planet in a tight orbit around Proxima Centauri, the star nearest the sun, and to observe the extremely hot planet WASP-76b, where molten iron presumably rains from the skies.

At the 4.3-meter Lowell Discovery Telescope near Flagstaff, Ariz., EXPRES has recently demonstrated its own ability to measure stellar motions to much better than 50 cm/s. Several other Doppler velocimeters, to be installed on other telescopes, are either in the final stages of construction or are coming on line to join the effort. The variety of designs will help astronomers observe stars of differing kinds and magnitudes.

Since that 2010 meeting, our community of instrument builders has quickly learned which design solutions work well and which have turned out to be more trouble than they are worth. Perhaps when we convene at the next such meeting in California, one group will have already found Earth 2.0. There’ll be no telling, though, whether it harbors alien life—and if so, whether the aliens have also spied us.

This article appears in the March 2021 print issue as “ How We’ll Find Another Earth.”

About the Author

Jason T. Wright is a professor of astronomy and astrophysics at Pennsylvania State University. Cullen Blake is an associate professor in the department of physics and astronomy at the University of Pennsylvania.

Ultrasonic Holograms: Who Knew Acoustics Could Go 3D?

Post Syndicated from Michelle Hampson original https://spectrum.ieee.org/tech-talk/sensors/imagers/acoustic-holograms-and-3d-ultrasound

Although its origins trace back to 1985, acoustic holography as a field has been hampered by rampant noise created by widespread reflection and scattering. (Lasers tamp this problem down somewhat for conventional optical holograms; ultrasound as yet offers no such technological quick fix.) 

But in a recent study published last month in IEEE Sensors Journal, a group of researchers report an improved method for creating acoustic holograms. While the advance won’t lead to treatment with acoustic holograms in the immediate future, the improved technique yields higher sensitivity and a better focusing effect than previous acoustic hologram methods.

There are a number of intriguing possibilities that come with manipulation using sound, including medical applications. Ultrasound can penetrate human tissues and is already used for medical imaging. But more precise manipulation and imaging of human tissues using 3D holographic ultrasound  could lead to completely new therapies—including targeted neuromodulation using sound.

The nature of sound itself poses the first hurdle to be overcome. “The medical application of acoustic holograms is limited owing to the sound reflection and scattering at the acoustic holographic surface and its internal attenuation,” explains Chunlong Fei, an associate professor at Xidian University who is involved in the study.

To address these issues, his team created their acoustic hologram via a “lens” consisting of a disc with a hole at its center. They placed a 1 MHz ultrasound transducer in water and used the outer part of the transducer surface to create the hologram. By creating a hole in the center of the lens, the center of the transducer generates and receives soundwaves with less reflection and scattering.

Next, the researchers compared their new disc approach to more conventional acoustic hologram techniques. They performed this A vs. B comparison via ultrasound holographic images of several thin needles, 1.25 millimeters in diameter or less.

“The most notable feature of the hole-hologram we proposed is that it has high sensitivity and maintains good focusing effect [thanks to the] holographic lens,” says Fei. He notes that these features will lead to less scattering and propagation loss than what occurs with traditional acoustic holograms.

Fei envisions several different ways in which this approach could one day be applied medically, for example by complementing existing medical imaging probes to achieve better resolution, or for applications such as nerve regulation or non-invasive brain stimulation. However, the current set up, using water, would need to be modified to be more suitable for medical setting, along with several next steps related to characterizing and manipulating the hologram, says Fei.

The varied design improvements Fei’s team hopes to develop match the equally eclectic possible applications of ultrasonic hologram technology. In the future, Fei says they hope acoustic holographic devices might achieve super-resolution imaging, particle trapping, selective biological tissue heating—and even find new applications in personalized medicine.

3D Fingerprint Sensors Get Under Your Skin

Post Syndicated from Michelle Hampson original https://spectrum.ieee.org/tech-talk/sensors/imagers/3d-fingerprint-sensor-id-subcutaneous

Journal Watch report logo, link to report landing page

Many people already use fingerprint recognition technology to access their phones, but one group of researchers wants to take this skin-deep concept further. In a study published January 18 in IEEE Sensors Journal, they described a technique that maps out not only the unique pattern of a person’s fingerprint, but the blood vessels underneath. The approach adds another layer of security to identity authentication.

Typically, a fingerprint scan only accounts for 2D information that captures a person’s unique fingerprint pattern of ridges and valleys, but this 2D information could easily be replicated.

“Compared with the existing 2D fingerprint recognition technologies, 3D recognition that captures the finger vessel pattern within a user’s finger will be ideal for preventing spoofing attacks and be much more secure,” explains Xiaoning Jing, a distinguished professor at North Carolina State University and co-author of the study.

To develop a more secure approach using 3D recognition, Jing and his colleagues created a device that relies on ultrasound pulses. When a finger is placed upon the system, triggering a pressure sensor, a high-frequency pulsed ultrasonic wave is emitted. The amplitudes of reflecting soundwaves can then be used to determine both the fingerprint and blood vessel patterns of the person.

In an experiment, the device was tested using an artificial finger created from polydimethylsiloxane, which has an acoustic impedance similar to human tissues. Bovine blood was added to vessels constructed in the artificial finger. Through this set up, the researchers were able to obtain electronic images of both the fingerprint and blood vessel patterns with resolutions of 500 × 500 dots per inch, which they say is sufficient for commercial applications.

Intriguingly, while the blood vessel features beneath the ridges of the artificial finger could be determined, this was not the case for 40% of the blood vessels that lay underneath the valleys of the fingerprint. Jing explains that this is because the high-frequency acoustic waves cannot propagate through the tiny spaces confined within the valleys of the fingerprint. Nevertheless, he notes that enough of the blood vessels throughout the finger can be distinguished enough to make this approach worthwhile, and that data interpolation or other advanced processing techniques could be used to reconstruct the undetected portion of vessels.

Chang Peng, a post-doc research scholar at North Carolina State University and co-author of the study, see this approach as widely applicable. “We envision this 3D fingerprint recognition approach can be adopted as a highly secure bio-recognition technique for broad applications including consumer electronics, law enforcement, banking and finance, as well as smart homes,” he says, noting that this group is seeking a patent and looking for industry partners to help commercialize the technology.

Notably, the current set up using a single ultrasound inducer takes an hour or more to acquire an image. To improve upon this, the researchers are planning to explore how an array ultrasonic fingerprint sensors will perform compared to the single sensor that was used in this study. Then they aim to test the device with real human fingers, comparing the security and robustness of their technique to commercialized optical and capacitive techniques for real fingerprint recognition.

Light-driven sonar could survey the oceans from the air

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/tech-talk/sensors/remote-sensing/lightdriven-sonar-could-survey-the-oceans-from-the-air

Sonar, which measures the time it takes for sound waves to bounce off objects and travel back to a receiver, is the best way to visualize underwater terrain or inspect marine-based structures. Sonar systems, though, have to be deployed on ships or buoys, making them slow and limiting the area they can cover.

However, engineers at Stanford University have developed a new hybrid technique combining light and sound. Aircraft, they suggest, could use this combined laser/sonar technology to sweep the ocean surface for high-resolution images of submerged objects. The proof-of-concept airborne sonar system, presented recently in the journal IEEE Access, could make it easier and faster to find sunken wrecks, investigate marine habitats, and spot enemy submarines.

Our system could be on a drone, airplane or helicopter,” says Amin Arbabian, an electrical engineering professor at Stanford University. “It could be deployed rapidly…and cover larger areas.”

Airborne radar and lidar are used to map the Earth’s surface at high resolution. Both can penetrate clouds and forest cover, making them especially useful in the air and on the ground. But peering into water from the air is a different challenge. Sound, radio, and light waves all quickly lose their energy when traveling from air into water and back. This attenuation is even worse in turbid water, Arbabian says.

So he and his students combined the two modalities—laser and sonar. Their system relies on the well-known photoacoustic effect, which turns pulses of light into sound. “When you shine a pulse of light on an object it heats up and expands and that leads to a sound wave because it moves molecules of air around the object,” he says.

The group’s new photoacoustic sonar system begins by shooting laser pulses at the water surface. Water absorbs most of the energy, creating ultrasound waves that move through it much like conventional sonar. These waves bounce off objects, and some of the reflected waves go back out from the water into the air.

At this point, the acoustic echoes lose a tremendous amount of energy as they cross that water-air barrier and then travel through the air. Here is where another critical part of the team’s design comes in.

To detect the weak acoustic waves in air, the team uses an ultra-sensitive microelectromechanical device with the mouthful name of an air-coupled capacitive micromachined ultrasonic transducer (CMUT). These devices are simple capacitors with a thin plate that vibrates when hit by ultrasound waves, causing a detectable change in capacitance. They are known to be efficient at detecting sound waves in air, and Arbabian has been investigating the use of CMUT sensors for remote ultrasound imaging. Special software processes the detected ultrasound signals to reconstruct a high-resolution 3D image of the underwater object.

The researchers tested the system by imaging metal bars of different heights and diameters placed in a large 25cm-deep fish tank filled with clear water. The CMUT detector was 10cm above the water surface.

The system should work in murky water, Arbabian says, although they haven’t tested that yet. Next up, they plan to image objects placed in a swimming pool, for which they will have to use more powerful laser sources that work for deeper water. They also want to improve the system so it works with waves, which distort signals and make the detection and image reconstruction much harder. “This proof of concept is to show that you can see through the air-water interface” Arbabian says. “That’s the hardest part of this problem. Once we can prove it works it can scale up to greater depths and larger objects.”

Accelerating the journey to autonomous for carmakers

Post Syndicated from ARM Automotive original https://spectrum.ieee.org/sensors/automotive-sensors/accelerating-the-journey-to-autonomous-for-carmakers

A child runs into the street, stands frozen and terrified as she sees a car speeding toward her. What’s safer? Relying on a car with autonomous capabilities or a human driver to brake in time?

In that situation, it takes the average driver 1.5 seconds to react and hit the brakes; it takes a car equipped with vision systems, RADAR and LIDAR just 0.5 seconds. That 3x faster response time can mean the difference between life and death, and it’s one of many reasons the automotive industry is accelerating down its journey to autonomous. Whether it’s safety, efficiency, comfort, navigation or the more efficient manufacture of an autonomous car, automotive companies are increasingly embracing autonomous technologies.

But this wave of interest in autonomous technologies does not come without its challenges. Disparate systems and a dizzying range of software and hardware choices can make charting a path to autonomy challenging. For example, many current solutions for automated driving are based on bulky, high-power, costly standalone chips. Others require proprietary solutions, which severely constrain design flexibility. However, to make it suitable for widespread commercial use, Tier1 and OEMs are looking for a more power-efficient and cost-effective solution suitable for mass production.

To embrace the vision, we need to understand that autonomous workloads are fundamentally different from traditional workloads and confront the challenges that complexity presents. The incorporation of AI and machine learning means the workload magnitude is far greater when compared to something like the narrow scope of operation surrounding, for example, contemporary ADAS (automated driver assistance systems).

Grappling with complexity

Autonomous is also more complex than traditional workloads because of the overlapping and interconnected nature of systems in industrial applications. There may be a machine vision system with its own set of data and algorithms that needs to integrate with robot vehicles that are alerted by the vision system that a task has been completed and pickup is required.

Integration may also be required for other points along the supply chain. For example, vehicles rolling down the manufacturing line will benefit from just-in-time parts arrivals thanks to the more integrated, near real-time connection with the supply chain.

These technologies also need to support features to help achieve both ASIL D and ASIL B safety requirements, as well as ISO26262.

New thinking, new technology

All of this requires a fundamental reconsideration of design, validation and configuration as well as embracing emerging technologies and methodologies.

As we evolve toward increasingly autonomous deployment, we need new hardware and software systems that will replace some of the commercial offerings being used today – and a fundamental shift from homogeneity to heterogeneous systems. These new systems must not only deliver flexibility, but help reduce the power, size, cost and thermal properties while retaining the performance needed to run autonomous workloads.  And such systems will benefit from a flourishing ecosystem coalescing around these new design approaches that offer customers choice and flexibility to unleash their innovations.

Our mission at Arm is to deliver the underlying technologies and nurture an ecosystem to enable industries such as automotive to accelerate their journey to autonomy and reap the rewards of delivering new, transformative products to their customers.

In this market deep dive we outline the technology challenges and hardware and software solutions that can propel the automotive industry forward in its journey to a more safe, efficient and prosperous future.

IoT Makes Fire Detection Systems Smarter

Post Syndicated from Brian Horowitz original https://spectrum.ieee.org/tech-talk/sensors/remote-sensing/how-iot-makes-fire-detection-systems-smarter

For years, first responders relied on paper maps to reach a fire in an apartment building or office. Incomplete information would delay firefighters from arriving at an emergency, and false alarms would set them on the wrong path altogether. Dispatchers in 911 centers would receive erroneous information on a problem with a smoke detector rather than a sprinkler switch.

“It gets to the point where you don’t even trust the data,” said Dick Bauer, fire chief for the Killingworth Volunteer Fire Company in Killingworth, Connecticut.

Now cloud computing, mobile apps, edge computing and IoT gateways will enable fire safety personnel to gain visibility into how to reach an emergency.

Remote monitoring and diagnostic capabilities of an IoT system help firefighters know where to position personnel and trucks in advance, according to Bauer. An IoT system tells fire personnel the locations of a smoke detector going off, a heat detector sending signals or a water flow switch being activated.

“You can see a map of the building with the actual location identified where the fire really is, and you can actually watch it spread if you have enough sensors,” said Bill Curtis, analyst in residence, IoT, at Moor Insights & Strategy.

IoT will make systems in commercial buildings work together like Amazon’s Alexa controls lights, thermostats and audio/video (AV) equipment in a home, Curtis said. An IoT system could shut down an HVAC system or put elevators in fire mode if smoke is blowing around a building, he suggested. A mobile app populated with sensor data can provide visibility into emergency systems and how to control specific locations in a building. It provides a holistic view of sensors, controls and fire panels.

Firefighters speeding to the scene will know what floor the fire is on and which sensors the emergency triggered. They’ll also learn how many people are in the building, and which entrance to use when they get there, Curtis explained.

“The more sensors and different types of sensors means earlier detection and greater resolution as well as greater precision on exactly where the fire is and how it is moving,” he said.

How IoT Fire Detection Works

Companies such as such as BehrTech and Honeywell offer IoT connectivity systems that provide situational awareness when fighting fires. BehrTech’s MyThings wireless IoT platform provides disaster warnings to guard against forest fires. It lets emergency personnel monitor the weather as well as atmospheric and seismic data.

On 20 Oct., Honeywell introduced a cloud platform for its Connected Life Safety Services (CLSS) that allows first responders to access data on a fire system before they get to an emergency. It’s now possible to evaluate the condition of devices and get essential data about an emergency in real time using a mobile app.

The CLSS cloud platform connects to an IoT gateway at a central station, which collects data from sensors around a building. CLSS transmits data on the building location that generated the alarm to fire departments. It also provides a history of detector signals over the previous 24 hours and indicates whether the smoke detector had previously triggered a false alarm, says Sameer Agrawal, general manager of software and services at Honeywell.

Agrawal said smart fire IoT platforms like CLSS indicate precisely where an emergency is occurring and will enable firefighters to take the right equipment to the correct location.

“When the dispatch sends a fire truck, the Computer Aided Dispatch (CAD) system will provide an access code that the officer in the truck can punch into an application; that will bring up a 2D model of the building and place the exact location of the alarm,” Agrawal said. “You’re able to track your crews that way, so this really is the kind of information that’s going to make their jobs so much safer and more efficient and take all the guesswork out of it.”

IoT Fire Safety Systems in the Future

Curtis suggests that, as more emergency systems to become interconnected in the future, building managers and workers should get access to these dashboards in addition to firefighters.

“Why not show the building occupants where the fire is so they can avoid it?” Curtis says.

In addition, smart fire detection systems will use artificial intelligence (AI) to detect false alarms and provide contextual information on how to prevent them—and prevent people from being thrown out of their hotel beds unnecessarily at 3 a.m., Agrawal said.

“When next-generation AI comes into play,” Agrawal says, “we start understanding more information about, you know, why was it a false alarm or what could have been done differently.”

An AI-equipped detection system will present a score to a facility manager indicating whether there’s a need to call the fire department. Information on the cause of an event and how first responders responded to past emergencies will help the software come up with the score.

What’s more, the algorithms will help detect anomalies in the data from multiple sensors. These anomalies can include a sensor malfunction, a security breach, or a reading that’s “unreasonable,” says Curtis.

Get More From Your Automotive Electronics Test Investment

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/whitepaper/get_more_from_your_automotive_electronics_test_investment

Keysight Automotive

Explore the powerful software behind Keysight’s high-precision hardware and discover how to meet emerging automotive electronics test requirements, while driving higher ROI from your existing hardware. Let Keysight help you cross the finish line ahead of your competition. Jump-start your automotive innovation today with a complimentary software trial

Cops Tap Smart Streetlights Sparking Controversy and Legislation

Post Syndicated from Tekla S. Perry original https://spectrum.ieee.org/view-from-the-valley/sensors/remote-sensing/cops-smart-street-lights

When San Diego started installing its smart streetlights in 2017, city managers envisioned that the data they gathered would help improve city operations—like selecting streets for bicycle lanes, identifying dangerous intersections for special attention, and figuring out where the city needed more parking. They thought they might spark some tech startups to develop apps that would guide the visually impaired, point drivers to parking places, and send joggers down the quietest routes. And they also envisioned cost savings from reduced energy thanks to the vastly higher efficiencies of the LED lights in comparison with the sodium-vapor lights they replaced.

Instead a $30 million project that had looked like it would put San Diego on the map as one of the “smartest” cities in the U.S. has mired it in battles over the way these systems are being used by law enforcement. Meanwhile, the independent apps that had seemed so promising two years ago have failed to materialize, and even the idea that the technology would pay for itself as energy costs came down didn’t work out as hoped.

San Diego got its smart “CityIQ” streetlights from GE Current, a company originally started as a subsidiary of General Electric but acquired last year by private-equity firm American Industrial Partners. Some 3300 of them have been installed to date and 1000 more have been received but not installed. As part of the deal, the city contracted with Current to run the cloud-based analytics of sensor data on its CityIQ platform. As part of that contract, the cloud operator, rather than the city, owns any algorithms derived from the data. In an additional shuffle, American Industrial Partners sold off the CityIQ platform in May to Ubicuia, a Florida manufacturer of streetlight sensors and software, but kept the LED-lighting side of the operation.

San Diego was the first city to fully embrace the CityIQ technology, though Atlanta and Portland did run pilot tests of the technology. San Diego financed the smart lights—and 14,000 other basic LED lights—with a plan that spread the payments out over 13 years, in such a way that the energy savings from replacing incandescent lighting would cover the cost and then some.

The CityIQ streetlights are packed with technology. Inside is an Intel Atom processor, half a terabyte of memory, Bluetooth and Wi-Fi radios, two 1080p video cameras, two acoustical sensors, and environmental sensors that monitor temperature, pressure, humidity, vibration, and magnetic fields. Much of the data is processed on the node—a textbook example of “edge processing.” That typically includes the processing of the digital video: machine-vision algorithms running on the streetlight itself count cars or bicycles, say, or extract the average speed of vehicles, and then transmit that information to the cloud. This data is managed under contract, initially by GE Current, and the data manager owns any analytics or algorithms derived from processed data.

Initially, at least, the data was expected to be used exclusively for civic analysis and planning and public convenience.

There was an expectation when we launched this that it was not a law-enforcement system, Says Erik Caldwell, deputy chief operating officer for the City of San Diego. “That’s true; that’s not the primary purpose,” he adds. “The primary purpose is to gather data to better run the city.”

But in August 2018, everything changed. That’s when, while investigating a murder in San Diego’s Gaslamp Quarter, a police officer looked up and saw one of the new smart streetlights. He realized the streetlight’s video cameras had a perfect view of the crime scene—one unavailable from the various security cameras in the area.

“We had never seen a video from any of these cameras before,” says Jeffrey Jordon, a captain with the San Diego Police Department. “But we realized the camera was exactly where the crime scene was.”

The police department reached out to San Diego’s environmental services department, the organization responsible for the lights, and asked if video were available. It turned out that the video was still stored on the light—it is deleted after five days—and Current was able to pull it up from the light to its cloud servers, and then forward it to the police department. 

“It was clear at that point that some of the video could help solve crimes, so we had an obligation to turn that over when there was a major crime,” Caldwell says.

“The data sits at the sensors unless we have a specific request, when a violent crime or fatal accident occurs. It is very surgical. We only use it in a reactive way,” Jordon states, not as surveillance.

In this case, it turned out, the video exonerated the person who had been arrested for the murder.

At this point, Caldwell admits that he and his colleagues in the city administration made a mistake.

“We could have done a better job of communicating with the public that there had been a change in the use of the data,” he concedes. “One could have argued that we should have thought this through from the beginning.”

Informal discussions between city managers and police department officials started a month or two later, Caldwell recalls. A lot of things had to be sorted out. There needed to be policies and procedures around when streetlight data would be used by the police. And the process of getting the data to the police had to be streamlined. As it was, it was very cumbersome, involving looking up a streetlight number on a map, contacting the environmental services department with that number, which would in turn contact Current, who would then extract and forward the video. It was an essentially manual process with so many people in the loop that it insufficiently safeguarded the chain of custody required for any evidence to hold up in court.

Come October, the police department had a draft policy in place, limiting the use of the data to serious crimes, though what that set of crimes encompassed wasn’t fully defined.

By mid-February of 2019, the data retrieval and chain-of-custody issues were resolved. The police department had started using Genetec Clearance, a digital-evidence management system, Jordon explains. San Diego Police now have a direct visual interface to the smart-streetlight network, showing where the lights are and if they are operational at any time. When a crime occurs, police officers can use that tool to directly extract the video from a particular streetlight. Data not extracted is still erased after five days.

It’s hard to say exactly when the use of the streetlight video started bubbling up into the public consciousness. Between March and September 2019 the city and the police department held more than a dozen community meetings, explaining the capabilities of the streetlights and collecting feedback on current and proposed future uses for the data. In June 2019 the police department released information on its use of video data to solve crimes—99 at that point, over the 10-month period beginning in August 2018.

In August of 2019, Genevieve Jones-Wright, then legal director of the Partnership for the Advancement of New Americans, said she and other leaders of community organizations heard about a tour the city was offering as part of this outreach effort. Representatives of several organizations made a plan to take the tour and attend future community meetings, soon forming a coalition called Trust SD, for Transparent and Responsible Use of Surveillance Technology San Diego. The group, with Jones-Wright as its spokesperson, pushes for clear and transparent policies around the program.

In late September 2018, TrustSD had some 20 community organizations on board (the group now encompasses 30 organizations). That month, Jones-Wright wrote to the City Council on behalf of the group, asking for an immediate moratorium on the use, installation, and acquisition of smart streetlights until safeguards were put in place to mitigate impacts on civil liberties, a city ordinance that protects privacy and ensures oversight, and public records identifying how streetlight data had been, and was being, used.

In March 2019, the police department adopted a formal policy around the use of streetlight data. It stated that video and audio may be accessed exclusively for law-enforcement purposes with the police department as custodian of the records; the city’s sustainability department (home of the streetlight program) does not have access to that crime-related data. The policy also pledges that streetlight data will not be used to discriminate against any group or to invade the privacy of individuals.

Jones-Wright and her coalition argued that the lack of specific penalties for misuse of the data was unacceptable. Since September 2019 they have been pushing for a change to the city’s municipal code, that is, to codify the policy into a law enforceable by fines and other measures if violated.

But the streetlight controversy didn’t really explode until early in 2020, when another release of data from the police department indicated that the number of times video from the street lights had been used was up to 175 in the first year and a half of police department use, all in investigations of “serious” crimes. The list included murders, sexual assaults, and kidnappings—but it also included vandalism and illegal dumping, which caused activists to question the city’s definition of “serious.”

Says Jones-Wright, “When you have a system with no oversight, we can’t tell if you are operating in confines of rules you created for yourself. When you see vandalism and illegal dumping on the list—these are not serious crimes—so you have to ask why they tapped into the surveillance footage for that, and could the reason be the class of the victim. “We are concerned with the hierarchy of justice here and creating tiers of victimhood.”

The police department’s Jordon points out that the dumping incident involved a truckload of concrete that blocked vehicles from entering and exiting a parking garage used by FBI employees, and therefore qualified as a serious situation.

Local media outlets reported extensively on the controversy. The city attorney submitted a policy to the council, and the council held a vote on whether to ratify the proposed policy in January. It was rejected, not because of any specific objections to its content, but because some lawmakers feared that a policy would no longer be enough to satisfy the public’s concerns. What was needed, they said, was an actual ordinance, with penalties for those who violated it, one that would go beyond streetlights to cover all use of surveillance technology by the city.

San Diegans may yet get their ordinance. In mid-July, 2020, a committee of the City Council approved two proposed ordinances related to streetlight data. One would set  up a process to govern all surveillance technologies in the city, including oversight, auditing, and reporting; data collection and sharing; and access, protection, and retention of data. The other would create a privacy advisory commission of technical experts and community members to review proposals for surveillance technology use. These ordinances still need to go to the full City Council for a vote.

The police department, Jordon said, would welcome clarity. “People are of course concerned right now about things taking place in Hong Kong and China [with the use of cameras] that give them pause. From our standpoint, we have tried to be great stewards of the technology,” he says.

There is no reason under the sun that the City of San Diego and our law enforcement agencies should continue to operate without rules that govern the use of surveillance technology—where there is no transparency, oversight or accountability—no matter what the benefits are,” says Jones-Wright. “So, I am very happy that we are one step closer to having the ordinances that TRUST SD helped to write on the books in San Diego.”

While  the city and community organizations figure out how to regulate the streetlights, their use continues. To date, San Diego police have tapped streetlight video data nearly 400 times, including this past June, during investigations of incidents of felony vandalism and looting during Black Lives Matter protests.

Meanwhile, some of the promised upside of the technology hasn’t worked out as expected.

Software developers who had initially expressed interest in using streetlight data to create consumer-oriented tools for ordinary citizens have yet to get an app on the market.

Says Caldwell, “When we initially launched the program, there was the hope that San Diego’s innovation economy community would find all sorts of interesting use cases for the data, and develop applications, and create a technology ecosystem around mobility and other solutions. That hasn’t born out yet, we have had a lot of conversations with companies looking at it, but it hasn’t turned into a smashing success yet.”

And the planned energy savings, intended to generate cash to pay for the expensive fixtures, were chewed up by increases in electric rates.

For better or worse, San Diego did pave the way for other municipalities looking into smart city technology. Take Carlsbad, a smaller city just north of San Diego. It is in the process of acquiring its own sensor-laden streetlights; these, however will not incorporate cameras. David Graham, who managed the streetlight acquisition program as deputy chief operating officer for San Diego and is now chief innovation officer for the City of Carlsbad, did not respond to Spectrum’s requests for comment but indicated in an interview with the Voice of San Diego that the cameras are not necessary to count cars and pedestrians and other methods will be used for that function. And Carlsbad’s City Council has indicated that it intends to be proactive in establishing clear policies around the use of streetlight data.

“The policies and laws should have been in place before the street lamps were in the ground, instead of legislation catching up to technological capacity,” says Morgan Currie, a lecturer in data and society at the University of Edinburgh.

“It is a clear case of function creep. The lights weren’t designed as surveillance tools, rather, it’s a classic example of how data collection systems are easily retooled as surveillance systems, of how the capacities of the smart city to do good things can also increase state and police control.”

Toshiba’s Light Sensor Paves the Way for Cheap Lidar

Post Syndicated from John Boyd original https://spectrum.ieee.org/cars-that-think/sensors/automotive-sensors/toshibas-light-sensor-highresolution-lidar

The introduction of fully autonomous cars has slowed to a crawl. Nevertheless, the introduction of technologies such as rearview cameras and automatic self-parking systems are helping the auto industry make incremental progress towards Level 4 autonomy while boosting driver-assist features along the way.

To that end, Toshiba has developed a compact, highly efficient silicon photo-multiplier (SiPM) that enables non-coaxial Lidar to employ off-the-shelf camera lenses to lower costs and help bring about solid-state, high-resolution Lidar.

Automotive Lidar (light detecting and ranging) typically uses spinning lasers to scan a vehicle’s environment 360 degrees by bouncing laser pulses off surrounding objects and measuring the return time for the reflected light to calculate their distances and shapes. The resulting point-cloud map can be used in combination with still images, radar data and GPS to create a virtual 3D map of the area the vehicle is traveling through.

However, high-end Lidar systems can be expensive, costing $80,000 or more, though cheaper versions are also available. The current leader in the field is Velodyne, whose lasers mechanically rotate in a tower mounted atop of a vehicle’s roof.

Solid-state Lidar systems have been announced in the past several years but have yet to challenge the mechanical variety. Now, Toshiba hopes to advance their cause with its SiPM: a solid-state light sensor employing single-photon avalanche diode (SPAD) technology. The Toshiba SiPM contains multiple SPADs, each controlled by an active quenching circuit (AQC). When an SPAD detects a photon, the SPAD cathode voltage is reduced but the AQC resets and reboots the SPAD voltage to the initial value. 

“Typical SiPM recovery time is 10 to 20 nanoseconds,” says Tuan Thanh Ta, Toshiba’s project leader for the technology. “We’ve made it 2 to 4 times faster by using this forced or active quenching method.”

The increased efficiency means Toshiba has been able to use far fewer light sensing cells—down from 48 to just 2—to produce a device measuring 25 μm x 90 μm, much smaller, the company says, than standard devices measuring 100 μm x 100 μm. The small size of these sensors has allowed Toshiba to create a dense two-dimensional array for high sensitivity, a requisite for long-range scanning. 

But such high-resolution data would require impractically large multichannel readout circuitry that comprises separate analog-to-digital converters (ADC) for long distances scanning, and time-to-digital converters (TDC) for short distances. Toshiba has overcome this problem by realizing both ADC and TDC functions in a single circuit. The result is an 80 percent reduction in size (down to 50 μm by 60 μm) over conventional dual data converter chips. 

“Our field trials using the SiPM with a prototype Lidar demonstrated the system’s effectiveness up to a distance of 200 meters while maintaining high resolution—which is necessary for Level 4 autonomous driving,” says Ta, who hails from Vietnam. “This is roughly quadruple the capability of solid-state Lidar currently on the market.”

“Factors like detection range are important, especially in high-speed environments like highways,” says Michael Milford, an interdisciplinary researcher and Deputy Director, Queensland University of Technology (QUT) Center for Robotics in Australia. “So I can see that these [Toshiba trial results] are important properties when it comes to commercial relevance.” 

And as Toshiba’s SiPM employs a two-dimensional array—unlike the one-dimensional photon receivers used in coaxial Lidar systems—Ta points out its 2D aspect ratio corresponds to that of light sensors used in commercial cameras. Consequently, off-the-shelf standard, telephoto and wide-angle lenses can be used for specific applications, helping to further reduce costs. 

By comparison, coaxial Lidar use the same optical path for sending and receiving the light source, and so require a costly customized lens to transmit and then collect the received light and send it to the 1D photon receivers, Ta explains.

But as Milford points out, “If Level 4+ on-road autonomous driving becomes a reality, it’s unlikely we’ll have a wide range of solutions and sensor configurations. So this flexibility [of Toshiba’s sensor] is perhaps more relevant for other domains like drones, and off-road autonomous vehicles, where there is more variability, and use of existing hardware is more important.”

Meanwhile, Toshiba is working to improve the performance quality of its SiPM. “We aim to introduce practical applications in fiscal 2022,” says Akihide Sai, a Senior Research Scientist at Toshiba overseeing the SiPM project. Though he declines to answer whether Toshiba is working with automakers or will produce its own solid-state Lidar system, he says Toshiba will make the SiPM device available to other companies. 

He adds, “We also see it being used in applications such as automated navigation of drones and in robots used for infrastructure monitoring, as well as in applications used in factory automation.”

But QUT’s Milford makes this telling point. “Many of the advances in autonomous vehicle technology and sensing, especially with regards to cost, bulk, and power usage, will become most relevant only when the overall challenge of Level 4+ driving is solved. They aren’t themselves the critical missing pieces for enabling this to happen.”

Can Sensors That Detect Coronavirus in the Air Help Economies Reopen Safely?

Post Syndicated from Emily Waltz original https://spectrum.ieee.org/the-human-os/sensors/chemical-sensors/devices-monitor-coronavirus-in-the-air

IEEE COVID-19 coverage logo, link to landing page

As businesses scramble to find ways to make workers and customers feel safe about entering enclosed spaces during a pandemic, several companies have proposed a solution: COVID-19 air monitoring devices. 

These devices suck in large quantities of air and trap aerosolized virus particles and anything else that’s present. The contents are then tested for the presence of the novel coronavirus, also known as SARS-CoV-2, which causes COVID-19. 

Several companies in the air quality and diagnostics sectors have quickly developed this sort of tech, with various iterations available on the market. The devices can be used anywhere, such as office buildings, airplanes, hospitals, schools and nursing homes, these companies say. 

But the devices don’t deliver results in real time—they don’t beep to alert people nearby that the virus has been detected. Instead, the collected samples must be sent to a lab to be analyzed, typically with a method called PCR, or polymerase chain reaction. 

This process takes hours. Add to that the logistics of physically transporting the samples to a lab and it could be a day or more before results are available. Still, there’s value in day-old air quality information, say developers of this type of technology. 

“It’s not solving everything about COVID-19,” says Milan Patel, CEO of PathogenDx, a DNA-based testing company. But it does enable businesses to spot the presence of the virus without relying on people to self-report, and brings peace of mind to everyone involved, he says. “If you’re going into a building, wouldn’t it be great to know that they’re doing environmental monitoring?” Patel says.

PathogenDx this year developed an airborne SARS-CoV-2 detection system by combining its DNA testing capability with an air sampler from Bertin Instruments. The gooseneck-shaped instrument uses a cyclonic vortex that draws in a high volume of air and traps any particles inside in a liquid. Once the sample is collected, it must be sent to a PathogenDx lab, where it goes through a two-step PCR process. This amplifies the virus’s genetic code so that it can be detected. Adding a second step to the process improves the test’s sensitivity, Patel says. (PCR is also the gold standard for general human testing of COVID-19.

Patel says he envisions the device proving particularly useful on airplanes, in large office buildings and health care facilities. On an airplane, for example, if the device picks up the presence of the virus during a flight, the airline can let passengers on that plane know that they were potentially exposed, he says. Or if the test comes back negative for the flight, the airline can “know that they didn’t just infect 267 passengers,” says Patel. 

In large office buildings, daily air sampling can give building managers a tool for early detection of the virus. As soon as the tests start coming back positive, the office managers could ask employees to work from home for a couple of weeks. Hospitals could use the device to track trends, identify trouble spots, and alert patients and staff of exposures.

Considering that many carriers of the virus don’t know they have it, or may be reluctant to report positive test results to every business they’ve visited, air monitoring could alert people to potential exposures in a way that contact tracing can’t. 

Other companies globally are putting forth their iterations on SARS-CoV-2 air monitoring. Sartorius in Göttingen, Germany says its device was used to analyze the air in two hospitals in Wuhan, China. (ResultsThe concentration of the virus in isolation wards and ventilated patient rooms was very low, but was higher in the toilet areas used by the patients.)

Assured Bio Labs in Oak Ridge, Tennessee markets its air monitoring device as a way to help the American workforce get back to business. InnovaPrep in Missouri offers an air sampling kit called the Bobcat, and Eurofins Scientific in Luxembourg touts labs worldwide that can analyze such samples.

But none of the commercially available tests can offer real-time results. That’s something that Jing Wang and Guangyu Qiu at the Swiss Federal Institute of Technology (ETH Zurich) and Swiss Federal Laboratories for Materials Science and Technology, or Empa, are working on.

They’ve come up with a plasmonic photothermal biosensor that can detect the presence of SARS-CoV-2 without the need for PCR. Qiu, a sensor engineer and postdoc at ETH Zurich and Empa, says that with some more work, the device could provide results within 15 minutes to an hour. “We’re trying to simplify it to a lab on a chip,” says Qiu. 

The device combines an optical sensor and a photothermal component that harness localized surface plasmon resonance sensing transduction and the plasmonic photothermal effect. But before the device can be tested in the real world, the researchers must find a way, on-board the device, to separate the virus’s genetic material from its membrane. Qiu says he hopes to resolve this and have a prototype ready to test by the end of the year.

New Electronic Nose Sniffs Out Perfectly Ripe Peaches for Harvest

Post Syndicated from Michelle Hampson original https://spectrum.ieee.org/tech-talk/sensors/chemical-sensors/new-electronic-nose-sniffs-out-perfectly-ripe-peaches-for-harvest

Have you ever tried to guess the ripeness of a peach by its smell? Farmers with a well-trained nose may be able to detect the unique combination of alcohols, esters, ketones, and aldehydes, but even an expert may struggle to know when the fruit is perfect for the picking. To help with harvesting, scientists have been developing electronic noses for sniffing out the ripest and most succulent peaches. In a recent study, one such e-nose exceeds 98 percent accuracy.

Sergio Luiz Stevan Jr. and colleagues at Federal University of Technology – Paraná and State University of Ponta Grossa, in Brazil, developed the new e-nose system. Stevan notes that even within a single, large orchard, fruit on one tree may ripen at different times than fruit on another tree, thanks to microclimates of varying ventilation, rain, soil, and other factors. Farmers can inspect the fruit and make their best guess at the prime time to harvest, but risk losing money if they choose incorrectly.

Fortunately, peaches emit vaporous molecules, called volatile organic compounds, or VOCs. “We know that volatile organic compounds vary in quantity and type, depending on the different phases of fruit growth,” explains Stevan. “Thus, the electronic noses are an [option], since they allow the online monitoring of the VOCs generated by the culture.”

The e-nose system created by his team has a set of gas sensors sensitive to particular VOCs. The measurements are digitized and pre-processed in a microcontroller. Next, a pattern recognition algorithm is used to classify each unique combination of VOC molecules associated with three stages of peach ripening (immature, ripe, over-ripe). The data is stored internally on an SD memory card and transmitted via Bluetooth or USB to a computer for analysis.

The system is also equipped with a ventilation mechanism that draws in air from the surrounding environment at a constant rate. The passing air is subjected to a set level of humidity and temperature to ensure consistent measurements. The idea, Stevan says, is to deploy several of these “noses” across an orchard to create a sensing network. 

He notes several advantages of this system over existing ripeness-sensing approaches, including that it is online, conducts real-time continuous analyses in an open environment, and does not require direct handling of the fruit. “It is different from the other [approaches] present in the literature, which are generally carried out in the laboratory or warehouses, post-harvest or during storage,” he says. The e-nose system is described in a study published June 4 in IEEE Sensors Journal.

While the study shows that the e-nose system already has a high rate of accuracy at more than 98 percent, the researchers are continuing to work on its components, focusing in particular on improving the tool’s flow analysis. They have filed for a patent and are exploring the prospect of commercialization.

For those who prefer their fruits and grains in drinkable form, there is additional good news. Stevan says in the past his team has developed a similar e-nose for beer, to analyze both alcohol content and aromas. Now they are working on an e-nose for wine, as well as a variety of other fruits.

Monitoring bees with a Raspberry Pi and BeeMonitor

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/monitoring-bees-with-a-raspberry-pi-and-beemonitor/

Keeping an eye on bee life cycles is a brilliant example of how Raspberry Pi sensors help us understand the world around us, says Rosie Hattersley

The setup featuring an Arduino, RF receiver, USB cable and Raspberry Pi

Getting to design and build things for a living sounds like a dream job, especially if it also involves Raspberry Pi and wildlife. Glyn Hudson has always enjoyed making things and set up a company manufacturing open-source energy monitoring tools shortly after graduating from university. With access to several hives at his keen apiarist parents’ garden in Snowdonia, Glyn set up BeeMonitor using some of the tools he used at work to track the beehives’ inhabitants.

Glyn bent down infront of a hive checking the original BeeMonitor setup

Glyn checking the original BeeMonitor setup

“The aim of the project was to put together a system to monitor the health of a bee colony by monitoring the temperature and humidity inside and outside the hive over multiple years,” explains Glyn. “Bees need all the help and love they can get at the moment and without them pollinating our plants, weíd struggle to grow crops. They maintain a 34∞C core brood temperature (± 0.5∞C) even when the ambient temperature drops below freezing. Maintaining this temperature when a brood is present is a key indicator of colony health.”

Wi-Fi not spot

BeeMonitor has been tracking the hives’ population since 2012 and is one of the earliest examples of a Raspberry Pi project. Glyn built most of the parts for BeeMonitor himself. Open-source software developed for the OpenEnergyMonitor project provides a data-logging and graphing platform that can be viewed online.

Spectators in protective suits watching staff monitor the beehive

BeeMonitor complete with solar panel to power it. The Snowdonia bees produce 12 to 15 kg of honey per year

The hives were too far from the house for WiFi to reach, so Glyn used a low-power RF sensor connected to an Arduino which was placed inside the hive to take readings. These were received by a Raspberry Pi connected to the internet.

Diagram showing what information BeeMonitor is trying to establish

Diagram showing what information BeeMonitor is trying to establish

At first, there was both a DS18B20 temperature sensor and a DHT22 humidity sensor inside the beehive, along with the Arduino (setup info can be found here). Data from these was saved to an SD card, the obvious drawback being that this didn’t display real-time data readings. In his initial setup, Glyn also had to extract and analyse the CSV data himself. “This was very time-consuming but did result in some interesting data,” he says.

Sensor-y overload

Almost as soon as BeeMonitor was running successfully, Glyn realised he wanted to make the data live on the internet. This would enable him to view live beehive data from anywhere and also allow other people to engage in the data.

“This is when Raspberry Pi came into its own,” he says. He also decided to drop the DHT22 humidity sensor. “It used a lot of power and the bees didn’t like it – they kept covering the sensor in wax! Oddly, the bees don’t seem to mind the DS218B20 temperature sensor, presumably since it’s a round metal object compared to the plastic grille of the DHT22,” notes Glyn.

Bees interacting with the temperature probe

Unlike the humidity sensor, the bees don’t seem to mind the temperature probe

The system has been running for eight years with minimal intervention and is powered by an old car battery and a small solar PV panel. Running costs are negligible: “Raspberry Pi is perfect for getting projects like this up and running quickly and reliably using very little power,” says Glyn. He chose it because of the community behind the hardware. “That was one of Raspberry Pi’s greatest assets and what attracted me to the platform, as well as the competitive price point!” The whole setup cost him about £50.

Glyn tells us we could set up a basic monitor using Raspberry Pi, a DS28B20 temperature sensor, a battery pack, and a solar panel.

The post Monitoring bees with a Raspberry Pi and BeeMonitor appeared first on Raspberry Pi.

Demystifying the Standards for Automotive Ethernet

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/webinar/demystifying_the_standards_for_automotive_ethernet

As Ethernet celebrates its 40th anniversary, this webinar will discuss its deployment for automotive applications and the standards related to its implementation. It will explain the role of the Institute of Electrical and Electronics Engineers (IEEE), OPEN Alliance, and the Ethernet Alliance in governing the implementation.  

Key learnings:

  • Understand the role of IEEE, OPEN Alliance, and the Ethernet Alliance in the development of automotive Ethernet 
  • Determine the relevant automotive Ethernet specifications for PHY developers, original equipment manufacturers, and Tier 1 companies  
  • Review the role of automotive Ethernet relative to other emerging high-speed digital automotive standards 

Please contact [email protected] to request PDH certificate code.

Sony Builds AI Into a CMOS Image Sensor

Post Syndicated from Tekla S. Perry original https://spectrum.ieee.org/view-from-the-valley/sensors/imagers/sony-builds-ai-into-a-cmos-image-sensor

Sony today announced that it has developed and is distributing smart image sensors. These devices use machine learning to process captured images on the sensor itself. They can then select only relevant images, or parts of images, to send on to cloud-based systems or local hubs.

This technology, says Mark Hanson, vice president of technology and business innovation for Sony Corp. of America, means practically zero latency between the image capture and its processing; low power consumption enabling IoT devices to run for months on a single battery; enhanced privacy; and far lower costs than smart cameras that use traditional image sensors and separate processors.

Sony’s San Jose laboratory developed prototype products using these sensors to demonstrate to future customers. The chips themselves were designed at Sony’s Atsugi, Japan, technology center. Hanson says that while other organizations have similar technology in development, Sony is the first to ship devices to customers.

Sony builds these chips by thinning and then bonding two wafers—one containing chips with light-sensing pixels and one containing signal processing circuitry and memory. This type of design is only possible because Sony is using a back-illuminated image sensor. In standard CMOS image sensors, the electronic traces that gather signals from the photodetectors are laid on top of the detectors. This makes them easy to manufacture, but sacrifices efficiency, because the traces block some of the incoming light. Back-illuminated devices put the readout circuitry and the interconnects under the photodetectors, adding to the cost of manufacture.

“We originally went to backside illumination so we could get more pixels on our device,” says Hanson. “That was the catalyst to enable us to add circuitry; then the question was what were the applications you could get by doing that.”

Hanson indicated that the initial applications for the technology will be in security, particularly in large retail situations requiring many cameras to cover a store. In this case, the amount of data being collected quickly becomes overwhelming, so processing the images at the edge would simplify that and cost much less, he says. The sensors could, for example, be programmed to spot people, carve out the section of the image containing the person, and only send that on for further processing. Or, he indicated, they could simply send metadata instead of the image itself, say, the number of people entering a building. The smart sensors can also track objects from frame to frame as a video is captured, for example, packages in a grocery store moving from cart to self-checkout register to bag.

Remote surveillance, with the devices running on battery power, is another application that’s getting a lot of interest, he says, along with manufacturing companies that mix robots and people looking to use image sensors to improve safety. Consumer gadgets that use the technology will come later, but he expected developers to begin experimenting with samples.

Learn the Latest Standards and Test Requirements for Automotive Radar

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/webinar/learn_the_latest_standards_and_test_requirements_for_automotive_radar

Automotive radar is a key enabler for autonomous driving technology, and it is one of the key areas for new modulation technologies. Unlike wireless communication technologies, there are no regulations and definitions for modulation type, signal pattern, and duty cycle for automotive radar.  

This webinar addresses which standards apply to radar measurements and how to measure different radar signals using Keysight’s automotive radar test solution.

Key learnings:

  • How to set the right parameters for different chirp rates and duty cycle radar signals for power measurements.  
  • Learn how to test the Rx interference of your automotive radar.  
  • Understand how to use Keysight’s automotive test solution to measure radar signals. 

Please contact [email protected] to request PDH certificate code.

Volvo and Lidar-maker Luminar to Deliver Hands-free Driving by 2022

Post Syndicated from Lawrence Ulrich original https://spectrum.ieee.org/cars-that-think/sensors/automotive-sensors/volvo-and-lidarmaker-luminar-to-deliver-handsfree-driving-by-2022

The race to bring self-driving cars to showrooms may have a new leader: Volvo Cars said it will partner with Silicon Valley-based lidar manufacturer Luminar to deliver genuine hands-free, eyes-off-the-road highway driving beginning in 2022. That so-called “Level 3” capability would be an industry first, as companies from Tesla to Uber thus far have failed to meet lofty promises to make autonomous driving a reality. 

Sweden’s Volvo, owned by Geely Holdings of China, said that models based on its upcoming SPA2 platform (for “Scalable Product Architecture”) will be hardware-enabled for Luminar’s roof-mounted lidar system. That includes the upcoming Polestar 3 from Volvo’s new electric-car division, and a range of Volvo-branded cars and SUVs. Henrik Green, chief technology officer for Volvo, said the optional “Highway Pilot” system would allow full autonomous driving, but only “when the car determines it is safe to do so.” 

“At that point, your Volvo takes responsibility for the driving and you can relax, take your eyes off the road and your hands off the wheel,” Green said. “Over time, updates over-the-air will expand the areas in which the car can drive itself. For us, a safe introduction of autonomy is a gradual introduction.” 

Luminar’s lidar system scans a car’s surroundings in real time, firing millions of pulses of light to create a virtual 3D map without relying on GPS or a network connection. Most experts agree that lidar is a critical linchpin of any truly autonomous car. A high-profile skeptic is Elon Musk, who has no plans to employ lidar in his Teslas, and scoffs at the technology as redundant and unnecessary.

Austin Russell, founder and chief executive of Luminar, disagrees. 

“If cameras could do everything you can do with lidar, great. But if you really want to get in the autonomous game, this is a clear requirement.” 

The 25-year-old Russell said his company’s high-performance lidar sensors, operating at 1550 nanometers and 64 lines of resolution, can spot even small and low-reflective objects—black cars, animals, a child darting across a road—at a range beyond 200 meters, and up to 500 meters for larger, brighter objects. The high-signal, low-noise system can deliver camera-level vision at 200 meters even in rain, fog or snow, he said. That range, Russell said, is critical to give cars the data and time to solve edge cases and make confident decisions, even when hurtling at freeway speeds.  

“Right now, you don’t have reliable interpretation,” with camera, GPS or radar systems, Russell said. “You can guess what’s ahead of you 99 percent of the time, but the problem is that last one percent. And a 99-percent pedestrian detection rate doesn’t cut it, not remotely close.” 

In a Zoom interview from Palo Alto, Russell held up two prototype versions, a roof-mounted unit roughly the size of a TV cable box, and a smaller unit that would mount on bumpers. The elegant systems incorporate a single laser and receiver, rather than the bulky, expensive, stacked arrays of lower-performance systems. Luminar built all its components and software in-house, Russell said, and is already on its eighth generation of ASIC chip controllers. 

The roof-mounted unit can deliver 120 degrees of forward vision, Russell said, plenty for Volvo’s application that combines lidar with cameras, radar, backup GPS and electrically controlled steering and braking. For future robotaxi applications with no human pilot aboard, cars would combine three to five lidar sensors to deliver full 360-degree vision. The unit will also integrate with Volvo’s Advanced Driver Assistance Systems (ADAS), improving such features as automated emergency braking, which global automakers have agreed to make standard in virtually all vehicles by 2022. 

The range and performance, Russell said, is key to solving the conundrums that have frustrated autonomous developers. 

The industry has hit a wall, unable to make the technical leap to Level 3, with cars so secure in their abilities that owners could perform tasks such as texting, reading or even napping. Last week, Audi officially quit its longstanding claims that its new A8 sedan would do just that, with its Traffic Jam Pilot system. Musk is luring consumers to pay $7,000 up-front for “Full Self-Driving” features that have yet to materialize, and that he claims would allow them to use their Teslas as money-making robotaxis. 

Current Level 2 systems, including Tesla’s Autopilot and Cadillac’s SuperCruise, often disengage when they can’t safely process their surroundings. Those disengagements, or even their possibility, demand that drivers continue to pay attention to the road at all times. And when systems do work effectively, drivers can fall out of the loop and be unable to quickly retake control. Russell acknowledged that those limitations leave current systems feeling like a parlor trick: If a driver must still pay full attention to the road, why not just keep driving the old-fashioned way? 

“Assuming perfect use, it’s fine as a novelty or convenience,” Russell said. “But it’s easy to get lulled into a sense of safety. Even when you’re really trying to maintain attention, to jump back into the game is very difficult.” 

Ideally, Russell said, a true Level 3 system would operate for years and hundreds or thousands of miles and never disengage without generous advance warning. 

“From our perspective, spontaneous handoffs that require human intervention are not safe,” he said. “It can’t be, ‘Take over in 500 milliseconds or I’m going to run into a wall.’” 

One revolution of Level 3, he said, is that drivers would actually begin to recover valuable time for productivity, rest or relaxation. The other is safety, and the elusive dream of sharply reducing roadway crashes that kill 1.35 million people a year around the world. 

Volvo cautioned that its Highway Pilot, whose price is up in the air, would initially roll out in small volumes, and steadily expand across its lineup. Volvo’s announcement included an agreement to possibly increase its minority stake in Luminar. But Luminar said it is also working with 12 of the world’s top 15 automakers in various stages of lidar development. And Russell said that, whichever manufacturer makes them, lidar-based safety and self-driving systems will eventually become an industry standard. 

“Driving adoption throughout the larger industry is the right move,” he said. “That’s how you save the most lives and create the most value.” 

Drones Use Radio Waves to Recharge Sensors While in Flight

Post Syndicated from Michelle Hampson original https://spectrum.ieee.org/tech-talk/sensors/remote-sensing/uavs-prove-usefuldelivering-remote-power-charging-services

Journal Watch report logo, link to report landing page

Remote sensors play a valuable role in collecting data—but recharging these devices while they are scattered over vast and isolated areas can be tedious. A new system is designed to make the charging process easier by using unmanned aerial vehicles (UAVs) to deliver power using radio waves during a flyby. A specialized antenna on the sensor harvests the signals and converts them into electricity. The design is described in a study published 23 March in IEEE Sensors Letters.