All posts by Megan Scudellari

Printing Electronics Directly on Delicate Surfaces—Like the Back of Your Hand

Post Syndicated from Megan Scudellari original https://spectrum.ieee.org/the-human-os/biomedical/devices/printing-electronics-directly-on-delicate-surfaces

Nick Williams didn’t ask permission. The graduate student just stuck his pinky finger under the printer and watched it paint two silver lines on his skin. When printing was complete, Williams put a small LED light at one end of the lines and applied voltage to the other. He smiled as the light glowed.

Williams showed the electronically active tattoo to his advisor, Duke University electrical engineer Aaron Franklin. Since Williams barely felt the printing, and the silver washed off with soap and water, they tried it again.

Flexible Wearable Reverses Baldness With Gentle Electric Pulses

Post Syndicated from Megan Scudellari original https://spectrum.ieee.org/the-human-os/biomedical/devices/flexible-wearable-reverses-baldness-with-gentle-electric-pulses

Why waste the energy used to tilt one’s head or digest food? University of Wisconsin-Madison engineer Xudong Wang is an expert at harvesting the body’s mechanical energy to power devices, such as an electric bandage that accelerates healing and a stomach implant that subdues hunger.

Now, Wang’s team is back with a self-powered wearable to tackle an age-old nemesis: hair loss.

Wang’s lab has created a motion-activated, flexible wearable that promotes hair regeneration via gentle electrical stimulation. They describe their work in a study published this month in the journal ACS Nano. In rodents, the device stimulated hair growth better than conventional topical medications.

The device can be discreetly hidden under a baseball cap, says Wang. He hopes to begin a clinical trial with humans within six months.

Facebook Closer to Augmented Reality Glasses With Brain Implant That Decodes Dialogue From Neural Activity

Post Syndicated from Megan Scudellari original https://spectrum.ieee.org/the-human-os/biomedical/devices/brain-implant-decodes-dialogue-from-neural-activity

The project is the first to decode question-and-answer speech from brain signals in real time

In 2017, Facebook’s Mark Chevillet gave himself two years to prove whether it was feasible to build a non-invasive technology able to read out 100 words per minute from brain activity.

It’s been two years, and the verdict is in: “The promise is there,” Chevillet told IEEE Spectrum. “We do think it will be possible.”

As research director of Facebook Reality Labs’ brain-computer interface program, Chevillet plans to push ahead with the project—and the company’s ultimate goal to develop augmented reality (AR) glasses that can be controlled without having to speak aloud.

Chevillet’s optimism is fueled in large part by a first in the field of brain-computer interfaces that hit the presses this morning: In the journal Nature Communications, a team at the University of California, San Francisco, in collaboration with Facebook Reality Labs, has built a brain-computer interface that accurately decodes dialogue—words and phrases both heard and spoken by the person wearing the device—from brain signals in real time.

With This AI, 60 Percent of Patients Who Had Surgery Could Have Avoided It

Post Syndicated from Megan Scudellari original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/more-than-60-of-patients-could-have-avoided-surgery-if-this-ai-had-been-their-doctor

Machine learning algorithms that combine clinical and molecular data are the “wave of the future,” experts say

A man walks into a doctor’s office for a CT scan of his gallbladder. The gallbladder is fine but the doctor notices a saclike pocket of fluid on the man’s pancreas. It’s a cyst that may lead to cancer, the doctor tells him, so I’ll need to cut it out to be safe.

It’ll take three months to recover from the surgery, the doctor adds—plus, there’s a 50 percent chance of surgical complications, and a 5 percent chance the man will die on the table.

An estimated 800,000 patients in the United States are incidentally diagnosed with pancreatic cysts each year, and doctors have no good way of telling which cysts harbor a deadly form of cancer and which are benign. This ambiguity results in thousands of unnecessary surgeries: One study found that up to 78 percent of cysts for which a patient was referred to surgery ended up being not cancerous.

Now there’s a machine learning algorithm that could help. Described today in the journal Science Translational Medicine, surgeons and computer scientists at Johns Hopkins University have built a test called CompCyst (for comprehensive cyst analysis) that is significantly better than today’s standard-of-care—a.k.a. doctor observations and medical imaging—at predicting whether patients should be sent home, monitored, or undergo surgery.

Smart Speaker Listens for Audible Signs of Cardiac Arrest

Post Syndicated from Megan Scudellari original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/smart-speaker-listens-for-cardiac-arrest

This AI system detects unique gasping sounds that occur when the heart stops beating

When a person’s heart malfunctions and suddenly stops beating, death can occur within minutes—unless someone intervenes. A bystander administering CPR right away can triple a person’s chances of surviving a cardiac arrest.

Last July, we described a smart watch designed to detect cardiac arrest and summon help. Now, a team at the University of Washington has developed a totally contactless AI system that listens to detect the telltale sound of agonal breathing—a unique guttural gasping sound made by 50 percent of cardiac arrest patients.

The smart speaker system, described today in the journal npj Digital Medicine, detected agonal breathing events 97 percent of the time with almost no false alarms in a proof-of-concept study.

The team imagines using the tool—which can run on Amazon’s Alexa or Google Home, among other devices—to passively monitor bedrooms for the sound of agonal breathing and, if detected, set off an alarm.

Scanning Your Eyes for Alzheimer’s

Post Syndicated from Megan Scudellari original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/reliable-portable-tech-for-alzheimers-diagnosis

Gates/Bezos-funded charity champions research into methods for early detection of Alzheimer’s disease

The Alzheimer’s Drug Discovery Foundation (ADDF), a public charity co-founded by Bill Gates, Jeff and MacKenzie Bezos, and former Estée Lauder CEO Leonard Lauder, just announced the first award recipients of their $50 million Diagnostics Accelerator research program.

The four recipients, chosen from a pool of 300 applicants across 30 countries, are developing reliable, cost-effective ways to diagnose Alzheimer’s disease, including one that will use machine learning to detect early signs of concern through an eye scan.

DARPA Funds Ambitious Brain-Machine Interface Program

Post Syndicated from Megan Scudellari original https://spectrum.ieee.org/the-human-os/biomedical/bionics/darpa-funds-ambitious-neurotech-program

The N3 program aims to develop wearable devices that let soldiers to communicate directly with machines.

DARPA’s Next-Generation Nonsurgical Neurotechnology (N3) program has awarded funding to six groups attempting to build brain-machine interfaces that match the performance of implanted electrodes but with no surgery whatsoever.

By simply popping on a helmet or headset, soldiers could conceivably command control centers without touching a keyboard; fly drones intuitively with a thought; even feel intrusions into a secure network. While the tech sounds futuristic, DARPA wants to get it done in four years.

“It’s an aggressive timeline,” says Krishnan Thyagarajan, a research scientist at PARC and principal investigator of one of the N3-funded projects. “But I think the idea of any such program is to really challenge the community to push the limits and accelerate things which are already brewing. Yes, it’s challenging, but it’s not impossible.”

The N3 program fits right into DARPA’s high-risk, high-reward biomedical tech portfolio, including programs in electric medicine, brain implants and electrical brain training. And the U.S. defense R&D agency is throwing big money at the program: Though a DARPA spokesperson declined to comment on the amount of funding, two of the winning teams are reporting eye-popping grants of $19.48 million and $18 million.

Plenty of noninvasive neurotechnologies already exist, but not at the resolution necessary to yield high-performance wearable devices for national security applications, says N3 program manager Al Emondi of DARPA’s Biological Technologies Office.

Following a call for applications back in March, a review panel narrowed the pool to six teams across industry and academia, Emondi told IEEE Spectrum. The teams are experimenting with different combinations of magnetic fields, electric fields, acoustic fields (ultrasound) and light. “You can combine all these approaches in different, unique and novel ways,” says Emondi. What the program hopes to discover, he adds, is which combinations can record brain activity and communicate back to the brain with the greatest speed and resolution.

Specifically, the program is seeking technologies that can read and write to brain cells in just 50 milliseconds round-trip, and can interact with at least 16 locations in the brain at a resolution of 1 cubic millimeter (a space that encompasses thousands of neurons).

The four-year N3 program will consist of three phases, says Emondi. In the current phase 1, teams have one year to demonstrate the ability to read (record) and write to (stimulate) brain tissue through the skull. Teams that succeed will move to phase 2. Over the ensuing 18 months, those groups will have to develop working devices and test them on living animals. Any group left standing will proceed to phase 3—testing their device on humans.

Four of teams are developing totally noninvasive technologies. A team from Carnegie Mellon University, for example, is planning to use ultrasound waves to guide light into and out of the brain to detect neural activity. They plan to use  interfering electrical fields to write to specific neurons.

The three other teams proposing non-invasive techniques include Johns Hopkins University’s Applied Physics Laboratory, Thyagarajan’s team at PARC, and a team from Teledyne Technologies, a California-based industrial company.

The two remaining teams are developing what DARPA calls “minutely invasive” technologies which, as we described in September, require no incisions or surgery but may involve technology that is swallowed, sniffed, injected or absorbed into the human body in some way.

Rice University, for example, is developing a system that requires exposing neurons to a viral vector to deliver instructions for synthetic proteins that indicate when a neuron is active. Ohio-based technology company Battelle is developing a brain-machine interface that relies on magnetoelectric nanoparticles injected into the brain.

“This is uncharted territory for DARPA, and the next step in brain-machine interfaces,” says Emondi. “If we’re successful in some of these technologies…that’s a whole new ecosystem that doesn’t exist right now.”

KidneyX Prize Winners Rethink Dialysis

Post Syndicated from Megan Scudellari original https://spectrum.ieee.org/the-human-os/biomedical/devices/kidneyx-prize-winners-redesign-dialysis

Artificial kidneys and a wearable that prevents blood clots were among the winning designs

In the 1960s, a pacemaker was the size of a microwave and a dialysis machine was the size of a refrigerator. Today, a pacemaker is the size of a vitamin and a dialysis machine is, well, the size of a refrigerator. They do have nice LED displays though.

“Kidney disease is underserved,” says Shuvo Roy, a bioengineer at the University of California, San Francisco. Although failed kidneys kill more people each year than breast or prostate cancer, “the field has not seen much innovation in the last 50 years,” says Roy.

The U.S. Department of Health and Human Services and the American Society of Nephrology want to change that. The organizations have teamed up to bring money and attention to the disease with the KidneyX: Redesign Dialysis competition. Started in 2018, the competition challenges innovators in any field to propose tools and technologies that could enable the design of new artificial kidney devices.

Brain Implant Can Say What You’re Thinking

Post Syndicated from Megan Scudellari original https://spectrum.ieee.org/the-human-os/biomedical/devices/implant-translates-brain-activity-into-spoken-sentences

A brain-computer interface that records signals in the motor cortex can synthesize speech from activity in a user’s brain

Two years ago, a 64-year-old man paralyzed by a spinal cord injury set a record when he used a brain-computer interface (BCI) to type at a speed of eight words per minute.

Today, in the journal Nature, scientists at the University of California, San Francisco, present a new type of BCI, powered by neural networks, that might enable individuals with paralysis or stroke to communicate at the speed of natural speech—an average of 150 words per minute.