Tag Archives: Biomedical/Diagnostics

Fujifilm SonoSite Wants to Bring AI to Ultrasound

Post Syndicated from Emily Waltz original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/fujifilm-sonosite-wants-to-bring-ai-to-ultrasound

Have you ever needed an IV and had to undergo multiple pricks before the nurse could find a vein? Technology to avoid that painful trial and error is in the works. Fujifilm’s ultrasound diagnostics arm SonoSite announced yesterday that it had partnered with a startup company to develop artificial intelligence that can interpret ultrasound images on a mobile phone.

The companies say the first target for their AI-enabled ultrasound will be finding veins for IV (intravenous) needle insertion. The technology would enable technicians to hold a simple ultrasound wand over the skin while software on a connected mobile device locates the vein for them.

For this project, Fujifilm SonoSite tapped the Allen Institute for Artificial Intelligence (AI2), which has an incubator for AI startup companies. “Not only do we have to come up with a very accurate model to analyze the ultrasound videos, but on top of that, we have to make sure the model is working effectively on the limited resources of an android tablet or phone,” says Vu Ha, technical director of the AI2 Incubator.

In an interview with IEEE Spectrum, Ha did not disclose the name of the startup that will be taking on the task, saying the fledgling company is still in “stealth mode.

Ha says the AI2 startup will take on the project in two stages: First, it’ll train a model on ultrasound images without any resource constraints, with the purpose of making it as accurate as possible. Then, the startup will go through a sequence of experiments to simplify the model by reducing the number of hidden layers in the network, and by trimming and compressing the network until it is simple enough to operate on a mobile phone. 

The trick will be to shrink the model without sacrificing too much accuracy, Ha says.

If successful, the device could help clinicians reduce the number of unsuccessful attempts at finding a vein, and enable less trained technicians to start IVs as well. Hospitals that do a large volume of IVs often have highly trained staff capable of eyeballing ultrasound videos and using those images to help them to find small blood vessels. But the number of these highly trained clinicians is very small, says Ha.

“My hope is that with this technology, a less trained person will be able to find veins more reliably” using ultrasound, he says. That could broaden the availability of portable ultrasound to rural and resource-poor areas. 

SonoSite and AI2 are homes to two of the many groups of researchers putting AI to work on medical imaging and diagnostics. The U.S. Food and Drug Administration (FDA) has approved for commercial use a deep learning algorithm to analyze MRI images of the heart, an AI system that looks for signs of diabetic retinopathy in the images of the retina, an algorithm that analyzes X-ray images for signs of wrist fracture, and software that looks for indicators of stroke in CT images of the brain, to name a few.  

Notably, the FDA in 2017 also approved for commercial use smartphone-based ultrasound technology made by Butterfly. The device, which costs less than $2000, can be used to take sonograms for 13 different clinical applications, including blood vessels. Butterfly has announced publicly that it is developing deep learning–based AI that will assist clinicians with image interpretation. But the company has not yet commercially launched the technology. 

At least four other portable or mobile device–based ultrasound technologies have been approved by the FDA, including that of Fujifilm SonoSite, and the Lumify from Philips

But the adoption of these devices has been relatively slow. As Eric Topol, director of the Scripps Research Translational Institute, told Spectrum recently, the smartphone ultrasound is a “brilliant engineering advance” that’s “hardly used at all” in the health care system. Complex challenges such as reimbursement, training, and the old habits of clinicians often hinder the uptake of new gadgets, despite engineers’ best efforts. 

With This AI, 60 Percent of Patients Who Had Surgery Could Have Avoided It

Post Syndicated from Megan Scudellari original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/more-than-60-of-patients-could-have-avoided-surgery-if-this-ai-had-been-their-doctor

Machine learning algorithms that combine clinical and molecular data are the “wave of the future,” experts say

A man walks into a doctor’s office for a CT scan of his gallbladder. The gallbladder is fine but the doctor notices a saclike pocket of fluid on the man’s pancreas. It’s a cyst that may lead to cancer, the doctor tells him, so I’ll need to cut it out to be safe.

It’ll take three months to recover from the surgery, the doctor adds—plus, there’s a 50 percent chance of surgical complications, and a 5 percent chance the man will die on the table.

An estimated 800,000 patients in the United States are incidentally diagnosed with pancreatic cysts each year, and doctors have no good way of telling which cysts harbor a deadly form of cancer and which are benign. This ambiguity results in thousands of unnecessary surgeries: One study found that up to 78 percent of cysts for which a patient was referred to surgery ended up being not cancerous.

Now there’s a machine learning algorithm that could help. Described today in the journal Science Translational Medicine, surgeons and computer scientists at Johns Hopkins University have built a test called CompCyst (for comprehensive cyst analysis) that is significantly better than today’s standard-of-care—a.k.a. doctor observations and medical imaging—at predicting whether patients should be sent home, monitored, or undergo surgery.

Smart Speaker Listens for Audible Signs of Cardiac Arrest

Post Syndicated from Megan Scudellari original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/smart-speaker-listens-for-cardiac-arrest

This AI system detects unique gasping sounds that occur when the heart stops beating

When a person’s heart malfunctions and suddenly stops beating, death can occur within minutes—unless someone intervenes. A bystander administering CPR right away can triple a person’s chances of surviving a cardiac arrest.

Last July, we described a smart watch designed to detect cardiac arrest and summon help. Now, a team at the University of Washington has developed a totally contactless AI system that listens to detect the telltale sound of agonal breathing—a unique guttural gasping sound made by 50 percent of cardiac arrest patients.

The smart speaker system, described today in the journal npj Digital Medicine, detected agonal breathing events 97 percent of the time with almost no false alarms in a proof-of-concept study.

The team imagines using the tool—which can run on Amazon’s Alexa or Google Home, among other devices—to passively monitor bedrooms for the sound of agonal breathing and, if detected, set off an alarm.

Laser Destroys Cancer Cells Circulating in the Blood

Post Syndicated from Emily Waltz original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/laser-destroys-cancer-cells-circulating-in-the-blood

The first study of a new treatment in humans demonstrates a noninvasive, harmless cancer killer

Tumor cells that spread cancer via the bloodstream face a new foe: a laser beam, shined from outside the skin, that finds and kills these metastatic little demons on the spot.

In a study published today in Science Translational Medicine, researchers revealed that their system accurately detected these cells in 27 out of 28 people with cancer, with a sensitivity that is about 1,000 times better than current technology. That’s an achievement in itself, but the research team was also able to kill a high percentage of the cancer-spreading cells, in real time, as they raced through the veins of the participants

If developed further, the tool could give doctors a harmless, noninvasive, and thorough way to hunt and destroy such cells before those cells can form new tumors in the body. “This technology has the potential to significantly inhibit metastasis progression,” says Vladimir Zharov, director of the nanomedicine center at the University of Arkansas for Medical Sciences, who led the research. 

Scanning Your Eyes for Alzheimer’s

Post Syndicated from Megan Scudellari original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/reliable-portable-tech-for-alzheimers-diagnosis

Gates/Bezos-funded charity champions research into methods for early detection of Alzheimer’s disease

The Alzheimer’s Drug Discovery Foundation (ADDF), a public charity co-founded by Bill Gates, Jeff and MacKenzie Bezos, and former Estée Lauder CEO Leonard Lauder, just announced the first award recipients of their $50 million Diagnostics Accelerator research program.

The four recipients, chosen from a pool of 300 applicants across 30 countries, are developing reliable, cost-effective ways to diagnose Alzheimer’s disease, including one that will use machine learning to detect early signs of concern through an eye scan.