Tag Archives: Biomedical/Diagnostics

New AI System Predicts Seizures With Near-Perfect Accuracy

Post Syndicated from Michelle Hampson original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/this-new-ai-system-can-predict-seizures-with-nearperfect-accuracy

For the roughly 50 million people worldwide with epilepsy, the exchange of electrical signals between cells in their brain can sometimes go haywire and cause a seizure—often with little to no warning. Two researchers at the University of Louisiana at Lafayette have developed a new AI-powered model that can predict the occurrence of seizures up to one hour before onset with 99.6 percent accuracy.

“Due to unexpected seizure times, epilepsy has a strong psychological and social effect on patients,” explains Hisham Daoud, a researcher who co-developed the new model.

Detecting seizures ahead of time could greatly improve the quality of life for patients with epilepsy and provide them with enough time to take action, he says. Notably, seizures are controllable with medication in up to 70 percent of these patients.

Can Big Data Help Prevent Alzheimer’s Disease?

Post Syndicated from Megan Scudellari original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/can-big-data-solve-alzheimers

During the Framingham Heart Study, a long-term research study initiated in 1948 that collected health data from thousands of people, researchers discovered that high cholesterol and elevated blood pressure increase one’s risk of heart disease. Thanks to that insight, at-risk individuals can reduce their chances of developing the condition by taking drugs to lower cholesterol and blood pressure.

Could the same be done for Alzheimer’s disease, a notoriously opaque and complex progressive brain disease?

Online Tool Spotted Vaping Illness Before Gov’t Alerts on e-Cigarette Lung Disease

Post Syndicated from Emily Waltz original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/online-tool-spotted-vaping-illness-before-government-alerts

Web-scouring algorithms are aiding the surveillance of a deadly vaping-related lung disease. The online tool, called HealthMap, first spotted the disease on 25 July, according to its curators. That’s nearly a month before U.S. federal officials announced an investigation into the e-cigarette–related illness.  

Since then, HealthMap’s case counts have lined up closely to that of the feds at the U.S. Centers for Disease Control and Prevention (CDC). In its most recent update, which was based on data collected through 8 October, the agency reported 1,299 confirmed and probable cases of the lung illness; HealthMap counted 1,305 up to the same date. 

The accuracy of HealthMap suggests that such web-based tools are a viable addition to traditional surveillance methods. “We see it not as a replacement [to traditional warning systems], but as a supplement,” says Yulin Hswen, a research fellow at Boston Children’s Hospital, Harvard Medical School. “It gives you a more comprehensive picture of everything that’s going on, and in real time,” she says.

App Detects Eye Disease in Personal Photos

Post Syndicated from Emily Waltz original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/app-detects-eye-disease-in-personal-photos

A smartphone app that monitors personal photos can spot eye diseases more than a year before doctors do, according to a new report published today in the journal Science Advances

Using machine learning, the app searches casual portraits for signs of leukocoria: the appearance of a white reflection in the pupil of the eye. Leukocoria, or “white eye,” looks similar to red eye—that creepy red reflection in the eye that often appears with flash photography. But a red reflection is actually a sign of a healthy eye. A white reflection can be a sign of a problem.   

White eye can indicate retinoblastoma, a type of childhood cancer of the retina, or a handful of other eye disorders, including retinopathy of prematurity, cataracts, or Coats Disease. Catching these disorders early can save an eye, or a life. 

“With retinoblastoma, every month counts,” says Bryan Shaw, an associate professor at Baylor University in Waco, Texas. “Tumors grow rapidly and when you start seeing the white eye, you have about six months to a year before the tumor starts to break up and metastasize down the optic nerve to the brain and kills you.”

Fujifilm SonoSite Wants to Bring AI to Ultrasound

Post Syndicated from Emily Waltz original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/fujifilm-sonosite-wants-to-bring-ai-to-ultrasound

Have you ever needed an IV and had to undergo multiple pricks before the nurse could find a vein? Technology to avoid that painful trial and error is in the works. Fujifilm’s ultrasound diagnostics arm SonoSite announced yesterday that it had partnered with a startup company to develop artificial intelligence that can interpret ultrasound images on a mobile phone.

The companies say the first target for their AI-enabled ultrasound will be finding veins for IV (intravenous) needle insertion. The technology would enable technicians to hold a simple ultrasound wand over the skin while software on a connected mobile device locates the vein for them.

For this project, Fujifilm SonoSite tapped the Allen Institute for Artificial Intelligence (AI2), which has an incubator for AI startup companies. “Not only do we have to come up with a very accurate model to analyze the ultrasound videos, but on top of that, we have to make sure the model is working effectively on the limited resources of an android tablet or phone,” says Vu Ha, technical director of the AI2 Incubator.

In an interview with IEEE Spectrum, Ha did not disclose the name of the startup that will be taking on the task, saying the fledgling company is still in “stealth mode.

Ha says the AI2 startup will take on the project in two stages: First, it’ll train a model on ultrasound images without any resource constraints, with the purpose of making it as accurate as possible. Then, the startup will go through a sequence of experiments to simplify the model by reducing the number of hidden layers in the network, and by trimming and compressing the network until it is simple enough to operate on a mobile phone. 

The trick will be to shrink the model without sacrificing too much accuracy, Ha says.

If successful, the device could help clinicians reduce the number of unsuccessful attempts at finding a vein, and enable less trained technicians to start IVs as well. Hospitals that do a large volume of IVs often have highly trained staff capable of eyeballing ultrasound videos and using those images to help them to find small blood vessels. But the number of these highly trained clinicians is very small, says Ha.

“My hope is that with this technology, a less trained person will be able to find veins more reliably” using ultrasound, he says. That could broaden the availability of portable ultrasound to rural and resource-poor areas. 

SonoSite and AI2 are homes to two of the many groups of researchers putting AI to work on medical imaging and diagnostics. The U.S. Food and Drug Administration (FDA) has approved for commercial use a deep learning algorithm to analyze MRI images of the heart, an AI system that looks for signs of diabetic retinopathy in the images of the retina, an algorithm that analyzes X-ray images for signs of wrist fracture, and software that looks for indicators of stroke in CT images of the brain, to name a few.  

Notably, the FDA in 2017 also approved for commercial use smartphone-based ultrasound technology made by Butterfly. The device, which costs less than $2000, can be used to take sonograms for 13 different clinical applications, including blood vessels. Butterfly has announced publicly that it is developing deep learning–based AI that will assist clinicians with image interpretation. But the company has not yet commercially launched the technology. 

At least four other portable or mobile device–based ultrasound technologies have been approved by the FDA, including that of Fujifilm SonoSite, and the Lumify from Philips

But the adoption of these devices has been relatively slow. As Eric Topol, director of the Scripps Research Translational Institute, told Spectrum recently, the smartphone ultrasound is a “brilliant engineering advance” that’s “hardly used at all” in the health care system. Complex challenges such as reimbursement, training, and the old habits of clinicians often hinder the uptake of new gadgets, despite engineers’ best efforts. 

With This AI, 60 Percent of Patients Who Had Surgery Could Have Avoided It

Post Syndicated from Megan Scudellari original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/more-than-60-of-patients-could-have-avoided-surgery-if-this-ai-had-been-their-doctor

Machine learning algorithms that combine clinical and molecular data are the “wave of the future,” experts say

A man walks into a doctor’s office for a CT scan of his gallbladder. The gallbladder is fine but the doctor notices a saclike pocket of fluid on the man’s pancreas. It’s a cyst that may lead to cancer, the doctor tells him, so I’ll need to cut it out to be safe.

It’ll take three months to recover from the surgery, the doctor adds—plus, there’s a 50 percent chance of surgical complications, and a 5 percent chance the man will die on the table.

An estimated 800,000 patients in the United States are incidentally diagnosed with pancreatic cysts each year, and doctors have no good way of telling which cysts harbor a deadly form of cancer and which are benign. This ambiguity results in thousands of unnecessary surgeries: One study found that up to 78 percent of cysts for which a patient was referred to surgery ended up being not cancerous.

Now there’s a machine learning algorithm that could help. Described today in the journal Science Translational Medicine, surgeons and computer scientists at Johns Hopkins University have built a test called CompCyst (for comprehensive cyst analysis) that is significantly better than today’s standard-of-care—a.k.a. doctor observations and medical imaging—at predicting whether patients should be sent home, monitored, or undergo surgery.

Smart Speaker Listens for Audible Signs of Cardiac Arrest

Post Syndicated from Megan Scudellari original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/smart-speaker-listens-for-cardiac-arrest

This AI system detects unique gasping sounds that occur when the heart stops beating

When a person’s heart malfunctions and suddenly stops beating, death can occur within minutes—unless someone intervenes. A bystander administering CPR right away can triple a person’s chances of surviving a cardiac arrest.

Last July, we described a smart watch designed to detect cardiac arrest and summon help. Now, a team at the University of Washington has developed a totally contactless AI system that listens to detect the telltale sound of agonal breathing—a unique guttural gasping sound made by 50 percent of cardiac arrest patients.

The smart speaker system, described today in the journal npj Digital Medicine, detected agonal breathing events 97 percent of the time with almost no false alarms in a proof-of-concept study.

The team imagines using the tool—which can run on Amazon’s Alexa or Google Home, among other devices—to passively monitor bedrooms for the sound of agonal breathing and, if detected, set off an alarm.

Laser Destroys Cancer Cells Circulating in the Blood

Post Syndicated from Emily Waltz original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/laser-destroys-cancer-cells-circulating-in-the-blood

The first study of a new treatment in humans demonstrates a noninvasive, harmless cancer killer

Tumor cells that spread cancer via the bloodstream face a new foe: a laser beam, shined from outside the skin, that finds and kills these metastatic little demons on the spot.

In a study published today in Science Translational Medicine, researchers revealed that their system accurately detected these cells in 27 out of 28 people with cancer, with a sensitivity that is about 1,000 times better than current technology. That’s an achievement in itself, but the research team was also able to kill a high percentage of the cancer-spreading cells, in real time, as they raced through the veins of the participants

If developed further, the tool could give doctors a harmless, noninvasive, and thorough way to hunt and destroy such cells before those cells can form new tumors in the body. “This technology has the potential to significantly inhibit metastasis progression,” says Vladimir Zharov, director of the nanomedicine center at the University of Arkansas for Medical Sciences, who led the research. 

Scanning Your Eyes for Alzheimer’s

Post Syndicated from Megan Scudellari original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/reliable-portable-tech-for-alzheimers-diagnosis

Gates/Bezos-funded charity champions research into methods for early detection of Alzheimer’s disease

The Alzheimer’s Drug Discovery Foundation (ADDF), a public charity co-founded by Bill Gates, Jeff and MacKenzie Bezos, and former Estée Lauder CEO Leonard Lauder, just announced the first award recipients of their $50 million Diagnostics Accelerator research program.

The four recipients, chosen from a pool of 300 applicants across 30 countries, are developing reliable, cost-effective ways to diagnose Alzheimer’s disease, including one that will use machine learning to detect early signs of concern through an eye scan.