All posts by Emily Waltz

What the Media Missed About Elon Musk’s $150 Million Augmented Brain Project

Post Syndicated from Emily Waltz original https://spectrum.ieee.org/the-human-os/biomedical/devices/elon-musks-150-million-augmented-brain-project-what-the-media-missed

Elon Musk’s company Neuralink last week announced a plan to sync our brains with artificial intelligence. Here’s what news outlets overlooked.

Elon Musk is known for trumpeting bold and sometimes brash plans. So it was no surprise last week when the Tesla founder made an announcement—in front of a live audience and streamed online, with a video trailer and thematic music—that his new company Neuralink plans to sync our brains with artificial intelligence. (Don’t worry, he assured the audience, “this is not a mandatory thing.”)

What was surprising was the breathless coverage in most media, which lacked context or appreciation for the two decades of research on which Neuralink’s work stands. 

Mind Meld: Tetris Players Electronically Connect Their Brains

Post Syndicated from Emily Waltz original https://spectrum.ieee.org/the-human-os/biomedical/bionics/video-game-players-electronically-connect-their-brains

Humans collaborate using brain-to-brain communication to play video game

Have you ever felt a strong emotion, such as elation from taking in a scenic view, and wanted to share it with the people around you? Not “share” as in tell them about it or post it on social media, but actually share it—like beam the feeling from your brain into theirs?  

Researchers at the University of Washington in Seattle say they would like to give humans that kind of brain-to-brain interaction, and have demonstrated a baby step toward that goal. In a set of experiments described in the journal Scientific Reports, the researchers enabled small groups of people to communicate collaboratively using only their minds.

In the experiments, participants played a Tetris-like video game. They worked in groups of three to decide whether to rotate a digital shape as it fell toward rows of blocks at the bottom of the screen. 

The participants could not see, hear, or communicate with each other in any way other than through thinking. Their thoughts—electrical signals in the brain—were read using electroencephalography (EEG) and delivered using transcranial magnetic stimulation (TMS).

The messages sent between participants’ brains were limited to “yes” and “no”. But the researchers who developed the system hope to expand upon it to enable the sharing of more complex information or even emotions. “Imagine if you could make a person feel something,” says Andrea Stocco, an assistant professor at the University of Washington, who collaborated on the experiments.

We already try to elicit emotions from each other—to varying degrees of success—using touch, words, pictures and drugs. And brain stimulation techniques such as TMS have for more than a decade been used to treat psychiatric disorders such as depression. Sharing emotions using brain-to-brain interaction is an extension of these existing practices, Stocco says. “It might sound creepy, but it’s important,” he says.

Stocco’s experiments with the technology, called brain-to-brain interface, or BBI, are the first demonstration of BBI between more than two people, he and his colleagues say. 

To be accurate, the technology should probably be called brain-to-computer-to-computer-to-brain interface. The brain signals of one person are recorded with EEG and decoded for their meaning (computer number one). The message is then re-coded and sent to a TMS device (computer number two), which delivers electrical stimulation to the brain of the recipient.

In Stocco’s experiments, this chain of communication has to happen in the amount of time it takes a Tetris-style block to drop (about 30 seconds—it’s slow Tetris). Participants work in groups of three: two senders and one receiver. The senders watch the video game and each decide whether to rotate the block or not. Then they send their yes or no decision to the receiver, who sits in a different room, and is charged with taking action to rotate the block or not, based on the senders’ messages. (Receivers can only see half the game—the piece that is falling—and not the rows of blocks into which the piece is supposed to fit, so they depend on the senders’ advice.) 

If senders want to rotate the block, they focus their attention on an area of their screen that says “yes” with an LED flashing beneath it at 17 hertz. If they do not want to rotate it, they focus their attention on area of the screen that says “no” with an LED flashing beneath it at 15 hertz. 

The difference in brain activity caused by looking at these two different rates of flashing light is fairly easy to detect with EEG, says Stocco. A computer then evaluates the brainwave patterns, determines if they corresponded with yes or no, and sends that information to the third person in the group (the receiver). 

The receiver wears a TMS device that induces gentle, electrical stimulation in the brain non-invasively. If the message from a sender is “yes” the TMS device stimulates the receiver’s brain in a way that produces some kind of visual cue, such as a flash of color. If the message is no, the receiver gets no visual cue. Messages to the receiver from the two senders arrived one after the other, in the same order each time.

To make things interesting, the researchers asked one of the two senders to frequently transmit the wrong answer. All the receivers noticed the pattern of bad advice, and chose to listen to the more accurate sender. 

Compared with the complexity of human thought, this binary form of brain-to-brain communication is just one step toward something that might be useful outside the lab. No emotions were shared between participants, aside from perhaps a little nostalgia among the participants who grew up with the real Tetris. 

To reach a higher level of sophistication in brain-to-brain communication, researchers will likely need equipment that can read brain activity with more spatial resolution than EEG, and can stimulate with more specificity than TMS. For this, some BBI researchers are turning to fMRI and ultrasound, Stocco says.

BBI work is slow going. Stocco and his University of Washington colleague Rajesh Rao first demonstrated a form of BBI in 2013. Other groups followed on shortly after. Now, six years later, researchers working on the technology are only inches from where they started. “There are maybe four or five groups working on this globally, so we get maybe one paper a year,” says Stocco. 

Why So Many Medical Advances Never Make it to Mainstream Medicine

Post Syndicated from Emily Waltz original https://spectrum.ieee.org/the-human-os/biomedical/devices/qa-the-future-of-digital-medicine

Eric Topol on why promising technologies like smartphone ultrasound struggle to achieve widespread adoption

For more than a decade, engineers have been innovating their way through the nascent field of digital medicine: creating pills with sensors on them, disease-detecting facial recognition software, tiny robots that swim through the body to perform tasks, smartphone-based imaging and diagnostics, and sensors that track our vitals. But for all that creativity, only a small portion of these inventions get widely adopted in health care. In an essay published today in Science Translational Medicine, Eric Topol, a cardiologist, geneticist, digital medicine researcher, and director of the Scripps Research Translational Institute, discusses the state of digital medicine, ten years in. IEEE Spectrum caught up with Topol to ask a few questions for our readers.

Laser Destroys Cancer Cells Circulating in the Blood

Post Syndicated from Emily Waltz original https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/laser-destroys-cancer-cells-circulating-in-the-blood

The first study of a new treatment in humans demonstrates a noninvasive, harmless cancer killer

Tumor cells that spread cancer via the bloodstream face a new foe: a laser beam, shined from outside the skin, that finds and kills these metastatic little demons on the spot.

In a study published today in Science Translational Medicine, researchers revealed that their system accurately detected these cells in 27 out of 28 people with cancer, with a sensitivity that is about 1,000 times better than current technology. That’s an achievement in itself, but the research team was also able to kill a high percentage of the cancer-spreading cells, in real time, as they raced through the veins of the participants

If developed further, the tool could give doctors a harmless, noninvasive, and thorough way to hunt and destroy such cells before those cells can form new tumors in the body. “This technology has the potential to significantly inhibit metastasis progression,” says Vladimir Zharov, director of the nanomedicine center at the University of Arkansas for Medical Sciences, who led the research. 

Tiny Robots Carry Stem Cells Through a Mouse

Post Syndicated from Emily Waltz original https://spectrum.ieee.org/the-human-os/biomedical/devices/magnetically-navigated-microbots-deliver-stem-cells-in-mice

Using this technique, microrobots could deliver stem cells to hard-to-reach places

Engineers have built microrobots to perform all sorts of tasks in the body, and can now add to that list another key skill: delivering stem cells. In a paper published today in Science Robotics, researchers describe propelling a magnetically-controlled, stem-cell-carrying bot through a live mouse. 

Under a rotating magnetic field, the microrobots moved with rolling and corkscrew-style locomotion. The researchers, led by Hongsoo Choi and his team at the Daegu Gyeongbuk Institute of Science & Technology (DGIST), in South Korea, also demonstrated their bot’s moves in slices of mouse brain, in blood vessels isolated from rat brains, and in a multi-organ-on-a chip. 

The invention provides an alternative way to deliver stem cells, which are increasingly important in medicine. Such cells can be coaxed into becoming nearly any kind of cell, making them great candidates for treating neurodegenerative disorders such as Alzheimer’s.

The Mood Ring of Algorithms Could Zap Your Brain to Help You Feel Better

Post Syndicated from Emily Waltz original https://spectrum.ieee.org/the-human-os/biomedical/devices/algorithms-play-doctor-in-brain-stimulation

These algorithms spot mood changes before you do, and could someday tell a stimulator to zap your brain to treat disorders

A man with depression is driving to work when his mood darkens, and the familiar inklings of anxiety begin to creep in. His brain cells start to fire in a pattern that has, in the past, led him down a dark, dangerous mental road. But this man has a set of electrodes implanted in his brain, and wireless software nearby that’s closely monitoring his neural activity. Algorithms recognize the shift in his brain waves and order up a therapeutic dose of electrical stimulation, zapping the faulty circuits. The man arrives at work calm and stable.

The technology in this scenario doesn’t exist yet, but it’s the vision of Maryam Shanechi, an electrical engineer at the University of Southern California’s Viterbi School of Engineering, and Edward Chang, a neurosurgeon at the University of California, San Francisco. Shanechi presented their progress this week in Nashville, Tennessee, at a neurotechnology meeting held by DARPA, the research arm of the U.S. Department of Defense. 

So far, Shanechi and her team have successfully developed algorithms that decoded the brain activity associated with mood changes in seven people. Now, they’re figuring out how to stimulate the brain to affect those mood changes, she reported at the meeting.

When the two pieces of the technology come together, they would form a closed-loop system that puts stimulation therapy decisions in the hands of an algorithm. “We are developing a precise, personalized therapy that takes readings of brain activity, and, based on that, makes decisions on stimulation parameters,” Shanechi said in her presentation in Nashville on Wednesday. 

Korea’s New 5G Futuristic Hospital

Post Syndicated from Emily Waltz original https://spectrum.ieee.org/the-human-os/biomedical/devices/koreas-new-futuristic-hospital

Hologram visitors, indoor navigation, facial recognition security, and voice-controlled rooms are coming to a hospital in South Korea

5G report logo, link to report landing page

When Yonsei University Health System opens its newest hospital next year, in Yongin, about 25 miles outside of Seoul, it will be decked out with some of tech’s hottest gadgets.

Very sick patients in isolation rooms can visit with holograms of their loved ones. Visitors will find their way around the hospital using an augmented reality (AR)-based indoor navigation system. Authorized medical workers will use facial recognition to enter secure areas. Patients can call a nurse and control their bed, lights, and TV with an Alexa-style voice assistant.

That’s the vision, at least. Yonsei and Korean telecommunications company SK Telecom, last week jointly announced that they had signed a memorandum of understanding to build technology for the futuristic hospital, scheduled to open in February 2020. SK Telecom will support the technology with a 5G network, and is considering securing it with quantum cryptography, according to the announcement.