Scientists Can Now Take Virtual Walks Through Human Cells

Post Syndicated from Emily Waltz original https://spectrum.ieee.org/the-human-os/biomedical/devices/scientists-can-now-take-virtual-walks-through-human-cells

This is no computer-generated ride. The images come from live cells using a super-resolution microscope. The software then converts the two-dimensional data from the microscope into a three-dimensional immersive visualization. This fresh, almost personal look at biological structures may allow researchers to better understand the inner workings of cells and to search for causes of disease

“We’re trying to find interesting ways to ‘just look at the thing,” says Steven Lee, a biophysical chemist at the University of Cambridge who co-authored the paper. Lee and his colleagues at Cambridge created the software through a partnership with 3D image analysis company Lume VR

“Recognizing patterns in nature is the fundamental keystone in science,” says Hermes Bloomfield-Gadêlha, head of the Polymaths Lab at the University of Bristol, who was not involved in the project. “This is an exciting new advance that will allow us to see patterns and interact with structures of the hidden and strange molecular universe.”

Lee graciously took the time to lead me on a guided tour of a brain cell, or neuron. We began from a bird’s eye view of the neuron. It’s axons, which are slender, tube-like projections, carry information from one brain cell to another in the form of electrical impulses. “The little tubes you see are the ways in which your neurons communicate,” Lee said as we swooped in closer. “Thoughts and ideas and feelings run down the middle of these tubes.”

We drew closer to the axon, revealing that the tube-like structure is formed by a series of rings, called spectrin. “The space between these rings is something on the scale of 100 nanometers. So it’s very small, you know—a hundred billionths of a meter,” he narrated.

Of interest were the molecules hanging out outside of the spectrin ring scaffold. “Are they floating through the axon or are they associated with the wall of the axon?” he asked rhetorically. We can try to answer those questions now “which would be extremely difficult to do in another way,” Lee said. 

The software, called vLUME, allows scientists to cut out and manipulate views of subregions of interest. We looked at a cut-out section of the axon turned on its end. We peered inside, which was like looking down the barrel of a ribbed drinking straw. (And a mind boggling experience. We were, after all, standing in the very structure that was allowing us to think about that structure.) 

We looked at four subsections of the axon, which floated in virtual boxes, and then zoomed out to see them side-by-side. “We can do some sort of quantitative measure to see how they are different,” Lee said. Looking at the difference in structure of a healthy region versus a diseased region could help researchers understand cause of disease.  

We pulled out further to see the whole picture—the axon, the four boxes of subsections, some analysis, and some hand-written notes. It was beautiful, and one could see how a scientist could use it to not only study a structure, but also explain it to the world. 

The technology would not be possible without super-resolution microscopy, which was awarded the Nobel prize in chemistry in 2014. This optical microscopy technique bypasses the diffraction limit—a physical barrier that restricts the optical resolution to roughly 250 nm and was previously thought to be impenetrable.

Super-resolution microscopy allows researchers to image biological structures with resolutions of about five to ten nanometers. But the images are typically two dimensional. Since life happens in 3D, scientists have been working on ways to infer 3D information from those 2D pictures. But it has proven difficult to interact with the 3D data in an immersive and native way.

The vLUME software uses the data from super-resolved images made up of millions of individual points that represent the 3D position of individual molecules, called fluorophores. vLUME is used to render the information as a point cloud, or set of data points in space. The point cloud can then be explored and segmented with the vLUME software. Clustering algorithms then analyze the complex datasets to find patterns in biological structures. 

The software will be made free for academic use. All the user needs is a VR headset. The software in its current iteration can only take researchers through a static picture of a cell. Lee and his colleagues hope to upgrade that to real-time, moving images of live cells in the future, but presently, the lag time is anywhere from ten minutes to an hour.  

Researchers are already using the tool to understand how immune cells determine which cells in the body have been infected by a pathogen, and how proteins mis-fold in disease. “It’s allowed us to be able to rule out hypotheses very quickly by showing people data in an intuitive way,” Lee says. 

But the ability to ‘just look at the thing’ may be only the beginning. Says Bloomfield-Gadêlha: “Future multidisciplinary interactions with, for example, mathematical modeling and simulation, and even robotics, could grant us predictive power from these 3D molecular realities and an unimaginable understanding of the molecular multiverse in nature.”