Nick Williams didn’t ask permission. The graduate student just stuck his pinky finger under the printer and watched it paint two silver lines on his skin. When printing was complete, Williams put a small LED light at one end of the lines and applied voltage to the other. He smiled as the light glowed.
Williams showed the electronically active tattoo to his advisor, Duke University electrical engineer Aaron Franklin. Since Williams barely felt the printing, and the silver washed off with soap and water, they tried it again.
Cardiovascular disease is the leading cause of mortality worldwide, accounting for nearly 18 million deaths each year, according to the World Health Organization. In recent years, scientists have looked to regenerative therapies – including those that use 3D-printed tissue – to repair damage done to the heart and restore cardiac function.
Thanks to advancements in 3D-printing technology, engineers have applied cutting-edge bioprinting techniques to create scaffolds and cardiac tissue that, once implanted, can quickly integrate with native tissues in the body. While 3D bioprinting can create 3D structures made of living cells, the final product is static – it cannot grow or change in response to changes in its environment.
Conversely, in 4D bioprinting, time is the fourth dimension. Engineers apply 4D printing strategies to create constructs using biocompatible, responsive materials or cells that can grow or even change functionalities over time and in response to their environment. This technology could be a game-changer for human health, particularly in pediatrics, where 4D-printed constructs could grow and change as children age, eliminating the need for future surgeries to replace tissues or scaffolds that fail to do the same.
But, 4D bioprinting technology is still young. One of the critical challenges impacting the field is the lack of advanced 4D-printable bioinks – material used to produce engineered live tissue using printing technology – that not only meet the requirements of 3D bioprinting, but also feature smart, dynamic capabilities to regulate cell behaviors and respond to changes in the environment wherever they’re implanted in the body.
Recognizing this, researchers at George Washington University (GWU) and the University of Maryland’s A. James Clark School of Engineering are working together to shed new light on this burgeoning field. GWU Department of Mechanical and Aerospace Engineering Associate Professor Lijie Grace Zhang and UMD Fischell Department of Bioengineering Professor and Chair John Fisher were recently awarded a joint $550,000 grant from the National Science Foundation to investigate 4D bioprinting of smart constructs for cardiovascular study.
Their main goal is to design novel and reprogrammable smart bioinks that can create dynamic 4D-bioprinted constructs to repair and control the muscle cells that make up the heart and pump blood throughout the body. The muscle cells they’re working with – human induced pluripotent stem cell (iPSC) derived cardiomyocytes – represent a promising stem cell source for cardiovascular regeneration.
In this study, the bioinks, and the 4D structures they’re used to create, are considered “reprogrammable” because they can be precisely controlled by external stimuli – in this case, by light – to contract and elongate on command in the same way that native heart muscle cells do with each and every heartbeat.
The research duo will use long-wavelength near-infrared (NIR) light to serve as the stimulus that prompts the 4D bioprinted structures into action. Unlike ultraviolet or visible light, long-wavelength NIR light could efficiently penetrate the bioprinted structures without causing harm to surrounding cells.
“4D bioprinting is at the frontier of the field of bioprinting,” Zhang said. “This collaborative research will expand our fundamental understanding of iPSC cardiomyocyte development in a dynamic microenvironment for cardiac applications. We are looking forward to a fruitful collaboration between our labs in the coming years.”
“We are thrilled to work with Dr. Zhang and her lab to continue to develop novel bioinks for 3D- and 4D- printing,” Fisher said. “We are confident that the collaborative research team will continue to bring to light untapped printing strategies, particularly in regards to stem cell biology.”
Moving forward, Zhang and Fisher hope to apply their 4D bioprinting technique to further study of the fundamental interactions between 4D structures and cardiomyocyte behaviors.
“The very concept of 4D bioprinting is so new that it opens up a realm of possibilities in tissue engineering that few had ever imagined,” Fisher said. “While scientists and engineers have a lot of ground to cover, 4D bioprinted tissue could one day change how we treat pediatric heart disease, or even pave the way to alternatives to donor organs.”
At GWU, Zhang leads the Bioengineering Laboratory for Nanomedicine and Tissue Engineering. At UMD, Fisher leads the Center for Engineering Complex Tissues, a joint research collaboration between UMD, Rice University, and the Wake Forest Institute for Regenerative Medicine. Fisher is also the principal investigator of the Tissue Engineering and Biomaterials Lab, housed within the UMD Fischell Department of Bioengineering.
This is a guest post. The views expressed here are solely those of the author and do not represent positions of IEEE Spectrum or the IEEE.
Technology has become more physically and psychologically intimate, which has created a demand for new technologies that can infer emotional states from humans. The term “affective computing” was coined in 1995 by Professor Rosalind Picard, founder and director of the affective computing research group at the MIT Media Lab. She recognized the extent to which emotions governed our lives and decided to drive forward the concept of “engineering emotion.”
What is affective computing?
Affective computing systems are being developed to recognize, interpret, and process human experiences and emotions. They all rely on extensive human behavioral data, captured by various kinds of hardware and processed by an array of sophisticated machine learning software applications.
AI-based software lies at the heart of each system’s ability to interpret and act on users’ emotional cues. These systems identify and link nuances in behavioral data with the associated emotion.
The most obvious types of hardware for collecting behavior data are cameras and other scanning devices that monitor facial expressions, eye movements, gestures, and postures. This data can be processed to identify subtle micro expressions that a human assessment might struggle to identify consistently.
What’s more, high-end audio equipment records variances and textures in users’ voices. Some insurance companies are experimenting with call voice analytics that can detect if someone is lying to their claim handers. The team working on IBM’s question-answering computer system Watson has developed a “tone analyzer” that uses linguistic analysis to detect three types of tones from text: language style, emotion, and social tendencies.
Virtual reality gear—head-mounted displays, for example—is being developed to create increasingly realistic simulated experiences. The technology enables the game to adapt based on the emotion of the user, creating a more personal and exciting simulated experience.
How is affective computing used by companies?
Many companies are looking to affective computing to capture mass data about consumer reactions to their advertising campaigns. One of the most notable innovators in the retail space, Realeyes, works with big-name brands such as Coca-Cola, Expedia, Mars, AT&T, and LG, who deploy the technology to help them measure, optimize, and compare the effectiveness of their content.
The Realeyes software measures viewers’ emotions and attention levels using webcams. It can show a brand’s content to panels of consenting consumers all around the world and measure how audiences respond to a campaign by monitoring their attention levels and logging moments of maximum engagement. Marketers are provided with an overall score based on attention and emotional engagement, which enables them to compare multiple assets or benchmark them against previous campaigns.
Microsoft’s Human Understanding and Empathy team is working on various projects with the aim of implementing affective computing into their products. This includes developing a multimodal emotion-sensing platform that combines computer vision analysis of facial expression and body pose with audio processing that detects speech and sentiment. Together, they enable the system to generate computational models of conversation that better reflect emotions.
What are the ethical issues in affective computing?
There’s inevitable fear and uncertainty surrounding developments in AI technology, and affective computing is no different. Marketing companies may have trouble gathering large amounts of personal data from audiences, as they’ll have to make sure that all participants have consented. This would make it difficult to gather information about everyday advertising to the masses.
Yet it’s easy to imagine that, one day soon, TV sets will have cameras and microphones that can pick up reactions to shows and commercials, and that those reactions will be monitored by the media industry. This possibility creates huge privacy and data protection issues, and it’s the biggest obstacle for companies.
Affective computing pioneer Picard is fervently against the use of affective computing for unethical purposes. In a recent interview, she said that she has lost out on a significant amount of money for not selling products to companies that planned to collect data without people’s consent.
Picard’s hopes for affective computing lie within its ability to help people communicate better. For instance, affective computing can help people with autism communicate the emotions they struggle to vocalize. Some years back, her research group made a glove that had sensors on the palm to monitor emotional responses. The glove could recognize when the person was experiencing moments of frustration, which could potentially help prevent emotional distress. This same device could be used to monitor stress during a child’s school day.
Within the Black Dog Institute, an Australian mental-health clinical services group, and ReachOut, a similar group that aims to help young people, there is considerable interest in harnessing affective computing for a range of social benefits. Slawomir Nasuto, professor of cybernetics at the University of Reading in the UK, believes these technologies could be easily integrated with public sector infrastructures.
Nasuto envisions the introduction of computerized tutoring within schools. The technology could be used to recognize the mental state of the students, including stress or attention, that signify whether the learners are struggling, interested, or bored. On the basis of that input, the system could adjust the difficulty of the problem, style of explanation, or pace of delivery to keep the students engaged.
Nasuto has also explored how a similar system could be deployed in hospitals’ intensive-care wards. Patients undergoing operations (particularly after a traumatic event) are highly stressed and face potential cognitive impairment post-operation. He says that “non-pharmacological interventions such as music may help to reduce the levels of anxiety that the patient is suffering. That, in turn, may enable the clinicians to lower the doses of medications that the patient is receiving.”
Affective computing has faced, and will continue to face, aversion from those who question the intentions behind its use. However, if used safely and ethically, affective computing may become a significant part of our everyday lives.
As Nasuto says: “Affective computing is a tool–and any tool can be used for either good or nefarious purposes. What is specific here is the pervasiveness of affective computing via online connectivity and the emergence of cheaper and more networked sensing technologies. Together, they will open the way for the collection of unprecedented volumes of date on the human state.”
Richard Johnson is a partner and European patent attorney at the IP firm Mewburn Ellis, where he works with the firm’s electronics, computing, physics, and engineering patent teams. He has a particular interest in the patentability of software and business-related inventions. Johnson advises clients in the United Kingdom and abroad on the development and management of patent portfolios.
A smartphone app that monitors personal photos can spot eye diseases more than a year before doctors do, according to a new report published today in the journal Science Advances.
Using machine learning, the app searches casual portraits for signs of leukocoria: the appearance of a white reflection in the pupil of the eye. Leukocoria, or “white eye,” looks similar to red eye—that creepy red reflection in the eye that often appears with flash photography. But a red reflection is actually a sign of a healthy eye. A white reflection can be a sign of a problem.
White eye can indicate retinoblastoma, a type of childhood cancer of the retina, or a handful of other eye disorders, including retinopathy of prematurity, cataracts, or Coats Disease. Catching these disorders early can save an eye, or a life.
“With retinoblastoma, every month counts,” says Bryan Shaw, an associate professor at Baylor University in Waco, Texas. “Tumors grow rapidly and when you start seeing the white eye, you have about six months to a year before the tumor starts to break up and metastasize down the optic nerve to the brain and kills you.”
Why waste the energy used to tilt one’s head or digest food? University of Wisconsin-Madison engineer Xudong Wang is an expert at harvesting the body’s mechanical energy to power devices, such as an electric bandage that accelerates healing and a stomach implant that subdues hunger.
Now, Wang’s team is back with a self-powered wearable to tackle an age-old nemesis: hair loss.
Wang’s lab has created a motion-activated, flexible wearable that promotes hair regeneration via gentle electrical stimulation. They describe their work in a study published this month in the journal ACS Nano. In rodents, the device stimulated hair growth better than conventional topical medications.
The device can be discreetly hidden under a baseball cap, says Wang. He hopes to begin a clinical trial with humans within six months.
When Benjamin Hansen was playing baseball in high school, around 2006, technologies to monitor athletes’ bodies and performance weren’t yet commonplace. Yet Hansen wanted to collect data any way he could. “I would sit on the bench with a calculator and a stopwatch, timing the pitchers,” he says. He clicked the stopwatch when the pitcher released the baseball and again when the ball popped into the catcher’s mitt, then factored in the pitcher’s height in calculating the pitch velocity.
Hansen’s coach, however, was not impressed. “My coach should have embraced it,” he says, wistfully. “But instead he made me run laps.”
Hansen kept playing baseball through college, pitching for his team at the Milwaukee School of Engineering. But he was plagued by injuries. He well remembers a practice game in which he logged 15 straight outs—then felt a sharp pain in his elbow. He had partially torn his ulnar collateral ligament (UCL) and had to sit out the rest of the season. “I always asked the question: Why is this happening?” he says.
Today, Hansen is the vice president of biomechanics and innovation for Motus Global, in St. Petersburg, Fla., a startup that produces wearable sports technology. For IEEE Spectrum’s October issue, he describes Motus’s product for baseball pitchers, a compression sleeve with sensors to measure workload and muscle fatigue. From Little League to Major League Baseball, pitchers are using Motus gear to understand their bodies, improve performance, and prevent injuries.
Traditional wisdom holds that pitcher injuries result from faulty form. But data from Motus’s wearable indicates that it’s the accumulated workload on a player’s muscles and ligaments that causes injuries like UCL tears, which have become far too common in baseball. By displaying measurements of fatigue and suggesting training regimens, rehab workouts, and in-game strategies, the wearable can help prevent players from pushing themselves past their limits. It’s a goal that even Hansen’s old coach would probably endorse.
This article appears in the October 2019 print issue as “Throwing Data Around.”
When St. Louis Cardinals pitcher Jordan Hickstakes to the mound, batters tremble. He has the same form as any other pitcher in Major League Baseball (MLB) as he throws the ball, but his results are extraordinary. In the windup, he steps back on one leg, lifting his other leg to raise his center of gravity. Then he strides forward, his hips swiveling, while his throwing arm extends back nearly parallel to the ground. As his arm comes forward, the muscles and ligaments in his elbow absorb over 100 newton meters of torque.
Hicks’s hand snaps forward, faster than the eye can see. If his arm were an engine, you’d see 1,200 rpm on the tachometer. Hicks’s fastballs have clocked in at 105 miles per hour (169 kilometers per hour), giving the batter less than a tenth of a second to recognize the pitch and react to it, trying to put bat on ball. But this repeatable miracle of biomechanics and talent does have a cost. In late June 2019, Hicks left a game with elbow pain—and it was soon revealed that he had torn a crucial ligament. He required what’s known as Tommy John surgery to repair the ligament and sat out the rest of the 2019 baseball season.
Baseball is at a crossroads. Pitchers are throwing at higher velocities than ever before, causing a surge of injuries to the ulnar collateral ligament (UCL) in the elbow, which connects the bones of the upper and lower arm. The number of Tommy John surgeries has skyrocketed over the past decade, and coaches, trainers, and players are desperately searching for technology to combat this epidemic. One ongoing MLB study uses motion capture to chart the pitching form of every new pitcher drafted into the league. It then combines that information with data from MRIs and other physical exams in an effort to identify players at risk of injury. While this campaign is laudable, a single assessment of pitching mechanics can’t predict injury over the years of athletic labor to come.
There is a better way. Technologies now exist that allow for constant and long-term recording of a pitcher’s arm movements, enabling players to track and understand the stresses on their bodies.
At our sports technology company, Motus Global, we use consumer-grade sensors—the kind that have been perfected for smartphones—to gather biometric data related to an athlete’s ultimate workload. Our analytics use software models of muscle fatigue to help pitchers improve performance while decreasing risk of injury. By using affordable technologies, we put data not just within the reach of the 30 MLB teams but also their 160 minor league affiliates, hundreds of college teams, and thousands of youth-level teams around the country.
In 2015, we introduced our first wearable for baseball: Motus Throw, a compression sleeve with a sensor that tracks the motion of a pitcher’s arm. A companion iOS app and Web-based dashboard present analytics to players and coaches. In 2015, MLB approved our technology for use on the field during ball games. By 2019, about a dozen MLB teams were using our technology for training and rehab programs.
In the 2000s, professional baseball was overhauled by an approach that’s often called moneyball, in which teams used obscure performance statistics to better understand players’ true value. You could call our approach bioball—by bringing biological data into the mix, we think teams can take their performance to the next level.
To track a pitcher’s arm movement precisely, the Motus sensor uses a three-axis gyroscope and a three-axis accelerometer, taking measurements 1,000 times per second. While the system is always sampling, it records the information permanently only when it detects the movement signature of a pitch. Then it files away the stream of data beginning 4 seconds before the pitch and ending 1 second after. The unobtrusive sensor, which weighs 6.9 grams and measures 9 millimeters thick, causes no discernible changes in pitchers’ movement patterns.
The Motus Throw system for baseball pitchers uses a tiny sensor [in blue] containing an accelerometer and a gyroscope to track arm movement, taking measurements 1,000 times per second. Photo: Motus
It saves and processes the data only when it detects the movement signature of a pitch. The sensor sits inside a compression sleeve. Photo: Motus
The pitcher can wear during practice sessions and games to record data from every throw. Photo: Motus
The lightweight sleeve and sensor don’t interfere with a pitcher’s normal movement patterns. Photo: Motus
Our tiny sensor is incredibly accurate: Its results are comparable to the gold standard of motion-capture video, which requires high-speed cameras and specialized labs that can cost well into six figures. For motion-caption video recording, the player wears a spandex suit that’s dotted with position markers, which the cameras track to create a model of how the player’s body moves.
Motus began as a small motion-capture studio with a lab at the famed sports training facility IMG Academy, in Bradenton, Fla. But the problem with the motion-capture approach quickly became apparent: Throwing a ball in a lab while wearing a spandex suit doesn’t accurately simulate game conditions. An athlete’s biomechanical form, and thus the forces at work in the body, can be very different in such an artificial situation.
We still have a motion-capture setup in our performance lab: We currently use 16 Motion Analysis Raptor cameras for product testing and some other tasks for clients. But we put full faith in our sensor to do the same job as these fancy cameras. Independent studies by the American Sports Medicine Institute (ASMI), Driveline Baseball, and the MLB’s sport science committee have shown that Motus sensors are up to 95 percent as accurate and reliable as camera-based techniques for measuring key parameters such as elbow torque. This presents a clear advantage for in-game use, because motion-capture video isn’t easily adaptable for precisely tracking players on the ballfield.
Our Motus Throw devices have collected data from more than 10 million throws, at all levels of competition. Beginning in 2016, Motus embarked on a three-year study in cooperation with the National Collegiate Athletic Association (NCAA), collecting data from dozens of teams about pitchers’ throwing workloads and their injuries. We’re now preparing to mine that data.
But the device is helpful for more than just broad epidemiological studies. Take what happened at a small state college in Texas, where a baseball coach named Bryan Conger went all in with the Motus Throw in the 2017 baseball season, using the technology to manage his pitching staff. During workouts, Conger could check the analytics on his phone, watching how a pitcher’s workload metrics changed with every pitch. He used all the data to create individualized training programs for each pitcher and to determine which pitchers were ready on game day.
Conger says the Motus analytics enabled him to keep his top pitchers in the game for more innings than he would have otherwise, confident that their fatigue levels were within bounds. He also used his top pitchers more often, finding that some could pitch on successive days without going beyond their workload limits. Most important, his pitchers suffered no major arm injuries all season. Conger’s team, from Tarleton State University, made it to the NCAA playoffs that year. Shortly thereafter, Conger was hired away by an MLB team, the Texas Rangers, where he’s now working as a pitching instructor.
When teams first got their hands on the Motus Throw device in 2015, they were most excited about its ability to measure a force called elbow valgus torque. If you look at your elbow, you’ll see that it can be moved in three ways: You can curl your arm inward in the motion of a bicep curl, twist your arm as if turning a doorknob, and stretch your arm outward as if throwing a baseball. That last motion causes valgus torque, the force that stresses the UCL.
When coauthor Hansen worked as a biomechanical specialist for the MLB’s Milwaukee Brewers, the team was focused on one-time assessments of pitchers’ valgus torque and using that data to predict future injury. But we now know that such snapshot measurements aren’t enough for accurate predictions, and that valgus torque is only part of the equation.
After our latest software update, the Motus Sleeve system no longer shows a measure of valgus torque in its analytics dashboard. Instead, it uses that measurement to calculate accumulated workload for the muscles of the forearm. We believe that the fatigue of these muscles is the most critical factor for a pitcher’s stamina and arm health.
Muscle fatigue, which we define as a decline in a muscle’s ability to generate force, is the single most significant predictor of pitcher injury. A study of youth pitchers by ASMI found that pitchers who threw while fatigued were 36 times as likely to require surgery. And while the concept of fatigue may seem simple, the physiology involved is quite complex.
We use two complementary measurements of fatigue to calculate workload. The first is acute fatigue, which is commonly seen in baseball games: A pitcher starts off fresh but is pulled from the game as performance declines. This acute fatigue comes about as muscles use up available energy. The other metric we track, which we call “chronic fitness,” increases as pitchers build resistance to acute fatigue through training—and simply by throwing more.
It may seem obvious that working out more increases a pitcher’s fitness. But in baseball today, there’s no objective way to capture this commonsense idea. Baseball’s standard ways of managing a pitcher’s workload are to keep count of how many innings they’ve pitched and how many pitches they’ve thrown. These catchall metrics aren’t based on a specific pitcher’s physiology and fitness, and they lead to subjective decisions about when to take pitchers out of a game and whether a pitcher has been “overused.”
To codify the relationship between intense physical stress and resilience built up over time, Motus partnered with Tim Gabbett, an Australian physiologist who initially worked with rugby players. Gabbett pioneered a measurement he calls acute chronic ratio (ACR), which we use in our Motus Throw system.
Measuring this ratio begins by determining a pitcher’s acute workload (the 9-day average of total valgus load) and a chronic workload (28-day average of valgus load). The ratio of these two averages provides a valuable measure of fatigue. Early in the season, when chronic loads are still low, a pitcher must sustain an ACR greater than 1.0 to build fitness. But it mustn’t go too high: In a recent study of high school pitchers conducted with the Motus Throw device, researcher Sameer Mehta found that throwing with an ACR over 1.3 multiplies an athlete’s risk of injury by 25.
“When I consult with organizations on concepts of acute and chronic workloads, I’m most often met with a reaction similar to what our fathers and grandfathers knew all along,” says Gabbett. “If you work hard at something, you get better at it. If you train appropriately, you prepare for demands of the sport.” Now we can quantify this age-old wisdom with solid data.
Since its initial launch in 2015, Motus has uncovered meaningful workload measures that are predictive indicators for elbow and shoulder injuries. The next step has been to turn these findings into software tools to help pitchers train in the sweet spot, building endurance without causing excess muscle fatigue.
For example, when a minor league pitcher who had been using the Motus Throw tore his UCL, we were able to look at the data to see what had gone wrong with his training regimen. We saw clear indicators of excess fatigue (his chart showed ACRs of greater than 1.3 and often greater than 2.0) due to both intense single-day workouts and the pitcher’s abrupt transition from throwing every other day to throwing six days per week. Through software simulation, we’ve shown that a different training regimen would have kept the ACRs lower and would likely have prevented the injury. We want Motus users to take heed of the red flags in the dashboard’s reports to keep their workloads manageable and healthy.
Christopher Camp, a physician at the Mayo Clinic of Sports Medicine, says baseball teams need to “embrace workload and fatigue measures” during both rehabilitation and preseason throwing. He uses the Motus Throw with patients and says he appreciates that the app now prescribes workload plans to pitchers that set an effort limit for each day. The app helps pitchers safely build chronic workload over months, he says, by making sure the ACR is moderately—not dangerously—elevated.
In pitching, avoiding injury and enhancing performance go hand in hand: When muscle fatigue sets in, injury risk increases and control and performance begin to decline. Zach Dechant, director of strength and conditioning at Texas Christian University, in Fort Worth, says workload data can be used to maximize a pitcher’s in-game stamina. He uses the Motus system to help his pitchers gradually build chronic fitness. “We use Motus [measurements of] workloads to prime our pitchers’ arms for success and to allow them to pitch deeper into games,” he says.
For decades, baseball coaches have relied on pitch counts and the subjective concept of overuse to manage their pitchers. Today, as we gain more insights into the relationship among workload, fatigue, and arm health, it’s clear that data-driven systems can do better to protect players from injury.
Sports physiologists have come to recognize that the forearm muscles protect the UCL from rupture. But to do so, they must have sufficient energy stores to contract effectively, thus keeping the UCL from taking too much strain. To drive each contraction, cells in the muscles convert glycogen, one of the body’s primary stores of energy, into the chemical adenosine triphosphate (ATP). Each successive pitch depletes the muscles’ glycogen stores, causing an energy debt. As a pitcher reaches the end of an inning, the muscles reach peak levels of fatigue. In between innings, the pitcher rests, and glycogen stores slowly recover over a period of minutes. This cycle continues until the energy debt is too great to recover from.
But how should this gradual decline be measured? Mike Sonne, a biomechanics expert who works with Canadian baseball teams, came up with the concept of “fatigue units” based on his study of this accumulative energy debt, which he modeled on the scale of milliseconds. Sonne developed his models using publicly available data from Pitchf/x and MLB’s Statcast system, both of which provide information about pitch velocity, release point, spin, and more. Now, using the Motus Throw to gather far more granular and precise data, Sonne anticipates further progress in understanding the biochemistry of pitching.
In the summer of 2019, Motus released an app update to provide information about fatigue units. With this data, pitchers can identify incidents of excess overload within a matter of hours—feedback that they can use to learn safe limits for their bodies and to determine when extra rest and recovery are needed.
The current MLB rules allow players to wear the Motus Throw during games so that the device can collect data. But pitchers and coaches are not permitted to download that data or access it during a game; only afterward can they look at the analytics. We believe those rules will soon be revised—and we hope to lead the charge for in-game analytics. If a coach could track a player’s fatigue units during a game, maybe he could visit the mound at a moment when the pitcher really needs a few minutes to recover his strength. A coach would also have an objective measure of when a pitcher has reached his limit and should be taken out.
If coaches switch to a “bioball” management strategy based on biochemistry and physiology, the change might do away with endless postgame arguments over whether a player was left in for too long, causing an injury or the loss of a game. Such an evolution might be bad for talk radio. But if it’s good for players’ performance and bodies, we think baseball will consider it a win.
This article appears in the October 2019 print issue as “The New Science of Bioball.”
Have you ever needed an IV and had to undergo multiple pricks before the nurse could find a vein? Technology to avoid that painful trial and error is in the works. Fujifilm’s ultrasound diagnostics arm SonoSite announced yesterday that it had partnered with a startup company to develop artificial intelligence that can interpret ultrasound images on a mobile phone.
The companies say the first target for their AI-enabled ultrasound will be finding veins for IV (intravenous) needle insertion. The technology would enable technicians to hold a simple ultrasound wand over the skin while software on a connected mobile device locates the vein for them.
For this project, Fujifilm SonoSite tapped the Allen Institute for Artificial Intelligence (AI2), which has an incubator for AI startup companies. “Not only do we have to come up with a very accurate model to analyze the ultrasound videos, but on top of that, we have to make sure the model is working effectively on the limited resources of an android tablet or phone,” says Vu Ha, technical director of the AI2 Incubator.
In an interview with IEEE Spectrum, Ha did not disclose the name of the startup that will be taking on the task, saying the fledgling company is still in “stealth mode.”
Ha says the AI2 startup will take on the project in two stages: First, it’ll train a model on ultrasound images without any resource constraints, with the purpose of making it as accurate as possible. Then, the startup will go through a sequence of experiments to simplify the model by reducing the number of hidden layers in the network, and by trimming and compressing the network until it is simple enough to operate on a mobile phone.
The trick will be to shrink the model without sacrificing too much accuracy, Ha says.
If successful, the device could help clinicians reduce the number of unsuccessful attempts at finding a vein, and enable less trained technicians to start IVs as well. Hospitals that do a large volume of IVs often have highly trained staff capable of eyeballing ultrasound videos and using those images to help them to find small blood vessels. But the number of these highly trained clinicians is very small, says Ha.
“My hope is that with this technology, a less trained person will be able to find veins more reliably” using ultrasound, he says. That could broaden the availability of portable ultrasound to rural and resource-poor areas.
But the adoption of these devices has been relatively slow. As Eric Topol, director of the Scripps Research Translational Institute, told Spectrum recently, the smartphone ultrasound is a “brilliant engineering advance” that’s “hardly used at all” in the health care system. Complex challenges such as reimbursement, training, and the old habits of clinicians often hinder the uptake of new gadgets, despite engineers’ best efforts.
For years, scientists have explored ways to alter the cells of microorganisms in efforts to improve how a wide range of products are made – including medicines, fuels, and even beer. By tapping into the world of metabolic engineering, researchers have also developed techniques to create “smart” bacteria capable of carrying out a multitude of functions that impact processes involved in drug delivery, digestion, and even water decontamination.
But, altering the genetic and regulatory processes that take place within cells presents challenges.
To start, cells are already programmed to carry out their normal, everyday processes with maximum efficiency; any alterations that engineers make to increase a cell’s production of a certain substance can, in turn, upset these processes and overburden the cell.
To address this problem, William E. Bentley, an A. James Clark School of Engineering professor and director of the Robert E. Fischell Institute for Biomedical Devices, is working with a team of researchers to focus on engineering microbial consortia, wherein cell subpopulations are engineered to work together to carry out a desired function. This strategy – which others in the field have also explored – allows engineers to design specialized cells and divvy up the target workload among a group of cells.
The tradeoff is that directing a cell consortium to carry out specific tasks requires engineers to somehow regulate how many of each cell subpopulation are present. Until now, there’s been little research focused on developing devices or systems to automatically regulate the compositions of cellular subpopulations within a consortium. Generally, studies of cell consortia have required engineers to use painstaking manual or expensive external controller systems to strike that balance.
Bentley and his team are focused on reengineering cells so that they’re able to coordinate their subpopulation densities autonomously. Their technique was highlighted in a Nature Communications paper published on Sept. 11.
“The key concept is that groups of cells can be engineered to self-regulate their composition, and no outside input is needed,” Bentley said. “For example, there’s no way to ensure that the bacteria engineered for use in the gastrointestinal tract will actually be retained or behave as we expect. And you can’t use convenient means such as magnetic or electrical fields to regulate bacteria in the gut, so why not incorporate the self-regulation property into the bacteria themselves?”
Like others in the field, Bentley and members of his Biomolecular and Metabolic Engineering Lab previously investigated “quorum sensing,” or QS—a bacterial form of cell-to-cell communication—to engineer communication circuits between bacterial strains to coordinate their behaviors.
To create an autonomous system, Bentley and his team rewired the bacterial QS systems in two strains of E. coli so that the growth rate of communicating cell subpopulations within the consortia would be dictated by signaling between the cells. It’s a sort of feedback loop in which cells are able to sense and react to intercellular signaling molecules called autoinducers, which enable bacteria to work together of their own accord.
The breakthrough could be key to a host of new functions for “smart bacteria” developed through genetic engineering, ranging from drug delivery to water decontamination to new fermentation processes for the latest craft beverage.
“Increasingly, consortia of microbes will be tasked with converting raw materials into valuable products,” Bentley said. “The raw materials may be wastes or byproducts of industrial processes. The synthetic capabilities of consortia may far surpass those of pure monocultures, so methodologies that help to align consortia will be needed.”
University of Maryland Fischell Department of Bioengineering (BIOE) and Institute for Bioscience and Biotechnology Research (IBBR) researcher Kristina Stephens served as first author on the Nature Communications paper titled, “Bacterial co-culture with cell signaling translator and growth controller modules for autonomously regulated culture composition.” Maria Pozo (BIOE), Chen-Yu Tsao (BIOE, IBBR), and Pricila Hauk (BIOE, IBBR) also contributed to the paper.
This work is supported in part by funding from the National Science Foundation, the Defense Threat Reduction Agency (U.S. Department of Defense), and the National Institutes of Health (NIH).
A technique once considered for explosives detection might save more lives if used to verify the integrity of medicines
The collective thoughts of the interwebz
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.