Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):
RSS 2020 – July 12-16, 2020 – [Virtual Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
The tiny biological machines that farms rely on to pollinate the flowers of fruiting plants have been having a tough time of it lately. While folks around the world are working on different artificial pollination systems, there’s really no replacing the productivity, efficiency, and genius of bees, and protecting them is incredibly important. That said, there’s no reason to also work on alternate methods of pollination, and researchers at the Japan Advanced Institute of Science and Technology (JAIST) have come up with something brilliant: pollen-infused soap bubbles blown out of a bubble maker mounted to a drone. And it apparently works really well.
Beyond the technical and social issues with drone delivery, there are real questions about whether it would actually be an efficient and cost-effective way of moving stuff around urban environments. A significant problem with delivery drones right now is that they’re generally not much use if you want to send something relatively heavy very far away, especially if you want them to also be able to make pinpoint deliveries throughout cities safely. The problem is that drones run on batteries, which substantially limit their range, especially once you load them up with cargo.
One approach to try to offset the low range of delivery drones by flying them from vehicles that can serve as base stations. This idea has been tested by companies like Mercedes-Benz and Matternet, and also by UPS and Workhorse, among others. Now here’s another idea: Instead of deploying a fleet of private vans, you could rely on a vast network of vehicles that’s already on the road: public buses. In a paper presented at ICRA this month, researchers from Stanford’s Intelligent Systems Laboratory and Autonomous Systems Lab have explored how a transit-based delivery drone system might work, and it turns out that it might work really well—in cities like San Francisco and Washington, D.C., hitchhiking on buses could potentially help drones more than quadruple their package delivery range.
Eighteen months ago, we traveled to Rwanda to see how Zipline had made fast, dependable drone delivery a critical part of medical supply infrastructure on a national scale. But outside of Africa, Zipline’s long-distance delivery drones have had to contend with complex and crowded airspace, decades of stale regulation, and a healthcare system that’s at least (sort of) functional, if not particularly agile.
Along with several other drone delivery companies, Zipline has been working with the U.S. Federal Aviation Administration (FAA) on small scale pilot projects over the past year or so to prove out the drone delivery concept, but progress has been slow. Now, though, COVID-19 has put enough additional stress on the U.S. healthcare system that the FAA has granted an emergency waiver to the Part 107 drone rules to allow North Carolina–based Novant Health to partner with Zipline on a beyond line-of-sight autonomous drone delivery service through controlled airspace—the first of its kind in the United States.
Urban delivery drones have their work cut out for them. We’re not really sure whether the whole business model for urban delivery by drone will ultimately make sense, but for it to have a chance at working, drones will certainly benefit from being as energy efficient and time efficient as possible. The biggest waste of time and energy happens during pickup and delivery, where the drone is hovering rather than moving. Hovering is necessary since these drones may be often delivering things (like food and beverages) that need to be kept mostly upright and not delivered by a parachute or something.
If you were imagining the ideal drone delivery system, though, it would involve drones flying about at full speed, somehow picking up and delivering items safely without ever having to slow down. The pinpoint accuracy required by something like this isn’t something that we’re likely to see on a drone anytime soon, but what if we instead did all of that hard work on the ground, using a high speed vision system and robot arm to attach a package to a drone as it flies past? The Ishikawa Group Laboratory at the University of Tokyo has already made it happen.
Named after a dragon from the “Game of Thrones” fantasy series, the Rhaegal-A won’t be making its mark by burninating the countryside. Instead the hybrid-electric cargo drone capable of taking off and landing like a helicopter is in the spotlight today during a U.S. Air Force conference about “flying car” technologies.
A year ago, we visited Rwanda to see how Zipline’s autonomous, fixed-wing delivery drones were providing blood to hospitals and clinics across the country. We were impressed with both Zipline’s system design (involving dramatic catapult launches, parachute drops, and mid-air drone catching), as well as their model of operations, which minimizes waste while making critical supplies available in minutes almost anywhere in the country.
Since then, Zipline has expanded into Ghana, and has plans to start flying in India as well, but the COVID-19 pandemic is changing everything. Africa is preparing for the worst, while in the United States, Zipline is working with the Federal Aviation Administration to try and expedite safety and regulatory approvals for an emergency humanitarian mission with the goal of launching a medical supply delivery network that could help people maintain social distancing or quarantine when necessary by delivering urgent medication nearly to their doorsteps.
In addition to its existing role delivering blood products and medication, Zipline is acting as a centralized distribution network for COVID-19 supplies in Ghana and Rwanda. Things like personal protective equipment (PPE) will be delivered as needed by drone, ensuring that demand is met across the entire healthcare network. This has been a problem in the United States—getting existing supplies where they’re needed takes a lot of organization and coordination, which the US government is finding to be a challenge.
Zipline says that their drones are able to reduce human involvement in the supply chain (a vector for infection), while reducing hospital overcrowding by making it more practical for non-urgent patients to receive care in local clinics closer to home. COVID-19 is also having indirect effects on healthcare, with social distancing and community lockdowns straining blood supplies. With its centralized distribution model, Zipline has helped Rwanda to essentially eliminate wasted (expired) blood products. “We probably waste more blood [in the United States] than is used in all of Rwanda,” Zipline CEO Keller Rinaudo told us. But it’s going to take more than blood supply to fight COVID-19, and it may hit Africa particularly hard.
“Things are earlier in Africa, you don’t see infections at the scale that we’re seeing in the U.S.,” says Rinaudo. “I also think Africa is responding much faster. Part of that is the benefit of seeing what’s happening in countries that didn’t take it seriously in the first few months where community spreading gets completely out of control. But it’s quite possible that COVID is going to be much more severe in countries that are less capable of locking down, where you have densely populated areas with people who can’t just stay in their house for 45 days.”
In an attempt to prepare for things getting worse, Rinaudo says that Zipline is stocking as many COVID-related products as possible, and they’re also looking at whether they’ll be able to deliver to neighborhood drop-off points, or perhaps directly to homes. “That’s something that Zipline has been on track to do for quite some time, and we’re considering ways of accelerating that. When everyone’s staying at home, that’s the ideal time for robots to be making deliveries in a contactless way.” This kind of system, Rinaudo points out, would also benefit people with non-COVID healthcare needs, who need to do their best to avoid hospitals. If a combination of telemedicine and home or neighborhood delivery of medical supplies means they can stay home, it would be a benefit for everyone. “This is a transformation of the healthcare system that’s already happening and needs to happen anyway. COVID is just accelerating it.”
For the past year, Zipline, working closely with the FAA, has been planning on a localized commercial trial of a medical drone delivery service that was scheduled to begin in North Carolina this fall. While COVID is more urgent, the work that’s already been done towards this trial puts Zipline in a good position to move quickly, says Rinaudo.
“All of the work that we did with the IPP [UAS Integration Pilot Program] is even more important, given this crisis. It means that we’ve already been working with the FAA in detail, and that’s made it possible for us to have a foundation to build on to help with the COVID-19 response.” Assuming that Zipline and the FAA can find a regulatory path forward, the company could begin setting up distribution centers that can support hospital networks for both interfacility delivery as well as contactless delivery to (eventually) neighborhood points and perhaps even homes. “It’s exactly the use case and value proposition that I was describing for Africa,” Rinaudo says.
Leveraging rapid deployment experience that it has from work with the U.S. Department of Defense, Zipline would launch one distribution center within just a few months of a go-ahead from the FAA. This single distribution center could cover an area representing up to 10 million people. “We definitely want to move quickly here,” Rinaudo tells us. Within 18 months, Zipline could theoretically cover the entire US, although he admits “that would be an insanely fast roll-out.”
The question, at this point, is how fast the FAA can take action to make innovative projects like this happen. Zipline, as far as we can tell, is ready to go. We did also ask Rinaudo if he thought that hospitals specifically, and the medical system in general, has the bandwidth to adopt a system like Zipline’s in the middle of a pandemic that’s already stretching people and resources to the limit.
“In the U.S. there’s this sense that this technology is impossible, whereas it’s already operating at multi-national scale, serving thousands of hospitals and health facilities, and it’s completely boring to the people who are benefiting from it,” Rinaudo says. “People in the U.S. have really not caught on that this is something that’s reliable and can dramatically improve our response to crises like this.”
A version of this article was originally published on Medium. The views expressed here are solely those of the authors and do not represent positions of IEEE Spectrum or the IEEE.
We here at Skydio have been developing and deploying machine learning systems for years due to their ability to scale and improve with data. However, to date our learning systems have only been used for interpreting information about the world; in this post, we present our first machine learning system for actually acting in the world.
Using a novel learning algorithm, the Skydio autonomy engine, and only 3 hours of “off-policy” logged data, we trained a deep neural network pilot that is capable of filming and tracking a subject while avoiding obstacles.
We’ve all seen drone displays—massive swarms of tiny drones, each carrying a light, that swarm together in carefully choreographed patterns to form giant (albeit very low resolution) 3D shapes in the sky at night. It’s cool, but it’s not particularly novel anymore, and without thousands of drones, the amount of detail that you can expect out of the display is not all that great.
CollMot Entertainment, a Hungarian company that puts on traditional drone shows, has been working on something a little bit different. Instead of using drones as pixels, they’ve developed a system that uses drones to generate an enormous screen in the sky, and then laser projectors draw on that screen to create “the largest 3D display you have ever seen.”
Drones of all sorts are getting smaller and cheaper, and that’s great—it makes them more accessible to everyone, and opens up new use cases for which big expensive drones would be, you know, too big and expensive. The problem with very small drones, particularly those with fixed-wing designs, is that they tend to be inefficient fliers, and are very susceptible to wind gusts as well as air turbulence caused by objects that they might be flying close to. Unfortunately, designing for resilience and designing for efficiency are two different things: Efficient wings are long and thin, and resilient wings are short and fat. You can’t really do both at the same time, but that’s okay, because if you tried to make long and thin wings for micro aerial vehicles (MAVs) they’d likely just snap off. So stubby wings it is!
In a paper published this week in Science Robotics, researchers from Brown University and EPFL are presenting a new wing design that’s able to deliver both highly efficient flight and robustness to turbulence at the same time. A prototype 100-gram MAV using this wing design can fly for nearly 3 hours, which is four times longer than similar drones with conventional wings. How did they come up with a wing design that offered such a massive improvement? Well, they didn’t— they stole it, from birds.
Birds have been doing their flying thing with flexible and feathery wings for about a hundred million years, give or take. And about a hundred years ago, give or take, humans decided that you know what, birds may be the experts but we’re just going to go off in our own direction with mostly rigid wings and propellers and stuff, because it’s easier or whatever. A century later, we’re still doing the rigid wings with discrete flappy bits, while birds (one has to assume) continue to judge us for our poor choices.
In a paper published today in Science Robotics, researchers at Stanford University have presented some new work on understanding exactly how birds maintain control by morphing the shape of their wings. They put together a flying robot called PigeonBot with a pair of “biohybrid morphing wings” to test out new control principles, and instead of trying to develop some kind of fancy new artificial feather system, they did something that makes a lot more sense: they cheated, by just using real feathers instead.
Let me begin this review by saying that the Skydio 2 is one of the most impressive robots that I have ever seen. Over the last decade, I’ve spent enough time around robots to have a very good sense of what kinds of things are particularly challenging for them, and to set my expectations accordingly. Those expectations include things like “unstructured environments are basically impossible” and “full autonomy is impractically expensive” and “robot videos rarely reflect reality.”
Skydio’s newest drone is an exception to all of this. It’s able to fly autonomously at speed through complex environments in challenging real-world conditions in a way that’s completely effortless and stress-free for the end user, allowing you to capture the kind of video that would be otherwise impossible, even (I’m guessing) for professional drone pilots. When you see this technology in action, it’s (almost) indistinguishable from magic.
It’s a beautiful morning on the waters of Alaska’s Peril Strait—clear, calm, silent, and just a little cool. A small but seaworthy research vessel glides through gentle swells. Suddenly, in the distance, a humpback whale the size of a school bus explodes out of the water. Enormous bursts of air and water jet out of its blowholes like a fire hose, the noise echoing between the banks.
“Blow at eleven o’clock!” cries the lookout, and the small boat swarms with activity. A crew member wearing a helmet and cut-proof gloves raises a large quadcopter drone over his head, as if offering it to the sun, which glints off the half dozen plastic petri dishes velcroed to the drone.
Further back in the boat, the drone pilot calls, “Starting engines in 3, 2, 1! Takeoff in 3, 2, 1!” The drone’s engines buzz as it zooms 20 meters into the air and then darts off toward where the whale just dipped below the water’s surface. With luck, the whale will spout again nearby, and the drone will be there when it does.
The drone is a modified DJI Inspire 2. About the size of a toaster oven, it’s generally sold to photographers, cinematographers, and well-heeled hobbyists, but this particular drone is on a serious mission: to monitor the health of whales, the ocean, and by extension, the planet. The petri dishes it carries collect the exhaled breath condensate of a whale—a.k.a. snot—which holds valuable information about the creature’s health, diet, and other qualities. Hence the drone’s name: the Parley SnotBot.
The flyer comes standard with a forward-facing camera for navigation, collision-avoidance detectors, ultrasonic and barometric sensors to track altitude, and a GPS locator. With the addition of a high-definition video camera on a stabilized gimbal that can be directed independently, it can stream 1080p video live while simultaneously storing the video on a microSD card as well as high-resolution images on a 1-terabyte solid-state drive. Given that both cameras run during the entire 26 minutes of a typical flight, that’s a lot of data. More on what we are doing with that data later, but first, a bit of SnotBot history.
Petri dishes aboard SnotBot collect whale exhalate for later analysis. Photo: Christian Miller/Ocean Alliance
Intel Labs Research Scientist Bryn Keller, holding the drone, and Iain Kerr, the chief executive officer of Ocean Alliance, retrieve petri dishes that were sprayed with whale exhalation. Photo: Christian Miller/Ocean Alliance
Bryn Keller checks SnotBot before a launch. Photo: Christian Miller/Ocean Alliance
Ian Kerr, sitting under the awning, and other researchers send SnotBot on a hunt for whales in the Gulf of California. Photo: Christian Miller/Ocean Alliance
Iain Kerr was one of the early pioneers in using drones as a platform to collect and analyze whale exhalation. He’s the CEO of Ocean Alliance, in Gloucester, Mass., a group dedicated to protecting whales and the world’s oceans. Whale biologists know that whale snot contains an enormous amount of biological information, including DNA, hormones, and microorganisms. Scientists can use that information to determine a whale’s health, sex, and pregnancy status, and details about its genetics and microbiome. The traditional and most often used technique for collecting that kind of information is to zoom past a surfacing whale in a boat and shoot it with a specially designed crossbow to capture a small core sample of skin and blubber. The process is stressful for both researchers and whales.
Researchers had demonstrated that whale snot can be a viable replacement for blubber samples, but collection involved reaching out over whales using long, awkward poles—difficult, to say the least. The development of small but powerful commercial drones inspired Kerr to launch an exploratory research project in 2015 to go after whale snot with drones. He received the first U.S. National Oceanic and Atmospheric Administration (NOAA) research permit for collecting whale snot in U.S. waters. Since then, there have been dozens of SnotBot missions around the world, in the waters off Alaska, Gabon, Mexico, and other places where whales like to congregate, and the idea has spread to other teams around the globe.
The SnotBot design continues to evolve. The earliest versions tried to capture snot by trailing gauzy cloth below the drone. The hanging cloth turned out to be difficult to work with, however, and the material itself interfered with some of the lab tests, so the researchers scrapped that method. The developers didn’t consider using petri dishes at first, because they assumed that if the drone flew directly into a whale’s spout, the rotor wash would interfere with collection. Eventually, though, they tried the petri dishes and were happy to discover that the rotors’ downdraft improved rather than hindered collection.
For each mission, the collection goals have been slightly different, and the team tweaks the design of the craft accordingly. On one mission, the focus might be to survey an area, getting samples from as many whales as possible. The next mission might be a “focal follow,” in which the team tracks one whale over a period of hours or days, taking multiple samples so that they can understand things like how a whale’s hormone levels change throughout the day, either from natural processes or as a response to environmental factors.
Collecting and analyzing snot is certainly an important way to assess whale health, but the SnotBot team suspected that the drone could do more. In early 2017, staffers from Parley for the Oceans, a nonprofit environmental group that was working with Ocean Alliance on the SnotBot project, contacted one of us (Willke) to find out just how much more.
Willke is a machine-learning and artificial-intelligence researcher who leads Intel’s Brain-Inspired Computing Lab, in Hillsboro, Ore. He immediately saw ways of expanding the information gathered by SnotBot. Willke enlisted two researchers in his lab—coauthor Keller and Javier Turek—and the three of us got to work on enhancing SnotBot’s mission.
The quadcopters used in the SnotBot project carry high-quality cameras with advanced auto-stabilization features. The drone pilot relies on the high-definition video being streamed back to the boat to fly the aircraft and collect the snot. We knew that these same video streams could simultaneously feed into a computer on the boat and be processed in real time. Could that information help assess whale health?
Working with Ocean Alliance scientists, we first came up with a tool that analyzes a photo of a whale’s tail flukes and, using a database of whale photographs collected by the Alaska Whale Foundation, identifies individual whales by the shape of the fluke and its black and white patterns. Identifying each whale allows researchers to correlate snot samples over time.
Such identification can also help whale biologists cope with tricky regulatory issues. For example, there are at least two breeding populations of humpback whales that migrate to Alaska. Most come from Hawaii, but a smaller group comes from Mexico. The Mexican population is under greater stress at the moment, and so NOAA requests that researchers focus on the healthier, Hawaiian whales and leave the Mexican whales alone as much as possible. However, both populations are exactly the same species and thus indistinguishable from each other as a group. The ability to recognize individual whales allows researchers to determine whether a whale had been previously spotted in Mexico or Hawaii, so that they can act appropriately to comply with the regulation.
We also developed software that analyzes the shape of a whale from an overhead shot, taken about 25 meters directly above the whale. Since a skinny whale is often a sick one or one that hasn’t been getting enough to eat, even that simple metric can be a powerful indicator of well-being.
The biggest challenge in developing these tools was what’s called data starvation—there just wasn’t enough data. A standard deep-learning algorithm would look at a huge set of images and then figure out and extract the key distinguishing features of a whale. In the case of the fluke-ID tool, there were only a few pictures of each whale in the catalog, and these were often too low quality to be useful. For overhead health monitoring, there were likewise too few photos or videos of whales shot with the right camera, from the right angle, under the right conditions.
To address these problems, our team turned to classic computer-vision techniques to extract what we considered the most useful data. For example, we used edge-detection algorithms to find and measure the trailing edge of a fluke, then obtained the grayscale values of all the pixels in a line extending from the center notch of the fluke to the outer tips. We trained a small but effective neural network on this data alone. If more data had been available, a deep-learning approach would have worked better than our approach did, but we had to work with the limited data we had.
The latest model of SnotBot flies into action, with custom mounting points for petri dishes and its new paint scheme, designed to camouflage it against a cloud-studded sky. Photo: Christian Miller/Ocean Alliance
SnotBot passes through whale exhalation. Photo: Christian Miller/Ocean Alliance
The shape of a whale’s flukes can help software recognize individual whales. Photo: Christian Miller/Ocean Alliance
A humpback whale dives below the surface of the ocean. Photo: Kate Westaway/Getty Images
New discoveries in whale biology have already come from our tools. Besides the ability to distinguish between the Mexican and Hawaiian whale populations, researchers have discovered they can identify whales from their calls, even when the calls were recorded many years previously.
That latter discovery came during the summer of 2017, when we joined Fred Sharpe, an Alaska Whale Foundation researcher and founding board member, to study teams of whales that worked together to feed. While observing a small group of humpback whales, the boat’s underwater microphone picked up a whale feeding call. Sharpe thought it sounded familiar, and so he consulted his database of whale vocalizations. He found a similar call from a whale called Trumpeter that he had recorded some 20 years ago. But was it really the same whale? There was no way to know for sure from the whale call.
Then a whale surfaced briefly and dove again, letting us capture an image of its flukes. Our software found a match: The flukes indeed belonged to Trumpeter. That told the researchers that adult whale feeding calls likely remain stable for decades, maybe even for life. This insight gave researchers another tool for identifying whales in the wild and improving our understanding of vocal signatures in humpback whales.
Meanwhile, whale-ID tools are getting better all the time. The original SnotBot algorithm that we developed for whale identification has been essentially supplanted by more capable services. One new algorithm relies on the curvature of the trailing edge of the fluke for identification.
SnotBot’s real contribution, it turns out, is in health monitoring. Our shape-analysis tool has been evolving and, in combination with the spray samples, is giving researchers a comprehensive picture of an individual whale’s health. We call this tool Morphometer. We recently teamed up with Kelly Cates, a Ph.D. candidate in marine biology at the University of Alaska Fairbanks, and Fredrik Christiansen, an assistant professor and whale expert at the Aarhus Institute of Advanced Studies, in Denmark, to make the technology more powerful and also easier to use.
Here’s how it works. Researchers who make measurements and assessments of baleen whales—the type of whales that filter-feed—have typically used a technique developed by Christiansen in 2016. (So far the effort has involved humpback and southern right whales, but the process could work for any kind of baleen whale.) The researchers start with photographic prints or images on a computer and hand-measure the body widths of whales in the images at intervals of 5 percent of the overall length from the snout to the notch of the tail flukes. They then feed this set of measurements to software that calculates an estimate of the whale’s volume. From the relationship between the body length and volume, they can determine if an individual whale is relatively fatter or thinner compared with population norms, taking into account the significant but normal changes in girth that occur as whales accumulate energy reserves during the feeding season and then use those energy stores for migration and during the breeding season.
Morphometer also uses photos, but it measures the whale’s width continuously at the highest resolution possible given the quality of the photo, yielding hundreds of width measurements for each animal, instead of only the small number of measurements that are feasible for human researchers. The result is thus much more accurate. It also processes the data much faster than a human could, allowing biologists to focus on biology rather than doing tedious measurements by hand.
To improve Morphometer, we trained a deep-learning system on images of humpback and southern right whales in all sorts of different weather, water, and lighting conditions to allow it to understand exactly which pixels in an image belong to a whale. Once a whale has been singled out, the system identifies the head and tail and then measures the whale’s length and width at each pixel point along the outline of its body. Our software tracks the altitude from which the drone photographed the whale and combines that data with camera specifications entered by the drone operator, allowing the system to automatically convert the measurements from pixels to meters.
Morphometer compares this whale with others of its body type, displaying the result as an image of the subject whale superimposed on a whale-shape color-coded diagram with zones indicating the average measurements of similar whales. It’s immediately obvious if the whale is normal size, underweight, or larger than average, as would be the case with pregnant females [see illustration, “Measuring Up”].
For our early prototype, we input parameters for a “normal” body shape based on age, sex, and other factors. But now Morphometer is in the process of figuring out “normal” for itself by processing large numbers of whale images. Whale researchers who use their own drones to collect whale photos have been sending us their images. Eventually, we envision setting up a collaborative website that would allow images and morphometry models to be shared among researchers. We also plan to adapt Morphometer to analyze videos of whales, automatically extracting the frames or clips in which the whale’s position and visibility are the best.
To help researchers gain a more complete picture, we’re building statistical models of various whale populations, which we will compare to models derived from human-estimated measurements. Then we’ll take new photos of whales whose age and gender are known, and see whether the software correctly classifies them and gives appropriate indications of health; we’ll have whale biologists verify the results.
Once this model is working reliably, we expect to be able to say how a given whale’s size compares with those of its peers of the same gender, in the same region, at the same time of year. We’ll also be able to identify historical trends—for example, this whale is not skinnier than average compared with last year, but it is much skinnier than whales in its class a few decades ago, assuming comparison data exists. If, in addition, we have snot from the same whale, we can create a more complete profile of the whale, in the same way your credit card company can tell a lot about you by integrating your personal data with the averages and variances in the general population.
So far, SnotBot has told us a lot about the health of individual whales. Soon, researchers will start using this data to monitor the health of oceans. Whales are known as “apex predators,” meaning they are at the top of the food chain. Humpback whales in particular are generalist foragers and have wide-ranging migration patterns, which make them an excellent early-warning system for environmental threats to the ocean as a whole.
This is where SnotBot can really make a difference. We all depend on the oceans for our survival. Besides the vast amount of food they produce, we depend on them for the air we breathe: Most of the oxygen in the atmosphere comes from marine organisms such as phytoplankton and algae.
Lately, ocean productivity associated with a North Pacific warm-water anomaly, or “blob,” has resulted in a reduction of births and more reports of skinny whales, and that should worry us. If conditions are bad for whales, they’re also bad for humans. Thanks to Project SnotBot, we’ll be able to find out—accurately, efficiently, and at a reasonable cost—just how the health and numbers of whales in our oceans are trending. With that information, we hope, we will be able to spur society to take steps to protect the oceans before it’s too late.
The whale images in this article were obtained under National Marine Fisheries Service permits 18636-01 and 19703.
This article appears in the December 2019 print issue as “SnotBot: A Whale of a Deep-Learning Project.”
About the Authors
Bryn Keller is a deep-learning research scientist in Intel’s Brain-Inspired Computing Lab. Senior principal engineer Ted Willke is the lab’s director.
As useful as drones are up in the air, the process of getting them there tends to be annoying at best and dangerous at worst. Consider what it takes to launch something as simple as a DJI Mavic or a Parrot Anafi— you need to find a flat spot free of debris or obstructions, unfold the thing and let it boot up and calibrate and whatnot, stand somewhere safe(ish), and then get it airborne and high enough quick enough to avoid hitting any people or things that you care about.
I’m obviously being a little bit dramatic here, but ground launching drones is certainly both time consuming and risky, and there are occasions where getting a drone into the air as quickly and as safely as possible is a priority. At IROS in Macau earlier this month, researchers from Caltech and NASA’s Jet Propulsion Laboratory (JPL) presented a prototype for a ballistically launched drone—a football-shaped foldable quadrotor that gets fired out of a cannon, unfolds itself, and then flies off.
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):
We usually don’t toss around the word “disrupting” in a technology context without some serious eye roll. But Zipline really has been disrupting medical supply delivery in Africa by using drones to bypass busy roads and hilly terrain to deliver medical supplies to hospitals and clinics in minutes rather than hours. We visited Zipline in Rwanda last year, and the system it has for delivering blood, blood products, and medication is versatile, reliable, and even (in some cases) more affordable than any other delivery method available.
It’s not at all surprising that the unique capabilities Zipline offers have caught the attention of the U.S. military, which (at least in terms of personnel ratios) is primarily a massive logistics and support organization and secondarily a fighting force. For the past year or so, the Defense Department’s Defense Innovation Unit (DIU) has been working with Zipline to evaluate how their technology could be used to help the U.S. Marines Corps. In July, Zipline deployed to Australia to participate in a joint military exercise to demonstrate “how its instant drone delivery capability could help save lives in austere and tactical emergency environments, which include live-fire artillery.”
World’s first small-scale topographic and bathymetric scanning LiDAR
ASTRALiTe’s edge™ is the world’s first small-scale topographic and bathymetric scanning LiDAR that can detect small underwater objects, measure shallow water depth, and survey critical underwater infrastructure from a small UAV platform.
The edge™ can see beneath the water surface at depths from 0-5 meters and is completely self-contained with its own Inertial Navigation System with GNSS, battery, and onboard computer. It weighs about 5 kg and is designed for deployment on UAV systems for faster, safer, and more accurate bathymetric surveys. This patented 2-in-1 topographic and bathymetric LiDAR offers a centimeter-level depth resolution. There are numerous possible applications for this LiDAR, such as coastal mapping and surveying, infrastructure inspection, or even military logistics.
Importance of geo-referencing and motion stabilization
“We needed a motion and navigation solution for our LiDAR. Our requirements included high accuracy along with low size, weight, and power” explains Andy Gisler, Director of Lidar Systems with ASTRALiTe. In addition, the system needed to be able to apply Post-Processing Kinematic (PPK) corrections to the LiDAR data to provide higher accuracy results to ASTRALiTe’s customers.
The LiDAR provides a comprehensive point cloud that needs to be motion-compensated and geo-referenced to be usable. Two methods can be used to reach the centimeter-level accuracy requested by surveyors. The first one is Real-Time Kinematic (RTK), which makes use of corrections obtained from a base station or a base station network in real-time thanks to a radio or a GSM link. The second one is used after the mission using a PPK software. This software will apply the same correction as RTK, but it will also re-compute all the inertial data and raw GNSS observables with a forward-backward-merge algorithm to correct all the trajectories, fill any loss of position, and greatly improve the overall accuracy.
This PPK software gives access to offline RTK corrections from more than 7,000 base stations located in 164 countries and is designed to help UAV integrators get the best of their GNSS or INS/GNSS solution.
About SBG Systems INS/GNSS
SBG Systems is an international company which develops Inertial Measurement Unit with embedded GNSS, from miniature to high accuracy ranges. Combined with cutting-edge calibration techniques and advanced embedded algorithms, SBG Systems manufactures inertial solutions for industrial & research projects such as unmanned vehicle control (land, marine, and aerial), antenna tracking, camera stabilization, and surveying applications.
The word “autonomy” in the context of drones (or really any other robot) can mean a whole bunch of different things. Skydio’s newest drone, which you can read lots more about here, is probably the most autonomous drone that we’ve ever seen, in the sense that it can fly itself while tracking subjects and avoiding obstacles. But as soon as the Skydio 2 lands, it’s completely helpless, dependent on a human to pick it up, pack it into a case, and take it back home to recharge.
For consumer applications, this is not a big deal. But for industry, a big part of the appeal of autonomy is being able to deliver results with a minimum of human involvement, since humans are expensive and almost always busy doing other things.
Today, Skydio is announcing the Skydio 2 Dock, a (mostly) self-contained home base that a Skydio 2 drone can snuggle up inside to relax and recharge in between autonomous missions, meaning that you can set it up almost anywhere and get true long-term full autonomy from your drone.
When detectives and other forensics specialists arrive at a crime scene, there is a pressing need to survey the area quickly. Environmental disturbances such as wind or an incoming tide could ruin valuable evidence, and even the investigators themselves are at risk of contaminating the crime scene. Could a fleet of evidence-surveying drones be of help?
Pompílio Araújo, a criminal expert for the Federal Police of Brazil, is responsible for recording crime scenes exactly as found. In his other role as a researcher at the Intelligent Vision Research Lab at Federal University of Bahia, he is trying to make his first job easier by developing drones that can—very quickly—home in on a piece of evidence and record it from multiple angles.
When Skydio announced the R1 in early 2018, it was one of the most incredible drones we’d ever seen. It’s been a year and a half, and in the fast-paced world of drones, the Skydio R1 is somehow still, by a huge margin, the most intelligent and capable drone in existence, offering a level of autonomy that would be impressive even if it was a one-off research project, which it wasn’t, because you could buy one for US $2,500.
The R1, though, was really not intended to be a consumer drone in the sense that it wasn’t a direct competitor to the likes of DJI, which has overwhelmingly dominated the consumer drone space since the early days of consumer drones. Rather, the R1 was meant to demonstrate exactly what Skydio was capable of, offering the chosen few who could justify paying for one a magical experience that couldn’t be found anywhere else.
Today, Skydio is announcing their second drone: the Skydio 2. The Skydio 2 takes everything that made the R1 so amazing, and squeezes it into something smaller, smarter, and at $999, alarmingly close to affordable.
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.