Tag Archives: Robotics/Drones

Nothing Can Keep This Drone Down

Post Syndicated from It uses a beetle-inspired set of wings to self-right itself original https://spectrum.ieee.org/tech-talk/robotics/drones/nothing-can-keep-this-drone-down

Journal Watch report logo, link to report landing page

When life knocks you down, you’ve got to get back up. Ladybugs take this advice seriously in the most literal sense. If caught on their backs, the insects are able to use their tough exterior wings, called elytra (of late made famous in the game Minecraft), to self-right themselves in just a fraction of a second.

Inspired by this approach, researchers have created self-righting drones with artificial elytra. Simulations and experiments show that the artificial elytra can not only help salvage fixed-wing drones from compromising positions, but also improve the aerodynamics of the vehicles during flight. The results are described in a study published July 9 in IEEE Robotics and Automation Letters.

Charalampos Vourtsis is a doctoral assistant at the Laboratory of Intelligent Systems, Ecole Polytechnique Federale de Lausanne in Switzerland who co-created the new design. He notes that beetles, including ladybugs, have existed for tens of millions of years. “Over that time, they have developed several survival mechanisms that we found to be a source of inspiration for applications in modern robotics,” he says.

His team was particularly intrigued by beetles’ elytra, which for ladybugs are their famous black-spotted, red exterior wing. Underneath the elytra is the hind wing, the semi-transparent appendage that’s actually used for flight.

When stuck on their backs, ladybugs use their elytra to stabilize themselves, and then thrust their legs or hind wings in order to pitch over and self-right. Vourtsis’ team designed Micro Aerial Vehicles (MAVs) that use a similar technique, but with actuators to provide the self-righting force. “Similar to the insect, the artificial elytra feature degrees of freedom that allow them to reorient the vehicle if it flips over or lands upside down,” explains Vourtsis.

The researchers created and tested artificial elytra of different lengths (11, 14 and 17 centimeters) and torques to determine the most effective combination for self-righting a fixed-wing drone. While torque had little impact on performance, the length of elytra was found to be influential.

On a flat, hard surface, the shorter elytra lengths yielded mixed results. However, the longer length was associated with a perfect success rate. The longer elytra were then tested on different inclines of 10°, 20° and 30°, and at different orientations. The drones used the elytra to self-right themselves in all scenarios, except for one position at the steepest incline.  

The design was also tested on seven different terrains: pavement, course sand, fine sand, rocks, shells, wood chips and grass. The drones were able to self-right with a perfect success rate across all terrains, with the exception of grass and fine sand. Vourtsis notes that the current design was made from widely available materials and a simple scale model of the beetle’s elytra—but further optimization may help the drones self-right on these more difficult terrains.

As an added bonus, the elytra were found to add non-negligible lift during flight, which offsets their weight.  

Vourtsis says his team hopes to benefit from other design features of the beetles’ elytra. “We are currently investigating elytra for protecting folding wings when the drone moves on the ground among bushes, stones, and other obstacles, just like beetles do,” explains Vourtsis. “That would enable drones to fly long distances with large, unfolded wings, and safely land and locomote in a compact format in narrow spaces.”

Parrot Announces A Bug-Inspired 4G Drone

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/parrot-announces-anafi-ai-a-buginspired-4g-drone

Parrot released the Anafi drone almost exactly four years ago. I’m still a fan of the little consumer drone—the design is elegant, it’s exceptionally portable, the camera is great, and it’s easy to fly. But the biggest problem with the Anafi (especially four years later) is that it’s very much not the cleverest of drones, without any of the onboard obstacle avoidance that’s now become the standard. Today, Parrot is announcing the Anafi AI, a butt-forward redesign of the Anafi for pros that adds obstacle avoidance, an enormous camera, and 4G connectivity that allows the drone to be flown to anywhere (and behind any object) where you can get a reliable 4G signal.

To Fly a Drone in the U.S., You Now Must Pass FAA’s TRUST Test

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/all-recreational-drone-pilots-must-now-past-the-faas-trust-test

For years, the U.S. Federal Aviation Administration (FAA) has been shuffling its way towards some semblance of regulation of the enormous number of drones now in the hands of recreational pilots in the United States. The fact that anyone can run out and buy a cheap drone at a nearby store, charge the battery, and launch the thing has got to be stupendously annoying for the FAA. One of their jobs, after all, is to impress upon people that drone owners doing something like that is not always a sensible thing to do. 

Perhaps coming to terms with its unfortunate (albeit quite necessary) role as a bit of a buzzkill, the FAA has been desperately trying to find ways of forcing recreational drone pilots to at least read the rules they’re supposed to be following, without resorting to a burdensome new regulatory infrastructure. Their strategy seems to be something like, “we’re going to require drone pilots to do a couple of things, but those things will be so painless that nobody can possibly object.” The first of those things is registering your drone if it weighs more than 0.55 pound, and the second of those things, just announced this week, is the TRUST testing requirement for all recreational drone pilots who fly drones.

Review: DJI’s New FPV Drone is Effortless, Exhilarating Fun

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/review-djis-new-fpv-drone-is-effortless-exhilarating-fun

In my experience, there are three different types of consumer drone pilots. You’ve got people for whom drones are a tool for taking pictures and video, where flying the drone is more or less just a necessary component of that. You’ve also got people who want a drone that can be used to take pictures or video of themselves, where they don’t want to be bothered flying the drone at all. Then you have people for whom flying the drone itself is the appealing part— people who like flying fast and creatively because it’s challenging and exciting and fun. And that typically means flying in First Person View, or FPV, where it feels like you’re a tiny little human sitting inside of a virtual cockpit in your drone.

For that last group of folks, the barrier to entry is high. Or rather, the barriers are high, because there are several. Not only is the equipment expensive, you often have to build your own system of drone, FPV goggles, and transmitter and receiver. And on top of that, it takes a lot of skill to fly an FPV drone well, and all of the inevitable crashes just add to the expense.

Today, DJI is announcing a new consumer first-person view drone system that includes everything you need to get started. You get an expertly designed and fully integrated high-speed FPV drone, a pair of FPV goggles with exceptional image quality and latency that’s some of the best we’ve ever seen, plus a physical controller to make it all work. Most importantly, though, there’s on-board obstacle avoidance plus piloting assistance that means even a complete novice can be zipping around with safety and confidence on day one.

Search-and-Rescue Drone Locates Victims By Homing in on Their Phones

Post Syndicated from Michelle Hampson original https://spectrum.ieee.org/tech-talk/robotics/drones/searchandrescue-drone-locates-victims-by-homing-in-on-their-phones

Journal Watch report logo, link to report landing page

When a natural disaster strikes, first responders must move quickly to search for survivors. To support the search-and-rescue efforts, one group of innovators in Europe has succeeded in harnessing the power of drones, AI, and smartphones, all in one novel combination.

Their idea is to use a single drone as a moving cellular base station, which can do large sweeps over disaster areas and locate survivors using signals from their phones. AI helps the drone methodically survey the area and even estimate the trajectory of survivors who are moving.

The team built its platform, called Search-And-Rescue DrOne based solution (SARDO), using off-the-shelf hardware and tested it in field experiments and simulations. They describe the results in a study published 13 January in IEEE Transactions on Mobile Computing.

“We built SARDO to provide first responders with an all-in-one victims localization system capable of working in the aftermath of a disaster without existing network infrastructure support,” explains Antonio Albanese, a Research Associate at NEC Laboratories Europe GmbH, which is headquartered in Heidelberg, Germany.

The point is that a natural disaster may knock out cell towers along with other infrastructure. SARDO, which is quipped with a light-weight cellular base station, is a mobile solution that could be implemented regardless of what infrastructure remains after a natural disaster. 

To detect and map out the locations of victims, SARDO performs time-of-flight measurements (using the timing of signals emitted by the users’ phones to estimate distance). 

A machine learning algorithm is then applied to the time-of-flight measurements to calculate the positions of victims. The algorithm compensates for when signals are blocked by rubble.

If a victim is on the move in the wake of a disaster, a second machine learning algorithm, tasked with estimating the person’s trajectory based on their current movement, kicks in—potentially helping first responders locate the person sooner.   

After sweeping an area, the drone is programmed to automatically maneuver closer to the position of a suspected victim to retrieve more accurate distance measurements. If too many errors are interfering with the drone’s ability to locate victims, it’s programmed to enlarge the scanning area.

In their study, Albanese and his colleagues tested SARDO in several field experiments without rubble, and used simulations to test the approach in a scenario where rubble interfered with some signals. In the field experiments, the drone was able to pinpoint the location of missing people to within a few tens of meters, requiring approximately three minutes to locate each victim (within a field roughly 200 meters squared. As would be expected, SARDO was less accurate when rubble was present or when the drone was flying at higher speeds or altitudes.

Albanese notes that a limitation of SARDO–as is the case with all drone-based approaches–is the battery life of the drone. But, he says, the energy consumption of the NEC team’s design remains relatively low.

The group is consulting the laboratory’s business experts on the possibility of commercializing this tech.  Says Albanese: “There is interest, especially from the public safety divisions, but still no final decision has been taken.”

In the meantime, SARDO may undergo further advances. “We plan to extend SARDO to emergency indoor localization so [it is] capable of working in any emergency scenario where buildings might not be accessible [to human rescuers],” says Albanese.

Folding Drone Can Drop Into Inaccessible Mines

Post Syndicated from Rahul Rao original https://spectrum.ieee.org/automaton/robotics/drones/folding-drone-can-drop-into-inaccessible-mines

Inspecting old mines is a dangerous business. For humans, mines can be lethal: prone to rockfalls and filled with noxious gases. Robots can go where humans might suffocate, but even robots can only do so much when mines are inaccessible from the surface.

Now, researchers in the UK, led by Headlight AI, have developed a drone that could cast a light in the darkness. Named Prometheus, this drone can enter a mine through a borehole not much larger than a football, before unfurling its arms and flying around the void. Once down there, it can use its payload of scanning equipment to map mines where neither humans nor robots can presently go. This, the researchers hope, could make mine inspection quicker and easier. The team behind Prometheus published its design in November in the journal Robotics.

Mine inspection might seem like a peculiarly specific task to fret about, but old mines can collapse, causing the ground to sink and damaging nearby buildings. It’s a far-reaching threat: the geotechnical engineering firm Geoinvestigate, based in Northeast England, estimates that around 8 percent of all buildings in the UK are at risk from any of the thousands of abandoned coal mines near the country’s surface. It’s also a threat to transport, such as road and rail. Indeed, Prometheus is backed by Network Rail, which operates Britain’s railway infrastructure.

Such grave dangers mean that old mines need periodic check-ups. To enter depths that are forbidden to traditional wheeled robots—such as those featured in the DARPA SubT Challenge—inspectors today drill boreholes down into the mine and lower scanners into the darkness.

But that can be an arduous and often fruitless process. Inspecting the entirety of a mine can take multiple boreholes, and that still might not be enough to chart a complete picture. Mines are jagged, labyrinthine places, and much of the void might lie out of sight. Furthermore, many old mines aren’t well-mapped, so it’s hard to tell where best to enter them.

Prometheus can fly around some of those challenges. Inspectors can lower Prometheus, tethered to a docking apparatus, down a single borehole. Once inside the mine, the drone can undock and fly around, using LIDAR scanners—common in mine inspection today—to generate a 3D map of the unknown void. Prometheus can fly through the mine autonomously, using infrared data to plot out its own course.

Other drones exist that can fly underground, but they’re either too small to carry a relatively heavy payload of scanning equipment, or too large to easily fit down a borehole. What makes Prometheus unique is its ability to fold its arms, allowing it to squeeze down spaces its counterparts cannot.

It’s that ability to fold and enter a borehole that makes Prometheus remarkable, says Jason Gross, a professor of mechanical and aerospace engineering at West Virginia University. Gross calls Prometheus “an exciting idea,” but he does note that it has a relatively short flight window and few abilities beyond scanning.

The researchers have conducted a number of successful test flights, both in a basement and in an old mine near Shrewsbury, England. Not only was Prometheus able to map out its space, the drone was able to plot its own course in an unknown area.

The researchers’ next steps, according to Puneet Chhabra, co-founder of Headlight AI, will be to test Prometheus’s ability to unfold in an actual mine. Following that, researchers plan to conduct full-scale test flights by the end of 2021.

New Drone Software Handles Motor Failures Even Without GPS

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/new-drone-software-handles-motor-failures-even-without-gps

Good as some drones are becoming at obstacle avoidance, accidents do still happen. And as far as robots go, drones are very much on the fragile side of things.  Any sort of significant contact between a drone and almost anything else usually results in a catastrophic, out-of-control spin followed by a death plunge to the ground. Bad times. Bad, expensive times.

A few years ago, we saw some interesting research into software that can keep the most common drone form factor, the quadrotor, aloft and controllable even after the failure of one motor. The big caveat to that software was that it relied on GPS for state estimation, meaning that without a GPS signal, the drone is unable to get the information it needs to keep itself under control. In a paper recently accepted to RA-L, researchers at the University of Zurich report that they have developed a vision-based system that brings state estimation completely on-board. The upshot: potentially any drone with some software and a camera can keep itself safe even under the most challenging conditions.

Smellicopter Drone Uses Live Moth Antenna to Track Scents

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/smellicopter-drone-live-moth-antenna

Research into robotic sensing has, understandably I guess, been very human-centric. Most of us navigate and experience the world visually and in 3D, so robots tend to get covered with things like cameras and lidar. Touch is important to us, as is sound, so robots are getting pretty good with understanding tactile and auditory information, too. Smell, though? In most cases, smell doesn’t convey nearly as much information for us, so while it hasn’t exactly been ignored in robotics, it certainly isn’t the sensing modality of choice in most cases.

Part of the problem with smell sensing is that we just don’t have a good way of doing it, from a technical perspective. This has been a challenge for a long time, and it’s why we either bribe or trick animals like dogs, rats, vultures, and other animals to be our sensing systems for airborne chemicals. If only they’d do exactly what we wanted them to do all the time, this would be fine, but they don’t, so it’s not. 

Until we get better at making chemical sensors, leveraging biology is the best we can do, and what would be ideal would be some sort of robot-animal hybrid cyborg thing. We’ve seen some attempts at remote controlled insects, but as it turns out, you can simplify things if you don’t use the entire insect, but instead just find a way to use its sensing system. Enter the Smellicopter.

New FAA Drone Rules: What Recreational and Commercial Pilots Need to Know

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/faa-drone-rules-what-recreational-and-commercial-pilots-need-to-know

The United States Federal Aviation Administration has been desperately trying to keep up with the proliferation of recreational and commercial drones. They haven’t been as successful as all of us might have wanted, but some progress is certainly being made, most recently with some new rules about flying drones at night and over people and vehicles, as well as the requirement for a remote-identification system for all drones.

Over the next few years, FAA’s drone rules are going to affect you even if you just fly a drone for fun in your backyard, so we’ll take detailed look about what changes are coming and how you can prepare.

Dart-Shooting Drone Attacks Trees for Science

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/dart-shooting-drone

We all know how robots are great at going to places where you can’t (or shouldn’t) send a human. We also know how robots are great at doing repetitive tasks. These characteristics have the potential to make robots ideal for setting up wireless sensor networks in hazardous environments—that is, they could deploy a whole bunch of self-contained sensor nodes that create a network that can monitor a very large area for a very long time.

When it comes to using drones to set up sensor networks, you’ve generally got two options: A drone that just drops sensors on the ground (easy but inaccurate and limited locations), or using a drone with some sort of manipulator on it to stick sensors in specific places (complicated and risky). A third option, under development by researchers at Imperial College London’s Aerial Robotics Lab, provides the accuracy of direct contact with the safety and ease of use of passive dropping by instead using the drone as a launching platform for laser-aimed, sensor-equipped darts. 

AI-Powered Drone Learns Extreme Acrobatics

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/ai-powered-drone-extreme-acrobatics

Quadrotors are among the most agile and dynamic machines ever created. In the hands of a skilled human pilot, they can do some astonishing series of maneuvers. And while autonomous flying robots have been getting better at flying dynamically in real-world environments, they still haven’t demonstrated the same level of agility of manually piloted ones.

Now researchers from the Robotics and Perception Group at the University of Zurich and ETH Zurich, in collaboration with Intel, have developed a neural network training method that “enables an autonomous quadrotor to fly extreme acrobatic maneuvers with only onboard sensing and computation.” Extreme.

There are two notable things here: First, the quadrotor can do these extreme acrobatics outdoors without any kind of external camera or motion-tracking system to help it out (all sensing and computing is onboard). Second, all of the AI training is done in simulation, without the need for an additional simulation-to-real-world (what researchers call “sim-to-real”) transfer step. Usually, a sim-to-real transfer step means putting your quadrotor into one of those aforementioned external tracking systems, so that it doesn’t completely bork itself while trying to reconcile the differences between the simulated world and the real world, where, as the researchers wrote in a paper describing their system, “even tiny mistakes can result in catastrophic outcomes.”

To enable “zero-shot” sim-to-real transfer, the neural net training in simulation uses an expert controller that knows exactly what’s going on to teach a “student controller” that has much less perfect knowledge. That is, the simulated sensory input that the student ends up using as it learns to follow the expert has been abstracted to present the kind of imperfect, imprecise data it’s going to encounter in the real world. This can involve things like abstracting away the image part of the simulation until you’d have no way of telling the difference between abstracted simulation and abstracted reality, which is what allows the system to make that sim-to-real leap.

The simulation environment that the researchers used was Gazebo, slightly modified to better simulate quadrotor physics. Meanwhile, over in reality, a custom 1.5-kilogram quadrotor with a 4:1 thrust to weight ratio performed the physical experiments, using only a Nvidia Jetson TX2 computing board and an Intel RealSense T265, a dual fisheye camera module optimized for V-SLAM. To challenge the learning system, it was trained to perform three acrobatic maneuvers plus a combo of all of them:

All of these maneuvers require high accelerations of up to 3 g’s and careful control, and the Matty Flip is particularly challenging, at least for humans, because the whole thing is done while the drone is flying backwards. Still, after just a few hours of training in simulation, the drone was totally real-world competent at these tricks, and could even extrapolate a little bit to perform maneuvers that it was not explicitly trained on, like doing multiple loops in a row. Where humans still have the advantage over drones is (as you might expect since we’re talking about robots) is quickly reacting to novel or unexpected situations. And when you’re doing this sort of thing outdoors, novel and unexpected situations are everywhere, from a gust of wind to a jealous bird.

For more details, we spoke with Antonio Loquercio from the University of Zurich’s Robotics and Perception Group.

IEEE Spectrum: Can you explain how the abstraction layer interfaces with the simulated sensors to enable effective sim-to-real transfer?

Antonio Loquercio: The abstraction layer applies a specific function to the raw sensor information. Exactly the same function is applied to the real and simulated sensors. The result of the function, which is “abstracted sensor measurements,” makes simulated and real observation of the same scene similar. For example, suppose we have a sequence of simulated and real images. We can very easily tell apart the real from the simulated ones given the difference in rendering. But if we apply the abstraction function of “feature tracks,” which are point correspondences in time, it becomes very difficult to tell which are the simulated and real feature tracks, since point correspondences are independent of the rendering. This applies for humans as well as for neural networks: Training policies on raw images gives low sim-to-real transfer (since images are too different between domains), while training on the abstracted images has high transfer abilities.

How useful is visual input from a camera like the Intel RealSense T265 for state estimation during such aggressive maneuvers? Would using an event camera substantially improve state estimation?

Our end-to-end controller does not require a state estimation module. It shares however some components with traditional state estimation pipelines, specifically the feature extractor and the inertial measurement unit (IMU) pre-processing and integration function. The input of the neural networks are feature tracks and integrated IMU measurements. When looking at images with low features (for example when the camera points to the sky), the neural net will mainly rely on IMU. When more features are available, the network uses to correct the accumulated drift from IMU. Overall, we noticed that for very short maneuvers IMU measurements were sufficient for the task. However, for longer ones, visual information was necessary to successfully address the IMU drift and complete the maneuver. Indeed, visual information reduces the odds of a crash by up to 30 percent in the longest maneuvers. We definitely think that event camera can improve even more the current approach since they could provide valuable visual information during high speed.

You describe being able to train on “maneuvers that stretch the abilities of even expert human pilots.” What are some examples of acrobatics that your drones might be able to do that most human pilots would not be capable of?

The Matty Flip is probably one of the maneuvers that our approach can do very well, but human pilots find very challenging. It basically entails doing a high speed power loop by always looking backward. It is super challenging for humans, since they don’t see where they’re going and have problems in estimating their speed. For our approach the maneuver is no problem at all, since we can estimate forward velocities as well as backward velocities.

What are the limits to the performance of this system?

At the moment the main limitation is the maneuver duration. We never trained a controller that could perform maneuvers longer than 20 seconds. In the future, we plan to address this limitation and train general controllers which can fly in that agile way for significantly longer with relatively small drift. In this way, we could start being competitive against human pilots in drone racing competitions.

Can you talk about how the techniques developed here could be applied beyond drone acrobatics?

The current approach allows us to do acrobatics and agile flight in free space. We are now working to perform agile flight in cluttered environments, which requires a higher degree of understanding of the surrounding with respect to this project. Drone acrobatics is of course only an example application. We selected it because it makes a stress test of the controller performance. However, several other applications which require fast and agile flight can benefit from our approach. Examples are delivery (we want our Amazon packets always faster, don’t we?), search and rescue, or inspection. Going faster allows us to cover more space in less time, saving battery costs. Indeed, agile flight has very similar battery consumption of slow hovering for an autonomous drone.


Deep Drone Acrobatics,” by Elia Kaufmann, Antonio Loquercio, René Ranftl, Matthias Müller, Vladlen Koltun, and Davide Scaramuzza from the Robotics and Perception Group at the University of Zurich and ETH Zurich, and Intel’s Intelligent Systems Lab, was presented at RSS 2020.

Why You Should Be Very Skeptical of Ring’s Indoor Security Drone

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/ring-indoor-security-drone

Yesterday, Ring, the smart home company owned by Amazon, announced the Always Home Cam, a “next-level indoor security” system in the form of a small autonomous drone. It costs US $250 and is designed to closely integrate with the rest of Ring’s home security hardware and software. Technologically, it’s impressive. But you almost certainly don’t want one.

Zipline Partners With Walmart on Commercial Drone Delivery

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/zipline-walmart-drone-delivery

Today, Walmart and Zipline are announcing preliminary plans “to bring first-of-its kind drone delivery service to the United States.” What makes this drone-delivery service the first of its kind is that Zipline uses fixed-wing drones rather than rotorcraft, giving them a relatively large payload capacity and very long range at the cost of a significantly more complicated launch, landing, and delivery process. Zipline has made this work very well in Rwanda, and more recently in North Carolina. But expanding into commercial delivery to individual households is a much different challenge. 

Along with a press release that doesn’t say much, Walmart and Zipline have released a short video of how they see the delivery operation happening, and it’s a little bit more, uh, optimistic than we’re entirely comfortable with.

These Underwater Drones Use Water Temperature Differences To Recharge

Post Syndicated from Jeremy Hsu original https://spectrum.ieee.org/automaton/robotics/drones/renewable-power-underwater-drones

Yi Chao likes to describe himself as an “armchair oceanographer” because he got incredibly seasick the one time he spent a week aboard a ship. So it’s maybe not surprising that the former NASA scientist has a vision for promoting remote study of the ocean on a grand scale by enabling underwater drones to recharge on the go using his company’s energy-harvesting technology.

Many of the robotic gliders and floating sensor stations currently monitoring the world’s oceans are effectively treated as disposable devices because the research community has a limited number of both ships and funding to retrieve drones after they’ve accomplished their mission of beaming data back home. That’s not only a waste of money, but may also contribute to a growing assortment of abandoned lithium-ion batteries polluting the ocean with their leaking toxic materials—a decidedly unsustainable approach to studying the secrets of the underwater world.

“Our goal is to deploy our energy harvesting system to use renewable energy to power those robots,” says Chao, president and CEO of the startup Seatrec. “We’re going to save one battery at a time, so hopefully we’re going to not to dispose more toxic batteries in the ocean.”

Chao’s California-based startup claims that its SL1 Thermal Energy Harvesting System can already help save researchers money equivalent to an order of magnitude reduction in the cost of using robotic probes for oceanographic data collection. The startup is working on adapting its system to work with autonomous underwater gliders. And it has partnered with defense giant Northrop Grumman to develop an underwater recharging station for oceangoing drones that incorporates Northrop Grumman’s self-insulating electrical connector capable of operating while the powered electrical contacts are submerged.

Seatrec’s energy-harvesting system works by taking advantage of how certain substances transition from solid-to-liquid phase and liquid-to-gas phase when they heat up. The company’s technology harnesses the pressure changes that result from such phase changes in order to generate electricity. 

To make the phase changes happen, Seatrec’s solution taps the temperature differences between warmer water at the ocean surface and colder water at the ocean depths. Even a relatively simple robotic probe can generate additional electricity by changing its buoyancy to either float at the surface or sink down into the colder depths.

By attaching an external energy-harvesting module, Seatrec has already begun transforming robotic probes into assets that can be recharged and reused more affordably than sending out a ship each time to retrieve the probes. This renewable energy approach could keep such drones going almost indefinitely barring electrical or mechanical failures. “We just attach the backpack to the robots, we give them a cable providing power, and they go into the ocean,” Chao explains. 

The early buyers of Seatrec’s products are primarily academic researchers who use underwater drones to collect oceanographic data. But the startup has also attracted military and government interest. It has already received small business innovation research contracts from both the U.S. Office of Naval Research and National Oceanic and Atmospheric Administration (NOAA).

Seatrec has also won two $10,000 prizes under the Powering the Blue Economy: Ocean Observing Prize administered by the U.S. Department of Energy and NOAA. The prizes awarded during the DISCOVER Competition phase back in March 2020 included one prize split with Northrop Grumman for the joint Mission Unlimited UUV Station concept. The startup and defense giant are currently looking for a robotics company to partner with for the DEVELOP Competition phase of the Ocean Observing Prize that will offer a total of $3 million in prizes.

In the long run, Seatrec hopes its energy-harvesting technology can support commercial ventures such as the aquaculture industry that operates vast underwater farms. The technology could also support underwater drones carrying out seabed surveys that pave the way for deep sea mining ventures, although those are not without controversy because of their projected environmental impacts.

Among all the possible applications Chao seems especially enthusiastic about the prospect of Seatrec’s renewable power technology enabling underwater drones and floaters to collect oceanographic data for much longer periods of time. He spent the better part of two decades working at the NASA Jet Propulsion Laboratory in Pasadena, Calif., where he helped develop a satellite designed for monitoring the Earth’s oceans. But he and the JPL engineering team that developed Seatrec’s core technology believe that swarms of underwater drones can provide a continuous monitoring network to truly begin understanding the oceans in depth.

The COVID-19 pandemic has slowed production and delivery of Seatrec’s products somewhat given local shutdowns and supply chain disruptions. Still, the startup has been able to continue operating in part because it’s considered to be a defense contractor that is operating an essential manufacturing facility. Seatrec’s engineers and other staff members are working in shifts to practice social distancing.

“Rather than building one or two for the government, we want to scale up to build thousands, hundreds of thousands, hopefully millions, so we can improve our understanding and provide that data to the community,” Chao says. 

Caltech’s Canon-Launched SQUID Drone Doubles in Size, Goes Autonomous

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/caltech-canon-launched-squid-drone

Journal Watch report logo, link to report landing page

At IROS last year, Caltech and JPL presented a prototype for a ballistically launched quadrotor—once folded up into a sort of football shape with fins, the drone is stuffed into a tube and then fired straight up with a blast of compressed CO2, at which point it unfolds itself, stabilizes, and then flies off. It’s been about half a year, and the prototype has been scaled up in both size and capability, now with a half-dozen rotors and full onboard autonomy that can (barely) squeeze into a 6-inch tube.

High Performance Ornithopter Drone Is Quiet, Efficient, and Safe

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/high-performance-ornithopter-drone

The vast majority of drones are rotary-wing systems (like quadrotors), and for good reason: They’re cheap, they’re easy, they scale up and down well, and we’re getting quite good at controlling them, even in very challenging environments. For most applications, though, drones lose out to birds and their flapping wings in almost every way—flapping wings are very efficient, enable astonishing agility, and are much safer, able to make compliant contact with surfaces rather than shredding them like a rotor system does. But flapping wing have their challenges too: Making flapping-wing robots is so much more difficult than just duct taping spinning motors to a frame that, with a few exceptions, we haven’t seen nearly as much improvement as we have in more conventional drones.

In Science Robotics last week, a group of roboticists from Singapore, Australia, China, and Taiwan described a new design for a flapping-wing robot that offers enough thrust and control authority to make stable transitions between aggressive flight modes—like flipping and diving—while also being able to efficiently glide and gently land. While still more complex than a quadrotor in both hardware and software, this ornithopter’s advantages might make it worthwhile.

Video Friday: Skydio 2 Drone Is Back on Sale, Gets Major Software Update

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/video-friday-skydio-2-back-on-sale

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

RSS 2020 – July 12-16, 2020 – [Virtual Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
IROS 2020 – October 25-29, 2020 – Las Vegas, Nevada
ICSR 2020 – November 14-16, 2020 – Golden, Colorado

Let us know if you have suggestions for next week, and enjoy today’s videos.


Drone With Bubble Machine Can Pollinate Flowers Like a Bee

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/drone-bubble-machine-pollinate-flowers-like-a-bee

The tiny biological machines that farms rely on to pollinate the flowers of fruiting plants have been having a tough time of it lately. While folks around the world are working on different artificial pollination systems, there’s really no replacing the productivity, efficiency, and genius of bees, and protecting them is incredibly important. That said, there’s no reason to also work on alternate methods of pollination, and researchers at the Japan Advanced Institute of Science and Technology (JAIST) have come up with something brilliant: pollen-infused soap bubbles blown out of a bubble maker mounted to a drone. And it apparently works really well.

Delivery Drones Could Hitchhike on Public Transit to Massively Expand Their Range

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/delivery-drones-could-hitchhike-on-public-transit-to-massively-expand-their-range

Beyond the technical and social issues with drone delivery, there are real questions about whether it would actually be an efficient and cost-effective way of moving stuff around urban environments. A significant problem with delivery drones right now is that they’re generally not much use if you want to send something relatively heavy very far away, especially if you want them to also be able to make pinpoint deliveries throughout cities safely. The problem is that drones run on batteries, which substantially limit their range, especially once you load them up with cargo.

One approach to try to offset the low range of delivery drones by flying them from vehicles that can serve as base stations. This idea has been tested by companies like Mercedes-Benz and Matternet, and also by UPS and Workhorse, among others. Now here’s another idea: Instead of deploying a fleet of private vans, you could rely on a vast network of vehicles that’s already on the road: public buses. In a paper presented at ICRA this month, researchers from Stanford’s Intelligent Systems Laboratory and Autonomous Systems Lab have explored how a transit-based delivery drone system might work, and it turns out that it might work really well—in cities like San Francisco and Washington, D.C., hitchhiking on buses could potentially help drones more than quadruple their package delivery range. 

Zipline Launches Long Distance Drone Delivery of COVID-19 Supplies in the U.S.

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/zipline-long-distance-delivery-covid19-supplies

Eighteen months ago, we traveled to Rwanda to see how Zipline had made fast, dependable drone delivery a critical part of medical supply infrastructure on a national scale. But outside of Africa, Zipline’s long-distance delivery drones have had to contend with complex and crowded airspace, decades of stale regulation, and a healthcare system that’s at least (sort of) functional, if not particularly agile. 

Along with several other drone delivery companies, Zipline has been working with the U.S. Federal Aviation Administration (FAA) on small scale pilot projects over the past year or so to prove out the drone delivery concept, but progress has been slow. Now, though, COVID-19 has put enough additional stress on the U.S. healthcare system that the FAA has granted an emergency waiver to the Part 107 drone rules to allow North Carolina–based Novant Health to partner with Zipline on a beyond line-of-sight autonomous drone delivery service through controlled airspace—the first of its kind in the United States.