All posts by Evan Ackerman

Amazon Redesigns Its Prime Air Delivery Drone

Post Syndicated from Evan Ackerman original

Amazon is finally starting to address some of the actual challenges with drone delivery, making us slightly less skeptical

Amazon has been working away at its Prime Air urban and suburban drone delivery for years. Many years. It’s been at least half a decade now. And for the entire time, we’ve been complaining that Amazon has been focusing on how to build drones that can physically transport objects rather than how to build drones that can safely and reliably transport objects in a manner that makes economic sense and that people actually want.

At its re:MARS conference today, Amazon showed off a brand-new version of its Prime Air drone. The design is certainly unique, featuring a hybrid tailsitter design with 6 degrees of freedom, but people have been futzing with weird drone designs for a long time, and this may or may not be a.) what Amazon has actually settled on long-term or b.) the best way of doing things, versus other techniques like Google Wing’s dangly box

What’s much more exciting is that Amazon seems to now be addressing the issue of safety, and has added a comprehensive suite of on-board sensing and computing that will help the drone deal with many of the complex obstacles that it’s likely to encounter while doing its job.

Amazon Introduces Two New Warehouse Robots

Post Syndicated from Evan Ackerman original

Xanthus is a major upgrade, and Pegasus is doing something new

At Amazon’s re:MARS conference in Las Vegas today, who else but Amazon is introducing two new robots designed to make its fulfillment centers even more fulfilling. Xanthus (named after a mythological horse that could very briefly talk but let’s not read too much into that) is a completely redesigned drive unit, one of the robotic mobile bases that carries piles of stuff around for humans to pick from. It has a thinner profile, a third of the parts, costs half as much, and can wear different modules on top to perform a much wider variety of tasks than its predecessor.

Pegasus (named after a mythological horse that could fly but let’s not read too much into that either) is also a mobile robot, but much smaller than Xanthus, designed to help the company quickly and accurately sort individual packages. For Amazon, it’s a completely new large-scale robotic system involving tightly coordinated fleets of robots tossing boxes down chutes, and it’s just as fun to watch as it sounds.

VelociRoACH Gets New Shell for Robot-on-Robot Smashing

Post Syndicated from Evan Ackerman original

Repeated headbutting helps little legged robots flip each other over

UC Berkeley’s VelociRoACH robots are something like a decade old—the first one, DASH, was presented at IROS 2009, back when IROS was small enough to fit into a Hyatt in St. Louis, Mo. The upgraded VelociRoACH showed up next year, and we’re still seeing it being used for innovative new research. The great thing about these little robots is that they’re cheap, easy to build (mostly cardboard), and even easier to modify, so they’ve evolved rapidly over the years with things like wings and winches and drone launchers

A few years ago, the addition of a shell (which actual roaches have) plus a tail (which actual roaches thankfully do not have) allowed VelociRoACH to flip itself over if it ended up upside down. This worked really well, but it did add a little bit of complication and expense to the VelociRoACH design. Not complication or expense that you’d care about if you were just making one robot, or 10 robots, or even maybe a 100 robots, but the whole point of making super cheap little mobile robots like VelociRoACH is that you want to be able to churn out thousands of them, and then deploy them in ginormous swarms to (say) find people in rubble after an earthquake. 

The latest version of VelociRoACH leverages the swarm idea to solve the flipped-over robot problem using nothing more than a slightly-redesigned shell. Instead of using a cockroach-like rounded robot, a square-fronted shell allows one robot to simply smash itself headfirst into another robot until it flips it over. 

iRobot Completely Redesigns Its Floor Care Robots With New m6 and s9

Post Syndicated from Evan Ackerman original

The Braava m6 and Roomba s9 work together to vacuum and then mop your floors

It was just late last year that iRobot announced the i7, a top of the line Roomba that could map your home, remember those maps, clean the rooms that you wanted it to, and then dump all of the dirt that it picked up into a docking station so you didn’t need to even think about the robot for weeks, or even months. After testing out the i7, we found that it really did work as advertised. For the first time, you could have consistently cleaner floors with literally zero effort. Thanks, robots!

We thought, somewhat naively as it turns out, that iRobot would be content to take a little bit of a break; historically, the Roomba innovation and commercial release cycle hasn’t been particularly aggressive. The i7 seemed like a big enough step forward that iRobot would let folks adjust to the idea of a self-emptying Roomba, which is (to be fair) a pretty impressive engineering trick.

Instead, iRobot is completely overshadowing the Roomba i7 just nine months later with two new floor cleaning robots: the Braava m6, which is a beefed-up version of the Braava Jet with a mapping system in it; and the Roomba s9, a square-fronted (!) Roomba vacuum with a 3D sensor in the front to help it manage tricky areas of your home better than ever. The reason that both of these robots are being announced together is because they work as a team, communicating with each other to first vacuum your floors, and then mop them. 

Buddy the Social Robot is Somehow Not Dead Yet

Post Syndicated from Evan Ackerman original

More than two years after Buddy was supposed to ship to Indiegogo backers, its creators have a plan

Nearly four years ago (September of 2015), over a thousand Indiegogo backers pledged a total of $657,000 for Buddy, “the first social robot that connects, protects, and interacts with each member of your family.” This was only about a year after Jibo raised more than $3 million—a time when social robots (especially crowdfunded social robots) seemed like the they would absolutely, positively, definitely be the next big thing.

Buddy was supposed to ship one year later, in September of 2016. It didn’t. And despite making appearances at shows and events over the next several years, the last actual public status update on Indiegogo is from 25 July, 2016. That statement didn’t say much. 

But this past weekend, out of the blue, Rodolphe Hasselvander, the creator of Buddy and CEO of Blue Frog Roboticssent a message to backers that opened with the picture at the top of this article (which is theirs, not something that we put together). They are very sorry, and if you forgive them and contribute just a little more money, they have a plan.

Watch the HyQReal Robot Pull an Airplane

Post Syndicated from Evan Ackerman original

The HyQReal quadruped robot is big, powerful, rugged, and capable of walking off with your Piaggio Avanti

The Dynamic Legged Systems Lab at the Italian Institute of Technology (IIT) has been working with hydraulic quadrupedal robots since about 2010— we first met their HyQ research platform at IROS 2011, and they’ve been using it for some consistently impressive research since then. But nine years is a very, very long time in robotics, and HyQ has been showing its age relative to the more recent (and more dynamic) generation of quadrupeds like SpotMini, ANYmal, and Cheetah 3.

Today, IIT is announcing a brand new and massively upgraded quadruped called HyQReal. It’s designed to be big, powerful, and rugged, and to demonstrate its capabilities, IIT figured that they might as well see if it could pull a three-ton airplane.

UC Berkeley’s “Hyper-Aggressive Pogo-Stick” Robot Now Works Outdoors

Post Syndicated from Evan Ackerman original

Salto gets an upgrade and now it can go for a bounce outside the lab

UC Berkeley’s Salto has been one of our favorite robots since 2016, which makes it old-ish in robot years. While it’s kept the same “hyper-aggressive pogo-stick” concept, a series of upgrades has endowed Salto with with the ability to do increasingly more dynamic maneuvers.

The original Salto could make two jumps in a row. Thrusters added in 2017 gave the robot the control it needed to chain together multiple jumps. And late last year at IROS, an improved controller gave Salto the intelligence that it needed to make pinpoint jumps that allowed it traverse a series of vertical obstacles (and more).

The big constraint has always been that in order for Salto to keep itself upright and in one piece, it had to jump within a motion-capture environment, which limited its usefulness to (let’s be honest) not much more than a cool research project and highly effective YouTube video view generator.

Today at ICRA, UC Berkeley roboticists Justin Yim and Eric Wang (from Ron Fearing’s Biomimetic Millisystems Lab) presented the latest version of Salto, which adds the sensing and computing required to do away with the motion-capture system completely. Salto can now jump as much as you want out of the lab, and in fact completely outdoors.

NVIDIA Brings Robot Simulation Closer to Reality by Making Humans Redundant

Post Syndicated from Evan Ackerman original

Learning in simulation no longer takes human expertise to make it useful in the real world

We all know how annoying real robots are. They’re expensive, they’re finicky, and teaching them to do anything useful takes an enormous amount of time and effort. One way of making robot learning slightly more bearable is to program robots to teach themselves things, which is not as fast as having a human instructor in the loop, but can be much more efficient because that human can be off doing something else more productive instead. Google industrialized this process by running a bunch of robots in parallel, which sped things up enormously, but you’re still constrained by those pesky physical arms.

The way to really scale up robot learning is to do as much of it as you can in simulation instead. You can use as many virtual robots running in virtual environments testing virtual scenarios as you have the computing power to handle, and then push the fast forward button so that they’re learning faster than real time. Since no simulation is perfect, it’ll take some careful tweaking to get it to actually be useful and reliable in reality, and that means that humans have get back involved in the process. Ugh.

A team of NVIDIA researchers, working at the company’s new robotics lab in Seattleis taking a crack at eliminating this final human-dependent step in a paper that they’re presenting at ICRA today. There’s still some tuning that has to happen to match simulation with reality, but now, it’s tuning that happens completely autonomously, meaning that the gap between simulation and reality can be closed without any human involvement at all.

Ultrafast Motion-Planning Chip Could Make Autonomous Cars Safer

Post Syndicated from Evan Ackerman original

Realtime Robotics’ motion-planning processor helps autonomous cars make better decisions

About two years ago, we covered a research project from Duke University that sped up motion planning for a tabletop robot arm by several orders of magnitude. The robot relied on a custom processor to do in milliseconds what normally takes seconds. The Duke researchers formed a company based on this tech called Realtime Robotics, and recently they’ve been focused on applying it to autonomous vehicles.

The reason that you should care about fast motion planning for autonomous vehicles is because motion planning encompasses the process by which the vehicle decides what it’s going to do next. Making this process faster doesn’t just mean that the vehicle can make decisions more quickly, but that it can make much better decisions as well—keeping you, and everyone around you, as safe as possible.

Robonaut and Astrobee Will Work Together on Space Station

Post Syndicated from Evan Ackerman original

NASA’s robots will help each other with useful tasks on the International Space Station

NASA has two robots that will, hopefully, be operating on the International Space Station (ISS) this year. There’s Robonaut, a humanoid (complete with legs) that will be on its way up there later this year, as well as Astrobee, a family of three free-flying robotic cubes that are already on the ISS as of a few weeks ago.

Astrobee and Robonaut are totally different in both form and function, but that just means that they have skills and abilities that complement each other, and the teams working on these robots have been making plans for on-orbit teamwork. To learn more about this collaboration, we spoke to Astrobee technical lead Trey Smith and Robonaut project manager Julia Badger.

Event Camera Helps Drone Dodge Thrown Objects

Post Syndicated from Evan Ackerman original

Watch this drone not get hit by a soccer ball

Davide Scaramuzza’s Robotics and Perception Group at the University of Zurich pioneered the use of event cameras on drones. We first wrote about event cameras back in 2014: These are sensors that are not good at interpreting a scene visually like a regular camera, but they’re extremely sensitive to motion, responding to changes in a scene on a per-pixel basis in microseconds. A regular camera that detects motion by comparing one frame with another takes milliseconds to do the same thing, which might not seem like much, but for a fast-moving drone it could easily be the difference between crashing into something and avoiding it successfully.

In a paper recently accepted to IEEE Robotics and Automation Letters, Davide Falanga and Suseong Kim from Scaramuzza’s group take a look at exactly how much of a difference it can make to use an event camera on drones moving at high speeds. And to validate their research, they hurl soccer balls at a drone as hard as they can, and see if it can dodge them.

This Robot Hummingbird Is Almost as Agile as the Real Thing

Post Syndicated from Evan Ackerman original

Purdue roboticists have built a bio-inspired micro air vehicle that flies much like a real hummingbird

Hummingbirds are some of the most nimble fliers on Earth. Their speed and agility are spectacular, driven by the complex muscles that control their wings. This is a difficult system for robots to emulate, and in general, the small winged robots that we’ve seen have relied on compromises in control in order to be able to use flapping wings for flight.

At Purdue University’s Bio-Robotics Lab, Xinyan Deng and her students are taking a very deliberately bio-inspired approach towards winged robotic flight that has resulted in one of the most capable robotic hummingbirds we’ve ever seen. It’s just about the same size and shape as the real thing, and the researchers hope it will be able to perform the same sorts of acrobatic maneuvers as an actual hummingbird. And more importantly, it’s robust enough that it can use its wings as sensors to navigate around obstacles, meaning that it has a shot at being useful outside of a lab.

These Robotic Objects Are Designed to Be Stabbed and Beaten to Help You Feel Better

Post Syndicated from Evan Ackerman original

Cathartic objects help users physically express strong emotional states

At a human-computer interaction conference this week in Glasgow, U.K., Carnegie Mellon University researcher Michal Luria is presenting a paper on “Challenges of Designing HCI for Negative Emotions.” The discussion includes a case study involving what Luria calls “cathartic objects”: robotic contraptions that you can beat, stab, smash, and swear at to help yourself feel better.

Is There a Future for Laundry-Folding Robots?

Post Syndicated from Evan Ackerman original

The company behind the Laundroid robot folds itself, but that’s not the end for laundry folding robots (yet)

The promising thing about laundry-folding robots is that they target a job that everybody does frequently, and nobody really likes. But to be successful in robotics, especially in consumer robotics, you have to be both affordable and reliable, and robots are, still, generally awful at those things. Laundroid, a robotic system that could ingest wads of laundry and somehow spit out neatly folded clothes, put on a few demos at CES over the past few years, but the Japanese company behind it just announced bankruptcy—probably because the robot didn’t work all the time, and would likely have been absurdly expensive.

Laundroid may not have been a success, but does that mean that other laundry-folding robots, most notably Foldimate, are doomed as well? Of course it doesn’t, although I’m not particularly optimistic.

NASA’s Robonaut to Return to Space Station With Legs Attached

Post Syndicated from Evan Ackerman original

NASA has fixed Robonaut and is nearly ready to send it back to the International Space Station

A little over a year ago, we reported on the status of the Robonaut 2 on the International Space Station. Things had not gone all that well for R2 ever since an attempt had been made to install a pair of legs back in 2014, leading to an intermittent power problem that was very hard to diagnose. NASA brought Robonaut back to Earth last year for repairs, and a few weeks ago, we stopped by NASA’s Johnson Space Center (JSC) in Houston, Texas, to visit the Robonaut lab and get an update on what’s been happening with R2.

Zipline Expands Medical Drone-Delivery Service to Ghana

Post Syndicated from Evan Ackerman original

With today’s official launch in Ghana, Zipline has vastly expanded the largest drone-delivery network in the world

Today, Zipline is officially opening the first of four distribution centers in Ghana, inaugurating a drone-delivery network that will eventually serve 2,000 hospitals and clinics covering 12 million people. We’re very familiar with Zipline’s dropping-packages-of-blood-from-the-sky operations in Rwanda, but Ghana will be on a much larger scale, with more drones flying more frequently delivering more items.