Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/robotics-hardware/trhex-hexapod-robot-microspines
Tiny spikes allow this robot to climb its way up steep slopes and grip onto rough surfaces
In Aaron Johnson’s “Robot Design & Experimentation” class at CMU, teams of students have a semester to design and build an experimental robotic system based on a theme. For spring 2019, that theme was “Bioinspired Robotics,” which is definitely one of our favorite kinds of robotics—animals can do all kinds of crazy things, and it’s always a lot of fun watching robots try to match them. They almost never succeed, of course, but even basic imitation can lead to robots with some unique capabilities.
One of the projects from this year’s course, from Team ScienceParrot, is a new version of RHex called T-RHex (pronounced T-Rex, like the dinosaur). T-RHex comes with a tail, but more importantly, it has tiny tapered toes, which help it grip onto rough surfaces like bricks, wood, and concrete. It’s able to climb its way up very steep slopes, and hang from them, relying on its toes to keep itself from falling off.
Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/sensitive-whiskers-for-tiny-drones
This whisker sensing system can detect air pressure from objects even before they make physical contact
Animals of all shapes and sizes have whiskers of some sort. Cats and dogs and rodents have them. Seals have them too. Some birds have them, as do insects and fish. Whiskers have shown up across such a diversity of animals because they’re an efficient and effective method of short range sensing. Besides just being able to detect objects that they come into direct contact with, whiskers can also sense fluid flows (like the speed and direction of moving air or water), and they work even if it’s dark or foggy or smoky.
While we’ve seen some research on whiskers before—I’m sure you remember the utterly adorable ShrewBot—there hasn’t been too much emphasis on adding whiskers to robots, likely because lidar and cameras offer more useful data at longer ranges. And that’s totally fine, if you can afford the lidar or the computing necessary to make adequate use of cameras. For very small, very cheap drones, investing in sophisticated sensing and computing may not make sense, especially if you’re only interested in simple behaviors like not crashing into stuff.
At ICRA last month, Pauline Pounds from the University of Queensland in Brisbane, Australia, demonstrated a new whisker sensing system for drones. The whiskers are tiny, cheap, and sensitive enough to detect air pressure from objects even before they make physical contact.
The lightweight carbon fiber arm has three fingers and six degrees of freedom
We usually think of robots as taking the place of humans in various tasks, but robots of all kinds can also enhance human capabilities. This may be especially true for people with disabilities. And while the Cybathlon competition showed what’s possible when cutting-edge research robotics is paired with expert humans, that competition isn’t necessarily reflective of the kind of robotics available to most people today.
Kinova Robotics’s Jaco arm is an assistive robotic arm designed to be mounted on an electric wheelchair. With six degrees of freedom plus a three-fingered gripper, the lightweight carbon fiber arm is frequently used in research because it’s rugged and versatile. But from the start, Kinova created it to add autonomy to the lives of people with mobility constraints.
Earlier this year, Kinova shared the story of Mary Nelson, an 11-year-old girl with spinal muscular atrophy, who uses her Jaco arm to show her horse in competition. Spinal muscular atrophy is a neuromuscular disorder that impairs voluntary muscle movement, including muscles that help with respiration, and Mary depends on a power chair for mobility.
We wanted to learn more about how Kinova designs its Jaco arm, and what that means for folks like Mary, so we spoke with both Kinova and Mary’s parents to find out how much of a difference a robot arm can make.
Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/robotics-hardware/cassie-on-hovershoes
Cassie gets a speedy new pair of shoes. Wheee!
Proponents of legged robots say that they make sense because legs are often required to go where humans go. Proponents of wheeled robots say, “Yeah, that’s great but watch how fast and efficient my robot is, compared to yours.” Some robots try and take advantage of wheels and legs with hybrid designs like whegs or wheeled feet, but a simpler and more versatile solution is to do what humans do, and just take advantage of wheels when you need them.
We’ve seen a few experiments with this. The University of Michigan managed to convince Cassie to ride a Segway, with mostly positive (but occasionally quite negative) results. A Segway, and hoverboard-like systems, can provide wheeled mobility for legged robots over flat terrain, but they can’t handle things like stairs, which is kind of the whole point of having a robot with legs anyway.
At UC Berkeley’s Hybrid Robotics Lab, led by Koushil Sreenath, researchers have taken things a step further. They are teaching their Cassie bipedal robot (called Cassie Cal) to wheel around on a pair of hovershoes. Hovershoes are like hoverboards that have been chopped in half, resulting in a pair of motorized single-wheel skates. You balance on the skates, and control them by leaning forwards and backwards and left and right, which causes each skate to accelerate or decelerate in an attempt to keep itself upright. It’s not easy to get these things to work, even for a human, but by adding a sensor package to Cassie the UC Berkeley researchers have managed to get it to zip around campus fully autonomously.
Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/amazon-redesigned-prime-air-drone
Amazon is finally starting to address some of the actual challenges with drone delivery, making us slightly less skeptical
Amazon has been working away at its Prime Air urban and suburban drone delivery for years. Many years. It’s been at least half a decade now. And for the entire time, we’ve been complaining that Amazon has been focusing on how to build drones that can physically transport objects rather than how to build drones that can safely and reliably transport objects in a manner that makes economic sense and that people actually want.
At its re:MARS conference today, Amazon showed off a brand-new version of its Prime Air drone. The design is certainly unique, featuring a hybrid tailsitter design with 6 degrees of freedom, but people have been futzing with weird drone designs for a long time, and this may or may not be a.) what Amazon has actually settled on long-term or b.) the best way of doing things, versus other techniques like Google Wing’s dangly box.
What’s much more exciting is that Amazon seems to now be addressing the issue of safety, and has added a comprehensive suite of on-board sensing and computing that will help the drone deal with many of the complex obstacles that it’s likely to encounter while doing its job.
Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/industrial-robots/amazon-introduces-two-new-warehouse-robots
Xanthus is a major upgrade, and Pegasus is doing something new
At Amazon’s re:MARS conference in Las Vegas today, who else but Amazon is introducing two new robots designed to make its fulfillment centers even more fulfilling. Xanthus (named after a mythological horse that could very briefly talk but let’s not read too much into that) is a completely redesigned drive unit, one of the robotic mobile bases that carries piles of stuff around for humans to pick from. It has a thinner profile, a third of the parts, costs half as much, and can wear different modules on top to perform a much wider variety of tasks than its predecessor.
Pegasus (named after a mythological horse that could fly but let’s not read too much into that either) is also a mobile robot, but much smaller than Xanthus, designed to help the company quickly and accurately sort individual packages. For Amazon, it’s a completely new large-scale robotic system involving tightly coordinated fleets of robots tossing boxes down chutes, and it’s just as fun to watch as it sounds.
Repeated headbutting helps little legged robots flip each other over
UC Berkeley’s VelociRoACH robots are something like a decade old—the first one, DASH, was presented at IROS 2009, back when IROS was small enough to fit into a Hyatt in St. Louis, Mo. The upgraded VelociRoACH showed up next year, and we’re still seeing it being used for innovative new research. The great thing about these little robots is that they’re cheap, easy to build (mostly cardboard), and even easier to modify, so they’ve evolved rapidly over the years with things like wings and winches and drone launchers.
A few years ago, the addition of a shell (which actual roaches have) plus a tail (which actual roaches thankfully do not have) allowed VelociRoACH to flip itself over if it ended up upside down. This worked really well, but it did add a little bit of complication and expense to the VelociRoACH design. Not complication or expense that you’d care about if you were just making one robot, or 10 robots, or even maybe a 100 robots, but the whole point of making super cheap little mobile robots like VelociRoACH is that you want to be able to churn out thousands of them, and then deploy them in ginormous swarms to (say) find people in rubble after an earthquake.
The latest version of VelociRoACH leverages the swarm idea to solve the flipped-over robot problem using nothing more than a slightly-redesigned shell. Instead of using a cockroach-like rounded robot, a square-fronted shell allows one robot to simply smash itself headfirst into another robot until it flips it over.
The Braava m6 and Roomba s9 work together to vacuum and then mop your floors
It was just late last year that iRobot announced the i7, a top of the line Roomba that could map your home, remember those maps, clean the rooms that you wanted it to, and then dump all of the dirt that it picked up into a docking station so you didn’t need to even think about the robot for weeks, or even months. After testing out the i7, we found that it really did work as advertised. For the first time, you could have consistently cleaner floors with literally zero effort. Thanks, robots!
We thought, somewhat naively as it turns out, that iRobot would be content to take a little bit of a break; historically, the Roomba innovation and commercial release cycle hasn’t been particularly aggressive. The i7 seemed like a big enough step forward that iRobot would let folks adjust to the idea of a self-emptying Roomba, which is (to be fair) a pretty impressive engineering trick.
Instead, iRobot is completely overshadowing the Roomba i7 just nine months later with two new floor cleaning robots: the Braava m6, which is a beefed-up version of the Braava Jet with a mapping system in it; and the Roomba s9, a square-fronted (!) Roomba vacuum with a 3D sensor in the front to help it manage tricky areas of your home better than ever. The reason that both of these robots are being announced together is because they work as a team, communicating with each other to first vacuum your floors, and then mop them.
Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/home-robots/buddy-the-social-robot-is-somehow-not-dead-yet
More than two years after Buddy was supposed to ship to Indiegogo backers, its creators have a plan
Nearly four years ago (September of 2015), over a thousand Indiegogo backers pledged a total of $657,000 for Buddy, “the first social robot that connects, protects, and interacts with each member of your family.” This was only about a year after Jibo raised more than $3 million—a time when social robots (especially crowdfunded social robots) seemed like the they would absolutely, positively, definitely be the next big thing.
Buddy was supposed to ship one year later, in September of 2016. It didn’t. And despite making appearances at shows and events over the next several years, the last actual public status update on Indiegogo is from 25 July, 2016. That statement didn’t say much.
But this past weekend, out of the blue, Rodolphe Hasselvander, the creator of Buddy and CEO of Blue Frog Robotics, sent a message to backers that opened with the picture at the top of this article (which is theirs, not something that we put together). They are very sorry, and if you forgive them and contribute just a little more money, they have a plan.
The HyQReal quadruped robot is big, powerful, rugged, and capable of walking off with your Piaggio Avanti
The Dynamic Legged Systems Lab at the Italian Institute of Technology (IIT) has been working with hydraulic quadrupedal robots since about 2010— we first met their HyQ research platform at IROS 2011, and they’ve been using it for some consistently impressive research since then. But nine years is a very, very long time in robotics, and HyQ has been showing its age relative to the more recent (and more dynamic) generation of quadrupeds like SpotMini, ANYmal, and Cheetah 3.
Today, IIT is announcing a brand new and massively upgraded quadruped called HyQReal. It’s designed to be big, powerful, and rugged, and to demonstrate its capabilities, IIT figured that they might as well see if it could pull a three-ton airplane.
Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/robotics-hardware/upgraded-salto-goes-for-a-bounce-outside-the-lab
Salto gets an upgrade and now it can go for a bounce outside the lab
UC Berkeley’s Salto has been one of our favorite robots since 2016, which makes it old-ish in robot years. While it’s kept the same “hyper-aggressive pogo-stick” concept, a series of upgrades has endowed Salto with with the ability to do increasingly more dynamic maneuvers.
The original Salto could make two jumps in a row. Thrusters added in 2017 gave the robot the control it needed to chain together multiple jumps. And late last year at IROS, an improved controller gave Salto the intelligence that it needed to make pinpoint jumps that allowed it traverse a series of vertical obstacles (and more).
The big constraint has always been that in order for Salto to keep itself upright and in one piece, it had to jump within a motion-capture environment, which limited its usefulness to (let’s be honest) not much more than a cool research project and highly effective YouTube video view generator.
Today at ICRA, UC Berkeley roboticists Justin Yim and Eric Wang (from Ron Fearing’s Biomimetic Millisystems Lab) presented the latest version of Salto, which adds the sensing and computing required to do away with the motion-capture system completely. Salto can now jump as much as you want out of the lab, and in fact completely outdoors.
Learning in simulation no longer takes human expertise to make it useful in the real world
We all know how annoying real robots are. They’re expensive, they’re finicky, and teaching them to do anything useful takes an enormous amount of time and effort. One way of making robot learning slightly more bearable is to program robots to teach themselves things, which is not as fast as having a human instructor in the loop, but can be much more efficient because that human can be off doing something else more productive instead. Google industrialized this process by running a bunch of robots in parallel, which sped things up enormously, but you’re still constrained by those pesky physical arms.
The way to really scale up robot learning is to do as much of it as you can in simulation instead. You can use as many virtual robots running in virtual environments testing virtual scenarios as you have the computing power to handle, and then push the fast forward button so that they’re learning faster than real time. Since no simulation is perfect, it’ll take some careful tweaking to get it to actually be useful and reliable in reality, and that means that humans have get back involved in the process. Ugh.
A team of NVIDIA researchers, working at the company’s new robotics lab in Seattle, is taking a crack at eliminating this final human-dependent step in a paper that they’re presenting at ICRA today. There’s still some tuning that has to happen to match simulation with reality, but now, it’s tuning that happens completely autonomously, meaning that the gap between simulation and reality can be closed without any human involvement at all.
Realtime Robotics’ motion-planning processor helps autonomous cars make better decisions
About two years ago, we covered a research project from Duke University that sped up motion planning for a tabletop robot arm by several orders of magnitude. The robot relied on a custom processor to do in milliseconds what normally takes seconds. The Duke researchers formed a company based on this tech called Realtime Robotics, and recently they’ve been focused on applying it to autonomous vehicles.
The reason that you should care about fast motion planning for autonomous vehicles is because motion planning encompasses the process by which the vehicle decides what it’s going to do next. Making this process faster doesn’t just mean that the vehicle can make decisions more quickly, but that it can make much better decisions as well—keeping you, and everyone around you, as safe as possible.
Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/space-robots/robonaut-and-astrobee-to-will-work-together-on-iss
NASA’s robots will help each other with useful tasks on the International Space Station
NASA has two robots that will, hopefully, be operating on the International Space Station (ISS) this year. There’s Robonaut, a humanoid (complete with legs) that will be on its way up there later this year, as well as Astrobee, a family of three free-flying robotic cubes that are already on the ISS as of a few weeks ago.
Astrobee and Robonaut are totally different in both form and function, but that just means that they have skills and abilities that complement each other, and the teams working on these robots have been making plans for on-orbit teamwork. To learn more about this collaboration, we spoke to Astrobee technical lead Trey Smith and Robonaut project manager Julia Badger.
Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/event-camera-helps-drone-dodge-thrown-objects
Watch this drone not get hit by a soccer ball
Davide Scaramuzza’s Robotics and Perception Group at the University of Zurich pioneered the use of event cameras on drones. We first wrote about event cameras back in 2014: These are sensors that are not good at interpreting a scene visually like a regular camera, but they’re extremely sensitive to motion, responding to changes in a scene on a per-pixel basis in microseconds. A regular camera that detects motion by comparing one frame with another takes milliseconds to do the same thing, which might not seem like much, but for a fast-moving drone it could easily be the difference between crashing into something and avoiding it successfully.
In a paper recently accepted to IEEE Robotics and Automation Letters, Davide Falanga and Suseong Kim from Scaramuzza’s group take a look at exactly how much of a difference it can make to use an event camera on drones moving at high speeds. And to validate their research, they hurl soccer balls at a drone as hard as they can, and see if it can dodge them.
Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/robot-hummingbird-is-almost-as-agile-as-the-real-thing
Purdue roboticists have built a bio-inspired micro air vehicle that flies much like a real hummingbird
Hummingbirds are some of the most nimble fliers on Earth. Their speed and agility are spectacular, driven by the complex muscles that control their wings. This is a difficult system for robots to emulate, and in general, the small winged robots that we’ve seen have relied on compromises in control in order to be able to use flapping wings for flight.
At Purdue University’s Bio-Robotics Lab, Xinyan Deng and her students are taking a very deliberately bio-inspired approach towards winged robotic flight that has resulted in one of the most capable robotic hummingbirds we’ve ever seen. It’s just about the same size and shape as the real thing, and the researchers hope it will be able to perform the same sorts of acrobatic maneuvers as an actual hummingbird. And more importantly, it’s robust enough that it can use its wings as sensors to navigate around obstacles, meaning that it has a shot at being useful outside of a lab.
Cathartic objects help users physically express strong emotional states
At a human-computer interaction conference this week in Glasgow, U.K., Carnegie Mellon University researcher Michal Luria is presenting a paper on “Challenges of Designing HCI for Negative Emotions.” The discussion includes a case study involving what Luria calls “cathartic objects”: robotic contraptions that you can beat, stab, smash, and swear at to help yourself feel better.
Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/home-robots/is-there-a-future-for-laundry-folding-robots
The company behind the Laundroid robot folds itself, but that’s not the end for laundry folding robots (yet)
The promising thing about laundry-folding robots is that they target a job that everybody does frequently, and nobody really likes. But to be successful in robotics, especially in consumer robotics, you have to be both affordable and reliable, and robots are, still, generally awful at those things. Laundroid, a robotic system that could ingest wads of laundry and somehow spit out neatly folded clothes, put on a few demos at CES over the past few years, but the Japanese company behind it just announced bankruptcy—probably because the robot didn’t work all the time, and would likely have been absurdly expensive.
Laundroid may not have been a success, but does that mean that other laundry-folding robots, most notably Foldimate, are doomed as well? Of course it doesn’t, although I’m not particularly optimistic.
Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/space-robots/nasas-robonaut-to-return-to-iss-with-legs-attached
NASA has fixed Robonaut and is nearly ready to send it back to the International Space Station
A little over a year ago, we reported on the status of the Robonaut 2 on the International Space Station. Things had not gone all that well for R2 ever since an attempt had been made to install a pair of legs back in 2014, leading to an intermittent power problem that was very hard to diagnose. NASA brought Robonaut back to Earth last year for repairs, and a few weeks ago, we stopped by NASA’s Johnson Space Center (JSC) in Houston, Texas, to visit the Robonaut lab and get an update on what’s been happening with R2.
Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/zipline-expands-medical-drone-delivery-service-to-ghana
With today’s official launch in Ghana, Zipline has vastly expanded the largest drone-delivery network in the world
Today, Zipline is officially opening the first of four distribution centers in Ghana, inaugurating a drone-delivery network that will eventually serve 2,000 hospitals and clinics covering 12 million people. We’re very familiar with Zipline’s dropping-packages-of-blood-from-the-sky operations in Rwanda, but Ghana will be on a much larger scale, with more drones flying more frequently delivering more items.
The collective thoughts of the interwebz
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.