Over the last half decade or so, the commercialization of autonomous robots that can operate outside of structured environments has dramatically increased. But this relatively new transition of robotic technologies from research projects to commercial products comes with its share of challenges, many of which relate to the rapidly increasing visibility that these robots have in society.
Whether it’s because of their appearance of agency, or because of their history in popular culture, robots frequently inspire people’s imagination. Sometimes this is a good thing, like when it leads to innovative new use cases. And sometimes this is a bad thing, like when it leads to use cases that could be classified as irresponsible or unethical. Can the people selling robots do anything about the latter? And even if they can, should they?
It’s been very cool to watch 3D printers and laser cutters evolve into fairly common tools over the last decade-ish, finding useful niches across research, industry, and even with hobbyists at home. Capable as these fabricators are, they tend to be good at just one specific thing: making shapes out of polymer. Which is great! But we have all kinds of other techniques for making things that are even more useful, like by adding computers and actuators and stuff like that. You just can’t do that with your 3D printer or laser cutter, because it just does its one thing—which is too bad.
At CHI this year, researchers from MIT CSAIL are presenting LaserFactory, an integrated fabrication system that turns any laser cutter into a device that can (with just a little bit of assistance) build you an entire drone at a level of finish that means when the fabricator is done, the drone can actually fly itself off of the print bed. Sign me up.
Over the past few weeks, we’ve seen a couple of new robots from Hyundai Motor Group. This is a couple more robots than I think I’ve seen from Hyundai Motor Group, like, ever. We’re particularly interested in them right now mostly because Hyundai Motor Group are the new owners of Boston Dynamics, and so far, these robots represent one of the most explicit indications we’ve got about exactly what Hyundai Motor Group wants their robots to be doing.
We know it would be a mistake to read too much into these new announcements, but we can’t help reading something into them, right? So let’s take a look at what Hyundai Motor Group has been up to recently. This first robot is DAL-e, what HMG is calling an “Advanced Humanoid Robot.”
According to Hyundai, DAL-e is “designed to pioneer the future of automated customer services,” and is equipped with “state-of-the-art artificial intelligence technology for facial recognition as well as an automatic communication system based on a language-comprehension platform.” You’ll find it in car showrooms, but only in Seoul, for now.
We don’t normally write about robots like these because they tend not to represent much that’s especially new or interesting in terms of robotic technology, capabilities, or commercial potential. There’s certainly nothing wrong with DAL-e—it’s moderately cute and appears to be moderately functional. We’ve seen other platforms (like Pepper) take on similar roles, and our impression is that the long-term cost effectiveness of these greeter robots tends to be somewhat limited. And unless there’s some hidden functionality that we’re not aware of, this robot doesn’t really seem to be pushing the envelope, but we’d love to be wrong about that.
We’ve talked about how adding wheels can make legged robots faster and more efficient, but I’m honestly not sure that it works all that well going the other way (adding legs to wheeled robots) because rather than adding a little complexity to get a multi-modal system that you can use much of the time, you’re instead adding a lot of complexity to get a multi-modal system that you’re going to use sometimes.
You could argue, as perhaps Hyundai would, that the multi-modal system is critical to get TIGER to do what they want it to do, which seems to be primarily remote delivery. They mention operating in urban areas as well, where TIGER could use its legs to climb stairs, but I think it would be beat by more traditional wheeled platforms, or even whegged platforms, that are almost as capable while being much simpler and cheaper. For remote delivery, though, legs might be a necessary feature.
That is, if you assume that using a ground-based system is really the best way to go.
The TIGER concept can be integrated with a drone to transport it from place to place, so why not just use the drone to make the remote delivery instead? I guess maybe if you’re dealing with a thick tree canopy, the drone could drop TIGER off in a clearing and the robot could drive to its destination, but now we’re talking about developing a very complex system for a very specific use case. Even though Hyundai has said that they’re going to attempt to commercialize TIGER over the next five years, I think it’ll be tricky for them to successfully do so.
The best part about these robots from Hyundai is that between the two of them, they suggest that the company is serious about developing commercial robots as well as willing to invest in something that seems a little crazy. And you know who else is both of those things? Boston Dynamics. To be clear, it’s almost certain that both of Hyundai’s robots were developed well before the company was even thinking about acquiring Boston Dynamics, so the real question is: Where do these two companies go from here?
Boston Dynamics has been working on an arm for its Spot quadruped for at least five years now. There have been plenty of teasers along the way, including this 45-second clip from early 2018 of Spot using its arm to open a door, which at 85 million views seems to be Boston Dynamics’ most popular video ever by a huge margin. Obviously, there’s a substantial amount of interest in turning Spot from a highly dynamic but mostly passive sensor platform into a mobile manipulator that can interact with its environment.
As anyone who’s done mobile manipulation will tell you, actually building an arm is just the first step—the really tricky part is getting that arm to do exactly what you want it to do. In particular, Spot’s arm needs to be able to interact with the world with some amount of autonomy in order to be commercially useful, because you can’t expect a human (remote or otherwise) to spend all their time positioning individual joints or whatever to pick something up. So the real question about this arm is whether Boston Dynamics has managed to get it to a point where it’s autonomous enough that users with relatively little robotics experience will be able to get it to do useful tasks without driving themselves nuts.
Today, Boston Dynamics is announcing commercial availability of the Spot arm, along with some improved software called Scout plus a self-charging dock that’ll give the robot even more independence. And to figure out exactly what Spot’s new arm can do, we spoke with Zachary Jackowski, Spot Chief Engineer at Boston Dynamics.
The fish collective called Blueswarm was created by a team led by Radhika Nagpal, whose lab is a pioneer in self-organizing systems. The oddly adorable robots can sync their movements like biological fish, taking cues from their plastic-bodied neighbors with no external controls required. Nagpal told IEEE Spectrum that this marks a milestone, demonstrating complex 3D behaviors with implicit coordination in underwater robots.
“Insights from this research will help us develop future miniature underwater swarms that can perform environmental monitoring and search in visually-rich but fragile environments like coral reefs,” Nagpal said. “This research also paves a way to better understand fish schools, by synthetically recreating their behavior.”
The research is published in Science Robotics, with Florian Berlinger as first author. Berlinger said the “Bluedot” robots integrate a trio of blue LED lights, a lithium-polymer battery, a pair of cameras, a Raspberry Pi computer and four controllable fins within a 3D-printed hull. The fish-lens cameras detect LED’s of their fellow swimmers, and apply a custom algorithm to calculate distance, direction and heading.
Based on that simple production and detection of LED light, the team proved that Blueswarm could self-organize behaviors, including aggregation, dispersal and circle formation—basically, swimming in a clockwise synchronization. Researchers also simulated a successful search mission, an autonomous Finding Nemo. Using their dispersion algorithm, the robot school spread out until one could detect a red light in the tank. Its blue LEDs then flashed, triggering the aggregation algorithm to gather the school around it. Such a robot swarm might prove valuable in search-and-rescue missions at sea, covering miles of open water and reporting back to its mates.
“Each Bluebot implicitly reacts to its neighbors’ positions,” Berlinger said. The fish—RoboCod, perhaps?—also integrate a Wifi module to allow uploading new behaviors remotely. The lab’s previous efforts include a 1,000-strong army of “Kilobots,” and a robotic construction crew inspired by termites. Both projects operated in two-dimensional space. But a 3D environment like air or water posed a tougher challenge for sensing and movement.
In nature, Berlinger notes, there’s no scaly CEO to direct the school’s movements. Nor do fish communicate their intentions. Instead, so-called “implicit coordination” guides the school’s collective behavior, with individual members executing high-speed moves based on what they see their neighbors doing. That decentralized, autonomous organization has long fascinated scientists, including in robotics.
“In these situations, it really benefits you to have a highly autonomous robot swarm that is self-sufficient. By using implicit rules and 3D visual perception, we were able to create a system with a high degree of autonomy and flexibility underwater where things like GPS and WiFi are not accessible.”
Berlinger adds the research could one day translate to anything that requires decentralized robots, from self-driving cars and Amazon warehouse vehicles to exploration of faraway planets, where poor latency makes it impossible to transmit commands quickly. Today’s semi-autonomous cars face their own technical hurdles in reliably sensing and responding to their complex environments, including when foul weather obscures onboard sensors or road markers, or when they can’t fix position via GPS. An entire subset of autonomous-car research involves vehicle-to-vehicle (V2V) communications that could give cars a hive mind to guide individual or collective decisions— avoiding snarled traffic, driving safely in tight convoys, or taking group evasive action during a crash that’s beyond their sensory range.
“Once we have millions of cars on the road, there can’t be one computer orchestrating all the traffic, making decisions that work for all the cars,” Berlinger said.
The miniature robots could also work long hours in places that are inaccessible to humans and divers, or even large tethered robots. Nagpal said the synthetic swimmers could monitor and collect data on reefs or underwater infrastructure 24/7, and work into tiny places without disturbing fragile equipment or ecosystems.
“If we could be as good as fish in that environment, we could collect information and be non-invasive, in cluttered environments where everything is an obstacle,” Nagpal said.
Last week’s announcement that Hyundai acquired Boston Dynamics from SoftBank left us with a lot of questions. We attempted to answer many of those questions ourselves, which is typically bad practice, but sometimes it’s the only option when news like that breaks.
Fortunately, yesterday we were able to speak with Michael Patrick Perry, vice president of business development at Boston Dynamics, who candidly answered our questions about Boston Dynamics’ new relationship with Hyundai and what the near future has in store.
Robotic solutions can help your operation keep up with the demands of today’s changing e-commerce market. Honeywell Robotics is helping DCs evaluate solutions with powerful physics-based simulation tools to ensure that everything works together in an integrated ecosystem.
Put more than a quarter-century of automation expertise to work for you.
Coconuts may be delicious and useful for producing a wide range of products, but harvesting them is no easy task. Specially trained harvesters must risk their lives by climbing trees roughly 15 meters high to hack off just one bunch of coconuts. A group of researchers in India has designed a robot, named Amaran, that could reduce the need for human harvesters to take such a risk. But is the robot up to the task?
The researchers describe the tree-climbing robot in a paper published in the latest issue of IEEE/ASME Transactions on Mechatronics. Along with lab tests, they compared Amaran’s ability to harvest coconuts to that of a 50-year-old veteran harvester. Whereas the man bested the robot in terms of overall speed, the robot excelled in endurance.
To climb, Amaran relies on a ring-shaped body that clasps around trees of varying diameter. The robot carries a control module, motor drivers, a power management unit, and a wireless communications interface. Eight wheels allow it to move up and down a tree, as well as rotate around the trunk. Amaran is controlled by a person on the ground, who can use an app or joystick system to guide the robot’s movements.
Once Amaran approaches its target, an attached controller unit wields a robotic arm with 4 degrees of freedom to snip the coconut bunch. As a safety feature, if Amaran’s main battery dies, a backup unit kicks in, helping the robot return to ground.
Rajesh Kannan Megalingam, an assistant professor at Amrita Vishwa Vidyapeetham University, in South India, says his team has been working on Amaran since 2014. “No two coconut trees are the same anywhere in the world. Each one is unique in size, and has a unique alignment of coconut bunches and leaves,” he explains. “So building a perfect robot is an extremely challenging task.”
While testing the robot in the lab, Megalingam and his colleagues found that Amaran is capable of climbing trees when the inclination of the trunk is up to 30 degrees with respect to the vertical axis. Megalingam says that many coconut trees, especially under certain environmental conditions, grow at such an angle.
Next, the researchers tested Amaran in the field, and compared its ability to harvest coconuts to the human volunteer. The trees ranged from 6.2 to 15.2 m in height.
It took the human on average 11.8 minutes to harvest one tree, whereas it took Amaran an average of 21.9 minutes per tree (notably 14 of these minutes were dedicated to setting up the robot at the base of the tree, before it even begins to climb).
But Megalingam notes that Amaran can harvest more trees in a given day. For example, the human harvester in their trials could scale about 15 trees per day before getting tired, while the robot can harvest up to 22 trees per day, if the operator does not get tired. And although the robot is currently teleoperated, future improvements could make it more autonomous, improving its climbing speed and harvesting capabilities.
“Our ultimate aim is to commercialize this product and to help the coconut farmers,” says Megalingam. “In Kerala state, there are only 7,000 trained coconut tree climbers, whereas the requirement is about 50,000 trained climbers. The situation is similar in other states in India like Tamil Nadu, Andhra, and Karnataka, where coconut is grown in large numbers.”
He acknowledges that the current cost of the robot is a barrier to broader deployment, but notes that community members could pitch together to share the costs and utilization of the robot. Most importantly, he notes, “Coconut harvesting using Amaran does not involve risk for human life. Any properly trained person can operate Amaran. Usually only male workers take up this tree climbing job. But Amaran can be operated by anyone irrespective of gender, physical strength, and skills.”
Boston Dynamics has been fielding questions about when its robots are going to go on sale and how much they’ll cost for at least a dozen years now. I can say this with confidence, because that’s how long I’ve been a robotics journalist, and I’ve been pestering them about it the entire time. But it’s only relatively recently that the company started to make a concerted push away from developing robots exclusively for the likes of DARPA into platforms with more commercial potential, starting with a compact legged robot called Spot, first introduced in 2016.
When Rovenso’s co-founder and CEO Thomas Estier started thinking about how autonomous security and monitoring robots could be helpful during the COVID-19 pandemic, adapting them for UV-C disinfection seemed like it made a lot of sense—while you patrol at night, why not also lower the viral load of shared areas? But arguably the first question that a company has to ask when considering a new application, Estier tells us, is whether they can offer something unique.
“For me, what was also interesting is that the crisis motivated us to consider existing solutions for disinfection, and then understanding that [those solutions] are not adapted for large workshops and offices,” he says. “Instead, it would make sense for a robot to ‘understand’ its environment and act intelligently and to better spend its energy, and this loop of sense-analyze-act is the essence of robotics. When you use the full power of robotics, then you can really innovate with new use cases.”
In three weeks, Estier and his team developed what he’s calling “a hack,” turning their highly mobile security robot into an autonomous and efficient coronavirus destroyer.
We’ll take a look at what’s new with Spot, and talk with Boston Dynamics founder Marc Raibert as well as Zack Jackowski, lead robotics engineer on Spot, about some of the highlights of the 2.0 update, how Spot now understands what stairs are, and when we’ll finally be seeing that arm hit commercial production.
Had enough of injuries, delays and worker turnover on the loading dock?
Next-generation robotic unloaders from Honeywell Robotics can perform the same task fully autonomously, while handling products with greater care. They don’t suffer any loss of productivity when working in hot or cold weather, either.
And with our physics-based simulation tools, you can see how the unloader will perform with your unique product mix and learn how quickly you’ll see return on your investment.
Two years ago, we wrote about an AI startup from UC Berkeley and OpenAI called Embodied Intelligence, founded by robot laundry-folding expert Pieter Abbeel. What exactly Embodied was going to do wasn’t entirely clear, and honestly, it seemed like Embodied itself didn’t really know—they talked about “building technology that enables existing robot hardware to handle a much wider range of tasks where existing solutions break down,” and gave some examples of how that might be applied (including in manufacturing and logistics), but nothing more concrete.
Since then, a few things have happened. Thing one is that Embodied is now Covariant.ai. Thing two is that Covariant.ai spent almost a year talking with literally hundreds of different companies about how smarter robots could potentially make a difference for them. These companies represent sectors that include electronics manufacturing, car manufacturing, textiles, bio labs, construction, farming, hotels, elder care—“pretty much anything you could think about where maybe a robot could be helpful,” Pieter Abbeel tells us. “Over time, it became clear to us that manufacturing and logistics are the two spaces where there’s most demand now, and logistics especially is just hurting really hard for more automation.” And the really hard part of logistics is what Covariant decided to tackle.
At first glance, the crops don’t look any different from other crops blanketing the Salinas Valley, in California, which is often called “America’s salad bowl.” All you see are rows and rows of lettuce, broccoli, and cauliflower stretching to the horizon. But then the big orange robots roll through.
The machines are on a search-and-destroy mission. Their target? Weeds. Equipped with tractorlike wheels and an array of cameras and environmental sensors, they drive autonomously up and down the rows of produce, hunting for any leafy green invaders. Rather than spraying herbicides, they deploy a retractable hoe that kills the weeds swiftly and precisely.
The robots belong to FarmWise, a San Francisco startup that wants to use robotics and artificial intelligence to make agriculture more sustainable—and tastier. The company has raised US $14.5 million in a recent funding round, and in 2020 it plans to deploy its first commercial fleet of robots, with more than 10 machines serving farmers in the Salinas Valley.
FarmWise says that although its robots are currently optimized for weeding, future designs will do much more. “Our goal is to become a universal farming platform,” says cofounder and CEO Sébastien Boyer. “We want to automate pretty much all tasks from seeding all the way to harvesting.”
Boyer envisions the robots collecting vast amounts of data, including detailed images of the crops and parameters that affect their health such as temperature, humidity, and soil conditions. But it’s what the robots will do with the data that makes them truly remarkable. Using machine learning, they’ll identify each plant individually, determine whether it’s thriving, and tend to it accordingly. Thanks to these AI-powered robots, every broccoli stalk will get the attention it needs to be the best broccoli it can be.
Automation is not new to agriculture. Wheeled harvesters are increasingly autonomous, and farmers have long been flying drones to monitor their crops from above. Also under development are robots designed to pick fruits and vegetables—apples, peppers, strawberries, tomatoes, grapes, cucumbers, asparagus. More recently, a number of robotics companies have turned their attention to ways they can improve the quality or yield of crops.
Farming robots are still a “very nascent market,” says Rian Whitton, a senior analyst at ABI Research, in London, but it’s one that will “expand significantly over the next 10 years.” ABI forecasts that annual shipments of mobile robots for agriculture will exceed 100,000 units globally by 2030, 100 times the volume deployed today.
It’s still a small number compared with the millions of tractors and other farming vehicles sold each year, but Whitton notes that demand for automation will likely accelerate due to labor shortages in many parts of the world.
FarmWise says it has worked closely with farmers to understand their needs and develop its robots based on their feedback. So how do they work? Boyer is not prepared to reveal specifics about the company’s technology, but he says the machines operate in three steps.
First, the sensor array captures images and other relevant data about the crops and stores that information on both onboard computers and cloud servers. The second step is the decision-making process, in which specialized deep-learning algorithms analyze the data. There’s an algorithm trained to detect plants in an image, and the robots combine that output with GPS and other location data to precisely identify each plant. Another algorithm is trained to decide whether a plant is, say, a lettuce head or a weed. The final step is the physical action that the machines perform on the crops—for example, deploying the weeding hoe.
Boyer says the robots perform the three steps in less than a second. Indeed, the robots can drive through the fields clearing the soil at a pace that would be virtually impossible for humans to match. FarmWise says its robots have removed weeds from more than 10 million plants to date.
Whitton, the ABI analyst, says focusing on weeding as an initial application makes sense. “There are potentially billions of dollars to be saved from less pesticide use, so that’s the fashionable use case,” he says. But he adds that commercial success for agriculture automation startups will depend on whether they can expand their services to perform additional farming tasks as well as operate in a variety of regions and climates.
FarmWise says it has recently completed a redesign of its robots. The new version is better suited to withstand the harsh conditions often found in the field, including mud, dust, and water. The company is now expanding its staff as it prepares to deploy its robotic fleet in California, and eventually in other parts of the United States and abroad.
Boyer is confident that farms everywhere will one day be filled with robots—and that they’ll grow some of the best broccoli you’ve ever tasted.
Sarcos has been developing powered exoskeletons and the robotic technologies that make them possible for decades, and the lobby of the company’s headquarters is a resting place for concepts and prototype hardware that’s been abandoned along the way. But now, Sarcos is ready to unveil the prototype of the Guardian XO, a strength-multiplying exoskeleton that’s about to begin shipping.
As our introductory briefing concludes, Sarcos CEO Ben Wolff is visibly excited to be able to show off what they’ve been working on in their lab. “If you were to ask the question, What does 30 years and $300 million look like,” Wolff tells us, “you’re going to see it downstairs.”
This is a guest post. The views expressed here are solely those of the author and do not represent positions of IEEE Spectrum or the IEEE.
Autonomous robots are coming around slowly. We already got autonomous vacuum cleaners, autonomous lawn mowers, toys that bleep and blink, and (maybe) soon autonomous cars. Yet, generation after generation, we keep waiting for the robots that we all know from movies and TV shows. Instead, businesses seem to get farther and farther away from the robots that are able to do a large variety of tasks using general-purpose, human anatomy-inspired hardware.
Although these are the droids we have been looking for, anything that came close, such as Willow Garage’s PR2 or Rethink Robotics’ Baxter has bitten the dust. With building a robotic company being particularly hard, compounding business risk with technological risk, the trend goes from selling robots to selling actual services like mowing your lawn, provide taxi rides, fulfilling retail orders, or picking strawberries by the pound. Unfortunately for fans of R2-D2 and C-3PO, these kind of business models emphasize specialized, room- or fridge-sized hardware that is optimized for one very specific task, but does not contribute to a general-purpose robotic platform.
Boston Dynamics is announcing this morning that Spot, its versatile quadruped robot, is now for sale. The machine’s animal-like behavior regularly electrifies crowds at tech conferences, and like other Boston Dynamics’ robots, Spot is a YouTube sensation whose videos amass millions of views.
Now anyone interested in buying a Spot—or a pack of them—can go to the company’s website and submit an order form. But don’t pull out your credit card just yet. Spot may cost as much as a luxury car, and it is not really available to consumers. The initial sales, described as an “early adopter program,” is targeting businesses. Boston Dynamics wants to find customers in select industries and help them deploy Spots in real-world scenarios.
We humans spend most of our time getting hungry or eating, which must be really inconvenient for the people who have to produce food for everyone. For a sustainable and tasty future, we’ll need to make the most of what we’ve got by growing more food with less effort, and that’s where the robots can help us out a little bit.
FarmWise, a California-based startup, is looking to enhance farming efficiency by automating everything from seeding to harvesting, starting with the worst task of all: weeding. And they’ve just raised US $14.5 million to do it.
Universal Robots, already the dominant force in collaborative robots, is flexing its muscles in an effort to further expand its reach in the cobots market. The Danish company is introducing today the UR16e, its strongest robotic arm yet, with a payload capability of 16 kilograms (35.3 lbs), reach of 900 millimeters, and repeatability of +/- 0.05 mm. Universal says the new “heavy duty payload cobot” will allow customers to automate a broader range of processes, including packaging and palletizing, nut and screw driving, and high-payload and CNC machine tending.
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.