Tag Archives: robotics

Why Boston Dynamics Is Putting Legged Robots in Hospitals

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/medical-robots/boston-dynamics-legged-robots-hospitals

For the past eight months, Boston Dynamics has been trying to find ways in which their friendly yellow quadruped, Spot, can provide some kind of useful response to COVID-19. The company has been working with researchers from MIT and Brigham and Women’s Hospital in Massachusetts to use Spot as a telepresence-based extension for healthcare workers in suitable contexts, with the goal of minimizing exposure and preserving supplies of PPE.

For triaging sick patients, it’s necessary to collect a variety of vital data, including body temperature, respiration rate, pulse rate, and oxygen saturation. Boston Dynamics has helped to develop “a set of contactless  monitoring systems for measuring vital signs and a tablet computer to enable face-to-face medical interviewing,” all of which fits neatly on Spot’s back. This system was recently tested in a medical tent for COVID-19 triage, which appeared to be a well constrained and very flat environment that left us wondering whether a legged platform like Spot was really necessary in this particular application. What makes Spot unique (and relatively complex and expensive) is its ability to navigate around complex environments in an agile manner. But in a tent in a hospital parking lot, are you really getting your US $75k worth out of those legs, or would a wheeled platform do almost as well while being significantly simpler and more affordable?

As it turns out, we weren’t the only ones who wondered whether Spot is really the best platform for this application. “We had the same response when we started getting pitched these opportunities in Feb / March,” Michael Perry, Boston Dynamics’ VP of business development commented on Twitter. “As triage tents started popping up in late March, though, there wasn’t confidence wheeled robots would be able to handle arbitrary triage environments (parking lots, lawns, etc).”

To better understand Spot’s value in this role, we sent Boston Dynamics a few questions about their approach to healthcare robots.

iRobot Remembers That Robots Are Expensive, Gives Us a Break With More Affordable Roomba i3

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/home-robots/irobot-roomba-i3

iRobot has released several new robots over the last few years, including the i7 and s9 vacuums. Both of these models are very fancy and very capable, packed with innovative and useful features that we’ve been impressed by. They’re both also quite expensive—with dirt docks included, you’re looking at US $800 for the i7+, and a whopping $1,100 for the s9+. You can knock a couple hundred bucks off of those prices if you don’t want the docks, but still, these vacuums are absolutely luxury items.

If you just want something that’ll do some vacuuming so that you don’t have to, iRobot has recently announced a new Roomba option. The Roomba i3 is iRobot’s new low to midrange vacuum, starting at $400. It’s not nearly as smart as the i7 or the s9, but it can navigate (sort of) and make maps (sort of) and do some basic smart home integration. If that sounds like all you need, the i3 could be the robot vacuum for you.

Video Friday: Bittle Is a Palm-Sized Robot Dog Now on Kickstarter

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/robotics-hardware/video-friday-bittle-robot-dog

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
AUVSI EXPONENTIAL 2020 – October 5-8, 2020 – [Online]
IROS 2020 – October 25-29, 2020 – [Online]
CYBATHLON 2020 – November 13-14, 2020 – [Online]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA

Let us know if you have suggestions for next week, and enjoy today’s videos.


Rongzhong Li, who is responsible for the adorable robotic cat Nybble, has an updated and even more adorable quadruped that’s more robust and agile but only costs around US $200 in kit form on Kickstarter.

Looks like the early bird options are sold out, but a full kit is a $225 pledge, for delivery in December.

[ Kickstarter ]

Thanks Rz!


I still maintain that Stickybot was one of the most elegantly designed robots ever.

[ Stanford ]


With the unpredictable health crisis of COVID-19 continuing to place high demands on hospitals, PAL Robotics have successfully completed testing of their delivery robots in Barcelona hospitals this summer. The TIAGo Delivery and TIAGo Conveyor robots were deployed in Hospital Municipal of Badalona and Hospital Clínic Barcelona following a winning proposal submitted to the European DIH-Hero project. Accerion sensors were integrated onto the TIAGo Delivery Robot and TIAGo Conveyor Robot for use in this project.

[ PAL Robotics ]


Energy Robotics, a leading developer of software solutions for mobile robots used in industrial applications, announced that its remote sensing and inspection solution for Boston Dynamics’s agile mobile robot Spot was successfully deployed at Merck’s thermal exhaust treatment plant at its headquarters in Darmstadt, Germany. Energy Robotics equipped Spot with sensor technology and remote supervision functions to support the inspection mission.

Combining Boston Dynamics’ intuitive controls, robotic intelligence and open interface with Energy Robotics’ control and autonomy software, user interface and encrypted cloud connection, Spot can be taught to autonomously perform a specific inspection round while being supervised remotely from anywhere with internet connectivity. Multiple cameras and industrial sensors enable the robot to find its way around while recording and transmitting information about the facility’s onsite equipment operations.

Spot reads the displays of gauges in its immediate vicinity and can also zoom in on distant objects using an externally-mounted optical zoom lens. In the thermal exhaust treatment facility, for instance, it monitors cooling water levels and notes whether condensation water has accumulated. Outside the facility, Spot monitors pipe bridges for anomalies.

Among the robot’s many abilities, it can detect defects of wires or the temperature of pump components using thermal imaging. The robot was put through its paces on a comprehensive course that tested its ability to handle special challenges such as climbing stairs, scaling embankments and walking over grating.

Energy Robotics ]

Thanks Stefan!


Boston Dynamics really should give Dr. Guero an Atlas just to see what he can do with it.

[ DrGuero ]


World’s First Socially Distanced Birthday Party: Located in London, the robotic arm was piloted in real time to light the candles on the cake by the founder of Extend Robotics, Chang Liu, who was sat 50 miles away in Reading. Other team members in Manchester and Reading were also able to join in the celebration as the robot was used to accurately light the candles on the birthday cake.

[ Extend Robotics ]


The Robocon in-person competition was canceled this year, but check out Tokyo University’s robots in action:

[ Robocon ]


Sphero has managed to pack an entire Sphero into a much smaller sphere.

[ Sphero ]


Squishy Robotics, a small business funded by the National Science Foundation (NSF), is developing mobile sensor robots for use in disaster rescue, remote monitoring, and space exploration. The shape-shifting, mobile, senor robots from UC-Berkeley spin-off Squishy Robotics can be dropped from airplanes or drones and can provide first responders with ground-based situational awareness during fires, hazardous materials (HazMat) release, and natural and man-made disasters.

[ Squishy Robotics ]


Meet Jasper, the small girl with big dreams to FLY. Created by UTS Animal Logic Academy in partnership with the Royal Australian Air Force to encourage girls to soar above the clouds. Jasper was created using a hybrid of traditional animation techniques and technology such as robotics and 3D printing. A KUKA QUANTEC robot is used during the film making to help the Australian Royal Airforce tell their story in a unique way. UTS adapted their High Accurate robot to film consistent paths, creating a video with physical sets and digital characters.

[ AU AF ]


Impressive what the Ghost Robotics V60 can do without any vision sensors on it.

[ Ghost Robotics ]


Is your job moving tiny amounts of liquid around? Would you rather be doing something else? ABB’s YuMi got you.

[ Yumi ]


For his PhD work at the Media Lab, Biomechatronics researcher Roman Stolyarov developed a terrain-adaptive control system for robotic leg prostheses. as a way to help people with amputations feel as able-bodied and mobile as possible, by allowing them to walk seamlessly regardless of the ground terrain.

[ MIT ]


This robot collects data on each cow when she enters to be milked. Milk samples and 3D photos can be taken to monitor the cow’s health status. The Ontario Dairy Research Centre in Elora, Ontario, is leading dairy innovation through education and collaboration. It is a state-of-the-art 175,000 square foot facility for discovery, learning and outreach. This centre is a partnership between the Agricultural Research Institute of Ontario, OMAFRA, the University of Guelph and the Ontario dairy industry.

[ University of Guleph ]


Australia has one of these now, should the rest of us panic?

[ Boeing ]


Daimler and Torc are developing Level 4 automated trucks for the real world. Here is a glimpse into our closed-course testing, routes on public highways in Virginia, and self-driving capabilities development. Our year of collaborating on the future of transportation culminated in the announcement of our new truck testing center in New Mexico.

[ Torc Robotics ]


GITAI Sending Autonomous Robot to Space Station

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/space-robots/gitai-autonomous-robot-iss

We’ve been keeping a close watch on GITAI since early last year—what caught our interest initially is the history of the company, which includes a bunch of folks who started in the JSK Lab at the University of Tokyo, won the DARPA Robotics Challenge Trials as SCHAFT, got swallowed by Google, narrowly avoided being swallowed by SoftBank, and are now designing robots that can work in space.

The GITAI YouTube channel has kept us more to less up to date on their progress so far, and GITAI has recently announced the next step in this effort: The deployment of one of their robots on board the International Space Station in 2021.

AWS Architecture Monthly Magazine: Robotics

Post Syndicated from Annik Stahl original https://aws.amazon.com/blogs/architecture/architecture-monthly-magazine-robotics/

Architecture Monthly: RoboticsSeptember’s issue of AWS Architecture Monthly issue is all about robotics. Discover why iRobot, the creator of your favorite (though maybe not your pet’s favorite) little robot vacuum, decided to move its mission-critical platform to the serverless architecture of AWS. Learn how and why you sometimes need to test in a virtual environment instead of a physical one. You’ll also have the opportunity to hear from technical experts from across the robotics industry who came together for the AWS Cloud Robotics Summit in August.

Our expert this month, Matt Hansen (who has dreamed of building robots since he was a teen), gives us his outlook for the industry and explains why cloud will be an essential part of that.

In September’s Robotics issue

  • Ask an Expert: Matt Hansen, Principle Solutions Architect
  • Blog: Testing a PR2 Robot in a Simulated Hospital
  • Case Study: iRobot
  • Blog: Introduction to Automatic Testing of Robotics Applications
  • Case Study: Multiply Labs Uses AWS RoboMaker to Manufacture Individualized Medicines
  • Demos & Videos: AWS Cloud Robotics Summit (August 18-19, 2020)
  • Related Videos: iRobot and ZS Associates

Survey opportunity

This month, we’re also asking you to take a 10-question survey about your experiences with this magazine. The survey is hosted by an external company (Qualtrics), so the below survey button doesn’t lead to our website. Please note that AWS will own the data gathered from this survey, and we will not share the results we collect with survey respondents. Your responses to this survey will be subject to Amazon’s Privacy Notice. Please take a few moments to give us your opinions.

How to access the magazine

We hope you’re enjoying Architecture Monthly, and we’d like to hear from you—leave us star rating and comment on the Amazon Kindle Newsstand page or contact us anytime at [email protected].

Zipline Partners With Walmart on Commercial Drone Delivery

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/zipline-walmart-drone-delivery

Today, Walmart and Zipline are announcing preliminary plans “to bring first-of-its kind drone delivery service to the United States.” What makes this drone-delivery service the first of its kind is that Zipline uses fixed-wing drones rather than rotorcraft, giving them a relatively large payload capacity and very long range at the cost of a significantly more complicated launch, landing, and delivery process. Zipline has made this work very well in Rwanda, and more recently in North Carolina. But expanding into commercial delivery to individual households is a much different challenge. 

Along with a press release that doesn’t say much, Walmart and Zipline have released a short video of how they see the delivery operation happening, and it’s a little bit more, uh, optimistic than we’re entirely comfortable with.

Video Friday: Drone Helps Explore World’s Deepest Ice Caves

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/robotics-hardware/video-friday-flyability-drone-greenland

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
AUVSI EXPONENTIAL 2020 – October 5-8, 2020 – [Online Conference]
IROS 2020 – October 25-29, 2020 – Las Vegas, Nev., USA
CYBATHLON 2020 – November 13-14, 2020 – [Online Event]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA

Let us know if you have suggestions for next week, and enjoy today’s videos.


Video Friday: Even Robots Know That You Need a Mask

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/robotics-hardware/video-friday-9420

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

CLAWAR 2020 – August 24-26, 2020 – [Online Conference]
Other Than Human – September 3-10, 2020 – Stockholm, Sweden
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
AUVSI EXPONENTIAL 2020 – October 5-8, 2020 – [Online Conference]
IROS 2020 – October 25-29, 2020 – Las Vegas, Nev., USA
CYBATHLON 2020 – November 13-14, 2020 – [Online Event]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA

Let us know if you have suggestions for next week, and enjoy today’s videos.


Field Notes: Deploying UiPath RPA Software on AWS

Post Syndicated from Yuchen Lin original https://aws.amazon.com/blogs/architecture/field-notes-deploying-uipath-rpa-software-on-aws/

Running UiPath RPA software on AWS leverages the elasticity of the AWS Cloud, to set up, operate, and scale robotic process automation. It provides cost-efficient and resizable capacity, and scales the robots to meet your business workload. This reduces the need for administration tasks, such as hardware provisioning, environment setup, and backups. It frees you to focus on business process optimization by automating more processes.

This blog post guides you in deploying UiPath robotic processing automation (RPA) software on AWS. RPA software uses the user interface to capture data and manipulate applications just like humans do. It runs as a software robot to interpret, and trigger responses, as well as communicate with other systems to perform a variety of repetitive tasks.

UiPath Enterprise RPA Platform provides the full automation lifecycle including discover, build, manage, run, engage, and measure with different products. This blog post focuses on the Platform’s core products: build with UiPath Studio, manage with UiPath Orchestrator and run with UiPath Robots.

About UiPath software

UiPath Enterprise RPA Platform’s core products are:

UiPath Studio and UiPath Robot are individual products, you can deploy each on a standalone machine.

UiPath Orchestrator contains Web Servers, SQL Server and Indexer Server (Elasticsearch), you can use Single Machine deployment, or Multi-Node deployment, depends on the workload capacity and availability requirements.

For information on UiPath platform offerings, review UiPath platform products.

UiPath on AWS

You can deploy all UiPath products on AWS.

  • UiPath Studio is needed for automation design jobs and runs on single machine. You deploy it with Amazon EC2.
  • UiPath Robots are needed for automation tasks, runs on a single machine, and scales with the business workload. You deploy it with Amazon EC2 and scale with Amazon EC2 Auto Scaling.
  • UiPath Orchestrator is needed for automation administration jobs and contains three logical components that run on multiple machines. You deploy Web Server with Amazon EC2, SQL Server with Amazon RDS, and Indexer Server with Amazon Elasticsearch Service. For Multi-Node deployment, you deploy High Availability Add-On with Amazon EC2.

The architecture of UiPath Enterprise RPA Platform on AWS looks like the following diagram:

Figure 1 - UiPath Enterprise RPA Platform on AWS

Figure 1 – UiPath Enterprise RPA Platform on AWS

By deploying the UiPath Enterprise RPA Platform on AWS, you can set up, operate, and scale workloads. This controls the infrastructure cost to meet process automation workloads.

Prerequisites

For this walkthrough, you should have the following prerequisites:

  • An AWS account
  • AWS resources
  • UiPath Enterprise RPA Platform software
  • Basic knowledge of Amazon EC2, EC2 Auto Scaling, Amazon RDS, Amazon Elasticsearch Service.
  • Basic knowledge to set up Windows Server, IIS, SQL Server, Elasticsearch.
  • Basic knowledge of Redis Enterprise to set up High Availability Add-on.
  • Basic knowledge of UiPath Studio, UiPath Robot, UiPath Orchestrator.

Deployment Steps

Deploy UiPath Studio
UiPath Studio deploys on a single machine. Amazon EC2 instances provide secure and resizable compute capacity in the cloud, and the ability to launch applications when needed without upfront commitments.

  1. Download the UiPath Enterprise RPA Platform. UiPath Studio is integrated in the installation package.
  2. Launch an EC2 instance with a Windows OS-based Amazon Machine Image (AMI) that meets the UiPath Studio hardware requirements and software requirements.
  3. Install the UiPath Studio software. For UiPath Studio installation steps, review the UiPath Studio Guide.

Optionally, you can save the installation and pre-configuration work completed for UiPath Studio as a custom Amazon Machine Image (AMI). Then, you can launch more UiPath Studio instances from this AMI. For details, visit Launch an EC2 instance from a custom AMI tutorial.

UiPath Robot Deployment

Each UiPath Robot deploys one single machine with Amazon EC2. Amazon EC2 Auto Scaling helps you add or remove Robots to meet automation workload changes in demand.

  1. Download the UiPath Enterprise RPA Platform. The UiPath Robot is integrated in the installation package.
  2. Launch an EC2 instance with a Windows OS based Amazon Machine Image (AMI) that meets the UiPath Robot hardware requirements and software requirements.
  3. Install the business application (Microsoft Office, SAP, etc.) required for your business processes. Alternatively, select the business application AMI from the AWS Marketplace.
  4. Install the UiPath Robot software. For UiPath Robot installation steps, review Installing the Robot.

Optionally, you can save the installation and pre-configuration work completed for UiPath Robot as a custom Amazon Machine Image (AMI). Then you can create Launch templates with instance configuration information. With launch template, you can create Auto Scaling groups from launch templates and scale the Robots.

Scale the Robots’ Capacity

Amazon EC2 Auto Scaling groups help you use scaling policies to scale compute capacity based on resource use. By monitoring the process queue and creating a customized scaling policy, the UiPath Robot can automatically scale based on the workload. For details, review Scaling the size of your Auto Scaling group.

Use the Robot Logs

UiPath Robot generates multiple diagnostic and execution logs. Amazon CloudWatch provides the log collection, storage, and analysis, and enables the complete visibility of the Robots and automation tasks. For CloudWatch agent setup on Robot, review Quick Start: Enable Your Amazon EC2 Instances Running Windows Server to Send logs to CloudWatch Logs.

Monitor the Automation Jobs

UiPath Robot uses the user interface to capture data and manipulate applications. When UiPath Robot runs, it is important to capture processing screens for troubleshooting and auditing usage. This screen capture activity can be integrated with process in conjunction with UiPath Studio.

Amazon S3 provides cost-effective storage for retaining all Robot logs and processing screen captures. Amazon S3 Object Lifecycle Management automates the transition between different storage classes, and helps you manage the screenshots so that they are stored cost effectively throughout their lifecycle. For lifecycle policy creation, review How Do I Create a Lifecycle Policy for an S3 Bucket?.

UiPath Orchestrator Deployment

Deployment Components
UiPath Orchestrator Server Platform has many logical components, grouped in three layers:

  • presentation layer
  • web service layer
  • persistence layer

The presentation layer and web service layer are built into one ASP.NET website. The persistence layer contains SQL Server and Elasticsearch. There are three deployment components to be set up:

  • web application
  • SQL Server
  • Elasticsearch

The Web Server, SQL Server, and Elasticsearch Server require multiple different environments. Review the hardware requirements and software requirements for more details.

Note: set up the Web Server, SQL Server, Elasticsearch Server environments before running the UiPath Enterprise Platform installation wizard.

Set up Web Server with Amazon EC2

UiPath Orchestrator Web Server deploys on Windows Server with IIS 7.5 or later. For details, review the software requirements.

AWS provides various AMIs for Windows Server that can help you set up the environment required for the Web Server.

The Microsoft Windows Server 2019 Base AMI includes most prerequisites for installation except some features of Web Server (IIS) to be enabled. For configuration steps, review Server Roles and Features.

The Web Server should be put in correct subnet (Public or Private) and have proper security group (HTTPS visits) according to the business requirements. Review Allow user to connect EC2 on HTTP or HTTPS.

Set up SQL Server with Amazon RDS

Amazon Relational Database Service (Amazon RDS) provides you with a managed database service. With a few clicks, you can set up, operate, and scale a relational database in the AWS Cloud.

Amazon RDS support SQL Server Engine. For UiPath Orchestrator, both Standard Edition and Enterprise Edition are supported. For details, review software requirements.

Amazon RDS can be set up in multiple Available Zones to meet requirements for high availability.

UiPath Orchestrator can connect to the created Amazon RDS database with SQL Server Authentication.

Set up Elasticsearch Server with Amazon Elasticsearch Service (Amazon ES)

Amazon ES is a fully managed service for you to deploy, secure, and operate Elasticsearch at scale with generally zero down time.

Elasticsearch Service provides a managed ELS stack, with no upfront costs or usage requirements, and without the operational overhead.

All messages logged by UiPath Robots are sent through the Logging REST endpoint to the Indexer Server where they are indexed for future utilization.

Install UiPath Orchestrator on the Web Server

After Web Server, SQL Server, Elasticsearch Server environment are ready, download the UiPath Enterprise RPA Platform, and install it on the Web Server.

The UiPath Enterprise Platform installation wizard guides you in configuring and setting up each environment, including connecting to SQL Server and configuring the Elasticsearch API URL.

After you complete setup, the UiPath Orchestrator Portal is available for you to visit and manage processes, jobs, and robots.

The UiPath Orchestrator dashboard appears like in the following screenshot:

Figure UiPath Orchestrator Portal

Figure 2- UiPath Orchestrator Portal

Set up Orchestrator High Availability Architecture

One Orchestrator can handle many robots in a typical configuration, but any product running on a single server is vulnerable to failure if something happens to that server.

The High Availability add-on (HAA) enables you to add a second Orchestrator server to your environment that is generally fully synchronized with the first server.

To set up multi-node deployment, launch Amazon EC2 instances with a Linux OS-based Amazon Machine Image (AMI) that meets the HAA hardware and software requirements. Follow the installation guide to set up HAA.

Elastic Load Balancing automatically distributes incoming application traffic across multiple targets. Network Load Balancer should be set up to allow Robots to communicate with multi-node Orchestrators.

Cleaning up

To avoid incurring future charges, delete all the resources.

Conclusion

In this post, I showed you how to deploy the UiPath Enterprise RPA Platform on AWS to further optimize and automate your business processes. AWS Managed Services like Amazon EC2, Amazon RDS, and Amazon Elasticsearch Service help you set up the environment with high availability. This reduces the maintenance effort of backend services, as well as scaling Orchestrator capabilities. Amazon EC2 Auto Scaling helps you add or remove robots to meet automation workload changes in demand.

Learn more about how to integrate UiPath with AWS services, check out The UiPath and AWS partnership.

Field Notes provides hands-on technical guidance from AWS Solutions Architects, consultants, and technical account managers, based on their experiences in the field solving real-world business problems for customers.

These Underwater Drones Use Water Temperature Differences To Recharge

Post Syndicated from Jeremy Hsu original https://spectrum.ieee.org/automaton/robotics/drones/renewable-power-underwater-drones

Yi Chao likes to describe himself as an “armchair oceanographer” because he got incredibly seasick the one time he spent a week aboard a ship. So it’s maybe not surprising that the former NASA scientist has a vision for promoting remote study of the ocean on a grand scale by enabling underwater drones to recharge on the go using his company’s energy-harvesting technology.

Many of the robotic gliders and floating sensor stations currently monitoring the world’s oceans are effectively treated as disposable devices because the research community has a limited number of both ships and funding to retrieve drones after they’ve accomplished their mission of beaming data back home. That’s not only a waste of money, but may also contribute to a growing assortment of abandoned lithium-ion batteries polluting the ocean with their leaking toxic materials—a decidedly unsustainable approach to studying the secrets of the underwater world.

“Our goal is to deploy our energy harvesting system to use renewable energy to power those robots,” says Chao, president and CEO of the startup Seatrec. “We’re going to save one battery at a time, so hopefully we’re going to not to dispose more toxic batteries in the ocean.”

Chao’s California-based startup claims that its SL1 Thermal Energy Harvesting System can already help save researchers money equivalent to an order of magnitude reduction in the cost of using robotic probes for oceanographic data collection. The startup is working on adapting its system to work with autonomous underwater gliders. And it has partnered with defense giant Northrop Grumman to develop an underwater recharging station for oceangoing drones that incorporates Northrop Grumman’s self-insulating electrical connector capable of operating while the powered electrical contacts are submerged.

Seatrec’s energy-harvesting system works by taking advantage of how certain substances transition from solid-to-liquid phase and liquid-to-gas phase when they heat up. The company’s technology harnesses the pressure changes that result from such phase changes in order to generate electricity. 

To make the phase changes happen, Seatrec’s solution taps the temperature differences between warmer water at the ocean surface and colder water at the ocean depths. Even a relatively simple robotic probe can generate additional electricity by changing its buoyancy to either float at the surface or sink down into the colder depths.

By attaching an external energy-harvesting module, Seatrec has already begun transforming robotic probes into assets that can be recharged and reused more affordably than sending out a ship each time to retrieve the probes. This renewable energy approach could keep such drones going almost indefinitely barring electrical or mechanical failures. “We just attach the backpack to the robots, we give them a cable providing power, and they go into the ocean,” Chao explains. 

The early buyers of Seatrec’s products are primarily academic researchers who use underwater drones to collect oceanographic data. But the startup has also attracted military and government interest. It has already received small business innovation research contracts from both the U.S. Office of Naval Research and National Oceanic and Atmospheric Administration (NOAA).

Seatrec has also won two $10,000 prizes under the Powering the Blue Economy: Ocean Observing Prize administered by the U.S. Department of Energy and NOAA. The prizes awarded during the DISCOVER Competition phase back in March 2020 included one prize split with Northrop Grumman for the joint Mission Unlimited UUV Station concept. The startup and defense giant are currently looking for a robotics company to partner with for the DEVELOP Competition phase of the Ocean Observing Prize that will offer a total of $3 million in prizes.

In the long run, Seatrec hopes its energy-harvesting technology can support commercial ventures such as the aquaculture industry that operates vast underwater farms. The technology could also support underwater drones carrying out seabed surveys that pave the way for deep sea mining ventures, although those are not without controversy because of their projected environmental impacts.

Among all the possible applications Chao seems especially enthusiastic about the prospect of Seatrec’s renewable power technology enabling underwater drones and floaters to collect oceanographic data for much longer periods of time. He spent the better part of two decades working at the NASA Jet Propulsion Laboratory in Pasadena, Calif., where he helped develop a satellite designed for monitoring the Earth’s oceans. But he and the JPL engineering team that developed Seatrec’s core technology believe that swarms of underwater drones can provide a continuous monitoring network to truly begin understanding the oceans in depth.

The COVID-19 pandemic has slowed production and delivery of Seatrec’s products somewhat given local shutdowns and supply chain disruptions. Still, the startup has been able to continue operating in part because it’s considered to be a defense contractor that is operating an essential manufacturing facility. Seatrec’s engineers and other staff members are working in shifts to practice social distancing.

“Rather than building one or two for the government, we want to scale up to build thousands, hundreds of thousands, hopefully millions, so we can improve our understanding and provide that data to the community,” Chao says. 

Amaran the Tree-Climbing Robot Can Safely Harvest Coconuts

Post Syndicated from Michelle Hampson original https://spectrum.ieee.org/automaton/robotics/industrial-robots/amaran-tree-climbing-robot-can-safely-harvest-coconuts

Journal Watch report logo, link to report landing page

Coconuts may be delicious and useful for producing a wide range of products, but harvesting them is no easy task. Specially trained harvesters must risk their lives by climbing trees roughly 15 meters high to hack off just one bunch of coconuts. A group of researchers in India has designed a robot, named Amaran, that could reduce the need for human harvesters to take such a risk. But is the robot up to the task?

The researchers describe the tree-climbing robot in a paper published in the latest issue of IEEE/ASME Transactions on Mechatronics. Along with lab tests, they compared Amaran’s ability to harvest coconuts to that of a 50-year-old veteran harvester. Whereas the man bested the robot in terms of overall speed, the robot excelled in endurance.

To climb, Amaran relies on a ring-shaped body that clasps around trees of varying diameter. The robot carries a control module, motor drivers, a power management unit, and a wireless communications interface. Eight wheels allow it to move up and down a tree, as well as rotate around the trunk. Amaran is controlled by a person on the ground, who can use an app or joystick system to guide the robot’s movements.

Once Amaran approaches its target, an attached controller unit wields a robotic arm with 4 degrees of freedom to snip the coconut bunch. As a safety feature, if Amaran’s main battery dies, a backup unit kicks in, helping the robot return to ground.

Rajesh Kannan Megalingam, an assistant professor at Amrita Vishwa Vidyapeetham University, in South India, says his team has been working on Amaran since 2014. “No two coconut trees are the same anywhere in the world. Each one is unique in size, and has a unique alignment of coconut bunches and leaves,” he explains. “So building a perfect robot is an extremely challenging task.”

While testing the robot in the lab, Megalingam and his colleagues found that Amaran is capable of climbing trees when the inclination of the trunk is up to 30 degrees with respect to the vertical axis. Megalingam says that many coconut trees, especially under certain environmental conditions, grow at such an angle.

Next, the researchers tested Amaran in the field, and compared its ability to harvest coconuts to the human volunteer. The trees ranged from 6.2 to 15.2 m in height.

It took the human on average 11.8 minutes to harvest one tree, whereas it took Amaran an average of 21.9 minutes per tree (notably 14 of these minutes were dedicated to setting up the robot at the base of the tree, before it even begins to climb).

But Megalingam notes that Amaran can harvest more trees in a given day. For example, the human harvester in their trials could scale about 15 trees per day before getting tired, while the robot can harvest up to 22 trees per day, if the operator does not get tired. And although the robot is currently teleoperated, future improvements could make it more autonomous, improving its climbing speed and harvesting capabilities. 

“Our ultimate aim is to commercialize this product and to help the coconut farmers,” says Megalingam. “In Kerala state, there are only 7,000 trained coconut tree climbers, whereas the requirement is about 50,000 trained climbers. The situation is similar in other states in India like Tamil Nadu, Andhra, and Karnataka, where coconut is grown in large numbers.”

He acknowledges that the current cost of the robot is a barrier to broader deployment, but notes that community members could pitch together to share the costs and utilization of the robot. Most importantly, he notes, “Coconut harvesting using Amaran does not involve risk for human life. Any properly trained person can operate Amaran. Usually only male workers take up this tree climbing job. But Amaran can be operated by anyone irrespective of gender, physical strength, and skills.”

Autonomous Robots Could Mine the Deep Seafloor

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/automaton/robotics/robotics-software/autonomous-robots-could-mine-the-deep-seafloor

A battle is brewing over the fate of the deep ocean. Huge swaths of seafloor are rich in metals—nickel, copper, cobalt, zinc—that are key to making electric vehicle batteries, solar panels, and smartphones. Mining companies have proposed scraping and vacuuming the dark expanse to provide supplies for metal-intensive technologies. Marine scientists and environmentalists oppose such plans, warning of huge and potentially permanent damage to fragile ecosystems.

Pietro Filardo is among the technology developers who are working to find common ground.

His company, Pliant Energy Systems, has built what looks like a black mechanical stingray. Its soft, rippling fins use hyperbolic geometry to move in a traveling wave pattern, propelling the skateboard-sized device through water. From an airy waterfront lab in Brooklyn, New York, Filardo’s team is developing tools and algorithms to transform the robot into an autonomous device equipped with grippers. Their goal is to pluck polymetallic nodules—potato-sized deposits of precious ores—off the seafloor without disrupting precious habitats.

“On the one hand, we need these metals to electrify and decarbonize. On the other hand, people worry we’re going to destroy deep ocean ecosystems that we know very little about,” Filardo said. He described deep sea mining as the “killer app” for Pliant’s robot—a potentially lucrative use for the startup’s minimally invasive design.

How deep seas will be mined, and where, is ultimately up to the International Seabed Authority (ISA), a group of 168 member countries. In October, the intergovernmental body is expected to adopt a sweeping set of technical and environmental standards, known as the Mining Code, that could pave the way for private companies to access large tracts of seafloor. 

The ISA has already awarded 30 exploratory permits to contractors in sections of the Atlantic, Pacific, and Indian Oceans. Over half the permits are for prospecting polymetallic nodules, primarily in the Clarion-Clipperton Zone, a hotspot south of Hawaii and west of Mexico.

Researchers have tested nodule mining technology since the 1970s, mainly in national waters. Existing approaches include sweeping the seafloor with hydraulic suction dredges to pump up sediment, filter out minerals, and dump the resulting slurry in the ocean or tailing ponds. In India, the National Institute of Ocean Technology is building a tracked “crawler” vehicle with a large scoop to collect, crush, and pump nodules up to a mother ship.

Mining proponents say such techniques are better for people and the environment than dangerous, exploitative land-based mining practices. Yet ocean experts warn that stirring up sediment and displacing organisms that live on nodules could destroy deep sea habitats that took millions of years to develop. 

“One thing I often talk about is, ‘How do we fix it if we break it? How are we going to know we broke it?’” said Cindy Lee Van Dover, a deep sea biologist and professor at Duke University’s Nicholas School of the Environment. She said much more research is required to understand the potential effects on ocean ecosystems, which foster fisheries, absorb carbon dioxide, and produce most of the Earth’s oxygen.

Significant work is also needed to transform robots into metal collectors that can operate some 6,000 meters below the ocean surface.

Pliant’s first prototype, called Velox, can navigate the depths of a swimming pool and the shallow ocean “surf zone” where waves crash into the sand. Inside Velox, an onboard CPU distributes power to actuators that drive the undulating motions in the flexible fins. Unlike a propeller thruster, which uses a rapidly rotating blade to move small jets of water at high velocity, Pliant’s undulating fins move large volumes of water at low velocity. By using the water’s large surface area, the robot can make rapid local maneuvers using relatively little battery input, allowing the device to operate for longer periods before needing to recharge, Filardo said. 

The design also stirs up less sediment on the seafloor, a potential advantage in sensitive deep sea environments, he added.

The Brooklyn company is partnering with the Massachusetts Institute of Technology to develop a larger next-generation robot, called C-Ray. The highly maneuverable device will twist and roll like a sea otter. Using metal detectors and a mix of camera hardware and computer algorithms, C-Ray will likely be used to surveil the surf zone for potential hazards to the U.S. Navy, who is sponsoring the research program.

The partners ultimately aim to deploy “swarms” of autonomous C-Rays that communicate via a “hive mind”—applications that would also serve to mine polymetallic nodules. Pliant envisions launching hundreds of gripper-equipped robots that roam the seafloor and place nodules in cages that float to the surface on gas-filled lift bags. Filardo suggested that C-Ray could also swap nodules with lower-value stones, allowing organisms to regrow on the seafloor.

A separate project in Italy may also yield new tools for plucking the metal-rich orbs.

SILVER2 is a six-legged robot that can feel its way around the dark and turbid seafloor, without the aid of cameras or lasers, by pushing its legs in repeated, frequent cycles.

“We started by looking at what crabs did underwater,” said Marcello Calisti, an assistant professor at the BioRobotics Institute, in the Sant’Anna School of Advanced Studies. He likened the movements to people walking waist-deep in water and using the sand as leverage, or the “punter” on a flat-bottomed river boat who uses a long wooden pole to propel the vessel forward.

Calisti and colleagues spent most of July at a seaside lab in Livorno, Italy, testing the 20-kilogram prototype in shallow water. SILVER2 is equipped with a soft elastic gripper that gently envelopes objects, as if cupping them in the palm of a hand. Researchers used the crab-like robot to collect plastic litter on the seabed and deposit the debris in a central collection bin.

Although SILVER2 isn’t intended for deep sea mining, Calisti said he could foresee potential applications in the sector if his team can scale the technology.

For developers like Pliant, their ability to raise funding and achieve their mining robots will largely depend on the International Seabed Authority’s next meeting. Opponents of ocean mining are pushing to pause discussions on the Mining Code to give scientists more time to evaluate risks, and to allow companies like Tesla or Apple to devise technologies that require fewer or different metal parts. Such regulatory uncertainty could dissuade investors from backing new mining approaches that might never be used.

The biologist Van Dover said she doesn’t outright oppose the Mining Code; rather, rules should include stringent stipulations, such as requirements to monitor environmental impacts and immediately stop operations once damage is detected. “I don’t see why the code couldn’t be so well-written that it would not allow the ISA to make a mistake,” she said.

Video Friday: This Robot Will Restock Shelves at Japanese Convenience Stores

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/robotics-hardware/video-friday-telexistence-model-t-robot

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

CLAWAR 2020 – August 24-26, 2020 – [Online Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
AUVSI EXPONENTIAL 2020 – October 5-8, 2020 – [Online Conference]
IROS 2020 – October 25-29, 2020 – Las Vegas, Nev., USA
CYBATHLON 2020 – November 13-14, 2020 – [Online Event]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA

Let us know if you have suggestions for next week, and enjoy today’s videos.


Caltech’s Canon-Launched SQUID Drone Doubles in Size, Goes Autonomous

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/drones/caltech-canon-launched-squid-drone

Journal Watch report logo, link to report landing page

At IROS last year, Caltech and JPL presented a prototype for a ballistically launched quadrotor—once folded up into a sort of football shape with fins, the drone is stuffed into a tube and then fired straight up with a blast of compressed CO2, at which point it unfolds itself, stabilizes, and then flies off. It’s been about half a year, and the prototype has been scaled up in both size and capability, now with a half-dozen rotors and full onboard autonomy that can (barely) squeeze into a 6-inch tube.

iRobot Announces Major Software Update, Shift From Pure Autonomy to Human-Robot Collaboration

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/home-robots/irobot-home-autonomy-update

Since the release of the very first Roomba in 2002, iRobot’s long-term goal has been to deliver cleaner floors in a way that’s effortless and invisible. Which sounds pretty great, right? And arguably, iRobot has managed to do exactly this, with their most recent generation of robot vacuums that make their own maps and empty their own dustbins. For those of us who trust our robots, this is awesome, but iRobot has gradually been realizing that many Roomba users either don’t want this level of autonomy, or aren’t ready for it.

Today, iRobot is announcing a major new update to its app that represents a significant shift of its overall approach to home robot autonomy. Humans are being brought back into the loop through software that tries to learn when, where, and how you clean so that your Roomba can adapt itself to your life rather than the other way around.

Boston Dynamics’ Handle robot recreated with Raspberry Pi

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/boston-dynamics-handle-robot-recreated-with-raspberry-pi/

You in the community seemed so impressed with this recent Boston Dynamics–inspired build that we decided to feature another. This time, maker Harry was inspired by Boston Dynamics’ research robot Handle, which stands 6.5 ft tall, travels at 9 mph and jumps 4​ ​feet vertically. Here’s how Harry made his miniature version, MABEL (Multi Axis Balancer Electronically Levelled).

MABEL has individually articulated legs to enhance off-road stability, prevent it from tipping, and even make it jump (if you use some really fast servos). Harry is certain that anyone with a 3D printer and a “few bits” can build one.

MABEL builds on the open-source YABR project for its PID controller, and it’s got added servos and a Raspberry Pi that helps interface them and control everything.

Installing MABEL’s Raspberry Pi brain and wiring the servos

Thanks to a program based on the open-source YABR firmware, an Arduino handles all of the PID calculations using data from an MPU-6050 accelerometer/gyro. Raspberry Pi, using Python code, manages Bluetooth and servo control, running an inverse kinematics algorithm to translate the robot legs perfectly in two axes.

Kit list

If you want to attempt this project yourself, the files for all the hard 3D-printed bits are on Thingiverse, and all the soft insides are on GitHub.

IKSolve is the class that handles the inverse kinematics functionality for MABEL (IKSolve.py) and allows for the legs to be translated using (x, y) coordinates. It’s really simple to use: all that you need to specify are the home values of each servo (these are the angles that, when passed over to your servos, make the legs point directly and straight downwards at 90 degrees).

When MABEL was just a twinkle in Harry’s eye

MABEL is designed to work by listening to commands on the Arduino (PID contoller) end that are sent to it by Raspberry Pi over serial using pySerial. Joystick data is sent to Raspberry Pi using the Input Python library. Harry first tried to get the joystick data from an old PlayStation 3 controller, but went with the PiHut’s Raspberry Pi Compatible Wireless Gamepad in the end for ease.

Keep up with Harry’s blog or give Raspibotics a follow on Twitter, as part 3 of his build write-up should be dropping imminently, featuring updates that will hopefully get MABEL jumping!

The post Boston Dynamics’ Handle robot recreated with Raspberry Pi appeared first on Raspberry Pi.

Robotic Tank Is Designed to Crawl Through Your Intestine

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/medical-robots/robotic-tank-is-designed-to-crawl-through-your-intestine

Let’s talk about bowels! Most of us have them, most of us use them a lot, and like anything that gets used a lot, they eventually need to get checked out to help make sure that everything will keep working the way it should for as long as you need it to. Generally, this means a colonoscopy, and while there are other ways of investigating what’s going on in your guts, a camera on a flexible tube is still “the gold-standard method of diagnosis and intervention,” according to some robotics researchers who want to change that up a bit.

The University of Colorado’s Advanced Medical Technologies Lab has been working on a tank robot called Endoculus that’s able to actively drive itself through your intestines, rather than being shoved. The good news is that it’s very small, and the bad news is that it’s probably not as small as you’d like it to be.

Video Friday: Child Robot Learning to Express Emotions Using Body Language

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/robotics-hardware/video-friday-ibuki-robot-child-emotional-gait

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

CLAWAR 2020 – August 24-26, 2020 – [Online Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
AUVSI EXPONENTIAL 2020 – October 5-8, 2020 – [Online Conference]
IROS 2020 – October 25-29, 2020 – Las Vegas, Nev., USA
CYBATHLON 2020 – November 13-14, 2020 – [Online Event]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA

Let us know if you have suggestions for next week, and enjoy today’s videos.


Raspberry Pi Off-World Bartender

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/raspberry-pi-off-world-bartender/

Three things we like: Blade Runner, robots, and cocktails. That’s why we LOVE Donald Bell‘s Raspberry Pi–packed ‘VK-01 Off-World Bartender‘ cocktail making machine.

This machine was due to be Donald’s entry into the Cocktail Robotics Grand Challenge, an annual event in San Francisco. By the time the event was cancelled, he was too deep into his awesome build to give up, so he decided to share it with the Instructables community instead.

Donald wanted users to get as much interaction and feedback as possible, rather than simply pressing a button and receiving a random drink. So with this machine, the interaction comes in four ways: instructions provided on the screen, using a key card to bypass security, placing and removing a cup on the tray, and entering an order number on the keypad.

In addition to that, feedback is provided by way of lighting changes, music, video dialogue, pump motors whirring, and even the clicks of relays at each stage of the cocktail making process.

Ordering on the keypad

close up of the black keypad

The keypad allows people to punch in a number to trigger their order, like on a vending machine. The drink order is sent to the Hello Drinkbot software running on the Raspberry Pi 3B that controls the pumps.

Getting your cup filled

Inside the cup dispenser sensor showing the switch and LEDs
The switch under the lid and ring of LEDs on the base

In order for the machine to be able to tell when a vessel is placed under the dispenser spout, and when it’s removed, Donald built in a switch under a 3D-printed tray. Provided the vessel has at least one ice cube in it, even the lightest plastic up is heavy enough to trigger the switch.

The RFID card reader

Cocktail machine customers are asked to scan a special ID card to start. To make this work, Donald adapted a sample script that blinks the card reader’s internal LED when any RFID card is detected.

Interactive video screen

close up of the interactive screen on the machine showing Japanese style script

This bit is made possible by MP4Museum, a “bare-bones” kiosk video player software that the second Raspberry Pi inside the machine runs on boot. By connecting a switch to the Raspberry Pi’s GPIO, Donald enabled customers to advance through the videos one by one. And yes, that’s an official Raspberry Pi Touch Display.

Behind the scenes of the interactive screen with the Raspberry Pi wired up
Behind the scenes of the screen with the Raspberry Pi A+ running the show

The Hello Drinkbot ‘bartender’

screen grab of the hello drinkbot web interface

Donald used the Python-based Hello Drinkbot software as the brains of the machine. With it, you can configure which liquors or juices are connected to which pumps, and send instructions on exactly how much to pour of each ingredient. Everything is configured via a web interface.

Via a bank of relays, microcontrollers connect all the signals from the Touch Display, keypad, RFID card reader, and switch under the spout.

Here’s the Fritzing diagram for this beast

Supplies

Donald shared an exhaustive kit list on his original post, but basically, what you’re looking at is…

Pencil sketches of the machine from different angles
Donald’s friend Jim Burke‘s beautiful concept sketches

And finally, check out the Raspberry Pi–based Hello Drinkbot project by Rich Gibson, which inspired Donald’s build.

The post Raspberry Pi Off-World Bartender appeared first on Raspberry Pi.

Minuscule RoBeetle Turns Liquid Methanol Into Muscle Power

Post Syndicated from Evan Ackerman original https://spectrum.ieee.org/automaton/robotics/robotics-hardware/robeetle-liquid-methanol

It’s no secret that one of the most significant constraints on robots is power. Most robots need lots of it, and it has to come from somewhere, with that somewhere usually being a battery because there simply aren’t many other good options. Batteries, however, are famous for having poor energy density, and the smaller your robot is, the more of a problem this becomes. And the issue with batteries goes beyond the battery itself, but also carries over into all the other components that it takes to turn the stored energy into useful work, which again is a particular problem for small-scale robots.

In a paper published this week in Science Robotics, researchers from the University of Southern California, in Los Angeles, demonstrate RoBeetle, an 88-milligram four legged robot that runs entirely on methanol, a power-dense liquid fuel. Without any electronics at all, it uses an exceptionally clever bit of mechanical autonomy to convert methanol vapor directly into forward motion, one millimeter-long step at a time.