All posts by Phil King

Machine Learning Prosthetic Arm | The MagPi #110

Post Syndicated from Phil King original https://www.raspberrypi.org/blog/machine-learning-prosthetic-arm-the-magpi-110/

This intelligent arm learns how to move naturally, based on what the wearer is doing, as Phil King discovers in the latest issue of The MagPi, out now.

Known for his robotic creations, popular YouTuber James Bruton is also a keen Iron Man cosplayer, and his latest invention would surely impress Tony Stark: an intelligent prosthetic arm that can move naturally and autonomously, depending on the wearer’s body posture and limb movements.

Equipped with three heavy-duty servos, the prosthetic arm moves naturally based on the data from IMU sensors on the wearer’s other limbs
Equipped with three heavy-duty servos, the prosthetic arm moves naturally based on the data from IMU sensors on the wearer’s other limbs

“It’s a project I’ve been thinking about for a while, but I’ve never actually attempted properly,” James tells us. “I thought it would be good to have a work stream of something that could be useful.”

Motion capture suit

To obtain the body movement data on which to base the arm’s movements, James considered using a brain computer, but this would be unreliable without embedding electrodes in his head! So, he instead opted to train it with machine learning.

For this he created a motion capture suit from 3D-printed parts to gather all the data from his body motions: arms, legs, and head. The suit measures joint movements using rotating pieces with magnetic encoders, along with limb and head positions – via a special headband – using MPU-6050 inertial measurement units and Teensy LC boards.

Part of the motion capture suit, the headband is equipped with an IMU to gather movement data
Part of the motion capture suit, the headband is equipped with an IMU to gather movement data

Collected by a Teensy 4.1, this data is then fed into a machine learning model running on the suit’s Raspberry Pi Zero using AOgmaNeo, a lightweight C++ software library designed to run on low-power devices such a microcontrollers.

“AOgmaNeo is a reinforcement machine learning system which learns what all of the data is doing in relation to itself,” James explains. “This means that you can remove any piece of data and, after training, the software will do its best to replace the missing piece with a learned output. In my case, I’m removing the right arm and using the learned output to drive the prosthetic arm, but it could be any limb.”

While James notes that AOgmaNeo is actually meant for reinforcement learning,“in this case we know what the output should be rather than it being unknown and learning through binary reinforcement.”

The motion capture suit comprises 3D-printed parts, each equipped with a magnetic rotary encoder, MPU-6050 IMU, and Teensy LC
The motion capture suit comprises 3D-printed parts, each equipped with a magnetic rotary encoder, MPU-6050 IMU, and Teensy LC

To train the model, James used distinctive repeated motions, such as walking, so that the prosthetic arm would later be able to predict what it should do from incoming sensor data. He also spent some time standing still so that the arm would know what to do in that situation.

New model arm

With the machine learning model trained, Raspberry Pi Zero can be put into playback mode to control the backpack-mounted arm’s movements intelligently. It can then duplicate what the wearer’s real right arm was doing during training depending on the positions and movements of other body parts.

So, as he demonstrates in his YouTube video, if James starts walking on the spot, the prosthetic arm swings the opposite way to his left arm as he strides along, and moves forward as raises his left leg. If he stands still, the arm will hang down by his side. The 3D-printed hand was added purely for aesthetic reasons and the fingers don’t move.

Subscribe to James’ YouTube channel

James admits that the project is highly experimental and currently an early work in progress. “I’d like to develop this concept further,” he says, “although the current setup is slightly overambitious and impractical. I think the next step will be to have a simpler set of inputs and outputs.”

While he generally publishes his CAD designs and code, the arm “doesn’t work all that well, so I haven’t this time. AOgmaNeo is open-source, though (free for personal use), so you can make something similar if you wished.” What would you do with an extra arm? 

Get The MagPi #110 NOW!

MagPi 110 Halloween cover

You can grab the brand-new issue right now from the Raspberry Pi Press store, or via our app on Android or iOS. You can also pick it up from supermarkets and newsagents. There’s also a free PDF you can download.

The post Machine Learning Prosthetic Arm | The MagPi #110 appeared first on Raspberry Pi.

New book: Get Started with MicroPython on Raspberry Pi Pico

Post Syndicated from Phil King original https://www.raspberrypi.org/blog/new-book-get-started-with-micropython-on-raspberry-pi-pico/

So, you’ve got a brand new Raspberry Pi Pico and want to know how to get started with this tiny but powerful microcontroller? We’ve got just the book for you.

Get Started with Raspberry Pi Pico book

Beginner-friendly

In Get Started with MicroPython on Raspberry Pi Pico, you’ll learn how to use the beginner-friendly language MicroPython to write programs and connect hardware to make your Raspberry Pi Pico interact with the world around it. Using these skills, you can create your own electro-mechanical projects, whether for fun or to make your life easier.

Inside the pages of the Raspberry Pi Pico book

After taking you on a guided tour of Pico, the books shows you how to get it up and running with a step-by-step illustrated guide to soldering pin headers to the board and installing the MicroPython firmware via a computer.

Programming basics

Inside the pages of the Raspberry Pi Pico book 02

Next, we take you through the basics of programming in MicroPython, a Python-based programming language developed specifically for microcontrollers such as Pico. From there, we explore the wonderful world of physical computing and connect a variety of electronic components to Pico using a breadboard. Controlling LEDs and reading input from push buttons, you’ll start by creating a pedestrian crossing simulation, before moving on to projects such as a reaction game, burglar alarm, temperature gauge, and data logger.

Inside the pages of the Raspberry Pi Pico book

Raspberry Pi Pico also supports the I2C and SPI protocols for communicating with devices, which we explore by connecting it up to an LCD display. You can even use MicroPython to take advantage of one of Pico’s most powerful features, Programmable I/O (PIO), which we explore by controlling NeoPixel LED strips.

Get your copy today!

You can buy Get Started with MicroPython on Raspberry Pi Pico now from the Raspberry Pi Press online store. If you don’t need the lovely new book, with its new-book smell, in your hands in real life, you can download a PDF version for free (or a small voluntary contribution).

STOP PRESS: we’ve spotted an error in the first print run of the book, affecting the code examples in Chapters 4 to 7. We’re sorry! Fortunately it’s easy for readers to correct in their own code; see here for everything you need to know. We’ve already corrected this in the PDF version.

The post New book: Get Started with MicroPython on Raspberry Pi Pico appeared first on Raspberry Pi.