All posts by Ashley Whittaker

13 Raspberry Pis slosh-test space shuttle tanks in zero gravity

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/13-raspberry-pis-slosh-test-space-shuttle-tanks-in-zero-gravity/

High-school student Eleanor Sigrest successfully crowdfunded her way onto a zero-G flight to test her latest Raspberry Pi-powered project. NASA Goddard engineers peer reviewed Eleanor’s experimental design, which detects unwanted movement (or ‘slosh’) in spacecraft fluid tanks.

The Raspberry Pi-packed setup

The apparatus features an accelerometer to precisely determine the moment of zero gravity, along with 13 Raspberry Pis and 12 Raspberry Pi cameras to capture the slosh movement.

What’s wrong with slosh?

The Broadcom Foundation shared a pretty interesting minute-by-minute report on Eleanor’s first hyperbolic flight and how she got everything working. But, in a nutshell…

The full apparatus onboard the zero gravity flight

You don’t want the fluid in your space shuttle tanks sloshing around too much. It’s a mission-ending problem. Slosh occurs on take-off and also in microgravity during manoeuvres, so Eleanor devised this novel approach to managing it in place of the costly, heavy subsystems currently used on board space craft.

Eleanor wanted to prove that the fluid inside tanks treated with superhydrophobic and superhydrophilic coatings settled quicker than in uncoated tanks. And she was right: settling times were reduced by 73% in some cases.

Eleanor at work

A continuation of this experiment is due to go up on Blue Origin’s New Shepard rocket – and yes, a patent is already pending.

Curiosity, courage & compromise

At just 13 years old, Eleanor won the Samueli Prize at the 2016 Broadcom MASTERS for her mastery of STEM principles and team leadership during a rigorous week-long competition. High praise came from Paula Golden, President of Broadcom Foundation, who said: “Eleanor is the epitome of a young woman scientist and engineer. She combines insatiable curiosity with courage: two traits that are essential for a leader in these fields.”

Eleanor aged 13 with her award-winning project ‘Rockets & Nozzles & Thrust… Oh My’

That week-long experience also included a Raspberry Pi Challenge, and Eleanor explained: “During the Raspberry Pi Challenge, I learned that sometimes the simplest solutions are the best. I also learned it’s important to try everyone’s ideas because you never know which one might work the best. Sometimes it’s a compromise of different ideas, or a compromise between complicated and simple. The most important thing is to consider them all.”

Get this girl to Mars already.

The post 13 Raspberry Pis slosh-test space shuttle tanks in zero gravity appeared first on Raspberry Pi.

Raspberry Pi powered e-paper display takes months to show a movie

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/raspberry-pi-powered-e-paper-display-takes-months-to-show-a-movie/

We loved the filmic flair of Tom Whitwell‘s super slow e-paper display, which takes months to play a film in full.

Living art

His creation plays films at about two minutes of screen time per 24 hours, taking a little under three months for a 110-minute film. Psycho played in a corner of his dining room for two months. The infamous shower scene lasted a day and a half.

Tom enjoys the opportunity for close study of iconic filmmaking, but you might like this project for the living artwork angle. How cool would this be playing your favourite film onto a plain wall somewhere you can see it throughout the day?

The Raspberry Pi wearing its e-Paper HAT

Four simple steps

Luckily, this is a relatively simple project – no hardcore coding, no soldering required – with just four steps to follow if you’d like to recreate it:

  1. Get the Raspberry Pi working in headless mode without a monitor, so you can upload files and run code
  2. Connect to an e-paper display via an e-paper HAT (see above image; Tom is using this one) and install the driver code on the Raspberry Pi
  3. Use Tom’s code to extract frames from a movie file, resize and dither those frames, display them on the screen, and keep track of progress through the film
  4. Find some kind of frame to keep it all together (Tom went with a trusty IKEA number)
Living artwork: the Psycho shower scene playing alongside still artwork in Tom’s home

Affordably arty

The entire build cost £120 in total. Tom chose a 2GB Raspberry Pi 4 and a NOOBS 64gb SD Card, which he bought from Pimoroni, one of our approved resellers. NOOBS included almost all the libraries he needed for this project, which made life a lot easier.

His original post is a dream of a comprehensive walkthrough, including all the aforementioned code.

2001: A Space Odyssey would take months to play on Tom’s creation

Head to the comments section with your vote for the creepiest film to watch in ultra slow motion. I came over all peculiar imaging Jaws playing on my living room wall for months. Big bloody mouth opening slooooowly (pales), big bloody teeth clamping down slooooowly (heart palpitations). Yeah, not going to try that. Sorry Tom.

The post Raspberry Pi powered e-paper display takes months to show a movie appeared first on Raspberry Pi.

Raspberry Pi turns retro radio into interactive storyteller

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/raspberry-pi-turns-retro-radio-into-interactive-storyteller/

8 Bits and a Byte created this voice-controllable, interactive, storytelling device, hidden inside a 1960s radio for extra aesthetic wonderfulness.

A Raspberry Pi 3B works with an AIY HAT, a microphone, and the device’s original speaker to run chatbot and speech-to-text artificial intelligence.

This creature is a Bajazzo TS made by Telefunken some time during the 1960s in West Germany, and this detail inspired the espionage-themed story that 8 Bits and a Byte retrofitted it to tell. Users are intelligence agents whose task is to find the evil Dr Donogood.

The device works like one of those ‘choose your own adventure’ books, asking you a series of questions and offering you several options. The story unfolds according to the options you choose, and leads you to a choice of endings.

In with the new (Raspberry Pi tucked in the lower right corner)

What’s the story?

8 Bits and a Byte designed a decision tree to provide a tight story frame, so users can’t go off on question-asking tangents.

When you see the ‘choose your own adventure’ frame set out like this, you can see how easy it is to create something that feels interactive, but really only needs to understand the difference between a few phrases: ‘laser pointer’; ‘lockpick’; ‘drink’; take bribe’, and ‘refuse bribe’.

How does it interact with the user?

Skip to 03mins 30secs to see the storytelling in action

Google Dialogflow is a free natural language understanding platform that makes it easy to design a conversational user interface, which is long-speak for ‘chatbot’.

There are a few steps between the user talking to the radio, and the radio figuring out how to respond. The speech-to-text and chatbot software need to work in tandem. For this project, the data flow runs like so:

1: The microphone detects that someone is speaking and records the audio.

2-3: Google AI (the Speech-To-Text box) processes the audio and extracts the words the user spoke as text.

4-5: The chatbot (Google Dialogflow) receives this text and matches it with the correct response, which is sent back to the Raspberry Pi.

6-7: Some more artificial intelligence uses this text to generate artificial speech.

8: This audio is played to the user via the speaker.

Make sure to check out more of 8 Bits and a Byte’s projects on YouTube. We recommend Mooomba the cow roomba.

The post Raspberry Pi turns retro radio into interactive storyteller appeared first on Raspberry Pi.

Raspberry Pi enables world’s smallest iMac

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/raspberry-pi-enables-worlds-smallest-imac/

This project goes a step further than most custom-made Raspberry Pi cases: YouTuber Michael Pick hacked a Raspberry Pi 4 and stuffed it inside this Apple lookalike to create the world’s smallest ‘iMac’.

Michael designed and 3D printed this miniature ‘iMac’ with what he calls a “gently modified” Raspberry Pi 4 at the heart. Everything you see is hand-painted and -finished to achieve an authentic, sleek Apple look.

This is “gentle modification” we just mentioned

Even after all that power tool sparking, this miniature device is capable of playing Minecraft at 1000 frames per second. Michael was set on making the finished project as thin as possible, so he had to slice off a couple of his Raspberry Pi’s USB ports and the Ethernet socket to make everything fit inside the tiny, custom-made case. This hacked setup leaves you with Bluetooth and wireless internet connections, which, as Michael explains in the build video, “if you’re a Mac user, that’s all you’re ever going to need.”

We love watching 3D printer footage set to relaxed elevator music

This teeny yet impactful project has even been featured on forbes.com, and that’s where we learned how the tightly packed tech manages to work in such a restricted space:

“A wireless dongle is plugged into one of the remaining USB ports to ensure it’s capable of connecting to a wireless keyboard and mouse, and a low-profile ribbon cable is used to connect the display to the Raspberry Pi. Careful crimping of cables and adapters ensures the mini iMac can be powered from a USB-C extension cable that feeds in under the screen, while the device also includes a single USB 2 port.”

Barry Collins | forbes.com

The maker also told forbes.com that this build was inspired by an iRaspbian software article from tech writer Barry Collins. iRaspbian puts a Mac-like interface — including Dock, Launcher and even the default macOS wallpaper — on top of a Linux distro. We guess Michael just wanted the case to match the content, hey?

Check out Michael’s YouTube channel for more inexplicably cool builds, such as a one billion volt Thor hammer.

The post Raspberry Pi enables world’s smallest iMac appeared first on Raspberry Pi.

Global sunrise/sunset Raspberry Pi art installation

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/global-sunrise-sunset-raspberry-pi-art-installation/

24h Sunrise/Sunset is a digital art installation that displays a live sunset and sunrise happening somewhere in the world with the use of CCTV.

Image by fotoswiss.com

Artist Dries Depoorter wanted to prove that “CCTV cameras can show something beautiful”, and turned to Raspberry Pi to power this global project.

Image by fotoswiss.com

Harnessing CCTV

The arresting visuals are beamed to viewers using two Raspberry Pi 3B+ computers and an Arduino Nano Every that stream internet protocol (IP) cameras with the use of command line media player OMXPlayer.

Dual Raspberry Pi power

The two Raspberry Pis communicate with each other using the MQTT protocol — a standard messaging protocol for the Internet of Things (IoT) that’s ideal for connecting remote devices with a small code footprint and minimal network bandwidth.

One of the Raspberry Pis checks at which location in the world a sunrise or sunset is happening and streams the closest CCTV camera.

The insides of the sleek display screen…

Beam me out, Scotty

The big screens are connected with the I2C protocol to the Arduino, and the Arduino is connected serial with the second Raspberry Pi. Dries also made a custom printed circuit board (PCB) so the build looks cleaner.

All that hardware is powered by an industrial power supply, just because Dries liked the style of it.

Software

Everything is written in Python 3, and Dries harnessed the Python 3 libraries BeautifulSoup, Sun, Geopy, and Pytz to calculate sunrise and sunset times at specific locations. Google Firebase databases in the cloud help with admin by way of saving timestamps and the IP addresses of the cameras.

Hardware

The artist stood infront of the two large display screens
Image of the artist with his work by fotoswiss.com

And, lastly, Dries requested a shoutout for his favourite local Raspberry Pi shop Gotron in Ghent.

The post Global sunrise/sunset Raspberry Pi art installation appeared first on Raspberry Pi.

What the blink is my IP address?

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/what-the-blink-is-my-ip-address/

Picture the scene: you have a Raspberry Pi configured to run on your network, you power it up headless (without a monitor), and now you need to know which IP address it was assigned.

Matthias came up with this solution, which makes your Raspberry Pi blink its IP address, because he used a Raspberry Pi Zero W headless for most of his projects and got bored with having to look it up with his DHCP server or hunt for it by pinging different IP addresses.

How does it work?

A script runs when you start your Raspberry Pi and indicates which IP address is assigned to it by blinking it out on the device’s LED. The script comprises about 100 lines of Python, and you can get it on GitHub.

A screen running Python
Easy peasy GitHub breezy

The power/status LED on the edge of the Raspberry Pi blinks numbers in a Roman numeral-like scheme. You can tell which number it’s blinking based on the length of the blink and the gaps between each blink, rather than, for example, having to count nine blinks for a number nine.

Blinking in Roman numerals

Short, fast blinks represent the numbers one to four, depending on how many short, fast blinks you see. A gap between short, fast blinks means the LED is about to blink the next digit of the IP address, and a longer blink represents the number five. So reading the combination of short and long blinks will give you your device’s IP address.

You can see this in action at this exact point in the video. You’ll see the LED blink fast once, then leave a gap, blink fast once again, then leave a gap, then blink fast twice. That means the device’s IP address ends in 112.

What are octets?

Luckily, you usually only need to know the last three numbers of the IP address (the last octet), as the previous octets will almost always be the same for all other computers on the LAN.

The script blinks out the last octet ten times, to give you plenty of chances to read it. Then it returns the LED to its default functionality.

Which LED on which Raspberry Pi?

On a Raspberry Pi Zero W, the script uses the green status/power LED, and on other Raspberry Pis it uses the green LED next to the red power LED.

The green LED blinking the IP address (the red power LED is slightly hidden by Matthias’ thumb)

Once you get the hang of the Morse code-like blinking style, this is a really nice quick solution to find your device’s IP address and get on with your project.

The post What the blink is my IP address? appeared first on Raspberry Pi.

Turn a watermelon into a RetroPie games console

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/turn-a-watermelon-into-a-retropie-games-console/

OK Cedrick, we don’t need to know why, but we have to know how you turned a watermelon into a games console.

This has got to be a world first. What started out as a regular RetroPie project has blown up reddit due to the unusual choice of casing for the games console: nearly 50,000 redditors upvoted this build within a week of Cedrick sharing it.

See, we’re not kidding

What’s inside?

  • Raspberry Pi 3
  • Jingo Dot power bank (that yellow thing you can see below)
  • Speakers
  • Buttons
  • Small 1.8″ screen
Cedric’s giggling really makes this video

Retropie

While this build looks epic, it isn’t too tricky to make. First, Cedrick flashed the RetroPie image onto an SD card, then he wired up a Raspberry Pi’s GPIO pins to the red console buttons, speakers, and the screen.

Cedrick achieved audio output by adding just a few lines of code to the config file, and he downloaded libraries for screen configuration and button input. That’s it! That’s all you need to get a games console up and running.

Cedrick just hanging on the train with his WaterBoy

Now for the messy bit

Cedrick had to gut an entire watermelon before he could start getting all the hardware in place. He power-drilled holes for the buttons to stick through, and a Stanley knife provided the precision he needed to get the right-sized gap for the screen.

A gutted watermelon with gaps cut to fit games console buttons and a screen

Rather than drill even more holes for the speakers, Cedrick stuck them in place inside the watermelon using toothpicks. He did try hot glue first but… yeah. Turns out fruit guts are impervious to glue.

Moisture was going to be a huge problem, so to protect all the hardware from the watermelon’s sticky insides, Cedric lined it with plastic clingfilm.

Infinite lives

And here’s how you can help: Cedrick is open to any tips as to how to preserve the perishable element of his project: the watermelon. Resin? Vaseline? Time machine? How can he keep the watermelon fresh?

Share your ideas on reddit or YouTube, and remember to subscribe to see more of Cedric’s maverick making in the wild.

The post Turn a watermelon into a RetroPie games console appeared first on Raspberry Pi.

It’s a brand-new NODE Mini Server!

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/its-a-brand-new-node-mini-server/

NODE has long been working to create open-source resources to help more people harness the decentralised internet, and their easily 3D-printed designs are perfect to optimise your Raspberry Pi.

NODE wanted to take advantage of the faster processor and up to 8GB RAM on Raspberry Pi 4 when it came out last year. Now that our tiny computer is more than capable of being used as as a general Linux desktop system, the NODE Mini Server version 3 has been born.

As for previous versions of NODE’s Mini Server, one of their main goals for this new iteration was to package Raspberry Pi in a way which makes it a little easier to use as a regular mini server or computer. In other words, it’s put inside a neat little box with all the ports accessible on one side.

Black is incredibly slimming

Slimmer and simpler

The latest design is simplified compared to previous versions. Everything lives in a 92mm × 92mm enclosure that isn’t much thicker than Raspberry Pi itself.

The slimmed-down new case comprises a single 3D-printed piece and a top cover made from a custom-designed printed circuit board (PCB) that has four brass-threaded inserts soldered into the corners, giving you a simple way to screw everything together.

The custom PCB cover

What are the new features?

Another goal for version 3 NODE’s Mini Server was to include as much modularity as possible. That’s why this new mini server requires no modifications to the Raspberry Pi itself, thanks to a range of custom-designed adapter boards. How to take advantage of all these new features is explained at this point in NODE’s YouTube video.

Ooh, shiny and new and new and shiny

Just like for previous versions, all the files and a list of the components you need to create your own Mini Server are available for free on the NODE website.

Leave comments on NODE’s YouTube video if you’d like to create and sell your own Mini Server kits or pre-made servers. NODE is totally open to showcasing any add-ons or extras you come up with yourself.

Looking ahead, making the Mini Server stackable and improving fan circulation is next on NODE’s agenda.

The post It’s a brand-new NODE Mini Server! appeared first on Raspberry Pi.

Give your voice assistant a retro Raspberry Pi makeover

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/give-your-voice-assistant-a-retro-raspberry-pi-makeover/

Do you feel weird asking the weather or seeking advice from a faceless device? Would you feel better about talking to a classic 1978 2-XL educational robot from Mego Corporation? Matt over at element14 Community, where tons of interesting stuff happens, has got your back.

Watch Matt explain how the 2-XL toy robot worked before he started tinkering with it. This robot works with Google Assistant on a Raspberry Pi, and answers to a custom wake word.

Kit list

Our recent blog about repurposing a Furby as a voice assistant device would have excited Noughties kids, but this one is mostly for our beautiful 1970s- and 1980s-born fanbase.

Time travel

2-XL, Wikipedia tells us, is considered the first “smart toy”, marketed way back in 1978, and exhibiting “rudimentary intelligence, memory, gameplay, and responsiveness”. 2-XL had a personality that kept kids’ attention, telling jokes and offering verbal support as they learned.

Teardown

Delve under the robot’s armour to see how the toy was built, understand the basic working mechanism, and watch Matt attempt to diagnose why his 2-XL is not working.

Setting up Google Assistant

The Matrix Creator daughter board mentioned in the kit list is an ideal platform for developing your own AI assistant. It’s the daughter board’s 8-microphone array that makes it so brilliant for this task. Learn how to set up Google Assistant on the Matrix board in this video.

What if you don’t want to wake your retrofit voice assistant in the same way as all the other less dedicated users, the ones who didn’t spend hours of love and care refurbishing an old device? Instead of having your homemade voice assistant answer to “OK Google” or “Alexa”, you can train it to recognise a phrase of your choice. In this tutorial, Matt shows you how to set up a custom wake word with your voice assistant, using word detection software called Snowboy.

Keep an eye on element14 on YouTube for the next instalment of this excellent retrofit project.

The post Give your voice assistant a retro Raspberry Pi makeover appeared first on Raspberry Pi.

Nandu’s lockdown Raspberry Pi robot project

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/nandus-lockdown-raspberry-pi-robot-project/

Nandu Vadakkath was inspired by a line-following robot built (literally) entirely from salvage materials that could wait patiently and purchase beer for its maker in Tamil Nadu, India. So he set about making his own, but with the goal of making it capable of slightly more sophisticated tasks.

“Robot, can you play a song?”

Hardware

Robot comes when called, and recognises you as its special human

Software

Nandu had ambitious plans for his robot: navigation, speech and listening, recognition, and much more were on the list of things he wanted it to do. And in order to make it do everything he wanted, he incorporated a lot of software, including:

Robot shares Nandu’s astrological chart
  • Python 3
  • virtualenv, a tool for creating isolating virtual Python environments
  • the OpenCV open source computer vision library
  • the spaCy open source natural language processing library
  • the TensorFlow open source machine learning platform
  • Haar cascade algorithms for object detection
  • A ResNet neural network with the COCO dataset for object detection
  • DeepSpeech, an open source speech-to-text engine
  • eSpeak NG, an open source speech synthesiser
  • The MySQL database service

So how did Nandu go about trying to make the robot do some of the things on his wishlist?

Context and intents engine

The engine uses spaCy to analyse sentences, classify all the elements it identifies, and store all this information in a MySQL database. When the robot encounters a sentence with a series of possible corresponding actions, it weighs them to see what the most likely context is, based on sentences it has previously encountered.

Getting to know you

The robot has been trained to follow Nandu around but it can get to know other people too. When it meets a new person, it takes a series of photos and processes them in the background, so it learns to remember them.

Nandu's home made robot
There she blows!

Speech

Nandu didn’t like the thought of a basic robotic voice, so he searched high and low until he came across the MBROLA UK English voice. Have a listen in the videos above!

Object and people detection

The robot has an excellent group photo function: it looks for a person, calculates the distance between the top of their head and the top of the frame, then tilts the camera until this distance is about 60 pixels. This is a lot more effort than some human photographers put into getting all of everyone’s heads into the frame.

Nandu has created a YouTube channel for his robot companion, so be sure to keep up with its progress!

The post Nandu’s lockdown Raspberry Pi robot project appeared first on Raspberry Pi.

Raspberry Pi retro player

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/raspberry-pi-retro-player/

We found this project at TeCoEd and we loved the combination of an OLED display housed inside a retro Argus slide viewer. It uses a Raspberry Pi 3 with Python and OpenCV to pull out single frames from a video and write them to the display in real time.​

TeCoEd names this creation the Raspberry Pi Retro Player, or RPRP, or – rather neatly – RP squared. The Argus viewer, he tells us, was a charity-shop find that cost just 50p.  It sat collecting dust for a few years until he came across an OLED setup guide on hackster.io, which inspired the birth of the RPRP.

Timelapse of the build and walk-through of the code

At the heart of the project is a Raspberry Pi 3 which is running a Python program that uses the OpenCV computer vision library.  The code takes a video clip and breaks it down into individual frames. Then it resizes each frame and converts it to black and white, before writing it to the OLED display. The viewer sees the video play in pleasingly retro monochrome on the slide viewer.

Tiny but cute, like us!

TeCoEd ran into some frustrating problems with the OLED display, which, he discovered, uses the SH1106 driver, rather than the standard SH1306 driver that the Adafruit CircuitPython library expects. Many OLED displays use the SH1306 driver, but it turns out that cheaper displays like the one in this project use the SH1106. He has made a video to spare other makers this particular throw-it-all-in-the-bin moment.

Tutorial for using the SH1106 driver for cheap OLED displays

If you’d like to try this build for yourself, here’s all the code and setup advice on GitHub.

Wiring diagram

TeCoEd is, as ever, our favourite kind of maker – the sharing kind! He has collated everything you’ll need to get to grips with OpenCV, connecting the SH1106 OLED screen over I2C, and more. He’s even told us where we can buy the OLED board.

The post Raspberry Pi retro player appeared first on Raspberry Pi.

Raspberry Pi + Furby = ‘Furlexa’ voice assistant

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/raspberry-pi-furby-furlexa-voice-assistant/

How can you turn a redundant, furry, slightly annoying tech pet into a useful home assistant? Zach took to howchoo to show you how to combine a Raspberry Pi Zero W with Amazon’s Alexa Voice Service software and a Furby to create Furlexa.

Furby was pretty impressive technology, considering that it’s over 20 years old. It could learn to speak English, sort of, by listening to humans. It communicated with other Furbies via infrared sensor. It even slept when its light sensor registered that it was dark.

Furby innards, exploded

Zach explains why Furby is so easy to hack:

Furby is comprised of a few primary components — a microprocessor, infrared and light sensors, microphone, speaker, and — most impressively — a single motor that uses an elaborate system of gears and cams to drive Furby’s ears, eyes, mouth and rocker. A cam position sensor (switch) tells the microprocessor what position the cam system is in. By driving the motor at varying speeds and directions and by tracking the cam position, the microprocessor can tell Furby to dance, sing, sleep, or whatever.

The original CPU and related circuitry were replaced with a Raspberry Pi Zero W

Zach continues: “Though the microprocessor isn’t worth messing around with (it’s buried inside a blob of resin to protect the IP), it would be easy to install a small Raspberry Pi computer inside of Furby, use it to run Alexa, and then track Alexa’s output to make Furby move.”

What you’ll need:

Harrowing

Running Alexa

The Raspberry Pi is running Alexa Voice Service (AVS) to provide full Amazon Echo functionality. Amazon AVS doesn’t officially support the tiny Raspberry Pi Zero, so lots of hacking was required. Point 10 on Zach’s original project walkthrough explains how to get AVS working with the Pimoroni Speaker pHAT.

Animating Furby

A small motor driver board is connected to the Raspberry Pi’s GPIO pins, and controls Furby’s original DC motor and gearbox: when Alexa speaks, so does Furby. The Raspberry Pi Zero can’t supply enough juice to power the motor, so instead, it’s powered by Furby’s original battery pack.

Software

There are three key pieces of software that make Furlexa possible:

  1. Amazon Alexa on Raspberry Pi – there are tonnes of tutorials showing you how to get Amazon Alexa up and running on your Raspberry Pi. Try this one on instructables.
  2. A script to control Furby’s motor howchooer Tyler wrote the Python script that Zach is using to drive the motor, and you can copy and paste it from Zach’s howchoo walkthrough.
  3. A script that detects when Alexa is speaking and calls the motor program – Furby detects when Alexa is speaking by monitoring the contents of a file whose contents change when audio is being output. Zach has written a separate guide for driving a DC motor based on Linux sound output.
Teeny tiny living space

The real challenge was cramming the Raspberry Pi Zero plus the Speaker pHAT, the motor controller board, and all the wiring back inside Furby, where space is at a premium. Soldering wires directly to the GPIO saved a bit of room, and foam tape holds everything above together nice and tightly. It’s a squeeze!

Zach is a maker extraordinaire, so check out his projects page on howchoo.

The post Raspberry Pi + Furby = ‘Furlexa’ voice assistant appeared first on Raspberry Pi.

Self-driving trash can controlled by Raspberry Pi

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/self-driving-trash-can-controlled-by-raspberry-pi/

YouTuber extraordinaire Ahad Cove HATES taking out the rubbish, so he decided to hack a rubbish bin/trash can – let’s go with trash can from now on – to take itself out to be picked up.

Sounds simple enough? The catch is that Ahad wanted to create an AI that can see when the garbage truck is approaching his house and trigger the garage door to open, then tell the trash can to drive itself out and stop in the right place. This way, Ahad doesn’t need to wake up early enough to spot the truck and manually trigger the trash can to drive itself.

Hardware

The trash can’s original wheels weren’t enough on their own, so Ahad brought in an electronic scooter wheel with a hub motor, powered by a 36V lithium ion battery, to guide and pull them. Check out this part of the video to hear how tricky it was for Ahad to install a braking system using a very strong servo motor.

The new wheel sits at the front of the trash can and drags the original wheels at the back along with

An affordable driver board controls the speed, power, and braking system of the garbage can.

The driver board

Tying everything together is a Raspberry Pi 3B+. Ahad uses one of the GPIO pins on the Raspberry Pi to send the signal to the driver board. He started off the project with a Raspberry Pi Zero W, but found that it was too fiddly to get it to handle the crazy braking power needed to stop the garbage can on his sloped driveway.

The Raspberry Pi Zero W, which ended up getting replaced in an upgrade

Everything is kept together and dry with a plastic snap-close food container Ahad lifted from his wife’s kitchen collection. Ssh, don’t tell.

Software

Ahad uses an object detection machine learning model to spot when the garbage truck passes his house. He handles this part of the project with an Nvidia Jetson Xavier NX board, connected to a webcam positioned to look out of the window watching for garbage trucks.

Object detected!

Opening the garage door

Ahad’s garage door has a wireless internet connection, so he connected the door to an app that communicates with his home assistant device. The app opens the garage door when the webcam and object detection software see the garbage truck turning into his street. All this works with the kit inside the trash can to get it to drive itself out to the end of Ahad’s driveway.

There she goes! (With her homemade paparazzi setup behind her)

Check out the end of Ahad’s YouTube video to see how human error managed to put a comical damper on the maiden voyage of this epic build.

The post Self-driving trash can controlled by Raspberry Pi appeared first on Raspberry Pi.

Boston Dynamics’ Handle robot recreated with Raspberry Pi

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/boston-dynamics-handle-robot-recreated-with-raspberry-pi/

You in the community seemed so impressed with this recent Boston Dynamics–inspired build that we decided to feature another. This time, maker Harry was inspired by Boston Dynamics’ research robot Handle, which stands 6.5 ft tall, travels at 9 mph and jumps 4​ ​feet vertically. Here’s how Harry made his miniature version, MABEL (Multi Axis Balancer Electronically Levelled).

MABEL has individually articulated legs to enhance off-road stability, prevent it from tipping, and even make it jump (if you use some really fast servos). Harry is certain that anyone with a 3D printer and a “few bits” can build one.

MABEL builds on the open-source YABR project for its PID controller, and it’s got added servos and a Raspberry Pi that helps interface them and control everything.

Installing MABEL’s Raspberry Pi brain and wiring the servos

Thanks to a program based on the open-source YABR firmware, an Arduino handles all of the PID calculations using data from an MPU-6050 accelerometer/gyro. Raspberry Pi, using Python code, manages Bluetooth and servo control, running an inverse kinematics algorithm to translate the robot legs perfectly in two axes.

Kit list

If you want to attempt this project yourself, the files for all the hard 3D-printed bits are on Thingiverse, and all the soft insides are on GitHub.

IKSolve is the class that handles the inverse kinematics functionality for MABEL (IKSolve.py) and allows for the legs to be translated using (x, y) coordinates. It’s really simple to use: all that you need to specify are the home values of each servo (these are the angles that, when passed over to your servos, make the legs point directly and straight downwards at 90 degrees).

When MABEL was just a twinkle in Harry’s eye

MABEL is designed to work by listening to commands on the Arduino (PID contoller) end that are sent to it by Raspberry Pi over serial using pySerial. Joystick data is sent to Raspberry Pi using the Input Python library. Harry first tried to get the joystick data from an old PlayStation 3 controller, but went with the PiHut’s Raspberry Pi Compatible Wireless Gamepad in the end for ease.

Keep up with Harry’s blog or give Raspibotics a follow on Twitter, as part 3 of his build write-up should be dropping imminently, featuring updates that will hopefully get MABEL jumping!

The post Boston Dynamics’ Handle robot recreated with Raspberry Pi appeared first on Raspberry Pi.

Raspberry Pi listening posts ‘hear’ the Borneo rainforest

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/raspberry-pi-listening-posts-hear-the-borneo-rainforest/

These award-winning, solar-powered audio recorders, built on Raspberry Pi, have been installed in the Borneo rainforest so researchers can listen to the local ecosystem 24/7. The health of a forest ecosystem can often be gaged according to how much noise it creates, as this signals how many species are around.

And you can listen to the rainforest too! The SAFE Acoustics website, funded by the World Wide Fund for Nature (WWF), streams audio from recorders placed around a region of the Bornean rainforest in Southeast Asia. Visitors can listen to live audio or skip back through the day’s recording, for example to listen to the dawn chorus.

Listen in on the Imperial College podcast

What’s inside?

We borrowed this image of the flux tower from Sarab Sethi’s site

The device records data in the field and uploads it to a central server continuously and robustly over long time-periods. And it was built for around $305.

Here’s all the code for the platform, on GitHub.

The 12V-to-5V micro USB converter to the power socket of the Anker USB hub, which is connected to Raspberry Pi.

The Imperial College London team behind the project has provided really good step-by-step photo instructions for anyone interested in the fine details.

Here’s the full set up in the field. The Raspberry Pi-powered brains of the kit are safely inside the green box

The recorders have been installed by Imperial College London researchers as part of the SAFE Project – one of the largest ecological experiments in the world.

Screenshot of the SAFE Project website

Dr Sarab Sethi designed the audio recorders with Dr Lorenzo Picinali. They wanted to quantify the changes in rainforest soundscape as land use changes, for example when forests are logged. Sarab is currently working on algorithms to analyse the gathered data with Dr Nick Jones from the Department of Mathematics.

The lovely cross-disciplinary research team based at Imperial College London

Let the creators of the project tell you more on the Imperial College London website.

The post Raspberry Pi listening posts ‘hear’ the Borneo rainforest appeared first on Raspberry Pi.