Tag Archives: Your Projects

Raspberry Pi LEGO sorter

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/raspberry-pi-lego-sorter/

Raspberry Pi is at the heart of this AI–powered, automated sorting machine that is capable of recognising and sorting any LEGO brick.

And its maker Daniel West believes it to be the first of its kind in the world!

Best ever

This mega-machine was two years in the making and is a LEGO creation itself, built from over 10,000 LEGO bricks.

A beast of 10,000 bricks

It can sort any LEGO brick you place in its input bucket into one of 18 output buckets, at the rate of one brick every two seconds.

While Daniel was inspired by previous LEGO sorters, his creation is a huge step up from them: it can recognise absolutely every LEGO brick ever created, even bricks it has never seen before. Hence the ‘universal’ in the name ‘universal LEGO sorting machine’.

Hardware

There we are, tucked away, just doing our job

Software

The artificial intelligence algorithm behind the LEGO sorting is a convolutional neural network, the go-to for image classification.

What makes Daniel’s project a ‘world first’ is that he trained his classifier using 3D model images of LEGO bricks, which is how the machine can classify absolutely any LEGO brick it’s faced with, even if it has never seen it in real life before.

We LOVE a thorough project video, and we love TWO of them even more

Daniel has made a whole extra video (above) explaining how the AI in this project works. He shouts out all the open source software he used to run the Raspberry Pi Camera Module and access 3D training images etc. at this point in the video.

LEGO brick separation

The vibration plate in action, feeding single parts into the scanner

Daniel needed the input bucket to carefully pick out a single LEGO brick from the mass he chucks in at once.

This is achieved with a primary and secondary belt slowly pushing parts onto a vibration plate. The vibration plate uses a super fast LEGO motor to shake the bricks around so they aren’t sitting on top of each other when they reach the scanner.

Scanning and sorting

A side view of the LEFO sorting machine showing a large white chute built from LEGO bricks
The underside of the beast

A Raspberry Pi Camera Module captures video of each brick, which Raspberry Pi 3 Model B+ then processes and wirelessly sends to a more powerful computer able to run the neural network that classifies the parts.

The classification decision is then sent back to the sorting machine so it can spit the brick, using a series of servo-controlled gates, into the right output bucket.

Extra-credit homework

A front view of the LEGO sorter with the sorting boxes visible underneath
In all its bricky beauty, with the 18 output buckets visible at the bottom

Daniel is such a boss maker that he wrote not one, but two further reading articles for those of you who want to deep-dive into this mega LEGO creation:

The post Raspberry Pi LEGO sorter appeared first on Raspberry Pi.

Deter burglars with a Raspberry Pi chatbot

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/deter-burglars-with-a-raspberry-pi-chatbot/

How to improve upon the standard burglar deterring method of leaving lights switched on? Dennis Mellican turned to Raspberry Pi for a much more effective solution. It actually proved too effective when a neighbour stopped by, but more on that in a bit.

Here you can see Dennis’s system in action scaring off a trespasser:

Good job, Raspberry Pi chatbots!

The burglar deterrent started out as Dennis’s regular home automation system. Not content with the current software offerings, and having worked in DevOps, Dennis decided to create his own solution. Enter Raspberry Pi (well, several of them).

Chatterboxes

Dennis has multiple Raspberry Pi–powered devices dotted around his home, doing things such as turning on lights, powering up a garden sprinkler, and playing fake dog barks on wireless speakers. All these burglar deterrents work together and are run by a chat bot.

A simulation of the chatbots responding to Dennis’ commands

Each Raspberry Pi controls a single automated item in Dennis’s home. All the Raspberry Pis communicate with each other via Slack. Dennis issues commands if he, for example, wants lights to turn on while he is away, but the Raspberry Pis can also talk to each other when a trigger event occurs, such as when a motion sensor is tripped.

Smart sound

speaker, chromecast device, cctv camera and the Raspberry Pi connected for the anti burglary chatbot
Speaker, Google Chromecast, CCTV camera and Raspberry Pi

Google Chromecast enables ‘dumb’ speakers to be smart. Dennis has such speakers set up inside, close to windows at the front and back of the house, and they play an .mp3 file of a fake dog bark when commanded.

The security cameras Dennis uses in his home setup are a wireless CCTV variety, and the lights are a mix of TP-Link and Lifx smart bulbs.

Here’s all the Python code running Dennis’ entire security system.

Too effective?

Dennis’s smart system has backfired on him a few times. Once a neighbour visited while he was out and thought Dennis was rudely not answering the door, because she saw the lights go on inside, making it appear like he was home. Awkward.

The fake dog barking has also startled the postman and a few joggers — Dennis says it adds to the realism.

You’re cute, but you wreck stuff, so get out

The troupe of Raspberry Pis has also scared away an Australian possum (video above). These critters are notorious for making nests in roof cavities, so Dennis dodged another problematic home invasion there.

Future upgrades

Dennis is a maker after our own hearts when explaining where he’d like to go next with his anti-burglary build:

“I feel like Kevin McCallister from Home Alone, with these home security ‘traps’. I’m still waiting to catch the Wet Bandits for the sequel to this story. So far only stray cats have been caught by the sprinkler. Perhaps the next adventure of the chat bot is to order pizza and have Gangster ‘Johnny’ complete the transaction when the pizza delivery triggers the sensors.”

Go for it, Dennis!

The post Deter burglars with a Raspberry Pi chatbot appeared first on Raspberry Pi.

RetroPie booze barrel

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/retropie-booze-barrel/

What do we want? Retro gaming, adult beverages, and our favourite Spotify playlist. When do we want them? All at the same time.

Luckily, u/breadtangle took to reddit to answer our rum-soaked prayers with this beautifully crafted beer barrel-cum-arcade machine-cum-drinks cabinet.

A beer barrel with drinks inside two opening doors cut into the front of the barrel and a retro arcade console serving as the lid of the barrel with joystick and buttons on a ledge in front
We approve of this drink selection

The addition of a sneaky hiding spot for your favourite tipple, plus a musical surprise, set this build apart from the popular barrel arcade projects we’ve seen before, like this one featured a few years back on the blog.

Retro gaming

A Raspberry Pi 3 Model B+ runs RetroPie, offering all sorts of classic games to entertain you while you sample from the grownup goodies hidden away in the drinks cabinet.

The maker’s top choice is Tetris Attack for the SNES.

A beer barrel with drinks inside two opening doors cut into the front of the barrel and a retro arcade console serving as the lid of the barrel with joystick and buttons on a ledge in front
Such a beautiful finish

Background music

What more could you want now you’ve got retro games and an elegantly hidden drinks cabinet at your fingertips? u/breadtangle‘s creation has another trick hidden inside its smooth wooden curves.

The Raspberry Pi computer used in this build also runs Raspotify, a Spotify Connect client for Raspberry Pi that allows you to stream your favourite tunes and playlists from your phone while you game.

You can set Raspotify to play via Bluetooth speakers, but if you’re using regular speakers and are after a quick install, whack this command in your Terminal:

curl -sL https://dtcooper.github.io/raspotify/install.sh | sh
Booze barrel joystick and buttons panel during the making process
Behind the scenes

u/breadtangle neatly tucked a pair of Logitech z506 speakers on the sides of the barrel, where they could be protected by the overhang of the glass screen cover.

Hardware

The build’s joysticks and buttons came from Amazon, and they’re set into an off-cut piece of kitchen countertop. The glass screen protector is another Amazon find and sits on a rubber car-door edge protector.

The screen itself is lovingly tilted towards the controls, to keep players’ necks comfortable, and u/breadtangle finished off the build’s look with a barstool to sit on while gaming.

We love it, but we have one very important question left…

Can we come round and play?

The post RetroPie booze barrel appeared first on Raspberry Pi.

Get VMWare on Raspberry Pi

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/get-vmware-on-raspberry-pi/

Hacking apart a sweet, innocent Raspberry Pi – who would do such a thing? Network Chuck, that’s who. But he has a very cool reason for it so, we’ll let him off the hook.

Subscribe to Network Chuck on YouTube

He’s figured out how to install VMware ESXi on Raspberry Pi, and he’s sharing the step-by-step process with you because he loves you. And us. We think. We hope.

Get cutting

In a nutshell, Chuck hacks apart a Raspberry Pi, turning it into three separate computers, each running different software at the same time. He’s a wizard.

Our poor sweet baby 😮

VMware is cool because it’s Virtual Machine software big companies use on huge servers, but you can deploy it on one of our tiny devices and learn how to use it in the comfort of your own home if you follow Chuck’s instructions.

Raspberry Pi cut into three pieces with labels showing how powerful each bit is and what it's capable of
Useful labels explaining which bit of Raspberry Pi is capable of what

What do you need?

Make sure you’re up to date

So easy, it only takes 40 seconds to explain

Firstly, you need to make sure you’re running the latest version of Raspberry Pi OS. Chuck uses Raspberry Pi Imager to do this, and the video above shows you how to do the same.

Format your SD card

Network Chuck removing SD card from Raspberry Pi 4
It’s teeny, but powerful

Then you’ll need to format your SD card ready for VMware ESXi. This can be done with Raspberry Pi Imager too. You’ll need to download these two things:

Chuck is the kind of good egg who walks you through how to do this on screen at this point in the project video.

VMware installation

Then you’ll need to create the VMWare Installer to install the actual software. It’s at this point your USB flash drive takes centre stage. Here’s everything you’ll need:

And this is the point in the video at which Chuck walks you through the process.

Once that’s all done, stick your USB flash drive into your Raspberry Pi and get going. You need to be quick off the mark for this bit – there’s some urgent Escape key pressing required, but don’t worry, Chuck walks you through everything.

Create a VM and expand your storage

Once you’ve followed all those steps, you will be up, running, and ready to go. The installation process only takes up the first 15 minutes of Chuck’s project video, and he spends the rest of his time walking you through creating your first VM and adding more storage.

Top job, Chuck.

Keep up with Chuck

Network Chuck holding a Raspberry Pi 4 next to his broadcasting microphone
Fun fact: Raspberry Pi 4 is the same length as Network Chuck’s beard

Network Chuck live-streams every Monday on his YouTube channel, and you can follow him on Twitter too.

There’s also the brilliant networkchuck.com.

The post Get VMWare on Raspberry Pi appeared first on Raspberry Pi.

These Furby-‘controlled’ Raspberry Pi-powered eyes follow you

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/these-furby-controlled-raspberry-pi-powered-eyes-follow-you/

Sam Battle aka LOOK MUM NO COMPUTER couldn’t resist splashing out on a clear Macintosh case for a new project in his ‘Cosmo’ series of builds, which inject new life into retro hardware.

furby facial recognition robot in a clear case in front of a dark background
AAGGGGHHHHHHH!

This time around, a Raspberry Pi, running facial recognition software, and one of our Camera Modules enable Furby-style eyes to track movement, detect faces, and follow you around the room.

Give LOOK MUM NO COMPUTER a follow on YouTube

He loves a good Furby does Sam. Has a whole YouTube playlist dedicated to projects featuring them. Seriously.

Raspberry Pi  with camera module attached to small screen loading software needed to run face recognition
Sam got all the Raspberry Pi kit needed from Pimoroni

Our favourite bit of the video is when Sam meets Raspberry Pi for the first time, boots it up, and says:

“Wait, I didn’t know it was a computer. It’s an actual computer computer. What?!”

face recognition software running on small screen with raspberry pi camera behind it, looking at the maker
Face recognition software up and running on Raspberry Pi

The eyes are ping pong balls cut in half so you can fit a Raspberry Pi Camera Module inside them. (Don’t forget to make a hole in the ‘pupil’ so the lens can peek through).

Maker inserting raspberry pi camera module inside a sliced ping pong ball. You can see the ribbons of the camera module sticking out of the ping pong ball half
Raspberry Pi Camera Module tucked inside ping pong ball as it’s mounted to a 3D-printed part

The Raspberry Pi and display screen are neatly mounted on the side of the Macintosh so they’re easily accessible should you need to make any changes.

Raspberry Pi and display screen mounted on the side of a clear macintosh frame
Easy access

All the hacked, repurposed junky bits sit inside or are mounted on swish 3D-printed parts.

Add some joke shop chatterbox teeth, and you’ve got what looks like the innards of a Furby staring at you. See below for a harrowing snapshot of Zach’s ‘Furlexa’ project, featured on our blog last year. We still see it when we sleep.

It gets worse the more you look around

It wasn’t enough for Furby-mad Sam to have created a Furby look-a-like face-tracking robot, he needed to go further. Inside the clear Macintosh case, you can see a de-furred Furby skeleton atop a 3D-printed plinth, with redundant ribbon cables flowing from its eyes into the back of the face-tracking robot face, thus making it appear as though the Furby is the brains behind this creepy creation that is following your every move.

a side view of the entire build with a furby skeleton visible inside
Hey in there. We see you! You dark lord of robo-controlling

Eventually, Sam’s Raspberry Pi–powered creation will be on display at the Museum of Everything Else, so you can go visit it and play with all the “obsolete and experimental technology” housed there. The museum is funded by the Look Mum No Computer Patreon page.

The post These Furby-‘controlled’ Raspberry Pi-powered eyes follow you appeared first on Raspberry Pi.

Smart Fairy Tale

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/smart-fairy-tale/

This is creepy, and we love it. OK, it’s not REALLY creepy, it’s just that some people have an aversion to dolls that appear to move of their own accord, due to a disturbing childhood experience — but enough about me.

Smart Fairy Tale is a whimsical, unique community project created by Berlin-based installation artist Niklas Roy and interaction designer Felix Fisgus.

Using a smartphone app, viewers determine which way a ball travels through transparent pipes, and depending on which light barriers the ball interrupts on its journey, various toys are animated to tell different stories.

The server of the installation is a Raspberry Pi 4. Via its GPIO pins, it controls the track switches and releases the ball.

Raspberry Pi 4 mounted onto plastic with the installation's servo and all the microcontrollers
Raspberry Pi 4 tucked in the top right-hand corner, mounted together with the router. Photo courtesy of Niklas’ project page

The apparatus is full of toys donated by residents of Wolfsburg, Germany. The artists wanted local people to not only be able to operate the mechanical piece, but also to have a hand in creating it. Each animatronic toy is made as a separate module, controlled by its own Arduino Nano.

Smart Fairy Tale can be remotely controlled by viewers who want to check in on the toys they gifted to the installation, and by any other curious people elsewhere in the world.

A phone using the app to control the installation. The installation is out of focus in the background
The app in action. Photo from Felix’s project page.

Better yet, the stories the toys tell were devised by local school students. The artists showed the gifted toys to a few elementary school classes, and the students drew several stories featuring toys they liked. The makers then programmed the toys to match what the drawings said they could do. A servo here, a couple of LEDs there, and the students’ stories were brought to life.

Some drawings local children made suggesting storylines for each of the gifted toys
Some of the storylines drawn by local children. Photo courtesy of Felix’s project page.

So what kind of stories did Wolfsburg’s finest come up with? One of the creators explains:

“There were a lot of scenes to interpret, like the blow-up love story, the chemtrail conspiracy, and the fossil fuel disaster, which culminates in a major traffic jam. The latter one even involved a laboratory for breeding synthetic dinosaurs by the use of renewable energies.”

Felix Fisgus

We LOVE it. Don’t tell me this isn’t creepy though…

WHY DO YOU HAUNT MY DREAMS???

You’ll find tonnes of extra technical specs and images in the project posts on both Felix and Niklas‘ websites.

The post Smart Fairy Tale appeared first on Raspberry Pi.

Raspberry Pi ‘Swear Bear’ keeps your potty mouth in check

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/raspberry-pi-swear-bear-keeps-your-potty-mouth-in-check/

Why use a regular swear jar to retrain your potty-mouthed brain when you can build a Swear Bear to help you instead?

Swear Bear listens to you. All the time. And Swear Bear can tell when a swear word is used. Swear Bear tells you off and saves all the swear words you said to the cloud to shame you. Swear Bear subscribes to the school of tough love.

Artificial intelligence

The Google AIY kit allows you to build your own natural language recogniser. This page shows you how to assemble the Voice HAT from the kit, and it also includes the code you’ll need to make your project capable of speech-to-text AI.

Black AIY HAT stuck on top of a Raspberry Pi
Image of the Voice HAT mounted onto a Raspberry Pi 3 courtesy of aiyprojects.withgoogle.com

To teach Swear Bear the art of profanity detection, Swear Bear creators 8 Bits and a Byte turned to the profanity check Python library. You can find the info to install and use the library on this page, as well as info on how it works and why it’s so accurate.

You’ll hear at this point in the video that Swear Bear says “Oh dear” when a swear word is used within earshot.

Hardware

Birds eye view of each of the hardware components used in the project on a green table

This project uses the the first version of Google’s AIY Voice Kit, which comes with a larger black AIY Voice HAT and is compatible with Raspberry Pi 3 Model B. The kit also includes a little Voice HAT microphone board.

Version 2 of the kit comprises the smaller Raspberry Pi Zero WH and a slimmer ‘Voice Bonnet’.

The microphone allows Swear Bear to ‘hear’ your speech, and through its speakers it can then tell you off for swearing.

All of hardware is squeezed into the stuffing-free bear once the text-to-speech and profanity detection software is working.

Babbage Bear hack?

Babbage the Bear

8 Bits and a Byte fan Ben Scarboro took to the comments on YouTube to suggest they rework one of our Babbage Bears into a Swear Bear. Babbage is teeny tiny, so maybe you would need to fashion a giant version to accomplish this. Just don’t make us watch while you pull out its stuffing.

The post Raspberry Pi ‘Swear Bear’ keeps your potty mouth in check appeared first on Raspberry Pi.

Personal Raspberry Pi music streamer

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/personal-raspberry-pi-music-streamer/

Mike Perez from Audio Arkitekts took to YouTube to show you how to build your own music streamer using a Raspberry Pi. Haters of Bluetooth and RCA plugs, he’s done this for you.

Mike reports a “substantial difference in sound quality” compared to his previous setup (the aforementioned and reviled Bluetooth and RCA plug options).

This project lets you use a Raspberry Pi as a music player and control it from your mobile phone.

Hardware

Unboxing the Argon Neo Raspberry Pi Bundle

Mike started out with an $80 Argon Neo Raspberry Pi Bundle, which includes a Raspberry Pi 4 Model B. He made a separate video to show you how to put everything together.

This bundle comes with a nice, sleek case, so your music player can be on display discreetly.

Pretty case

Not sure about spending $80 on this kit? In the project video, Mike says it’s “totally, totally worth it” — partly due to the quality of the case.

Software

Mike grabbed a compatible Volumio image from Volumio’s ‘Get Started’ page and flashed it onto Raspberry Pi with Etcher.

Volumio app in action

You can use an Ethernet cable, but Mike wanted to utilise Raspberry Pi 4’s wireless connectivity to boot the Volumio app. This way, the Raspberry Pi music player can be used anywhere in the house, as it’ll create its own wireless hotspot within your home network called ‘Volumio’.

Eew! No more direct audio connection to your phone to listen to music.

You’ll need a different version of the Volumio app depending on whether you have an Android phone or iPhone. Mike touts the app as “super easy, really robust”. You just select the music app you usually use from the ‘Plugins’ section of the Volumio app, and all your music, playlists, and cover art will be there ready for you once downloaded.

And that’s basically it. Just connect to the Volumio OS via the app and tell your Raspberry Pi what to play.

Amp it up

To get his new music player booming all around the house, Mike used a Starke Sound AD4, which you can watch him unbox and review.

What kind of amplification system have you got paired up with your Raspberry Pi–powered music player?

The post Personal Raspberry Pi music streamer appeared first on Raspberry Pi.

Vulkan update: we’re conformant!

Post Syndicated from Eben Upton original https://www.raspberrypi.org/blog/vulkan-update-were-conformant/

Today we have a guest post from Igalia’s Iago Toral, who has spent the past year working on the Mesa graphic driver stack for Raspberry Pi 4.

It’s been nearly a year since we first announced that we were developing a Vulkan driver for the latest generation of Raspberry Pi devices (Raspberry Pi 4, Raspberry Pi 400, and Compute Module 4).

Sascha Willems’ Vulkan radial blur demo

In June we released the source code for our prototype driver, and last month we announced that the driver had been successfully merged to Mesa upstream.

Today we have some very exciting news to share: as of 24 November the V3DV Vulkan Mesa driver for Raspberry Pi 4 has demonstrated Vulkan 1.0 conformance.

Khronos describes the conformance process as a way to ensure that its standards are consistently implemented by multiple vendors, so as to create a reliable platform for application developers. For each standard, Khronos provides a large conformance test suite (CTS) that implementations must pass successfully to be declared conformant; in the case of Vulkan 1.0, the CTS contains over 100,000 tests.

Vulkan 1.0 conformance is a major milestone in bringing Vulkan to Raspberry Pi, but it isn’t the end of the journey. Our team continues to work on all fronts to expand the Vulkan feature set, improve performance, and fix bugs. So stay tuned for future Vulkan updates!

The post Vulkan update: we’re conformant! appeared first on Raspberry Pi.

Classify your trash with Raspberry Pi

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/classify-your-trash-with-raspberry-pi/

Maker Jen Fox took to hackster.io to share a Raspberry Pi–powered trash classifier that tells you whether the trash in your hand is recyclable, compostable, or just straight-up garbage.

Jen reckons this project is beginner-friendly, as you don’t need any code to train the machine learning model, just a little to load it on Raspberry Pi. It’s also a pretty affordable build, costing less than $70 including a Raspberry Pi 4.

“Haz waste”?!

Hardware:

  • Raspberry Pi 4 Model B
  • Raspberry Pi Camera Module
  • Adafruit push button
  • Adafruit LEDs
Watch Jen giving a demo of her creation

Software

The code-free machine learning model is created using Lobe, a desktop tool that automatically trains a custom image classifier based on what objects you’ve shown it.

The image classifier correctly guessing it has been shown a bottle cap

Training the image classifier

Basically, you upload a tonne of photos and tell Lobe what object each of them shows. Jen told the empty classification model which photos were of compostable waste, which were of recyclable and items, and which were of garbage or bio-hazardous waste. Of course, as Jen says, “the more photos you have, the more accurate your model is.”

Loading up Raspberry Pi

Birds eye view of Raspberry Pi 4 with a camera module connected
The Raspberry Pi Camera Module attached to Raspberry Pi 4

As promised, you only need a little bit of code to load the image classifier onto your Raspberry Pi. The Raspberry Pi Camera Module acts as the image classifier’s “eyes” so Raspberry Pi can find out what kind of trash you hold up for it.

The push button and LEDs are wired up to the Raspberry Pi GPIO pins, and they work together with the camera and light up according to what the image classifier “sees”.

Here’s the fritzing diagram showing how to wire the push button and LEDS to the Raspberry Pi GPIO pins

You’ll want to create a snazzy case so your trash classifier looks good mounted on the wall. Kate cut holes in a cardboard box to make sure that the camera could “see” out, the user can see the LEDs, and the push button is accessible. Remember to leave room for Raspberry Pi’s power supply to plug in.

Jen’s hand-painted case mounted to the wall, having a look at a plastic bag

Jen has tonnes of other projects on her Hackster profile — check out the micro:bit Magic Wand.

The post Classify your trash with Raspberry Pi appeared first on Raspberry Pi.

Defeat evil with a Raspberry Pi foam-firing spy camera

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/defeat-evil-with-a-raspberry-pi-foam-firing-spy-camera/

Ruth and Shawn from YouTube channel Kids Invent Stuff picked a cool idea by 9-year-old Nathan, who drew a Foam-Firing Spy Camera, to recreate in real life.

FYI: that’s not really a big camera lens…

The trick with spy devices is to make sure they look as much like the object they’re hidden inside as possible. Where Raspberry Pi comes in is making sure the foam camera can be used as a real photo-taking camera too, to throw the baddies off the scent if they start fiddling with your spyware.

Here’s the full build video by Kids Invent Stuff

The foam-firing bit of Nathan’s invention was relatively simple to recreate – a modified chef’s squirty cream dispenser, hidden inside a camera-shaped box, gets the job done.

Squirty cream thing painted black and mounted onto camera-shaped frame

Ruth and Shawn drew a load of 3D-printed panels to mount on the box frame in the image above. One of those cool coffee cups that look like massive camera lenses hides the squirty cream dispenser and gives this build an authentic camera look.

THOSE cool camera lens-shaped coffee cups, see?

Techy bits from the build:

  • Raspberry Pi
  • Infrared LED
  • Camera module
  • Mini display screen
All the bits mentioned in the list above

The infrared LED is mounted next to the camera module and switches on when it gets dark, giving you night vision.

The mini display screen serves as a ‘lid’ to the blue case protecting the Raspberry Pi and mounts into the back panel of the ‘camera’

The Raspberry Pi computer and its power bank are crammed inside the box-shaped part, with the camera module and infrared LED mounted to peek out of custom-made holes in one of the 3D-printed panels on the front of the box frame.

The night vision mini display screen in action on the back of the camera

The foam-firing chef’s thingy is hidden inside the big fake lens, and it’s wedged inside so that when you lift the big fake lens, the lever on the chef’s squirty thing is depressed and foam fires out of a tube near to where the camera lens and infrared LED peek out on the front panel of the build.

Watch the #KidsInventStuff presenters test out Nathan’s invention

Baddies don’t stand a chance!

The post Defeat evil with a Raspberry Pi foam-firing spy camera appeared first on Raspberry Pi.

Raspberry Pi smart IoT glove

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/raspberry-pi-smart-iot-glove/

Animator/engineer Ashok Fair has put witch-level finger pointing powers in your hands by sticking a SmartEdge Agile, wirelessly controlled by Raspberry Pi Zero, to a golf glove. You could have really freaked the bejeezus out of Halloween party guests with this (if we were allowed to have Halloween parties that is).

The build uses a Smart Edge Agile IoT device with Brainium, a cloud-based tool for performing machine learning tasks.

The Rapid IoT kit is interfaced with Raspberry Pi Zero and creates a thread network connecting to light, car, and fan controller nodes.

The Brainium app is installed on Raspberry Pi and bridges between the cloud and Smart Edge device. MQTT is running on Python and processes the Rapid IoT Kit’s data.

The device is mounted onto a golf glove, giving the wearer seemingly magical powers with the wave of a hand.

Kit list

  • Raspberry Pi Zero
  • Avnet SmartEdge Agile (the white box attached to the glove)
  • NXP Rapid IoT Prototyping Kit (the square blue screen stuck on the adaptor board with the Raspberry Pi Zero)
  • Brainium AI Studio app
  • Golf glove
Waking up the Rapid IoT screen

To get started, the glove wearer draws a pattern above the screen attached to the Raspberry Pi to unlock it and wake up all the controller nodes.

The light controller node is turned on by drawing a clockwise circle, and turned off with an counter-clockwise circle.

The full kit and caboodle

The fan is turned on and off in the same way, and you can increase the fan’s speed by moving your hand upwards and reduce the speed by moving your hand down. You know it’s working by the look of the fan’s LEDs: they blinker faster as the fan speeds up.

Make a pushing motion in the air above the car to make it move forward, and you can also make it turn and reverse.

“Driving glove”

If you wear the glove while driving, it collects data in real time and logs it on the Brainium cloud so you can review your driving style.

Keep up with Ashok’s projects on Twitter or Facebook.

The post Raspberry Pi smart IoT glove appeared first on Raspberry Pi.

Hire Raspberry Pi as a robot sous-chef in your kitchen

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/hire-raspberry-pi-as-a-robot-sous-chef-in-your-kitchen/

Design Engineering student Ben Cobley has created a Raspberry Pi–powered sous-chef that automates the easier pan-cooking tasks so the head chef can focus on culinary creativity.

Ben named his invention OnionBot, as the idea came to him when looking for an automated way to perfectly soften onions in a pan while he got on with the rest of his dish. I have yet to manage to retrieve onions from the pan before they blacken so… *need*.

OnionBot robotic sous-chef set up in a kitchen
The full setup (you won’t need a laptop while you’re cooking, so you’ll have counter space)

A Raspberry Pi 4 Model B is the brains of the operation, with a Raspberry Pi Touch Display showing the instructions, and a Raspberry Pi Camera Module keeping an eye on the pan.

OnionBot robotic sous-chef hardware mounted on a board
Close up of the board-mounted hardware and wiring

Ben’s affordable solution is much better suited to home cooking than the big, expensive robotic arms used in industry. Using our tiny computer also allowed Ben to create something that fits on a kitchen counter.

OnionBot robotic sous-chef hardware list

What can OnionBot do?

  • Tells you on-screen when it is time to advance to the next stage of a recipe
  • Autonomously controls the pan temperature using PID feedback control
  • Detects when the pan is close to boiling over and automatically turns down the heat
  • Reminds you if you haven’t stirred the pan in a while
OnionBot robotic sous-chef development stages
Images from Ben’s blog on DesignSpark

How does it work?

A thermal sensor array suspended above the stove detects the pan temperature, and the Raspberry Pi Camera Module helps track the cooking progress. A servo motor controls the dial on the induction stove.

Screenshot of the image classifier of OnionBot robotic sous-chef
Labelling images to train the image classifier

No machine learning expertise was required to train an image classifier, running on Raspberry Pi, for Ben’s robotic creation; you’ll see in the video that the classifier is a really simple drag-and-drop affair.

Ben has only taught his sous-chef one pasta dish so far, and we admire his dedication to carbs.

Screenshot of the image classifier of OnionBot robotic sous-chef
Training the image classifier to know when you haven’t stirred the pot in a while

Ben built a control panel for labelling training images in real time and added labels at key recipe milestones while he cooked under the camera’s eye. This process required 500–1000 images per milestone, so Ben made a LOT of pasta while training his robotic sous-chef’s image classifier.

Diagram of networked drivers and devices in OnionBot robotic sous-chef

Ben open-sourced this project so you can collaborate to suggest improvements or teach your own robot sous-chef some more dishes. Here’s OnionBot on GitHub.

The post Hire Raspberry Pi as a robot sous-chef in your kitchen appeared first on Raspberry Pi.

Pumpkin Pi Build Monitor

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/pumpkin-pi-build-monitor/

Following on from Rob Zwetsloot’s Haunted House Hacks in the latest issue of The MagPi magazine, GitHub’s Martin Woodward has created a spooky pumpkin that warns you about the thing programmers find scariest of all — broken builds. Here’s his guest post describing the project:

“When you are browsing code looking for open source projects, seeing a nice green passing build badge in the ReadMe file lets you know everything is working with the latest version of that project. As a programmer you really don’t want to accidentally commit bad code, which is why we often set up continuous integration builds that constantly check the latest code in our project.”

“I decided to create a 3D-printed pumpkin that would hold a Raspberry Pi Zero with an RGB LED pHat on top to show me the status of my build for Halloween. All the code is available on GitHub alongside the 3D printing models which are also available on Thingiverse.”

Components

  • Raspberry Pi Zero (I went for the WH version to save me soldering on the header pins)
  • Unicorn pHat from Pimoroni
  • Panel mount micro-USB extension
  • M2.5 hardware for mounting (screws, male PCB standoffs, and threaded inserts)

“For the 3D prints, I used a glow-in-the-dark PLA filament for the main body and Pi holder, along with a dark green PLA filament for the top plug.”

“I’ve been using M2.5 threaded inserts quite a bit when printing parts to fit a Raspberry Pi, as it allows you to simply design a small hole in your model and then you push the brass thread into the gap with your soldering iron to melt it securely into place ready for screwing in your device.”

Threaded insert

“Once the inserts are in, you can screw the Raspberry Pi Zero into place using some brass PCB stand-offs, place the Unicorn pHAT onto the GPIO ports, and then screw that down.”

pHAT install

“Then you screw in the panel-mounted USB extension into the back of the pumpkin, connect it to the Raspberry Pi, and snap the Raspberry Pi holder into place in the bottom of your pumpkin.”

Inserting the base

Code along with Martin

“Now you are ready to install the software.  You can get the latest version from my PumpkinPi project on GitHub. “

“Format the micro SD Card and install Raspberry Pi OS Lite. Rather than plugging in a keyboard and monitor, you probably want to do a headless install, configuring SSH and WiFi by dropping an ssh file and a wpa_supplicant.conf file onto the root of the SD card after copying over the Raspbian files.”

“You’ll need to install the Unicorn HAT software, but they have a cool one-line installer that takes care of all the dependencies including Python and Git.”

\curl -sS https://get.pimoroni.com/unicornhat | bash

“In addition, we’ll be using the requests module in Python which you can install with the following command:”

sudo pip install requests

“Next you want to clone the git repo.”

git clone https://github.com/martinwoodward/PumpkinPi.git

“You then need to modify the settings to point at your build badge. First of all copy the sample settings provided in the repo:”

cp ~/PumpkinPi/src/local_settings.sample ~/PumpkinPi/src/local_settings.py

“Then edit the BADGE_LINK variable and point at the URL of your build badge.”

# Build Badge for the build you want to monitor

BADGE_LINK = "https://github.com/martinwoodward/calculator/workflows/CI/badge.svg?branch=main"

# How often to check (in seconds). Remember - be nice to the server. Once every 5 minutes is plenty.

REFRESH_INTERVAL = 300

“Finally you can run the script as root:”

sudo python ~/PumpkinPi/src/pumpkinpi.py &

“Once you are happy everything is running how you want, don’t forget you can run the script at boot time. The easiest way to do this is to use crontab. See this cool video from Estefannie to learn more. But basically you do sudo crontab -e then add the following:”

@reboot /bin/sleep 10 ; /usr/bin/python /home/pi/PumpkinPi/src/pumpkinpi.py &

“Note that we are pausing for 10 seconds before running the Python script. This is to allow the WiFi network to connect before we check on the state of our build.”

“The current version of the pumpkinpi script works with all the SVG files produced by the major hosted build providers, including GitHub Actions, which is free for open source projects. But if you want to improve the code in any way, I’m definitely accepting pull requests on it.”

“Using the same hardware you could monitor lots of different things, such as when someone posts on Twitter, what the weather will be tomorrow, or maybe just code your own unique multi-coloured display that you can leave flickering in your window.”

“If you build this project or create your own pumpkin display, I’d love to see pictures. You can find me on Twitter @martinwoodward and on GitHub.”

The post Pumpkin Pi Build Monitor appeared first on Raspberry Pi.

Raspberry Pi High Quality security camera

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/raspberry-pi-high-quality-security-camera/

DJ from the element14 community shows you how to build a red-lensed security camera in the style of Portal 2 using the Raspberry Pi High Quality Camera.

The finished camera mounted on the wall

Portal 2 is a puzzle platform game developed by Valve — a “puzzle game masquerading as a first-person shooter”, according to Forbes.

DJ playing with the Raspberry Pi High Quality Camera

Kit list

No code needed!

DJ was pleased to learn that you don’t need to write any code to make your own security camera, you can just use a package called motionEyeOS. All you have to do is download the motionEyeOS image, pop the flashed SD card into your Raspberry Pi, and you’re pretty much good to go.

Dj got everything set up on a 5″ screen attached to the Raspberry Pi

You’ll find that the default resolution is 640×480, so it will show up as a tiny window on your monitor of choice, but that can be amended.

Simplicity

While this build is very simple electronically, the 20-part 3D-printed shell is beautiful. A Raspberry Pi is positioned on a purpose-built platform in the middle of the shell, connected to the Raspberry Pi High Quality Camera, which sits at the front of that shell, peeking out.

All the 3D printed parts ready to assemble

The 5V power supply is routed through the main shell into the base, which mounts the build to the wall. In order to keep the Raspberry Pi cool, DJ made some vent holes in the lens of the shell. The red LED is routed out of the side and sits on the outside body of the shell.

Magnetising

Raspberry Pi 4 (centre) and Raspberry Pi High Quality Camera (right) sat inside the 3D printed shell

This build is also screwless: the halves of the shell have what look like screw holes along the edges, but they are actually 3mm neodymium magnets, so assembly and repair is super easy as everything just pops on and off.

The final picture (that’s DJ!)

You can find all the files you need to recreate this build, or you can ask DJ a question, at element14.com/presents.

The post Raspberry Pi High Quality security camera appeared first on Raspberry Pi.

Global sunrise/sunset Raspberry Pi art installation

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/global-sunrise-sunset-raspberry-pi-art-installation/

24h Sunrise/Sunset is a digital art installation that displays a live sunset and sunrise happening somewhere in the world with the use of CCTV.

Image by fotoswiss.com

Artist Dries Depoorter wanted to prove that “CCTV cameras can show something beautiful”, and turned to Raspberry Pi to power this global project.

Image by fotoswiss.com

Harnessing CCTV

The arresting visuals are beamed to viewers using two Raspberry Pi 3B+ computers and an Arduino Nano Every that stream internet protocol (IP) cameras with the use of command line media player OMXPlayer.

Dual Raspberry Pi power

The two Raspberry Pis communicate with each other using the MQTT protocol — a standard messaging protocol for the Internet of Things (IoT) that’s ideal for connecting remote devices with a small code footprint and minimal network bandwidth.

One of the Raspberry Pis checks at which location in the world a sunrise or sunset is happening and streams the closest CCTV camera.

The insides of the sleek display screen…

Beam me out, Scotty

The big screens are connected with the I2C protocol to the Arduino, and the Arduino is connected serial with the second Raspberry Pi. Dries also made a custom printed circuit board (PCB) so the build looks cleaner.

All that hardware is powered by an industrial power supply, just because Dries liked the style of it.

Software

Everything is written in Python 3, and Dries harnessed the Python 3 libraries BeautifulSoup, Sun, Geopy, and Pytz to calculate sunrise and sunset times at specific locations. Google Firebase databases in the cloud help with admin by way of saving timestamps and the IP addresses of the cameras.

Hardware

The artist stood infront of the two large display screens
Image of the artist with his work by fotoswiss.com

And, lastly, Dries requested a shoutout for his favourite local Raspberry Pi shop Gotron in Ghent.

The post Global sunrise/sunset Raspberry Pi art installation appeared first on Raspberry Pi.

It’s a brand-new NODE Mini Server!

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/its-a-brand-new-node-mini-server/

NODE has long been working to create open-source resources to help more people harness the decentralised internet, and their easily 3D-printed designs are perfect to optimise your Raspberry Pi.

NODE wanted to take advantage of the faster processor and up to 8GB RAM on Raspberry Pi 4 when it came out last year. Now that our tiny computer is more than capable of being used as as a general Linux desktop system, the NODE Mini Server version 3 has been born.

As for previous versions of NODE’s Mini Server, one of their main goals for this new iteration was to package Raspberry Pi in a way which makes it a little easier to use as a regular mini server or computer. In other words, it’s put inside a neat little box with all the ports accessible on one side.

Black is incredibly slimming

Slimmer and simpler

The latest design is simplified compared to previous versions. Everything lives in a 92mm × 92mm enclosure that isn’t much thicker than Raspberry Pi itself.

The slimmed-down new case comprises a single 3D-printed piece and a top cover made from a custom-designed printed circuit board (PCB) that has four brass-threaded inserts soldered into the corners, giving you a simple way to screw everything together.

The custom PCB cover

What are the new features?

Another goal for version 3 NODE’s Mini Server was to include as much modularity as possible. That’s why this new mini server requires no modifications to the Raspberry Pi itself, thanks to a range of custom-designed adapter boards. How to take advantage of all these new features is explained at this point in NODE’s YouTube video.

Ooh, shiny and new and new and shiny

Just like for previous versions, all the files and a list of the components you need to create your own Mini Server are available for free on the NODE website.

Leave comments on NODE’s YouTube video if you’d like to create and sell your own Mini Server kits or pre-made servers. NODE is totally open to showcasing any add-ons or extras you come up with yourself.

Looking ahead, making the Mini Server stackable and improving fan circulation is next on NODE’s agenda.

The post It’s a brand-new NODE Mini Server! appeared first on Raspberry Pi.

Raspberry Pi Off-World Bartender

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/raspberry-pi-off-world-bartender/

Three things we like: Blade Runner, robots, and cocktails. That’s why we LOVE Donald Bell‘s Raspberry Pi–packed ‘VK-01 Off-World Bartender‘ cocktail making machine.

This machine was due to be Donald’s entry into the Cocktail Robotics Grand Challenge, an annual event in San Francisco. By the time the event was cancelled, he was too deep into his awesome build to give up, so he decided to share it with the Instructables community instead.

Donald wanted users to get as much interaction and feedback as possible, rather than simply pressing a button and receiving a random drink. So with this machine, the interaction comes in four ways: instructions provided on the screen, using a key card to bypass security, placing and removing a cup on the tray, and entering an order number on the keypad.

In addition to that, feedback is provided by way of lighting changes, music, video dialogue, pump motors whirring, and even the clicks of relays at each stage of the cocktail making process.

Ordering on the keypad

close up of the black keypad

The keypad allows people to punch in a number to trigger their order, like on a vending machine. The drink order is sent to the Hello Drinkbot software running on the Raspberry Pi 3B that controls the pumps.

Getting your cup filled

Inside the cup dispenser sensor showing the switch and LEDs
The switch under the lid and ring of LEDs on the base

In order for the machine to be able to tell when a vessel is placed under the dispenser spout, and when it’s removed, Donald built in a switch under a 3D-printed tray. Provided the vessel has at least one ice cube in it, even the lightest plastic up is heavy enough to trigger the switch.

The RFID card reader

Cocktail machine customers are asked to scan a special ID card to start. To make this work, Donald adapted a sample script that blinks the card reader’s internal LED when any RFID card is detected.

Interactive video screen

close up of the interactive screen on the machine showing Japanese style script

This bit is made possible by MP4Museum, a “bare-bones” kiosk video player software that the second Raspberry Pi inside the machine runs on boot. By connecting a switch to the Raspberry Pi’s GPIO, Donald enabled customers to advance through the videos one by one. And yes, that’s an official Raspberry Pi Touch Display.

Behind the scenes of the interactive screen with the Raspberry Pi wired up
Behind the scenes of the screen with the Raspberry Pi A+ running the show

The Hello Drinkbot ‘bartender’

screen grab of the hello drinkbot web interface

Donald used the Python-based Hello Drinkbot software as the brains of the machine. With it, you can configure which liquors or juices are connected to which pumps, and send instructions on exactly how much to pour of each ingredient. Everything is configured via a web interface.

Via a bank of relays, microcontrollers connect all the signals from the Touch Display, keypad, RFID card reader, and switch under the spout.

Here’s the Fritzing diagram for this beast

Supplies

Donald shared an exhaustive kit list on his original post, but basically, what you’re looking at is…

Pencil sketches of the machine from different angles
Donald’s friend Jim Burke‘s beautiful concept sketches

And finally, check out the Raspberry Pi–based Hello Drinkbot project by Rich Gibson, which inspired Donald’s build.

The post Raspberry Pi Off-World Bartender appeared first on Raspberry Pi.

Steampunk ‘Help is coming’ Raspberry Pi alert system

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/steampunk-help-is-coming-raspberry-pi-alert-system/

Tom Lee decided to combine his household with his sister-in-law during lockdown so that she could help him make childcare more manageable. The problem was, Tom’s household was a smidge frantic in the mornings, as the family struggled to be up and ready in time for his sister-in-law’s arrival.

Enter this Raspberry Pi–powered tracking device, which tells Tom when the family car is on its way with childcare support. The DIY appliance helps his household manage childcare routines like clockwork.

The magic is in the wooden box, but the light cage and electrical meter are all part of the show

When the family car is moving, a light turns on, and an antique electrical meter points to 30…20…10 to show the estimated minutes until the driver arrives. The movements of the car come in from a cellular Sinotrack OBD2 dongle pointed at a traccar server running on Raspberry Pi 3.

We see you in there, Raspberry Pi…

Tom explains: “I have not found traccar to be the greatest to work with, but you can make it forward everything it decodes to your own script pretty easily.”

Materials:

  • Arduino microcontrollers (ATMega328P & ESP8266 based)
  • Raspberry Pi (Model 1 and 3)
  • Dongle device in car (with SIM card and cellular service)
  • Light device with bulb and solid state relay
  • Antique electrical meter (for the steampunks among you – any similar device will do the job!) 
The light safety cage was rescued from an old workshop

The case (below) is a lasercut design Tom had made by online laser cutting business Ponoko.

Inside there’s a solid state relay and a first-generation Raspberry Pi (hidden under the black cable in the photo below). This Raspberry Pi model doesn’t have wireless connectivity, and Tom found that getting wireless working was a bit tricky for this project.

Tom produced a nice long webinar to show you exactly how this all works. So if you’d like to give this project a try, watch it for yourself.

You’ll learn how to…

Code resources

Oh, and he’s only gone and uploaded every single bit of code you’ll need on GitHub (what an angel):

The post Steampunk ‘Help is coming’ Raspberry Pi alert system appeared first on Raspberry Pi.

Mini Raspberry Pi Boston Dynamics–inspired robot

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/mini-raspberry-pi-boston-dynamics-inspired-robot/

This is a ‘Spot Micro’ walking quadruped robot running on Raspberry Pi 3B. By building this project, redditor /thetrueonion (aka Mike) wanted to teach themself robotic software development in C++ and Python, get the robot walking, and master velocity and directional control.

Mike was inspired by Spot, one of Boston Dynamics’ robots developed for industry to perform remote operation and autonomous sensing.

What’s it made of?

  • Raspberry Pi 3B
  • Servo control board: PCA9685, controlled via I2C
  • Servos: 12 × PDI-HV5523MG
  • LCD Panel: 16×2 I2C LCD panel
  • Battery: 2s 4000 mAh LiPo, direct connection to power servos
  • UBEC: HKU5 5V/5A ubec, used as 5V voltage regulator to power Raspberry Pi, LCD panel, PCA9685 control board
  • Thingiverse 3D-printed Spot Micro frame

How does it walk?

The mini ‘Spot Micro’ bot rocks a three-axis angle command/body pose control mode via keyboard and can achieve ‘trot gait’ or ‘walk gait’. The former is a four-phase gait with symmetric motion of two legs at a time (like a horse trotting). The latter is an eight-phase gait with one leg swinging at a time and a body shift in between for balance (like humans walking).

Mike breaks down how they got the robot walking, right down to the order the servos need to be connected to the PCA9685 control board, in this extensive walkthrough.

Here’s the code

And yes, this is one of those magical projects with all the code you need stored on GitHub. The software is implemented on a Raspberry Pi 3B running Ubuntu 16.04. It’s composed on C++ and Python nodes in a ROS framework.

What’s next?

Mike isn’t finished yet: they are looking to improve their yellow beast by incorporating a lidar to achieve simple 2D mapping of a room. Also on the list is developing an autonomous motion-planning module to guide the robot to execute a simple task around a sensed 2D environment. And finally, adding a camera or webcam to conduct basic image classification would finesse their creation.

The post Mini Raspberry Pi Boston Dynamics–inspired robot appeared first on Raspberry Pi.