Raspberry Pi is at the heart of this AI–powered, automated sorting machine that is capable of recognising and sorting any LEGO brick.
And its maker Daniel West believes it to be the first of its kind in the world!
This mega-machine was two years in the making and is a LEGO creation itself, built from over 10,000 LEGO bricks.
It can sort any LEGO brick you place in its input bucket into one of 18 output buckets, at the rate of one brick every two seconds.
While Daniel was inspired by previous LEGO sorters, his creation is a huge step up from them: it can recognise absolutely every LEGO brick ever created, even bricks it has never seen before. Hence the ‘universal’ in the name ‘universal LEGO sorting machine’.
What makes Daniel’s project a ‘world first’ is that he trained his classifier using 3D model images of LEGO bricks, which is how the machine can classify absolutely any LEGO brick it’s faced with, even if it has never seen it in real life before.
Daniel has made a whole extra video (above) explaining how the AI in this project works. He shouts out all the open source software he used to run the Raspberry Pi Camera Module and access 3D training images etc. at this point in the video.
LEGO brick separation
Daniel needed the input bucket to carefully pick out a single LEGO brick from the mass he chucks in at once.
This is achieved with a primary and secondary belt slowly pushing parts onto a vibration plate. The vibration plate uses a super fast LEGO motor to shake the bricks around so they aren’t sitting on top of each other when they reach the scanner.
Why use a regular swear jar to retrain your potty-mouthed brain when you can build a Swear Bear to help you instead?
Swear Bear listens to you. All the time. And Swear Bear can tell when a swear word is used. Swear Bear tells you off and saves all the swear words you said to the cloud to shame you. Swear Bear subscribes to the school of tough love.
The microphone allows Swear Bear to ‘hear’ your speech, and through its speakers it can then tell you off for swearing.
All of hardware is squeezed into the stuffing-free bear once the text-to-speech and profanity detection software is working.
Babbage Bear hack?
8 Bits and a Byte fan Ben Scarboro took to the comments on YouTube to suggest they rework one of our Babbage Bears into a Swear Bear. Babbage is teeny tiny, so maybe you would need to fashion a giant version to accomplish this. Just don’t make us watch while you pull out its stuffing.
OK Cedrick, we don’t need to know why, but we have to know how you turned a watermelon into a games console.
This has got to be a world first. What started out as a regular RetroPie project has blown up reddit due to the unusual choice of casing for the games console: nearly 50,000 redditors upvoted this build within a week of Cedrick sharing it.
Jingo Dot power bank (that yellow thing you can see below)
Small 1.8″ screen
While this build looks epic, it isn’t too tricky to make. First, Cedrick flashed the RetroPie image onto an SD card, then he wired up a Raspberry Pi’s GPIO pins to the red console buttons, speakers, and the screen.
Cedrick achieved audio output by adding just a few lines of code to the config file, and he downloaded libraries for screen configuration and button input. That’s it! That’s all you need to get a games console up and running.
Now for the messy bit
Cedrick had to gut an entire watermelon before he could start getting all the hardware in place. He power-drilled holes for the buttons to stick through, and a Stanley knife provided the precision he needed to get the right-sized gap for the screen.
Rather than drill even more holes for the speakers, Cedrick stuck them in place inside the watermelon using toothpicks. He did try hot glue first but… yeah. Turns out fruit guts are impervious to glue.
Moisture was going to be a huge problem, so to protect all the hardware from the watermelon’s sticky insides, Cedric lined it with plastic clingfilm.
And here’s how you can help: Cedrick is open to any tips as to how to preserve the perishable element of his project: the watermelon. Resin? Vaseline? Time machine? How can he keep the watermelon fresh?
Share your ideas on reddit or YouTube, and remember to subscribe to see more of Cedric’s maverick making in the wild.
We found this project at TeCoEd and we loved the combination of an OLED display housed inside a retro Argus slide viewer. It uses a Raspberry Pi 3 with Python and OpenCV to pull out single frames from a video and write them to the display in real time.
TeCoEd names this creation the Raspberry Pi Retro Player, or RPRP, or -- rather neatly -- RP squared. The Argus viewer, he tells us, was a charity-shop find that cost just 50p. It sat collecting dust for a few years until he came across an OLED setup guide on hackster.io, which inspired the birth of the RPRP.
At the heart of the project is a Raspberry Pi 3 which is running a Python program that uses the OpenCV computer vision library. The code takes a video clip and breaks it down into individual frames. Then it resizes each frame and converts it to black and white, before writing it to the OLED display. The viewer sees the video play in pleasingly retro monochrome on the slide viewer.
TeCoEd ran into some frustrating problems with the OLED display, which, he discovered, uses the SH1106 driver, rather than the standard SH1306 driver that the Adafruit CircuitPython library expects. Many OLED displays use the SH1306 driver, but it turns out that cheaper displays like the one in this project use the SH1106. He has made a video to spare other makers this particular throw-it-all-in-the-bin moment.
TeCoEd is, as ever, our favourite kind of maker -- the sharing kind! He has collated everything you’ll need to get to grips with OpenCV, connecting the SH1106 OLED screen over I2C, and more. He’s even told us where we can buy the OLED board.
This is a ‘Spot Micro’ walking quadruped robot running on Raspberry Pi 3B. By building this project, redditor /thetrueonion (aka Mike) wanted to teach themself robotic software development in C++ and Python, get the robot walking, and master velocity and directional control.
Mike was inspired by Spot, one of Boston Dynamics’ robots developed for industry to perform remote operation and autonomous sensing.
The mini ‘Spot Micro’ bot rocks a three-axis angle command/body pose control mode via keyboard and can achieve ‘trot gait’ or ‘walk gait’. The former is a four-phase gait with symmetric motion of two legs at a time (like a horse trotting). The latter is an eight-phase gait with one leg swinging at a time and a body shift in between for balance (like humans walking).
Mike breaks down how they got the robot walking, right down to the order the servos need to be connected to the PCA9685 control board, in this extensive walkthrough.
Here’s the code
And yes, this is one of those magical projects with all the code you need stored on GitHub. The software is implemented on a Raspberry Pi 3B running Ubuntu 16.04. It’s composed on C++ and Python nodes in a ROS framework.
Mike isn’t finished yet: they are looking to improve their yellow beast by incorporating a lidar to achieve simple 2D mapping of a room. Also on the list is developing an autonomous motion-planning module to guide the robot to execute a simple task around a sensed 2D environment. And finally, adding a camera or webcam to conduct basic image classification would finesse their creation.
One of our Approved Resellers in the Netherlands, Daniël from Raspberry Store, shared this Raspberry Pi–powered prayer reminder with us. It’s a useful application one of his customers made using a Raspberry Pi purchased from Daniël’s store.
As a Raspberry Pi Official Reseller, I love to see how customers use Raspberry Pi to create innovative products. Spying on bird nests, streaming audio to several locations, using them as a menu in a restaurant, or in a digital signage-solution… just awesome. But a few weeks ago, a customer showed me a new usage of Raspberry Pi: a prayer clock for mosques.
Made by Mawaqit, this is a narrowcasting solution with a Raspberry Pi at its heart and can be used on any browser or smartphone.
If you do not have an internet connection, you’ll also need an RTC clock
With the HDMI cable, Raspberry Pi can broadcast the clock — plus other useful info like the weather, or a reminder to silence your phone — on a wall in the mosque. Awesome! So simple, and yet I have not seen a solution like this before, despite Mawaqit’s application now being used in 51 countries and over 4609 mosques. And, last I checked, it has more than 185,000 active users!
There are then two options: connected and offline. If you set yourself up using the connected option, you’ll be able to remotely control the app from your smartphone or any computer and tablet, which will be synchronised across all the screens connected to Raspberry Pi. You can also send messages and announcements. The latest updates from Mawaqit will install automatically.
If you need to choose the offline option and you’re not able to use the internet at your mosque, it’s important to equip your Raspberry Pi with RTC, because Raspberry Pi can’t keep time by itself.
The Mawaqit project is free of charge, and the makers actually prohibit harnessing it for any monetary gain. The makers even created an API for you to create your own extentions — how great is that? So, if you want your own prayer clock for in a mosque, school, or just at home, take a look at Mawaqit.net.
Redditor Mark Hank missed the tactile experience of vinyl records so he removed the insides of an old Sonos Boost to turn it into a Raspberry Pi- and NFC-powered music player. Yes, this really works:
The Sonos Boost was purchased for just £3 on eBay. Mark pulled all the original insides out of it and repurposed it as what they call a ‘vinyl emulator’ to better replicate the experience of playing records than what a simple touchscreen offers.
The Boost now contains a Raspberry Pi 3A+ and an ACR122U NFC reader, and it plays a specific album, playlist, or radio station when you tap a specific NFC tag on it. It’s teamed with Sonos speakers, and NTAG213 NFC tags. The maker recommends you go with the largest tags you can find, as it will improve read performance; they went with these massive ones.
One of the album covers printed onto thick card
The tags are inside printouts mounted on 1mm thick card (those album cover artwork squares getting chucked at the Sonos in the video), and they’re “super cheap” according to the maker.
You’ll need to install the node-sonos-http-api package on your Raspberry Pi; it’s the basis of the whole back-end of the project. The maker provides full instructions on their original post, including on how to get Spotify up and running on your Raspberry Pi.
The whole setup neatened up
Rather than manually typing HTTP requests into a web browser, the maker wanted to automate the process so that the Raspberry Pi does it when presented with certain stimulus (aka when the NFC reader is triggered). They also walk you through this process in their step-by-step instructions.
How the maker hid the mess under the display table
The entire build cost around £50, and the great thing is that it doesn’t need to sit inside an old Sonos Boost if you don’t want it to. The reader works through modest-width wood, so you can mount it under a counter, install it in a ‘now listening’ stand, whatever — it’s really up to you.
Adrien Castel’s idea of converting an old electronic toy into a retro games machine was no flight of fancy, as David Crookes discovers
The 1980s was a golden era for imaginative electronic toys. Children would pester their parents for a Tomytronic 3D or a Nintendo Game & Watch. And they would enviously eye anyone who had a Tomy Turnin’ Turbo Dashboard with its promise of replicating the thrill of driving (albeit without the traffic jams).
All of the buttons, other than the joystick, are original to the toy – as are the seven red LED lights
Two years ago, maker Matt Brailsford turned that amazing toy into a fully working Out Run arcade machine and Adrien Castel was smitten. “I loved the fact that he’d upcycled an old toy and created something that could be enjoyed as a grown-up,” he says. “But I wanted to push the simulation a bit further and I thought a flying sim could do the trick.”
“I didn’t want to modify the look of the toy”
Ideas began flying around Adrien’s mind. “I knew what I wanted to achieve so I made an overall plan in my head,” he recalls. First he found the perfect toy: a battery-powered Sky Fighter F-16 tabletop game made by Dival. He then decided to base his build around a Raspberry Pi 3A+. “It’s the perfect hardware for projects like this because of its flexibility,” Adrien says.
The toy needed some work. Its original bright red joystick was missing and Adrien knew he’d have to replace the original screen with a TFT LCD. To do this, he 3D-printed a frame to fit the TFT display and he created a smaller base for the replacement joystick. Adrien also changed the microswitches for greater sensitivity but he didn’t go overboard with the changes.
The games can make use of the full screen. Adrien would have liked a larger screen, but the original ratio oddly lay between 4:3 and 16:9, making a bigger display harder to find
“I knew I would have to adapt some parts for the joystick and for the screen, but I didn’t want to modify the look of the toy,” Adrien explains. “To be honest, modifying the toy would have involved some sanding and painting and I was worried that it would ruin the overall effect of the project if it was badly executed.”
A Raspberry Pi 3A+ sits at the heart of the Pi Commander, alongside a mini audio amplifier, and it’s wired up to components within the toy
As such, a challenge was set. “I had to keep most of the original parts such as throttle levers and LEDs and adapt them to the new build,” he says. “This meant getting them to work together with the system and it also meant using the original PCB, getting rid of the components and re-routing the electronics to plug on the GPIOs.”
There were some enhancements. Adrien soldered a PAM8403 3W class-D audio amplifier to Raspberry Pi and this allowed a basic speaker to replace the original for better sound. But there were some compromises too.
The original PCB was used and the electronics were re-routed. All the components need to work between 3.3 to 5V with the lowest possible amperage while fitting into a tight space
“At first I thought the screen could be bigger than the one I used, but the round shape of the cockpit didn’t give much space to fit a screen larger than four inches.” He also believes the project could be improved with a better joystick: “The one I’ve used is a simple two-button arcade stick with a jet fighter look.”
By using the retro gaming OS Recalbox (based on EmulationStation and RetroArch), however, he’s been able to perfect the overall feel. “Recalbox allowed me to create a custom front end that matches the look of a jet fighter,” he explains. It also means the Pi Commander plays shoot-’em-up games alongside open-source simulators like FlightGear (flightgear.org). “It’s a lot of fun.”
Read The MagPi for free!
Find more fantastic projects, tutorials, and reviews in The MagPi #93, out now! You can get The MagPi #95 online at our store, or in print from all good newsagents and supermarkets. You can also access The MagPi magazine via our Android and iOS apps.
Don’t forget our super subscription offers, which include a free gift of a Raspberry Pi Zero W when you subscribe for twelve months.
Nixie tubes: these electronic devices, which can display numerals or other information using glow discharge, made their first appearance in 1955, and they remain popular today because of their cool, vintage aesthetic. Though lots of companies manufactured these items back in the day, the name ‘Nixie’ is said to derive from a Burroughs corporation’s device named NIX I, an abbreviation of ‘Numeric Indicator eXperimental No. 1’.
We liked this recent project shared on reddit, where user farrp2011 used Raspberry Pi to make his Nixie tube display smart enough to tell the time.
A still from Farrp2011’s video shows he’s linked the bulb displays up to tell the time
Farrp2011’s set-up comprises six Nixie tubes controlled by Raspberry Pi 3, along with eight SN74HC shift registers to turn the 60 transistors on and off that ground the pin for the digits to be displayed on the Nixie tubes. Sounds complicated? Well, that’s why farrp2011 is our favourite kind of DIY builder — they’ve put all the code for the project on GitHub.
Tales of financial woe from users trying to source their own Nixie tubes litter the comments section on the reddit post, but farrp2011 says they were able to purchase the ones used in this project for about about $15 each on eBay. Here’s a closer look at the bulbs, courtesy of a previous post by farrp2011 sharing an earlier stage of project…
Farrp2011 got started with one, then two Nixie bulbs before building up to six for the final project
Digging through the comments, we learned that for the video, farrp2011 turned their house lights off to give the Nixie tubes a stronger glow. So the tubes are not as bright in real life as they appear. We also found out that the drop resistor is 22k, with 170V as the supply. Another comments section nugget we liked was the name of the voltage booster boards used for each bulb: “Pile o’Poo“.
Upcoming improvements farrp201 has planned include displaying the date, temperature, and Bitcoin exchange rate, but more suggestions are welcome. They’re also going to add some more capacitors to help with a noise problem and remove the need for the tubes to be turned off before changing the display.
And for extra nerd-points, we found this mesmerising video from Dalibor Farný showing the process of making Nixie tubes:
If you own a 3D printer, you’ll likely have at least heard of OctoPrint from the ever benevolent 3D printing online community. It has the potential to transform your 3D printing workflow for the better, and it’s very easy to set up. This guide will take you through the setup process step by step, and give you some handy tips along the way.
Before we start finding out how to install OctoPrint, let’s look at why you might want to. OctoPrint is a piece of open-source software that allows us to add WiFi functionality to any 3D printer with a USB port (which is pretty much all of them). More specifically, you’ll be able to drop files from your computer onto your printer, start/stop prints, monitor your printer via a live video feed, control the motors, control the temperature, and more, all from your web browser. Of course, with great power comes great responsibility — 3D printers have parts that are hot enough to cause fires, so make sure you have a safe setup, which may include not letting it run unsupervised.
• Raspberry Pi 3 (or newer) • MicroSD card • Raspberry Pi power adapter • USB cable (the connector type will depend on your printer) • Webcam/Raspberry Pi Camera Module (optional) • 3D-printed camera mount (optional)
Before we get started, it is not recommended that anything less than a Raspberry Pi 3 is used for this project. There have been reports of limited success using OctoPrint on a Raspberry Pi Zero W, but only if you have no intention of using a camera to monitor your prints. If you want to try this with a Pi Zero or an older Raspberry Pi, you may experience unexpected print failures.
Firstly, you will need to download the latest version of OctoPi from the OctoPrint website. OctoPi is a Raspbian distribution that comes with OctoPrint, video streaming software, and CuraEngine for slicing models on your Raspberry Pi. When this has finished downloading, unzip the file and put the resulting IMG file somewhere handy.
Next, we need to flash this image onto our microSD card. We recommend using Etcher to do this, due to its minimal UI and ease of use; plus it’s also available to use on both Windows and Mac. Get it here: balena.io/etcher. When Etcher is installed and running, you’ll see the UI displayed. Simply click the Select Image button and find the IMG file you unzipped earlier. Next, put your microSD card into your computer and select it in the middle column of the Etcher interface.
Finally, click on Flash!, and while the image is being burned onto the card, get your WiFi router details, as you’ll need them for the next step.
Now that you have your operating system, you’ll want to add your WiFi details so that the Raspberry Pi can automatically connect to your network after it’s booted. To do this, remove the microSD card from your computer (Etcher will have ‘ejected’ the card after it has finished burning the image onto it) and then plug it back in again. Navigate to the microSD card on your computer — it should now be called boot — and open the file called octopi-wpa-supplicant.txt. Editing this file using WordPad or TextEdit can cause formatting issues; we recommend using Notepad++ to update this file, but there are instructions within the file itself to mitigate formatting issues if you do choose to use another text editor. Find the section that begins ## WPA/WPA2 secured and remove the hash signs from the four lines below this one to uncomment them. Finally, replace the SSID value and the PSK value with the name and password for your WiFi network, respectively (keeping the quotation marks). See the example below for how this should look.
Further down in the file, there is a section for what country you are in. If you are using OctoPrint in the UK, leave this as is (by default, the UK is selected). However, if you wish to change this, simply comment the UK line again by adding a # before it, and uncomment whichever country you are setting up OctoPrint in. The example below shows how the file will look if you are setting this up for use in the US:
# Uncomment the country your Pi is in to activate Wifi in RaspberryPi 3 B+ and above
# For full list see: https://en.wikipedia.org/ wiki/ISO_3166-1_alpha-2
#country=GB # United Kingdom
#country=CA # Canada
#country=DE # Germany
#country=FR # France
country=US # United States
When the changes have been made, save the file and then eject/unmount and remove the microSD card from your computer and put it into your Raspberry Pi. Plug the power supply in, and go and make a cup of tea while it boots up for the first time (this may take around ten minutes). Make sure the Raspberry Pi is running as expected (i.e. check that the green status LED is flashing intermittently). If you’re using macOS, visit octopi.local in your browser of choice. If you’re using Windows, you can find OctoPrint by clicking on the Network tab in the sidebar. It should be called OctoPrint instance on octopi – double-clicking on this will open the OctoPrint dashboard in your browser.
If you see the screen shown above, then congratulations! You have set up OctoPrint.
Not seeing that OctoPrint splash screen? Fear not, you are not the first. While a full list of issues is beyond the scope of this article, common issues include: double-checking your WiFi details are entered correctly in the octopi-wpa-supplicant.txt file, ensuring your Raspberry Pi is working correctly (plug the Raspberry Pi into a monitor and watch what happens during boot), or your Raspberry Pi may be out of range of your WiFi router. There’s a detailed list of troubleshooting suggestions on the OctoPrint website.
Printing with OctoPrint
We now have the opportunity to set up OctoPrint for our printer using the handy wizard. Most of this is very straightforward — setting up a password, signing up to send anonymous usage stats, etc. — but there are a few sections which require a little more thought.
We recommend enabling the connectivity check and the plug-ins blacklist to help keep things nice and stable. If you plan on using OctoPrint as your slicer as well as a monitoring tool, then you can use this step to import a Cura profile. However, we recommend skipping this step as it’s much quicker (and you can use a slicer of your choice) to slice the model on your computer, and then send the finished G-code over.
Finally, we need to put in our printer details. Above, we’ve included some of the specs of the Creality Ender-3 as an example. If you can’t find the exact details of your printer, a quick web search should show what you need for this section.
The General tab can have anything in it, it’s just an identifier for your own use. Print bed & build volume should be easy to find out — if not, you can measure your print bed and find out the position of the origin by looking at your Cura printer profile. Leave Axes as default; for the Hotend and extruder section, defaults are almost certainly fine here (unless you’ve changed your nozzle; 0.4 is the default diameter for most consumer printers).
OctoPrint is better with a camera
Now that you’re set up with OctoPrint, you’re ready to start printing. Turn off your Raspberry Pi, then plug it into your 3D printer. After it has booted up, open OctoPrint again in your browser and take your newly WiFi-enabled printer for a spin by clicking the Connect button. After it has connected, you’ll be able to set the hot end and bed temperature, then watch as the real-time readings are updated.
In the Control tab, we can see the camera stream (if you’re using one) and the motor controls, as well as commands to home the axes. There’s a G-code file viewer to look through a cross-section of the currently loaded model, and a terminal to send custom G-code commands to your printer. The last tab is for making time-lapses; however, there is a plug-in available to help with this process.
Undoubtedly the easiest way to set up video monitoring of your prints is to use the official Raspberry Pi Camera Module. There are dozens of awesome mounts on Thingiverse for a Raspberry Pi Camera Module, to allow you to get the best angle of your models as they print. There are also some awesome OctoPrint-themed Raspberry Pi cases to house your new printer brains. While it isn’t officially supported by OctoPrint, you can use a USB webcam instead if you have one handy, or just want some very high-quality video streams. The OctoPrint wiki has a crowdsourced list of webcams known to work, as well as a link for the extra steps needed to get the webcam working correctly.
As mentioned earlier, our recommended way of printing a model using OctoPrint is to first use your slicer as you would if you were creating a file to save to a microSD card. Once you have the file, save it somewhere handy on your computer, and open the OctoPrint interface. In the bottom left of the screen, you will see the Upload File button — click this and upload the G-code you wish to print.
You’ll see the file/print details appear, including information on how long it’ll take for the object to print. Before you kick things off, check out the G-code Viewer tab on the right. You can not only scroll through the layers of the object, but, using the slider at the bottom, you can see the exact pattern the 3D printer will use to ‘draw’ each layer. Now click Print and watch your printer jump into action!
OctoPrint has scores of community-created plug-ins, but our favourite, Octolapse, makes beautiful hypnotic time-lapses. What makes them so special is that the plug-in alters the G-code of whatever object you are printing so that once each layer has finished, the extruder moves away from the print to let the camera take an unobstructed shot of the model. The result is an object that seems to grow out of the build plate as if by magic. You’ll not find a finer example of it than here.
3D Printing timelapses of models printed on the Prusa i3 MK3! Here’s another compilation of my recent timelapses. I got some shots that i think came out really great and i hope you enjoy them! as always if you want to see some of these timelapses before they come out or want to catch some behind the scenes action check out my instagram!
Thanks to Glenn and HackSpace magazine
This tutorial comes fresh from the pages of HackSpace magazine issue 26 and was written by Glenn Horan. Thanks, Glenn.
You might have a baby/dog/hamster that you want to keep an eye on when you’re not there. We understand: they’re lovely, especially hamsters. Here’s how HackSpace magazine contributor Dr Andrew Lewis built a Raspberry Pi baby cam to watch over his small creatures…
When a project is going to be used in the home, it pays to take a little bit of extra time on appearance
Wireless baby monitors
You can get wireless baby monitors that have a whole range of great features for making sure your little ones are safe, sound, and sleeping happily, but they come with a hefty price tag.
In this article, you’ll find out how to make a Raspberry Pi-powered streaming camera, and combine it with a built-in I2C sensor pack that monitors temperature, pressure, and humidity. You’ll also see how you can use the GPIO pins on Raspberry Pi to turn an LED night light on and off using a web interface.
The hardware for this project is quite simple, and involves minimal soldering, but the first thing you need to do is to install Raspbian onto a microSD card for your Raspberry Pi. If you’re planning on doing a headless install, you’ll also need to enable SSH by creating an empty file called SSH on the root of the Raspbian install, and a file with your wireless LAN details called wpa_supplicant.conf.
You can download the code for this as well as the 3D-printable files from our GitHub. You’ll need to transfer the code to the Raspberry Pi. Next, connect the camera, the BME280 board, and the LEDs to the Raspberry Pi, as shown in the circuit diagram.
The BME280 module uses the I2C connection on pins 3 and 5 of the GPIO, taking power from pins 1 and 9. The LEDs connect directly to pins 19 and 20, and the camera cable fits into the camera connector.
Insert the microSD card into the Raspberry Pi and boot up. If everything is working OK, you should be able to see the IP address for your device listed on your hub or router, and you should be able to connect to it via SSH. If you don’t see the Raspberry Pi listed, check your wireless connection details and make sure your adapter is supplying enough power. It’s worth taking the time to assign your Raspberry Pi with a static IP address on your network, so it can’t change its IP address unexpectedly.
Smile for Picamera
Use the raspi-config application to enable the camera interface and the I2C interface. If you’re planning on modifying the code yourself, we recommend enabling VNC access as well, because it will make editing and debugging the code once the device is put together much easier. All that remains on the software side is to update APT, download the babycam.py script, install any dependencies with PIP, and set the script to run automatically. The main dependencies for the babycam.py script are the RPi.bme280 module, Flask, PyAudio, picamera, and NumPy. Chances are that these are already installed on your system by default, with the exception of RPi.bme280, which can be installed by typing sudo pip3 install RPi.bme280 from the terminal. Once all of the dependencies are present, load up the script and give it a test run, and point your web browser at port 8000 on the Raspberry Pi. You should see a webpage with a camera image, controls for the LED lights, and a read-out of the temperature, pressure, and humidity of the room.
Finishing a 3D print by applying a thin layer of car body filler and sanding back will give a much smoother surface. This isn’t always necessary, but if your filament is damp or your nozzle is worn, it can make a model look much better when it’s painted
The easiest way to get the babycam.py script to run on boot is to add a line to the rc.local file. Assuming that the babycam.py file is located in your home directory, you should add the line python3 /home/pi/babycam.py to the rc.local file, just before the line that reads exit 0. It’s very important that you include the ampersand at the end of the line, otherwise the Python script will not be run in a separate process, the rc.local file will never complete, and your Raspberry Pi will never boot.
Tinned Raspberry Pi
With the software and hardware working, you can start putting the case together. You might need to scale the 3D models to suit the tin can you have before you print them out, so measure your tin before you click Print. You’ll also want to remove any inner lip from the top of the can using a can opener, and make a small hole in the side of the can near the bottom for the USB power cable. Next, make a hole in the bottom of the can for the LED cables to pass through.
If you want to add more than a couple of LEDs (or want to use brighter LEDs), you should connect your LEDs to the power input, and use a transistor on the GPIO to trigger them
If you haven’t already done so, solder appropriate leads to your LEDs, and don’t forget to put a 330 Ω resistor in-line on the positive side. The neck of the camera is supported by two lengths of aluminium armature wire. Push the wire up through each of the printed neck pieces, and use a clean soldering iron to weld the pieces together in the middle. Push the neck into the printed top section, and weld into place with a soldering iron from underneath. Be careful not to block the narrow slot with plastic, as this is where the camera cable passes up through the neck and into the camera.
You need to mount the BME280 so that the sensor is exposed to the air in the room. Do this by drilling a small hole in the 3D-printed top piece and hot gluing the sensor into position. If you’re going to use the optional microphone, you can add an extra hole and glue the mic into place in the same way. A short USB port extender will give you enough cable to plug the USB microphone into the socket on your Raspberry Pi
Paint the tin can and the 3D-printed parts. We found that spray blackboard paint gives a good effect on 3D-printed parts, and PlastiKote stone effect paint made the tin can look a little more tactile than a flat colour. Once the paint is dry, pass the camera cable up through the slot in the neck, and then apply the heat-shrink tubing to cover the neck with a small gap at the top and bottom. Connect the camera to the top of the cable, and push the front piece on to hold it into place. Glue shouldn’t be necessary, but a little hot glue might help if the front parts don’t hold together well.
Push the power cable through the hole in the case, and secure it with a knot and some hot glue. Leave enough cable free to easily remove the top section from the can in future without stressing the wires.
If you’re having trouble getting the armature wire through the 3D-printed parts, try using a drill to help twist the wire through
This is getting heavy
Glue the bottom section onto the can with hot glue, and hot-glue the LEDs into place on the bottom, feeding the cable up through the hole and into the GPIO header. This is a good time to hot-glue a weight into the bottom of the can to improve its stability. I used an old weight from some kitchen scales, but any small weight should be fine. Finally, fix the Raspberry Pi into place on the top piece by either drilling or gluing, then reconnect the rest of the cables, and push the 3D-printed top section into the tin can. If the top section is too loose, you can add a little bit of hot glue to hold things together once you know everything is working.
With the right type of paint, even old tin cans make a good-looking enclosure for a project
That should be all of the steps complete. Plug in the USB and check the camera from a web browser. The babycam.py script includes video, sensors, and light control. If you are using the optional USB microphone, you can expand the functionality of the app to include audio streaming, use cry detection to activate the LEDs (don’t make the LEDs too stimulating or you’ll never get a night’s sleep again), or maybe even add a Bluetooth speaker and integrate a home assistant.
HackSpace magazine issue 26
HackSpace magazine is out now, available in print from your local newsagent, the Raspberry Pi Store in Cambridge, and online from Raspberry Pi Press.
If you love HackSpace magazine as much as we do, why not have a look at the subscription offers available, including the 12-month deal that comes with a free Adafruit Circuit Playground!
And, as always, you can download the free PDF here.
How to use a Raspberry PI as a synthesizer. Table of contents below! The Raspberry PI is a popular card-sized computer. In this video, I show how to set up a Raspberry PI V3 as a virtual analog synthesizer with keyboard and knobs for realtime sound tweaking, using standard MIDI controllers and some very minor shell script editing.
“In this video,” Floyd explains on YouTube, “I show how to set up a Raspberry Pi 3 as a virtual analogue synthesiser with keyboard and knobs for real-time sound tweaking, using standard MIDI controllers and some very minor shell script editing. The result is a battery-powered mini synth creating quite impressive sounds!”
We know a fair few of you (Raspberry Pi staff included) love dabbling in the world of Raspberry Pi synth sound, so be sure to watch the video to see what Floyd gets up to while turning a Raspberry Pi 3 into a virtual analogue synthesiser.
These Raspberry Pis take hourly photographs of snails in plastic container habitats, sharing them to the Snail Habitat website.
While some might find them kind of icky, I am in love with snails (less so with their homeless cousin, the slug), so this snail habitat project from Mrs Nation’s class is right up my alley.
This project was done in a classroom with 22 students. We broke the kids out into groups and created 5 snail habitats. It would be a great project to do school-wide too, where you create 1 snail habitat per class. This would allow the entire school to get involved and monitor each other’s habitats.
Each snail habitat in Mrs Nation’s class is monitored by a Raspberry Pi and camera module, and Misty Lackie has written specific code to take a photo every hour, uploading the image to the dedicated Snail Habitat website. This allows the class to check in on their mollusc friends without disturbing their environment.
“I would love to see others habitats,” Misty states on the project’s GitHub repo, “so if you create one, please share it and I would be happy to publish it on snailhabitat.com.”
Snail facts according to Emma, our resident Bug Doctor
The World Snail Racing Championships take place in Norfolk every year. Emma’s friend took a snail there once, but it didn’t win.
Roman snails, while common in the UK, aren’t native to the country. They were brought to the country by the Romans. Emma is 99% sure this fact is correct.
Garlic snails, when agitated, emit a garlic scent. Helen likes the idea of self-seasoning escargots. Alex is less than convinced.
Snails have no backbone, making them awful wingmen during late-night pub brawls and confrontations.
More cool stuff at http://www.tabb.me and http://www.evankhill.com Cheeseborg has one purpose: to create the best grilled cheese it possibly can! Cheeseborg is fully automated, voice activated, and easy to move. With Google Assistant SDK integration, Cheeseborg can even be used as a part of your smart home.
Does it use a Raspberry Pi, please?
Sometimes we’ll see a project online and find ourselves hoping and praying that it uses a Raspberry Pi, just so we have a reason to share it with you all.
That’s how it was when I saw Cheeseborg, the grilled cheese robot, earlier this week. “Please, please, please…” I prayed to the robot gods, as I chowed down on a grilled cheese at my desk (true story), and, by the grace of all that is good in this world, my plea was answered.
Cheeseborg: the grilled cheese robot
Cheeseborg uses both an Arduino Mega and a Raspberry Pi 3 in its quest to be the best ever automated chef in the world. The Arduino handles the mechanics, while our deliciously green wonder board runs the Google Assistant SDK, allowing you to make grilled cheese via voice command.
Saying “Google, make me a grilled cheese” will set in motion a series of events leading to the production of a perfectly pressed sammie, ideal for soup dunking or solo snacking.
The robot uses a vacuum lifter to pick up a slice of bread, dropping it onto an acrylic tray before repeating the process with a slice of cheese and then a second slice of bread. Then the whole thing is pushed into a panini press that has been liberally coated in butter spray (not shown for video aesthetics), and the sandwich is toasted, producing delicious ooey-gooey numminess out the other side.
Here at Raspberry Pi, we give the Cheeseborg five slices out of five, and look forward to one day meeting Cheeseborg for real, so we can try out its scrummy wares.
Low-cost open labware is a good thing in the world, and I was particularly pleased when micropalaeontologist Martin Tetard got in touch about the Raspberry Pi-based microscope he is developing. The project is called microscoPI (what else?), and it can capture, process, and store images and image analysis results. Martin is engaged in climate research: he uses microscopy to study tiny fossil remains, from which he gleans information about the environmental conditions that prevailed in the far-distant past.
microscoPI a project that aims to design a multipurpose, open-source and inexpensive micro-computer-assisted microscope (Raspberry PI 3). This microscope can automatically take images, process them, and save them altogether with the results of image analyses on a flash drive. It it multipurpose as it can be used on various kinds of images (e.g.
Martin repurposed an old microscope with a Z-axis adjustable stage for accurate focusing, and sourced an inexpensive X/Y movable stage to allow more accurate horizontal positioning of samples under the camera. He emptied the head of the scope to install a Raspberry Pi Camera Module, and he uses an M12 lens adapter to attach lenses suitable for single-specimen close-ups or for imaging several specimens at once. A Raspberry Pi 3B sits above the head of the microscope, and a 3.5-inch TFT touchscreen mounted on top of the Raspberry Pi allows the user to check images as they are captured and processed.
The Raspberry Pi runs our free operating system, Raspbian, and free image-processing software ImageJ. Martin and his colleagues use a number of plugins, some developed themselves and some by others, to support the specific requirements of their research. With this software, microscoPI can capture and analyse microfossil images automatically: it can count particles, including tiny specimens that are touching, analyse their shape and size, and save images and results before prompting the user for the name of the next sample.
microscoPI is compact – less than 30cm in height – and it’s powered by a battery bank secured under the base of the microscope, so it’s easily portable. The entire build comes in at under 160 Euros. You can find out more, and get in touch with Martin, on the microscoPI website.
Chris Aviles, aka the teacher we all wish we’d had when we were at school, discusses how his school is in New Jersey is directly linking data with life itself…
Over to you, Chris.
Every year, our students take federal or state-mandated testing, but what significant changes have we made to their education with the results of these tests? We have never collected more data about our students and society in general. The problem is most people and institutions do a poor job interpreting data and using it to make meaningful change. This problem was something I wanted to tackle in FH Grows.
FH Grows is the name of my seventh-grade class, and is a student-run agriculture business at Knollwood Middle School in Fair Haven, New Jersey. In FH Grows, we sell our produce both online and through our student-run farmers markets. Any produce we don’t sell is donated to our local soup kitchen. To get the most out of our school gardens, students have built sensors and monitors using Raspberry Pis. These sensors collect data which then allows me to help students learn to better interpret data themselves and turn it into action.
Turning data into action
In the greenhouse, our gardens, and alternative growing stations (hydroponics, aquaponics, aeroponics) we have sensors that log the temperature, humidity, and other important data points that we want to know about our garden. This data is then streamed in real time, online at FHGrows.com. When students come into the classroom, one of the first things we look at is the current, live data on the site and find out what is going on in our gardens. Over the course of the semester, students are taught about the ideal growing conditions of our garden. When looking at the data, if we see that the conditions in our gardens aren’t ideal, we get to work.
If we see that the greenhouse is too hot, over 85 degrees, students will go and open the greenhouse door. We check the temperature a little bit later, and if it’s still too hot, students will go turn on the fan. But how many fans do you turn on? After experimenting, we know that each fan lowers the greenhouse temperature between 7-10 degrees Fahrenheit. Opening the door and turning on both fans can bring a greenhouse than can push close to 100 degrees in late May or early June down to a manageable 80 degrees.
Turning data into action can allow for some creativity as well. Over-watering plants can be a real problem. We found that our plants were turning yellow because we were watering them every day when we didn’t need to. How could we solve this problem and become more efficient at watering? Students built a Raspberry Pi that used a moisture sensor to find out when a plant needed to be watered. We used a plant with the moisture sensor in the soil as our control plant. We figured that if we watered the control plant at the same time we watered all our other plants, when the control plant was dry (gave a negative moisture signal) the rest of the plants in the greenhouse would need to be watered as well.
This method of determining when to water our plants worked well. We rarely ever saw our plants turn yellow from overwatering. Here is where the creativity came in. Since we received a signal from the Raspberry Pi when the soil was not wet enough, we played around with what we could do with that signal. We displayed it on the dashboard along with our other data, but we also decided to make the signal send as an email from the plant. When I showed students how this worked, they decided to write the message from the plant in the first person. Every week or so, we received an email from Carl the Control Plant asking us to come out and water him!
If students don’t honour Carl’s request for water, use data to know when to cool our greenhouse, or had not done the fan experiments to see how much cooler they make the greenhouse, all our plants, like the basil we sell to the pizza places in town, would die. This is the beauty of combining data literacy with a school garden: failure to interpret data then act based on their interpretation has real consequences: our produce could die. When it takes 60-120 days to grow the average vegetable, the loss of plants is a significant event. We lose all the time and energy that went into growing those plants as well as lose all the revenue they would have brought in for us. Further, I love the urgency that combining data and the school garden creates because many students have learned the valuable life lesson that not making a decision is making a decision. If students freeze or do nothing when confronted with the data about the garden, that too has consequences.
Using data to spot trends and make predictions
The other major way we use data in FH Grows is to spot trends and make predictions. Different to using data to create the ideal growing conditions in our garden every day, the sensors that we use also provide a way for us to use information about the past to predict the future. FH Grows has about two years’ worth of weather data from our Raspberry Pi weather station (there are guides online if you wish to build a weather station of your own). Using weather data year over year, we can start to determine important events like when it is best to plant our veggies in our garden.
For example, one of the most useful data points on the Raspberry Pi weather station is the ground temperature sensor. Last semester, we wanted to squeeze in a cool weather grow in our garden. This post-winter grow can be done between March and June if you time it right. Getting an extra growing cycle from our garden is incredibly valuable, not only to FH Grows as business (since we would be growing more produce to turn around and sell) but as a way to get an additional learning cycle out of the garden.
So, using two seasons’ worth of ground temperature data, we set out to predict when the ground in our garden would be cool enough to do this cool veggie grow. Students looked at the data we had from our weather station and compared it to different websites that predicted the last frost of the season in our area. We found that the ground right outside our door warmed up two weeks earlier than the more general prediction given by websites. With this information we were able to get a full cool crop grow at a time where our garden used to lay dormant.
We also used our Raspberry Pi to help us predict whether or not it was going to rain over the weekend. Using a Raspberry Pi connected to Weather Underground and previous years’ data, if we believed it would not rain over the weekend we would water our gardens on Friday. If it looked like rain over the weekend, we let Mother Nature water our garden for us. Our prediction using the Pi and previous data was more accurate for our immediate area than compared to the more general weather reports you would get on the radio or an app, since those considered a much larger area when making their prediction.
It seems like we are going to be collecting even more data in the future, not less. It is important that we get our students comfortable working with data. The school garden supported by Raspberry Pi’s amazing ability to collect data is a boon for any teacher who wants to help students learn how to interpret data and turn it into action.
Click here to download the PDF right now. Right this second. If you want to be a love, click here to subscribe, again for free. Subscribers will receive an email when the latest issue is out, and we won’t use your details for anything nasty.
If you’re an educator in the UK, click here and you’ll receive the printed version of Hello World direct to your door. And, guess what? Yup, that’s free too!
What I’m trying to say here is that there is a group of hard-working, passionate educators who take the time to write incredible content for Hello World, for free, and you would be doing them (and us, and your students, kids and/or friends) a solid by reading it 🙂
Perpetual Chimes is a set of augmented wind chimes that offer an escapist experience where your collaboration composes the soundscape. Since there is no wind indoors, the chimes require audience interaction to gently tap or waft them and encourage/nurture the hidden sounds within – triggering sounds as the chimes strike one another.
Normal wind chimes pale in comparison
I don’t like wind chimes. There, I said it. I also don’t like the ticking of the second hand of analogue clocks, and I think these two dislikes might be related. There’s probably a name for this type of dislike, but I’ll leave the Googling to you.
Sound designer Frazer Merrick’s interactive wind chimes may actually be the only wind chimes I can stand. And this is due, I believe, to the wonderful sounds they create when they touch, much more wonderful than regular wind chime sounds. And, obviously, because these wind chimes incorporate a Raspberry Pi 3.
Perpetual Chimes is a set of augmented wind chimes that offer an escapist experience where your collaboration composes the soundscape. Since there is no wind indoors, the chimes require audience interaction to gently tap or waft them and encourage/nurture the hidden sounds within — triggering sounds as the chimes strike one another. Since the chimes make little acoustic noise, essentially they’re broken until you collaborate with them.
This isn’t the first mineral oil bath we’ve seen for the Raspberry Pi, but it’s definitely the first we’ve seen with added fish tank decorations.
Using the see-through casing of an old Apple PowerMac G4, Reddit user u/mjh2901 decided to build a mineral oil tank for their Raspberry Pi, and it looks fabulous. Renamed Apple Pi, this use of mineral oil is a technique used by some to manage the heat produced by tech. Oil is able to transfer heat up to five times more efficiently than air, with some mineral oil projects using a separate radiator to dissipate the heat back into the air.
So, how did they do it?
“Started with a PowerMac G4 case I previously used as a fish tank, then a candy dish. I had cut a piece of acrylic and glued it into the bottom.”
They then placed a Raspberry Pi 3 attached to a 2-line 16 character LCD into the tank, along with various decorations, and began to fill with store-bought mineral oil. Once full, the project was complete, the Raspberry Pi forever submerged.
You can find more photos here. But, one question still remains…
Have you ever witnessed something marvellous but, by the time you get your camera out to record it, the moment has passed? Johan Link‘s Film in the Past hat-mounted camera is here to save the day!
Record the past
As 18-year-old student Johan explains, “Imagine you are walking in the street and you see a meteorite in the sky – obviously you don’t have time to take your phone to film it.” While I haven’t seen many meteorites in the sky, I have found myself wishing I’d had a camera to hand more than once in my life – usually when a friend trips over or says something ridiculous. “Fortunately after the passage of the meteorite, you just have to press a button on the hat and the camera will record the last 7 seconds”, Johan continues. “Then you can download the video from an application on your phone.”
Johan’s project, Film in the Past, consists of a Raspberry Pi 3 with USB camera attached, mounted to the peak of a baseball cap.
The camera is always on, and, at the press of a button, will save the last seven seconds of footage to the Raspberry Pi. You can then access the saved footage from an application on your smartphone. It’s a bit like the video capture function on the Xbox One or, as I like to call it, the option to record hilarious glitches during gameplay. But, unlike the Xbox One, it’s a lot easier to get the footage off the Raspberry Pi and onto your phone.
A fleet of driverless cars working together to keep traffic moving smoothly can improve overall traffic flow by at least 35 percent, researchers have shown. The researchers, from the University of Cambridge, programmed a small fleet of miniature robotic cars to drive on a multi-lane track and observed how the traffic flow changed when one of the cars stopped.
So long, traffic!
By using Raspberry Pis and onboard sensors to program scale-model versions of commercially available cars, undergraduate researchers have built a fleet of driverless cars that ‘talk to each other’. They did this because they are studying how driverless technology can help reduce traffic incidents on our roads.
The researchers investigated how a car stalled on a multi-lane track affects the buildup of traffic, and how communication between driverless cars can prevent these buildups.
When the cars acted independently of each other, a stalled car caused other vehicles in the same lane to slow or stop in order to merge into the adjacent lane. This soon led to queues forming along the track. But when the cars communicated via Raspberry Pis, they could tell each other about obstacles on the track, and this allowed cars to shift lanes with the cooperation of other road users.
The researchers recently presented their paper on the subject at the International Conference on Robotics and Automation (ICRA 2019) in Montréal, Canada. You can find links to their results, plus more information, on the University of Cambridge blog.
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.