“Wait, I didn’t know it was a computer. It’s an actual computer computer. What?!”
The eyes are ping pong balls cut in half so you can fit a Raspberry Pi Camera Module inside them. (Don’t forget to make a hole in the ‘pupil’ so the lens can peek through).
The Raspberry Pi and display screen are neatly mounted on the side of the Macintosh so they’re easily accessible should you need to make any changes.
All the hacked, repurposed junky bits sit inside or are mounted on swish 3D-printed parts.
Add some joke shop chatterbox teeth, and you’ve got what looks like the innards of a Furby staring at you. See below for a harrowing snapshot of Zach’s ‘Furlexa’ project, featured on our blog last year. We still see it when we sleep.
It wasn’t enough for Furby-mad Sam to have created a Furby look-a-like face-tracking robot, he needed to go further. Inside the clear Macintosh case, you can see a de-furred Furby skeleton atop a 3D-printed plinth, with redundant ribbon cables flowing from its eyes into the back of the face-tracking robot face, thus making it appear as though the Furby is the brains behind this creepy creation that is following your every move.
The Edwards Lab at the University of Reading has developed a flexible, low-cost, open source lab robot for capturing images of microbiology samples with a Raspberry Pi camera module. It’s called POLIR, for Raspberry Pi camera Open-source Laboratory Imaging Robot. Here’s a timelapse video of them assembling it.
Measuring antibiotic resistance with colour-changing dye
The robot is useful for all kinds of microbiology imaging, but at the moment the lab is using it to measure antimicrobial resistance in bacteria. They’re doing this by detecting the colour change in a dye called resazurin, which changes from blue to pink in the presence of metabolically active cells: if bacteria incubated with antibiotics grow, their metabolic activity causes the dye to turn pink. However, if the antibiotics stop or impede the growth of the bacteria, their lower levels of metabolic activity will cause less colour change, or none at all. In the photo below, the colourful microtitre plate holds bacterial samples with and without resistance to the antibiotics against which they’re being tested.
POLIR, an open source 3D printer-based Raspberry Pi lab imaging robot
An imaging system based on 3D-printer designs
The researchers adapted existing open source 3D printer designs and used v-slot aluminium extrusion (this stuff) with custom 3D-printed joints to make a frame. Instead of a printer extrusion head, a Raspberry Pi and camera module are mounted on the frame. An Arduino running open-source Repetier software controls x-y-z stepper motors to adjust the position of the computer and camera.
Front and top views of POLIR
Open-source OctoPrint software controls the camera position by supplying scripts from the Raspberry Pi to the Arduino. OctoPrint also allows remote access and control, which gives researchers flexibility in when they run experiments and check progress. Images are acquired using a Python script configured with the appropriate settings (eg image exposure), and are stored on the Raspberry Pi’s SD card. From there, they can be accessed via FTP.
More flexibility, lower cost
Off-the-shelf lab automation systems are extremely expensive and remain out of the reach of most research groups. POLIR cost just £600.
The system has a number of advantages over higher-cost off-the-shelf imaging systems. One is its flexibility: the robot can image a range of sample formats, including agar plates like those in the video above, microtitre plates like the one in the first photograph, and microfluidic “lab-on-a-comb” devices. A comb looks much like a small, narrow rectangle of clear plastic with striations running down its length; each striation is a microcapillary with capacity for a 1μl sample, and each comb has ten microcapillaries. These microfluidic devices let scientists run experiments on a large number of samples at once, while using a minimum of space on a lab bench, in an incubator, or in an imaging robot like POLIR.
POLIR accommodates 2160 individual capillaries and a 96 well plate, with room to spare
High spatial and temporal resolution
For lab-on-a-comb images, POLIR gives the Reading team four times the spatial resolution they get with a static camera. The moveable Raspberry Pi camera with a short focus yields images with 6 pixels per capillary, compared to 1.5 pixels per capillary using a $700 static Canon camera with a macro lens.
Because POLIR is automated, it brings higher temporal resolution within reach, too. A non-automated system, by contrast, can only be used for timelapse imaging if a researcher repeatedly intervenes at fixed time intervals. Capturing kinetic data with timelapse imaging is valuable because it can be significant if different samples reach the same endpoint but at different rates, and because some dyes can give a transient signal that would be missed by an endpoint measurement alone.
Dr Alexander Edwards of the University of Reading comments:
We built the robot with a simple purpose, to make antimicrobial resistance testing more robust without resorting to expensive and highly specialised lab equipment […] The beauty of the POLIR kit is that it’s based on open source designs and we have likewise published our own designs and modifications, allowing everyone and anyone to benefit from the original design and the modifications in other contexts. We believe that open source hardware is a game changer that will revolutionise microbiological and other life science lab work by increasing data production whilst reducing hands-on labour time in the lab.
Cornell University: ECE 5725 Michael Xiao and Thomas Scavella
Building a Raspberry Pi laser scanner
The ingredients you’ll need to build the laser scanner are:
Raspberry Pi Camera Module v2
Stepper motor and driver
Various LEDs, resistors, and wires
To complete the build, access to a 3D printer and laser cutter would come in handy. If you don’t have access to such tools, we trust you to think of an alternative housing for the scanner. You’re a maker, you’re imaginative — it’s what you do.
How does the laser scanner work?
The line laser projects a line an object, highlighting a slice of it. The Raspberry Pi Camera Module captures this slice, recording the shape of the laser line on the object’s surface. Then the stepper motor rotates the object. When the object has completed a full rotation and the camera has taken an image of every slice, the Raspberry Pi processes all the images to create one virtual 3D object.
Instructables user mfx2 has written a wonderful tutorial for the project, which also includes all files needed to build and program your own version.
If you own a 3D printer, you’ll likely have at least heard of OctoPrint from the ever benevolent 3D printing online community. It has the potential to transform your 3D printing workflow for the better, and it’s very easy to set up. This guide will take you through the setup process step by step, and give you some handy tips along the way.
Before we start finding out how to install OctoPrint, let’s look at why you might want to. OctoPrint is a piece of open-source software that allows us to add WiFi functionality to any 3D printer with a USB port (which is pretty much all of them). More specifically, you’ll be able to drop files from your computer onto your printer, start/stop prints, monitor your printer via a live video feed, control the motors, control the temperature, and more, all from your web browser. Of course, with great power comes great responsibility — 3D printers have parts that are hot enough to cause fires, so make sure you have a safe setup, which may include not letting it run unsupervised.
• Raspberry Pi 3 (or newer) • MicroSD card • Raspberry Pi power adapter • USB cable (the connector type will depend on your printer) • Webcam/Raspberry Pi Camera Module (optional) • 3D-printed camera mount (optional)
Before we get started, it is not recommended that anything less than a Raspberry Pi 3 is used for this project. There have been reports of limited success using OctoPrint on a Raspberry Pi Zero W, but only if you have no intention of using a camera to monitor your prints. If you want to try this with a Pi Zero or an older Raspberry Pi, you may experience unexpected print failures.
Firstly, you will need to download the latest version of OctoPi from the OctoPrint website. OctoPi is a Raspbian distribution that comes with OctoPrint, video streaming software, and CuraEngine for slicing models on your Raspberry Pi. When this has finished downloading, unzip the file and put the resulting IMG file somewhere handy.
Next, we need to flash this image onto our microSD card. We recommend using Etcher to do this, due to its minimal UI and ease of use; plus it’s also available to use on both Windows and Mac. Get it here: balena.io/etcher. When Etcher is installed and running, you’ll see the UI displayed. Simply click the Select Image button and find the IMG file you unzipped earlier. Next, put your microSD card into your computer and select it in the middle column of the Etcher interface.
Finally, click on Flash!, and while the image is being burned onto the card, get your WiFi router details, as you’ll need them for the next step.
Now that you have your operating system, you’ll want to add your WiFi details so that the Raspberry Pi can automatically connect to your network after it’s booted. To do this, remove the microSD card from your computer (Etcher will have ‘ejected’ the card after it has finished burning the image onto it) and then plug it back in again. Navigate to the microSD card on your computer — it should now be called boot — and open the file called octopi-wpa-supplicant.txt. Editing this file using WordPad or TextEdit can cause formatting issues; we recommend using Notepad++ to update this file, but there are instructions within the file itself to mitigate formatting issues if you do choose to use another text editor. Find the section that begins ## WPA/WPA2 secured and remove the hash signs from the four lines below this one to uncomment them. Finally, replace the SSID value and the PSK value with the name and password for your WiFi network, respectively (keeping the quotation marks). See the example below for how this should look.
Further down in the file, there is a section for what country you are in. If you are using OctoPrint in the UK, leave this as is (by default, the UK is selected). However, if you wish to change this, simply comment the UK line again by adding a # before it, and uncomment whichever country you are setting up OctoPrint in. The example below shows how the file will look if you are setting this up for use in the US:
# Uncomment the country your Pi is in to activate Wifi in RaspberryPi 3 B+ and above
# For full list see: https://en.wikipedia.org/ wiki/ISO_3166-1_alpha-2
#country=GB # United Kingdom
#country=CA # Canada
#country=DE # Germany
#country=FR # France
country=US # United States
When the changes have been made, save the file and then eject/unmount and remove the microSD card from your computer and put it into your Raspberry Pi. Plug the power supply in, and go and make a cup of tea while it boots up for the first time (this may take around ten minutes). Make sure the Raspberry Pi is running as expected (i.e. check that the green status LED is flashing intermittently). If you’re using macOS, visit octopi.local in your browser of choice. If you’re using Windows, you can find OctoPrint by clicking on the Network tab in the sidebar. It should be called OctoPrint instance on octopi – double-clicking on this will open the OctoPrint dashboard in your browser.
If you see the screen shown above, then congratulations! You have set up OctoPrint.
Not seeing that OctoPrint splash screen? Fear not, you are not the first. While a full list of issues is beyond the scope of this article, common issues include: double-checking your WiFi details are entered correctly in the octopi-wpa-supplicant.txt file, ensuring your Raspberry Pi is working correctly (plug the Raspberry Pi into a monitor and watch what happens during boot), or your Raspberry Pi may be out of range of your WiFi router. There’s a detailed list of troubleshooting suggestions on the OctoPrint website.
Printing with OctoPrint
We now have the opportunity to set up OctoPrint for our printer using the handy wizard. Most of this is very straightforward — setting up a password, signing up to send anonymous usage stats, etc. — but there are a few sections which require a little more thought.
We recommend enabling the connectivity check and the plug-ins blacklist to help keep things nice and stable. If you plan on using OctoPrint as your slicer as well as a monitoring tool, then you can use this step to import a Cura profile. However, we recommend skipping this step as it’s much quicker (and you can use a slicer of your choice) to slice the model on your computer, and then send the finished G-code over.
Finally, we need to put in our printer details. Above, we’ve included some of the specs of the Creality Ender-3 as an example. If you can’t find the exact details of your printer, a quick web search should show what you need for this section.
The General tab can have anything in it, it’s just an identifier for your own use. Print bed & build volume should be easy to find out — if not, you can measure your print bed and find out the position of the origin by looking at your Cura printer profile. Leave Axes as default; for the Hotend and extruder section, defaults are almost certainly fine here (unless you’ve changed your nozzle; 0.4 is the default diameter for most consumer printers).
OctoPrint is better with a camera
Now that you’re set up with OctoPrint, you’re ready to start printing. Turn off your Raspberry Pi, then plug it into your 3D printer. After it has booted up, open OctoPrint again in your browser and take your newly WiFi-enabled printer for a spin by clicking the Connect button. After it has connected, you’ll be able to set the hot end and bed temperature, then watch as the real-time readings are updated.
In the Control tab, we can see the camera stream (if you’re using one) and the motor controls, as well as commands to home the axes. There’s a G-code file viewer to look through a cross-section of the currently loaded model, and a terminal to send custom G-code commands to your printer. The last tab is for making time-lapses; however, there is a plug-in available to help with this process.
Undoubtedly the easiest way to set up video monitoring of your prints is to use the official Raspberry Pi Camera Module. There are dozens of awesome mounts on Thingiverse for a Raspberry Pi Camera Module, to allow you to get the best angle of your models as they print. There are also some awesome OctoPrint-themed Raspberry Pi cases to house your new printer brains. While it isn’t officially supported by OctoPrint, you can use a USB webcam instead if you have one handy, or just want some very high-quality video streams. The OctoPrint wiki has a crowdsourced list of webcams known to work, as well as a link for the extra steps needed to get the webcam working correctly.
As mentioned earlier, our recommended way of printing a model using OctoPrint is to first use your slicer as you would if you were creating a file to save to a microSD card. Once you have the file, save it somewhere handy on your computer, and open the OctoPrint interface. In the bottom left of the screen, you will see the Upload File button — click this and upload the G-code you wish to print.
You’ll see the file/print details appear, including information on how long it’ll take for the object to print. Before you kick things off, check out the G-code Viewer tab on the right. You can not only scroll through the layers of the object, but, using the slider at the bottom, you can see the exact pattern the 3D printer will use to ‘draw’ each layer. Now click Print and watch your printer jump into action!
OctoPrint has scores of community-created plug-ins, but our favourite, Octolapse, makes beautiful hypnotic time-lapses. What makes them so special is that the plug-in alters the G-code of whatever object you are printing so that once each layer has finished, the extruder moves away from the print to let the camera take an unobstructed shot of the model. The result is an object that seems to grow out of the build plate as if by magic. You’ll not find a finer example of it than here.
3D Printing timelapses of models printed on the Prusa i3 MK3! Here’s another compilation of my recent timelapses. I got some shots that i think came out really great and i hope you enjoy them! as always if you want to see some of these timelapses before they come out or want to catch some behind the scenes action check out my instagram!
Thanks to Glenn and HackSpace magazine
This tutorial comes fresh from the pages of HackSpace magazine issue 26 and was written by Glenn Horan. Thanks, Glenn.
Time to pull out all the stops for the biggest Super Make Something project to date! Using 3D printing, laser cutting, a Raspberry Pi, computer vision, Python, and nearly 600 Neopixel LEDs, I build a low resolution LED mirror that displays your reflection on a massive 3 foot by 3 foot grid made from an array of 24 by 24 RGB LEDs!
If you’re into cool uses of tech, you may be aware of Daniel Rozin, the creative artist building mechanical mirrors out of wooden panels, trash, and…penguins, to name but a few of his wonderful builds.
Yup, this is a mechanical mirror made of toy penguins.
A digital mechanical mirror?
Inspired by Daniel Rozin’s work, Alex, the person behind Super Make Something, put an RGB LED spin on the concept, producing this stunning mirror that thoroughly impressed visitors at Cleveland Maker Faire last month.
“Inspired by Danny Rozin’s mechanical mirrors, this 3 foot by 3 foot mirror is powered by a Raspberry Pi, and uses Python and OpenCV computer vision libraries to process captured images in real time to light up 576 individual RGB LEDs!” Alex explains on Instagram. “Also onboard are nearly 600 3D-printed squares to diffuse the light from each NeoPixel, as well as 16 laser-cut panels to hold everything in place!”
The video above gives a brilliantly detailed explanation of how Alex made the, so we highly recommend giving it a watch if you’re feeling inspired to make your own.
Seriously, we really want to make one of these for Raspberry Pi Towers!
As always, be sure to subscribe to Super Make Something on YouTube and leave a comment on the video if, like us, you love the project. Most online makers are producing content such as this with very little return on their investment, so every like and subscriber really does make a difference.
Last week, lots and lots of you shared your Raspberry Pi builds with us on social media using the hashtag #IUseMyRaspberryPiFor. Jay Wainwright from Liverpool noticed the conversation and got in touch to tell us about The Nest Box, which uses Raspberry Pi to bring impressively high-quality images and video from British bird boxes to your Facebook feed.
Jay runs a small network of livestreaming nest box cameras, with three currently sited and another three in the pipeline; excitingly, the new ones will include a kestrel box and a barn owl box! During the spring, all the cameras stream live to The Nest Box’s Facebook page, which has steadily built a solid following of several thousand wildlife fans.
The Nest Box’s setup uses a Raspberry Pi and Camera Module, along with a Raspberry Pi PoE HAT to provide both power and internet connectivity, so there’s only one cable connection to weatherproof. There’s also a custom HAT that Jay has designed to control LED lights and to govern the Raspberry Pi Camera Module’s IR filter, ensuring high-quality images both during the day and at night. To top it all off, he has written some Python code to record visitors to the nest boxes and go into live streaming mode whenever the action is happening.
As we can see from this nest box design for swifts, shown on the project’s crowdfunding profile, plenty of thought has evidently been put into the design of the boxes so that they provide tempting quarters for their feathered occupants while also accommodating all the electronic components.
Follow The Nest Box on Facebook to add British birds into your social media mix — whatever you’ve got now, I’ll bet all tomorrow’s coffees that it’ll be an improvement. And if you’re using Raspberry Pi for a wildlife project, or you’ve got plans along those lines, let us know in the comments.
These Raspberry Pis take hourly photographs of snails in plastic container habitats, sharing them to the Snail Habitat website.
While some might find them kind of icky, I am in love with snails (less so with their homeless cousin, the slug), so this snail habitat project from Mrs Nation’s class is right up my alley.
This project was done in a classroom with 22 students. We broke the kids out into groups and created 5 snail habitats. It would be a great project to do school-wide too, where you create 1 snail habitat per class. This would allow the entire school to get involved and monitor each other’s habitats.
Each snail habitat in Mrs Nation’s class is monitored by a Raspberry Pi and camera module, and Misty Lackie has written specific code to take a photo every hour, uploading the image to the dedicated Snail Habitat website. This allows the class to check in on their mollusc friends without disturbing their environment.
“I would love to see others habitats,” Misty states on the project’s GitHub repo, “so if you create one, please share it and I would be happy to publish it on snailhabitat.com.”
Snail facts according to Emma, our resident Bug Doctor
The World Snail Racing Championships take place in Norfolk every year. Emma’s friend took a snail there once, but it didn’t win.
Roman snails, while common in the UK, aren’t native to the country. They were brought to the country by the Romans. Emma is 99% sure this fact is correct.
Garlic snails, when agitated, emit a garlic scent. Helen likes the idea of self-seasoning escargots. Alex is less than convinced.
Snails have no backbone, making them awful wingmen during late-night pub brawls and confrontations.
When we invited Estefannie Explains It All to present at Coolest Projects International, she decided to make something cool with a Raspberry Pi to bring along. But being Estefannie, she didn’t just make something a little bit cool. She went ahead and made Raspberry Pi Zero-powered Jurassic Park goggles, or, as she calls them, the world’s first globally triggered, mass broadcasting, photon-emitting and -collecting head unit.
Is it heavy? Yes. But these goggles are not expensive. Follow along as I make the classic Jurassic Park Goggles from scratch!! The 3D Models: https://www.thingiverse.com/thing:3732889 My code: https://github.com/estefanniegg/estefannieExplainsItAll/blob/master/makes/JurassicGoggles/jurassic_park.py Thank you Coolest Projects for bringing me over to speak in Ireland!! https://coolestprojects.org/ Thank you Polymaker for sending me the Polysher and the PolySmooth filament!!!!
3D-printing, sanding, and sanding
Estefannie’s starting point was the set of excellent 3D models of the iconic goggles that Jurassicpaul has kindly made available on Thingiverse. There followed several 3D printing attempts and lots of sanding, sanding, sanding, spray painting, and sanding, then some more printing with special Polymaker filament that can be ethanol polished.
Adding the electronics and assembling the goggles
Estefannie soldered rings of addressable LEDs and created custom models for 3D-printable pieces to fit both them and the goggles. She added a Raspberry Pi Zero, some more LEDs and buttons, an adjustable headgear part from a welding mask, and – importantly – four circles of green acetate. After quite a lot of gluing, soldering, and wiring, she ended up with an entirely magnificent set of goggles.
Here, they’re modelled magnificently by Raspberry Pi videographer Brian. I think you’ll agree he cuts quite a dash.
Coding and LED user interface
Estefannie wrote a Python script to interact with Twitter, take photos, and provide information about the goggles’ current status via the LED rings. When Estefannie powers up the Raspberry Pi, it runs a script on startup and connects to her phone’s wireless hotspot. A red LED on the front of the goggles indicates that the script is up and running.
Once it’s running, pressing a button at the back of the head unit makes the Raspberry Pi search Twitter for mentions of @JurassicPi. The LEDs light up green while it searches, just like you remember from the film. If Estefannie’s script finds a mention, the LEDs flash white and the Raspberry Pi camera module takes a photo. Then they light up blue while the script tweets the photo.
All the code is available on Estefannie’s GitHub. I love this project – I love the super clear, simple user experience provided by the LED rings, and there’s something I really appealing about the asynchronous Twitter interaction, where you mention @JurassicPi and then get an image later, the next time googles are next turned on.
Extra bonus Coolest Projects
If you read the beginning of this post and thought, “wait, what’s Coolest Projects?” then be sure to watch to the end of Estefannie’s video to catch her excellentCoolest Projects mini vlog. And then sign up for updates about Coolest Projects events near you, so you can join in next year, or help a team of young people to join in.
Plant scientists and agronomists use growth chambers to provide consistent growing conditions for the plants they study. This reduces confounding variables – inconsistent temperature or light levels, for example – that could render the results of their experiments less meaningful. To make sure that conditions really are consistent both within and between growth chambers, which minimises experimental bias and ensures that experiments are reproducible, it’s helpful to monitor and record environmental variables in the chambers.
In a recent paper in Applications in Plant Sciences, Brandin Grindstaff and colleagues at the universities of Missouri and Arizona describe how they developed Growth Monitor pi, or GMpi: an affordable growth chamber monitor that provides wider functionality than other devices. As well as sensing growth conditions, it sends the gathered data to cloud storage, captures images, and generates alerts to inform scientists when conditions drift outside of an acceptable range.
The authors emphasise – and we heartily agree – that you don’t need expertise with software and computing to build, use, and adapt a system like this. They’ve written a detailed protocol and made available all the necessary software for any researcher to build GMpi, and they note that commercial solutions with similar functionality range in price from $10,000 to $1,000,000 – something of an incentive to give the DIY approach a go.
The team used open-source app Rclone to upload sensor data to a cloud service, choosing Google Drive since it’s available for free. To alert users when growing conditions fall outside of a set range, they use the incoming webhooks app to generate notifications in a Slack channel. Sensor operation, data gathering, and remote monitoring are supported by a combination of software that’s available for free from the open-source community and software the authors developed themselves. Their package GMPi_Pack is available on GitHub.
With a bill of materials amounting to something in the region of $200, GMpi is another excellent example of affordable, accessible, customisable open labware that’s available to researchers and students. If you want to find out how to build GMpi for your lab, or just for your greenhouse, Affordable remote monitoring of plant growth in facilities using Raspberry Pi computers by Brandin et al. is available on PubMed Central, and it includes appendices with clear and detailed set-up instructions for the whole system.
Low-cost open labware is a good thing in the world, and I was particularly pleased when micropalaeontologist Martin Tetard got in touch about the Raspberry Pi-based microscope he is developing. The project is called microscoPI (what else?), and it can capture, process, and store images and image analysis results. Martin is engaged in climate research: he uses microscopy to study tiny fossil remains, from which he gleans information about the environmental conditions that prevailed in the far-distant past.
microscoPI a project that aims to design a multipurpose, open-source and inexpensive micro-computer-assisted microscope (Raspberry PI 3). This microscope can automatically take images, process them, and save them altogether with the results of image analyses on a flash drive. It it multipurpose as it can be used on various kinds of images (e.g.
Martin repurposed an old microscope with a Z-axis adjustable stage for accurate focusing, and sourced an inexpensive X/Y movable stage to allow more accurate horizontal positioning of samples under the camera. He emptied the head of the scope to install a Raspberry Pi Camera Module, and he uses an M12 lens adapter to attach lenses suitable for single-specimen close-ups or for imaging several specimens at once. A Raspberry Pi 3B sits above the head of the microscope, and a 3.5-inch TFT touchscreen mounted on top of the Raspberry Pi allows the user to check images as they are captured and processed.
The Raspberry Pi runs our free operating system, Raspbian, and free image-processing software ImageJ. Martin and his colleagues use a number of plugins, some developed themselves and some by others, to support the specific requirements of their research. With this software, microscoPI can capture and analyse microfossil images automatically: it can count particles, including tiny specimens that are touching, analyse their shape and size, and save images and results before prompting the user for the name of the next sample.
microscoPI is compact – less than 30cm in height – and it’s powered by a battery bank secured under the base of the microscope, so it’s easily portable. The entire build comes in at under 160 Euros. You can find out more, and get in touch with Martin, on the microscoPI website.
Wanting to break from the standard practice of updating old analogue cameras with digital technology, Alan Wang decided to retrofit a broken vintage camera flash with a Raspberry Pi Zero W to produce a video-capturing action cam.
Full story of this project: https://www.hackster.io/alankrantas/raspberry-pi-zero-flash-cam-359875
By hacking a somewhat gnarly hole into the body of the broken flash unit, Alan fit in the Raspberry Pi Zero W and Camera Module, along with a few other components. He powers the whole unit via a USB power bank.
At every touch of the onboard touchpad, the retrofit camera films 12 seconds of footage and saves it as an MP4 file on the onboard SD card or an optional USB flash drive.
While the project didn’t technically bring the flash unit back to life — as the flash function is still broken — it’s a nice example of upcycling old tech, and it looks pretty sweet. Plus, you can attach it to your existing film camera to produce some cool side-by-side comparison imagery, as seen in the setup above.
How to build a night vision camera, video showing the process and problems that I came across when building this camera
Raspberry Pi night vison camera
Built into the body of an old camera flash, Dan’s Raspberry Pi night vision camera is a homage to a childhood spent sneaking around the levels of Splinter Cell. Says Dan:
The iconic image from the game is the night vision goggles that Sam Fisher wears. I have always been fascinated by the idea that you can see in the dark and this formed the foundation of my idea to build a portable hand-held night vision piece of equipment.
The camera, running on Raspbian, boasts several handy functions, including touchscreen controls courtesy of the Pimoroni HyperPixel, realtime video and image capture, and a viewing distance of two to five metres.
It’s okay to FAIL
Embracing the FAIL (First Attempt In Learning) principle, Dan goes into detail about the issues he had to overcome while building the camera, which is another reason why we really enjoyed this project. It’s okay to fail when trying your hand at digital making, because you learn from your mistakes! Dan’s explanations of the struggles he faced and how he overcame them are .
Time lapse over a Finnish lake from July 2019. Shot with a DIY all-weather HDR time-lapse camera built from ZWO ASI 224MC and Raspberry Pi 3. The camera was built to function as an all-sky camera for recording the night sky year round but since in July the stars were not visible in Finland I decided to test it aimed horizontally over a lake and was positively surprised about the results.
Time-lapse over a Finnish lake from July 2019. Shot with a DIY all-weather HDR time-lapse camera built from ZWO ASI 224MC and Raspberry Pi 3.
Filmed over 6 days using a Raspberry Pi Zero W and Raspberry Pi Camera. Once photo taken every 5 minutes and then played back at 24 fps. I removed the night time photos and then the images were stitched together using the ‘Stop Motion’ app on an iPhone.
Filmed over 6 days using a Raspberry Pi Zero W and Raspberry Pi Camera. Once photo taken every 5 minutes and then played back at 24 fps.
Timelapse about salad growth. Period of Picture Making: 03-04 to 02-05-2016 Camera has shot 2087 pictures in a distance of 20 minutes. Camera: Raspberry Pi Camera Module Music: Valesco – Stay With Me: http://soundcloud.com/valesco_official/stay-with-me Valesco on Soundcloud: http://soundcloud.com/valesco_official My Links: Website: https://pimeetsplants.com Twitter: https://twitter.com/PiMeetsPlants Google+: https://plus.google.com/+Pimeetsplants
I think I have a thing for time-lapse videos of plant growth. They’re just so friggin’ cool!
More info : https://www.sainsmart.com/products/wide-angle-fov160-5-megapixel-camera-module-for-raspberry-pi FOLLOW US Twitter: https://twitter.com/Sain_Smart Facebook: https://www.facebook.com/SainSmart/ Instagram: https://www.instagram.com/sainsmart/
Given that we had access to a bunch of Raspberry Pis, we thought that we should use some of them to get some timelapse footage of the shop being set up. Read more about the Raspberry Pi shop on our blog: http://rpf.io/ytstoreblog
We couldn’t help ourselves. When the time came to set up the Raspberry Pi retail store in Cambridge, we just had to install a time-lapse camera in the corner.
While this time lapse wasn’t taken with a Raspberry Pi Camera Module, the slider moving the camera was controlled using Raspberry Pi. That counts, right?
The Burren is a karst landscape region in north-west Co. Clare in Ireland. It is one of the largest karst regions in Europe. I have been photographing The Burren over the last 5 years, and recently got into time lapse photography. The Burren was an obvious place for me to do this first video.
The Burren is a karst landscape region in north-west Co. Clare in Ireland. It is one of the largest karst regions in Europe. I have been photographing The Burren over the last 5 years, and recently got into time-lapse photography. The Burren was an obvious place for me to do this first video.
Want to set up your own Raspberry Pi time-lapse camera? Our handy guide shows you how.
Do you have a time-lapse video you’d like to share with us? Then please post your link in the comments below.
Some of you may wonder why you wouldn’t have your records with your record player and, as such, use that record player to play those records. If you are one of these people, then consider, for example, the beautiful Damien Rice LP I own that tragically broke during a recent house move. While I can no longer play the LP, its artwork is still worthy of a place on my record shelf, and with Plynth I can still play the album as well.
In addition, instead of album artwork to play an album, you could use photographs, doodles, or type to play curated playlists, or, as mentioned on the website, DVDs to play the movies soundtrack, or CDs to correctly select the right disc in a disc changer.
Convinced or not, I think what we can all agree on is that Plynth is a good-looking bit of kit, and at Pi Towers look forward to seeing where they project leads.
Long-time readers will remember Penguin Lifelines, one of our very favourite projects from back in the mists of time (which is to say 2014 — we have short memories around here).
Click on penguins for fun and conservation
Penguin Lifelines was a programme run by the Zoological Society of London, crowdsourcing the tracking of penguin colonies in Antarctica. It’s since evolved into something called Penguin Watch, now working with the World Wildlife Fund (WWF) and British Antarctic Survey (BAS). It’s citizen science on a big scale: thousands of people from all over the world come together on the internet to…click on penguins. By counting the birds in their colonies, users help penguinologists measure changes in the birds’ behaviour and habitat, and in the larger ecosystem, thus assisting in their conservation.
The penguin people say this about Penguin Watch:
Some of these colonies are so difficult to get to that they haven’t been visited for 50 years! The images contain unprecedented detail, giving us the opportunity to gather new data on the number of penguins in the region. This information will help us understand how they are being affected by climate change, the potential impact of local fisheries, and how we can help conserve these incredible species.
Pis in the coldest, wildest place
And what are those special cameras? The static ones providing time-lapse images are Raspberry Pi Camera Modules, mounted on Raspberry Pi Zeros, and we’re really proud to see just how robust they’ve been in the face of Antarctic winters.
Success! The @arribada_i timelapse @Raspberry_Pi Zero cameras built for @penguin_watch survived the Antarctic winter! They captured these fantastic photos of a Gentoo penguin rookery for https://t.co/MEzxbqSyc1 #WorldPenguinDay @helenlynn @philipcolligan https://t.co/M0TK5NLT6G
These things are incredibly tough. They’re the same cameras that Alasdair and colleagues have been sticking on turtles, at depths of down to 500m; I can’t think of a better set of tests for robustness.
Want to get involved? Head over to Penguin Watch, and get clicking! We warn you, though — it’s a little addictive.
You’re most likely aware of the Astro Pi Challenge. In case you’re not, it’s a wonderfully exciting programme organised by the European Space Agency (ESA) and us at Raspberry Pi. Astro Pi challenges European young people to write scientific experiments in code, and the best experiments run aboard the International Space Station (ISS) on two Astro Pi units: Raspberry Pi 1 B+ and Sense HATs encased in flight-grade aluminium spacesuits.
It’s very cool. So, so cool. As adults, we’re all extremely jealous that we’re unable to take part. We all love space and, to be honest, we all want to be astronauts. Astronauts are the coolest.
So imagine our excitement at Pi Towers when ESA shared this photo on Friday:
This is a Soyuz vehicle on its way to dock with the International Space Station. And while Soyuz vehicles ferry between earth and the ISS all the time, what’s so special about this occasion is that this very photo was captured using a Raspberry Pi 1 B+ and a Raspberry Pi Camera Module, together known as Izzy, one of the Astro Pi units!
So if anyone ever asks you whether the Raspberry Pi Camera Module is any good, just show them this photo. We don’t think you’ll need to provide any further evidence after that.
Kids of the 1980s, rejoice: the age of the digital Etch-A-Sketch is now!
What is an Etch-A-Sketch
Introduced in 1960, the Etch-A-Sketch was invented by Frenchman André Cassagnes and manufactured by the Ohio Art Company.
The back of the Etch-A-Sketch screen is covered in very fine aluminium powder. Turning one of the two directional knobs runs a stylus across the back of the screen, displacing the powder and creating a dark grey line visible in the front side.
Etch-A-Snap is (probably) the world’s first Etch-A-Sketch Camera. Powered by a Raspberry Pi Zero (or Zero W), it snaps photos just like any other camera, but outputs them by drawing to an Pocket Etch-A-Sketch screen. Quite slowly.
Unless someone can show us another Etch-A-Sketch camera like this, we’re happy to agree that this is a first!
Raspberry Pi–powered Etch-A-Sketch
Powered by four AA batteries and three 18650 LiPo cells, Etch-A-Snap houses the $5 Raspberry Pi Zero and two 5V stepper motors within a 3D-printed case mounted on the back of a pocket-sized Etch-A-Sketch.
Photos taken using the Raspberry Pi Camera Module are converted into 1-bit, 100px × 60px, black-and-white images using Pillow and OpenCV. Next, these smaller images are turned into plotter commands using networkx. Finally, the Raspberry Pi engages the two 5V stepper motors to move the Etch-A-Sketch control knobs, producing a sketch within 15 minutes to an hour, depending on the level of detail in the image.
Build your own Etch-A-Snap
On his website, Martin goes into some serious detail about Etch-A-Snap, perfect for anyone interested in building their own, or in figuring out how it all works. You’ll find an overview with videos, along with breakdowns of the build, processing, drawing, and plotter.
Tired of opening the refrigerator only to find that your favourite snack is missing? Get video evidence of sneaky fridge thieves sent to your phone, with Adrian Rosebeck’s Raspberry Pi security camera project.
Learn how to build a IoT + Raspberry Pi security camera using OpenCV and computer vision. Send TXT/MMS message notifications, images, and video clips when the security camera is triggered. Full tutorial (including code) here: https://www.pyimagesearch.com/2019/03/25/building-a-raspberry-pi-security-camera-with-opencv
Adrian loves hummus. And, as you can see from my author bio, so do I. So it wasn’t hard for me to relate to Adrian’s story about his college roommates often stealing his cherished chickpea dip.
“Of course, back then I wasn’t as familiar with computer vision and OpenCV as I am now,” he explains on his blog. “Had I known what I do at present, I would have built a Raspberry Pi security camera to capture the hummus heist in action!”
Raspberry Pi security camera
So, in homage to his time as an undergrad, Adrian decided to finally build that security camera for his fridge, despite now only needing to protect his hummus from his wife. And to build it, he opted to use OpenCV, a Raspberry Pi, and a Raspberry Pi Camera Module.
Adrian’s camera is an IoT project: it not only captures footage but also uses Twillo to send that footage, via a cloud service (AWS), to a smartphone.
Because the content of your fridge lives in the dark when you’re not inspecting it, the code for capturing video footage detects light and dark, and records everything that occurs between the fridge door opening and closing. “You could also deploy this inside a mailbox that opens/closes,” suggests Adrian.
Get the code and more
Adrian provides all the code for the project on his blog, pyimagesearch, with a full explanation of why each piece of code is used — thanks, Adrian!
My love for stereoscopic photography goes way back
My great-uncle Eric was a keen stereoscopic photographer and member of The Stereoscopic Society. Every memory I have of visiting him includes looking at his latest stereo creations through a pair of gorgeously antique-looking, wooden viewers. And I’ve since inherited the beautiful mahogany viewing cabinet that used to stand in his dining room.
It looks like this, but fancier
Stereoscopic photography has always fascinated me. Two images that seem identical suddenly become, as if by magic, a three-dimensional wonder. As a child, I couldn’t make sense of it. And even now, while I do understand how it actually works, it remains magical in my mind — like fairies at the bottom of the garden. Or magnets.
So it’s no wonder that I was instantly taken with StereoPi when I stumbled across its crowdfunding campaign on Twitter. Having wanted to make a Pi-based stereoscopic camera ever since I joined the organisation, but not knowing how best to go about it, I thought this new board seemed ideal for me.
The StereoPi board
Despite its name, StereoPi is more than just a stereoscopic camera board. How to attach two Camera Modules to a Raspberry Pi is a question people ask us frequently and for various projects, from home security systems to robots, cameras, and VR.
Slim and standard editions of the StereoPi
The board attaches to any version of the Raspberry Pi Compute Module, including the newly released CM3+, and you can use it in conjunction with Raspbian to control it via the Python module picamera.
StereoPi stereoscopic livestream over 4G. Project site: http://StereoPi.com
When it comes to what you can do with StereoPi, the possibilities are almost endless: mount two wide-angle lenses for 360º recording, build a VR rig to test out virtual reality games, or, as I plan to do, build a stereoscopic camera!
It’s on Crowd Supply now!
StereoPi is currently available to back on Crowd Supply, and purchase options start from $69. At 69% funded with 30 days still to go, we have faith that the StereoPi project will reach its goal and make its way into the world of impressive Raspberry Pi add-ons.
SelfieBot is a project Kim and I originally made for our booth at Seattle Mini Maker Faire 2017. Now, you can build your own! A full tutorial for SelfieBot is up on the Adafruit Learning System at https://learn.adafruit.com/raspberry-pi-selfie-bot/ This was our first Raspberry Pi project, and is an experiment in DIY AI.
Pasties, projects, and plans
Last year, I built a Raspberry Pi photobooth for a friend’s wedding, complete with a thermal printer for instant printouts, and a Twitter feed to keep those unable to attend the event in the loop. I called the project PastyCam, because I built it into the paper mache body of a Cornish pasty, and I planned on creating a tutorial blog post for the build. But I obviously haven’t. And I think it’s time, a year later, to admit defeat.
The wedding was in Cornwall, so the Cornish pasty totally makes sense, alright?
But lucky for us, Sophy Wong has gifted us all with SelfieBot.
If you subscribe to HackSpace magazine, you’ll recognise Sophy from issue 4, where she adorned the cover, complete with glowing fingernails. And if you’re like me, you instantly wanted to be her as soon as you saw that image.
Makers should also know Sophy from her impressive contributions to the maker community, including her tutorials for Adafruit, her YouTube channel, and most recently her work with Mythbusters Jr.
Filming for #MythbustersJr is wrapped, and I’m heading home to Seattle. What an incredible summer filled with amazing people. I’m so inspired by every single person, crew and cast, on this show, and I’ll miss you all until our paths cross again someday
SelfieBot at MakerFaire
I saw SelfieBot in passing at Maker Faire Bay Area earlier this year. Yet somehow I managed to not introduce myself to Sophy and have a play with her Pi-powered creation. So a few weeks back at World Maker Faire New York, I accosted Sophy as soon as I could, and we bonded by swapping business cards and Pimoroni pins.
SelfieBot is more than just a printing photo booth. It giggles, it talks, it reacts to movement. It’s the robot version of that friend of yours who’s always taking photos. Always. All the time, Amy. It’s all the time! *ahem*
SelfieBot consists of a Raspberry Pi 2, a Pi Camera Module, a 5″ screen, an accelerometer, a mini thermal printer, and more, including 3D-printed and laser-cut parts.
Getting SelfieBot ready for Maker Faire Bay Area next weekend! Super excited to be talking on Sunday with @kpimmel – come see us and meet SelfieBot!
If you want to build your own SelfieBot — and obviously you do — then you can find a complete breakdown of the build process, including info on all parts you’ll need, files for 3D printing, and so, so many wonderfully informative photographs, on the Adafruit Learning System!
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.