Tag Archives: Raspberry Pi 3B+

Be a better Scrabble player with a Raspberry Pi High Quality Camera

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/be-a-better-scrabble-player-with-a-raspberry-pi-high-quality-camera/

One of our fave makers, Wayne from Devscover, got a bit sick of losing at Scrabble (and his girlfriend was likely raging at being stuck in lockdown with a lesser opponent). So he came up with a Raspberry Pi–powered solution!

Using a Raspberry Pi High Quality Camera and a bit of Python, you can quickly figure out the highest-scoring word your available Scrabble tiles allow you to play.

Hardware

  • Raspberry Pi 3B
  • Compatible touchscreen
  • Raspberry Pi High Quality Camera
  • Power supply for the touchscreen and Raspberry Pi
  • Scrabble board

You don’t have to use a Raspberry Pi 3B, but you do need a model that has both display and camera ports. Wayne also chose to use an official Raspberry Pi Touch Display because it can power the computer, but any screen that can talk to your Raspberry Pi should be fine.

Software

Firstly, the build takes a photo of your Scrabble tiles using raspistill.

Next, a Python script processes the image of your tiles and then relays the highest-scoring word you can play to your touchscreen.

The key bit of code here is twl, a Python script that contains every possible word you can play in Scrabble.

From 4.00 minutes into his build video, Wayne walks you through what each bit of code does and how he made it work for this project, including how he installed and used the Scrabble dictionary.

Fellow Scrabble-strugglers have suggested sneaky upgrades in the comments of Wayne’s YouTube video, such having the build relay answers to a more discreet smart watch.

No word yet on how the setup deals with the blank Scrabble tiles; those things are like gold dust.

In case you haven’t met the Raspberry Pi High Quality Camera yet, Wayne also did this brilliant unboxing and tutorial video for our newest piece of hardware.

And for more projects from Devscover, check out this great Amazon price tracker using a Raspberry Pi Zero W, and make sure to subscribe to the channel for more content.

The post Be a better Scrabble player with a Raspberry Pi High Quality Camera appeared first on Raspberry Pi.

Raspberry Pi-powered wedding memories record player

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/raspberry-pi-powered-wedding-memories-record-player/

We’re a sentimental bunch and were bowled over by this intricate, musical wedding gift. It’s powered by a Raspberry Pi and has various other bits of geeky goodness under the hood. Honestly, the extra features just keep coming — you’ll see.

This beautifully crafted ‘record player’ plays one pair of newlyweds’ Spotify accounts, and there’s a special visual twist when their ‘first dance’ wedding song plays.

Midway through the build process

First, a little background: the newlyweds, Holly and Dougie, have been sweethearts since early highschool days. Their wedding took place on a farm near the village they grew up in, Fintry in rural Scotland.

No Title

No Description

Throughout the wedding day, the phrase “Music is a huge deal” was repeated often, which gave the bride’s older brother Ben Howell the idea for a homemade, Raspberry Pi–powered gift.

Custom tagline laser-cut and spray-painted

He built the couple a neatly finished music box, known as HD-001 (HD for ‘Holly Dougie’ of course) and home to a ‘smart turntable’. It can connect to a wireless network and has a touch screen where the record label would normally sit. When you lift the lid and switch it on, it asks “Hello. Who’s listening?”

Once you tap on the picture of either the bride or groom, it accesses their Spotify account and fetches the album artwork of whatever song it plays.

What’s inside?

The main brain is Raspberry Pi 3 running Raspberry Pi OS. The interface is built as a web page in mostly PHP and JavaScript. It uses the Spotify API to get the ‘now playing’ track of the bride’s or groom’s account, and to fish out the album artwork URL from the return data so it can display this on a rotating panel.

The audio side is a powered by a 50W Bluetooth amplifier, which is entirely independent from the Raspberry Pi computer.

The build details

The enclosure is all custom-designed and built using scrap wood wrapped in green faux leather material. Ben sourced most of the other materials — rubber feet, hinges, switches, metal grille — on Amazon.

The HD-001 also features a hand-built 4-way speaker system and a custom-made speaker grille with that famous phrase “Music is a huge deal” on the front.

The lettering on the grille was laser-cut by a company in Glasgow to order, and Ben spray-painted it metallic grey. The LCD panel and driver board are also from Amazon.

To play and pause music, Ben sourced a tone-arm online and routed cabling from the Raspberry Pi GPIO pins through to a micro-switch where the original needle should sit. That’s how lifting the arm pauses playback, and replacing it resumes the music.

Getting the audio to work

Ben explains: “Essentially, it’s a fancy Bluetooth speaker system disguised as an old-fashioned turntable and designed to behave and work like an old-fashioned turntable (skeuomorphism gone mad!).”

Oh, and our favourite adorable bonus feature? If the first dance song from Holly’s and Dougie’s wedding is played, the album artwork on the LCD panel fades away, to be replaced by a slideshow of photos from their wedding.

And for extra, extra big brother points, Ben even took the time to create a manual to make sure the newlyweds got the most out of their musical gift.

We have it on good authority that Ben will entertain anyone who would like to place a pre-order for the HD-002.

The post Raspberry Pi-powered wedding memories record player appeared first on Raspberry Pi.

Meet your new robotic best friend: the MiRo-E dog

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/meet-your-new-robotic-best-friend-the-miro-e-dog/

When you’re learning a new language, it’s easier the younger you are. But how can we show very young students that learning to speak code is fun? Consequential Robotics has an answer…

The MiRo-E is an ’emotionally engaging’ robot platform that was created on a custom PCB  and has since moved onto Raspberry Pi. The creators made the change because they saw that schools were more familiar with Raspberry Pi and realised the potential in being able to upgrade the robotic learning tools with new Raspberry Pi boards.

The MiRo-E was born from a collaboration between Sheffield Robotics, London-based SCA design studio, and Bristol Robotics Lab. The cute robo-doggo has been shipping with Raspberry Pi 3B+ (they work well with the Raspberry Pi 4 too) for over a year now.

While the robot started as a developers’ tool (MiRo-B), the creators completely re-engineered MiRo’s mechatronics and software to turn it into an educational tool purely for the classroom environment.

Three school children in uniforms stroke the robot dog's chin

MiRo-E with students at a School in North London, UK

MiRo-E can see, hear, and interact with its environment, providing endless programming possibilities. It responds to human interaction, making it a fun, engaging way for students to learn coding skills. If you stroke it, it purrs, lights up, move its ears, and wags its tail. Making a sound or clapping makes MiRo move towards you, or away if it is alarmed. And it especially likes movement, following you around like a real, loyal canine friend. These functionalities are just the basic starting point, however: students can make MiRo do much more once they start tinkering with their programmable pet.

These opportunities are provided on MiRoCode, a user-friendly web-based coding interface, where students can run through lesson plans and experiment with new ideas. They can test code on a virtual MiRo-E to create new skills that can be applied to a real-life MiRo-E.

What’s inside?

Here are the full technical specs. But basically, MiRo-E comprises a Raspberry Pi 3B+ as its core, light sensors, cliff sensors, an HD camera, and a variety of connectivity options.

How does it interact?

MiRo reacts to sound, touch, and movement in a variety of ways. 28 capacitive touch sensors tell it when it is being petted or stroked. Six independent RGB LEDs allow it to show emotion, along with DOF to move its eyes, tail, and ears. Its ears also house four 16-bit microphones and a loudspeaker. And two differential drive wheels with opto-sensors help MiRo move around.

What else can it do?

The ‘E’ bit of MiRo-E means it’s emotionally engaging, and the intelligent pet’s potential in healthcare have already been explored. Interaction with animals has been proved to be positive for patients of all ages, but sometimes it’s not possible for ‘real’ animals to comfort people. MiRo-E can fill the gap for young children who would benefit from animal comfort, but where healthcare or animal welfare risks are barriers.

The same researchers who created this emotionally engaging robo-dog for young people are also working with project partners in Japan to develop ‘telepresence robots’ for older patients to interact with their families over video calls.

The post Meet your new robotic best friend: the MiRo-E dog appeared first on Raspberry Pi.

Make it rain chocolate with a Raspberry Pi-powered dispenser

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/make-it-rain-chocolate-with-a-raspberry-pi-powered-dispenser/

This fully automated M&M’s-launching machine delivers chocolate on voice command, wherever you are in the room.

A quick lesson in physics

To get our head around Harrison McIntyre‘s project, first we need to understand parabolas. Harrison explains: “If we ignore air resistance, a parabola can be defined as the arc an object describes when launching through space. The shape of a parabolic arc is determined by three variables: the object’s departure angle; initial velocity; and acceleration due to gravity.”

Harrison uses a basketball shooter to illustrate parabolas

Lucky for us, gravity is always the same, so you really only have to worry about angle and velocity. You could also get away with only changing one variable and still be able to determine where a launched object will land. But adjusting both the angle and the velocity grants much greater precision, which is why Harrison’s machine controls both exit angle and velocity of the M&M’s.

Kit list

The M&M’s launcher comprises:

  • 2 Arduino Nanos
  • 1 Raspberry Pi 3
  • 3 servo motors
  • 2 motor drivers
  • 1 DC motor
  • 1 Hall effect limit switch
  • 2 voltage converters
  • 1 USB camera
  • “Lots” of 3D printed parts
  • 1 Amazon Echo Dot

A cordless drill battery is the primary power source.

The project relies on similar principles as a baseball pitching machine. A compliant wheel is attached to a shaft sitting a few millimetres above a feeder chute that can hold up to ten M&M’s. To launch an M&M’s piece, the machine spins up the shaft to around 1500 rpm, pushes an M&M’s piece into the wheel using a servo, and whoosh, your M&M’s piece takes flight.

Controlling velocity, angle and direction

To measure the velocity of the fly wheel in the machine, Harrison installed a Hall effect magnetic limit switch, which gets triggered every time it is near a magnet.

Two magnets were placed on opposite sides of the shaft, and these pass by the switch. By counting the time in between each pulse from the limit switch, the launcher determines how fast the fly wheel is spinning. In response, the microcontroller adjusts the motor output until the encoder reports the desired rpm. This is how the machine controls the speed at which the M&M’s pieces are fired.

Now, to control the angle at which the M&M’s pieces fly out of the machine, Harrison mounted the fly wheel assembly onto a turret with two degrees of freedom, driven by servos. The turret controls the angle at which the sweets are ‘pitched’, as well as the direction of the ‘pitch’.

So how does it know where I am?

With the angle, velocity, and direction at which the M&M’s pieces fly out of the machine taken care of, the last thing to determine is the expectant snack-eater’s location. For this, Harrison harnessed vision processing.


Harrison used a USB camera and a Python script running on Raspberry Pi 3 to determine when a human face comes into view of the machine, and to calculate how far away it is. The turret then rotates towards the face, the appropriate parabola is calculated, and an M&M’s piece is fired at the right angle and velocity to reach your mouth. Harrison even added facial recognition functionality so the machine only fires M&M’s pieces at his face. No one is stealing this guy’s candy!

So what’s Alexa for?

This project is topped off with a voice-activation element, courtesy of an Amazon Echo Dot, and a Python library called Sinric. This allowed Harrison to disguise his Raspberry Pi as a smart TV named ‘Chocolate’ and command Alexa to “increase the volume of ‘Chocolate’ by two” in order to get his machine to fire two M&M’s pieces at him.

       

Drawbacks

In his video, Harrison explaining that other snack-launching machines involve a spring-loaded throwing mechanism, which doesn’t let you determine the snack’s exit velocity. That means you have less control over how fast your snack goes and where it lands. The only drawback to Harrison’s model? His machine needs objects that are uniform in shape and size, which means no oddly shaped peanut M&M’s pieces for him.

He’s created quite the monster here, in that at first, the machine’s maximum firing speed was 40 mph. And no one wants crispy-shelled chocolate firing at their face at that speed. To keep his teeth safe, Harrison switched out the original motor for one with a lower rpm, which reduced the maximum exit velocity to a much more sensible 23 mph… Please make sure you test your own snack-firing machine outdoors before aiming it at someone’s face.

Go subscribe

Check out the end of Harrison’s videos for some more testing to see what his machine was capable of: he takes out an entire toy army and a LEGO Star Wars squad by firing M&M’s pieces at them. And remember to subscribe to his channel and like the video if you enjoyed what you saw, because that’s just a nice thing to do.

The post Make it rain chocolate with a Raspberry Pi-powered dispenser appeared first on Raspberry Pi.

Build a Raspberry Pi laser scanner

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/build-a-raspberry-pi-laser-scanner/

You really don’t need anything too fancy to build this Raspberry Pi laser scanner, and that’s why we think it’s pretty wonderful.

Rasperry Pi 3D Laser Scanner

Cornell University: ECE 5725 Michael Xiao and Thomas Scavella

Building a Raspberry Pi laser scanner

The ingredients you’ll need to build the laser scanner are:

  • Raspberry Pi
  • Raspberry Pi Camera Module v2
  • Stepper motor and driver
  • Line laser
  • Various LEDs, resistors, and wires
  • Button

To complete the build, access to a 3D printer and laser cutter would come in handy. If you don’t have access to such tools, we trust you to think of an alternative housing for the scanner. You’re a maker, you’re imaginative — it’s what you do.

How does the laser scanner work?

The line laser projects a line an object, highlighting a slice of it. The Raspberry Pi Camera Module captures this slice, recording the shape of the laser line on the object’s surface. Then the stepper motor rotates the object. When the object has completed a full rotation and the camera has taken an image of every slice, the Raspberry Pi processes all the images to create one virtual 3D object.

Instructables user mfx2 has written a wonderful tutorial for the project, which also includes all files needed to build and program your own version.

The post Build a Raspberry Pi laser scanner appeared first on Raspberry Pi.

Playing The Doors with a door (and a Raspberry Pi)

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/playing-the-doors-with-a-door-and-a-raspberry-pi/

Floyd Steinberg is back with more synthy Raspberry Pi musical magic, this time turning a door into a MIDI controller.

I played The Doors on a door – using a Raspberry PI DIY midi controller and a Yamaha EX5

You see that door? You secretly want that to be a MIDI controller? Here’s how to do it, and how to play a cover version of “Break On Through” by The Doors on a door 😉 Link to source code and the DIY kit below.

If you don’t live in a home with squeaky doors — living room door, I’m looking at you — you probably never think about the musical potential of mundane household objects.

Unless you’re these two, I guess:

When Mama Isn’t Home / When Mom Isn’t Home ORIGINAL (the Oven Kid) Timmy Trumpet – Freaks

We thought this was hilarious. Hope you enjoy! This video has over 60 million views worldwide! Social Media: @jessconte To use this video in a commercial player, advertising or in broadcasts, please email [email protected]

If the sound of a slammed oven door isn’t involved in your ditty of choice, you may instead want to add some electronics to that sweet, sweet harmony maker, just like Floyd.

Trusting in the melodic possibilities of incorporating a Raspberry Pi 3B+ and various sensory components into a humble door, Floyd created The Doors Door, a musical door that plays… well, I’m sure you can guess.

If you want to build your own, you can practice some sophisticated ‘copy and paste’ programming after downloading the code. And for links to all the kit you need, check out the description of the video over on YouTube. While you’re there, be sure to give the video a like, and subscribe to Floyd’s channel.

And now, to get you pumped for the weekend, here’s Jim:

The Doors – Break On Through HQ (1967)

recorded fall 1966 – lyrics: You know the day destroys the night Night divides the day Tried to run Tried to hide Break on through to the other side Break on through to the other side Break on through to the other side, yeah We chased our pleasures here Dug our treasures there But can you still recall The time we cried Break on through to the other side Break on through to the other side Yeah!

The post Playing The Doors with a door (and a Raspberry Pi) appeared first on Raspberry Pi.

How to set up OctoPrint on your Raspberry Pi

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/how-to-set-up-octoprint-on-your-raspberry-pi/

If you own a 3D printer, you’ll likely have at least heard of OctoPrint from the ever benevolent 3D printing online community. It has the potential to transform your 3D printing workflow for the better, and it’s very easy to set up. This guide will take you through the setup process step by step, and give you some handy tips along the way.

Octoprint

Before we start finding out how to install OctoPrint, let’s look at why you might want to. OctoPrint is a piece of open-source software that allows us to add WiFi functionality to any 3D printer with a USB port (which is pretty much all of them). More specifically, you’ll be able to drop files from your computer onto your printer, start/stop prints, monitor your printer via a live video feed, control the motors, control the temperature, and more, all from your web browser. Of course, with great power comes great responsibility — 3D printers have parts that are hot enough to cause fires, so make sure you have a safe setup, which may include not letting it run unsupervised.

OctoPrint ingredients

• Raspberry Pi 3 (or newer)
MicroSD card
• Raspberry Pi power adapter
• USB cable (the connector type will depend on your printer)
• Webcam/Raspberry Pi Camera Module (optional)
• 3D-printed camera mount (optional)

Before we get started, it is not recommended that anything less than a Raspberry Pi 3 is used for this project. There have been reports of limited success using OctoPrint on a Raspberry Pi Zero W, but only if you have no intention of using a camera to monitor your prints. If you want to try this with a Pi Zero or an older Raspberry Pi, you may experience unexpected print failures.

Download OctoPi

Firstly, you will need to download the latest version of OctoPi from the OctoPrint website. OctoPi is a Raspbian distribution that comes with OctoPrint, video streaming software, and CuraEngine for slicing models on your Raspberry Pi. When this has finished downloading, unzip the file and put the resulting IMG file somewhere handy.

Next, we need to flash this image onto our microSD card. We recommend using Etcher to do this, due to its minimal UI and ease of use; plus it’s also available to use on both Windows and Mac. Get it here: balena.io/etcher. When Etcher is installed and running, you’ll see the UI displayed. Simply click the Select Image button and find the IMG file you unzipped earlier. Next, put your microSD card into your computer and select it in the middle column of the Etcher interface.

Finally, click on Flash!, and while the image is being burned onto the card, get your WiFi router details, as you’ll need them for the next step.

Now that you have your operating system, you’ll want to add your WiFi details so that the Raspberry Pi can automatically connect to your network after it’s booted. To do this, remove the microSD card from your computer (Etcher will have ‘ejected’ the card after it has finished burning the image onto it) and then plug it back in again. Navigate to the microSD card on your computer — it should now be called boot — and open the file called octopi-wpa-supplicant.txt. Editing this file using WordPad or TextEdit can cause formatting issues; we recommend using Notepad++ to update this file, but there are instructions within the file itself to mitigate formatting issues if you do choose to use another text editor. Find the section that begins ## WPA/WPA2 secured and remove the hash signs from the four lines below this one to uncomment them. Finally, replace the SSID value and the PSK value with the name and password for your WiFi network, respectively (keeping the quotation marks). See the example below for how this should look.

Further down in the file, there is a section for what country you are in. If you are using OctoPrint in the UK, leave this as is (by default, the UK is selected). However, if you wish to change this, simply comment the UK line again by adding a # before it, and uncomment whichever country you are setting up OctoPrint in. The example below shows how the file will look if you are setting this up for use in the US:

# Uncomment the country your Pi is in to activate Wifi in RaspberryPi 3 B+ and above
# For full list see: https://en.wikipedia.org/ wiki/ISO_3166-1_alpha-2
#country=GB # United Kingdom
#country=CA # Canada
#country=DE # Germany
#country=FR # France
country=US # United States

When the changes have been made, save the file and then eject/unmount and remove the microSD card from your computer and put it into your Raspberry Pi. Plug the power supply in, and go and make a cup of tea while it boots up for the first time (this may take around ten minutes). Make sure the Raspberry Pi is running as expected (i.e. check that the green status LED is flashing intermittently). If you’re using macOS, visit octopi.local in your browser of choice. If you’re using Windows, you can find OctoPrint by clicking on the Network tab in the sidebar. It should be called OctoPrint instance on octopi – double-clicking on this will open the OctoPrint dashboard in your browser.

If you see the screen shown above, then congratulations! You have set up OctoPrint.

Not seeing that OctoPrint splash screen? Fear not, you are not the first. While a full list of issues is beyond the scope of this article, common issues include: double-checking your WiFi details are entered correctly in the octopi-wpa-supplicant.txt file, ensuring your Raspberry Pi is working correctly (plug the Raspberry Pi into a monitor and watch what happens during boot), or your Raspberry Pi may be out of range of your WiFi router. There’s a detailed list of troubleshooting suggestions on the OctoPrint website.

Printing with OctoPrint

We now have the opportunity to set up OctoPrint for our printer using the handy wizard. Most of this is very straightforward — setting up a password, signing up to send anonymous usage stats, etc. — but there are a few sections which require a little more thought.

We recommend enabling the connectivity check and the plug-ins blacklist to help keep things nice and stable. If you plan on using OctoPrint as your slicer as well as a monitoring tool, then you can use this step to import a Cura profile. However, we recommend skipping this step as it’s much quicker (and you can use a slicer of your choice) to slice the model on your computer, and then send the finished G-code over.

Finally, we need to put in our printer details. Above, we’ve included some of the specs of the Creality Ender-3 as an example. If you can’t find the exact details of your printer, a quick web search should show what you need for this section.

The General tab can have anything in it, it’s just an identifier for your own use. Print bed & build volume should be easy to find out — if not, you can measure your print bed and find out the position of the origin by looking at your Cura printer profile. Leave Axes as default; for the Hotend and extruder section, defaults are almost certainly fine here (unless you’ve changed your nozzle; 0.4 is the default diameter for most consumer printers).

OctoPrint is better with a camera

Now that you’re set up with OctoPrint, you’re ready to start printing. Turn off your Raspberry Pi, then plug it into your 3D printer. After it has booted up, open OctoPrint again in your browser and take your newly WiFi-enabled printer for a spin by clicking the Connect button. After it has connected, you’ll be able to set the hot end and bed temperature, then watch as the real-time readings are updated.

In the Control tab, we can see the camera stream (if you’re using one) and the motor controls, as well as commands to home the axes. There’s a G-code file viewer to look through a cross-section of the currently loaded model, and a terminal to send custom G-code commands to your printer. The last tab is for making time-lapses; however, there is a plug-in available to help with this process.

Undoubtedly the easiest way to set up video monitoring of your prints is to use the official Raspberry Pi Camera Module. There are dozens of awesome mounts on Thingiverse for a Raspberry Pi Camera Module, to allow you to get the best angle of your models as they print. There are also some awesome OctoPrint-themed Raspberry Pi cases to house your new printer brains. While it isn’t officially supported by OctoPrint, you can use a USB webcam instead if you have one handy, or just want some very high-quality video streams. The OctoPrint wiki has a crowdsourced list of webcams known to work, as well as a link for the extra steps needed to get the webcam working correctly.

As mentioned earlier, our recommended way of printing a model using OctoPrint is to first use your slicer as you would if you were creating a file to save to a microSD card. Once you have the file, save it somewhere handy on your computer, and open the OctoPrint interface. In the bottom left of the screen, you will see the Upload File button — click this and upload the G-code you wish to print.

You’ll see the file/print details appear, including information on how long it’ll take for the object to print. Before you kick things off, check out the G-code Viewer tab on the right. You can not only scroll through the layers of the object, but, using the slider at the bottom, you can see the exact pattern the 3D printer will use to ‘draw’ each layer. Now click Print and watch your printer jump into action!

OctoPrint has scores of community-created plug-ins, but our favourite, Octolapse, makes beautiful hypnotic time-lapses. What makes them so special is that the plug-in alters the G-code of whatever object you are printing so that once each layer has finished, the extruder moves away from the print to let the camera take an unobstructed shot of the model. The result is an object that seems to grow out of the build plate as if by magic. You’ll not find a finer example of it than here.

Satisfying 3D Prints TimeLapse episode 7 (Prusa I3 Mk3 octopi)

3D Printing timelapses of models printed on the Prusa i3 MK3! Here’s another compilation of my recent timelapses. I got some shots that i think came out really great and i hope you enjoy them! as always if you want to see some of these timelapses before they come out or want to catch some behind the scenes action check out my instagram!

Thanks to Glenn and HackSpace magazine

This tutorial comes fresh from the pages of HackSpace magazine issue 26 and was written by Glenn Horan. Thanks, Glenn.

To get your copy of HackSpace magazine issue 26, visit your local newsagent, the Raspberry Pi Store, Cambridge, or the Raspberry Pi Press online store.

Fans of HackSpace magazine will also score themselves a rather delightful Adafruit Circuit Playground Express with a 12-month subscription. Sweet!

The post How to set up OctoPrint on your Raspberry Pi appeared first on Raspberry Pi.

Really, really awesome Raspberry Pi NeoPixel LED mirror

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/awesome-neopixel-led-mirror/

Check out Super Make Something’s awesome NeoPixel LED mirror: a 576 RGB LED display that converts images via the Raspberry Pi Camera Module and Raspberry Pi 3B+ into a pixelated light show.

Neopixel LED Mirror (Python, Raspberry Pi, Arduino, 3D Printing, Laser Cutting!) DIY How To

Time to pull out all the stops for the biggest Super Make Something project to date! Using 3D printing, laser cutting, a Raspberry Pi, computer vision, Python, and nearly 600 Neopixel LEDs, I build a low resolution LED mirror that displays your reflection on a massive 3 foot by 3 foot grid made from an array of 24 by 24 RGB LEDs!

Mechanical mirrors

If you’re into cool uses of tech, you may be aware of Daniel Rozin, the creative artist building mechanical mirrors out of wooden panels, trash, and…penguins, to name but a few of his wonderful builds.

A woman standing in front of a mechanical mirror made of toy penguins

Yup, this is a mechanical mirror made of toy penguins.

A digital mechanical mirror?

Inspired by Daniel Rozin’s work, Alex, the person behind Super Make Something, put an RGB LED spin on the concept, producing this stunning mirror that thoroughly impressed visitors at Cleveland Maker Faire last month.

“Inspired by Danny Rozin’s mechanical mirrors, this 3 foot by 3 foot mirror is powered by a Raspberry Pi, and uses Python and OpenCV computer vision libraries to process captured images in real time to light up 576 individual RGB LEDs!” Alex explains on Instagram. “Also onboard are nearly 600 3D-printed squares to diffuse the light from each NeoPixel, as well as 16 laser-cut panels to hold everything in place!”

The video above gives a brilliantly detailed explanation of how Alex made the, so we highly recommend giving it a watch if you’re feeling inspired to make your own.

Seriously, we really want to make one of these for Raspberry Pi Towers!

As always, be sure to subscribe to Super Make Something on YouTube and leave a comment on the video if, like us, you love the project. Most online makers are producing content such as this with very little return on their investment, so every like and subscriber really does make a difference.

The post Really, really awesome Raspberry Pi NeoPixel LED mirror appeared first on Raspberry Pi.

The Nest Box: DIY Springwatch with Raspberry Pi

Post Syndicated from Helen Lynn original https://www.raspberrypi.org/blog/the-nest-box-diy-springwatch/

Last week, lots and lots of you shared your Raspberry Pi builds with us on social media using the hashtag #IUseMyRaspberryPiFor. Jay Wainwright from Liverpool noticed the conversation and got in touch to tell us about The Nest Box, which uses Raspberry Pi to bring impressively high-quality images and video from British bird boxes to your Facebook feed.

Jay runs a small network of livestreaming nest box cameras, with three currently sited and another three in the pipeline; excitingly, the new ones will include a kestrel box and a barn owl box! During the spring, all the cameras stream live to The Nest Box’s Facebook page, which has steadily built a solid following of several thousand wildlife fans.

A pair of blue tits feeds their chicks in a woolly nest

The Nest Box’s setup uses a Raspberry Pi and Camera Module, along with a Raspberry Pi PoE HAT to provide both power and internet connectivity, so there’s only one cable connection to weatherproof. There’s also a custom HAT that Jay has designed to control LED lights and to govern the Raspberry Pi Camera Module’s IR filter, ensuring high-quality images both during the day and at night. To top it all off, he has written some Python code to record visitors to the nest boxes and go into live streaming mode whenever the action is happening.

As we can see from this nest box design for swifts, shown on the project’s crowdfunding profile, plenty of thought has evidently been put into the design of the boxes so that they provide tempting quarters for their feathered occupants while also accommodating all the electronic components.

Follow The Nest Box on Facebook to add British birds into your social media mix — whatever you’ve got now, I’ll bet all tomorrow’s coffees that it’ll be an improvement. And if you’re using Raspberry Pi for a wildlife project, or you’ve got plans along those lines, let us know in the comments.

The post The Nest Box: DIY Springwatch with Raspberry Pi appeared first on Raspberry Pi.

Real-life DOR-15 bowler hat from Disney’s Meet the Robinsons

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/real-life-dor-15-bowler-hat-from-disneys-meet-the-robinsons/

Why wear a boring bowler hat when you can add technology to make one of Disney’s most evil pieces of apparel?

Meet the Robinsons

Meet the Robinsons is one of Disney’s most underrated movies. Thank you for coming to my TED talk.

What’s not to love? Experimental, futuristic technology, a misunderstood villain, lessons of love and forgiveness aplenty, and a talking T-Rex!

For me, one of the stand-out characters of Meet the Robinsons is DOR-15, a best-of-intentions experiment gone horribly wrong. Designed as a helper hat, DOR-15 instead takes over the mind of whoever is wearing it, hellbent on world domination.

Real-life DOR-15

Built using a Raspberry Pi and the MATRIX Voice development board, the real-life DOR-15, from Team MATRIX Labs, may not be ready to take over the world, but it’s still really cool.

With a plethora of built-in audio sensors, the MATRIX Voice directs DOR-15 towards whoever is making sound, while a series of servos wiggle 3D‑printed legs for added creepy.

This project uses ODAS (Open embeddeD Audition System) and some custom code to move a servo motor in the direction of the most concentrated incoming sound in a 180 degree radius. This enables the hat to face a person calling to it.

The added wiggly spider legs come courtesy of this guide by the delightful Jorvon Moss, whom HackSpace readers will remember from issue 21.

In their complete Hackster walkthrough, Team Matrix Lab talk you through how to build your own DOR-15, including all the files needed to 3D‑print the legs.

Realising animated characters and props

So, what fictional wonder would you bring to life? Your own working TARDIS? Winifred’s spellbook? Mary Poppins’ handbag? Let us know in the comments below.

The post Real-life DOR-15 bowler hat from Disney’s Meet the Robinsons appeared first on Raspberry Pi.

Securely tailor your TV viewing with BBC Box and Raspberry Pi

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/securely-tailor-your-tv-viewing-with-bbc-box-and-raspberry-pi/

Thanks to BBC Box, you might be able to enjoy personalised services without giving up all your data. Sean McManus reports:

One day, you could watch TV shows that are tailored to your interests, thanks to BBC Box. It pulls together personal data from different sources in a household device, and gives you control over which apps may access it.

“If we were to create a device like BBC Box and put it out there, it would allow us to create personalised services without holding personal data,” says Max Leonard.

TV shows could be edited on the device to match the user’s interests, without those interests being disclosed to the BBC. One user might see more tech news and less sport news, for example.

BBC Box was partly inspired by a change in the law that gives us all the right to reuse data that companies hold on us. “You can pull out data dumps, but it’s difficult to do anything with them unless you’re a data scientist,” explains Max. “We’re trying to create technologies to enable people to do interesting things with their data, and allow organisations to create services based on that data on your behalf.”

Building the box

BBC Box is based on Raspberry Pi 3B+, the most powerful model available when this project began. “Raspberry Pi is an amazing prototyping platform,” says Max. “Relatively powerful, inexpensive, with GPIO, and able to run a proper OS. Most importantly, it can fit inside a small box!”

That prototype box is a thing of beauty, a hexagonal tube made of cedar wood. “We created a set of principles for experience and interaction with BBC Box and themes of strength, protection, and ownership came out very strongly,” says Jasmine Cox. “We looked at shapes in nature and architecture that were evocative of these themes (beehives, castles, triangles) and played with how they could be a housing for Raspberry Pi.”

The core software for collating and managing access to data is called Databox. Alpine Linux was chosen because it’s “lightweight, speedy but most importantly secure”, in Max’s words. To get around problems making GPIO access work on Alpine Linux, an Arduino Nano is used to control the LEDs. Storage is a 64GB microSD card, and apps run inside Docker containers, which helps to isolate them from each other.

Combining data securely

The BBC has piloted two apps based on BBC Box. One collects your preferred type of TV programme from BBC iPlayer and your preferred music genre from Spotify. That unique combination of data can be used to recommend events you might like from Skiddle’s database.

Another application helps two users to plan a holiday together. It takes their individual preferences and shows them the destinations they both want to visit, with information about them brought in from government and commercial sources. The app protects user privacy, because neither user has to reveal places they’d rather not visit to the other user, or the reason why.

The team is now testing these concepts with users and exploring future technology options for BBC Box.

The MagPi magazine

This article was lovingly yoinked from the latest issue of The MagPi magazine. You can read issue 87 today, for free, right now, by visiting The MagPi website.

You can also purchase issue 87 from the Raspberry Pi Press website with free worldwide delivery, from the Raspberry Pi Store, Cambridge, and from newsagents and supermarkets across the UK.

 

The post Securely tailor your TV viewing with BBC Box and Raspberry Pi appeared first on Raspberry Pi.

Build a Raspberry Pi chartplotter for your boat

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/build-a-raspberry-pi-chartplotter-for-your-boat/

Earlier this year, James Conger built a chartplotter for his boat using a Raspberry Pi. Here he is with a detailed explanation of how everything works:

Building your own Chartplotter with a Raspberry Pi and OpenCPN

Provides an overview of the hardware and software needed to put together a home-made Chartplotter with its own GPS and AIS receiver. Cost for this project was about $350 US in 2019.

The entire build cost approximately $350. It incorporates a Raspberry Pi 3 Model B+, dAISy AIS receiver HAT, USB GPS module, and touchscreen display, all hooked up to his boat.



Perfect for navigating the often foggy San Francisco Bay, the chartplotter allows James to track the position, speed, and direction of major vessels in the area, superimposed over high-quality NOAA nautical charts.

Raspberry Pi at sea

For more nautically themed Raspberry Pi projects, check out Rekka Bellum and Devine Lu Linvega’s stunning Barometer and Ufuk Arslan’s battery-saving IoT boat hack.

The post Build a Raspberry Pi chartplotter for your boat appeared first on Raspberry Pi.

Raspberry Pi retro gaming on Reddit

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/raspberry-pi-retro-gaming-on-reddit/

Reddit was alive with the sound of retro gaming this weekend.

First out to bat is this lovely minimalist, wall-mounted design built by u/sturnus-vulgaris, who states:

I had planned on making a bar top arcade, but after I built the control panel, I kind of liked the simplicity. I mounted a frame of standard 2×4s cut with a miter saw. Might trim out in black eventually (I have several panels I already purchased), but I do like the look of wood.

Next up, a build with Lego bricks, because who doesn’t love Lego bricks?

Just completed my mini arcade cabinet that consists of approximately 1,000 [Lego bricks], a Raspberry Pi, a SNES style controller, Amazon Basics computer speakers, and a 3.5″ HDMI display.

u/RealMagicman03 shared the build here, so be sure to give them an upvote and leave a comment if, like us, you love Raspberry Pi projects that involve Lego bricks.

And lastly, this wonderful use of the Raspberry Pi Compute Module 3+, proving yet again how versatile the form factor can be.

CM3+Lite cartridge for GPi case. I made this cartridge for fun at first, and it works as all I expected. Now I can play more games l like on this lovely portable stuff. And CM3+ is as powerful as RPi3B+, I really like it.

Creator u/martinx72 goes into far more detail in their post, so be sure to check it out.

What other projects did you see this weekend? Share your links with us in the comments below.

The post Raspberry Pi retro gaming on Reddit appeared first on Raspberry Pi.

Musically synced car windscreen wipers using Raspberry Pi

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/musically-synced-car-windscreen-wipers-using-raspberry-pi/

Hey there! I’ve just come back from a two-week vacation, Liz and Helen are both off sick, and I’m not 100% sure I remember how to do my job.

So, while I figure out how to social media and word write, here’s this absolutely wonderful video from Ian Charnas, showing how he hacked his car windscreen wipers to sync with his stereo.

FINALLY! Wipers Sync to Music

In this video, I modify my car so the windshield wipers sync to the beat of whatever music I’m listening to. You can own this idea!

Ian will be auctioning off the intellectual property rights to his dancing wipers on eBay, will all proceeds going to a charity supporting young makers.

The post Musically synced car windscreen wipers using Raspberry Pi appeared first on Raspberry Pi.

Growth Monitor pi: an open monitoring system for plant science

Post Syndicated from Helen Lynn original https://www.raspberrypi.org/blog/growth-monitor-pi-an-open-monitoring-system-for-plant-science/

Plant scientists and agronomists use growth chambers to provide consistent growing conditions for the plants they study. This reduces confounding variables – inconsistent temperature or light levels, for example – that could render the results of their experiments less meaningful. To make sure that conditions really are consistent both within and between growth chambers, which minimises experimental bias and ensures that experiments are reproducible, it’s helpful to monitor and record environmental variables in the chambers.

A neat grid of small leafy plants on a black plastic tray. Metal housing and tubing is visible to the sides.

Arabidopsis thaliana in a growth chamber on the International Space Station. Many experimental plants are less well monitored than these ones.
(“Arabidopsis thaliana plants […]” by Rawpixel Ltd (original by NASA) / CC BY 2.0)

In a recent paper in Applications in Plant Sciences, Brandin Grindstaff and colleagues at the universities of Missouri and Arizona describe how they developed Growth Monitor pi, or GMpi: an affordable growth chamber monitor that provides wider functionality than other devices. As well as sensing growth conditions, it sends the gathered data to cloud storage, captures images, and generates alerts to inform scientists when conditions drift outside of an acceptable range.

The authors emphasise – and we heartily agree – that you don’t need expertise with software and computing to build, use, and adapt a system like this. They’ve written a detailed protocol and made available all the necessary software for any researcher to build GMpi, and they note that commercial solutions with similar functionality range in price from $10,000 to $1,000,000 – something of an incentive to give the DIY approach a go.

GMpi uses a Raspberry Pi Model 3B+, to which are connected temperature-humidity and light sensors from our friends at Adafruit, as well as a Raspberry Pi Camera Module.

The team used open-source app Rclone to upload sensor data to a cloud service, choosing Google Drive since it’s available for free. To alert users when growing conditions fall outside of a set range, they use the incoming webhooks app to generate notifications in a Slack channel. Sensor operation, data gathering, and remote monitoring are supported by a combination of software that’s available for free from the open-source community and software the authors developed themselves. Their package GMPi_Pack is available on GitHub.

With a bill of materials amounting to something in the region of $200, GMpi is another excellent example of affordable, accessible, customisable open labware that’s available to researchers and students. If you want to find out how to build GMpi for your lab, or just for your greenhouse, Affordable remote monitoring of plant growth in facilities using Raspberry Pi computers by Brandin et al. is available on PubMed Central, and it includes appendices with clear and detailed set-up instructions for the whole system.

The post Growth Monitor pi: an open monitoring system for plant science appeared first on Raspberry Pi.

Raspberry Pi in space!

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/raspberry-pi-in-space/

We love ‘Raspberry Pi + space’ stuff. There, I’ve said it. No taksies backsies.

From high-altitude balloon projects transporting Raspberry Pis to near space, to our two Astro Pi units living aboard the International Space Station, we simply can’t get enough.

Seriously, if you’ve created anything space-related using a Raspberry Pi, please tell us!

Capturing Earth from low orbit

Surrey Satellite Technology Ltd (SSTL) sent a Raspberry Pi Zero to space as part of their Demonstration of Technology (DoT-1) satellite, launched aboard a Soyuz rocket in July.

Earth captured from Low Earth Orbit by a Raspberry Pi

Subscribe to our YouTube channel: http://rpf.io/ytsub Help us reach a wider audience by translating our video content: http://rpf.io/yttranslate Buy a Raspberry Pi from one of our Approved Resellers: http://rpf.io/ytproducts Find out more about the #RaspberryPi Foundation: Raspberry Pi http://rpf.io/ytrpi Code Club UK http://rpf.io/ytccuk Code Club International http://rpf.io/ytcci CoderDojo http://rpf.io/ytcd Check out our free online training courses: http://rpf.io/ytfl Find your local Raspberry Jam event: http://rpf.io/ytjam Work through our free online projects: http://rpf.io/ytprojects Do you have a question about your Raspberry Pi?

So, not that we’re complaining, but why did they send the Raspberry Pi Zero to space to begin with? Well, why not? As SSTL state:

Whilst the primary objective of the 17.5kg self-funded DoT-1 satellite is to demonstrate SSTL’s new Core Data Handling System (Core-DHS), accommodation was made available for some additional experimental payloads including the Raspberry Pi camera experiment which was designed and implemented in conjunction with the Surrey Space Centre.

Essentially, if you can fit a Raspberry Pi into your satellite, you should.


Managing Director of SSTL Sarah Parker went on to say that “the success of the Raspberry Pi camera experiment is an added bonus which we can now evaluate for future missions where it could be utilised for spacecraft ‘selfies’ to check the operation of key equipments, and also for outreach activities.”

SSTL’s very snazzy-looking Demonstration of Technology (DoT-1) satellite

The onboard Raspberry Pi Zero was equipped with a Raspberry Pi Camera Module and a DesignSpark M12 Mount Lens. Image data captured on the space-bound Raspberry Pi was sent back to the SSTL ground station via the Core-DHS.

So, have you sent a Raspberry Pi to space? Or anywhere else we wouldn’t expect a Raspberry Pi to go? Let us know in the comments!

The post Raspberry Pi in space! appeared first on Raspberry Pi.

Control a vintage Roland pen plotter with Raspberry Pi

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/control-vintage-roland-pen-plotter/

By refitting a vintage Roland DG DXY-990 pen plotter using Raspberry Pi, the members of Liege Hackerspace in Belgium have produced a rather nifty build that writes out every tweet mentioning a specific hashtag.

Liege Hackerspace member u/iooner first shared an image of the plotter yesterday, and fellow Redditors called for video of the project in action immediately.

Watch the full video here. And to see the code code for the project, visit the Liege Hackerspace GitHub.

The post Control a vintage Roland pen plotter with Raspberry Pi appeared first on Raspberry Pi.

Controlling a boom lift with a Raspberry Pi

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/controlling-a-boom-lift-with-a-raspberry-pi/

Do you have a spare Raspberry Pi lying around? And a Bluetooth games controller? Do you have access to boom lifts or other heavy machinery?

Well, then we most certainly (do not) have the project for you.

Allow us to introduce what is (possibly, probably, hopefully) the world’s first Raspberry Pi–controlled boom lift. Weighing in at 13,000lb, this is the epitome of DON’T try this at home.

Please don’t!

Raspberry Pi-controlled boom lift

Shared on Reddit over the weekend, u/Ccundiff12’s project received many an upvote and concerned comment, but, as the poster explains, hacking the boom is a personal project for personal use to fix a specific problem — thankfully not something built for the sake of having some fun.

Meet STRETCH. Circa 1989 Genie Boom that I bought (cheap) from a neighbor. I use it to trim trees around my property. Its biggest problem was that it always got stuck. It’s not really an off-road vehicle. It used to take two people to move it around… one to drive the lift, and the other to push it with the tractor when it lost traction. The last time it got stuck, I asked my wife to assist by driving one of the two…….. the next day I started splicing into the control system. Now I can push with the tractor & run the boom via remote!

Visit the original Reddit post for more information on the build. And remember: please do not try this at home.

The post Controlling a boom lift with a Raspberry Pi appeared first on Raspberry Pi.

Use PlayStation Buzz! controllers with a Raspberry Pi

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/use-playstation-buzz-controllers-with-a-raspberry-pi/

Buzz! was a favourite amongst my university housemates and me. With popular culture questions asked by an animated Jason Donovan, answered using real-life quiz controllers with a big red button, what’s not to like?

But, as with most of the tech available in the early 2000s, my Buzz! controllers now sit in a box somewhere, dusty and forgotten.

That’s why it is so goshdarn delightful to see PiMyLifeUp breathe new life into these awesome-looking games controllers.

Bringing Buzz! back

The tutorial uses the hidapi library to communicate with the controllers, allowing them to control functions through the Raspberry Pi, and the Raspberry Pi to control the LED within the big red button.

By the end of this tutorial, you will have learned how to read information about all your USB devices, learned how to read data that the devices are sending back and also how to write a library that will act as a simple wrapper to dealing with the device.

Aside from the Buzz! controllers, available on eBay or similar for a few pounds, you only need a Raspberry Pi and its essential peripherals to get started, as the controllers connect directly via USB — thanks, Buzz!

PiMyLifeUp’s tutorial is wonderfully detailed, explaining the hows and whys of the lines of code needed to turn your old Buzz! controllers into a quiz game written in Python that uses the coloured buttons to answer multiple-choice questions.

Guitar Hero, dance mats, Donkey Kong Bongos — what other gaming peripherals would you like to bring back to life?

The post Use PlayStation Buzz! controllers with a Raspberry Pi appeared first on Raspberry Pi.

Raspberry Pi Sense HAT impact recorder for your car

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/raspberry-pi-sense-hat-impact-recorder-for-your-car/

Let the accelerometer and gyroscope of your Raspberry Pi Sense HAT measure and record impact sustained in a car collision.

Raspberry Pi Sense HAT

The Raspberry Pi Sense HAT was originally designed for the European Astro Pi Challenge, inviting schoolchildren to code their own experiments for two Raspberry Pi units currently orbiting the Earth upon the International Space Station.

The Sense HAT is kitted out with an 8×8 RGB LED matrix and a five-button joystick, and it houses an array of useful sensors, including an accelerometer and gyroscope.

And it’s these two sensors that Instructables user Ashu_d has used for their Impact Recorder for Vehicles.

Impact Recorder for Vehicles

“Impact Recorder is designed to record impact sustained to a vehicle while driving or stationary,” Ashu_d explains. Alongside the Raspberry Pi and Sense HAT, the build also uses a Raspberry Pi Camera Module to record footage, saving video and/or picture files to the SD card for you to examine after a collision. “The impacts are stored in the database in the form of readings as well as video/picture.”

By following Ashu_d’s Instructables tutorial, you’re essentially building yourself a black box for your car, recording impact data as the Sense HAT records outside the standard parameters of your daily commute.

“Upon impact, remote users can be verified in real time,” they continue, “and remote users can then watch the saved video or take remote access to the Pi Camera Module and watch events accordingly.”

Ashu_d goes into great detail on how to use Node-RED and MQTT to complete the project, how you can view video in real time using VLC, and how each element works to create the final build over at Instructables.

The post Raspberry Pi Sense HAT impact recorder for your car appeared first on Raspberry Pi.