Tag Archives: raspberry pi 3

(Raspberry) Pi Commander | The MagPi 95

Post Syndicated from Rob Zwetsloot original https://www.raspberrypi.org/blog/pi-commander-the-magpi-95/

Adrien Castel’s idea of converting an old electronic toy into a retro games machine was no flight of fancy, as David Crookes discovers

The 1980s was a golden era for imaginative electronic toys. Children would pester their parents for a Tomytronic 3D or a Nintendo Game & Watch. And they would enviously eye anyone who had a Tomy Turnin’ Turbo Dashboard with its promise of replicating the thrill of driving (albeit without the traffic jams).

All of the buttons, other than the joystick, are original to the toy – as are the seven red LED lights

Two years ago, maker Matt Brailsford turned that amazing toy into a fully working Out Run arcade machine and Adrien Castel was smitten. “I loved the fact that he’d upcycled an old toy and created something that could be enjoyed as a grown-up,” he says. “But I wanted to push the simulation a bit further and I thought a flying sim could do the trick.”

“I didn’t want to modify the look of the toy”

Ideas began flying around Adrien’s mind. “I knew what I wanted to achieve so I made an overall plan in my head,” he recalls. First he found the perfect toy: a battery-powered Sky Fighter F-16 tabletop game made by Dival. He then decided to base his build around a Raspberry Pi 3A+. “It’s the perfect hardware for projects like this because of its flexibility,” Adrien says.

Taking off

The toy needed some work. Its original bright red joystick was missing and Adrien knew he’d have to replace the original screen with a TFT LCD. To do this, he 3D-printed a frame to fit the TFT display and he created a smaller base for the replacement joystick. Adrien also changed the microswitches for greater sensitivity but he didn’t go overboard with the changes.

The games can make use of the full screen. Adrien would have liked a larger screen, but the original ratio oddly lay between 4:3 and 16:9, making a bigger display harder to find

“I knew I would have to adapt some parts for the joystick and for the screen, but I didn’t want to modify the look of the toy,” Adrien explains. “To be honest, modifying the toy would have involved some sanding and painting and I was worried that it would ruin the overall effect of the project if it was badly executed.”

A Raspberry Pi 3A+ sits at the heart of the Pi Commander, alongside a mini audio amplifier, and it’s wired up to components within the toy

As such, a challenge was set. “I had to keep most of the original parts such as throttle levers and LEDs and adapt them to the new build,” he says. “This meant getting them to work together with the system and it also meant using the original PCB, getting rid of the components and re-routing the electronics to plug on the GPIOs.”

There were some enhancements. Adrien soldered a PAM8403 3W class-D audio amplifier to Raspberry Pi and this allowed a basic speaker to replace the original for better sound. But there were some compromises too.

The original PCB was used and the electronics were re-routed. All the components need to work between 3.3 to 5V with the lowest possible amperage while fitting into a tight space

“At first I thought the screen could be bigger than the one I used, but the round shape of the cockpit didn’t give much space to fit a screen larger than four inches.” He also believes the project could be improved with a better joystick: “The one I’ve used is a simple two-button arcade stick with a jet fighter look.”

Flying high

By using the retro gaming OS Recalbox (based on EmulationStation and RetroArch), however, he’s been able to perfect the overall feel. “Recalbox allowed me to create a custom front end that matches the look of a jet fighter,” he explains. It also means the Pi Commander plays shoot-’em-up games alongside open-source simulators like FlightGear (flightgear.org). “It’s a lot of fun.”

Read The MagPi for free!

Find more fantastic projects, tutorials, and reviews in The MagPi #93, out now! You can get The MagPi #95 online at our store, or in print from all good newsagents and supermarkets. You can also access The MagPi magazine via our Android and iOS apps.

Don’t forget our super subscription offers, which include a free gift of a Raspberry Pi Zero W when you subscribe for twelve months.

And, as with all our Raspberry Pi Press publications, you can download the free PDF from our website.

The post (Raspberry) Pi Commander | The MagPi 95 appeared first on Raspberry Pi.

Retro Nixie tube lights get smart

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/retro-nixie-tube-lights-get-smart/

Nixie tubes: these electronic devices, which can display numerals or other information using glow discharge, made their first appearance in 1955, and they remain popular today because of their cool, vintage aesthetic. Though lots of companies manufactured these items back in the day, the name ‘Nixie’ is said to derive from a Burroughs corporation’s device named NIX I, an abbreviation of ‘Numeric Indicator eXperimental No. 1’.

We liked this recent project shared on reddit, where user farrp2011 used Raspberry Pi  to make his Nixie tube display smart enough to tell the time.

A still from Farrp2011’s video shows he’s linked the bulb displays up to tell the time

Farrp2011’s set-up comprises six Nixie tubes controlled by Raspberry Pi 3, along with eight SN74HC shift registers to turn the 60 transistors on and off that ground the pin for the digits to be displayed on the Nixie tubes. Sounds complicated? Well, that’s why farrp2011 is our favourite kind of DIY builder — they’ve put all the code for the project on GitHub.

Tales of financial woe from users trying to source their own Nixie tubes litter the comments section on the reddit post, but farrp2011 says they were able to purchase the ones used in this project for about about $15 each on eBay. Here’s a closer look at the bulbs, courtesy of a previous post by farrp2011 sharing an earlier stage of project…

Farrp2011 got started with one, then two Nixie bulbs before building up to six for the final project

Digging through the comments, we learned that for the video, farrp2011 turned their house lights off to give the Nixie tubes a stronger glow. So the tubes are not as bright in real life as they appear. We also found out that the drop resistor is 22k, with 170V as the supply. Another comments section nugget we liked was the name of the voltage booster boards used for each bulb: “Pile o’Poo“.

Upcoming improvements farrp201 has planned include displaying the date, temperature, and Bitcoin exchange rate, but more suggestions are welcome. They’re also going to add some more capacitors to help with a noise problem and remove the need for the tubes to be turned off before changing the display.

And for extra nerd-points, we found this mesmerising video from Dalibor Farný showing the process of making Nixie tubes:

The post Retro Nixie tube lights get smart appeared first on Raspberry Pi.

How to set up OctoPrint on your Raspberry Pi

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/how-to-set-up-octoprint-on-your-raspberry-pi/

If you own a 3D printer, you’ll likely have at least heard of OctoPrint from the ever benevolent 3D printing online community. It has the potential to transform your 3D printing workflow for the better, and it’s very easy to set up. This guide will take you through the setup process step by step, and give you some handy tips along the way.

Octoprint

Before we start finding out how to install OctoPrint, let’s look at why you might want to. OctoPrint is a piece of open-source software that allows us to add WiFi functionality to any 3D printer with a USB port (which is pretty much all of them). More specifically, you’ll be able to drop files from your computer onto your printer, start/stop prints, monitor your printer via a live video feed, control the motors, control the temperature, and more, all from your web browser. Of course, with great power comes great responsibility — 3D printers have parts that are hot enough to cause fires, so make sure you have a safe setup, which may include not letting it run unsupervised.

OctoPrint ingredients

• Raspberry Pi 3 (or newer)
MicroSD card
• Raspberry Pi power adapter
• USB cable (the connector type will depend on your printer)
• Webcam/Raspberry Pi Camera Module (optional)
• 3D-printed camera mount (optional)

Before we get started, it is not recommended that anything less than a Raspberry Pi 3 is used for this project. There have been reports of limited success using OctoPrint on a Raspberry Pi Zero W, but only if you have no intention of using a camera to monitor your prints. If you want to try this with a Pi Zero or an older Raspberry Pi, you may experience unexpected print failures.

Download OctoPi

Firstly, you will need to download the latest version of OctoPi from the OctoPrint website. OctoPi is a Raspbian distribution that comes with OctoPrint, video streaming software, and CuraEngine for slicing models on your Raspberry Pi. When this has finished downloading, unzip the file and put the resulting IMG file somewhere handy.

Next, we need to flash this image onto our microSD card. We recommend using Etcher to do this, due to its minimal UI and ease of use; plus it’s also available to use on both Windows and Mac. Get it here: balena.io/etcher. When Etcher is installed and running, you’ll see the UI displayed. Simply click the Select Image button and find the IMG file you unzipped earlier. Next, put your microSD card into your computer and select it in the middle column of the Etcher interface.

Finally, click on Flash!, and while the image is being burned onto the card, get your WiFi router details, as you’ll need them for the next step.

Now that you have your operating system, you’ll want to add your WiFi details so that the Raspberry Pi can automatically connect to your network after it’s booted. To do this, remove the microSD card from your computer (Etcher will have ‘ejected’ the card after it has finished burning the image onto it) and then plug it back in again. Navigate to the microSD card on your computer — it should now be called boot — and open the file called octopi-wpa-supplicant.txt. Editing this file using WordPad or TextEdit can cause formatting issues; we recommend using Notepad++ to update this file, but there are instructions within the file itself to mitigate formatting issues if you do choose to use another text editor. Find the section that begins ## WPA/WPA2 secured and remove the hash signs from the four lines below this one to uncomment them. Finally, replace the SSID value and the PSK value with the name and password for your WiFi network, respectively (keeping the quotation marks). See the example below for how this should look.

Further down in the file, there is a section for what country you are in. If you are using OctoPrint in the UK, leave this as is (by default, the UK is selected). However, if you wish to change this, simply comment the UK line again by adding a # before it, and uncomment whichever country you are setting up OctoPrint in. The example below shows how the file will look if you are setting this up for use in the US:

# Uncomment the country your Pi is in to activate Wifi in RaspberryPi 3 B+ and above
# For full list see: https://en.wikipedia.org/ wiki/ISO_3166-1_alpha-2
#country=GB # United Kingdom
#country=CA # Canada
#country=DE # Germany
#country=FR # France
country=US # United States

When the changes have been made, save the file and then eject/unmount and remove the microSD card from your computer and put it into your Raspberry Pi. Plug the power supply in, and go and make a cup of tea while it boots up for the first time (this may take around ten minutes). Make sure the Raspberry Pi is running as expected (i.e. check that the green status LED is flashing intermittently). If you’re using macOS, visit octopi.local in your browser of choice. If you’re using Windows, you can find OctoPrint by clicking on the Network tab in the sidebar. It should be called OctoPrint instance on octopi – double-clicking on this will open the OctoPrint dashboard in your browser.

If you see the screen shown above, then congratulations! You have set up OctoPrint.

Not seeing that OctoPrint splash screen? Fear not, you are not the first. While a full list of issues is beyond the scope of this article, common issues include: double-checking your WiFi details are entered correctly in the octopi-wpa-supplicant.txt file, ensuring your Raspberry Pi is working correctly (plug the Raspberry Pi into a monitor and watch what happens during boot), or your Raspberry Pi may be out of range of your WiFi router. There’s a detailed list of troubleshooting suggestions on the OctoPrint website.

Printing with OctoPrint

We now have the opportunity to set up OctoPrint for our printer using the handy wizard. Most of this is very straightforward — setting up a password, signing up to send anonymous usage stats, etc. — but there are a few sections which require a little more thought.

We recommend enabling the connectivity check and the plug-ins blacklist to help keep things nice and stable. If you plan on using OctoPrint as your slicer as well as a monitoring tool, then you can use this step to import a Cura profile. However, we recommend skipping this step as it’s much quicker (and you can use a slicer of your choice) to slice the model on your computer, and then send the finished G-code over.

Finally, we need to put in our printer details. Above, we’ve included some of the specs of the Creality Ender-3 as an example. If you can’t find the exact details of your printer, a quick web search should show what you need for this section.

The General tab can have anything in it, it’s just an identifier for your own use. Print bed & build volume should be easy to find out — if not, you can measure your print bed and find out the position of the origin by looking at your Cura printer profile. Leave Axes as default; for the Hotend and extruder section, defaults are almost certainly fine here (unless you’ve changed your nozzle; 0.4 is the default diameter for most consumer printers).

OctoPrint is better with a camera

Now that you’re set up with OctoPrint, you’re ready to start printing. Turn off your Raspberry Pi, then plug it into your 3D printer. After it has booted up, open OctoPrint again in your browser and take your newly WiFi-enabled printer for a spin by clicking the Connect button. After it has connected, you’ll be able to set the hot end and bed temperature, then watch as the real-time readings are updated.

In the Control tab, we can see the camera stream (if you’re using one) and the motor controls, as well as commands to home the axes. There’s a G-code file viewer to look through a cross-section of the currently loaded model, and a terminal to send custom G-code commands to your printer. The last tab is for making time-lapses; however, there is a plug-in available to help with this process.

Undoubtedly the easiest way to set up video monitoring of your prints is to use the official Raspberry Pi Camera Module. There are dozens of awesome mounts on Thingiverse for a Raspberry Pi Camera Module, to allow you to get the best angle of your models as they print. There are also some awesome OctoPrint-themed Raspberry Pi cases to house your new printer brains. While it isn’t officially supported by OctoPrint, you can use a USB webcam instead if you have one handy, or just want some very high-quality video streams. The OctoPrint wiki has a crowdsourced list of webcams known to work, as well as a link for the extra steps needed to get the webcam working correctly.

As mentioned earlier, our recommended way of printing a model using OctoPrint is to first use your slicer as you would if you were creating a file to save to a microSD card. Once you have the file, save it somewhere handy on your computer, and open the OctoPrint interface. In the bottom left of the screen, you will see the Upload File button — click this and upload the G-code you wish to print.

You’ll see the file/print details appear, including information on how long it’ll take for the object to print. Before you kick things off, check out the G-code Viewer tab on the right. You can not only scroll through the layers of the object, but, using the slider at the bottom, you can see the exact pattern the 3D printer will use to ‘draw’ each layer. Now click Print and watch your printer jump into action!

OctoPrint has scores of community-created plug-ins, but our favourite, Octolapse, makes beautiful hypnotic time-lapses. What makes them so special is that the plug-in alters the G-code of whatever object you are printing so that once each layer has finished, the extruder moves away from the print to let the camera take an unobstructed shot of the model. The result is an object that seems to grow out of the build plate as if by magic. You’ll not find a finer example of it than here.

Satisfying 3D Prints TimeLapse episode 7 (Prusa I3 Mk3 octopi)

3D Printing timelapses of models printed on the Prusa i3 MK3! Here’s another compilation of my recent timelapses. I got some shots that i think came out really great and i hope you enjoy them! as always if you want to see some of these timelapses before they come out or want to catch some behind the scenes action check out my instagram!

Thanks to Glenn and HackSpace magazine

This tutorial comes fresh from the pages of HackSpace magazine issue 26 and was written by Glenn Horan. Thanks, Glenn.

To get your copy of HackSpace magazine issue 26, visit your local newsagent, the Raspberry Pi Store, Cambridge, or the Raspberry Pi Press online store.

Fans of HackSpace magazine will also score themselves a rather delightful Adafruit Circuit Playground Express with a 12-month subscription. Sweet!

The post How to set up OctoPrint on your Raspberry Pi appeared first on Raspberry Pi.

Raspberry Pi 3 baby monitor | Hackspace magazine #26

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/raspberry-pi-3-baby-monitor-hackspace-magazine-26/

You might have a baby/dog/hamster that you want to keep an eye on when you’re not there. We understand: they’re lovely, especially hamsters. Here’s how HackSpace magazine contributor Dr Andrew Lewis built a Raspberry Pi baby cam to watch over his small creatures…

When a project is going to be used in the home, it pays to take a little bit of extra time on appearance

Wireless baby monitors

You can get wireless baby monitors that have a whole range of great features for making sure your little ones are safe, sound, and sleeping happily, but they come with a hefty price tag.

In this article, you’ll find out how to make a Raspberry Pi-powered streaming camera, and combine it with a built-in I2C sensor pack that monitors temperature, pressure, and humidity. You’ll also see how you can use the GPIO pins on Raspberry Pi to turn an LED night light on and off using a web interface.

The hardware for this project is quite simple, and involves minimal soldering, but the first thing you need to do is to install Raspbian onto a microSD card for your Raspberry Pi. If you’re planning on doing a headless install, you’ll also need to enable SSH by creating an empty file called SSH on the root of the Raspbian install, and a file with your wireless LAN details called wpa_supplicant.conf.

You can download the code for this as well as the 3D-printable files from our GitHub. You’ll need to transfer the code to the Raspberry Pi. Next, connect the camera, the BME280 board, and the LEDs to the Raspberry Pi, as shown in the circuit diagram.

The BME280 module uses the I2C connection on pins 3 and 5 of the GPIO, taking power from pins 1 and 9. The LEDs connect directly to pins 19 and 20, and the camera cable fits into the camera connector.

Insert the microSD card into the Raspberry Pi and boot up. If everything is working OK, you should be able to see the IP address for your device listed on your hub or router, and you should be able to connect to it via SSH. If you don’t see the Raspberry Pi listed, check your wireless connection details and make sure your adapter is supplying enough power. It’s worth taking the time to assign your Raspberry Pi with a static IP address on your network, so it can’t change its IP address unexpectedly.

Smile for Picamera

Use the raspi-config application to enable the camera interface and the I2C interface. If you’re planning on modifying the code yourself, we recommend enabling VNC access as well, because it will make editing and debugging the code once the device is put together much easier. All that remains on the software side is to update APT, download the babycam.py script, install any dependencies with PIP, and set the script to run automatically. The main dependencies for the babycam.py script are the RPi.bme280 module, Flask, PyAudio, picamera, and NumPy. Chances are that these are already installed on your system by default, with the exception of RPi.bme280, which can be installed by typing sudo pip3 install RPi.bme280 from the terminal. Once all of the dependencies are present, load up the script and give it a test run, and point your web browser at port 8000 on the Raspberry Pi. You should see a webpage with a camera image, controls for the LED lights, and a read-out of the temperature, pressure, and humidity of the room.

Finishing a 3D print by applying a thin layer of car body filler and sanding back will give a much smoother surface. This isn’t always necessary, but if your filament is damp or your nozzle is worn, it can make a model look much better when it’s painted

The easiest way to get the babycam.py script to run on boot is to add a line to the rc.local file. Assuming that the babycam.py file is located in your home directory, you should add the line python3 /home/pi/babycam.py to the rc.local file, just before the line that reads exit 0. It’s very important that you include the ampersand at the end of the line, otherwise the Python script will not be run in a separate process, the rc.local file will never complete, and your Raspberry Pi will never boot.

Tinned Raspberry Pi

With the software and hardware working, you can start putting the case together. You might need to scale the 3D models to suit the tin can you have before you print them out, so measure your tin before you click Print. You’ll also want to remove any inner lip from the top of the can using a can opener, and make a small hole in the side of the can near the bottom for the USB power cable. Next, make a hole in the bottom of the can for the LED cables to pass through.

If you want to add more than a couple of LEDs (or want to use brighter LEDs), you should connect your LEDs to the power input, and use a transistor on the GPIO to trigger them

If you haven’t already done so, solder appropriate leads to your LEDs, and don’t forget to put a 330 Ω resistor in-line on the positive side. The neck of the camera is supported by two lengths of aluminium armature wire. Push the wire up through each of the printed neck pieces, and use a clean soldering iron to weld the pieces together in the middle. Push the neck into the printed top section, and weld into place with a soldering iron from underneath. Be careful not to block the narrow slot with plastic, as this is where the camera cable passes up through the neck and into the camera.

You need to mount the BME280 so that the sensor is exposed to the air in the room. Do this by drilling a small hole in the 3D-printed top piece and hot gluing the sensor into position. If you’re going to use the optional microphone, you can add an extra hole and glue the mic into place in the same way. A short USB port extender will give you enough cable to plug the USB microphone into the socket on your Raspberry Pi

Paint the tin can and the 3D-printed parts. We found that spray blackboard paint gives a good effect on 3D-printed parts, and PlastiKote stone effect paint made the tin can look a little more tactile than a flat colour. Once the paint is dry, pass the camera cable up through the slot in the neck, and then apply the heat-shrink tubing to cover the neck with a small gap at the top and bottom. Connect the camera to the top of the cable, and push the front piece on to hold it into place. Glue shouldn’t be necessary, but a little hot glue might help if the front parts don’t hold together well.

Push the power cable through the hole in the case, and secure it with a knot and some hot glue. Leave enough cable free to easily remove the top section from the can in future without stressing the wires.

If you’re having trouble getting the armature wire through the 3D-printed parts, try using a drill to help twist the wire through

This is getting heavy

Glue the bottom section onto the can with hot glue, and hot-glue the LEDs into place on the bottom, feeding the cable up through the hole and into the GPIO header. This is a good time to hot-glue a weight into the bottom of the can to improve its stability. I used an old weight from some kitchen scales, but any small weight should be fine. Finally, fix the Raspberry Pi into place on the top piece by either drilling or gluing, then reconnect the rest of the cables, and push the 3D-printed top section into the tin can. If the top section is too loose, you can add a little bit of hot glue to hold things together once you know everything is working.

With the right type of paint, even old tin cans make a good-looking enclosure
for a project

That should be all of the steps complete. Plug in the USB and check the camera from a web browser. The babycam.py script includes video, sensors, and light control. If you are using the optional USB microphone, you can expand the functionality of the app to include audio streaming, use cry detection to activate the LEDs (don’t make the LEDs too stimulating or you’ll never get a night’s sleep again), or maybe even add a Bluetooth speaker and integrate a home assistant.

HackSpace magazine issue 26

HackSpace magazine is out now, available in print from your local newsagent, the Raspberry Pi Store in Cambridge, and online from Raspberry Pi Press.

If you love HackSpace magazine as much as we do, why not have a look at the subscription offers available, including the 12-month deal that comes with a free Adafruit Circuit Playground!

And, as always, you can download the free PDF here.

The post Raspberry Pi 3 baby monitor | Hackspace magazine #26 appeared first on Raspberry Pi.

Using a Raspberry Pi as a synthesiser

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/virtual-analogue-synthesiser/

Synthesiser? Synthesizer? Whichever it is*, check out this video of Floyd Steinberg showing how he set up his Raspberry Pi as one of them.

How to use a Raspberry PI as a synthesizer

How to use a Raspberry PI as a synthesizer. Table of contents below! The Raspberry PI is a popular card-sized computer. In this video, I show how to set up a Raspberry PI V3 as a virtual analog synthesizer with keyboard and knobs for realtime sound tweaking, using standard MIDI controllers and some very minor shell script editing.

“In this video,” Floyd explains on YouTube, “I show how to set up a Raspberry Pi 3 as a virtual analogue synthesiser with keyboard and knobs for real-time sound tweaking, using standard MIDI controllers and some very minor shell script editing. The result is a battery-powered mini synth creating quite impressive sounds!”

The components of a virtual analogue Raspberry Pu synthesiser

We know a fair few of you (Raspberry Pi staff included) love dabbling in the world of Raspberry Pi synth sound, so be sure to watch the video to see what Floyd gets up to while turning a Raspberry Pi 3 into a virtual analogue synthesiser.

Be sure to check out Floyd’s other videos for more synthy goodness, and comment on his video if you’d like him to experiment further with Raspberry Pi. (The answer is yes, yes we would 🙏🙌)

 

*[Editor’s note: it’s spelled with a z in US English, and with an s in UK English. You’re welcome, Alex.]

The post Using a Raspberry Pi as a synthesiser appeared first on Raspberry Pi.

Raspberry Pi snail habitats for Mrs Nation’s class

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/raspberry-pi-snail-habitats-for-mrs-nations-class/

These Raspberry Pis take hourly photographs of snails in plastic container habitats, sharing them to the Snail Habitat website.

Snails

While some might find them kind of icky, I am in love with snails (less so with their homeless cousin, the slug), so this snail habitat project from Mrs Nation’s class is right up my alley.

Snail Habitats



This project was done in a classroom with 22 students. We broke the kids out into groups and created 5 snail habitats. It would be a great project to do school-wide too, where you create 1 snail habitat per class. This would allow the entire school to get involved and monitor each other’s habitats.

Each snail habitat in Mrs Nation’s class is monitored by a Raspberry Pi and camera module, and Misty Lackie has written specific code to take a photo every hour, uploading the image to the dedicated Snail Habitat website. This allows the class to check in on their mollusc friends without disturbing their environment.

“I would love to see others habitats,” Misty states on the project’s GitHub repo, “so if you create one, please share it and I would be happy to publish it on snailhabitat.com.”

Snail facts according to Emma, our resident Bug Doctor

  • The World Snail Racing Championships take place in Norfolk every year. Emma’s friend took a snail there once, but it didn’t win.
  • Roman snails, while common in the UK, aren’t native to the country. They were brought to the country by the Romans. Emma is 99% sure this fact is correct.
  • Garlic snails, when agitated, emit a garlic scent. Helen likes the idea of self-seasoning escargots. Alex is less than convinced.
  • Snails have no backbone, making them awful wingmen during late-night pub brawls and confrontations.
  • This GIF may be fake:

The post Raspberry Pi snail habitats for Mrs Nation’s class appeared first on Raspberry Pi.

The grilled cheese-making robot of your dreams

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/the-grilled-cheese-making-robot-of-your-dreams/

Ummm…YES PLEASE!

Cheeseborg: The Grilled Cheese Robot!

More cool stuff at http://www.tabb.me and http://www.evankhill.com Cheeseborg has one purpose: to create the best grilled cheese it possibly can! Cheeseborg is fully automated, voice activated, and easy to move. With Google Assistant SDK integration, Cheeseborg can even be used as a part of your smart home.

Does it use a Raspberry Pi, please?

Sometimes we’ll see a project online and find ourselves hoping and praying that it uses a Raspberry Pi, just so we have a reason to share it with you all.

That’s how it was when I saw Cheeseborg, the grilled cheese robot, earlier this week. “Please, please, please…” I prayed to the robot gods, as I chowed down on a grilled cheese at my desk (true story), and, by the grace of all that is good in this world, my plea was answered.

Cheeseborg: the grilled cheese robot

Cheeseborg uses both an Arduino Mega and a Raspberry Pi 3 in its quest to be the best ever automated chef in the world. The Arduino handles the mechanics, while our deliciously green wonder board runs the Google Assistant SDK, allowing you to make grilled cheese via voice command.

Saying “Google, make me a grilled cheese” will set in motion a series of events leading to the production of a perfectly pressed sammie, ideal for soup dunking or solo snacking.

The robot uses a vacuum lifter to pick up a slice of bread, dropping it onto an acrylic tray before repeating the process with a slice of cheese and then a second slice of bread. Then the whole thing is pushed into a panini press that has been liberally coated in butter spray (not shown for video aesthetics), and the sandwich is toasted, producing delicious ooey-gooey numminess out the other side.

Pareidolia much?

Here at Raspberry Pi, we give the Cheeseborg five slices out of five, and look forward to one day meeting Cheeseborg for real, so we can try out its scrummy wares.

ooooey-gooey numminess

You can find out more about Cheeseborg here.

Toastie or grilled cheese


Yes, there’s a difference: but which do you prefer? What makes them different? And what’s your favourite filling for this crispy, cheesy delight?

The post The grilled cheese-making robot of your dreams appeared first on Raspberry Pi.

A low-cost, open-source, computer-assisted microscope

Post Syndicated from Helen Lynn original https://www.raspberrypi.org/blog/a-low-cost-open-source-computer-assisted-microscope/

Low-cost open labware is a good thing in the world, and I was particularly pleased when micropalaeontologist Martin Tetard got in touch about the Raspberry Pi-based microscope he is developing. The project is called microscoPI (what else?), and it can capture, process, and store images and image analysis results. Martin is engaged in climate research: he uses microscopy to study tiny fossil remains, from which he gleans information about the environmental conditions that prevailed in the far-distant past.

microscoPI: a microcomputer-assisted microscope

microscoPI a project that aims to design a multipurpose, open-source and inexpensive micro-computer-assisted microscope (Raspberry PI 3). This microscope can automatically take images, process them, and save them altogether with the results of image analyses on a flash drive. It it multipurpose as it can be used on various kinds of images (e.g.

Martin repurposed an old microscope with a Z-axis adjustable stage for accurate focusing, and sourced an inexpensive X/Y movable stage to allow more accurate horizontal positioning of samples under the camera. He emptied the head of the scope to install a Raspberry Pi Camera Module, and he uses an M12 lens adapter to attach lenses suitable for single-specimen close-ups or for imaging several specimens at once. A Raspberry Pi 3B sits above the head of the microscope, and a 3.5-inch TFT touchscreen mounted on top of the Raspberry Pi allows the user to check images as they are captured and processed.

The Raspberry Pi runs our free operating system, Raspbian, and free image-processing software ImageJ. Martin and his colleagues use a number of plugins, some developed themselves and some by others, to support the specific requirements of their research. With this software, microscoPI can capture and analyse microfossil images automatically: it can count particles, including tiny specimens that are touching, analyse their shape and size, and save images and results before prompting the user for the name of the next sample.

microscoPI is compact – less than 30cm in height – and it’s powered by a battery bank secured under the base of the microscope, so it’s easily portable. The entire build comes in at under 160 Euros. You can find out more, and get in touch with Martin, on the microscoPI website.

The post A low-cost, open-source, computer-assisted microscope appeared first on Raspberry Pi.

Using data to help a school garden

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/using-data-to-help-a-school-garden/

Chris Aviles, aka the teacher we all wish we’d had when we were at school, discusses how his school is in New Jersey is directly linking data with life itself…

Over to you, Chris.

Every year, our students take federal or state-mandated testing, but what significant changes have we made to their education with the results of these tests? We have never collected more data about our students and society in general. The problem is most people and institutions do a poor job interpreting data and using it to make meaningful change. This problem was something I wanted to tackle in FH Grows.

FH Grows is the name of my seventh-grade class, and is a student-run agriculture business at Knollwood Middle School in Fair Haven, New Jersey. In FH Grows, we sell our produce both online and through our student-run farmers markets. Any produce we don’t sell is donated to our local soup kitchen. To get the most out of our school gardens, students have built sensors and monitors using Raspberry Pis. These sensors collect data which then allows me to help students learn to better interpret data themselves and turn it into action.

Turning data into action

In the greenhouse, our gardens, and alternative growing stations (hydroponics, aquaponics, aeroponics) we have sensors that log the temperature, humidity, and other important data points that we want to know about our garden. This data is then streamed in real time, online at FHGrows.com. When students come into the classroom, one of the first things we look at is the current, live data on the site and find out what is going on in our gardens. Over the course of the semester, students are taught about the ideal growing conditions of our garden. When looking at the data, if we see that the conditions in our gardens aren’t ideal, we get to work.

If we see that the greenhouse is too hot, over 85 degrees, students will go and open the greenhouse door. We check the temperature a little bit later, and if it’s still too hot, students will go turn on the fan. But how many fans do you turn on? After experimenting, we know that each fan lowers the greenhouse temperature between 7-10 degrees Fahrenheit. Opening the door and turning on both fans can bring a greenhouse than can push close to 100 degrees in late May or early June down to a manageable 80 degrees.

Turning data into action can allow for some creativity as well. Over-watering plants can be a real problem. We found that our plants were turning yellow because we were watering them every day when we didn’t need to. How could we solve this problem and become more efficient at watering? Students built a Raspberry Pi that used a moisture sensor to find out when a plant needed to be watered. We used a plant with the moisture sensor in the soil as our control plant. We figured that if we watered the control plant at the same time we watered all our other plants, when the control plant was dry (gave a negative moisture signal) the rest of the plants in the greenhouse would need to be watered as well.

Chris Aviles Innovation Lab Raspberry Pi Certified Educator

This method of determining when to water our plants worked well. We rarely ever saw our plants turn yellow from overwatering. Here is where the creativity came in. Since we received a signal from the Raspberry Pi when the soil was not wet enough, we played around with what we could do with that signal. We displayed it on the dashboard along with our other data, but we also decided to make the signal send as an email from the plant. When I showed students how this worked, they decided to write the message from the plant in the first person. Every week or so, we received an email from Carl the Control Plant asking us to come out and water him!

 

If students don’t honour Carl’s request for water, use data to know when to cool our greenhouse, or had not done the fan experiments to see how much cooler they make the greenhouse, all our plants, like the basil we sell to the pizza places in town, would die. This is the beauty of combining data literacy with a school garden: failure to interpret data then act based on their interpretation has real consequences: our produce could die. When it takes 60-120 days to grow the average vegetable, the loss of plants is a significant event. We lose all the time and energy that went into growing those plants as well as lose all the revenue they would have brought in for us. Further, I love the urgency that combining data and the school garden creates because many students have learned the valuable life lesson that not making a decision is making a decision. If students freeze or do nothing when confronted with the data about the garden, that too has consequences.

Using data to spot trends and make predictions

The other major way we use data in FH Grows is to spot trends and make predictions. Different to using data to create the ideal growing conditions in our garden every day, the sensors that we use also provide a way for us to use information about the past to predict the future. FH Grows has about two years’ worth of weather data from our Raspberry Pi weather station (there are guides online if you wish to build a weather station of your own). Using weather data year over year, we can start to determine important events like when it is best to plant our veggies in our garden.

For example, one of the most useful data points on the Raspberry Pi weather station is the ground temperature sensor. Last semester, we wanted to squeeze in a cool weather grow in our garden. This post-winter grow can be done between March and June if you time it right. Getting an extra growing cycle from our garden is incredibly valuable, not only to FH Grows as business (since we would be growing more produce to turn around and sell) but as a way to get an additional learning cycle out of the garden.

So, using two seasons’ worth of ground temperature data, we set out to predict when the ground in our garden would be cool enough to do this cool veggie grow. Students looked at the data we had from our weather station and compared it to different websites that predicted the last frost of the season in our area. We found that the ground right outside our door warmed up two weeks earlier than the more general prediction given by websites. With this information we were able to get a full cool crop grow at a time where our garden used to lay dormant.

We also used our Raspberry Pi to help us predict whether or not it was going to rain over the weekend. Using a Raspberry Pi connected to Weather Underground and previous years’ data, if we believed it would not rain over the weekend we would water our gardens on Friday. If it looked like rain over the weekend, we let Mother Nature water our garden for us. Our prediction using the Pi and previous data was more accurate for our immediate area than compared to the more general weather reports you would get on the radio or an app, since those considered a much larger area when making their prediction.

It seems like we are going to be collecting even more data in the future, not less. It is important that we get our students comfortable working with data. The school garden supported by Raspberry Pi’s amazing ability to collect data is a boon for any teacher who wants to help students learn how to interpret data and turn it into action.
 

Hello World issue 10

Issue 10 of Hello World magazine is out today, and it’s free. 100% free.

Click here to download the PDF right now. Right this second. If you want to be a love, click here to subscribe, again for free. Subscribers will receive an email when the latest issue is out, and we won’t use your details for anything nasty.

If you’re an educator in the UK, click here and you’ll receive the printed version of Hello World direct to your door. And, guess what? Yup, that’s free too!

What I’m trying to say here is that there is a group of hard-working, passionate educators who take the time to write incredible content for Hello World, for free, and you would be doing them (and us, and your students, kids and/or friends) a solid by reading it 🙂

The post Using data to help a school garden appeared first on Raspberry Pi.

Raspberry Pi interactive wind chimes

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/interactive-wind-chimes/

Grab yourself a Raspberry Pi, a Makey Makey, and some copper pipes: it’s interactive wind chime time!

Perpetual Chimes

Perpetual Chimes is a set of augmented wind chimes that offer an escapist experience where your collaboration composes the soundscape. Since there is no wind indoors, the chimes require audience interaction to gently tap or waft them and encourage/nurture the hidden sounds within – triggering sounds as the chimes strike one another.

Normal wind chimes pale in comparison

I don’t like wind chimes. There, I said it. I also don’t like the ticking of the second hand of analogue clocks, and I think these two dislikes might be related. There’s probably a name for this type of dislike, but I’ll leave the Googling to you.

Sound designer Frazer Merrick’s interactive wind chimes may actually be the only wind chimes I can stand. And this is due, I believe, to the wonderful sounds they create when they touch, much more wonderful than regular wind chime sounds. And, obviously, because these wind chimes incorporate a Raspberry Pi 3.

Perpetual Chimes is a set of augmented wind chimes that offer an escapist experience where your collaboration composes the soundscape. Since there is no wind indoors, the chimes require audience interaction to gently tap or waft them and encourage/nurture the hidden sounds within — triggering sounds as the chimes strike one another. Since the chimes make little acoustic noise, essentially they’re broken until you collaborate with them.

Follow the Instructables tutorial to create your own!

The post Raspberry Pi interactive wind chimes appeared first on Raspberry Pi.

Raspberry Pi mineral oil tank with added pizzazz

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/raspberry-pi-mineral-oil-tank-with-added-pizzazz/

This isn’t the first mineral oil bath we’ve seen for the Raspberry Pi, but it’s definitely the first we’ve seen with added fish tank decorations.

Using the see-through casing of an old Apple PowerMac G4, Reddit user u/mjh2901 decided to build a mineral oil tank for their Raspberry Pi, and it looks fabulous. Renamed Apple Pi, this use of mineral oil is a technique used by some to manage the heat produced by tech. Oil is able to transfer heat up to five times more efficiently than air, with some mineral oil projects using a separate radiator to dissipate the heat back into the air.

So, how did they do it?

“Started with a PowerMac G4 case I previously used as a fish tank, then a candy dish. I had cut a piece of acrylic and glued it into the bottom.”


They then placed a Raspberry Pi 3 attached to a 2-line 16 character LCD into the tank, along with various decorations, and began to fill with store-bought mineral oil. Once full, the project was complete, the Raspberry Pi forever submerged.

You can find more photos here. But, one question still remains…

…who would use an old fish tank as a candy bowl?! 🤢

The post Raspberry Pi mineral oil tank with added pizzazz appeared first on Raspberry Pi.

Record the last seven seconds of everything you see

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/record-the-last-seven-seconds-of-everything-you-see/

Have you ever witnessed something marvellous but, by the time you get your camera out to record it, the moment has passed? ‘s Film in the Past hat-mounted camera is here to save the day!

Record the past

As 18-year-old student Johan explains, “Imagine you are walking in the street and you see a meteorite in the sky – obviously you don’t have time to take your phone to film it.” While I haven’t seen many meteorites in the sky, I have found myself wishing I’d had a camera to hand more than once in my life – usually when a friend trips over or says something ridiculous. “Fortunately after the passage of the meteorite, you just have to press a button on the hat and the camera will record the last 7 seconds”, Johan continues. “Then you can download the video from an application on your phone.”

Johan’s project, Film in the Past, consists of a Raspberry Pi 3 with USB camera attached, mounted to the peak of a baseball cap.

The camera is always on, and, at the press of a button, will save the last seven seconds of footage to the Raspberry Pi. You can then access the saved footage from an application on your smartphone. It’s a bit like the video capture function on the Xbox One or, as I like to call it, the option to record hilarious glitches during gameplay. But, unlike the Xbox One, it’s a lot easier to get the footage off the Raspberry Pi and onto your phone.

Fancy building your own? The full Python code for the project can be downloaded via GitHub, and more information can be found on Instructables and Johan’s website.

The post Record the last seven seconds of everything you see appeared first on Raspberry Pi.

Driverless cars run by Raspberry Pi

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/driverless-cars-run-by-raspberry-pi/

Could the future of driverless cars be shaped by Raspberry Pi? For undergraduate researchers at the University of Cambridge, the answer is a resounding yes!

Can cars talk to each other?

A fleet of driverless cars working together to keep traffic moving smoothly can improve overall traffic flow by at least 35 percent, researchers have shown. The researchers, from the University of Cambridge, programmed a small fleet of miniature robotic cars to drive on a multi-lane track and observed how the traffic flow changed when one of the cars stopped.

So long, traffic!

By using Raspberry Pis and onboard sensors to program scale-model versions of commercially available cars, undergraduate researchers have built a fleet of driverless cars that ‘talk to each other’. They did this because they are studying how driverless technology can help reduce traffic incidents on our roads.

Cambridge University Driverless cars using Raspberry Pi

The researchers investigated how a car stalled on a multi-lane track affects the buildup of traffic, and how communication between driverless cars can prevent these buildups.

Cambridge University Driverless cars using Raspberry Pi

When the cars acted independently of each other, a stalled car caused other vehicles in the same lane to slow or stop in order to merge into the adjacent lane. This soon led to queues forming along the track. But when the cars communicated via Raspberry Pis, they could tell each other about obstacles on the track, and this allowed cars to shift lanes with the cooperation of other road users.

The researchers recently presented their paper on the subject at the International Conference on Robotics and Automation (ICRA 2019) in Montréal, Canada. You can find links to their results, plus more information, on the University of Cambridge blog.

The post Driverless cars run by Raspberry Pi appeared first on Raspberry Pi.

Retrofit a handheld Casio portable TV with a Raspberry Pi

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/retrofit-raspberry-pi-handheld-casio-portable-tv/

What do we say to the god of outdated tech? Not today! Revive an old portable television with a Raspberry Pi 3!

Pocket televisions

In the late 1980s, when I was a gadget-savvy kid, my mother bought me a pocket TV as a joint Christmas and birthday present. The TV’s image clarity was questionable, its sound tinny, and its aerial so long that I often poked myself and others in the eye while trying to find a signal. Despite all this, it was one of the coolest, most futuristic things I’d ever seen, and I treasured it. But, as most tech of its day, the pocket TV no longer needed: I can watch TV in high definition on my phone — a device half the size, with a screen thrice as large, and no insatiable hunger for AA batteries.

So what do we do with this old tech to save it from the tip?

We put a Raspberry Pi in it, of course!

JaguarWong’s Raspberry Pi 3 pocket TV!

“I picked up a broken Casio TV-400 for the princely sum of ‘free’ a few weeks back. And I knew immediately what I wanted to do with it,” imgur user JaguarWong states in the introduction for the project.

I got the Pi for Christmas a couple of years back and have never really had any plans for it. Not long after I got it, I picked up the little screen from eBay to play with but again, with no real purpose in mind — but when I got the pocket TV everything fell into place.

Isn’t it wonderful when things fall so perfectly into place?

Thanks to an online pinout guide, JW was able to determine how to  connect the screen and the Raspberry Pi; fortunately, only a few jumper wires were needed — “which was handy given the limits on space.”

With slots cut into the base of the TV for the USB and Ethernet ports, the whole project fit together like a dream, with little need for modification of the original housing.

The final result is wonderful. And while JW describes the project as “fun, if mostly pointless”, we think it’s great — another brilliant example of retrofitting old tech with Raspberry Pi!

10/10 would recommend to a friend.

The post Retrofit a handheld Casio portable TV with a Raspberry Pi appeared first on Raspberry Pi.

Raspberry Pi underwater camera drone | The MagPi 80

Post Syndicated from Rob Zwetsloot original https://www.raspberrypi.org/blog/raspberry-pi-underwater-camera-drone-magpi-80/

Never let it be said that some makers won’t jump in at the deep end for their ambitious experiments with the Raspberry Pi. When Ievgenii Tkachenko fancied a challenge, he sought to go where few had gone before by creating an underwater drone, successfully producing a working prototype that he’s now hard at work refining.

Inspired by watching inventors on the Discovery Channel, Ievgenii has learned much from his endeavour. “For me it was a significant engineering challenge,” he says, and while he has ended up submerging himself within a process of trial-and-error, the results so far have been impressive.

Pi dive

The project began with a loose plan in Ievgenii’s head. “I knew what I should have in the project as a minimum: motions, lights, camera, and a gyroscope inside the device and smartphone control outside,” he explains. “Pretty simple, but I didn’t have a clue what equipment I would be able to use for the drone, and I was limited by finances.”

Bearing that in mind, one of his first moves was to choose a Raspberry Pi 3B, which he says was perfect for controlling the motors, diodes, and gyroscope while sending video streams from a camera and receiving commands from a control device.

The Raspberry Pi 3 sits in the housing and connects to a LiPo battery that also powers the LEDs and motors

“I was really surprised that this small board has a fully functional UNIX-based OS and that software like the Node.js server can be easily installed,” he tells us. “It has control input and output pins and there are a lot of libraries. With an Ethernet port and wireless LAN and a camera, it just felt plug-and-play. I couldn’t find a better solution.”

The LEDs are attached to radiators to prevent overheating, and a pulse driver is used for flashlight control

Working with a friend, Ievgenii sought to create suitable housing for the components, which included a twin twisted-pair wire suitable for transferring data underwater, an electric motor, an electronic speed control, an LED together with a pulse driver, and a battery. Four motors were attached to the outside of the housing, and efforts were made to ensure it was waterproof. Tests in a bath and out on a lake were conducted.

Streaming video

With a WiFi router on the shore connected to the Raspberry Pi via RJ45 connectors and an Ethernet cable, Ievgenii developed an Android application to connect to the Raspberry Pi by address and port (“as an Android developer, I’m used to working with the platform”). This also allowed movement to be controlled via the touchscreen, although he says a gamepad for Android can also be used. When it’s up and running, the Pi streams a video from the camera to the app — “live video streaming is not simple, and I spent a lot of time on the solution” — but the wired connection means the drone can only currently travel as far as the cable length allows.

The camera was placed in this transparent waterproof case attached to the front of the waterproof housing

In that sense, it’s not perfect. “It’s also hard to handle the drone, and it needs to be enhanced with an additional controls board and a few more electromotors for smooth movement,” Ievgenii admits. But as well as wanting to base the project on fast and reliable C++ code and make use of a USB 4K camera, he can see the future potential and he feels it will swim rather than sink.

“Similar drones are used for boat inspections, and they can also be used by rescue squads or for scientific purposes,” he points out. “They can be used to discover a vast marine world without training and risks too. In fact, now that I understand the Raspberry Pi, I know I can create almost anything, from a radio electronic toy car to a smart home.”

The MagPi magazine

This article was lovingly borrowed from the latest issue of The MagPi magazine. Pick up your copy of issue 80 from your local stockist, online, or by downloading the free PDF.

Subscribers to The MagPi also get a rather delightful subscription gift!

The post Raspberry Pi underwater camera drone | The MagPi 80 appeared first on Raspberry Pi.

Play Heverlee’s Sjoelen and win beer

Post Syndicated from Rob Zwetsloot original https://www.raspberrypi.org/blog/play-heverlees-sjoelen-win-beer/

Chances are you’ve never heard of the Dutch table shuffleboard variant Sjoelen. But if you have, then you’ll know it has a basic premise – to slide wooden pucks into a set of four scoring boxes – but some rather complex rules.

Sjoelen machine

Uploaded by Grant Gibson on 2018-07-10.

Sjoelen

It may seem odd that a game which relies so much on hand-eye coordination and keeping score could be deemed a perfect match for a project commissioned by a beer brand. Yet Grant Gibson is toasting success with his refreshing interpretation of Sjoelen, having simplified the rules and incorporated a Raspberry Pi to serve special prizes to the winners.

“Sjoelen’s traditional scoring requires lots of addition and multiplication, but our version simply gives players ten pucks and gets them to slide three through any one of the four gates within 30 seconds,” Grant explains.

As they do this, the Pi (a Model 3B) keeps track of how many pucks are sliding through each gate, figures how much time the player has left, and displays a winning message on a screen. A Logitech HD webcam films the player in action, so bystanders can watch their reactions as they veer between frustration and success.

Taking the plunge

Grant started the project with a few aims in mind: “I wanted something that could be transported in a small van and assembled by a two-person team, and I wanted it to have a vintage look.” Inspired by pinball tables, he came up with a three-piece unit that could be flat-packed for transport, then quickly assembled on site. The Pi 3B proved a perfect component.

Grant has tended to use full-size PCs in his previous builds, but he says the Pi allowed him to use less complex software, and less hardware to control input and output. He used Python for the input and output tasks and to get the Pi to communicate with a full-screen Chromium browser, via JSON, in order to handle the scoring and display tasks in JavaScript.

“We used infrared (IR) sensors to detect when a puck passed through the gate bar to score a point,” Grant adds. “Because of the speed of the pucks, we had to poll each of the four IR sensors over 100 times per second to ensure that the pucks were always detected. Optimising the Python code to run fast enough, whilst also leaving enough processing power to run a full-screen web browser and HD webcam, was definitely the biggest software challenge on this project.”

Bottoms up

The Raspberry Pi’s GPIO pins are used to trigger the dispensing of a can of Heverlee beer to the winner. These are stocked inside the machine, but building the vending mechanism was a major headache, since it needed to be lightweight and compact, and to keep the cans cool.

No off-the-shelf vending unit offered a solution, and Grant’s initial attempts with stepper motors and clear laser-cut acrylic gears proved disastrous. “After a dozen successful vends, the prototype went out of alignment and started slicing through cans, creating a huge frothy fountain of beer. Impressive to watch, but not a great mix with electronics,” Grant laughs.

Instead, he drew up a final design that was laser‑cut from poplar plywood. “It uses automotive central locking motors to operate a see-saw mechanism that serve the cans. A custom Peltier-effect heat exchanger, and a couple of salvaged PC fans, keep the cans cool inside the machine,” reveals Grant.

“I’d now love to make a lightweight version sometime, perhaps with a folding Sjoelen table and pop-up scoreboard screen, that could be carried by one person,” he adds. We’d certainly drink to that.

More from The MagPi magazine

Get your copy now from the Raspberry Pi Press store, major newsagents in the UK, or Barnes & Noble, Fry’s, or Micro Center in the US. Or, download your free PDF copy from The MagPi magazine website.

MagPi 79 cover

Subscribe now

Subscribe to The MagPi on a monthly, quarterly, or twelve-monthly basis to save money against newsstand prices!

Twelve-month print subscribers get a free Raspberry Pi 3A+, the perfect Raspberry Pi to try your hand at some of the latest projects covered in The MagPi magazine.

The post Play Heverlee’s Sjoelen and win beer appeared first on Raspberry Pi.

Build a dial-up ISP server using a Raspberry Pi

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/raspberry-pi-dial-up-server/

Trying to connect an old, dial-up–compatible computer to modern-day broadband internet can be a chore. The new tutorial by Doge Microsystems walks you through the process of using a Raspberry Pi to bridge the gap.

The Sound of dial-up Internet

I was bored so I wanted to see if I could get free dial up internet so I found that NetZero still has free service so I put in the number and heard the glorious sound of the Dial-up. Remind me of years gone. Unfortunately I was not able to make a connection.

Dial-up internet

Ah, there really is nothing quite like it: listen to the sweet sound of dial-up internet in the video above and reminisce about the days of yore that you spent waiting for your computer to connect and trying to convince other members of your household to not use the landline for a few hours.

But older computers have fallen behind these times of ever faster broadband and ever more powerful processors, and getting your beloved vintage computer online isn’t as easy as it once was.

For one thing, does anyone even have a landline anymore?

Enter Doge Microsystems, who save the day with their Linux-based dial-up server, the perfect tool for connecting computers of yesteryear to today’s broadband using a Raspberry Pi.

Disclaimer: I’m going to pre-empt a specific topic of conversation in the comment section by declaring that, no, I don’t like the words ‘vintage’, ‘retro’, and yesteryear’ any more than you do. But we all need to accept that the times, they are a-changing, OK? We’re all in this together. Let’s continue.

Building a Raspberry Pi dial-in server

For the build, you’ll need a hardware modem — any model should work, as long as it presents as a serial device to the operating system. You’ll also need a Linux device such as a Raspberry Pi, a client device with a modem, and ‘some form of telephony connection to link the two modems’, described by Doge Microsystems as one of the following:

We need a way to connect our ISP modem to clients. There are many ways to approach this:

  • Use the actual PSTN (i.e. real phone lines)
  • Use a PBX to provide local connectivity
  • Build your own circuity (not covered here, as it would require extra configuration)
  • Build a fake PSTN using VoIP ATAs and a software PBX

I’ve gone with the fourth option. Here’s the breakdown:

  • Asterisk — a VoIP PBX — is configured on the dial-in server to accept connections from two SIP client accounts and route calls between them
  • A Linksys PAP2T ATA — which supports two phone lines — is set up as both of those SIP clients connected to the PBX
  • The ISP-side modem is connected to the first line, and the client device to the second line

Doge Microsystems explains how to set up everything, including the Linux device, on the wiki for the project. Have a look for yourself if you want to try out the dial-up server first-hand.

The sound of dial-up

For funsies, I asked our Twitter followers how they would write down the sound of a dial-up internet connection. Check them out.

Alex on Twitter

@Raspberry_Pi dialtone, (phone beeps), rachh racchh rachh rechhhhhhh reccchhhhhh rechhhh, DEE-DONG-DEE-DONG-DI, BachhhhhhhhhhhhBACHHHHBACHHHHHHHHHHHHHHHHHHHHHHHHH

The post Build a dial-up ISP server using a Raspberry Pi appeared first on Raspberry Pi.

Monitoring insects at the Victoria and Albert Museum

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/monitoring-insects-at-the-victoria-and-albert-museum/

A simple Raspberry Pi camera setup is helping staff at the Victoria and Albert Museum track and identify insects that are threatening priceless exhibits.

“Fiacre, I need an image of bug infestation at the V&A!”

The problem with bugs

Bugs: there’s no escaping them. Whether it’s ants in your kitchen or cockroaches in your post-apocalyptic fallout shelter, insects have a habit of inconveniently infesting edifices, intent on damaging beloved belongings.

And museums are as likely as anywhere to be hit by creepy-crawly visitors. Especially when many of their exhibits are old and deliciously dusty. Yum!

Tracking insects at the V&A

As Bhavesh Shah and Maris Ines Carvalho state on the V&A blog, monitoring insect activity has become common practice at their workplace. As part of the Integrated Pest Monitoring (IPM) strategy at the museum, they even have trained staff members who inspect traps and report back their findings.

“But what if we could develop a system that gives more insight into the behaviour of insects and then use this information to prevent future outbreaks?” ask Shah and Carvalho.

The team spent around £50 on a Raspberry Pi and a 160° camera, and used these and Claude Pageau’s PI-TIMOLO software project to build an insect monitoring system. The system is now integrated into the museum, tracking insects and recording their movements — even in low-light conditions.

Emma Ormond, Raspberry Pi Trading Office Manager and Doctor of Bugs, believes this to be a Bristletail or Silverfish.

“The initial results were promising. Temperature, humidity, and light sensors could also be added to find out, for example, what time of day insects are more active or if they favour particular environmental conditions.”

For more information on the project, visit the Victoria & Albert Museum blog. And for more information on the Victoria & Albert Museum, visit the Victoria & Albert Museum, London — it’s delightful. We highly recommend attending their Videogames: Design/Play/Disrupt exhibition, which is running until 24 February.

The post Monitoring insects at the Victoria and Albert Museum appeared first on Raspberry Pi.

Upcycle a vintage TV with the Raspberry Pi TV HAT | The MagPi #78

Post Syndicated from Rob Zwetsloot original https://www.raspberrypi.org/blog/magpi-78-upcycled-vintage-tv-hat/

When Martin Mander’s portable Hitachi television was manufactured in 1975, there were just three UK channels and you’d need to leave the comfort of your sofa in order to switch between them.

A page layout of the upcycled vintage television project using the Raspberry Pi TV HAT from The MagPi issue 78

Today, we have multiple viewing options and even a cool Raspberry Pi TV HAT that lets us enjoy DVB-T2 broadcasts via a suitable antenna. So what did nostalgia-nut Martin decide to do when he connected his newly purchased TV HAT to the Pi’s 40-pin GPIO header? Why, he stuck it in his old-fashioned TV set with a butt-busting rotary switch and limited the number of channels to those he could count on one hand – dubbing it “the 1982 experience” because he wanted to enjoy Channel 4 which was launched that year.

Going live

Martin is a dab hand at CRT television conversions (he’s created six since 2012, using monitors, photo frames, and neon signs to replace the displays). “For my latest project, I wanted to have some fun with the new HAT and see if I’d be able to easily display and control its TV streams on some of my converted televisions,” he says. It’s now being promoted to his office, for some background viewing as he works. “I had great fun getting the TV HAT streams working with the rotary dial,” he adds.

Raspberry Pi TV HAT

The project was made possible thanks to the new Raspberry Pi TV HAT

Although Martin jumped straight into the HAT without reading the instructions or connecting an aerial, he eventually followed the guide and found getting it up-and-running to be rather straightforward. He then decided to repurpose his Hitachi Pi project, which he’d already fitted with an 8-inch 4:3 screen.

Upcycled television using the Raspberry Pi TV HAT

The boards, screen, and switches installed inside the repurposed Hitachi television

“It’s powered by a Pi 3 and it already had the rotary dial set up and connected to the GPIO,” he explains. “This meant I could mess about with the TV HAT, but still fall back on the original project’s script if needed, with no hardware changes required.”

Change the channel

Indeed, Martin’s main task was to ensure he could switch channels using the rotary dial and this, he says, was easier to achieve than he expected. “When you go to watch a show from the Tvheadend web interface, it downloads an M3U playlist file for you which you can then open in VLC or another media player,” he says.

Upcycled television using the Raspberry Pi TV HAT

– The Hitachi television is fitted with a Pimoroni 8-inch 4:3 screen and a Raspberry Pi 3
– Programmes stream from a Pi 2 server and the channels are changed by turning the dial
– The name of the channel briefly appears at the bottom of the screen – the playlist files are edited in Notepad

“At first, I thought the playlist file was specific to the individual TV programme, as the show’s name is embedded in the file, but actually each playlist file is specific to the channel itself, so it meant I could download a set of playlists, one per channel, and store them in a folder to give me a full range of watching options.”

Sticking to his theme, he stored playlists for the four main channels of 1982 (BBC1, BBC2, ITV, and Channel 4) in a folder and renamed them channel1, channel2, channel3, and channel4.

Upcycled television using the Raspberry Pi TV HAT

A young Martin Mander decides the blank screen of his black and white Philips TX with six manual preset buttons is preferable to the shows (but he’d like to convert one of these in the future)

“Next, I created a script with an infinite loop that would look out for any action on the GPIO pin that was wired to the rotary dial,” he continues. “If the script detects that the switch has been moved, then it opens the first playlist file in VLC, full-screen. The next time the switch moves, the script loops around and adds ‘1’ to the playlist name, so that it will open the next one in the folder.”

Martin is now planning the next stage of the project, considering expanding the channel-changing script to include streams from his IP cameras, replacing a rechargeable speaker with a speaker HAT, and looking to make the original volume controls work with the Pi’s audio. “It been really satisfying to get this project working, and there are many possibilities ahead,” he says.

More from The MagPi magazine

The MagPi magazine issue 78 is out today. Buy your copy now from the Raspberry Pi Press store, major newsagents in the UK, or Barnes & Noble, Fry’s, or Micro Center in the US. Or, download your free PDF copy from The MagPi magazine website.

The MagPi magazine issue 78

Subscribe now

Subscribe to The MagPi magazine on a monthly, quarterly, or twelve-month basis to save money against newsstand prices!

Twelve-month print subscribers get a free Raspberry Pi 3A+, the perfect Raspberry Pi to try your hand at some of the latest projects covered in The MagPi magazine.

The post Upcycle a vintage TV with the Raspberry Pi TV HAT | The MagPi #78 appeared first on Raspberry Pi.

Spirit Animal: a guitar with a built-in synthesiser

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/spirit-animal-synth-guitar/

UK-based Lucem Custom Instruments has teamed up with Seattle’s Tracktion Corporation to create an electric guitar with a built-in Raspberry Pi synthesiser, which they call Spirit Animal.

Raspberry Pi inside a guitar body - Spirit Animal

The Spirit Animal concept guitar

We love seeing the Raspberry Pi incorporated into old technology such as radios, games consoles and unwanted toys. And we also love Pi-based music projects. So can you imagine how happy we were to see an electric guitar with an onboard Raspberry Pi synthesiser?

Raspberry Pi inside a guitar body - Spirit Animal

Tracktion, responsible for synth software BioTek 2, ran their product on a Raspberry Pi, and Lucem fitted this Pi and associated tech inside the hollow body of a through-neck Visceral guitar. The concept guitar made its debut at NAMM 2019 last weekend, where attendees at the National Association of Music Merchants event had the chance to get hands-on with the new instrument.

Raspberry Pi inside a guitar body - Spirit Animal

The instrument boasts an onboard Li-ion battery granting about 8 hours of play time, and a standard 1/4″ audio jack for connecting to an amp. To permit screen-sharing, updates, and control via SSH, the guitar allows access to the Pi’s Ethernet port and wireless functionality.

See more

You can find more information about the design on the Gear News website, and see the instrument in action at NAMM on the Lucem Custom Instruments Facebook page. We look forward to seeing where this collaboration will lead!

Music and Pi

If you’re a guitarist and keen to incorporate a Raspberry Pi into your music, then also check out these other projects:

  • pisound — the Raspberry Pi–powered guitar pedal

PiSound with hardware and peripherals

  • Pedalumi — the illuminated pedal board

  • Guitarboy — is it a Gameboy? Is it a guitar? Unclear, but it’s awesome!

Guitar Boy video

The Guitar Boy is a guitar. The Guitar Boy is a Game Boy. The Guitar Boy is the best of both worlds! Created for the BitFix Gaming 2015 Game Boy Classic build-off, this Game Boy guitar plays both Pokemon and rock and roll!

The post Spirit Animal: a guitar with a built-in synthesiser appeared first on Raspberry Pi.