Tag Archives: Raspberry Pi Camera Module

SelfieBot: taking and printing photos with a smile

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/selfiebot-sophy-wong-raspberry-pi-camera/

Does your camera giggle and smile as it takes your photo? Does your camera spit out your image from a thermal printer? No? Well, Sophy Wong’s SelfieBot does!

Raspberry Pi SelfieBot: Selfie Camera with a Personality

SelfieBot is a project Kim and I originally made for our booth at Seattle Mini Maker Faire 2017. Now, you can build your own! A full tutorial for SelfieBot is up on the Adafruit Learning System at https://learn.adafruit.com/raspberry-pi-selfie-bot/ This was our first Raspberry Pi project, and is an experiment in DIY AI.

Pasties, projects, and plans

Last year, I built a Raspberry Pi photobooth for a friend’s wedding, complete with a thermal printer for instant printouts, and a Twitter feed to keep those unable to attend the event in the loop. I called the project PastyCam, because I built it into the paper mache body of a Cornish pasty, and I planned on creating a tutorial blog post for the build. But I obviously haven’t. And I think it’s time, a year later, to admit defeat.

A photo of the Cornish Pasty photo booth Alex created for a wedding in Cornwall - SelfieBot Raspberry Pi Camera

The wedding was in Cornwall, so the Cornish pasty totally makes sense, alright?

But lucky for us, Sophy Wong has gifted us all with SelfieBot.

Sophy Wong

If you subscribe to HackSpace magazine, you’ll recognise Sophy from issue 4, where she adorned the cover, complete with glowing fingernails. And if you’re like me, you instantly wanted to be her as soon as you saw that image.

SelfieBot Raspberry Pi Camera

Makers should also know Sophy from her impressive contributions to the maker community, including her tutorials for Adafruit, her YouTube channel, and most recently her work with Mythbusters Jr.

sophy wong on Twitter

Filming for #MythbustersJr is wrapped, and I’m heading home to Seattle. What an incredible summer filled with amazing people. I’m so inspired by every single person, crew and cast, on this show, and I’ll miss you all until our paths cross again someday 😊

SelfieBot at MakerFaire

I saw SelfieBot in passing at Maker Faire Bay Area earlier this year. Yet somehow I managed to not introduce myself to Sophy and have a play with her Pi-powered creation. So a few weeks back at World Maker Faire New York, I accosted Sophy as soon as I could, and we bonded by swapping business cards and Pimoroni pins.

Creating SelfieBot

SelfieBot is more than just a printing photo booth. It giggles, it talks, it reacts to movement. It’s the robot version of that friend of yours who’s always taking photos. Always. All the time, Amy. It’s all the time! *ahem*

SelfieBot Raspberry Pi Camera

SelfieBot consists of a Raspberry Pi 2, a Pi Camera Module, a 5″ screen, an accelerometer, a mini thermal printer, and more, including 3D-printed and laser-cut parts.

sophy wong on Twitter

Getting SelfieBot ready for Maker Faire Bay Area next weekend! Super excited to be talking on Sunday with @kpimmel – come see us and meet SelfieBot!

If you want to build your own SelfieBot — and obviously you do — then you can find a complete breakdown of the build process, including info on all parts you’ll need, files for 3D printing, and so, so many wonderfully informative photographs, on the Adafruit Learning System!

The post SelfieBot: taking and printing photos with a smile appeared first on Raspberry Pi.

A waterproof Raspberry Pi?! Five 3D-printable projects to try

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/waterproof-3d-printing-raspberry-pi/

Summer is coming to a close. The evenings grow darker. So pack away your flip flops, hang up your beach towel, and settle in for the colder months with these fun 3D-printable projects to make at home or in your local makerspace.

Fallout 4 desktop terminal

Power Up Props’ replica of the Fallout desktop terminals fits a 3.5″ screen and a Raspberry Pi 3B. Any Fallout fans out there will be pleased to know that you don’t need to raise your Science level to hack into this terminal — you’ll just need access to a 3D printer and these free files from My Mini Factory.

Fallout 4 terminal 3d-printable raspberry pi case

And while you’re waiting for this to print, check out Power Up Props’ wall-mounted terminal!

Fallout 4 – Working Terminal (Raspberry Pi Version) – Power Up Props

Howdy neighbors, grab some fusion cores and put on your power armor because today we’re making a working replica of the wall mounted computer “terminals” from the Fallout series, powered by a Raspberry Pi! Want one of your very own terminals?

Falcon Heavy night light

Remixing DAKINGINDANORF‘s low-poly Arduino-based design, this 3D-printable night light is a replica of the SpaceX Falcon Heavy rocket. The replica uses a Raspberry Pi Zero and a Pimoroni Unicorn pHAT to create a rather lovely rocket launch effect. Perfect for the budding space explorer in your home!

Falcon Heavy night light

I 3D printed a SpaceX Falcon Heavy night light, with some nice effects like it’s actually launching. Useful? Hell no. Cool? Hell yes! Blogpost with files and code: https://www.dennisjanssen.be/tutorials/falcon-heavy-night-light/

You can download the files directly from Dennis Janssen’s website.

Swimming IoT satellite

We’re really excited about this design and already thinking about how we’ll use it for our own projects:

Floating Raspberry Pi case

Using an acrylic Christmas bauble and 3D-printed parts, you can set your Raspberry Pi Zero W free in local bodies of water — ideal for nature watching and citizen science experiments.

Art Deco clock and weather display

Channel your inner Jay Gatsby with this Art Deco-effect clock and weather display.

Art Deco Raspberry Pi Clock

Fitted with a Raspberry Pi Zero W and an Adafruit piTFT display, this build is ideally suited for any late-night cocktail parties you may have planned.

High-altitude rocket holder

Send four Raspberry Pi Zeros and Camera Modules into the skies with this holder design from Thingiverse user randysteck.

Raspberry Pi Zero rocket holder

The 3D-printable holder will keep your boards safe and sound while they simultaneously record photos or video of their airborne adventure.

More more more

What projects did we miss? Share your favourite 3D-printable designs for Raspberry Pis in the comments so we can see more builds from the internet’s very best community!

The post A waterproof Raspberry Pi?! Five 3D-printable projects to try appeared first on Raspberry Pi.

Your face, 14 ft tall: image mapping with As We Are

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/as-we-are-face-mapping/

While at World Maker Faire New York last weekend, I found myself chatting to a rather lovely gentleman by the name of Mac Pierce. During our conversation, Mac mentioned a project he’d worked on called As We Are, an interactive art installation located in the Greater Columbus Convention Center in Columbus, Ohio.

as we are

“So it’s this 14-foot head covered in LEDs…”, Mac began, and after his brief explanation, I found myself grabbing nearby makers to have him tell them about the project too. I was hooked! I hadn’t even seen photos of the sculpture, yet I was hooked. And true to his word, Mac had the press release for As We Are sitting in my inbox when I returned to Pi Towers.

So here is it:

The Greater Columbus Convention Center: “As We Are” – Creating the Ultimate Selfie Machine

DCL, an award-winning fabricator of architectural specialties and custom experiential design elements, worked with artist Matthew Mohr to develop, engineer and fabricate this 14ft, 7,000lb, interactive digital sculpture. Featuring custom LED modules, an integrated 3D photobooth, 32 cameras, and a touch-screen display – this unique project combines technologies to present a seamless experience for visitors to display their own portrait on the sculpture.

As We Are

The brainchild of artist Matthew Mohr, As We Are was engineered and produced by DCL, an award-winning Boston-based fabricator whose greatest achievement to date, in my opinion at least, is hiring Mac Pierce.

as we are

YAY!

DCL built the 14-foot structure using 24 layers of aluminium ‘ribs’ covered in custom Sansi LED modules. These modules add up to an astounding 850000 individual LEDs, allowing for crisp detail of images displayed by the build.

as we are

When a visitor to the Convention Center steps inside the interactive sculpture, they’re met with a wall of 32 Raspberry Pis plus Camera Modules. The Pis use facial recognition software to 3D scan the visitor’s face and flattened the image, and then map the face across the outer surface of the structure.

Matthew Mohr was inspired to show off the diversity of Columbus, OH, while also creating a sense of oneness with As We Are. Combining technology and interaction, the sculpture has been called “the ultimate selfie machine”.

If you’re in or near Columbus and able to visit the installation, we’d love to see your photos, so please share them with us on our social media platforms.

Raspberry Pi facial mapping as we are

You see now why I was dumbstruck when Mac told me about this project, yes?

Always tell us

Had it not been for a chance encounter with Mac at Maker Faire, we may never have heard of As We Are. While Matthew Mohr and DCL installed the sculpture in 2017, very little fuss was made about the use of Raspberry Pis within it, and it completely slipped under our radar. So if you are working on a project for your business, as a maker, or for any other reason, and you’re using a Raspberry Pi, please make sure to let us know by emailing [email protected].

The post Your face, 14 ft tall: image mapping with As We Are appeared first on Raspberry Pi.

Rock, paper, scissors, lizard, Spock, fire, water balloon!

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/rock-paper-scissors-lizard-spock-fire-water-balloon/

Use a Raspberry Pi and a Pi Camera Module to build your own machine learning–powered rock paper scissors game!

Rock-Paper-Scissors game using computer vision and machine learning on Raspberry Pi

A Rock-Paper-Scissors game using computer vision and machine learning on the Raspberry Pi. Project GitHub page: https://github.com/DrGFreeman/rps-cv PROJECT ORIGIN: This project results from a challenge my son gave me when I was teaching him the basics of computer programming making a simple text based Rock-Paper-Scissors game in Python.

Virtual rock paper scissors

Here’s why you should always leave comments on our blog: this project from Julien de la Bruère-Terreault instantly had our attention when he shared it on our recent Android Things post.

Julien and his son were building a text-based version of rock paper scissors in Python when his son asked him: “Could you make a rock paper scissors game that uses the camera to detect hand gestures?” Obviously, Julien really had no choice but to accept the challenge.

“The game uses a Raspberry Pi computer and Raspberry Pi Camera Module installed on a 3D-printed support with LED strips to achieve consistent images,” Julien explains in the tutorial for the build. “The pictures taken by the camera are processed and fed to an image classifier that determines whether the gesture corresponds to ‘Rock’, ‘Paper’, or ‘Scissors’ gestures.”

How does it work?

Physically, the build uses a Pi 3 Model B and a Camera Module V2 alongside 3D-printed parts. The parts are all green, since a consistent colour allows easy subtraction of background from the captured images. You can download the files for the setup from Thingiverse.

rock paper scissors raspberry pi

To illustrate how the software works, Julien has created a rather delightful pipeline demonstrating where computer vision and machine learning come in.

rock paper scissors using raspberry pi

The way the software works means the game doesn’t need to be limited to the standard three hand signs. If you wanted to, you could add other signs such as ‘lizard’ and ‘Spock’! Or ‘fire’ and ‘water balloon’. Or any other alterations made to the game in your pop culture favourites.

rock paper scissors lizard spock

Check out Julien’s full tutorial to build your own AI-powered rock paper scissors game here on Julien’s GitHub. Massive kudos to Julien for spending a year learning the skills required to make it happen. And a massive thank you to Julien’s son for inspiring him! This is why it’s great to do coding and digital making with kids — they have the best project ideas!

Sharing is caring

If you’ve built your own project using Raspberry Pi, please share it with us in the comments below, or via social media. As you can tell from today’s blog post, we love to see them and share them with the whole community!

The post Rock, paper, scissors, lizard, Spock, fire, water balloon! appeared first on Raspberry Pi.

Recording lost seconds with the Augenblick blink camera

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/augenblick-camera/

Warning: a GIF used in today’s blog contains flashing images.

Students at the University of Bremen, Germany, have built a wearable camera that records the seconds of vision lost when you blink. Augenblick uses a Raspberry Pi Zero and Camera Module alongside muscle sensors to record footage whenever you close your eyes, producing a rather disjointed film of the sights you miss out on.

Augenblick blink camera recording using a Raspberry Pi Zero

Blink and you’ll miss it

The average person blinks up to five times a minute, with each blink lasting 0.5 to 0.8 seconds. These half-seconds add up to about 30 minutes a day. What sights are we losing during these minutes? That is the question asked by students Manasse Pinsuwan and René Henrich when they set out to design Augenblick.

Blinking is a highly invasive mechanism for our eyesight. Every day we close our eyes thousands of times without noticing it. Our mind manages to never let us wonder what exactly happens in the moments that we miss.

Capturing lost moments

For Augenblick, the wearer sticks MyoWare Muscle Sensor pads to their face, and these detect the electrical impulses that trigger blinking.

Augenblick blink camera recording using a Raspberry Pi Zero

Two pads are applied over the orbicularis oculi muscle that forms a ring around the eye socket, while the third pad is attached to the cheek as a neutral point.

Biology fact: there are two muscles responsible for blinking. The orbicularis oculi muscle closes the eye, while the levator palpebrae superioris muscle opens it — and yes, they both sound like the names of Harry Potter spells.

The sensor is read 25 times a second. Whenever it detects that the orbicularis oculi is active, the Camera Module records video footage.

Augenblick blink recording using a Raspberry Pi Zero

Pressing a button on the side of the Augenblick glasses set the code running. An LED lights up whenever the camera is recording and also serves to confirm the correct placement of the sensor pads.

Augenblick blink camera recording using a Raspberry Pi Zero

The Pi Zero saves the footage so that it can be stitched together later to form a continuous, if disjointed, film.

Learn more about the Augenblick blink camera

You can find more information on the conception, design, and build process of Augenblick here in German, with a shorter explanation including lots of photos here in English.

And if you’re keen to recreate this project, our free project resource for a wearable Pi Zero time-lapse camera will come in handy as a starting point.

The post Recording lost seconds with the Augenblick blink camera appeared first on Raspberry Pi.

Own your own working Pokémon Pokédex!

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/deep-learning-pokedex/

Squeal with delight as your inner Pokémon trainer witnesses the wonder of Adrian Rosebrock’s deep learning Pokédex.

Creating a real-life Pokedex with a Raspberry Pi, Python, and Deep Learning

This video demos a real-like Pokedex, complete with visual recognition, that I created using a Raspberry Pi, Python, and Deep Learning. You can find the entire blog post, including code, using this link: https://www.pyimagesearch.com/2018/04/30/a-fun-hands-on-deep-learning-project-for-beginners-students-and-hobbyists/ Music credit to YouTube user “No Copyright” for providing royalty free music: https://www.youtube.com/watch?v=PXpjqURczn8

The history of Pokémon in 30 seconds

The Pokémon franchise was created by video game designer Satoshi Tajiri in 1995. In the fictional world of Pokémon, Pokémon Trainers explore the vast landscape, catching and training small creatures called Pokémon. To date, there are 802 different types of Pokémon. They range from the ever recognisable Pikachu, a bright yellow electric Pokémon, to the highly sought-after Shiny Charizard, a metallic, playing-card-shaped Pokémon that your mate Alex claims she has in mint condition, but refuses to show you.

Pokemon GIF

In the world of Pokémon, children as young as ten-year-old protagonist and all-round annoyance Ash Ketchum are allowed to leave home and wander the wilderness. There, they hunt vicious, deadly creatures in the hope of becoming a Pokémon Master.

Adrian’s deep learning Pokédex

Adrian is a bit of a deep learning pro, as demonstrated by his Santa/Not Santa detector, which we wrote about last year. For that project, he also provided a great explanation of what deep learning actually is. In a nutshell:

…a subfield of machine learning, which is, in turn, a subfield of artificial intelligence (AI).While AI embodies a large, diverse set of techniques and algorithms related to automatic reasoning (inference, planning, heuristics, etc), the machine learning subfields are specifically interested in pattern recognition and learning from data.

As with his earlier Raspberry Pi project, Adrian uses the Keras deep learning model and the TensorFlow backend, plus a few other packages such as Adrian’s own imutils functions and OpenCV.

Adrian trained a Convolutional Neural Network using Keras on a dataset of 1191 Pokémon images, obtaining 96.84% accuracy. As Adrian explains, this model is able to identify Pokémon via still image and video. It’s perfect for creating a Pokédex – an interactive Pokémon catalogue that should, according to the franchise, be able to identify and read out information on any known Pokémon when captured by camera. More information on model training can be found on Adrian’s blog.

Adrian Rosebeck deep learning pokemon pokedex

For the physical build, a Raspberry Pi 3 with camera module is paired with the Raspberry Pi 7″ touch display to create a portable Pokédex. And while Adrian comments that the same result can be achieved using your home computer and a webcam, that’s not how Adrian rolls as a Raspberry Pi fan.

Adrian Rosebeck deep learning pokemon pokedex

Plus, the smaller size of the Pi is perfect for one of you to incorporate this deep learning model into a 3D-printed Pokédex for ultimate Pokémon glory, pretty please, thank you.

Adrian Rosebeck deep learning pokemon pokedex

Adrian has gone into impressive detail about how the project works and how you can create your own on his blog, pyimagesearch. So if you’re interested in learning more about deep learning, and making your own Pokédex, be sure to visit.

The post Own your own working Pokémon Pokédex! appeared first on Raspberry Pi.

AIY Projects 2: Google’s AIY Projects Kits get an upgrade

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/google-aiy-projects-2/

After the outstanding success of their AIY Projects Voice and Vision Kits, Google has announced the release of upgraded kits, complete with Raspberry Pi Zero WH, Camera Module, and preloaded SD card.

Google AIY Projects Vision Kit 2 Raspberry Pi

Google’s AIY Projects Kits

Google launched the AIY Projects Voice Kit last year, first as a cover gift with The MagPi magazine and later as a standalone product.

Makers needed to provide their own Raspberry Pi for the original kit. The new kits include everything you need, from Pi to SD card.

Within a DIY cardboard box, makers were able to assemble their own voice-activated AI assistant akin to the Amazon Alexa, Apple’s Siri, and Google’s own Google Home Assistant. The Voice Kit was an instant hit that spurred no end of maker videos and tutorials, including our own free tutorial for controlling a robot using voice commands.

Later in the year, the team followed up the success of the Voice Kit with the AIY Projects Vision Kit — the same cardboard box hosting a camera perfect for some pretty nifty image recognition projects.

For more on the AIY Voice Kit, here’s our release video hosted by the rather delightful Rob Zwetsloot.

AIY Projects adds natural human interaction to your Raspberry Pi

Check out the exclusive Google AIY Projects Kit that comes free with The MagPi 57! Grab yourself a copy in stores or online now: http://magpi.cc/2pI6IiQ This first AIY Projects kit taps into the Google Assistant SDK and Cloud Speech API using the AIY Projects Voice HAT (Hardware Accessory on Top) board, stereo microphone, and speaker (included free with the magazine).

AIY Projects 2

So what’s new with version 2 of the AIY Projects Voice Kit? The kit now includes the recently released Raspberry Pi Zero WH, our Zero W with added pre-soldered header pins for instant digital making accessibility. Purchasers of the kits will also get a micro SD card with preloaded OS to help them get started without having to set the card up themselves.

Google AIY Projects Vision Kit 2 Raspberry Pi

Everything you need to build your own Raspberry Pi-powered Google voice assistant

In the newly upgraded AIY Projects Vision Kit v1.2, makers are also treated to an official Raspberry Pi Camera Module v2, the latest model of our add-on camera.

Google AIY Projects Vision Kit 2 Raspberry Pi

“Everything you need to get started is right there in the box,” explains Billy Rutledge, Google’s Director of AIY Projects. “We knew from our research that even though makers are interested in AI, many felt that adding it to their projects was too difficult or required expensive hardware.”

Google AIY Projects Vision Kit 2 Raspberry Pi
Google AIY Projects Vision Kit 2 Raspberry Pi
Google AIY Projects Vision Kit 2 Raspberry Pi

Google is also hard at work producing AIY Projects companion apps for Android, iOS, and Chrome. The Android app is available now to coincide with the launch of the upgraded kits, with the other two due for release soon. The app supports wireless setup of the AIY Kit, though avid coders will still be able to hack theirs to better suit their projects.

Google has also updated the AIY Projects website with an AIY Models section highlighting a range of neural network projects for the kits.

Get your kit

The updated Voice and Vision Kits were announced last night, and in the US they are available now from Target. UK-based makers should be able to get their hands on them this summer — keep an eye on our social channels for updates and links.

The post AIY Projects 2: Google’s AIY Projects Kits get an upgrade appeared first on Raspberry Pi.

Build a solar-powered nature camera for your garden

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/solar-powered-nature-camera/

Spring has sprung, and with it, sleepy-eyed wildlife is beginning to roam our gardens and local woodlands. So why not follow hackster.io maker reichley’s tutorial and build your own solar-powered squirrelhouse nature cam?

Raspberry Pi- and solar-powered nature camera

Inspiration

“I live half a mile above sea level and am SURROUNDED by animals…bears, foxes, turkeys, deer, squirrels, birds”, reichley explains in his tutorial. “Spring has arrived, and there are LOADS of squirrels running around. I was in the building mood and, being a nerd, wished to combine a common woodworking project with the connectivity and observability provided by single-board computers (and their camera add-ons).”

Building a tiny home

reichley started by sketching out a design for the house to determine where the various components would fit.

Raspberry Pi- and solar-powered nature camera

Since he’s fan of autonomy and renewable energy, he decided to run the project’s Raspberry Pi Zero W via solar power. To do so, he reiterated the design to include the necessary tech, scaling the roof to fit the panels.

Raspberry Pi- and solar-powered squirrel cam
Raspberry Pi- and solar-powered squirrel cam
Raspberry Pi- and solar-powered squirrel cam

To keep the project running 24/7, reichley had to figure out the overall power consumption of both the Zero W and the Raspberry Pi Camera Module, factoring in the constant WiFi connection and the sunshine hours in his garden.

Raspberry Pi- and solar-powered nature camera

He used a LiPo SHIM to bump up the power to the required 5V for the Zero. Moreover, he added a BH1750 lux sensor to shut off the LiPo SHIM, and thus the Pi, whenever it’s too dark for decent video.

Raspberry Pi- and solar-powered nature camera

To control the project, he used Calin Crisan’s motionEyeOS video surveillance operating system for single-board computers.

Build your own nature camera

To build your own version, follow reichley’s tutorial, in which you can also find links to all the necessary code and components. You can also check out our free tutorial for building an infrared bird box using the Raspberry Pi NoIR Camera Module. As Eben said in our YouTube live Q&A last week, we really like nature cameras here at Pi Towers, and we’d love to see yours. So if you have any live-stream links or photography from your Raspberry Pi–powered nature cam, please share them with us!

The post Build a solar-powered nature camera for your garden appeared first on Raspberry Pi.

The robotic teapot from your nightmares

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/robotic-teapot/

For those moments when you wish the cast of Disney’s Beauty and the Beast was real, only to realise what a nightmare that would be, here’s Paul-Louis Ageneau’s robotic teapot!

Paul-Louis Ageneau Robotic teapot Raspberry Pi Zero

See what I mean?

Tale as old as time…

It’s the classic story of guy meets digital killer teapot, digital killer teapot inspires him to 3D print his own. Loosely based on a boss level of the video game Alice: Madness Returns, Paul-Louis’s creation is a one-eyed walking teapot robot with a (possible) thirst for blood.

Kill Build the beast

“My new robot is based on a Raspberry Pi Zero W with a camera.” Paul-Louis explains in his blog. “It is connected via a serial link to an Arduino Pro Mini board, which drives servos.”

Each leg has two points of articulation, one for the knee and one for the ankle. In order to move each of the joints, the teapot uses eight servo motor in total.

Paul-Louis Ageneau Robotic teapot Raspberry Pi Zero

Paul-Louis designed and 3D printed the body of the teapot to fit the components needed. So if you’re considering this build as a means of acquiring tea on your laziest of days, I hate to be the bearer of bad news, but the most you’ll get from your pour will be jumper leads and Pi.

Paul-Louis Ageneau Robotic Raspberry Pi Zero teapot
Paul-Louis Ageneau Robotic Raspberry Pi Zero teapot
Paul-Louis Ageneau Robotic Raspberry Pi Zero teapot

While the Arduino board controls the legs, it’s the Raspberry Pi’s job to receive user commands and tell the board how to direct the servos. The protocol for moving the servos is simple, with short lines of characters specifying instructions. First a digit from 0 to 7 selects a servo; next the angle of movement, such as 45 or 90, is input; and finally, the use of C commits the instruction.

Typing in commands is great for debugging, but you don’t want to be glued to a keyboard. Therefore, Paul-Louis continued to work on the code in order to string together several lines to create larger movements.

Paul-Louis Ageneau Robotic teapot Raspberry Pi Zero

The final control system of the teapot runs on a web browser as a standard four-axis arrow pad, with two extra arrows for turning.

Something there that wasn’t there before

Jean-Paul also included an ‘eye’ in the side of the pot to fit the Raspberry Pi Camera Module as another nod to the walking teapot from the video game, but with a purpose other than evil and wrong-doing. As you can see from the image above, the camera live-streams footage, allowing for remote control of the monster teapot regardless of your location.

If you like it all that much, it’s yours

In case you fancy yourself as an inventor, Paul-Louis has provided the entire build process and the code on his blog, documenting how to bring your own teapot to life. And if you’ve created any robotic household items or any props from video games or movies, we’d love to see them, so leave a link in the comments or share it with us across social media using the hashtag #IBuiltThisAndNowIThinkItIsTryingToKillMe.

The post The robotic teapot from your nightmares appeared first on Raspberry Pi.

PipeCam: the low-cost underwater camera

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/pipecam-low-cost-underwater-camera/

Fred Fourie is building a low-cost underwater camera for shallow deployment, and his prototypes are already returning fascinating results. You can build your own PipeCam, and explore the undiscovered depths with a Raspberry Pi and off-the-shelf materials.

PipeCam underwater Raspberry Pi Camera

Materials and build

In its latest iteration, PipeCam consists of a 110mm PVC waste pipe with fittings and a 10mm perspex window at one end. Previous prototypes have also used plumbing materials for the body, but this latest version employs heavy-duty parts that deliver the good seal this project needs.

PipeCam underwater Raspberry Pi Camera

In testing, Fred and a friend determined that the rig could withstand 4 bar of pressure. This is enough to protect the tech inside at the depths Fred plans for, and a significant performance improvement on previous prototypes.

PipeCam underwater Raspberry Pi Camera
PipeCam underwater Raspberry Pi Camera

Inside the pipe are a Raspberry Pi 3, a camera module, and a real-time clock add-on board. A 2.4Ah rechargeable lead acid battery powers the set-up via a voltage regulator.

Using foam and fibreboard, Fred made a mount that holds everything in place and fits snugly inside the pipe.

PipeCam underwater Raspberry Pi Camera
PipeCam underwater Raspberry Pi Camera
PipeCam underwater Raspberry Pi Camera

PipeCam will be subject to ocean currents, not to mention the attentions of sea creatures, so it’s essential to make sure that everything is held securely inside the pipe – something Fred has learned from previous versions of the project.

Software

It’s straightforward to write time-lapse code for a Raspberry Pi using Python and one of our free online resources, but Fred has more ambitious plans for PipeCam. As well as a Python script to control the camera, Fred made a web page to display the health of the device. It shows battery level and storage availability, along with the latest photo taken by the camera. He also made adjustments to the camera’s exposure settings using raspistill. You can see the effect in this side-by-side comparison of the default python-picam image and the edited raspistill one.

PipeCam underwater Raspberry Pi Camera
PipeCam underwater Raspberry Pi Camera

Underwater testing

Fred has completed the initial first test of PipeCam, running the device under water for an hour in two-metre deep water off the coast near his home. And the results? Well, see for yourself:

PipeCam underwater Raspberry Pi Camera
PipeCam underwater Raspberry Pi Camera
PipeCam underwater Raspberry Pi Camera

PipeCam is a work in progress, and you can read Fred’s build log at the project’s Hackaday.io page, so be sure to follow along.

The post PipeCam: the low-cost underwater camera appeared first on Raspberry Pi.

qrocodile: the kid-friendly Sonos system

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/qrocodile-kid-friendly-sonos-system/

Chris Campbell’s qrocodile uses a Raspberry Pi, a camera, and QR codes to allow Chris’s children to take full control of the Sonos home sound system. And we love it!

qrocodile

Introducing qrocodile, a kid-friendly system for controlling your Sonos with QR codes. Source code is available at: https://github.com/chrispcampbell/qrocodile Learn more at: http://labonnesoupe.org https://twitter.com/chrscmpbll

Sonos

SONOS is SONOS backwards. It’s also SONOS upside down, and SONOS upside down and backwards. I just learnt that this means SONOS is an ambigram. Hurray for learning!

Sonos (the product, not the ambigram) is a multi-room speaker system controlled by an app. Speakers in different rooms can play different tracks or join forces to play one track for a smooth musical atmosphere throughout your home.

sonos raspberry pi

If you have a Sonos system in your home, I would highly recommend accessing to it from outside your home and set it to play the Imperial March as you walk through the front door. Why wouldn’t you?

qrocodile

One day, Chris’s young children wanted to play an album while eating dinner. By this one request, he was inspired to create qrocodile, a musical jukebox enabling his children to control the songs Sonos plays, and where it plays them, via QR codes.

It all started one night at the dinner table over winter break. The kids wanted to put an album on the turntable (hooked up to the line-in on a Sonos PLAY:5 in the dining room). They’re perfectly capable of putting vinyl on the turntable all by themselves, but using the Sonos app to switch over to play from the line-in is a different story.

The QR codes represent commands (such as Play in the living room, Use the turntable, or Build a song list) and artists (such as my current musical crush Courtney Barnett or the Ramones).

qrocodile raspberry Pi

A camera attached to a Raspberry Pi 3 feeds the Pi the QR code that’s presented, and the Pi runs a script that recognises the code and sends instructions to Sonos accordingly.


Chris used a costum version of the Sonos HTTP API created by Jimmy Shimizu to gain access to Sonos from his Raspberry Pi. To build the QR codes, he wrote a script that utilises the Spotify API via the Spotipy library.

His children are now able to present recognisable album art to the camera in order to play their desired track.

It’s been interesting seeing the kids putting the thing through its paces during their frequent “dance parties”, queuing up their favorite songs and uncovering new ones. I really like that they can use tangible objects to discover music in much the same way I did when I was their age, looking through my parents records, seeing which ones had interesting artwork or reading the song titles on the back, listening and exploring.

Chris has provided all the scripts for the project, along with a tutorial of how to set it up, on his GitHub — have a look if you want to recreate it or learn more about his code. Also check out Chris’ website for more on qrocodile and to see some of his other creations.

The post qrocodile: the kid-friendly Sonos system appeared first on Raspberry Pi.

facepunch: the facial recognition punch clock

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/facepunch-facial-recognition/

Get on board with facial recognition and clock your screen time with facepunch, the facial recognition punch clock from dekuNukem.

dekuNukem facepunch raspberry pi facial recognition

image c/o dekuNukem

How it works

dekuNukem uses a Raspberry Pi 3, the Raspberry Pi camera module, and an OLED screen for the build. You don’t strictly need to include the OLED board, but it definitely adds to the overall effect, letting you view your daily and weekly screen time at a glance without having to access your Raspberry Pi for data.

As dekuNukem explains in the GitHub repo for the build, they used a perf board to mount the screen and attached it to the Raspberry Pi. This is a nice, simple means of pulling the whole project together without loose wires or the need for a modified case.

dekuNukem facepunch raspberry pi facial recognition

image c/o dekuNukem

This face_recognition library lets the Pi + camera register your face. You’ll also need a well lit 400×400 photograph of yourself to act as a reference for the library. From there, a few commands should get you started.

Uses for facial recognition

You could simply use facepunch for its intended purpose, but here at Pi Towers we’ve been discussing further uses for the build. We’re all guilty of sitting for too long at our desks, so why not incorporate a “get up and walk around” notification? How about a flashing LED that tells you to “drink some water”? You could even go a little deeper (though possibly a little Big Brother) and set up an “I’m back at my desk” notification on Slack, to let your colleagues know you’re available.

You could also take this foray into facial recognition and incorporate it into home automation projects: a user-identifying Magic Mirror, perhaps, or a doorbell that recognises friends and family.

What would you do with facial recognition on a Raspberry Pi?

The post facepunch: the facial recognition punch clock appeared first on Raspberry Pi.