Tag Archives: raspberry pi camera

HIIT Pi makes Raspberry Pi your home workout buddy

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/hiit-pi-makes-raspberry-pi-your-home-workout-buddy/

Has your fitness suffered during locked down? Have you been able to keep up diligently with your usual running routine? Maybe you found it easy to recreate you regular gym classes in your lounge with YouTube coaches. Or maybe, like a lot of us, you’ve not felt able to do very much at all, and needed a really big push to keep moving.

From James’s YouTube channel

Maker James Wong took to Raspberry Pi to develop something that would hold him accountable for his daily HIIT workouts, and hopefully keep his workouts on track while alone in lockdown.

What is a HIIT workout?

HIIT is the best kind of exercise, in that it doesn’t last long and it’s effective. You do short bursts of high-intensity physical movement between short, regular rest periods. HIIT stands for High Intensity Interval Training.

James’s model can detect how well you perform a burpee, as well as many other exercise movements

James was attracted to HIIT during lockdown as it didn’t require any gym visits or expensive exercise equipment. He had access to endless online training sessions, but felt he needed that extra level of accountability to make sure he kept up with his at-home fitness regime. Hence, HIIT Pi.

So what does HIIT Pi actually do?

HIIT Pi is a web app that uses machine learning on Raspberry Pi to help track your workout in real time. Users can interact with the app via any web browser running on the same local network as the Raspberry Pi, be that on a laptop, tablet, or smartphone.

HIIT Pi running software on server from ipad using raspberry pi
An iPad accessing a remote server running on James’s Raspberry Pi

HIIT Pi is simple in that it only does two things:

  • Uses computer vision to automatically capture and track detected poses and movement
  • Scores them according to a set of rules and standards
HIIT Pi running on Raspberry Pi and a Raspberry Pi camera module, propped up on a shelf
HIIT Pi is watching you via a Raspberry Pi camera module (top right)

So, essentially, you’ve got a digital personal trainer in the room monitoring your movements and letting you know whether they’re up to standard and whether you’re likely to achieve your fitness goals.

James calls HIIT Pi an “electronic referee”, and we agree that if we had one of those in the room while muddling through a Yoga With Adriene session on YouTube, we would try a LOT harder.

How does it work?

A Raspberry Pi camera module streams raw image data from the sensor roughly at 30 frames per second. James devised a custom recording stream handler that works off this pose estimation model and takes frames from the video stream, spitting out pose confidence scores using pre-set keypoint position coordinates.

HIIT Pi dashboard
HIIT Pi uses Dash, a laudable open source tool from the Plotly team

James’s original project post details the inner workings. You can also grab the code needed to create your own at-home Raspberry Pi personal trainer.

Get in touch with James here.

The post HIIT Pi makes Raspberry Pi your home workout buddy appeared first on Raspberry Pi.

Machine Learning made easy with Raspberry Pi, Adafruit and Microsoft

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/machine-learning-made-easy-with-raspberry-pi-adafruit-and-microsoft/

Machine learning can sound daunting even for experienced Raspberry Pi hobbyists, but Microsoft and Adafruit Industries are determined to make it easier for everyone to have a go. Microsoft’s Lobe tool takes the stress out of training machine learning models, and Adafruit have developed an entire kit around their BrainCraft HAT, featuring Raspberry Pi 4 and a Raspberry Pi Camera, to get your own machine learning project off to a flying start.

adafruit lobe kit
Adafruit developed this kit especially for the BrainCraft HAT to be used with Microsoft Lobe on Raspberry Pi

Adafruit’s BrainCraft HAT

Adafruit’s BrainCraft HAT fits on top of Raspberry Pi 4 and makes it really easy to connect hardware and debug machine learning projects. The 240 x 240 colour display screen also lets you see what the camera sees. Two microphones allow for audio input, and access to the GPIO means you can connect things likes relays and servos, depending on your project.

Adafruit’s BrainCraft HAT in action detecting a coffee mug

Microsoft Lobe

Microsoft Lobe is a free tool for creating and training machine learning models that you can deploy almost anywhere. The hardest part of machine learning is arguably creating and training a new model, so this tool is a great way for newbies to get stuck in, as well as being a fantastic time-saver for people who have more experience.

Get started with one of three easy, medium, and hard tutorials featured on the lobe-adafruit-kit GitHub.

This is just a quick snippet of Microsoft’s full Lobe tutorial video.
Look how quickly the tool takes enough photos to train a machine learning model

‘Bakery’ identifies and prices different pastries

Lady Ada demonstrated Bakery: a machine learning model that uses an Adafruit BrainCraft HAT, a Raspberry Pi camera, and Microsoft Lobe. Watch how easy it is to train a new machine learning model in Microsoft Lobe from this point in the Microsoft Build Keynote video.

A quick look at Bakery from Adafruit’s delightful YouTube channel

Bakery identifies different baked goods based on images taken by the Raspberry Pi camera, then automatically identifies and prices them, in the absence of barcodes or price tags. You can’t stick a price tag on a croissant. There’d be flakes everywhere.

Extra functionality

Running this project on Raspberry Pi means that Lady Ada was able to hook up lots of other useful tools. In addition to the Raspberry Pi camera and the HAT, she is using:

  • Three LEDs that glow green when an object is detected
  • A speaker and some text-to-speech code that announces which object is detected
  • A receipt printer that prints out the product name and the price

All of this running on Raspberry Pi, and made super easy with Microsoft Lobe and Adafruit’s BrainCraft HAT. Adafruit’s Microsoft Machine Learning Kit for Lobe contains everything you need to get started.

full adafruit lobe kit
The full Microsoft Machine Learning Kit for Lobe with Raspberry Pi 4 kit

Watch the Microsoft Build keynote

And finally, watch Microsoft CTO Kevin Scott introduce Limor Fried, aka Lady Ada, owner of Adafruit Industries. Lady Ada joins remotely from the Adafruit factory in Manhattan, NY, to show how the BrainCraft HAT and Lobe work to make machine learning accessible.

The post Machine Learning made easy with Raspberry Pi, Adafruit and Microsoft appeared first on Raspberry Pi.

Classify your trash with Raspberry Pi

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/classify-your-trash-with-raspberry-pi/

Maker Jen Fox took to hackster.io to share a Raspberry Pi–powered trash classifier that tells you whether the trash in your hand is recyclable, compostable, or just straight-up garbage.

Jen reckons this project is beginner-friendly, as you don’t need any code to train the machine learning model, just a little to load it on Raspberry Pi. It’s also a pretty affordable build, costing less than $70 including a Raspberry Pi 4.

“Haz waste”?!

Hardware:

  • Raspberry Pi 4 Model B
  • Raspberry Pi Camera Module
  • Adafruit push button
  • Adafruit LEDs
Watch Jen giving a demo of her creation

Software

The code-free machine learning model is created using Lobe, a desktop tool that automatically trains a custom image classifier based on what objects you’ve shown it.

The image classifier correctly guessing it has been shown a bottle cap

Training the image classifier

Basically, you upload a tonne of photos and tell Lobe what object each of them shows. Jen told the empty classification model which photos were of compostable waste, which were of recyclable and items, and which were of garbage or bio-hazardous waste. Of course, as Jen says, “the more photos you have, the more accurate your model is.”

Loading up Raspberry Pi

Birds eye view of Raspberry Pi 4 with a camera module connected
The Raspberry Pi Camera Module attached to Raspberry Pi 4

As promised, you only need a little bit of code to load the image classifier onto your Raspberry Pi. The Raspberry Pi Camera Module acts as the image classifier’s “eyes” so Raspberry Pi can find out what kind of trash you hold up for it.

The push button and LEDs are wired up to the Raspberry Pi GPIO pins, and they work together with the camera and light up according to what the image classifier “sees”.

Here’s the fritzing diagram showing how to wire the push button and LEDS to the Raspberry Pi GPIO pins

You’ll want to create a snazzy case so your trash classifier looks good mounted on the wall. Kate cut holes in a cardboard box to make sure that the camera could “see” out, the user can see the LEDs, and the push button is accessible. Remember to leave room for Raspberry Pi’s power supply to plug in.

Jen’s hand-painted case mounted to the wall, having a look at a plastic bag

Jen has tonnes of other projects on her Hackster profile — check out the micro:bit Magic Wand.

The post Classify your trash with Raspberry Pi appeared first on Raspberry Pi.

Defeat evil with a Raspberry Pi foam-firing spy camera

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/defeat-evil-with-a-raspberry-pi-foam-firing-spy-camera/

Ruth and Shawn from YouTube channel Kids Invent Stuff picked a cool idea by 9-year-old Nathan, who drew a Foam-Firing Spy Camera, to recreate in real life.

FYI: that’s not really a big camera lens…

The trick with spy devices is to make sure they look as much like the object they’re hidden inside as possible. Where Raspberry Pi comes in is making sure the foam camera can be used as a real photo-taking camera too, to throw the baddies off the scent if they start fiddling with your spyware.

Here’s the full build video by Kids Invent Stuff

The foam-firing bit of Nathan’s invention was relatively simple to recreate – a modified chef’s squirty cream dispenser, hidden inside a camera-shaped box, gets the job done.

Squirty cream thing painted black and mounted onto camera-shaped frame

Ruth and Shawn drew a load of 3D-printed panels to mount on the box frame in the image above. One of those cool coffee cups that look like massive camera lenses hides the squirty cream dispenser and gives this build an authentic camera look.

THOSE cool camera lens-shaped coffee cups, see?

Techy bits from the build:

  • Raspberry Pi
  • Infrared LED
  • Camera module
  • Mini display screen
All the bits mentioned in the list above

The infrared LED is mounted next to the camera module and switches on when it gets dark, giving you night vision.

The mini display screen serves as a ‘lid’ to the blue case protecting the Raspberry Pi and mounts into the back panel of the ‘camera’

The Raspberry Pi computer and its power bank are crammed inside the box-shaped part, with the camera module and infrared LED mounted to peek out of custom-made holes in one of the 3D-printed panels on the front of the box frame.

The night vision mini display screen in action on the back of the camera

The foam-firing chef’s thingy is hidden inside the big fake lens, and it’s wedged inside so that when you lift the big fake lens, the lever on the chef’s squirty thing is depressed and foam fires out of a tube near to where the camera lens and infrared LED peek out on the front panel of the build.

Watch the #KidsInventStuff presenters test out Nathan’s invention

Baddies don’t stand a chance!

The post Defeat evil with a Raspberry Pi foam-firing spy camera appeared first on Raspberry Pi.

13 Raspberry Pis slosh-test space shuttle tanks in zero gravity

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/13-raspberry-pis-slosh-test-space-shuttle-tanks-in-zero-gravity/

High-school student Eleanor Sigrest successfully crowdfunded her way onto a zero-G flight to test her latest Raspberry Pi-powered project. NASA Goddard engineers peer reviewed Eleanor’s experimental design, which detects unwanted movement (or ‘slosh’) in spacecraft fluid tanks.

The Raspberry Pi-packed setup

The apparatus features an accelerometer to precisely determine the moment of zero gravity, along with 13 Raspberry Pis and 12 Raspberry Pi cameras to capture the slosh movement.

What’s wrong with slosh?

The Broadcom Foundation shared a pretty interesting minute-by-minute report on Eleanor’s first hyperbolic flight and how she got everything working. But, in a nutshell…

The full apparatus onboard the zero gravity flight

You don’t want the fluid in your space shuttle tanks sloshing around too much. It’s a mission-ending problem. Slosh occurs on take-off and also in microgravity during manoeuvres, so Eleanor devised this novel approach to managing it in place of the costly, heavy subsystems currently used on board space craft.

Eleanor wanted to prove that the fluid inside tanks treated with superhydrophobic and superhydrophilic coatings settled quicker than in uncoated tanks. And she was right: settling times were reduced by 73% in some cases.

Eleanor at work

A continuation of this experiment is due to go up on Blue Origin’s New Shepard rocket – and yes, a patent is already pending.

Curiosity, courage & compromise

At just 13 years old, Eleanor won the Samueli Prize at the 2016 Broadcom MASTERS for her mastery of STEM principles and team leadership during a rigorous week-long competition. High praise came from Paula Golden, President of Broadcom Foundation, who said: “Eleanor is the epitome of a young woman scientist and engineer. She combines insatiable curiosity with courage: two traits that are essential for a leader in these fields.”

Eleanor aged 13 with her award-winning project ‘Rockets & Nozzles & Thrust… Oh My’

That week-long experience also included a Raspberry Pi Challenge, and Eleanor explained: “During the Raspberry Pi Challenge, I learned that sometimes the simplest solutions are the best. I also learned it’s important to try everyone’s ideas because you never know which one might work the best. Sometimes it’s a compromise of different ideas, or a compromise between complicated and simple. The most important thing is to consider them all.”

Get this girl to Mars already.

The post 13 Raspberry Pis slosh-test space shuttle tanks in zero gravity appeared first on Raspberry Pi.

3D-printable cases for the Raspberry Pi High Quality Camera

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/3d-printable-cases-for-the-raspberry-pi-high-quality-camera/

Earlier this year, we released the Raspberry Pi High Quality Camera, a brand-new 12.3 megapixel camera that allows you to use C- and CS-mount lenses with Raspberry Pi boards.

We love it. You love it.

How do we know you love it? Because the internet is now full of really awesome 3D-printable cases and add-ons our community has created in order to use their High Quality Camera out and about…or for Octoprint…or home security…or SPACE PHOTOGRAPHY, WHAT?!

The moon, captured by a Raspberry Pi High Quality Camera. Credit: Greg Annandale

We thought it would be fun to show you some of 3D designs we’ve seen pop up on sites like Thingiverse and MyMiniFactory, so that anyone with access to a 3D printer can build their own camera too!

Adafruit did a thing, obvs

Shout out to our friends at Adafruit for this really neat, retro-looking camera case designed by the Ruiz Brothers. The brown filament used for the casing is so reminiscent of the leather bodies of SLRs from my beloved 1980s childhood that I can’t help but be drawn to it. And, with snap-fit parts throughout, you can modify this case model as you see fit. Not bad. Not bad at all.

Nikon to Raspberry Pi

While the Raspberry Pi High Quality Camera is suitable for C- and CS-mount lenses out of the box, this doesn’t mean you’re limited to only these sizes! There’s a plethora of C- and CS-mount adapters available on the market, and you can also 3D print your own adapter.

Thingiverse user UltiArjan has done exactly that and designed this adapter for using Nikon lenses with the High Quality Camera. Precision is key here to get a snug thread, so you may have to fiddle with your printer settings to get the right fit.

And, for the Canon users out there, here’s Zimbo1’s adapter for Canon EF lenses!

Raspberry Pi Zero minimal adapter

If you’re not interested in a full-body camera case and just need something to attach A to B, this minimal adapter for the Raspberry Pi Zero will be right up your street.

Designer ed7coyne put this model together in order to use Raspberry Pi Zero as a webcam, and according to Cura on my laptop, should only take about 2 hours to print at 0.1 with supports. In fact, since I’ve got Cura open already…

3D print a Raspberry Pi High Quality Camera?!

Not a working one, of course, but if you’re building something around the High Quality Camera and want to make sure everything fits without putting the device in jeopardy, you could always print a replica for prototyping!

Thingiverse user tmomas produced this scale replica of the Raspberry Pi High Quality Camera with the help of reference photos and technical drawings, and a quick search online will uncover similar designs for replicas of other Raspberry Pi products you might want to use while building a prototype

Bonus content alert

We made this video for HackSpace magazine earlier this year, and it’s a really hand resource if you’re new to the 3D printing game.

Also…

…I wasn’t lying when I said I was going to print ed7coyne’s minimal adapter.

The post 3D-printable cases for the Raspberry Pi High Quality Camera appeared first on Raspberry Pi.

Teaching pigeons with Raspberry Pi

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/teaching-pigeons-with-raspberry-pi/

It’s been a long lockdown for one of our favourite makers, Pi & Chips. Like most of us (probably), they have turned their hand to training small animals that wander into their garden to pass the time — in this case, pigeons. I myself enjoy raising my glass to the squirrel that runs along my back fence every evening at 7pm.

Of course, Pi & Chips has taken this one step further and created a food dispenser including motion-activated camera with a Raspberry Pi 3B+ to test the intelligence of these garden critters and capture their efforts live.

Bird behaviour

Looking into the cognitive behaviour of birds (and finding the brilliantly titled paper Maladaptive gambling by pigeons), Pi & Chips discovered that pigeons can, with practice, recognise objects including buttons and then make the mental leap to realise that touching these buttons actually results in something happening. So they set about building a project to see this in action.

Enter the ‘SmartFrank 3000’, named after the bossiest bird to grace Pi & Chips’s shed roof over the summer.

Steppers and servos

The build itself is a simple combo of a switch and dispenser. But it quickly became apparent that any old servo wasn’t going to be up to the job — it couldn’t move fast enough to open and close a hatch quickly or strongly enough.

The motor setup

Running a few tests with a stepper motor confirmed that this was the perfect choice, as it could move quickly enough, and was strong enough to hold back a fair weight of seed when not in operation.

It took a while to get the timing on the stepper just right to give a pretty consistent delivery of the seed…

A 3D-printed flap for the stepper was also fashioned, plus a nozzle that fits over the neck of a two-litre drinks bottle, and some laser-cut pieces to make a frame to hold it all together.

The switch

Now for the switch that Frank the pigeon was going to have to touch if it wanted any bird seed. Pi & Chips came up with this design made from 3mm ply and some sponge as the spring.

They soldered some wires to a spring clip from an old photo frame and added a bolt and two nuts. The second nut allowed very fine adjustment of the distance to make sure the switch could be triggered by as light a touch as possible.

Behind the scenes

Behind the scenes setup

Behind the scenes there’s a Raspberry Pi 3B+ running the show, together with a motor controller board for the stepper motor. This board runs from its own battery pack, as it needs 12V power and is therefore too heavy for Raspberry Pi to handle directly. A Raspberry Pi Camera Module has also been added and runs this motion detection script to start recording whenever a likely bird candidate steps up to the plate for dinner. Hopefully, we can soon get some footage of Frank the pigeon learning and earning!

The post Teaching pigeons with Raspberry Pi appeared first on Raspberry Pi.

Processing raw image files from a Raspberry Pi High Quality Camera

Post Syndicated from David Plowman original https://www.raspberrypi.org/blog/processing-raw-image-files-from-a-raspberry-pi-high-quality-camera/

When taking photos, most of us simply like to press the shutter button on our cameras and phones so that viewable image is produced almost instantaneously, usually encoded in the well-known JPEG format. However, there are some applications where a little more control over the production of that JPEG is desirable. For instance, you may want more or less de-noising, or you may feel that the colours are not being rendered quite right.

This is where raw (sometimes RAW) files come in. A raw image in this context is a direct capture of the pixels output from the image sensor, with no additional processing. Normally this is in a relatively standard format known as a Bayer image, named after Bryce Bayer who pioneered the technique back in 1974 while working for Kodak. The idea is not to let the on-board hardware ISP (Image Signal Processor) turn the raw Bayer image into a viewable picture, but instead to do it offline with an additional piece of software, often referred to as a raw converter.

A Bayer image records only one colour at each pixel location, in the pattern shown

The raw image is sometimes likened to the old photographic negative, and whilst many camera vendors use their own proprietary formats, the most portable form of raw file is the Digital Negative (or DNG) format, defined by Adobe in 2004. The question at hand is how to obtain DNG files from Raspberry Pi, in such a way that we can process them using our favourite raw converters.

Obtaining a raw image from Raspberry Pi

Many readers will be familiar with the raspistill application, which captures JPEG images from the attached camera. raspistill includes the -r option, which appends all the raw image data to the end of the JPEG file. JPEG viewers will still display the file as normal but ignore the (many megabytes of) raw data tacked on the end. Such a “JPEG+RAW” file can be captured using the terminal command:

raspistill -r -o image.jpg

Unfortunately this JPEG+RAW format is merely what comes out of the camera stack and is not supported by any raw converters. So to make use of it we will have to convert it into a DNG file.

PyDNG

This Python utility converts the Raspberry Pi’s native JPEG+RAW files into DNGs. PyDNG can be installed from github.com/schoolpost/PyDNG, where more complete instructions are available. In brief, we need to perform the following steps:

git clone https://github.com/schoolpost/PyDNG
cd PyDNG
pip3 install src/.  # note that PyDNG requires Python3

PyDNG can be used as part of larger Python scripts, or it can be run stand-alone. Continuing the raspistill example from before, we can enter in a terminal window:

python3 examples/utility.py image.jpg

The resulting DNG file can be processed by a variety of raw converters. Some are free (such as RawTherapee or dcraw, though the latter is no longer officially developed or supported), and there are many well-known proprietary options (Adobe Camera Raw or Lightroom, for instance). Perhaps users will post in the comments any that they feel have given them good results.

White balancing and colour matrices

Now, one of the bugbears of processing Raspberry Pi raw files up to this point has been the problem of getting sensible colours. Previously, the images have been rendered with a sickly green cast, simply because no colour balancing is being done and green is normally the most sensitive colour channel. In fact it’s even worse than this, as the RGB values in the raw image merely reflect the sensitivity of the sensor’s photo-sites to different wavelengths, and do not a priori have more than a general correlation with the colours as perceived by our own eyes. This is where we need white balancing and colour matrices.

Correct white balance multipliers are required if neutral parts of the scene are to look, well, neutral.  We can use raspistills guesstimate of them, found in the JPEG+RAW file (or you can measure your own on a neutral part of the scene, like a grey card). Matrices and look-up tables are then required to convert colour from ‘camera’ space to the final colour space of choice, mostly sRGB or Adobe RGB.

My thanks go to forum contributors Jack Hogan for measuring these colour matrices, and to Csaba Nagy for implementing them in the PyDNG tool. The results speak for themselves.

Results

Previous attempts at raw conversion are on the left; the results using the updated PyDNG are on the right.

DCP files

For those familiar with DNG files, we include links to DCP (DNG Camera Profile) files (warning: binary format). You can try different ones out in raw converters, and we would encourage users to experiment, to perhaps create their own, and to share their results!

  1. This is a basic colour profile baked into PyDNG, and is the one shown in the results above. It’s sufficiently small that we can view it as a JSON file.
  2. This is an improved (and larger) profile involving look-up tables, and aiming for an overall balanced colour rendition.
  3. This is similar to the previous one, but with some adjustments for skin tones and sky colours.

Note, however, that these files come with a few caveats. Specifically:

  • The calibration is only for a single Raspberry Pi High Quality Camera rather than a known average or “typical” module.
  • The illuminants used for the calibration are merely the ones that we had to hand — the D65 lamp in particular appears to be some way off.
  • The calibration only really works when the colour temperature lies between, or not too far from, the two calibration illuminants, approximately 2900K to 6000K in our case.

So there remains room for improvement. Nevertheless, results across a number of modules have shown these parameters to be a significant step forward.

Acknowledgements

My thanks again to Jack Hogan for performing the colour matrix calibration with DCamProf, and to Csaba Nagy for adding these new features to PyDNG.

Further reading

  1. There are many resources explaining how a raw (Bayer) image is converted into a viewable RGB or YUV image, among them Jack’s blog post.
  2. To understand the role of the colour matrices in a DNG file, please refer to the DNG specification. Chapter 6 in particular describes how they are used.

The post Processing raw image files from a Raspberry Pi High Quality Camera appeared first on Raspberry Pi.

Auto-blow bubbles with a Raspberry Pi-powered froggy

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/auto-blow-bubbles-with-a-raspberry-pi-powered-froggy/

8 Bits and a Byte created this automatic bubble machine, which is powered and controlled by a Raspberry Pi and can be switched on via the internet by fans of robots and/or bubbles.

They chose a froggy-shaped bubble machine, but you can repurpose whichever type you desire; it’s just easier to adapt a model running on two AA batteries.

Raspberry Pi connected to the relay module

Before the refurb, 8 Bits and a Byte’s battery-powered bubble machine was controlled by a manual switch, which turned the motor on and off inside the frog. If you wanted to watch the motor make the frog burp out bubbles, you needed to flick this switch yourself.

After dissecting their plastic amphibian friend, 8 Bits and a Byte hooked up its motor to Raspberry Pi using a relay module. They point to this useful walkthrough for help with connecting a relay module to Raspberry Pi’s GPIO pins.

Now the motor inside the frog can be turned on and off with the power of code. And you can become controller of bubbles by logging in here and commanding the Raspberry Pi to switch on.

A screenshot of the now automated frog in situ as seen on the remo dot tv website

To let the internet’s bubble fans see the fruits of their one-click labour, 8 Bits and a Byte set up a Raspberry Pi Camera Module and connected their build to robot streaming platform remo.tv.

Bubble soap being poured into the plastic frog's mouth
Don’t forget your bubble soap!

Kit list:

The only remaining question is: what’s the best bubble soap recipe?

The post Auto-blow bubbles with a Raspberry Pi-powered froggy appeared first on Raspberry Pi.

Go sailing with this stop-motion 3D-printed boat

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/go-sailing-with-this-stop-motion-3d-printed-boat/

Shot on a Raspberry Pi Camera Module, this stop-motion sequence is made up of 180 photos that took two hours to shoot and another hour to process.

The trick lies in the Camera Module enabling you to change the alpha transparency of the overlay image, which is the previous frame. It’s all explained in the official documentation, but basically, the Camera Module’s preview permits multiple layers to be rendered simultaneously: text, image, etc. Being able to change the transparency from the command line means this maker could see how the next frame (or the object) should be aligned. In 2D animation, this process is called ‘onion skinning’.

You can see the Raspberry Pi Camera Module on the bottom left in front of Yuksel’s hand

So why the Raspberry Pi Camera Module? Redditor /DIY_Maxwell aka Yuksel Temiz explains: “I make stop-motion animations as a hobby, using either my SLR or phone with a remote shutter. In most cases I didn’t need precision, but some animations like this are very challenging because I need to know the exact position of my object (the boat in this case) in each frame. The Raspberry Pi camera was great because I could overlay the previously captured frame into the live preview, and I could quickly change the transparency of the overlay to see how precise the location and how smooth the motion.”

You can easily make simple, linear stop-motion videos by just capturing your 3D printer while it’s doing its thing. Yuksel created a bolting horse (above) in that way. The boat sequence was more complicated though, because it rotates, and because pieces had to be added and removed.

The official docs are really comprehensive and span basic to advanced skill levels. Yuksel even walks you through getting started with the installation of Raspberry Pi OS.

Yuksel’s Raspberry Pi + Lego microscope

We’ve seen Yuksel’s handiwork before, and this new project was made in part by modifying the code from the open-source microscope (above) they made using Raspberry Pi and LEGO. They’re now planning to make a nice GUI and share the project as an open-source stop-motion animation tool.

The post Go sailing with this stop-motion 3D-printed boat appeared first on Raspberry Pi.