All posts by Andrew Gregory

Meet Anna Ploszajski: Where making and materials meet

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/meet-anna-ploszajski-where-making-and-materials-meet/

In the latest issue of HackSpace magazine, Andrew Gregory meets Anna Ploszajski to explore the bit of the Venn diagram where making and materials meet.

Anna Ploszajski (pronounced Por-shy-ski) is a cross-channel swimmer, a materials scientist, a writer, and a breaker-down of barriers to scientific understanding. 50% of the HackSpace editorial team listen to her podcast, Handmade, from which has arisen a book: Handmade: A scientist’s search for meaning through making. Naturally, we wanted to talk to her to find out why we humans do what we do when we turn object A into object B. That’s a pretty big question, but if anyone can answer it for us, Anna can.

anna ploszajski glass blowing
Anna’s journey into making began with watching a bit of broken glassware getting fixed
(Image: Charlie Murphy)

HackSpace: Hi Anna! You’ve written a book about making. Before we get on to that, though, we’d like to ask you about something you’ve been working on in your non-writing life – 4D printing. A while ago we saw a box with a hinged lid; the hinges were fabric, and the box was PLA, so you get the benefits of two types of material in one object. I guess what you’re doing is rather more advanced than that?

Anna Ploszajski: You say that, but I’ve been doing quite a lot of experiments in 3D printing onto fabric to try and make a 4D thing, because PLA has a kind of shape memory. I was wanting to do the experiment that I was doing (which actually I described at the end of the book). I’m trying to draw a conclusion about how my adventures in craft had also impacted my scientific research life. And the example that I use is this experiment that I did 3D printing onto fabrics. 

What I was doing began with  sort of pre-stressing, just normal textiles. I think there was a cotton, linen, pre-stressing it, just stretching them out with my hands, and then attaching them onto the print bed. And so, you already put in a kind of internal strain into the fabric, then 3D-print a very simple design that was either a circle, or just simple lines. And then obviously, when you print onto it, then the PLA plastic is bonded onto the textile. My idea was that if you then heated that material up, then it would soften, and that tension that you’d put into the fabric would be released. So that was my idea. 

anna ploszajski with woollen materials
Anna’s mission is to make science available to non-scientists
(Image: Steve Cross)

My project was all to do with exploring this idea of 4D printing. So printing, using 3D printing, to make objects that move in some way after you’ve printed them. The thing about it is, it’s adjacent to this topic of smart materials. There’s a family of materials that have some kind of smart property, usually it’s colour-changing or shape-changing in response to an external stimulus. So, that could be temperature change, or light levels or moisture levels. 

And those smart materials are not actually that smart, it turns out, because what they do is really simple. Let’s take the example of a really simple shape change: wood is a really good example. It expands when it gets wet. And it contracts when it dries out. By our definition of a smart material, that is a smart material because it changes shape when there’s a change in environment. And that’s a very simple  movement. And these smart materials tend to just have this kind of flip-flopping between two simple states – either, you know, an expanded state or a contracted state in this example. That’s not actually that useful, unless you can do a clever design to use that movement to form a clever kind of motion. 

A really good example in nature is the pine cone; the spines of a pine cone have this really ingenious bi-layer structure, where one side of them has a very hygroscopic wood – it expands a lot when it gets wet. And the other side doesn’t expand a lot when it gets wet. So, when the pine cone gets wet, it’s that bi-layer structure that causes that movement. The wood itself is just expanding. But the contrast between the two is what causes that motion. So I was trying to get inspired by that and combine, using clever design, a quite simple, smart material with some design that would combine it with a non-smart material that would cause some kind of motion.

It’s all to do with stored tension, and  triggering that tension release. And to be honest with you, I didn’t get very far with it. I understand the material side; that was fine. And I could do all my experiments in the lab, and I could characterise the materials fine, but I just don’t have a designer’s brain. 

And that is what the book is about in a way: trying to access or tap into these other skills that designers and makers and craftspeople have which I don’t.

hand made bookcover
Anna’s book Handmade: A scientist’s search for meaning through making is available to buy now

HS: How much have you learned over the course of writing the book? You must have had to speak to all sorts of people to research it.

AP:  I think that meeting all those craftspeople, and getting a view into their world, really gave me an appreciation for exactly how much work and time and skill and practice goes into really honing these skills. Wood is a really good example: when I did the wood carving workshop with Barn the Spoon, it took hours trying to make a spoon, but when I did it, mine didn’t look anything like 
his spoons. 

The skills themselves are often not that complicated or difficult to do. It’s the constant practice in refinement and design, which are the skills that I didn’t necessarily have.

HS: What led you to write the book? 

AP:  A few things. Firstly, I wanted to write a popular science book that didn’t cater to the normal popular science audience, by which I mean people who are already relatively interested in science, the types of people who would browse the popular science sections in a bookshop and pick things up about space, or the gut, or whatever. I feel like that audience is already very well catered for.

What I wanted to do was try and write a popular book that would be read by someone who would never normally read a science book – that’s the whole of the rest of the population. So you’ll notice in   the book that there are a lot of scientific analyses and explanations, but they’re all quite short. And my hope was that, if someone’s coming at this with not very much prior knowledge of science, they 
get to a description of the quantum mechanics behind why glass is transparent. But on the next page, we’re back to the story. And it’s really those stories that were the most important thing to me.

anna ploszajski
Like the sound of a materials scientist on a journey into making? Listen to Anna’s excellent Handmade podcast
(Image: Steve Cross)

And so, in each of the ten chapters on different materials, the story isn’t the story of the material – it’s the story of something else. So in Plastics, it’s the story of my Polish grandad and, you know, his life story throughout the 20th century, which intertwines with the story of the rise and fall of plastics. 

I wanted to draw all these other audiences in by storytelling, and then hopefully, sneak the science in when they weren’t looking. 

The story of the book itself is to do with feeling very inadequate, I suppose. I had this realisation, having walked into the Institute of Making for the first time, that I was supposedly this expert in materials, having studied the science of it, having studied all on paper, but actually, there were all of these different people that had so much more in-depth knowledge than me. The craftspeople and the makers and the artists and the historians and the designers and the architects… And so it was them that I really wanted to spend time with and learn from. 

That was four years ago. That was when I started my podcast, which is also called Handmade. And that was where I started interviewing makers and craftspeople. And the book just grew from that. Quite a few of the people that I interviewed on the podcast have ended up being featured in the book as the very, very, very kind craftspeople that took me under their wing and showed me the ropes of what they do.

To take blacksmithing as one example – I thought I was an expert in materials, but I had never felt metal softening under my fingers. Yes, I knew the theory, I could draw you the iron-carbon phase diagram, I could talk about the phases and melting, and all of the ways that carbon and iron interact at the atomic level inside steel. But I’ve never done it. And I didn’t know how hard you had to hit it to make it change shape. Agnes, the blacksmith who taught me, is just so, so brilliant. I’m such a huge fangirl of her. And it was very humbling, actually, to spend time with people like that. 

anna ploszajski mug from ceramic materials
It’s one thing to understand the molecular changes that occur when you fire clay; it’s another thing entirely to be able to make a pot

HS: Getting to touch and feel the materials rather than study them, was there any one in particular that you gained an appreciation of? 

AP:  My favourite chapter in the book is Sugar, because it was the most fun story to write. And it’s the story of my English Channel swim. [Yes, you read that right – Anna has swum the English Channel.] One of the reasons was, I think, it already is one of the strongest chapters for storytelling. Because it is this kind of archetypal physical journey from A to B, but also a journey of discovery about yourself. And intertwined in that story is the story of sugar, and all its different forms, and how it affects the body and the mind. 

In terms of the crafts, it was really wool that caught my imagination, and I’ve stuck with it. The story of wool is the story of my camper-van trip around Scotland and the north of England. I acquired wool from all these different places that I went to on my trip, and then knitted a patchwork blanket with all the wool I got from the different places. And through doing that, I taught myself how to knit and I met all of these kinds of amazing knitters and wool-craft people throughout Scotland and the north of England, and chatted to them and got an insight into this amazing world of women who knit – and they were all women – and what it means to them, and how it connects them. And it’s very meditative, I find, and that’s the craft that I’ve taken through since finishing the book a year ago. That’s the craft that I’ve continued with. 

anna ploszajski made a blanket from wool materials
Knitting contains loads of mathematical patterns, which knitters seem to understand intuitively

I don’t know what it is about it. It just feels so nice to create something, you know, especially in the last year when we were all sitting at home watching Netflix and trawling through the movies and TV shows on there. Although that felt like perhaps a bit of a waste of time, actually, if I was knitting while watching TV, it wasn’t all a waste of time; I had something to show for it at the end. And I think that’s what craft gives us – it’s a sense of purpose almost, and a sense of achievement at the end. 

You know, to have that sense of achievement of ‘I’ve made this’ and now I can wear it, or now I can use it. I haven’t had that in science before. I only got that when I started entering this world 
of craft. 

HS: It sounds like you see a disconnect between science and making. Is that fair to say?

AP:  I’ve thought a lot about this: this kind of compartmentalising of making and science, or art and science as I talk about in the book (and I know that art and making are absolutely not the same thing). And I think there are a lot of reasons why the arts and sciences have been sort of severed from each other. In formal education, we separate them. At school, we have to often choose between those types of subjects. I ended up going down the science route, but I did A-level music. I love writing and music and history, and I was always crap at art, but I enjoy it. I think it’s really unhelpful that   we do that, because it means that we brand people as ‘you’re a scientist’, or ‘you’re more of an artist’. And actually, I think the majority of people are probably somewhere in the middle. Actually, they have interest in both. 

anna ploszajski socks
Wool was hugely important for England’s development into a major mediaeval power. It’s also good for keeping your feet warm

It’s a real shame that we often get siphoned off into these different camps, and often don’t get the chance to rediscover the other one. As someone who was siphoned off into the scientific track, it was really liberating to be able to discover the craft and artistic world. It was, like I say, very humbling. It was also really nice to be a complete beginner again at something, to be able to ask the silly questions from a place of curiosity, with no pressure, no educational pressure. I wasn’t trying to achieve anything apart from trying to make a spoon, or forge a knife, or throw a pot, or whatever it was. 

Materials is a really interesting subject because it can sit at this intersection between the artistic world and the scientific world. Materials, perhaps uniquely in the sciences, is a really lovely way to explore the more artistic side. And what I’ve discovered through the book and through the podcast, is that we all understand these materials, maybe in slightly different ways. But quite often, it’s just that we use different language to talk about them. I remember interviewing a silversmith on my podcast called John Cussell, who described cold-working silver metal to me as making the atoms angry. So, when you cold-work silver, it becomes more and more stiff. I would describe that as putting dislocations into the material and putting internal stresses and strains to make them more brittle. We’re both talking about the same thing in different ways. And I think that, really, the wonderful thing that I love about materials is that it can be this  common substance, literally, through which all sorts of different people can talk to each other. 

anna ploszajski smiling in a blue top
We’re fascinated by the idea of 4D printing – printing an object that’s designed to move
(Image: Steve Cross)

HS: Citizen science has taken huge steps forward recently in broadening access to scientific research, but very often it’s locked away inside university buildings and it’s a real shame. What do you think can be done about that?

AP:  That’s my life’s mission, to try and break science out of universities through doing things like writing the book and the podcast, and the talks that I give. I really want to invite people in and show them that science – it’s a huge cliché but science really is everywhere. It’s never been more important than in the last 18 months to understand science, virology, how contagions spread – that’s all science. And the science communication that’s going on around that has been mixed. Some of it’s been really good, but some of it’s been really damaging. That’s what’s important to me is to break science out of these institutions, because a lot of people are turned off science at a very early age. And unlike a lot of other areas, it’s impossible to turn back. If you go down a non-scientific route, through school, and then maybe through university or through a job, it’s impossible to go back on that and pick it up again later. I feel like subjects like history and literature are much more accessible to everybody. Whereas science is considered to be more for a select few, you know, a chosen few who are allowed to do it. And that’s really not fair. 

HS: Are craftspeople scientists? There must be a lot of crossover in terms of learning, experimentation, and so on. 

AP:  I think you’d have to ask them, but whatever it is they do is experimentation, right? And they do experiments all the time – what temperature do I need to make my steel to make it do X? Or, what composition do I need my clay to be to make it do Y? How do I do the settings on my furnace to make sure that my pots don’t explode? And that is exactly the sort of stuff that we would do in the lab, you know: methodical experimentation. So in that way, definitely. I can’t see that there’s any difference at all between that. And in terms of the way that craftspeople and scientists think, that’s much more difficult to answer. 

Most science has arisen from craftspeople and early experimenters. The subject of material science arose out of the subject of metallurgy, which arose out of blacksmiths like Agnes. If you go back far enough, it’s all the same thing.

HackSpace magazine issue 46 out NOW!

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store or your local newsagents.

hackspace front cover red and yellow graphics featuring a spanner and test tube

As always, every issue is free to download from the HackSpace magazine website.

The post Meet Anna Ploszajski: Where making and materials meet appeared first on Raspberry Pi.

Make an animated sign with Raspberry Pi Pico

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/make-an-animated-sign-with-raspberry-pi-pico/

Light up your living room like Piccadilly Circus with this Raspberry Pi Pico project from the latest issue of HackSpace magazine. Don’t forget, it’s not too late to get your hands on our new microcontroller for FREE if you subscribe to HackSpace magazine.

HUB75 LED panels provide an affordable way to add graphical output to your projects. They were originally designed for large advertising displays (such as the ones made famous by Piccadilly Circus in London, and Times Square in New York). However, we can use a little chunk of these bright lights in our projects. They’re often given a ‘P’ value, such as P3 or P5 for the number of millimetres between the different RGB LEDs. These don’t affect the working or wiring in any way.

We used a 32×32 Adafruit screen. Other screens of this size may work, or may be wired differently. It should be possible to get screens of different sizes working, but you’ll have to dig through the code a little more to get it running properly.

The most cost- effective way to add 1024 RGB LEDs to your project
The most cost- effective way to add 1024 RGB LEDs to your project

The protocol for running these displays involves throwing large amounts of data down six different data lines. This lets you light up one portion of the display. You then switch to a different portion of the display and throw the data down the data lines again. When you’re not actively writing to a particular segment of the display, those LEDs are off.

There’s no in-built control over the brightness levels – each LED is either on or off. You can add some control over brightness by flicking pixels on and off for different amounts of time, but you have to manage this yourself. We won’t get into that in this tutorial, but if you’d like to investigate this, take a look at the box on ‘Going Further’.

The code for this is on GitHub (hsmag.cc/Hub75). If you spot a way of improving it, send us a pull request
The code for this is on GitHub. If you spot a way of improving it, send us a pull request

The first thing you need to do is wire up the screen. There are 16 connectors, and there are three different types of data sent – colour values, address values, and control values. You can wire this up in different ways, but we just used header wires to connect between a cable and a breadboard. See here for details of the connections.

These screens can draw a lot of power, so it’s best not to power them from your Pico’s 5V output. Instead, use a separate 5V supply which can output enough current. A 1A supply should be more than enough for this example. If you’re changing it, start with a small number of pixels lit up and use a multimeter to read the current.

With it wired up, the first thing to do is grab the code and run it. If everything’s working correctly, you should see the word Pico bounce up and down on the screen. It is a little sensitive to the wiring, so if you see some flickering, make sure that the wires are properly seated. You may want to just display the word ‘Pico’. If so, congratulations, you’re finished!

However, let’s take a look at how to customise the display. The first things you’ll need to adapt if you want to display different data are the text functions – there’s one of these for each letter in Pico. For example, the following draws a lower-case ‘i’:

def i_draw(init_x, init_y, r, g, b):
    for i in range(4):
        light_xy(init_x, init_y+i+2, r, g, b)
    light_xy(init_x, init_y, r, g, b)

As you can see, this uses the light_xy method to set a particular pixel a particular colour (r, g, and b can all be 0 or 1). You’ll also need your own draw method. The current one is as follows:

def draw_text():
    global text_y
    global direction
    global writing
    global current_rows
    global rows

    writing = True
    text_y = text_y + direction
    if text_y > 20: direction = -1
    if text_y < 5: direction = 1
    rows = [0]*num_rows
    #fill with black
    for j in range(num_rows):
    rows[j] = [0]*blocks_per_row

    p_draw(3, text_y-4, 1, 1, 1)
    i_draw(9, text_y, 1, 1, 0)
    c_draw(11, text_y, 0, 1, 1)
    o_draw(16, text_y, 1, 0, 1)
    writing = False

This sets the writing global variable to stop it drawing this frame if it’s still being updated, and then just scrolls the text_y variable between 5 and 20 to bounce the text up and down in the middle of the screen.

This method runs on the second core of Pico, so it can still throw out data constantly from the main processing core without it slowing down to draw images.

Get HackSpace magazine – Issue 40

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store, The Raspberry Pi store in Cambridge, or your local newsagents.

Each issue is free to download from the HackSpace magazine website.

When you subscribe, we’ll send you a Raspberry Pi Pico for FREE.

A banner with the words "Be a Pi Day donor today"

The post Make an animated sign with Raspberry Pi Pico appeared first on Raspberry Pi.

NeoPixel fireflies jar with Raspberry Pi | HackSpace 39

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/neopixel-fireflies-jar-with-raspberry-pi-hackspace-39/

This twinkly tutorial is fresh from the latest issue of HackSpace magazine, out now.

Adding flashing lights to a project is a great way to make it a little more visually appealing, and WS2812B LEDs (sometimes known as NeoPixels) are a great way to do that.

They have their own mini communications protocol, so you can control lots of them with just a single pin on your microcontroller, and there’s a handy library for Pico MicroPython that lets you control them.

First, you need to grab the library from hsmag.cc/PicoPython and copy the PY file to your Pico device. You can do this by opening the file in Thonny and clicking Save As, and then selecting your MicroPython device and calling it ws2812b.py.

You create an object with the following parameters: number of LEDs, state machine ID, and GPIO number, in that order. So, to create a strip of ten LEDs on state machine 0 and GPIO 0, you use:

pixels = ws2812b.ws2812b(10,0,0)

This object has two methods: show() which sends the data to the strip, and set_pixel which sets the colour values for a particular LED. The parameters are LED number, red, green, blue, with the colours taking values between 0 and 255.

At the time of writing, there’s an issue using this library in the interpreter. The author is investigating, but it’s best to run it from saved files to ensure everything runs properly. Create a file with the following and run it:

import ws2812b
import time

pixels = ws2812b.ws2812b(10,0,0)
pixels.set_pixel(5,10,0,0)
pixels.show()
time.sleep(2)
pixels.set_pixel(5,0,10,0)
pixels.show()
time.sleep(2)
pixels.fill(0,0,10)
pixels.show()
time.sleep(2)

So, now we can light up some LEDs, let’s take a look at how to turn this into an interesting light fixture.

We originally created the fireflies example in the WS2812B project for Christmas tree lights, but once the festive season was over, we liked them so much that we wanted to keep them going year round. Obviously, we can’t just keep a tree up all the time, so we needed another way to display them. We’re using them on thin-wire WS2812B LEDs that are available from direct-from-China sellers, but they should work on other types of WS2812B-compatible LEDs.

There are some other methods in the WS2812B module, such as set_pixel_line_gradient() to add effects to your projects

For display, we’ve put the string of LEDs into a glass demijohn that we used to use for brewing, but any large glass jar would work. This gives an effect inspired by fireflies trapped in a jar. You can just download the code and run it (it’s in the examples folder in the above repository), but let’s take a look and see how it works. The first part of the code sets everything up:

import time
import ws2812b
import random

bright_div = 20
numpix = 50 # Number of NeoPixels
strip = ws2812b.ws2812b(numpix, 0,0)

colors = [
[232, 100, 255], # Purple
[200, 200, 20], # Yellow
[30, 200, 200], # Blue
[150,50,10],
[50,200,10],
]

max_len=20
min_len = 5

flashing = []

num_flashes = 10

You can change numpix, and the details for creating the WS2812B object, to whatever’s suitable for your setup. The colors array holds the different colours that you want your LEDs to flash (in red, green, blue format). You can add to these or change them. We like the subtle pastels of this palette, but you can make it bolder by having more pure colours.

The max_len and min_ len variables control the length of time each light flashes for. They’re not in any units (other than iterations of the main loop), so you may need a little trial and error to get settings that are pleasing for you. The remaining code is what actually does the work of flashing each LED:

for i in range(num_flashes):
pix = random.randint(0, numpix - 1)
col = random.randint(1, len(colors) - 1)
flash_len = random.randint(min_len, max_len)
flashing.append([pix, colors[col], flash_len, 0, 1])

strip.fill(0,0,0)

while True:
strip.show()
for i in range(num_flashes):

    pix = flashing[i][0]
    brightness = (flashing[i][3]/flashing[i][2])
    colr = (int(flashing[i][1][0]*brightness),
            int(flashing[i][1][1]*brightness),
            int(flashing[i][1][2]*brightness))
    strip.set_pixel(pix, colr[0], colr[1], colr[2])

    if flashing[i][2] == flashing[i][3]:
        flashing[i][4] = -1
    if flashing[i][3] == 0 and flashing[i][4] == -1:
        pix = random.randint(0, numpix - 1)
        col = random.randint(0, len(colors) - 1)
        flash_len = random.randint(min_len, max_len)
        flashing[i] = [pix, colors[col], flash_len, 0, 1]
    flashing[i][3] = flashing[i][3] + flashing[i][4]
    time.sleep(0.005)

The flashing list contains an entry for every LED that’s currently flashing. It stores the LED position colour, length of the flash, current position of the flash, and whether it’s getting brighter or dimmer. These are initially seeded with random data; then we start a loop that keeps updating the display.

That’s all there is to it. You can tweak this code or create your very own custom display.

Issue 40 of Hackspace Magazine is out NOW

Front cover of Hack space magazine featuring Pico on pink and black background

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store, The Raspberry Pi store in Cambridge, or your local newsagents.

Each issue is free to download from the HackSpace magazine website.

The post NeoPixel fireflies jar with Raspberry Pi | HackSpace 39 appeared first on Raspberry Pi.

Add face recognition with Raspberry Pi | Hackspace 38

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/add-face-recognition-with-raspberry-pi-hackspace-38/

It’s hard to comprehend how far machine learning has come in the past few years. You can now use a sub-£50 computer to reliably recognise someone’s face with surprising accuracy.

Although this kind of computing power is normally out of reach of microcontrollers, adding a Raspberry Pi computer to your project with the new High Quality Camera opens up a range of possibilities. From simple alerting applications (‘Mum’s arrived home!’), to dynamically adjusting settings based on the person using the project, there’s a lot of fun to be had.

Here’s a beginner’s guide to getting face recognition up and running.

Face recognition using machine learning is hard work, so the latest, greatest Raspberry Pi 4 is a must

1. Prepare your Raspberry Pi
For face recognition to work well, we’re going to need some horsepower, so we recommend a minimum of Raspberry Pi 3B+, ideally a Raspberry Pi 4. The extra memory will make all the difference. To keep as much resource as possible available for our project, we’ve gone for a Raspberry Pi OS Lite installation with no desktop.

Make sure you’re on the network, have set a new password, enabled SSH if you need to, and updated everything with sudo apt -y update && sudo apt -y full-upgrade. Finally, go into settings by running sudo raspi-config and enable the camera in ‘Interfacing Options’.

2. Attach the camera
This project will work well with the original Raspberry Pi Camera, but the new official HQ Camera will give you much better results. Be sure to connect the camera to your Raspberry Pi 4 with the power off. Connect the ribbon cable as instructed in hsmag.cc/HQCameraGetStarted. Once installed, boot up your Raspberry Pi 4 and test the camera is working. From the command line, run the following:
raspivid -o test.h264 -t 10000
This will record ten seconds of video to your microSD card. If you have an HDMI cable plugged in, you’ll see what the camera can see in real-time. Take some time to make sure the focus is correct before proceeding.

3. Install dependencies
The facial recognition library we are using is one that has been maintained for many years by Adam Geitgey. It contains many examples, including Python 3 bindings to make it really simple to build your own facial recognition applications. What is not so easy is the number of dependencies that need to be installed first. There are way too many to list here, and you probably won’t want to type them out, so head over to hsmag.cc/FacialRec so that you can cut and paste the commands. This step will take a while to complete on a Raspberry Pi 4, and significantly longer on a Model 3 or earlier.

3. Install the libraries
Now that we have everything in place, we can install Adam’s applications and Python bindings with a simple, single command:
sudo pip3 install face_recognition
Once installed, there are some examples we can download to try everything out.
cd
git clone --single-branch https://github.com/ageitgey/face_recognition.git
In this repository is a range of examples showing the different ways the software can be used, including live video recognition. Feel free to explore and remix.

5. Example images
The examples come with a training image of Barack Obama. To run the example:
cd ./face_recognition/examples
python3 facerec_on_raspberry_pi.py

On your smartphone, find an image of Obama using your favourite search engine and point it at the camera. Providing focus and light are good you will see:
“I see someone named Barack Obama!”
If you see a message saying it can’t recognise the face, then try a different image or try to improve the lighting if you can. Also, check the focus for the camera and make sure the distance between the image and camera is correct.

Who are you? What even is a name? Can a computer decide your identity?

6. Training time
The final step is to start recognising your own faces. Create a directory and, in it, place some good-quality passport-style photos of yourself or those you want to recognise. You can then edit the facerec_on_raspberry_pi.py script to use those files instead. You’ve now got a robust prototype of face recognition. This is just the beginning. These libraries can also identify ‘generic’ faces, meaning it can detect whether a person is there or not, and identify features such as the eyes, nose, and mouth. There’s a world of possibilities available, starting with these simple scripts. Have fun!

Issue 38 of Hackspace Magazine is out NOW

Front cover of hack space magazine featuring a big striped popcorn bucket filled with maker tools and popcorn

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store, The Raspberry Pi store in Cambridge, or your local newsagents.

Each issue is free to download from the HackSpace magazine website.

The post Add face recognition with Raspberry Pi | Hackspace 38 appeared first on Raspberry Pi.

Read RFID and NFC tokens with Raspberry Pi | HackSpace 37

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/read-rfid-and-nfc-tokens-with-raspberry-pi-hackspace-37/

Add a bit of security to your project or make things selectable
by using different cards. In the latest issue of HackSpace magazine, PJ Evans goes contactless.

The HAT is not hard on resources, so you can use many variants of Raspberry Pi

NFC (near-field communication) is based on the RFID (radio-frequency identification) standard. Both allow a device to receive data from a passive token or tag (meaning it doesn’t require external power to work). RFID supports a simple ID message that shouts ‘I exist’, whereas NFC allows for both reading and writing of data.

Most people come into contact with these systems every day, whether it’s using contactless payment, or a card to unlock a hotel or office door. In this tutorial we’ll look at the Waveshare NFC HAT, an add-on for Raspberry Pi computers that allows you to interact with NFC and RFID tokens.

Prepare your Raspberry Pi

We start with the usual step of preparing a Raspberry Pi model for the job. Reading RFID tags is not strenuous work for our diminutive friend, so you can use pretty much any variant of the Raspberry Pi range you like, so long as it has the 40-pin GPIO. We only need Raspberry Pi OS Lite (Buster) for this tutorial; however, you can install any version you wish. Make sure you’ve configured it how you want, have a network connection, and have updated everything by running sudo apt -y update && sudo apt -y upgrade on the command line.

Enable the serial interface

This NFC HAT is capable of communicating over three different interfaces: I2C, SPI, and UART. We’re going with UART as it’s the simplest to demonstrate, but you may wish to use the others. Start by running sudo raspi-config, going to ‘Interfacing options’, and selecting ‘Serial Interface’. When asked if you want to log into the console, say ‘No’. Then, when asked if you want to enable the serial interface, say ‘Yes’. You’ll need to reboot now. This will allow the HAT to talk to our Raspberry Pi over the serial interface.

Configure and install the HAT

As mentioned in the previous step, we have a choice of interfaces and swapping between them means changing some physical settings on the NFC HAT itself. Do not do this while the HAT is powered up in any way. Our HAT can be configured for UART/Serial by default but do check on the wiki at hsmag.cc/iHj1XA. The jumpers at I1 and I0 should both be shorting ‘L’, D16 and D20 should be shorted and on the DIP switch, everything should be off except RX and TX. Check, double-check, attach the HAT to the GPIO, and boot up.

The Waveshare HAT contains many settings. Make sure to read the instructions!

Download the examples

You can download some examples directly from Waveshare. First, we need to install some dependencies. Run the following at the command line:
sudo apt install rpi.gpio p7zip-full python3-pip
pip3 install spidev pyserial

Now, download the files and unpack them:
cd
wget https://www.waveshare.com/w/upload/6/67/Pn532-nfc-hat-code.7z
7z x Pn532-nfc-hat-code.7z

Before you try anything out, you need to edit the example file so that we use UART (see the accompanying code listing).
cd ~/raspberrypi/python
nano example_get_uid.py

Find the three lines that start pn532 = and add a # to the top one (to comment it out). Now remove the # from the line starting pn532 = PN532_UART. Save, and exit.

Try it out!

Finally, we get to the fun part. Start the example code as follows:
python3 example_get_uid.py
If all is well, the connection to the HAT will be announced. You can now place your RFID token over the area of the HAT marked ‘NFC’. Hexadecimal numbers will start scrolling up the screen; your token has been detected! Each RFID token has a unique number, so it can be used to uniquely identify someone. However, this HAT is capable of much more than that as it also supports NFC and can communicate with common standards like MIFARE Classic, which allows for 1kB of storage on the card. Check out example_dump_mifare.py in the same directory (but make sure you make the same edits as above to use the serial connection).

Going further

You can now read unique identifiers on RFID and NFC tokens. As we just mentioned, if you’re using the MIFARE or NTAG2 standards, you can also write data back to the card. The examples folder contains some C programs that let you do just that. The ability to read and write small amounts of data onto cards can lead to some fun projects. At the Electromagnetic Field festival in 2018, an entire game was based around finding physical locations and registering your presence with a MIFARE card. Even more is possible with smartphones, where NFC can be used to exchange data in any form.

Get HackSpace magazine 37 – Out Now!

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store, The Raspberry Pi store in Cambridge, or your local newsagents.

Each issue is free to download from the HackSpace magazine website.

The post Read RFID and NFC tokens with Raspberry Pi | HackSpace 37 appeared first on Raspberry Pi.

Talk to your Raspberry Pi | HackSpace 36

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/talk-to-your-raspberry-pi-hackspace-36/

In the latest issue of HackSpace Magazine, out now, @MrPJEvans shows you how to add voice commands to your projects with a Raspberry Pi 4 and a microphone.

You’ll need:

It’s amazing how we’ve come from everything being keyboard-based to so much voice control in our lives. Siri, Alexa, and Cortana are everywhere and happy to answer questions, play you music, or help automate your household.

For the keen maker, these offerings may not be ideal for augmenting their latest project as they are closed systems. The good news is, with a bit of help from Google, you can add voice recognition to your project and have complete control over what happens. You just need a Raspberry Pi 4, a speaker array, and a Google account to get started.

Set up your microphone

This clever speaker uses four microphones working together to increase accuracy. A ring of twelve RGB LEDs can be coded to react to events, just like an Amazon Echo

For a home assistant device, being able to hear you clearly is an essential. Many microphones are either too low-quality for the task, or are unidirectional: they only hear well in one direction. To the rescue comes Seeed’s ReSpeaker, an array of four microphones with some clever digital processing to provide the kind of listening capability normally found on an Amazon Echo device or Google Assistant. It’s also in a convenient HAT form factor, and comes with a ring of twelve RGB LEDs, so you can add visual effects too. Start with a Raspberry Pi OS Lite installation, and follow these instructions to get your ReSpeaker ready for use.

Install Snowboy

You’ll see later on that we can add the power of Google’s speech-to-text API by streaming audio over the internet. However, we don’t want to be doing that all the time. Snowboy is an offline ‘hotword’ detector. We can have Snowboy running all the time, and when your choice of word is ‘heard’, we switch to Google’s system for accurate processing. Snowboy can only handle a few words, so we only use it for the ‘trigger’ words. It’s not the friendliest of installations so, to get you up and running, we’ve provided step-by-step instructions.

There’s also a two-microphone ReSpeaker for the Raspberry Pi Zero

Create your own hotword

As we’ve just mentioned, we can have a hotword (or trigger word) to activate full speech recognition so we can stay offline. To do this, Snowboy must be trained to understand the word chosen. The code that describes the word (and specifically your pronunciation of it) is called the model. Luckily, this whole process is handled for you at snowboy.kitt.ai, where you can create a model file in a matter of minutes and download it. Just say your choice of words three times, and you’re done. Transfer the model to your Raspberry Pi 4 and place it in your home directory.

Let’s go Google

ReSpeaker can use its multiple mics to detect distance and direction

After the trigger word is heard, we want Google’s fleet of super-servers to help us transcribe what is being said. To use Google’s speech-to-text API, you will need to create a Google application and give it permissions to use the API. When you create the application, you will be given the opportunity to download ‘credentials’ (a small text file) which will allow your setup to use the Google API. Please note that you will need a billable account for this, although you get one hour of free speech-to-text per month. Full instructions on how to get set up can be found here.

Install the SDK and transcriber

To use Google’s API, we need to install the firm’s speech-to-text SDK for Python so we can stream audio and get the results. On the command line, run the following:pip3 install google-cloud-speech
(If you get an error, run sudo apt install python3-pip then try again).
Remember that credentials file? We need to tell the SDK where it is:
export GOOGLE_APPLICATION_CREDENTIALS="/home/pi/[FILE_NAME].json"
(Don’t forget to replace [FILE_NAME] with the actual name of the JSON file.)
Now download and run this test file. Try saying something and see what happens!

Putting it all together

Now we can talk to our Raspberry Pi, it’s time to link the hotword system to the Google transcription service to create our very own virtual assistant. We’ve provided sample code so that you can see these two systems running together. Run it, then say your chosen hotword. Now ask ‘what time is it?’ to get a response. (Don’t forget to connect a speaker to the audio output if you’re not using HDMI.) Now it’s over to you. Try adding code to respond to certain commands such as ‘turn the light on’, or ‘what time is it?’

Get HackSpace magazine 36 Out Now!

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store, The Raspberry Pi store in Cambridge, or your local newsagents.

Each issue is free to download from the HackSpace magazine website.

The post Talk to your Raspberry Pi | HackSpace 36 appeared first on Raspberry Pi.