Tag Archives: HackSpace

RetroPie Cyberdeck | HackSpace #47

Post Syndicated from Ben Everard original https://www.raspberrypi.org/blog/retropie-cyberdeck-hackspace-47/

You know we love a good cyberdeck around here, and we think you’ll love this video game emulator fresh from the latest issue of HackSpace magazine, out now.

We’ve only just finished printing a series on building a games cabinet using the RetroPie games emulator on a Raspberry Pi… and now something comes along that makes our plywood, full-size arcade machine look old hat. 

hackspace cyberdeck

This mostly 3D-printed cyberdeck features a 5-inch 800 × 480 touchscreen display, as well as the usual ports available through the Raspberry Pi 3 Model B+ that powers it. Quite how useful the screen’s portrait orientation will be for Sonic The Hedgehog is anyone’s guess, but if you’re playing any sort of top-down shooter, you’re laughing. The maker describes this project as a “video game emulator with some edge” – we think it’s pretty impressive for a project that began as an excuse to learn 3D design.

hackspace cyberdeck

HackSpace magazine issue 47 out NOW!

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store or your local newsagents.

hackspace 47 cover

As always, every issue is free to download in PDF format from the HackSpace magazine website.

The post RetroPie Cyberdeck | HackSpace #47 appeared first on Raspberry Pi.

Meet Anna Ploszajski: Where making and materials meet

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/meet-anna-ploszajski-where-making-and-materials-meet/

In the latest issue of HackSpace magazine, Andrew Gregory meets Anna Ploszajski to explore the bit of the Venn diagram where making and materials meet.

Anna Ploszajski (pronounced Por-shy-ski) is a cross-channel swimmer, a materials scientist, a writer, and a breaker-down of barriers to scientific understanding. 50% of the HackSpace editorial team listen to her podcast, Handmade, from which has arisen a book: Handmade: A scientist’s search for meaning through making. Naturally, we wanted to talk to her to find out why we humans do what we do when we turn object A into object B. That’s a pretty big question, but if anyone can answer it for us, Anna can.

anna ploszajski glass blowing
Anna’s journey into making began with watching a bit of broken glassware getting fixed
(Image: Charlie Murphy)

HackSpace: Hi Anna! You’ve written a book about making. Before we get on to that, though, we’d like to ask you about something you’ve been working on in your non-writing life – 4D printing. A while ago we saw a box with a hinged lid; the hinges were fabric, and the box was PLA, so you get the benefits of two types of material in one object. I guess what you’re doing is rather more advanced than that?

Anna Ploszajski: You say that, but I’ve been doing quite a lot of experiments in 3D printing onto fabric to try and make a 4D thing, because PLA has a kind of shape memory. I was wanting to do the experiment that I was doing (which actually I described at the end of the book). I’m trying to draw a conclusion about how my adventures in craft had also impacted my scientific research life. And the example that I use is this experiment that I did 3D printing onto fabrics. 

What I was doing began with  sort of pre-stressing, just normal textiles. I think there was a cotton, linen, pre-stressing it, just stretching them out with my hands, and then attaching them onto the print bed. And so, you already put in a kind of internal strain into the fabric, then 3D-print a very simple design that was either a circle, or just simple lines. And then obviously, when you print onto it, then the PLA plastic is bonded onto the textile. My idea was that if you then heated that material up, then it would soften, and that tension that you’d put into the fabric would be released. So that was my idea. 

anna ploszajski with woollen materials
Anna’s mission is to make science available to non-scientists
(Image: Steve Cross)

My project was all to do with exploring this idea of 4D printing. So printing, using 3D printing, to make objects that move in some way after you’ve printed them. The thing about it is, it’s adjacent to this topic of smart materials. There’s a family of materials that have some kind of smart property, usually it’s colour-changing or shape-changing in response to an external stimulus. So, that could be temperature change, or light levels or moisture levels. 

And those smart materials are not actually that smart, it turns out, because what they do is really simple. Let’s take the example of a really simple shape change: wood is a really good example. It expands when it gets wet. And it contracts when it dries out. By our definition of a smart material, that is a smart material because it changes shape when there’s a change in environment. And that’s a very simple  movement. And these smart materials tend to just have this kind of flip-flopping between two simple states – either, you know, an expanded state or a contracted state in this example. That’s not actually that useful, unless you can do a clever design to use that movement to form a clever kind of motion. 

A really good example in nature is the pine cone; the spines of a pine cone have this really ingenious bi-layer structure, where one side of them has a very hygroscopic wood – it expands a lot when it gets wet. And the other side doesn’t expand a lot when it gets wet. So, when the pine cone gets wet, it’s that bi-layer structure that causes that movement. The wood itself is just expanding. But the contrast between the two is what causes that motion. So I was trying to get inspired by that and combine, using clever design, a quite simple, smart material with some design that would combine it with a non-smart material that would cause some kind of motion.

It’s all to do with stored tension, and  triggering that tension release. And to be honest with you, I didn’t get very far with it. I understand the material side; that was fine. And I could do all my experiments in the lab, and I could characterise the materials fine, but I just don’t have a designer’s brain. 

And that is what the book is about in a way: trying to access or tap into these other skills that designers and makers and craftspeople have which I don’t.

hand made bookcover
Anna’s book Handmade: A scientist’s search for meaning through making is available to buy now

HS: How much have you learned over the course of writing the book? You must have had to speak to all sorts of people to research it.

AP:  I think that meeting all those craftspeople, and getting a view into their world, really gave me an appreciation for exactly how much work and time and skill and practice goes into really honing these skills. Wood is a really good example: when I did the wood carving workshop with Barn the Spoon, it took hours trying to make a spoon, but when I did it, mine didn’t look anything like 
his spoons. 

The skills themselves are often not that complicated or difficult to do. It’s the constant practice in refinement and design, which are the skills that I didn’t necessarily have.

HS: What led you to write the book? 

AP:  A few things. Firstly, I wanted to write a popular science book that didn’t cater to the normal popular science audience, by which I mean people who are already relatively interested in science, the types of people who would browse the popular science sections in a bookshop and pick things up about space, or the gut, or whatever. I feel like that audience is already very well catered for.

What I wanted to do was try and write a popular book that would be read by someone who would never normally read a science book – that’s the whole of the rest of the population. So you’ll notice in   the book that there are a lot of scientific analyses and explanations, but they’re all quite short. And my hope was that, if someone’s coming at this with not very much prior knowledge of science, they 
get to a description of the quantum mechanics behind why glass is transparent. But on the next page, we’re back to the story. And it’s really those stories that were the most important thing to me.

anna ploszajski
Like the sound of a materials scientist on a journey into making? Listen to Anna’s excellent Handmade podcast
(Image: Steve Cross)

And so, in each of the ten chapters on different materials, the story isn’t the story of the material – it’s the story of something else. So in Plastics, it’s the story of my Polish grandad and, you know, his life story throughout the 20th century, which intertwines with the story of the rise and fall of plastics. 

I wanted to draw all these other audiences in by storytelling, and then hopefully, sneak the science in when they weren’t looking. 

The story of the book itself is to do with feeling very inadequate, I suppose. I had this realisation, having walked into the Institute of Making for the first time, that I was supposedly this expert in materials, having studied the science of it, having studied all on paper, but actually, there were all of these different people that had so much more in-depth knowledge than me. The craftspeople and the makers and the artists and the historians and the designers and the architects… And so it was them that I really wanted to spend time with and learn from. 

That was four years ago. That was when I started my podcast, which is also called Handmade. And that was where I started interviewing makers and craftspeople. And the book just grew from that. Quite a few of the people that I interviewed on the podcast have ended up being featured in the book as the very, very, very kind craftspeople that took me under their wing and showed me the ropes of what they do.

To take blacksmithing as one example – I thought I was an expert in materials, but I had never felt metal softening under my fingers. Yes, I knew the theory, I could draw you the iron-carbon phase diagram, I could talk about the phases and melting, and all of the ways that carbon and iron interact at the atomic level inside steel. But I’ve never done it. And I didn’t know how hard you had to hit it to make it change shape. Agnes, the blacksmith who taught me, is just so, so brilliant. I’m such a huge fangirl of her. And it was very humbling, actually, to spend time with people like that. 

anna ploszajski mug from ceramic materials
It’s one thing to understand the molecular changes that occur when you fire clay; it’s another thing entirely to be able to make a pot

HS: Getting to touch and feel the materials rather than study them, was there any one in particular that you gained an appreciation of? 

AP:  My favourite chapter in the book is Sugar, because it was the most fun story to write. And it’s the story of my English Channel swim. [Yes, you read that right – Anna has swum the English Channel.] One of the reasons was, I think, it already is one of the strongest chapters for storytelling. Because it is this kind of archetypal physical journey from A to B, but also a journey of discovery about yourself. And intertwined in that story is the story of sugar, and all its different forms, and how it affects the body and the mind. 

In terms of the crafts, it was really wool that caught my imagination, and I’ve stuck with it. The story of wool is the story of my camper-van trip around Scotland and the north of England. I acquired wool from all these different places that I went to on my trip, and then knitted a patchwork blanket with all the wool I got from the different places. And through doing that, I taught myself how to knit and I met all of these kinds of amazing knitters and wool-craft people throughout Scotland and the north of England, and chatted to them and got an insight into this amazing world of women who knit – and they were all women – and what it means to them, and how it connects them. And it’s very meditative, I find, and that’s the craft that I’ve taken through since finishing the book a year ago. That’s the craft that I’ve continued with. 

anna ploszajski made a blanket from wool materials
Knitting contains loads of mathematical patterns, which knitters seem to understand intuitively

I don’t know what it is about it. It just feels so nice to create something, you know, especially in the last year when we were all sitting at home watching Netflix and trawling through the movies and TV shows on there. Although that felt like perhaps a bit of a waste of time, actually, if I was knitting while watching TV, it wasn’t all a waste of time; I had something to show for it at the end. And I think that’s what craft gives us – it’s a sense of purpose almost, and a sense of achievement at the end. 

You know, to have that sense of achievement of ‘I’ve made this’ and now I can wear it, or now I can use it. I haven’t had that in science before. I only got that when I started entering this world 
of craft. 

HS: It sounds like you see a disconnect between science and making. Is that fair to say?

AP:  I’ve thought a lot about this: this kind of compartmentalising of making and science, or art and science as I talk about in the book (and I know that art and making are absolutely not the same thing). And I think there are a lot of reasons why the arts and sciences have been sort of severed from each other. In formal education, we separate them. At school, we have to often choose between those types of subjects. I ended up going down the science route, but I did A-level music. I love writing and music and history, and I was always crap at art, but I enjoy it. I think it’s really unhelpful that   we do that, because it means that we brand people as ‘you’re a scientist’, or ‘you’re more of an artist’. And actually, I think the majority of people are probably somewhere in the middle. Actually, they have interest in both. 

anna ploszajski socks
Wool was hugely important for England’s development into a major mediaeval power. It’s also good for keeping your feet warm

It’s a real shame that we often get siphoned off into these different camps, and often don’t get the chance to rediscover the other one. As someone who was siphoned off into the scientific track, it was really liberating to be able to discover the craft and artistic world. It was, like I say, very humbling. It was also really nice to be a complete beginner again at something, to be able to ask the silly questions from a place of curiosity, with no pressure, no educational pressure. I wasn’t trying to achieve anything apart from trying to make a spoon, or forge a knife, or throw a pot, or whatever it was. 

Materials is a really interesting subject because it can sit at this intersection between the artistic world and the scientific world. Materials, perhaps uniquely in the sciences, is a really lovely way to explore the more artistic side. And what I’ve discovered through the book and through the podcast, is that we all understand these materials, maybe in slightly different ways. But quite often, it’s just that we use different language to talk about them. I remember interviewing a silversmith on my podcast called John Cussell, who described cold-working silver metal to me as making the atoms angry. So, when you cold-work silver, it becomes more and more stiff. I would describe that as putting dislocations into the material and putting internal stresses and strains to make them more brittle. We’re both talking about the same thing in different ways. And I think that, really, the wonderful thing that I love about materials is that it can be this  common substance, literally, through which all sorts of different people can talk to each other. 

anna ploszajski smiling in a blue top
We’re fascinated by the idea of 4D printing – printing an object that’s designed to move
(Image: Steve Cross)

HS: Citizen science has taken huge steps forward recently in broadening access to scientific research, but very often it’s locked away inside university buildings and it’s a real shame. What do you think can be done about that?

AP:  That’s my life’s mission, to try and break science out of universities through doing things like writing the book and the podcast, and the talks that I give. I really want to invite people in and show them that science – it’s a huge cliché but science really is everywhere. It’s never been more important than in the last 18 months to understand science, virology, how contagions spread – that’s all science. And the science communication that’s going on around that has been mixed. Some of it’s been really good, but some of it’s been really damaging. That’s what’s important to me is to break science out of these institutions, because a lot of people are turned off science at a very early age. And unlike a lot of other areas, it’s impossible to turn back. If you go down a non-scientific route, through school, and then maybe through university or through a job, it’s impossible to go back on that and pick it up again later. I feel like subjects like history and literature are much more accessible to everybody. Whereas science is considered to be more for a select few, you know, a chosen few who are allowed to do it. And that’s really not fair. 

HS: Are craftspeople scientists? There must be a lot of crossover in terms of learning, experimentation, and so on. 

AP:  I think you’d have to ask them, but whatever it is they do is experimentation, right? And they do experiments all the time – what temperature do I need to make my steel to make it do X? Or, what composition do I need my clay to be to make it do Y? How do I do the settings on my furnace to make sure that my pots don’t explode? And that is exactly the sort of stuff that we would do in the lab, you know: methodical experimentation. So in that way, definitely. I can’t see that there’s any difference at all between that. And in terms of the way that craftspeople and scientists think, that’s much more difficult to answer. 

Most science has arisen from craftspeople and early experimenters. The subject of material science arose out of the subject of metallurgy, which arose out of blacksmiths like Agnes. If you go back far enough, it’s all the same thing.

HackSpace magazine issue 46 out NOW!

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store or your local newsagents.

hackspace front cover red and yellow graphics featuring a spanner and test tube

As always, every issue is free to download from the HackSpace magazine website.

The post Meet Anna Ploszajski: Where making and materials meet appeared first on Raspberry Pi.

Add 57,600 pixels to your Raspberry Pi Pico

Post Syndicated from Ben Everard original https://www.raspberrypi.org/blog/add-57600-pixels-to-your-raspberry-pi-pico/

In the latest issue of HackSpace magazine, Ben Everard tests whether a bit of kit from Spotpear can turn Raspberry Pi Pico into a games machine.

The snappily named Raspberry Pi Pico display 1.54-inch LCD by Spotpear ($11.89) brings in a 240×240 pixel IPS screen and ten buttons in a joypad-like arrangement. There’s four for direction, four for action, a select, and a start. At least, they’re labelled like this. You can use them for anything you like.

Spot pear Pico screen front
The buttons are just a bit too small and fiddly for us

To help you get started, there’s a short manual, which includes example code for MicroPython and C.

This example code is easy enough to use, but it is a little messy. The mechanism for controlling the hardware isn’t separated into its own module, so you’re left with either the task of building the library yourself or having slightly untidy code. Not the biggest inconvenience, but compared to how neatly some maker hardware companies manage their code, we found ourselves off to a disappointing start.

There are also some sample UF2 files included along with the C example code, but these appear to have been built for different hardware and work either partially or not at all. The actual example code did compile and work properly.

Impressive quality

When we ran the example code, we were impressed with the quality of the screen. With 240×240 pixels in just 1.54 inches, there’s a high pixel density that can give crisp graphics. Obviously, high pixel densities are a double-edged sword. While they can look great, it does mean higher RAM use, more time transferring data, and more data to process.

Fortunately, Pico is well-suited to the task of driving screens. Each pixel can take 16 bits of colour data, so a full-frame buffer is just 115,200 bytes. The display data is transferred by SPI, and Pico has a maximum SPI frequency of half the clock speed. For MicroPython, that means 62.5MHz. The actual data transfer rate is a little less than this because of overhead of the protocol, but we were able to drive full-frame refreshes at over 40 fps, which is plenty for smooth animations.

Spot pear Pico screen back
Pico slots in the back, which is perfect for space-constrained builds

Obviously, if you’re looking to do animations, sending the data is only half the story. You also need to calculate the frame before it’s displayed. If you’re using MicroPython, you are quite limited by the amount of processing you can do and still keep a high frame rate (though you could use the second core to offload some of the processing). With C, you’ve got much more scope, especially as you could potentially offload the data transfer using direct memory access (DMA).

Battery-sucking light

The one disappointing thing about the screen is that there’s no control over the backlight. According to the documentation, it should be attached to pin 13, but it isn’t. You can’t turn it on or off – it’s just permanently on, and quite bright. That’s a deal-breaker for anything running off battery power, as it will suck up a lot of power. However, if you want a display permanently on, this might be perfectly acceptable.

While we were quite impressed by the screen, we can’t say the same for the other part of the hardware – the buttons. They’re small, stiff, and have very little movement. The end result is a button that is hard to press, and hard to know if you’ve pressed it. They’re the sort of buttons that are commonly used as reset buttons as they’re hard to accidentally press.

We had hoped that this screen would make a good base for a games console, but unfortunately these buttons would just make for a frustrating experience. They might be OK for a menu-driven user interface, but that’s about it.

Another minor annoyance in this is the lack of any mounting holes. This makes it hard to embed into a project as the user interface.

We wanted to like this project. It’s got a good, high-res screen and a nice layout of buttons. However, the choice of components makes it hard to see how we’ll use this in our projects. We’re considering removing the surface-mount buttons and soldering wires onto them to make a more useful device, but if you’re going to go to that level of surgery, it’s probably better to start with a plain screen and work your way up from there.

Verdict

5/10

Good screen, but awful buttons

Price: $11.89

HackSpace magazine issue 46 out NOW!

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store or your local newsagents.

hackspace front cover red and yellow graphics featuring a spanner and test tube

As always, every issue is free to download from the HackSpace magazine website.

The post Add 57,600 pixels to your Raspberry Pi Pico appeared first on Raspberry Pi.

Meet Laura Kampf: Wood and metalworker

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/meet-laura-kampf-wood-and-metalworker/

Laura Kampf, the Köln-based wood and metalworker with a mild tiny house and Leatherman obsession sat down (virtually) with Alex Bate to talk about prison tattoo machines, avoiding your nightmares, and why aggressive hip-hop and horror movies inspire her weekly project builds.

Smudo the workshop dog was also there, which seems to be becoming a recurring and pleasant feature of HackSpace magazine interviews.

laura kampf
In five years, Laura has uploaded over 200 videos to YouTube

Alex: Your videos feel very unique in how they’re produced. It feels as though we’re in the workshop watching you get on with your day. That you’d be doing this regardless of whether the camera was there or not.

Laura: Yeah, that’s absolutely it. I mean, I document it for YouTube because I’m aware that this is the only place for me. And the documentation, that’s the work part, like setting up the camera, thinking about the story. But the physical work of building something, that’s a form of meditation. That’s just my happy place. And I know I have to document my work because I have to do something to make a living, right? I can’t just play. So YouTube is my work. But making is just, it’s just what I do, and I feel more and more that this is the only place for me.

And this is probably how musicians feel when they are performing on stage. You know, this – being in my shop, I feel so comfortable. And I feel so good. I don’t have that anywhere else. 

Subscribe to Laura Kampf on YouTube

I remember seeing that you went to design school. Is that where your journey as a maker started or does creativity run in your family? I know your brother is creative  (instagram.com/zooburger), but what about your parents?

My brother is super-creative, but my parents, not so much. My grandfather was an engineer. So I think 
it kind of skipped a generation.

In design school, there was a project where we had to build something out of everyday objects. And it was for us, the designers, to get away from the computers and just do something with our hands. I built a tattoo machine, like a prison-style tattoo machine. And I was hooked. I remember coming home and I was so moved by the whole thing. Even though the machine looks terrible, everything fell into place. 

Because all my life, people were telling me you need to find this one thing that you’re really good at and then just keep doing that. I think it’s also a German thing, you know, like, be perfect at one thing, and then you’ll be the best in your field. And I could never focus on one thing and building this tattoo machine; there were so many different things coming together. 

I had all this interest in so many different fields and I could use them for the project – I enjoyed drawing fonts and learned how to do old-school tattoo lettering, and I could do a little bit of electronics to hook up a switch. All these things, I thought it was super-interesting. It was the first time I could just use little bits of everything I knew to make something that was really cool, and I was hooked after that.

laura kampf
Laura works mostly with wood and metal in her weekly videos

Have you tattooed yourself with the tattoo machine?

I wanted to and then, thank God, because I was really young, it’s very likely that I would have done that, a tattoo artist came by and I showed him the machine, and he was like, “Don’t do it. It’s running way too fast. You will make mincemeat out of your skin”. But I bought pigs legs and pig’s ears and tattooed them. I couldn’t eat pig for probably two years after that. It was so warm, and tattooing the piece for a couple hours, the fat was running out of it – it was disgusting.

It’s interesting that if you look at the stuff that you’re making now, the anchor point that started all of this is a prison tattoo machine.

Looking back, I remember the little things that I made; when I showed them to people, they just didn’t show the same excitement for them as I did. And it was such a disappointment until I realised that no, the stuff I was making was really bad. That’s why no one was excited, because I didn’t know what I was doing. Once I got better and better, and especially with YouTube and talking to the community – well, I’m preaching to the choir here; everyone knows making is fantastic, and we have a very focused, niche community – and they get it.

laura kampf dog
All good workshops need a dog

Do you feel like a bit of a sense of responsibility being a woman in this community, being queer in this community? Two of the things that are a minority in this field. Do you feel that affects your work at all? 

I didn’t to begin with, I have to say. In the beginning, I felt more that it wasn’t about me, it was about the things that I make, and my sexuality and my gender don’t play a role in this. I don’t think about my sexuality all day long; I don’t think about the fact I’m a girl all day long, so why would it be in my videos? But I have to say that I changed my mind about these things. Because visibility is really important.

I had this really weird experience at the 10 Maker event a few years ago. I was wearing this T-shirt I got for free on one Christopher Street Day, it says ‘Gay Okay’. I love that shirt; it’s a really nice fit. I went to get some groceries with Brett from Skull and Spade and Hassan from HABU, and there was this girl, maybe eleven or twelve years old, and she saw me wearing that shirt, hanging out with regular dudes, doing regular stuff in a regular supermarket, and her jaw dropped. We were in the countryside, you don’t see things like rainbow flags there. And I could tell she’s maybe gay too, and it was so good for her to see that. There’s nothing different about you – you can still hang out with guys, you can still,   you know, go shopping and all these things. That’s when I realised, institutions like Christopher Street Day are so important, but it’s also important to just have it integrated into regular stuff, not just special occasions. Today’s International Women’s Day? Well, we need to celebrate girls every day; every day you need to celebrate these things. 

So, I kind of made it a habit to have rainbow flags in my videos. Not every video, and never super-obvious, but in the background, when I talk to the camera sometimes. I do wear my Gay Okay shirt every once in a while. I don’t want to make it a point because people like to put you in drawers. And, once you’re the queer maker, you’re the queer maker, and that’s all people want to talk about. And I don’t want that because I still think, at the end of the day, it’s about the things I build and not about me and my sexuality and gender. But, yeah, to just sprinkle it in every once in a while, I think it’s very important.

I don’t get much negativity about this. I was surprised, pleasantly so, obviously, but yeah, a couple of days ago, I wore my Gay Okay t-shirt in my Instagram Stories, and people applauded me for it, and that’s really interesting. I would never have thought that.

laura kampf
It might not look like it now, but this will become a pub on wheels

Do you get much trolling at all? Or are you spared from it?

I think, at the beginning of my YouTube career, I was growing really fast and really, like, exponentially. And I had a couple of videos that went viral, like the beer bike, that went outside of the community. For those viral videos, you get negativity. They don’t know who you are, they don’t know the context, they don’t know what I’m doing. That’s why I hate having viral videos. It brings in the worst. I like to be in this little lake, surrounded by my followers.

A few people have said that, actually. That it’s the worst. It’s the thing everybody aims for and then, when you get there, you wish you weren’t.

Yeah, they take you out of context. Those people, they see one of my videos, they don’t know that I’m building something. And that’s another interesting thing that your community learns about you. They know I build something every week for the past six years. Every week, it can’t be the Holy Grail every freaking week. Sometimes it’s bad, but it’s stuff that I did that week – it’s documentation.

When I was a kid, I remember my mind was blown that The Simpsons had a different intro every episode. Something different happens every time. I couldn’t believe that, and how much work went into it. I think it primed me for being a weekly creator.

The tattoo machine that started it all
The tattoo machine that started it all

It’s impressive. There aren’t a lot of makers releasing weekly videos, and many that do are releasing build videos in weekly parts. And you just come along and go ta-da!

Haha, but not every video is a good idea. Some of them are really bad ideas. But that’s my privilege, you know, that I can still do that. Because I have to, otherwise there wouldn’t be a video, and I love that because the pressure helps me keep going. And the process is the same. It doesn’t matter if you’re building a tiny house or a scratch post for a cat. The process is me, being in the shop, listening to podcasts, listening to music, enjoying my tools, playing with the material – it’s all the same, it doesn’t really matter. 

So, the public bench stuff that I’ve been doing lately, I get so many questions like, “Oh no, how could you leave the bench” and, like, I don’t give a damn about the bench. It’s not the bench, it’s the process that I enjoy. I could literally throw everything that I build away – I could throw it in the trash right away. I wouldn’t mind. I’m so focused on the process.

I was going to ask you about the bench, because it was recently vandalised and so you made another one. Most people would probably just not, would just raise their hands in defeat and leave it. But you just made it again. 

I was expecting it to break eventually. And, to be honest, I was kinda hoping for it because I wanted to do it again. And, this time, I’m actually hoping for it to get broken again because I want to do it again.

Laura’s bike frame cup holder

I may be making this up, but I’m sure you once mentioned that it’s illegal to sell furniture in Germany unless you’re registered. Is that right?

Yeah, it’s a very broad description of this, but the craftsmanship in Germany is of a very high standard, right? At least we like to think so. So, if you want to be a carpenter, you’re first an apprentice for three years or so, then you can be a carpenter and work under a master carpenter. If you want to educate other apprentices, or if you want to sell certain furniture, I think chairs is one of them, then you have to be a master. And it’s the same for every field. I think the most plausible is electricians. If you are not a master electrician, you cannot, say, make a lamp and sell it.

But my interest is so general. I wanted to make lamps, but the notion of designing a lamp that’s made out of wood and then obviously has electricity in it, it’s just impossible. 

I spoke to the TÜV and asked them, if I design a lamp and want to sell it in a   store, how do I do it. And I would have to get it checked by their institution, which is a couple of hundred euros, and get a certificate. But I would have to do this for the next lamp design, and the next. And that makes them so expensive. I can’t sell a lamp for 150 euros if it costs me more than that to get it checked. I’m not interested in mass production, I want to make one-off pieces. 

I had already quit my job when I discovered this and remember having a big knot in my stomach thinking, ‘what do I do?’, and YouTube was the answer. 

Could you not use YouTube as a way of selling lamps? It’s not a lamp, it’s a video prop?

Yeah, there are loopholes – this is not a lamp, this is art. But, when I quit my job to become a self-employed lamp seller, I really only quit my job because I hated working for other people, not because it was my dream to sell furniture and lamps. I didn’t know YouTube really existed as a thing for me, and once I figured out people were actually making money off this, I was like, OK, I need to get a camera, I need to give this a try. Because that would be better than building stuff to sell it. I wasn’t interested in selling stuff. I don’t want clients. I don’t want that pressure from anyone else except me, so YouTube worked out perfectly for me. 

How to build a tattoo machine from scratch – one of Laura’s most popular videos

The job you quit was as a Display Artist for Urban Outfitters, if I remember correctly? Designing displays within a store. That sounded like a brilliant job.

It was. It was a great job, but it wasn’t for me. It was probably the perfect job, but I am not a good employee. I was asked a couple of years ago if I would do a talk about my career and how I made this job for myself and followed my dreams, blah, blah. I don’t like ‘follow your dreams’. It was the other way around. I avoided my nightmares. That’s how I got here. I never dreamed of this, I didn’t know this existed. So, I think avoiding your nightmares is much more efficient than following your dreams.

With your design school background, when you create something, how much of that project is art over functionality? Dovetails versus pocket holes for instance. 

It’s more, and this is hard to explain, but I have this internal measuring unit of how much work should go into a project. I know how much time I can put into a project, and there’s this bucket of work I’ve put into it, and depending on how full the bucket is determines how the project looks and whether I use pocket holes or dovetails, for example.

You work a lot with wood and with metal, as well as a few other materials, all of which require their own set of skills. Where have you learned all your techniques?

All YouTube. That’s the cool thing. It’s all full circle. There are some things – I had a couple of jobs where I learned some skills. I worked as a flight case builder for three years, just filling those black flight cases. Which sounds very, very trivial. It’s not though, It’s crazy. You have to work so precisely, otherwise, the catches won’t close and all these things, and everything is building boxes. So I learned a bunch of stuff there. It was my Karate Kid apprenticeship. But a lot of it is YouTube. I remember watching Jimmy DiResta – I saw his TV show online, and then I watched a bunch of his videos without realising it was the same guy. Eventually, I noticed he had a weekly schedule and a podcast, and it was all exactly what I needed to see and hear. Right when I quit my job and I couldn’t sell lamps, there were these people telling me that they do this for a living. It was perfect timing. I feel like I’m the second generation YouTuber and they’re the first. 

Laura’s cargo bike

As well as those makers, what else influences your work?

I like to listen to a lot of hip-hop, like super-aggressive hip-hop that is the complete opposite of me and has nothing to do with my world. And I like to watch horror movies, super-scary and bloody horror movies. I like to explore the opposite of what I have. A view into a completely different world. The Fantasy Filmfest is a huge inspiration for me. These movies that go right to DVD; they don’t go into the big theatres. I like to think about how they got made? How did they think of that? That’s the biggest inspiration. And, with hip-hop, the personas, and why they feel like they do, and how do they come up with those lines. They’re in their own universe, they have their own rules. I just love that. It’s how I feel when I’m building stuff. I’m telling myself a story that I don’t know the ending of. I don’t like to make sketches, I don’t like to know if it works. If I see someone else had the idea and did a full video about it, I don’t even want to do the idea anymore. I want to have that unknown. This is the idea, this is the stuff you have, now try to make it happen.

Is there anything still on the list? Projects you still want to work on?

I don’t know if you saw it on Instagram, but I bought a Multicar. It’s so good. It’s the slowest car ever; it is painfully slow – 45 kilometres an hour and that’s it. But it has torque; you can tow pretty much everything with it. So my plan is to take the world’s smallest pub that I built a couple months ago and put it on at the back of the Multicar.

Something is holding me back at the moment, though. I have all the parts, I should be able to do it, but I don’t know what it is. I experience that quite often – I have an idea, and everything should be good to go, but I’m not doing it. And then, eventually, it turns out I wasn’t sure about the colour, or something else that was missing that I didn’t know at the time. So I don’t push it. But that’s the project I’m looking forward to.

Do you think you’ll ever just get to the point where you’re going to stop doing weekly videos? Or is this you for life?

I don’t know. Like, that’s the one thing that I’m really scared of, like, what happens when I get sick? Because at the beginning of the year, I hired my best friend. So now we’re both relying on my mental and physical health. So I think it’s a good idea to broaden stuff and have more income streams. I love doing the TV stuff [Laura recently started presenting a new TV show], because whenever I’m working with the TV people, I think, like, oh man, I love YouTube. And, when I do too much YouTube, I start really looking forward to working with actual professionals again. It’s a cool balance. I kind of hope that I can keep doing this. You know, I think it’s really cool. And as I said, there’s no other place for me. Where would I go?

Clever keyring with screwdriver

You have YouTube, you have TV, you have your podcast, and you sell merchandise. Is there anything left?

I think I would like to actually have a couple of products now. Some of the furniture I’m building, if you look at them in a different context to ‘this is just what I built this week’ and is only the product of seven days’ worth of work, I think some of those ideas aren’t that bad. And if you put some more work into them, they could be pieces that would sell. But I would want someone who takes the prototypes and does the whole production for me. I’m not interested in all that. But I think it would be cool to have a line of plywood furniture.

So we won’t be seeing a run of the Laura Kampf bench across Köln?

A newspaper interviewed me about the bench. And for the interview, they also approached the city saying, hey, wouldn’t you want to work with her? And you know, maybe collaborate on this because this might be a cool thing. And they said that they don’t have the personnel. But honestly, if they would have done it, that would have made it so boring. Working with somebody in an office telling me where the broken benches are so I can go and fix them. That’s a job.

Laura’s beer bike

Is there anything you’ve ever made that you haven’t wanted to share? A build just for you?

Until I hit publish, I feel like that every week. It feels like I’m just making it for myself. I talk very positively about YouTube, and that’s genuinely how I feel about it, but sometimes it’s really hard on me because I’ll work seven days on a video, think it’s the best thing I ever did, and it makes me so happy. I’ll edit it for hours, sink all this time into it, all this energy, and then the video tanks, and it kinda ruins it for me. I’m in a super-good mood right now because the bench video did so well, and people understand what I’m trying to say. But, there are other cases where it doesn’t work as well, and where I feel like I’ve dropped the ball and couldn’t get my excitement across. And that’s always super-disappointing because I’m always excited about the stuff I make; I always have some angle I find super-interesting, otherwise, I’m not motivated to do it. And, when the video tanks, it makes me feel like I lost the opportunity to spread that excitement, to spread that motivation, and that feels like I wasted my time. And that’s the downside of YouTube. 

I mean, I think every creator takes it in a different way. And you need to find a way to deal with this, and it’s really important to talk about it. This is my dream job and I can do whatever I want to do as long as I don’t drop the ball. I hired my friend, so now I can’t drop the ball for the both of us.

Laura appeared in our video “How do you define ‘maker’?”

Laura Kampf produces a video every Sunday on her YouTube channel. You can also follow her on Instagram and, for any German-speaking readers, her podcast – Raabe & Kampf – with friend and journalist Melanie Raabe can be found wherever you listen to podcasts. 

HackSpace magazine issue 45 out NOW!

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store or your local newsagents.

Hack space magazine issue 45 front cover

As always, every issue is free to download from the HackSpace magazine website.

The post Meet Laura Kampf: Wood and metalworker appeared first on Raspberry Pi.

Archimedes the AI robot | HackSpace #45

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/archimedes-the-ai-robot-hackspace-45/

When we saw Alex Glow’s name in the latest issue of HackSpace magazine, we just had to share their project. HackSpace #45 celebrates the best Raspberry Pi builds of all time, and we remembered spotting Alex’s wearable robotic owl familiar back in the day. For those of you yet to have had the pleasure, meet Archimedes…

archimedes owl on maker's shoulder
Archimedes taking a perch on his maker’s shoulder

Back in 2018, Hackster’s Alex Glow built Archimedes, an incredible robot companion using a combination of Raspberry Pi Zero W and Arduino with the Google AIY Vision Kit for its ‘brain’.

An updated model, Archie 2, using Raspberry Pi 3B, ESP32-powered Matrix Voice, and an SG90 micro-servo motor saw the personable owl familiar toughen up – Alex says the 3D-printed case is far more durable – as well as having better voice interaction options using Matrix HAL (for which installer packages are provided for Raspberry Pi and Python), plus Mycroft and Snips.ai voice assistant software.

archimedes owl insides laid out on table
Owl innards

Other refinements included incorporating compact discs into the owl’s wings to provide an iridescent sheen. Slots in the case allowed Alex to feed through cable ties to attach Archie’s wings, which she says now “provide a lively bounce to the wings, in tune with his active movements (as well as my own).”

archimedes owl wing detail
Raspberry Pi getting stuffed into Archimedes’ head

HackSpace magazine issue 45 out NOW!

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store or your local newsagents.

Hack space magazine issue 45 front cover

As always, every issue is free to download from the HackSpace magazine website.

The post Archimedes the AI robot | HackSpace #45 appeared first on Raspberry Pi.

Make an animated sign with Raspberry Pi Pico

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/make-an-animated-sign-with-raspberry-pi-pico/

Light up your living room like Piccadilly Circus with this Raspberry Pi Pico project from the latest issue of HackSpace magazine. Don’t forget, it’s not too late to get your hands on our new microcontroller for FREE if you subscribe to HackSpace magazine.

HUB75 LED panels provide an affordable way to add graphical output to your projects. They were originally designed for large advertising displays (such as the ones made famous by Piccadilly Circus in London, and Times Square in New York). However, we can use a little chunk of these bright lights in our projects. They’re often given a ‘P’ value, such as P3 or P5 for the number of millimetres between the different RGB LEDs. These don’t affect the working or wiring in any way.

We used a 32×32 Adafruit screen. Other screens of this size may work, or may be wired differently. It should be possible to get screens of different sizes working, but you’ll have to dig through the code a little more to get it running properly.

The most cost- effective way to add 1024 RGB LEDs to your project
The most cost- effective way to add 1024 RGB LEDs to your project

The protocol for running these displays involves throwing large amounts of data down six different data lines. This lets you light up one portion of the display. You then switch to a different portion of the display and throw the data down the data lines again. When you’re not actively writing to a particular segment of the display, those LEDs are off.

There’s no in-built control over the brightness levels – each LED is either on or off. You can add some control over brightness by flicking pixels on and off for different amounts of time, but you have to manage this yourself. We won’t get into that in this tutorial, but if you’d like to investigate this, take a look at the box on ‘Going Further’.

The code for this is on GitHub (hsmag.cc/Hub75). If you spot a way of improving it, send us a pull request
The code for this is on GitHub. If you spot a way of improving it, send us a pull request

The first thing you need to do is wire up the screen. There are 16 connectors, and there are three different types of data sent – colour values, address values, and control values. You can wire this up in different ways, but we just used header wires to connect between a cable and a breadboard. See here for details of the connections.

These screens can draw a lot of power, so it’s best not to power them from your Pico’s 5V output. Instead, use a separate 5V supply which can output enough current. A 1A supply should be more than enough for this example. If you’re changing it, start with a small number of pixels lit up and use a multimeter to read the current.

With it wired up, the first thing to do is grab the code and run it. If everything’s working correctly, you should see the word Pico bounce up and down on the screen. It is a little sensitive to the wiring, so if you see some flickering, make sure that the wires are properly seated. You may want to just display the word ‘Pico’. If so, congratulations, you’re finished!

However, let’s take a look at how to customise the display. The first things you’ll need to adapt if you want to display different data are the text functions – there’s one of these for each letter in Pico. For example, the following draws a lower-case ‘i’:

def i_draw(init_x, init_y, r, g, b):
    for i in range(4):
        light_xy(init_x, init_y+i+2, r, g, b)
    light_xy(init_x, init_y, r, g, b)

As you can see, this uses the light_xy method to set a particular pixel a particular colour (r, g, and b can all be 0 or 1). You’ll also need your own draw method. The current one is as follows:

def draw_text():
    global text_y
    global direction
    global writing
    global current_rows
    global rows

    writing = True
    text_y = text_y + direction
    if text_y > 20: direction = -1
    if text_y < 5: direction = 1
    rows = [0]*num_rows
    #fill with black
    for j in range(num_rows):
    rows[j] = [0]*blocks_per_row

    p_draw(3, text_y-4, 1, 1, 1)
    i_draw(9, text_y, 1, 1, 0)
    c_draw(11, text_y, 0, 1, 1)
    o_draw(16, text_y, 1, 0, 1)
    writing = False

This sets the writing global variable to stop it drawing this frame if it’s still being updated, and then just scrolls the text_y variable between 5 and 20 to bounce the text up and down in the middle of the screen.

This method runs on the second core of Pico, so it can still throw out data constantly from the main processing core without it slowing down to draw images.

Get HackSpace magazine – Issue 40

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store, The Raspberry Pi store in Cambridge, or your local newsagents.

Each issue is free to download from the HackSpace magazine website.

When you subscribe, we’ll send you a Raspberry Pi Pico for FREE.

A banner with the words "Be a Pi Day donor today"

The post Make an animated sign with Raspberry Pi Pico appeared first on Raspberry Pi.

NeoPixel dithering with Pico

Post Syndicated from Ben Everard original https://www.raspberrypi.org/blog/neopixel-dithering-with-pico/

In the extra special Raspberry Pi Pico launch issue of HackSpace magazine, editor Ben Everard shows you how to get extra levels of brightness out of your LEDs with our new board.

WS2812B LEDs, commonly known as NeoPixels, are cheap and widely available LEDs. They have red, green, and blue LEDs in a single package with a microcontroller that lets you control a whole string of them using just one pin on your microcontroller.

The three connections may be in a different order on your LED strip, so check the labels to make sure they’re connected correctly
The three connections may be in a different order on your LED strip, so check the labels to make sure they’re connected correctly

However, they do have a couple of disadvantages:

1) The protocol needed to control them is timing-dependent and often has to be bit-banged.

2) Each colour has 8 bits, so has 255 levels of brightness. However, these aren’t gamma-corrected, so the low levels of brightness have large steps between them. For small projects, we often find ourselves only using the lower levels of brightness, so often only have 10 or 20 usable levels of brightness.

There will usually be wires already connected to your strip, but if you cut it, you’ll need to solder new wires on
There will usually be wires already connected to your strip, but if you cut it, you’ll need to solder new wires on

We’re going to look at how two features of Pico help solve these problems. Firstly, Programmable I/O (PIO) lets us implement the control protocol on a state machine rather than the main processing cores. This means that we don’t have to dedicate any processor time to sending the data out. Secondly, having two cores means we can use one of the processing cores to dither the NeoPixels. This means shift them rapidly between different brightness levels to make pseudo-levels of brightness.

For example, if we wanted a brightness level halfway between levels 3 and 4, we’d flick the brightness back and forth between 3 and 4. If we can do this fast enough, our eyes blur this into a single brightness level and we don’t see the flicker. By varying the amount of time at levels 3 and 4, we can make many virtual levels of brightness. While one core is doing this, we still have a processing core completely free to manipulate the data we want to display.

First, we’ll need a PIO program to communicate with the WS2812B LEDs. The Pico development team have provided an example PIO program to work with – you can see the full details here, but we’ll cover the essentials here. The PIO code is:

.program ws2812
.side_set 1
.define public T1 2
.define public T2 5
.define public T3 3
bitloop:
 out x, 1 side 0 [T3 - 1]
 jmp !x do_zero side 1 [T1 - 1]
 do_one:
 jmp bitloop side 1 [T2 - 1]
 do_zero:
 nop side 0 [T2 - 1]

We looked at the PIO syntax in the main cover feature, but it’s basically an assembly language for the PIO state machine. The WS2812B protocol uses pulses at a rate of 800kHz, but the length of the pulse determines if a 1 or a 0 is being sent. This code uses jumps to move through the loop to set the timings depending on whether the bit (stored in the register x) is 0 or 1. The T1, T2, and T3 variables hold the timings, so are used to calculate the delays (with 1 taken off as the instruction itself takes one clock cycle). There’s also a section in the pio file that links the PIO code and the C code:

% c-sdk {
#include "hardware/clocks.h"
static inline void ws2812_program_init(PIO pio,
uint sm, uint offset, uint pin, float freq, bool
rgbw) {
 pio_gpio_select(pio, pin);
 pio_sm_set_consecutive_pindirs(pio, sm, pin, 1,
true);
 pio_sm_config c = ws2812_program_get_default_
config(offset);
 sm_config_set_sideset_pins(&c, pin);
 sm_config_set_out_shift(&c, false, true, rgbw ?
32 : 24);
 sm_config_set_fifo_join(&c, PIO_FIFO_JOIN_TX);
 int cycles_per_bit = ws2812_T1 + ws2812_T2 +
ws2812_T3;
 float div = clock_get_hz(clk_sys) / (freq *
cycles_per_bit);
 sm_config_set_clkdiv(&c, div);
 pio_sm_init(pio, sm, offset, &c);
 pio_sm_set_enable(pio, sm, true);
}
%}

Most of this is setting the various PIO options – the full range is detailed in the Raspberry Pi Pico C/C++ SDK document.

 sm_config_set_out_shift(&c, false, true, rgbw ? 32
: 24);

This line sets up the output shift register which holds each 32 bits of data before it’s moved bit by bit into the PIO state machine. The parameters are the config (that we’re setting up and will use to initialise the state machine); a Boolean value for shifting right or left (false being left); and a Boolean value for autopull which we have set to true. This means that whenever the output shift register falls below a certain threshold (set in the next parameter), the PIO will automatically pull in the next 32 bits of data.

Using a text editor with programmer’s features such as syntax highlighting will make the job a lot easier
Using a text editor with programmer’s features such as syntax highlighting will make the job a lot easier

The final parameter is set using the expression rgbw ? 32 : 24. This means that if the variable rgbw is true, the value 32 is passed, otherwise 24 is passed. The rbgw variable is passed into this function when we create the PIO program from our C program and is used to specify whether we’re using an LED strip with four LEDs in each (using one red, one green, one blue, and one white) or three (red, green, and blue).

The PIO hardware works on 32-bit words, so each chunk of data we write with the values we want to send to the LEDs has to be 32 bits long. However, if we’re using RGB LED strips, we actually want to work in 24-bit lengths. By setting autopull to 24, we still pull in 32 bits each time, but once 24 bits have been read, another 32 bits are pulled in which overwrite the remaining 8 bits.

sm_config_set_fifo_join(&c, PIO_FIFO_JOIN_TX);

Each state machine has two four-word FIFOs attached to it. These can be used for one going in and one coming out. However, as we only have data going into our state machine, we can join them together to form a single eight-word FIFO using the above line. This gives us a small buffer of time to write data to in order to avoid the state machine running out of data and execution stalling. The following three lines are used to set the speed the state machine runs at:

int cycles_per_bit = ws2812_T1 + ws2812_T2 +
ws2812_T3;
 float div = clock_get_hz(clk_sys) / (freq *
cycles_per_bit);
 sm_config_clkdiv(&c, div);

The WS2812B protocol demands that data is sent out at a rate of 800kHz. However, each bit of data requires a number of state machine cycles. In this case, they’re defined in the variables T1, T2, and T3. If you look back at the original PIO program, you’ll see that these are used in the delays (always with 1 taken off the value because the initial instruction takes one cycle before the delay kicks in). Every loop of the PIO program will take T1 + T2 + T3 cycles. We use these values to calculate the speed we want the state machine to run at, and from there we can work out the divider we need to slow the system clock down to the right speed for the state machine. The final two lines just initialise and enable the state machine.

The main processor

That’s the code that’s running on the state machine, so let’s now look at the code that’s running on our main processor cores. The full code is on github. Let’s first look at the code running on the second core (we’ll look at how to start this code running shortly), as this controls the light levels of the LEDs.

static inline void put_pixel(uint32_t pixel_grb) {
 pio_sm_put_blocking(pio0, 0, pixel_grb << 8u);
}
static inline uint32_t urgb_u32(uint8_t r, uint8_t
g, uint8_t b) {
 return
 ((uint32_t) (r) << 8) |
 ((uint32_t) (g) << 16) |
 (uint32_t) (b);
}
void ws2812b_core() {
int valuer, valueg, valueb;
int shift = bit_depth-8;
 while (1){

for(int i=0; i<STRING_LEN; i++) {
valueb=(pixelsb[i] + errorsb[i]) >> shift;
valuer=(pixelsr[i] + errorsr[i]) >> shift;
valueg=(pixelsg[i] + errorsg[i]) >> shift;
put_pixel(urgb_u32(valuer, valueg, valueb));
errorsb[i] = (pixelsb[i] + errorsb[i]) -
(valueb << shift);
errorsr[i] = (pixelsr[i] + errorsr[i]) -
(valuer << shift);
errorsg[i] = (pixelsg[i] + errorsg[i]) -
(valueg << shift);
 }
sleep_us(400);
}
}

We start by defining a virtual bit depth. This is how many bits per pixel you can use. Our code will then attempt to create the necessary additional brightness levels. It will run as fast as it can drive the LED strip, but if you try to do too many brightness levels, you’ll start to notice flickering.

We found twelve to be about the best with strings up to around 100 LEDs, but you can experiment with others. Our code works with two arrays – pixels which holds the values that we want to display, and errors which holds the error in what we’ve displayed so far (there are three of each for the different colour channels).

If you just want to see this in action, you can download the UF2 file from hsmag.cc/orfgBD and flash it straight to your Pico
If you just want to see this in action, you can download the UF2 file from hsmag.cc/orfgBD and flash it straight to your Pico

To explain that latter point, let’s take a look at the algorithm for determining how to light the LED. We borrowed this from the source code of Fadecandy by Micah Scott, but it’s a well-used algorithm for calculating error rates. We have an outer while loop that just keeps pushing out data to the LEDs as fast as possible. We don’t care about precise timings and just want as much speed as possible. We then go through each pixel.

The corresponding item in the errors array holds the cumulative amount our LED has been underlit so far compared to what we want it to be. Initially, this will be zero, but with each loop (if there’s a difference between what we want to light the LED and what we can light the LED) this error value will increase. These two numbers (the closest light level and the error) added together give the brightness at the pseudo-level, so we need to bit-shift this by the difference between our virtual level and the 8-bit brightness levels that are available.

This gives us the value for this pixel which we write out. We then need to calculate the new error level. Let’s take a look at what this means in practice. Suppose we want a brightness level halfway between 1 and 2 in the 8-bit levels. To simplify things, we’ll use nine virtual bits. 1 and 2 in 8-bit is 2 and 4 in 9 bits (adding an extra 0 to the end multiplies everything by a power of 2), so halfway between these two is a 9-bit value of 3 (or 11 in binary, which we’ll use from now on).

In the first iteration of our loop, pixels is 11, errors is 0, and shift is 1.

value = 11 >> 1 = 1
errors = 11 – 10 = 1

So this time, the brightness level of 1 is written out. The second iteration, we have:

value = 100 >> 1 = 10
errors = 100 – 100 = 0

So this time, the brightness level of 10 (in binary, or 2 in base 10) is written out. This time, the errors go back to 0, so we’re in the same position as at the start of the first loop. In this case, the LED will flick between the two brightness levels each loop so you’ll have a brightness half way between the two.

Using this simple algorithm, we can experiment with different virtual bit-depths. The algorithm will always handle the calculations for us, but we just have to see what creates the most pleasing visual effect for the eye. The larger the virtual bit depth, the more potential iterations you have to go through before the error accumulates enough to create a correction, so the more likely you are to see flicker. The biggest blocker to increasing the virtual bit depth is the sleep_us(400). This is needed to reset the LED strip.

NeoPixels come in many different shapes and sizes

Essentially, we throw out bits at 800kHz, and each block of 24 bits is sent, in turn, to the next LED. However, once there’s a long enough pause, everything resets and it goes back to the first LED. How big that pause is can vary. The truth is that a huge proportion of WS2812B LEDs are clones rather than official parts – and even for official parts, the length of the pause needed to reset has changed over the years.

400 microseconds is conservative and should work, but you may be able to get away with less (possibly even as low as 50 microseconds for some LEDs). The urgb_u32 method simply amalgamates the red, blue, and green values into a single 32-bit string (well, a 24-bit string that’s held inside a 32-bit string), and put_pixel sends this to the state machine. The bit shift there is to make sure the data is in the right place so the state machine reads the correct 24 bits from the output shift register.

Getting it running

We’ve now dealt with all the mechanics of the code. The only bit left is to stitch it all together.

int main() {
 PIO pio = pio0;
 int sm = 0;
 uint offset = pio_add_program(pio, &ws2812_
program);
 ws2812_program_init(pio, sm, offset, PIN_TX,
1000000, false);
 multicore_launch_core1(ws2812b_core);

 while (1) {
 for (int i = 0; i < 30; ++i) {
pixels[i] = i;

for (int j=0;j<30;++j){
 pixels[0] = j;
 if(j%8 == 0) { pixels[1] = j; }
 sleep_ms(50);
 }
 for (int j=30;j>0;--j){
 pixels[0] = j;
 if(j%8 == 0) { pixels[1] = j; }
 sleep_ms(50);
 }
 }
 } }

The method ws2812_program_init calls the method created in the PIO program to set everything up. To launch the algorithm creating the virtual bit-depth, we just have to use multicore_launch_core1 to set a function running on the other core. Once that’s done, whatever we put in the pixels array will be reflected as accurately as possible in the WS2812B LEDs. In this case, we simply fade it in and out, but you could do any animation you like.

Get a free Raspberry Pi Pico

Would you like a free Raspberry Pi Pico? Subscribe to HackSpace magazine via your preferred option here, and you’ll receive your new microcontroller in the mail before the next issue arrives.

The post NeoPixel dithering with Pico appeared first on Raspberry Pi.

Add face recognition with Raspberry Pi | Hackspace 38

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/add-face-recognition-with-raspberry-pi-hackspace-38/

It’s hard to comprehend how far machine learning has come in the past few years. You can now use a sub-£50 computer to reliably recognise someone’s face with surprising accuracy.

Although this kind of computing power is normally out of reach of microcontrollers, adding a Raspberry Pi computer to your project with the new High Quality Camera opens up a range of possibilities. From simple alerting applications (‘Mum’s arrived home!’), to dynamically adjusting settings based on the person using the project, there’s a lot of fun to be had.

Here’s a beginner’s guide to getting face recognition up and running.

Face recognition using machine learning is hard work, so the latest, greatest Raspberry Pi 4 is a must

1. Prepare your Raspberry Pi
For face recognition to work well, we’re going to need some horsepower, so we recommend a minimum of Raspberry Pi 3B+, ideally a Raspberry Pi 4. The extra memory will make all the difference. To keep as much resource as possible available for our project, we’ve gone for a Raspberry Pi OS Lite installation with no desktop.

Make sure you’re on the network, have set a new password, enabled SSH if you need to, and updated everything with sudo apt -y update && sudo apt -y full-upgrade. Finally, go into settings by running sudo raspi-config and enable the camera in ‘Interfacing Options’.

2. Attach the camera
This project will work well with the original Raspberry Pi Camera, but the new official HQ Camera will give you much better results. Be sure to connect the camera to your Raspberry Pi 4 with the power off. Connect the ribbon cable as instructed in hsmag.cc/HQCameraGetStarted. Once installed, boot up your Raspberry Pi 4 and test the camera is working. From the command line, run the following:
raspivid -o test.h264 -t 10000
This will record ten seconds of video to your microSD card. If you have an HDMI cable plugged in, you’ll see what the camera can see in real-time. Take some time to make sure the focus is correct before proceeding.

3. Install dependencies
The facial recognition library we are using is one that has been maintained for many years by Adam Geitgey. It contains many examples, including Python 3 bindings to make it really simple to build your own facial recognition applications. What is not so easy is the number of dependencies that need to be installed first. There are way too many to list here, and you probably won’t want to type them out, so head over to hsmag.cc/FacialRec so that you can cut and paste the commands. This step will take a while to complete on a Raspberry Pi 4, and significantly longer on a Model 3 or earlier.

3. Install the libraries
Now that we have everything in place, we can install Adam’s applications and Python bindings with a simple, single command:
sudo pip3 install face_recognition
Once installed, there are some examples we can download to try everything out.
cd
git clone --single-branch https://github.com/ageitgey/face_recognition.git
In this repository is a range of examples showing the different ways the software can be used, including live video recognition. Feel free to explore and remix.

5. Example images
The examples come with a training image of Barack Obama. To run the example:
cd ./face_recognition/examples
python3 facerec_on_raspberry_pi.py

On your smartphone, find an image of Obama using your favourite search engine and point it at the camera. Providing focus and light are good you will see:
“I see someone named Barack Obama!”
If you see a message saying it can’t recognise the face, then try a different image or try to improve the lighting if you can. Also, check the focus for the camera and make sure the distance between the image and camera is correct.

Who are you? What even is a name? Can a computer decide your identity?

6. Training time
The final step is to start recognising your own faces. Create a directory and, in it, place some good-quality passport-style photos of yourself or those you want to recognise. You can then edit the facerec_on_raspberry_pi.py script to use those files instead. You’ve now got a robust prototype of face recognition. This is just the beginning. These libraries can also identify ‘generic’ faces, meaning it can detect whether a person is there or not, and identify features such as the eyes, nose, and mouth. There’s a world of possibilities available, starting with these simple scripts. Have fun!

Issue 38 of Hackspace Magazine is out NOW

Front cover of hack space magazine featuring a big striped popcorn bucket filled with maker tools and popcorn

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store, The Raspberry Pi store in Cambridge, or your local newsagents.

Each issue is free to download from the HackSpace magazine website.

The post Add face recognition with Raspberry Pi | Hackspace 38 appeared first on Raspberry Pi.

Read RFID and NFC tokens with Raspberry Pi | HackSpace 37

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/read-rfid-and-nfc-tokens-with-raspberry-pi-hackspace-37/

Add a bit of security to your project or make things selectable
by using different cards. In the latest issue of HackSpace magazine, PJ Evans goes contactless.

The HAT is not hard on resources, so you can use many variants of Raspberry Pi

NFC (near-field communication) is based on the RFID (radio-frequency identification) standard. Both allow a device to receive data from a passive token or tag (meaning it doesn’t require external power to work). RFID supports a simple ID message that shouts ‘I exist’, whereas NFC allows for both reading and writing of data.

Most people come into contact with these systems every day, whether it’s using contactless payment, or a card to unlock a hotel or office door. In this tutorial we’ll look at the Waveshare NFC HAT, an add-on for Raspberry Pi computers that allows you to interact with NFC and RFID tokens.

Prepare your Raspberry Pi

We start with the usual step of preparing a Raspberry Pi model for the job. Reading RFID tags is not strenuous work for our diminutive friend, so you can use pretty much any variant of the Raspberry Pi range you like, so long as it has the 40-pin GPIO. We only need Raspberry Pi OS Lite (Buster) for this tutorial; however, you can install any version you wish. Make sure you’ve configured it how you want, have a network connection, and have updated everything by running sudo apt -y update && sudo apt -y upgrade on the command line.

Enable the serial interface

This NFC HAT is capable of communicating over three different interfaces: I2C, SPI, and UART. We’re going with UART as it’s the simplest to demonstrate, but you may wish to use the others. Start by running sudo raspi-config, going to ‘Interfacing options’, and selecting ‘Serial Interface’. When asked if you want to log into the console, say ‘No’. Then, when asked if you want to enable the serial interface, say ‘Yes’. You’ll need to reboot now. This will allow the HAT to talk to our Raspberry Pi over the serial interface.

Configure and install the HAT

As mentioned in the previous step, we have a choice of interfaces and swapping between them means changing some physical settings on the NFC HAT itself. Do not do this while the HAT is powered up in any way. Our HAT can be configured for UART/Serial by default but do check on the wiki at hsmag.cc/iHj1XA. The jumpers at I1 and I0 should both be shorting ‘L’, D16 and D20 should be shorted and on the DIP switch, everything should be off except RX and TX. Check, double-check, attach the HAT to the GPIO, and boot up.

The Waveshare HAT contains many settings. Make sure to read the instructions!

Download the examples

You can download some examples directly from Waveshare. First, we need to install some dependencies. Run the following at the command line:
sudo apt install rpi.gpio p7zip-full python3-pip
pip3 install spidev pyserial

Now, download the files and unpack them:
cd
wget https://www.waveshare.com/w/upload/6/67/Pn532-nfc-hat-code.7z
7z x Pn532-nfc-hat-code.7z

Before you try anything out, you need to edit the example file so that we use UART (see the accompanying code listing).
cd ~/raspberrypi/python
nano example_get_uid.py

Find the three lines that start pn532 = and add a # to the top one (to comment it out). Now remove the # from the line starting pn532 = PN532_UART. Save, and exit.

Try it out!

Finally, we get to the fun part. Start the example code as follows:
python3 example_get_uid.py
If all is well, the connection to the HAT will be announced. You can now place your RFID token over the area of the HAT marked ‘NFC’. Hexadecimal numbers will start scrolling up the screen; your token has been detected! Each RFID token has a unique number, so it can be used to uniquely identify someone. However, this HAT is capable of much more than that as it also supports NFC and can communicate with common standards like MIFARE Classic, which allows for 1kB of storage on the card. Check out example_dump_mifare.py in the same directory (but make sure you make the same edits as above to use the serial connection).

Going further

You can now read unique identifiers on RFID and NFC tokens. As we just mentioned, if you’re using the MIFARE or NTAG2 standards, you can also write data back to the card. The examples folder contains some C programs that let you do just that. The ability to read and write small amounts of data onto cards can lead to some fun projects. At the Electromagnetic Field festival in 2018, an entire game was based around finding physical locations and registering your presence with a MIFARE card. Even more is possible with smartphones, where NFC can be used to exchange data in any form.

Get HackSpace magazine 37 – Out Now!

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store, The Raspberry Pi store in Cambridge, or your local newsagents.

Each issue is free to download from the HackSpace magazine website.

The post Read RFID and NFC tokens with Raspberry Pi | HackSpace 37 appeared first on Raspberry Pi.

Talk to your Raspberry Pi | HackSpace 36

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/talk-to-your-raspberry-pi-hackspace-36/

In the latest issue of HackSpace Magazine, out now, @MrPJEvans shows you how to add voice commands to your projects with a Raspberry Pi 4 and a microphone.

You’ll need:

It’s amazing how we’ve come from everything being keyboard-based to so much voice control in our lives. Siri, Alexa, and Cortana are everywhere and happy to answer questions, play you music, or help automate your household.

For the keen maker, these offerings may not be ideal for augmenting their latest project as they are closed systems. The good news is, with a bit of help from Google, you can add voice recognition to your project and have complete control over what happens. You just need a Raspberry Pi 4, a speaker array, and a Google account to get started.

Set up your microphone

This clever speaker uses four microphones working together to increase accuracy. A ring of twelve RGB LEDs can be coded to react to events, just like an Amazon Echo

For a home assistant device, being able to hear you clearly is an essential. Many microphones are either too low-quality for the task, or are unidirectional: they only hear well in one direction. To the rescue comes Seeed’s ReSpeaker, an array of four microphones with some clever digital processing to provide the kind of listening capability normally found on an Amazon Echo device or Google Assistant. It’s also in a convenient HAT form factor, and comes with a ring of twelve RGB LEDs, so you can add visual effects too. Start with a Raspberry Pi OS Lite installation, and follow these instructions to get your ReSpeaker ready for use.

Install Snowboy

You’ll see later on that we can add the power of Google’s speech-to-text API by streaming audio over the internet. However, we don’t want to be doing that all the time. Snowboy is an offline ‘hotword’ detector. We can have Snowboy running all the time, and when your choice of word is ‘heard’, we switch to Google’s system for accurate processing. Snowboy can only handle a few words, so we only use it for the ‘trigger’ words. It’s not the friendliest of installations so, to get you up and running, we’ve provided step-by-step instructions.

There’s also a two-microphone ReSpeaker for the Raspberry Pi Zero

Create your own hotword

As we’ve just mentioned, we can have a hotword (or trigger word) to activate full speech recognition so we can stay offline. To do this, Snowboy must be trained to understand the word chosen. The code that describes the word (and specifically your pronunciation of it) is called the model. Luckily, this whole process is handled for you at snowboy.kitt.ai, where you can create a model file in a matter of minutes and download it. Just say your choice of words three times, and you’re done. Transfer the model to your Raspberry Pi 4 and place it in your home directory.

Let’s go Google

ReSpeaker can use its multiple mics to detect distance and direction

After the trigger word is heard, we want Google’s fleet of super-servers to help us transcribe what is being said. To use Google’s speech-to-text API, you will need to create a Google application and give it permissions to use the API. When you create the application, you will be given the opportunity to download ‘credentials’ (a small text file) which will allow your setup to use the Google API. Please note that you will need a billable account for this, although you get one hour of free speech-to-text per month. Full instructions on how to get set up can be found here.

Install the SDK and transcriber

To use Google’s API, we need to install the firm’s speech-to-text SDK for Python so we can stream audio and get the results. On the command line, run the following:pip3 install google-cloud-speech
(If you get an error, run sudo apt install python3-pip then try again).
Remember that credentials file? We need to tell the SDK where it is:
export GOOGLE_APPLICATION_CREDENTIALS="/home/pi/[FILE_NAME].json"
(Don’t forget to replace [FILE_NAME] with the actual name of the JSON file.)
Now download and run this test file. Try saying something and see what happens!

Putting it all together

Now we can talk to our Raspberry Pi, it’s time to link the hotword system to the Google transcription service to create our very own virtual assistant. We’ve provided sample code so that you can see these two systems running together. Run it, then say your chosen hotword. Now ask ‘what time is it?’ to get a response. (Don’t forget to connect a speaker to the audio output if you’re not using HDMI.) Now it’s over to you. Try adding code to respond to certain commands such as ‘turn the light on’, or ‘what time is it?’

Get HackSpace magazine 36 Out Now!

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store, The Raspberry Pi store in Cambridge, or your local newsagents.

Each issue is free to download from the HackSpace magazine website.

The post Talk to your Raspberry Pi | HackSpace 36 appeared first on Raspberry Pi.

Build an arcade cabinet | Hackspace 35

Post Syndicated from Ben Everard original https://www.raspberrypi.org/blog/build-an-arcade-cabinet-hackspace-35/

Games consoles might be fast and have great graphics, but they’re no match for the entertainment value of a proper arcade machine. In this month’s issue of Hackspace magazine, you’re invited to relive your misspent youth with this huge build project.

There’s something special about the comforting solidity of a coin-eating video game monolith, and nothing screams retro fun like a full-sized arcade cabinet sitting in the corner of the room. Classic arcade machines can be a serious investment. Costing thousands of pounds and weighing about the same as a giant panda, they’re out of reach for all but the serious collector. Thankfully, you can recreate that retro experience using modern components for a fraction of the price and weight.

An arcade cabinet is much easier to make than you might expect. It’s essentially a fancy cupboard that holds a monitor, speakers, a computer, a keyboard, and some buttons. You can make your own cabinet using not much more than a couple of sheets of MDF, some clear plastic, and a few cans of spray paint.

If you want a really authentic-looking cabinet, you can find plenty of plans and patterns online. However, most classic cabinets are a bit bigger than you might remember, occupying almost a square metre of floor space. If you scale that down to approximately 60 cm2, you can make an authentic-looking home arcade cabinet that won’t take over the entire room, and can be cut from just two pieces of 8 × 4 (2440 mm × 1220 mm) MDF. You can download our plans, but these are rough plans designed for you to tweak into your own creation. A sheet of 18 mm MDF is ideal for making the body of the cabinet, and 12 mm MDF works well to fill in the front and back panels. You can use thinner sheets of wood to make a lighter cabinet, but you might find it less sturdy and more difficult to screw into.

The sides of the machine should be cut from 18 mm MDF, and will be 6 feet high. The sides need to be as close to identical as possible, so mark out the pattern for the side on one piece of 18 mm MDF, and screw the boards together to hold them while you cut. You can avoid marking the sides by placing the screws through the waste areas of the MDF. Keep these offcuts to make internal supports or brackets. You can cut the rest of the pieces of MDF using the project plans as a guide. 

Why not add a coin machine for extra authenticity

Attach the side pieces to the base, so that the sides hang lower than the base by an inch or two. If you’re more accomplished at woodworking and want to make the strongest cabinet possible, you can use a router to joint and glue the pieces of wood together. This will make the cabinet very slightly narrower and will affect some measurements, but if you follow the old adage to measure twice and cut once, you should be fine. If you don’t want to do this, you can use large angle brackets and screws to hold everything together. The cabinet will still be strong, and you’ll have the added advantage that you can disassemble it in the future if necessary.

Keep attaching the 18 mm MDF pieces, starting with the top piece and the rear brace. Once you have these pieces attached, the cabinet should be sturdy enough to start adding the thinner panels. Insetting the panels by about an inch gives the cabinet that retro look, and also hides any design crimes you might have committed while cutting out the side panels.

The absolute sizing of the cabinet isn’t critical unless you’re trying to make an exact copy of an old machine, so don’t feel too constrained by measuring things down to the millimetre. As long as the cabinet is wide enough to accept your monitor, everything else is moveable and can be adjusted to suit your needs.

Make it shiny

You can move onto decoration once the cabinet woodwork is fitted together. This is mostly down to personal preference, although it’s wise to think about which parts of the case will be touched more often, and whether your colour choices will cause any problems with screen reflection. Matt black is a popular choice for arcade cabinets because it’s non-reflective and any surface imperfections are less noticeable with a matt paint finish.

Aluminium checker plate is a good way of protecting your cabinet from damage, and it can be cut and shaped easily.

Wallpaper or posters make a great choice for decorating the outside of the cabinet, and they are quick to apply. Just be sure to paste all the way up to the edge, and protect any areas that will be handled regularly with aluminium checker plate or plastic sheet. The edges of MDF sheets can be finished with iron-on worktop edging, or with the chrome detailing tape used on cars. You can buy detailing tape in 12 mm and 18 mm widths, which makes it great for finishing edges. The adhesive tape provided with the chrome edging isn’t always very good, so it’s worth investing in some high-strength, double-sided clear vinyl foam tape.

You’ve made your cabinet, but it’s empty at the moment. You’re going to add a Raspberry Pi, monitor, speakers, and a panel for buttons and joysticks. To find out how, you can read the full article in HackSpace magazine 35.  

Get HackSpace magazine 35 Out Now!

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store, The Raspberry Pi store in Cambridge, or your local newsagents.

Each issue is free to download from the HackSpace magazine website.

If you subscribe for 12 months, you get an Adafruit Circuit Playground Express , or can choose from one of our other subscription offers, including this amazing limited-time offer of three issues and a book for only £10!

The post Build an arcade cabinet | Hackspace 35 appeared first on Raspberry Pi.

Rotary encoders: Raise a Glitch Storm | Hackspace 34

Post Syndicated from Ben Everard original https://www.raspberrypi.org/blog/rotary-encoders-raise-a-glitch-storm-hackspace-34/

A Glitch Storm is an explosive torrent of musical rhythms and sound, all generated from a single line of code. In theory, you can’t do this with a Raspberry Pi running Python – in this month’s new issue, out now, the HackSpace magazine team lovingly acquired a tutorial from The Mag Pi team to throw theory out the window and show you how.

What is a Glitch Storm

A Glitch Storm is a user-influenceable version of bytebeat music. We love definitions like that here at the Bakery: something you have never heard of is simple a development of something else you have never heard of. Bytebeat music was at the heart of the old Commodore 64 demo scene, a competition to see who could produce the most impressive graphs and music in a very limited number of bytes. This was revived/rediscovered and christened by Viznut, aka Ville-Matias Heikkilä, in 2011. And then JC Ureña of the ‘spherical sound society’ converted the concept into the interactive Glitch Storm.

Figure 1: Schematic for the sound-generating circuit

So what is it?

Most random music generators work on the level of notes; that is, notes are chosen one at a time and then played, like our Fractal Music project in The MagPi #66. However, with bytebeat music, an algorithm generates the actual samples levels that make up the sound. This algorithm performs bitwise operations on a tick variable that increments with each sample. Depending on the algorithm used, this may or may not produce something musically interesting. Often, the samples produced exhibit a fractal structure, which is itself similar on many levels, thus providing both the notes and structure.

Enter the ‘Glitch Storm’

With a Glitch Storm, three user-controlled variables – a, b, and c – can be added to this algorithm, allowing the results to be fine-tuned. In the ‘Algorithms’ box, you can see that the bytebeat algorithms simply run; they all repeat after a certain time, but this time can be long, in the order of hours for some. A Glitch Storm algorithm, on the other hand, contains variables that a user can change in real-time while the sample is playing. This exactly what we can do with rotary encoders, without having the algorithm interrupted by checking the state of them all the time.

Figure 2: Schematic for the control box

What hardware?

In order to produce music like this on the Raspberry Pi, we need some extra hardware to generate the sound samples, and also a bunch of rotary encoders to control things. The samples are produced by using a 12-bit A/D converter connected to one of the SPI ports. The schematic of this is shown in Figure 1. The clock rate for the transfer of data to this can be controlled and provides a simple way of controlling, to some extent, the sample rate of the sound. Figure 2 shows the wiring diagram of the five rotary encoders we used.

Making the hardware

The hardware comes as two parts: the D/A converter and associated audio components. These are built on a board that hangs off Raspberry Pi’s GPIO pins. Also on this board is a socket that carries the wires to the control box. We used an IDC (insulation displacement connector) to connect between the board and the box, as we wanted the D/A connection wires to be as short as possible because they carry a high frequency signal. We used a pentagonal box just for fun, with a control in each corner, but the box shape is not important here.

Figure 3: Front physical layout of the interface board

Construction

The board is built on a 20-row by 24-hole piece of stripboard. Figure 3 and Figure 4 show the physical layout for the front and back of the board. The hole number 5 on row 4 is enlarged to 2.5mm and a new hole is drilled between rows 1 and 2 to accommodate the audio jack socket. A 40-way surface-mount socket connector is soldered to the back of the board, and a 20-way socket is soldered to the front. You could miss this out and wire the 20-way ribbon cable direct to the holes in these positions if you want to economise.

Figure 4: Rear physical layout of the interface board

Further construction notes

Note: as always, the physical layout diagram shows where the wires go, not necessarily the route they will take. Here, we don’t want wires crossing the 20-way connector, so the upper four wires use 30AWG Kynar wire to pop under the connector and out through a track hole, without soldering, on the other side. When putting the 20-way IDC pin connector on the ribbon cable, make sure the red end connector wire is connected to the pin next to the downward-pointing triangle on the pin connector. Figure 5 shows a photograph of the control box wiring

Figure 5: Wiring of the control board

Testing the D/A

The live_byte_beat.py listing on GitHub is a minimal program for trying out a bytebeat algorithm. It will play until stopped by pressing CTRL+C. The variable v holds the value of the sample, which is then transferred to the D/A over SPI in two bytes. The format of these two bytes is shown in Figure 6, along with how we have to manipulate v to achieve an 8-bit or 12-bit sample output. Note that all algorithms were designed for an 8-bit sample size, and using 12 bits is a free bonus here: it does sound radically different, and not always in a good way.

The main software

The main software for this project is on our GitHub page, and contains 24 Pythonised algorithms. The knobs control the user variables as well as the sample rate and what algorithm to use. You can add extra algorithms, but if you are searching online for them, you will find they are written in C. There are two major differences you need to note when converting from C to Python. The first is the ternary operation which in C is a question mark, and the second is the modulus operator with a percent sign. See the notes that accompany the main code about these.

Figure 6: How to program the registers in the D/A converter

Why does this work?

There are a few reasons why you would not expect this to work on a Raspberry Pi in Python. The most obvious being that of the interruptions made by the operating system, regularly interrupting the flow of output samples. Well, it turns out that this is not as bad as you might fear, and the extra ‘noise’ this causes is at a low level and is masked by the glitchy nature of the sound. As Python is an interpreted language, it is just about fast enough to give an adequate sample rate on a Raspberry Pi 4.

Make some noise

You can now explore the wide range of algorithms for generating a Glitch Storm and interact with the sound. On our GitHub page there’s a list of useful links allowing you to explore what others have done so far. For a sneak preview of the bytebeat type of sound, visit magpi.cc/bytebeatdemo; you can even add your own algorithms here. For interaction, however, there’s no substitute for having your own hardware. The best settings are often found by making small adjustments and listening to the long-term effects – some algorithms surprise you about a minute or two into a sequence by changing dramatically.

Get HackSpace magazine issue 34 — out today

HackSpace magazine issue 34: on sale now!

HackSpace magazine is out now, available in print from the Raspberry Pi Press online store, your local newsagents, and the Raspberry Pi Store, Cambridge.

You can also download the directly from PDF from the HackSpace magazine website.

Subscribers to HackSpace for 12 months to get a free Adafruit Circuit Playground, or choose from one of our other subscription offers, including this amazing limited-time offer of three issues and a book for only £10!

If you liked this project, it was first featured in The MagPi Magazine. Download the latest issue for free or subscribe here.

The post Rotary encoders: Raise a Glitch Storm | Hackspace 34 appeared first on Raspberry Pi.

OctoPrint: a baby monitor for your 3D printer

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/octoprint-a-baby-monitor-for-your-3d-printer/

In issue 32 of HackSpace magazine, out now, we talk to Gina Häußge, creator of OctoPrint – it sits on a Raspberry pi and monitors your 3D printer.

Gina Häußge, creator and maintainer of OctoPrint

There’s something enchanting about watching a 3D printer lay down hot plastic. Seeing an object take shape before your eyes is utterly compelling, which is perhaps why we love watching 3D printing time-lapse videos so much.

Despite this, it would be impractical and inefficient to sit and watch every time you sent a print job through. That’s why we should all be grateful for OctoPrint. This free, open-source software monitors your 3D printer for you, keeping you from wasting plastic and ensuring that you can go about your business without fearing for your latest build.
OctoPrint is the creation of Gina Haüßge. We enjoyed a socially distant chat with her about the challenges of running an open-source project, making, and what it’s like to have a small project become huge.

HackSpace: Most people who have used a 3D printer will have heard of OctoPrint, but for the benefit of those who haven’t, what is it?

Gina Haüßge: Somebody once called it a baby monitor for your 3D printer. I really like this description. It’s pretty much a combination of a baby monitor and a remote control, because it allows you to go through any web browser on your network and monitor what your printer is currently up to, how much the current job has progressed. If you have a webcam set up, it can show you the print itself, so you can see that everything is working correctly, it’s still on the bed, and all that.

It also offers a plug-in interface so that it can be expanded with various features and functionality, and people have written a ton of integrations with notification systems. And all of this runs on pretty much any system that runs Python. I have to say Python, not MicroPython, the full version. Usually Linux, and the most common use case is to run it on a Raspberry Pi, and this is also how I originally set it out to work.

Most people think it only runs on a Raspberry Pi, but no. It will run on any old laptop that you still have lying around. It’s cross-platform, so you don’t need to buy a Raspberry Pi if you have another machine that will fit the bill.

OctoPrint is most commonly run on a Raspberry Pi

HS: How long have you been working on it?

GH: I originally sat down to write it over my Christmas break in 2012, because I had got my first 3D printer back then. It was sitting in my office producing fumes and noise for hours on end, which was annoying when trying to work, or game, or anything else.

I thought there must be a solution involving attaching one of these nifty new Raspberry Pis that had just come out. Someone must have written something, right? I browsed around the internet, realised that the closest thing to what I was looking for treated the printer as a black box – to fire job data at it and hope that it gets it right. That was not what I wanted; I wanted this feedback channel. I wanted to see what was happening; I wanted to monitor the temperatures; I wanted to monitor the job progress.

The very first version back then was a plug-in for Cura, before Cura even supported plug-ins. After my Christmas break, I went, OK, it’s doing everything I wanted it to do; back to work at my normal regular job. And then it exploded. I started getting emails, issue reports, and feature requests from all over the world. ‘Can you make it also do this?’ ‘Hey, I have this other printer with this slightly different firmware that behaves like this; can you adapt it so that it works with this?’. ‘Can you remove it from Cura, and have it so it works standalone?’ Suddenly I had this huge open-source project on my hands. I didn’t do any kind of promotion for it or anything like that. I just posted about it in a Google+ community, of all things, and from there it grew by word of mouth.

A year or so later, I reduced my regular job to 80%, to have one day a week for OctoPrint, but that didn’t suffice either with everything that was going on. Then I had the opportunity to go full-time, sponsored by a single company who also made 3D printers, and they ran out of money in 2016. That was when I turned to crowdfunding, which has been the mode of operation ever since. Around 95% of everything that is done on OctoPrint is run by me, and I work on it full-time now. Since 2014.

A lot of the stuff that I have been adding over the years, for instance, the plug-in system itself, would not have been possible as a pet side project, not with a day job.

HS: What are you working on at the moment?

GH: In March just gone, I released the next big version, to make OctoPrint Python 3-compatible, because at the start of the year Python was deemed end of life, so I had to do something. The problem is that there’s a flourishing plug-in ecosystem written in Python 2, so for now, I’m stuck with having to support both, and trying to motivate the plug-in maintainers to also migrate, which is a ton of fun actually. I wrote a migration guide, tracking in the plug-in repository how many plugs are compatible. Newly registered plug-ins have to be compatible too.

HS: Do you have any idea how many people use OctoPrint?

GH: Nine months, a year ago, I introduced usage tracking. It’s my own bundled plug-in that ships with OctoPrint that does anonymous user tracking through my own platform, so no GDPR issues should arise there. And what this shows me is that, over the course of the last seven days, I saw 66,000 instances, and the last 30 days, I saw 91,000 instances.

But that’s only those who have opted into the usage tracking, which obviously is only a fraction. I have no idea about the fraction – whether the real number is five times, ten times higher, I’ve no way of knowing.

When I did the most recent big update, I got some statistics back from piwheels [a Python package repository]. They saw a spike in repositories that were being pulled from their index, which corresponded to dependencies that the new version of OctoPrint depends on, and the spike that they saw corresponded with the day that I rolled out the new version. Based on that, it looks like there’s probably ten times as many instances out there. I didn’t expect that. So the total number of users could be 700,000, it could be over a million, I have no idea. But based on these piwheels stats, it’s in that ballpark.

HS: And are you seeing a growth in those figures?

GH: Yes. Especially now, with the pandemic going on. If you had asked me three or four months ago, just when the pandemic started, I would have told you more like 60,000 per 30 days. So I saw a significant increase. I also saw a significant usage increase in the last couple of weeks.

I also saw a significant increase in support overheads in the last couple of weeks, which was absolutely insane. It was like everyone and their mother wanted to know something from me, writing me emails, opening tickets and all that, and this influx of people has not stopped yet. At first I thought, well I’ll just go into crunch mode and weather this out, but that didn’t work out. I had to find new ways to cope in order to keep this sustainable.

HS: You can’t have crunch mode for three months!

GH: I mean it’s OK for four weeks or so, but then you start to notice side effects on your own well-being. It’s not a good idea. I’m in for the long haul.

HS: Wanting a feedback channel instead of just firing off commands that work silently makes a lot of sense.

GH: It’s not like a paper printer where you fire and forget, so treating it as a black box, where you don’t get anything back on status and all that, is bound to be trouble. This is a complicated machine where a lot of stuff can go wrong, so it makes sense to have a feedback channel — at least that was my intuition back then, and evidently, a lot of people thought the same.

HS: You must have saved people countless hours and hours of wasted time, filament, and energy.

GH: I’ve also heard that I’ve saved at least one marriage! Someone wrote me an email a couple of years ago thanking me because the person had a new printer in their garage and was constantly monitoring it, sitting in front of it. Apparently the wife and kids were not too thrilled by this. They installed OctoPrint, and since then they’ve been happy again.

Get HackSpace magazine issue 31 — out today

HackSpace magazine issue 32: on sale now!

You can read the rest of HackSpace magazine’s interview with Gina Häußge in issue 32, out today and available online from the Raspberry Pi Press online store. You can also download issue 32 for free.

The post OctoPrint: a baby monitor for your 3D printer appeared first on Raspberry Pi.

The Raspberry Pi Press store is looking mighty fine

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/the-raspberry-pi-press-store-is-looking-mighty-fine/

Eagle-eyed Raspberry Pi Press fans might have noticed some changes over the past few months to the look and feel of our website. Today we’re pleased to unveil a new look for the Raspberry Pi Press website and its online store.

Did you know?

Raspberry Pi Press is the publishing imprint of Raspberry Pi (Trading) Ltd, which is part of the Raspberry Pi Foundation, a UK-based charity that does loads of cool stuff with computers and computer education.

Did you also know?

Raspberry Pi Press publishes five monthly magazines: The MagPi, HackSpace Magazine, Wireframe, Custom PC, and Digital SLR Photography. It also produces a plethora of project books and gorgeous hardback beauties, such as retro gamers’ delight Code the Classics, as well as Hello World, the computing and digital making magazine for educators! Phew!

And did you also, also know?

The Raspberry Pi Press online store ships around the globe, with copies of our publications making their way to nearly every single continent on planet earth. Antarctica, we’re looking at you, kid.

It’s upgrade time!

With all this exciting work going on, it seemed only fair that Raspberry Pi Press should get itself a brand new look. We hope you’ll enjoy skimming the sparkling shelves of our online newsagents and bookshop.

Ain’t nothin’ wrong with a little tsundoku

You can pick up all the latest issues of your favourite magazines or treat yourself to a book or three, and you can also subscribe to all our publications with ease. We’ve even added a few new payment options to boot.

New delivery options

We’ve made a few changes to our shipping options, with additional choices for some regions to make sure that you can easily track your purchases and receive timely and reliable deliveries, even if you’re a long way from the Raspberry Pi Press printshop.

Customers in the UK, the EU, North America, Australia, and New Zealand won’t see any changes to delivery options. We continue to work to make sure we’re offering the best price and service we can for everyone, no matter where you are.

Have a look and see what you think!

So hop on over to the new and improved Raspberry Pi Press website to see the changes for yourself. And if you have any feedback, feel free to drop Oli and the team an email at [email protected].

The post The Raspberry Pi Press store is looking mighty fine appeared first on Raspberry Pi.

Design your own Internet of Things with HackSpace magazine

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/design-your-own-internet-of-things-with-hackspace-magazine/

In issue 31 of HackSpace magazine, out today, PJ Evans looks at DIY smart homes and homemade Internet of Things devices.

In the last decade, various companies have come up with ‘smart’ versions of almost everything. Microcontrollers have been unceremoniously crowbarred into devices that had absolutely no need for microcontrollers, and often tied to phone apps or web services that are hard to use and don’t work well with other products.

Put bluntly, the commercial world has struggled to deliver an ecosystem of useful smart products. However, the basic principle behind the connected world is good – by connecting together sensors, we can understand our local environment and control it to make our lives better. That could be as simple as making sure the plants are correctly watered, or something far more complex.

The simple fact is that we each lead different lives, and we each want different things out of our smart homes. This is why companies have struggled to create a useful smart home system, but it’s also why we, as makers, are perfectly placed to build our own. Let’s dive in and take a look at one way of doing this – using the TICK Stack – but there are many more, and we’ll explore a few alternatives later on.

Many of our projects create data, sometimes a lot of it. This could be temperature, humidity, light, position, speed, or anything else that we can measure electronically. To be useful, that data needs to be turned into information. A list of numbers doesn’t tell you a lot without careful study, but a line graph based on those numbers can show important information in an instant. Often makers will happily write scripts to produce charts and other types of infographics, but now open-source software allows anyone to log data to a database, generate dashboards of graphs, and even trigger alerts and scripts based on the incoming data. There are several solutions out there, so we’re going to focus on just one: a suite of products from InfluxData collectively known as the TICK Stack.

InfluxDB

The ‘I’ in TICK is the database that stores your precious data. InfluxDB is a time series database. It differs from regular SQL databases as it always indexes based on the time stamp of the incoming data. You can use a regular SQL database if you wish (and we’ll show you how later), but what makes InfluxDB compelling for logging data is not only its simplicity, but also its data-management features and built-in web-based API interface. Getting data into InfluxDB can be as easy as a web post, which places it within the reach of most internet-capable microcontrollers.

Kapacitor

Next up is our ‘K’. Kapacitor is a complex data processing engine that acts on data coming into your InfluxDB. It has several purposes, but the common use is to generate alerts based on data readings. Kapacitor supports a wide range of alert ‘endpoints’, from sending a simple email to alerting notification services like Pushover, or posting a message to the ubiquitous Slack. Multiple alerts to multiple destinations can be configured, and what constitutes an alert status is up to you. More advanced uses of Kapacitor include machine learning and anomaly detection.

Chronograf

The problem with Kapacitor is the configuration. It’s a lot of work with config files and the command line. Thoughtfully, InfluxData has created Chronograf, a graphical user interface to both Kapacitor and InfluxDB. If you prefer to keep away from the command line, you can query and manage your databases here as well as set up alerts, metrics that trigger an alert, and the configurations for the various handlers. This is all presented through a web app that you can access from anywhere on your network. You can also build ‘Dashboards’ – collections of charts displayed on a single page based on your InfluxDB data.

Telegraf

Finally, our ’T’ in TICK. One of the most common uses for time series databases is measuring computer performance. Telegraf provides the link between the machine it is installed on and InfluxDB. After a simple install, Telegraf will start logging all kinds of data about its host machine to your InfluxDB installation. Memory usage, CPU temperatures and load, disk space, and network performance can all be logged to your database and charted using Chronograf. This is more due to the Stack’s more common use for monitoring servers, but it’s still useful for making sure the brains of our network-of-things is working properly. If you get a problem, Kapacitor can not only trigger alerts but also user-defined scripts that may be able to remedy the situation.

Get HackSpace magazine issue 31 — out today

HackSpace magazine issue 31: on sale now!

You can read the rest of HackSpace magazine’s DIY IoT feature in issue 31, out today and available online from the Raspberry Pi Press online store. You can also download issue 31 for free.

The post Design your own Internet of Things with HackSpace magazine appeared first on Raspberry Pi.

Build low-power, clock-controlled devices

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/build-low-power-clock-controlled-devices/

Do you want to make a sensor with a battery life you can measure in days rather than hours? Even if it contains a (relatively!) power-hungry device like a Raspberry Pi? By cunning use of a real-time clock module, you can make something that wakes up, does its thing, and then goes back to sleep. While asleep, the sensor will sip a tiny amount of current, making it possible to remotely monitor the temperature of your prize marrow in the greenhouse for days on end from a single battery. Read on to find out how to do it.

A sleeping Raspberry Pi Zero apparently consuming no current!

You’ll need:

  • DS3231 powered real-time clock module with battery backup: make sure it has a battery holder and an INT/SQW output pin
  • P-channel MOSFET: the IRF9540N works well
  • Three resistors: 2.2 kΩ, 4.7 kΩ, and 220 Ω
  • A device you want to control: this can be a PIC, Arduino, ESP8266, ESP32, or Raspberry Pi. My software is written in Python and works in MicroPython or on Raspberry Pi, but you can find DS3231 driver software for lots of devices
  • Sensor you want to use: we’re using a BME280 to get air temperature, pressure, and humidity
  • Breadboard or prototype board to build up the circuit

We’ll be using a DS3231 real-time clock which is sold in a module, complete with a battery. The DS3231 contains two alarms and can produce a trigger signal to control a power switch. To keep our software simple, we are going to implement an interval timer, but there is nothing to stop you developing software that turns on your hardware on particular days of the week or days in the month. The DS3231 is controlled using I2C, which means it can be used with lots of devices.

You can pick up one of these modules from lots of suppliers. Make sure that you get one with the SQW connection, as that provides the alarm signal

MOSFET accompli

The power to our Raspberry Pi Zero is controlled via a P-channel MOSFET device operating as a switch. The 3.3 V output from Raspberry Pi is used to power the DS3231 and our BME280 sensor. The gate on the MOSFET is connected via a resistor network to the SQW output from the DS3231.

You can think of a MOSFET as a kind of switch. It has a source pin (where we supply power), a drain pin (which is the output the MOSFET controls), and a gate pin. If we change the voltage on the gate pin, this will control whether the MOSFET conducts or not.

We use a P-channel MOSFET to switch the power because the gate voltage must be pulled down to cause the MOSFET to conduct, and that is how P-channel devices function.

MOSFET devices are all about voltage. Specifically, when the voltage difference between the source and the gate pin reaches a particular value, called the threshold voltage, the MOSFET will turn on. The threshold voltage is expressed as a negative value because the voltage on the gate must be lower than the voltage on the source. The MOSFET that we’re using turns on at a threshold voltage of around -3.7 volts and off at a voltage of -1.75 volts.

The SQW signal from the DS3231 is controlled by a transistor which is acting as a switch connected to ground inside the DS3231. When the alarm is triggered, this transistor is turned on, connecting the SQW pin to ground. The diagram below shows how this works.

The resistors R1 and R2 are linked to the supply voltage at one end and the SQW pin and the MOSFET gate on the other. When SQW is turned off the voltage on the MOSFET gate is pulled high by the resistors, so the MOSFET turns off. When SQW is turned on, it pulls the voltage on the MOSFET gate down, turning it on.

Unfortunately, current leaking through R1 and R2 to the DN3231 means that we are not going to get zero current consumption when the MOSFET is turned off, but it is much less than 1 milliamp.

We’re using a BME280 environmental sensor on this device. It is connected via I2C to Raspberry Pi. You don’t need this sensor to implement the power saving

Power control

Now that we have our hardware built, we can get some code running to control the power. The DS3231 is connected to Raspberry Pi using I2C. Before you start, you must enable I2C on your Raspberry Pi using the raspi-config tool. Use sudo raspi-config and select Interfacing Options. Next, you need to make sure that you have all the I2C libraries installed by issuing this command at a Raspberry Pi console:

sudo apt-get install python3-smbus python3-dev i2c-tools

The sequence of operation of our sensor is as follows:

  1. The program does whatever it needs to do. This is the action that you want to perform at regular intervals. That may be to read a sensor and send the data onto the network, or write it to a local SD card or USB memory key. It could be to read something and update an e-ink display. You can use your imagination here.
  2. The program then sets an alarm in the DS3231 at a point in the future, when it wants the power to come back on.
  3. Finally, the program acknowledges the alarm in the DS3231, causing the SQW alarm output to change state and turn off the power.

Clock setting

The program below only uses a fraction of the capabilities of the DS3231 device. It creates an interval timer that can time hours, minutes, and seconds. Each time the program runs, the clock is set to zero, and the alarm is configured to trigger when the target time is reached.

Put the program into a file called SensorAction.py on your Raspberry Pi, and put the code that you want to run into the section indicated.

import smbus

bus = smbus.SMBus(1)

DS3231 = 0x68

SECONDS_REG = 0x00
ALARM1_SECONDS_REG = 0x07

CONTROL_REG = 0x0E
STATUS_REG = 0x0F

def int_to_bcd(x):
    return int(str(x)[-2:], 0x10)

def write_time_to_clock(pos, hours, minutes, seconds):
    bus.write_byte_data(DS3231, pos, int_to_bcd(seconds))
    bus.write_byte_data(DS3231, pos + 1, int_to_bcd(minutes))
    bus.write_byte_data(DS3231, pos +2, int_to_bcd(hours))

def set_alarm1_mask_bits(bits):
    pos = ALARM1_SECONDS_REG
    for bit in reversed(bits):
        reg = bus.read_byte_data(DS3231, pos)
        if bit:
            reg = reg | 0x80
        else:
            reg = reg & 0x7F
        bus.write_byte_data(DS3231, pos, reg)
        pos = pos + 1

def enable_alarm1():
    reg = bus.read_byte_data(DS3231, CONTROL_REG)
    bus.write_byte_data(DS3231, CONTROL_REG, reg | 0x05)

def clear_alarm1_flag():
    reg = bus.read_byte_data(DS3231, STATUS_REG)
    bus.write_byte_data(DS3231, STATUS_REG, reg & 0xFE)

def check_alarm1_triggered():
    return bus.read_byte_data(DS3231, STATUS_REG) & 0x01 != 0

def set_timer(hours, minutes, seconds):
    # zero the clock
    write_time_to_clock(SECONDS_REG, 0, 0, 0)
    # set the alarm
    write_time_to_clock(ALARM1_SECONDS_REG, hours, minutes, seconds)
    # set the alarm to match hours minutes and seconds
    # need to set some flags
    set_alarm1_mask_bits((True, False, False, False))
    enable_alarm1()
    clear_alarm1_flag()

#
# Your sensor behaviour goes here
#
set_timer(1,30,0)

The set_timer function is called to set the timer and clear the alarm flag. This resets the alarm signal and powers off the sensor. The example above will cause the sensor to shut down for 1 hour 30 minutes.

You can use any other microcontroller that implements I2C

Power down

The SensorAction program turns off your Raspberry Pi without shutting it down properly, which is something your mother probably told you never to do. The good news is that in extensive testing, we’ve not experienced any problems with this. However, if you want to make your Raspberry Pi totally safe in this situation, you should make its file system ‘read-only’, which means that it never changes during operation and therefore can’t be damaged by untimely power cuts. There are some good instructions from Adafruit here: hsmag.cc/UPgJSZ.

Note: making the operating system file store read-only does not prevent you creating a data logging application, but you would have to log the data to an external USB key or SD card and then dismount the storage device before killing the power.

If you are using a different device, such as an ESP8266 or an Arduino, you don’t need to worry about this as the software in them is inherently read-only.

The SQW output from the DS3231 will pull the gate of the MOSFET low to turn on the power to Raspberry Pi

Always running

To get the program to run when the Raspberry Pi boots, use the Nano editor to add a line at the end of the rc.local file that runs your program.

sudo nano /etc/rc.local

Use the line above at the command prompt to start editing the rc.local file and add the following line at the end of the file:

python3 /home/pi/SensorAction.py &

This statement runs Python 3, opens the SensorAction.py file, and runs it. Don’t forget the ampersand (&) at the end of the command: this starts your program as a separate process, allowing the boot to complete. Now, when Raspberry Pi boots up, it will run your program and then shut itself down. You can find a full sample application on the GitHub pages for this project (hsmag.cc/Yx7q6t). It logs air temperature, pressure, and humidity to an MQTT endpoint at regular intervals. Now, go and start tracking that marrow temperature!

Issue 30 of HackSpace magazine is out now

The latest issue of HackSpace magazine is on sale now, and you can get your copy from the Raspberry Pi Press online store. You can also download it for free to check it out first.

UK readers can take advantage of our special subscriptions offer at the moment.

3 issues for £10 & get a free book worth £10…

If you’re in the UK, get your first three issues of HackSpace magazine, The MagPi, Custom PC, or Digital SLR Photography delivered to your door for £10, and choose a free book (itself worth £10) on top!

The post Build low-power, clock-controlled devices appeared first on Raspberry Pi.

Special offer for magazine readers

Post Syndicated from Russell Barnes original https://www.raspberrypi.org/blog/special-offer-for-magazine-readers/

You don’t need me to tell you about the unprecedented situation that the world is facing at the moment. We’re all in the same boat, so I won’t say anything about it other than I hope you stay safe and take care of yourself and your loved ones.

The other thing I will say is that every year, Raspberry Pi Press produces thousands of pages of exciting, entertaining, and often educational content for lovers of computing, technology, games, and photography.

In times of difficulty, it’s not uncommon for people to find solace in their hobbies. The problem you’ll find yourself with is that it’s almost impossible to buy a magazine at the moment, at least in the UK: most of the shops that sell them are closed (and even most of their online stores are too).

We’re a proactive bunch, so we’ve done something about that:


From today, you can subscribe to The MagPi, HackSpace magazine, Custom PC, or Digital SLR Photography at a cost of three issues for £10 in the UK – and we’re giving you a little extra too.

We like to think we produce some of the best-quality magazines on the market today (and you only have to ask our mums if you want a second opinion). In fact, we’d go as far as to say our magazines are exactly the right mix of words and pictures for making the most of all the extra home-time you and your loved ones are having.

Take your pick for three issues at £10 and get a free book worth £10!

If you take us up on this offer, we’ll send the magazines direct to your door in the UK, with free postage. And we’re also adding a gift to thank you for signing up: on top of your magazines, you’ll get to choose a book that’s worth £10 in itself.

In taking up this offer, you’ll get some terrific reading material, and we’ll deliver it all straight to you — no waiting around. You’ll also be actively supporting our print magazines and the charitable work of the Raspberry Pi Foundation.

I hope that among our magazines, you’ll find something that’s of interest to you or, even better yet, something that sparks a new interest. Enjoy your reading!

The post Special offer for magazine readers appeared first on Raspberry Pi.

Build a physical game controller for Infinite Bunner

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/build-a-physical-game-controller-for-infinite-bunner/

In HackSpace magazine issue 28 we had a look at how to create an ultrasonic controller for a version of Pong called Boing!. This month, we’re going to take a step further forward through video game history and look at the game Frogger. In this classic game, you control a frog as it makes its way across logs, roads, and train tracks, avoiding falling in the water or getting hit.

Infinite Bunner

The tribute to Frogger in the new Code the Classics Volume 1 book is called Infinite Bunner, and works in much the same way, except you control a bunny.

Jump along the logs, dodge the traffic, avoid the trains, and keep your bunny alive for as long as possible

All this hopping got us thinking about a controller. Our initial idea was that since the animals jump, so should the controller. An accelerometer can detect freefall, so it shouldn’t be too hard to convert that into button presses. However, it turns out that computer-controlled frogs and rabbits can jump much, much faster than humans can, and we really struggled to get a working game mechanic, so we compromised a little and worked with ‘flicks’.

The flick controller

The basic idea is that you tilt the controller left or right to move left or right, but you have to flick it up to register a jump (simply holding it upright won’t work).

We’ve used a Circuit Playground Bluefruit as our hardware, but it would work equally well with a Circuit Playground Express. There are two key parts to the software. The first is reading in accelerometer values and use these to know what orientation the board is in, and the second is the board mimicing a USB keyboard and sending keystrokes to any software running on it.

Playing Infinite Bunner

The first step is to get Infinite Bunner working on your machine.

Get your copy of Code the Classics today

You can download the code for all the Code the Classics Volume 1 games here. Click on Clone or Download > Download ZIP. Unzip the download somewhere.

You’ll need Python 3 with Pygame Zero installed. The process for this differs a little between different computers, but there’s a good overview of all the different options on page 186 of Code the Classics.

Subscribe to HackSpace magazine for twelve months and you get a Circuit Playground Express for free! Then you can make your very own Infinite Bunner controller

Once everything’s set up, open a terminal and navigate to the directory you unzipped the code in. Then, inside that, you should find a folder called bunner-master and move into that. You can then run:

python3 bunner.py

Have a few goes playing the game, and you’ll find that you need the left, right, and up arrow keys to play (there is also the down arrow, but we’ve ignored this since we’ve never actually used it in gameplay – if you’re a Frogger/Bunner aficionado, you may wish to implement this as well).

Reading the accelerometer is as easy as importing the appropriate module and running one line:

from adafruit_circuitplayground import cpx, y, z = cp.acceleration

Sending key presses is similarly easy. You can set up a keyboard with the following:

from adafruit_hid.keyboard import Keyboard
from adafruit_hid.keyboard_layout_us import KeyboardLayoutUS
from adafruit_hid.keycode import Keycode

keyboard = Keyboard(usb_hid.devices)

Then send key presses with code such as this:

time.keyboard.press(Keycode.LEFT_ARROW) time.sleep(0.1)
keyboard.release_all()

The only thing left is to slot in our mechanics. The X-axis on the accelerometer can determine if the controller is tilted left or right. The output is between 10 (all the way left) and -10 (all the way right). We chose to threshold it at 7 and -7 to require the user to tilt it most of the way. There’s a little bit of fuzz in the readings, especially as the user flicks the controller up, so having a high threshold helps avoid erroneous readings.

The Y-axis is for jumping. In this case, we require a ‘flap’ where the user first lifts it up (over a threshold of 5), then back down again.

The full code for our controller is:

import time
from adafruit_circuitplayground import cp
import usb_hid
from adafruit_hid.keyboard import Keyboard
from adafruit_hid.keyboard_layout_us import KeyboardLayoutUS
from adafruit_hid.keycode import Keycode

keyboard = Keyboard(usb_hid.devices)

jumping = 0
up=False
while True:
    x, y, z = cp.acceleration
    if abs(y) > 5:
        up=True
    if y < 5 and up:
        keyboard.press(Keycode.UP_ARROW)
        time.sleep(0.3)
        keyboard.release_all()
        up=False
    if x < -7 :
        keyboard.press(Keycode.LEFT_ARROW)
        time.sleep(0.1)
        keyboard.release_all()
    if x < 7 : keyboard.press(Keycode.RIGHT_ARROW)
        time.sleep(0.1)
        keyboard.release_all()
        time.sleep(0.1)
    if jumping > 0:
        jumping=jumping-1

It doesn’t take much CircuitPython to convert a microcontroller into a games controller

The final challenge we had was that there’s a bit of wobble when moving the controller around – especially when trying to jump repeatedly and quickly. After fiddling with thresholds for a while, we found that there’s a much simpler solution: increase the weight of the controller. The easiest way to do this is to place it inside a book. If you’ve ever held a copy of Code the Classics, you’ll know that it’s a fairly weighty tome. Just place the board inside and close the book around it, and all the jitter disappears.

That’s all there is to the controller. You can use it to play the game, just as you would any joypad. Start the game as usual, then start flapping the book around to get hopping.

HackSpace magazine is out now

The latest issue of HackSpace magazine is out today and can be purchased from the Raspberry Pi Press online store. You can also download a copy if you want to see what all the fuss is about.


Code the Classics is available from Raspberry Pi Press as well, and comes with free UK shipping. And here’s a lovely video about Code the Classics artist Dan Malone and the gorgeous artwork he created for the book:

Code the Classics: Artist Dan Malone

No Description

The post Build a physical game controller for Infinite Bunner appeared first on Raspberry Pi.

Play Pong with ultrasonic sensors and a Raspberry Pi | HackSpace magazine

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/play-pong-with-ultrasonic-sensors-and-a-raspberry-pi-hackspace-magazine/

Day three of our Pong celebration leads us here, to HackSpace magazine’s ultrasonic hack of Eben’s Code the Classics Pong tribute, Boing!

If you haven’t yet bought your copy of Code the Classics, you have until 11:59pm GMT tonight to get £1 off using the discount code PONG. Click here to visit the Raspberry Pi Press online store to secure your copy, and read on to see how you can use ultrasonic sensors to turn this classic game into something a lot more physical.

Over to the HackSpace magazine team…

Code the Classics is an entertaining book for a whole bunch of reasons, but one aspect of it that is particularly exciting to us makers is that it means there are some games out there that are really fun to play, but also written to be easy to understand and have high-quality game art to go along with them. Why does this excite us as makers? Because it makes them ideal candidates for testing out novel DIY games controllers!

Pong

We’re going to start right at the beginning of the book (and also at the beginning of computer game history) with the game Pong. There’s a great chapter on this seminal game in the book, but we’ll dive straight into the source code of our Boing! tribute game. This code should run on any computer with Python 3 (and a few dependencies) installed, but we’ll use a Raspberry Pi, as this has GPIO pins that we can use to add on our extra controller.

Download the code here by clicking the ‘Clone or download’ button, and then ‘Download ZIP’. Unzip the downloaded file, and you should have a directory called Code‑The‑Classics-master, and inside this, a directory called boing-master.

Open a terminal and navigate to this directory, then run:

python3 boing.py

If everything works well, you’ll get a screen asking you to select one or two players – press SPACE to confirm your selection, and have a play.

Hacking Code the Classics

So that’s how Eben Upton designed the game to be played. Let’s put our own spin on it. Games controllers are basically just sensors that take input from the real world in some way and translate that into in-game actions. Most commonly, these sensors are buttons that you press, but there’s no need for that to be the case. You can use almost any sensor you can get input from – it sounds trite, but the main limitation really is your imagination!

We were playing with ultrasonic distance sensors in the last issue of HackSpace magazine, and this sprung to mind a Pong controller. After all, distance sensors measure in one dimension and Pong bats travel in one dimension.

Last issue we learned that the main challenge when using the cheap HC-SR04 sensors with 3.3V devices is that they use 5V, so we need to reduce their output to 3.3V. A simple voltage divider does the trick, and we used three 330Ω resistors to achieve this – see Figure 1 for more details.

There’s support for these sensors in the GPIO Zero Python library. As a simple test, you can obtain the distance with the following Python code:

import gpiozero
import time
sensor = gpiozero.DistanceSensor(echo=15,trigger=14)

while True:
    print(sensor.distance)

time.sleep(0.1)

That will give you a constant read-out of the distance between the ultrasonic sensor and whatever object is in front of it. If you wave your hand around in front of the sensor, you’ll see the numbers changing from 0 to 1, which is the distance in metres.

So far, so straightforward. We only need to add a few bits to the code of our Boing! game to make it interact with the sensor. You can download an updated version of Boing! here, but the changes are as follows.

Add this line to the import statements at the top:

import gpiozero

Add this line to instantiate the distance sensor object at the end of the file (just before pgzrun.go()):

p1_distance = DistanceSensor(echo=15,trigger=14,queue_len=5)

We added the queue_len parameter to get the distances through a little quicker.

Finally, overwrite the p1_controls function with the following:

def p1_controls():
    move = 0
    distance = p1_distance.distance
    print(distance)
    if distance < 0.1:
        move = PLAYER_SPEED
    elif distance > 0.2:
        move = -PLAYER_SPEED
    return move

This uses the rather arbitrary settings of 10 cm and 20 cm to define whether the paddle moves up or down. You can adjust these as required.

That’s all there is to our ultrasonic Pong. It’s great fun to play, but there are, no doubt, loads of other versions of this classic game you can make by adding different sensors. Why not see what you can come up with?

Code the Classics

Today is the last day to get £1 off Code the Classics with the promo code PONG, so visit the Raspberry Pi Press online store to order your discounted copy before 11:59pm GMT tonight.

You can also download Code the Classics as a free PDF here, but the book, oh, the book – it’s a marvellous publication that deserves a physical presence in your home.

The post Play Pong with ultrasonic sensors and a Raspberry Pi | HackSpace magazine appeared first on Raspberry Pi.

How to play sound and make noise with your Raspberry Pi

Post Syndicated from Andrew Gregory original https://www.raspberrypi.org/blog/how-to-play-sound-and-make-noise-with-your-raspberry-pi/

If your amazing project is a little too quiet, add high-fidelity sound with Raspberry Pi and the help of this handy guide from HackSpace magazine, written by PJ Evans.

The PecanPi HAT features best-in-class components and dual DACs for superior audio reproduction

It’s no surprise that we love microcontrollers at HackSpace magazine. Their versatility and simplicity make them a must for electronics projects. Although a dab hand at reading sensors or illuminating LEDs, Arduinos and their friends do struggle when it comes to high-quality audio. If you need to add music or speech to your project, it may be worth getting a Raspberry Pi computer to do the heavy lifting. We’re going to look at the various audio output options available for our favourite small computer, from a simple buzz, through to audiophile bliss.

Get buzzing

Need to keep it simple and under a pound?
An active buzzer is what you need

The simplest place to start is with the humble buzzer. A cheap active buzzer can be quickly added to Raspberry Pi’s GPIO. It’s surprisingly easy too. Try connecting a buzzer’s red wire (positive) to GPIO pin 22 (Broadcom numbering) and the black wire (ground) to any GND pin. Now, install the GPIO Zero Python library by typing this at the command line:

sudo apt install python3-gpiozero

Create a file called buzz.py in your favourite editor and enter the following:

import time
from gpiozero import Buzzer
buzzer = Buzzer(22)
buzzer.on()
time.sleep(1)
buzzer.off()

Run it at the command line:

python3 buzz.py

You should hear a one-second buzz. See if you can make Morse code sounds by changing the duration of the sleep statement.

Passive but not aggressive

Raspberry Pi computers, with the exception of the Zero range, all have audio output on board. The original Raspberry Pi featured a stereo 3.5mm socket, and all A and B models since feature a four-pole socket that also includes composite video. This provides your cheapest route to getting audio from your Raspberry Pi computer.

A low-cost passive speaker can be directly plugged in to provide sound, albeit probably quieter than you’d like. Of course, add an amplifier or active speaker and you have sound as loud as you like. This is the most direct way of adding sound to your project, but how to get the sound out?

Need a simple solution? USB audio devices come in all shapes and sizes but are mostly plug-and-play

Normally, the Raspbian operating system will recognise that an audio device has been connected and route audio through it. Sometimes, especially if you’ve connected an HDMI monitor with sound capability (e.g. an HDMI TV), sound will not come out of the correct device.

To fix this, open up a terminal window and run sudo raspi-config. When the menu appears, go to Advanced Options and select Audio, then select the option to force the output through the audio jack. You may need to reboot Raspbian for all changes to take effect.

Plug and playback

A USB sound device is another simple choice for audio playback on Raspberry Pi. Literally hundreds are available, and a basic input/output device with better audio quality than the on-board system can be purchased for a few pounds online. Installation tends to be no more complicated than plugging the device into the USB port. You may need to select the new output, as the underlying audio system, ALSA (see the ALSA and PulseAudio section for more), may mute it by default. To fix this, run alsamixer from the command line, press F6 to select the new sound device, and if you see ‘MM’ at the bottom of the volume indicator, press M to unmute and adjust the volume with the cursor keys.

Many DACs also come with on-board amplifiers. Perfect for passive speakers

Unsurprisingly, when choosing your USB sound device, you can start at a few pounds and go right up to professional equipment costing hundreds. As they are low-power, USB devices do not tend to feature amplification, unless they have a separate power source.

Let’s play

The simplest way to play audio on Raspbian is to use OMXPlayer. This is a dedicated hardware-accelerated command-line tool that takes full advantage of Raspberry Pi’s capabilities. It sends audio to the analogue audio jack by default, so playing back an MP3 file is as simple as running:

omxplayer /path/to/audio/file.wav

There are many command-line options that allow you to control how the audio is played. Want the audio to loop forever? Just add --loop to the command. You’ll notice that when it’s running, OMXPlayer provides a user interface of sorts, allowing you to control playback from within the terminal. If you’d just like it to run in the background without user input, run the command like this:

omxplayer --no-keys example.wav &

Here, —-no-keys removes the interface, and the ampersand (&) tells the operating system to run the job ‘in the background’ so that it won’t block anything else you want to do.

OMXPlayer is a great choice for Raspbian, but other players such as mpg321 are available, so find the tool that’s best for you.

Another useful utility is speaker-test. This can produce white noise or vocal confirmation so you can check your speakers are working properly. It’s as simple as this:

speaker-test -t wav -c 2

The first parameter sets the sound to be a voice, and the -c tests stereo channels only: front left and front right.

Phat Beats

If space is an issue, a Raspberry Pi 4, amplifier, and speaker may not be what you have in mind. After all, your cool wearable project is going to be problematic if you’re trailing an amplifier on a cart with a 50-metre extension lead powering everything. Luckily, the clever people at Pimoroni have you covered. The Speaker pHAT is a Raspberry Pi Zero-sized HAT that not only adds audio capability to the smallest of the Raspberry Pi family, but also sports a 3 W speaker. Now you can play any audio with a tiny device and a USB battery pack.

Small, cheap, and fun, the Speaker pHat features a 3 W speaker and LED VU meter

The installation process is fully automated, so no messing around with drivers and config files. Once the script has completed, you can run any audio tool as before, and the sound will be routed through the speaker. No, the maximum volume won’t be troubling any heavy metal concerts, but you can’t knock the convenience and form factor.

Playing the blues

An easy way to get superior audio quality using a Raspberry Pi computer is Bluetooth. Recent models such as the 3B, 4, and even the Zero W support Bluetooth devices, and can be paired with most Bluetooth speakers, even from the command line. Once connected, you have a range of options on size and output power, plus the advantage of wireless connectivity.

Setting up a Bluetooth connection, especially if you are using the command line, can be a little challenging (see the Bluetooth cheatsheet section). There is a succinct guide here: hsmag.cc/N6p2IB. If you are using Raspbian Desktop, it’s a lot easier. Simply click on the Bluetooth logo on the top-right, and follow the instructions to pair your device.

If you find OMXPlayer isn’t outputting any audio, try installing mpg321:

sudo apt install mpg321

And try again:

mpg321 /path/to/audio/file.mp3

But seriously

If your project needs good audio, and the standard 3.5 mm output just isn’t cutting it, then it’s time to look at the wide range of DACs (digital-to-analogue converters) available in HAT format. It’s a crowded market, and the prices vary significantly depending on what you want from your device. Let’s start at the lower end, with major player HiFiBerry’s DAC+ Zero. This tiny HAT adds 192kHz/24-bit playback via two RCA phono ports for £12.50. If you’re serious about your audio, then you can consider the firm’s full HAT format high-resolution DAC+ Pro for £36, or really go for it with the DSP (digital sound processing) version for £67. All of these will require amplification, but the sound quality will rival audio components of a much higher price.

Money no object? The Allo Katana is a monster DAC, and weighs in at £240, but outperforms £1000 equivalents

If money is no object and your project requires the best possible reproduction, then you can consider going full audiophile. There are some amazing high-end HATs out there, but one of the best-performing ones we’ve seen is the PecanPi DAC. Its creator Leonid Ayzenshtat sourced each individual component carefully, always choosing the best-in-class. He even used a separate DAC for each audio channel. The resulting board may make your wallet wince at around £200 for the bare board, but the resulting audio is good enough to be used in professional recording studios. If you’ve restored a gorgeous old radio back to showroom condition, you could do a lot worse than add the board in with a great amp and speaker.

ALSA and PulseAudio

There’s often confusion between these two systems. Raspbian comes pre-installed with ALSA (Advanced Linux Sound Architecture), which is the low-level software that makes sound work. It comes with a range of utilities to control output device, volume, and more. PulseAudio is a software layer that sits on top of ALSA to provide more features, including streaming capabilities. Chances are, if you need to do something a bit more clever than just play audio, you’ll need to install a PulseAudio server.

Bluetooth cheatsheet

If you want to pair a Bluetooth audio device (A2DP) on the command line, it can be a little hairy. Here’s a quick guide:

First-time installation:

sudo apt-get install pulseaudio pulseaudio-module-bluetooth
sudo usermod -G bluetooth -a pi
sudo reboot

Start the PulseAudio server:

pulseaudio --start

Run the Bluetooth utility:

bluetoothctl

Put your speaker into pairing mode. Now, within the utility, run the following commands (pressing Enter after each one):

power on
agent on
scan on

Now wait for the list to populate. When you see your device…
pair <dev>
Where <dev> is the displayed long identifier for your device. You can just type in the first few characters and press Tab to auto-complete. Do the same for the following steps.

trust <dev>
connect <dev>

Wait for the confirmation, then enter:

quit <dev>

Now try to play some audio using aplay (for WAV files) or mpg321 (for mp3). These instructions are adapted from the guide by Actuino at hsmag.cc/N6p2IB.

File types

There are command-line players available for just about every audio format in common use. Generally, MP3 provides the best balance of quality and space, but lower bit-rates result in lower sound quality. WAV is completely uncompressed, but can eat up your SSD card. If you don’t want to compromise on audio quality, try FLAC, which is identical in quality to WAV, but much smaller. To convert between audio types, consider installing FFmpeg, a powerful audio and video processing tool.

HackSpace magazine

This article comes direct from HackSpace magazine issue 28, out now and available in print from your local newsagent, the Raspberry Pi Store in Cambridge, and online from Raspberry Pi Press.

If you love HackSpace magazine as much as we do, why not have a look at the subscription offers available, including the 12-month deal that comes with a free Adafruit Circuit Playground! Subscribers in the USA can now get a 12-month subscription for $60 when joining by the end of March!

And, as always, you can download the free PDF from the Raspberry Pi Press website.

The post How to play sound and make noise with your Raspberry Pi appeared first on Raspberry Pi.