A portable, affordable Raspberry Pi-powered blood analyser is helping to “establish a robust healthcare ecosystem” in remote parts of India. Samples can be tested in just 30 seconds, and the cost and size of the parts make it an attractive solution for rural and resource-strapped areas.
It is the work of researchers Sangeeta Palekar and Jayu Kalambe from the Department of Electronics Engineering at Shri Ramdeobaba College of Engineering and Management.
Tiny computer — massive processing power
Regular blood tests are vital in the tracking and elimination of many diseases, but there is a huge time and monetary cost currently tied to this type of laboratory work.
The researchers’s device measures light absorbance through a blood sample, a common type of analysis, and they harnessed the processing capability of Raspberry Pi 4 Model B to analyse the absorbance data. Their Raspberry Pi-powered solution was found to perform on a par with the kind of expensive lab-based blood test typically used.
Quick and easy
Sangeeta and Jayu’s analyser is not only cheaper to build and maintain than the lab-based version, it also does the job better. Using the lab-based method means that samples from patients in rural areas must be sent away for analysis, with results communicated back to patients at a much later date. In contrast, Sangeeta and Jayu’s device can process blood samples there and then. All you need is an electricity source. Patients get their results immediately, and there is no need to transport delicate samples across rural terrain.
Incorporating an IoT element into their design, which would allow for remote monitoring, is the next step for the researchers. They also intend to develop their invention to allow it to carry out different types of blood analyses.
Read more about the science behind the creation
The full research paper is behind a paywall, but the abstract does a great job succinctly explaining all the science. Sangeeta herself also explains a lot of the magic behind her creation in this interview with IEEE Spectrum.
Raspberry Pi Pico powers this real-time audio spectrogram visualiser using a digital microphone to pick up the sound and an LCD display to show us what those sounds ‘look’ like.
First things first
OK firstly, let’s make sure we know what all of those words mean, because ‘audio spectrogram visualiser’ is a bit of a mouthful:
A ‘spectrogram’ is a visual way of representing signal strength, or “loudness”, of a signal.
The ‘visualiser’ bit comes in when these frequencies are presented as waveforms on the little screen.
And the ‘audio‘ is simply because Sandeep is visualising sounds in this project.
Perfectly portable sound monitor
This pocket-sized device can be carried around with you and lets you see a visual representation of your surrounding audio environment in real-time. So, if you wander into a peaceful bird reserve or something, the LCD display will show you something very different than if you were in, say, Wembley Stadium during an FA Cup final.
Above, you can see Sandeep’s creation in action in the vicinity of a crying baby.
Baltic is a handsome 1962 vintage tugboat that was built in Norway, where she operated until the 1980s. She’s now in English waters, having been registered in Southampton once renovations were complete. After some initial hull restoration work in France she sailed to the western Ligurian coast in Italy, where it took about five years to complete the work. The boat’s original exterior was restored, while the inside was fully refurbished to the standard of a luxury yacht.
But where is the Raspberry Pi?
Ulderico Arcidiaco, who coordinated the digital side of Baltic’s makeover, is the CEO of Sfera Labs, so naturally he turned to Raspberry Pi Compute Module 3+ in the guise of Sfera’s Strato Pi CM Duo for the new digital captain of the vessel.
Strato Pi CM Duo is an industrial server comprising a Raspberry Pi Compute Module 3+ inside a DIN-rail case with a slew of additional features. The MagPi magazine took a good look at them when they launched.
The Strato Pi units are the four with red front panels in the cabinet pictured below. There are four other Raspberry Pi Compute Modules elsewhere onboard. Two are identical to the Strato Pi CM Duos in this photo; another is inside an Iono Pi Max; and there’s a Compute Module 4 inside an Exo Sense Pi down in the galley.
Thoroughly modern makeover
Baltic now has fully integrated control of all core and supplementary functions, from power distribution to tanks and pump control, navigation, alarms, fire, lighting, stabilisers, chargers, inverters, battery banks, and video. All powered by Raspberry Pi.
“When it was built sixty years ago, not even the wildest science fiction visionary could have imagined she would one day be fully computer controlled, and not by expensive dedicated computer hardware, but by a tiny and inexpensive device that any kid can easily buy and play with to have fun learning.
And, if there is some old-fashioned patriotism in things, the Raspberry Pi on board will surely like the idea of being back under their home British Flag.”
The Sky Vane provides the soundtrack to an immersive sky-driven experience. Just lie down on the grass, gaze up at the sky, and listen to the changing soundscape through the day.
A Raspberry Pi powers the arresting structure in the middle of the circle of comfy skygazing mats in the photo above, and is connected to an array of atmospheric sensors. These sensors detect changes in light, temperature, pressure, and humidity. Then they send real-time data to the Raspberry Pi computer in order to create a dynamic soundtrack.
The Sky Vane’s creators produced a carefully written soundtrack for the experience. Raspberry Pi triggers changes to the number of musical layers, sequences, audio effects processing, and so on, based on the information the sensors read. That’s the “dynamic” bit. A huge gust of wind, for example, leads to a different musical change than the setting sun.
A portable Minirig sound system generates a seriously high-fidelity audio experience that can be heard clearly within a 25-metre radius of The Sky Vane.
Everything hides underneath the dome-shaped “shroom pod”, which in turn sits beneath the big sculpture
Inspiration behind the installation
The Sky Vane is the latest installation from pyka, a collective of experienced designers who create digital artefacts that enable you to explore the world of sound. Commissioned by Tin Shed Theatre Company and Our Living Levels, The Sky Vane’s big debut was at the Big Skies 2021 event in south Wales.
When they were planning this installation, the creators at pyka weren’t sure how it would go down in a post-pandemic world. They’re used to building things that bring people together, but they were mindful of people’s anxiety around shared public activities. This led to a design that promotes quiet contemplation and mindfulness whilst enjoying the freedom of the outdoors. We think it’s lovely.
You sit down with your six-string, ready to bash out that new song you recently mastered, but find you’re out of tune. Redditor u/thataintthis (Guyrandy Jean-Gilles) has taken the pain out of tuning your guitar, so those of us lacking this necessary skill can skip the boring bit and get back to playing.
Before you dismiss this project as just a Raspberry Pi Pico-powered guitar tuning box, read on, because when the maker said this is a fully automatic tuner, they meant it.
How does it work?
Guyrandy’s device listens to the sound of a string being plucked and decides which note it needs to be tuned to. Then it automatically turns the tuning keys on the guitar’s headstock just the right amount until it achieves the correct note.
If this were a regular tuning box, it would be up to the musician to fiddle with the tuning keys while twanging the string until they hit a note that matches the one being made by the tuning box.
It’s currently hardcoded to do standard tuning, but it could be tweaked to do things like Drop D tuning.
Commenters were quick to share great ideas to make this build even better. Issues of harmonics were raised, and possible new algorithms to get around it were shared. Another commenter noticed the maker wrote their own code in C and suggested making use of the existing ulab FFT in MicroPython. And a final great idea was training the Raspberry Pi Pico to accept the guitar’s audio output as input and analyse the note that way, rather than using a microphone, which has a less clear sound quality.
These upgrades seemed to pique the maker’s interest. So maybe watch this space for a v2.0 of this project…
(Watch out for some spicy language in the comments section of the original reddit post. People got pretty lively when articulating their love for this build.)
This project was inspired by the Roadie automatic tuning device. Roadie is sleek but it costs big cash money. And it strips you of the hours of tinkering fun you get from making your own version.
A team from National Yang Ming Chiao Tung University has developed a foot-pressure-sensing insole to detect Parkinson’s disease. Using our tiny computers, they managed to create something discreet that can monitor people as they walk around in their own shoes.
What is Parkinson’s disease?
Parkinson’s disease is a neurodegenerative disorder that mostly affects people aged over 60, though it can affect younger people too. One symptom that can suggest a diagnosis of Parkinson’s disease is an abnormal gait – that is, when someone’s walk has changed from its usual pattern. It’s this that the project aims to detect.
While there is currently no cure for Parkinson’s, many people respond well to treatment with medication and physical therapy, and early detection gives people a better chance of a good quality of life for as long as possible.
Eight FlexiForce sensors are placed evenly on each insole of a user’s shoes to measure their gait as they go about their day:
A Raspberry Pi 3 is fixed to a strap around the user’s knee and paired with Himax WE-I Plus. Cables connect the knee- strapped hardware to the sensors in the insoles.
How does it work?
The sensors in the user’s shoes detect pressure across the whole foot while walking. Data is then processed by the Raspberry Pi and the user’s gait is assessed. Users pair the device with a mobile app to see their results. The app also shows real-time data while they’re walking.
The team took advantage of a free online database that collects foot pressure data from both Parkinson’s disease patients and people without Parkinson’s who have a typical gait. They used this to train their own machine learning model, which predicts whether a user has a gait that may indicate Parkinson’s disease.
Check out a live demo from this point in the project video.
The team submitted this project in the 2021 Synopsys ARC AIoT Design Contest and scored a second-place prize. Check out more project videos from this year’s submissions.
Assessing gait as part of a diagnosis of potential Parkinson’s disease usually requires that patients take trips to the hospital to have tests on large pressure-sensored walking mats. The team’s new device offers a much more portable and affordable approach.
Researchers from the University of Trento have developed a Raspberry Pi-powered device that automatically detects pests in fruit orchards so they can get sorted out before they ruin a huge amount of crop. There’s no need for farmer intervention either, saving their time as well as their harvest.
The researchers devised an embedded system that uses machine learning to process images captured inside pheromone traps. The pheromones lure the potential pests in to have their picture taken.
Each trap is built on a custom hardware platform that comprises:
Sony IMX219 image sensor to collect images (chosen because it’s small and low-power)
Intel Neural Compute module for machine learning optimisation
Long-range radio chip for communication
Solar energy-harvesting power system
The research paper mentions that Raspberry Pi 3 was chosen because it offered the best trade-off between computing capability, energy demand, and cost. However, we don’t know which Raspberry Pi 3 they used. But we’re chuffed nonetheless.
How does it work?
The Raspberry Pi computer manages the sensor, processing the captured images and transmitting them for classification.
Then the Intel Neural Compute Stick is activated to perform the machine learning task. It provides a boost to the project by reducing the inference time, so we can tell more quickly whether a potentially disruptive bug has been caught, or just a friendly bug.
In this case, it’s codling moths we want to watch out for. They are major pests to agricultural crops, mainly fruits, and they’re the reason you end up with apples that look like they’ve been feasted on by hundreds of maggots.
When this task is done manually, farmers typically check codling moth traps twice a week. But this automated system checks the pheromone traps twice every day, making it much more likely to detect an infestation before it gets out of hand.
Maker Mellow was inspired by watching the progress of NASA’s Perseverance Mars rover and wanted in on the interplanetary robot scene. Their first idea was to build a scale version of Perseverance, but when their partner stepped in to suggest that starting smaller might be a little easier, Zippy was born.
ProtoTank (a bolt-together modular tank-style robotics platform)
Zippy’s basic parts haven’t changed much through its three iterations. You can follow the journey of Zippy 1.0 through 3.0 on Mellow’s website. You’ll see that some additional hardware was required when Mellow made some improvements.
The first version of Mellow’s mini Mars rover was just a motor on a 3D-printed body, controlled by plugging in wires to the battery. But Mellow was desperate to level up their robot and build something that could be controlled by an Xbox controller. They reached that goal with Zippy 2.0 and can drive the mini Mars rover remotely via Bluetooth. However, the range is quite tight, so slow runners need not apply for the job of pilot.
Zippy 3.0 comes complete with a DJI Osmo Action camera to capture its adventures.
What surfaces can Zippy ride on?
Our favourite part of Mellow’s original project post is the list rating how good Zippy is at navigating various types of terrain (some of which are showcased in the video up top):
Sand – NO it gets stuck in the wheels
Big rocks – NO the robot is too low to the ground and gets stuck
Pebbles – with determination
Grass – only very short grass
Human bodies – surprisingly well
Carpets – Zippy loves carpets
Flat terrain – definitely
Here’s all the code you need to build your own mini Mars rover.
Follow the real thing on Mars
Keep up with NASA’s Perseverance Mars rover on Twitter. Perseverance spent its summer drilling into rocks, and has photos to prove it.
The maker turned to PowerShell – a cross-platform task automation solution – to create a script (available on GitHub) that tells the Raspberry Pi which album is playing, and sends it the album artwork for display on the LED matrix.
Raspberry Pi runs a flaschen-taschen server to display the album artwork. The PowerShell script runs a ‘send image’ command every time the album art updates. Then the Raspberry Pi switches the display to reflect what is now playing. In the demo video, the maker runs this from iTunes, but says that any PowerShell-compatible music player (ie: Spotify) will work too.
Setting up your own LED album art display
The maker’s original reddit post shares a step-by-step guide to follow on the software side of things. And they detail the terminal code you’ll need to run on the Raspberry Pi to get your LED Matrix displaying the correct image.
An artist and maker, Geeky Faye describes themself as a one-man band, tackling whole areas of creation. In the latest issue of The MagPi Magazine, Rob Zwetsloot meets the cosplaying polymath.
Having multiple hobbies and interests can be fun, but they can sometimes get on top of you. Allie, also know online as Geeky Faye, seems to have thrived with so many. “As it currently stands, I will happily refer to myself as a maker, artist, designer, and filmmaker because all of those are quite accurate to describe the stuff I do!” Allie tells us.
“I’ve been making almost my whole life. I dove headlong into art as a young teen, to be quickly followed by cosplay and building things that I needed for myself. I would go on to get a degree in fine arts and pursue a professional career as an artist, but that actually ended out resulting in me being on a computer all day more than anything! I’ve always needed to use my hands to create, which is why I’ve always been drawn to picking up as many making skills as possible… These days my making is all very ‘multimedia’ so to speak, involving 3D printing, textiles, electronics, wood working, digital design, and lots of paint!”
When did you learn about Raspberry Pi?
I’d heard about Raspberry Pi years ago, but I didn’t really learn about it until a few years back when I started getting into 3D printing and discovered that you could use one to act as a remote controller for the printer. That felt like an amazing use for a tool I had previously never gotten involved with, but once I started to use them for that, I became more curious and started learning a bit more about them. I’m still quite a Raspberry Pi novice and I am continually blown away by what they are capable of.
What have you made with Raspberry Pi?
I am actually working on my first ever proper Raspberry Pi project as we speak! Previously I have only set them up for use with OctoPrint, 3D-printed them a case, and then let them do their thing. Starting from that base need, I decided to take an OctoPrint server [Raspberry] Pi to the next level and started creating BMOctoPrint; an OctoPrint server in the body of a BMO (from Adventure Time). Of course, it would be boring to just slap a Raspberry Pi inside a BMO-shaped case and call it a day.
So, in spite of zero prior experience (I’m even new to electronics in general), I decided to add in functionality like physical buttons that correspond to printer commands, a touchscreen to control OctoPrint (or anything on Raspberry Pi) directly, speakers for sound, and of course user-triggered animations to bring BMO to life… I even ended out designing a custom PCB for the project, which makes the whole thing so clean and straightforward.
What’s your favourite project that you’ve done?
Most recently I redesigned my teleprompter for the third time and I’m finally really happy with it. It is 3D-printable, prints in just two pieces that assemble with a bit of glue, and is usable with most kinds of lens adapters that you can buy off the internet along with a bit of cheap plastic for the ‘glass’. It is small, easy to use, and will work with any of my six camera lenses; a problem that the previous teleprompter struggled with! That said, I still think my modular picture frame is one of the coolest, hackiest things that I’ve made. I highly recommend anyone who frames more than a single thing over the course of their lives to pick up the files, as you will basically never need to buy a picture frame again, and that’s pretty awesome, I think.
Get The MagPi #109 NOW!
You can grab the brand-new issue right now from the Raspberry Pi Press store, or via our app on Android or iOS. You can also pick it up from supermarkets and newsagents. There’s also a free PDF you can download.
The maker of this robotic waiter had almost all of the parts for this project just sat around collecting dust on a shelf. We’re delighted they decided to take the time to pick up the few extra bits they needed online, then take the extra hour (just an hour?!) to write a program in Python to get this robotic waiter up and running.
We are also thrilled to report (having spotted it in the reddit post we found this project on) that the maker had “so much fun picking up and sometimes crushing small things with this claw.” The line between serving drinks and wanting to crush things is thinner than you might imagine.
And in even better news, all the code you need to recreate this build is on GitHub.
One of our favourite things about finding Raspberry Pi-powered projects on reddit is the comments section. It’s (usually) the perfect mix of light adoration, constructive suggestions, and gateways to tangents we cannot ignore.
Like this one recalling the Rick and Morty sketch in which a cute tiny robot realises their sole purpose is to pass butter:
And also this one pointing us to another robotic arm having a grand old time picking up a tiny ball, sending it down a tiny slide, and then doing it all over again. Because it’s important we know how to make our own fun:
We also greatly enjoyed the fact that the original maker couldn’t use the Rick and Morty “what is my purpose” line to share this project because they are such an uber fan that they already used it for a project they posted just the day before. This cute creation’s sole reason for existing is to hold an Apple pencil while looking fabulous. And we are HERE for it:
It hurts our aged soul to think how many of you won’t know what a teasmade is. So here is a quick overview of this classic 20th-century technology. Now we will tell you how VEEB brought such a contraption back to life with Raspberry Pi.
Yeah, we love the project video as much as you do. The clattering trolley rolling in with this ancient tea-making machine on top. Then loudly making a Google calendar note to brew the tea for you while you do something more useful. Genius.
Raspberry Pi reads your Google calendar and automatically activates the kettle ten minutes before the time when you’ve said you want a coffee.
Then it gets super noisy. Teasmades are like that. But it’s worth it, trust me. To cover the sound of the janky old machine, VEEB has added a speaker that plays God save the Queen as the water heats up and pours into the clever dripper with the coffee filter in it. I’m not sure there is anything more English than that, other than if this project actually made tea and not coffee. I think coffee belongs to Seattle, but I’m not sure Seattle has a national anthem of its own. Correct me in the comments. Maybe Nirvana?
Anywho, then you sprinkle your coffee grounds into the hot water, give it a stir with a spoon, and hey presto, you have [kind of automatically brewed] coffee!
File this in the list of projects we love because engineers like to spend several hours building something to automate an activity that takes one second. In this case, switching on a kettle to boil water for your coffee.
A quick PSA to share with those not in the know the wonder that is the limited television series Father Ted. The Mrs Doyle character was infamous for her fervent insistence on making everyone a cup of tea and she was crushed when her parochial employer, Father Ted, gave her a Teasmade for Christmas to take the “misery” out of making tea. It is not a miserable task. It is a calming, soothing ritual. Stupid Father Ted.
Clem from element14 found a discarded Super 8 camera and wanted to channel his inner filmmaking hipster, but he didn’t want to spend tons of money on analogue film, so he digitised the camera with Raspberry Pi.
Clem recreated an original Super 8 cartridge and packed it with tiny hardware to do the job of the 8mm analogue film digitally. Doing it this way also means you can just drop the new cartridge into any Super 8 camera and use it as a digital device. It also means you don’t need to cut up any part of your gorgeous retro device in the process.
Getting the Raspberry Pi camera lens lined up perfectly with the original lens was the hardest part of this build. But using our tiny camera meant that the lens could be placed at exactly the right angle, because it doesn’t have to be fixed to the PCB.
Super 8 cartridges are pretty small, so the super compact Raspberry Pi 3A+ was just the right size for this project, especially as Clem needed wireless connectivity. He had to get the power supply, Raspberry Pi brain, camera, and all the wires into a tight space.
Clem wanted to be able to walk around and use the Super 8 as originally intended, so an external screen with a keyboard and a mouse wouldn’t have worked to control the device. Instead, he rigged up some buttons and an LED to the Raspberry Pi’s GPIO ports. He explains it all from this point in the build video.
We love that the final output looks just like the kind of films the original camera would have captured back in the day.
In the latest issue of The MagPi Magazine, Jeroen Domburg showcases his refurbed Nintendo Game Boy.
The Nintendo Game Boy – the iconic handheld video game console launched in 1989 – is no stranger to the pages of The MagPi. We’ve seen makers either stuff a Raspberry Pi computer into an original case or buy off-the-shelf projects, such as the superb RetroFlag GPi, and create their own from scratch. It’s great to see the device kept alive.
But just as we thought we’d seen it all, along came Jeroen Domburg, aka Sprite_tm. Like us, he’d seen a reasonable number of people modifying Game Boy cases to create portable RetroPie machines. “But because they wanted the thing to emulate as many consoles as possible, they usually went all-out with the modifications: high-resolution screen, Li-ion battery, HDMI and USB, multiple front buttons, shoulder buttons, the works,” he says.
“Obviously this would work really well, but it went against the original Game Boy looks. The projects could look like a weird mutation and it made me think, what if I went the other way? What if instead of sacrificing the original looks for playability, I sacrificed playability for the original looks?” Welcome then, DMGPlus: a handheld that looks familiar but has its internals replaced by something more powerful.
Pressing the right buttons
That something includes a Raspberry Pi Zero computer and a replacement motherboard containing a lower power, high performance ICE40 field-programmable gate array (FPGA). These are fixed either side of a new, printed circuit board, replacing the CPU, GPU, and memory.
Jeroen has retained the buttons, cartridge port, speaker, and link port, with everything capable of being run from four AA batteries, just like the original. “I did change the LCD a little bit by driving it in a smart way so that it can display 16 greys instead of the original four,” he enthuses.
And the upshot of that? “It ends up substantially increasing the number of games the Game Boy can play,” he continues. “Because of emulation, all of a sudden you can have access to games that originally ran on other consoles, some of which have specs way better than the original Game Boy.”
Work hard, play hard
Making the build extra-special is its use of original carts, emulating the Game Boy experience so closely it’s difficult to tell if anything has changed. It uses the emulator Gnuboy and when Jeroen uses his own reproduction carts containing games not originally made for the Game Boy, Raspberry Pi Zero kicks in and runs the title natively.
“Getting Raspberry Pi Zero to boot as fast as possible was tricky because it needed some rethinking of the boot process, as well as a kernel recompile to make it load within the time it took the Game Boy startup screen to finish,” Jeroen explains. “My hardware also takes a longer path: Raspberry Pi has to talk through the SPI port to the FPGA, which then needs to control the cartridge. Doing this for every byte that the game needs would be very slow, so the emulator uses caching.”
Raspberry Pi Zero seemed the perfect choice. Aside from being able to fit in the case, Jeroen said he knew he could get the video interface to do what he wanted. “Raspberry Pi has proper DPI support, outputting video over the GPIO pins so I could make the Game Boy LCD show up as just another frame buffer device,” he says. “That was important because I didn’t want to hack the video output system of every emulator or game I wanted to run it.”
The result is a stunning handheld console, but not one for the faint-hearted. “The big challenge was the need for custom hardware, custom software, custom gateware, and so on and it took a fair bit of time and effort to develop,” he says. “If you’re looking to replicate it, be prepared to put some work into tweaking and fixing things.”
Get The MagPi #109 NOW!
You can grab the brand-new issue right now from the Raspberry Pi Press store, or via our app on Android or iOS. You can also pick it up from supermarkets and newsagents. There’s also a free PDF you can download.
Russell coded different sequences to make the NeoPixel lights turn to a calming blue at bedtime, and then brighten up to sunshine yellow when it’s time for the kids to wake up.
mintyPi retro gaming handheld
How could we not share a retro gaming device hidden in an Altoids tin? Raspberry Pi Zero runs RetroPie software, and gameplay lasts up to five hours!
mintyPi is a nice simple do-it-yourself project, and you can find links for the parts at sudomod.com. Good luck finding an Altoids tin in the UK though. If you’ve found a good alternative, let us know in the comments.
Fresh coffee monitor
We picked this project because Raspberry Pi Towers coffee drinkers have been thinking about making something similar.
Caleb Brewer made his office coffee machine smart by building an alert system that sends notifications when someone brews a fresh pot. A waterproof temperature sensor constantly monitors the coffee pot, and Raspberry Pi Zero W turns on an alert light and sends a Slack notification when a new hot tasty brew is ready.
If you’re into compilation videos featuring Raspberry Pi and Arduino projects, follow Top Projects Compilation on YouTube, Facebook or Instagram for more.
We love Wireframe magazine’s regular feature ‘The principles of game design’. They’re written by video game pioneer Howard Scott Warshaw, who authored several of Atari’s most famous and infamous titles. In the latest issue of Wireframe, he provides a snapshot of the hell-raising that went on behind the scenes at Atari…
Video game creation is unusual in that developers need to be focused intently on achieving design goals while simultaneously battling tunnel vision and re-evaluating those goals. It’s a demanding and frustrating predicament. Therefore, a solid video game creator needs two things: a way to let ideas simmer (since rumination is how games grow from mediocre to fabulous) and a way to blow off steam (since frustration abounds while trying to achieve fabulous). At Atari, there was one place where things both simmered and got steamy… the hot tub. The only thing we couldn’t do was keep a lid on the antics cooked up inside.
The hot tub was situated in the two-storey engineering building. This was ironic, because the hot tub generated way more than two stories in that building. The VCS/2600 and Home Computer development groups were upstairs. The first floor held coin-op development, a kitchen/cafeteria, and an extremely well-appointed gym. The gym featured two appendages: a locker area and the hot tub room. Many shenanigans were hatched and/or executed in the hot tub. One from the more epic end of the spectrum comes to mind: the executive birthday surprise.
It was during the birthday celebration of a VP who shall remain nameless, but it might have been the one who used to keep a canister of nitrous oxide and another of pure oxygen in his office. The nitrous oxide was for getting high and laughing some time away, while the oxygen was used for rapid sobering up in the event a spontaneous meeting was called (which happened regularly at Atari). As the party raged on, a small crew of revellers migrated to the small but accommodating hot tub room. Various intoxicants (well beyond the scope of nitrous) were being consumed in celebration of the special event (although by this standard, nearly every day was a special event at Atari).
As the party rolled on, inhibitions were shed along with numerous articles of clothing. At one point, the birthday boy was adjudged to be in dire need of a proper tubbing as he hadn’t lost sufficient layers to keep pace with the party at large. The birthday boy disagreed, and the ensuing negotiation took the form of a lively chase around the area. The VP ran out of the hot tub room and headed for the workout area with a wet posse in hot pursuit, all in varying stages of undress.
It’s important to note here that although refreshments and revelry were widely available at Atari, one item in short supply was conference rooms. Consequently, meetings could pop up in odd locales. Any place an aggregation could be achieved was a potential meeting spot. The sensitivity of the subject matter would determine the level of privacy required on a case-by-case basis. Since people weren’t always working out, the gym had enough places to sit that it could serve as a decent host for gatherings. And as for sensitivity, the hot tub room was well sound-proofed, so intruding ears weren’t a concern.
As the crew of rowdy revellers followed the VP into the workout area, they were confronted by just such a collection of executives who happened to be meeting at the time. I don’t think the birthday party was on the agenda. However, they may have been pleased that the absentee VP had ultimately decided to join their number. It was embarrassing for some, entertaining for others, and nearly career-ending for a couple. The moral of this story being that Atari executives should never go anywhere without their oxygen tanks in tow.
But morals aside, there was work to be done at Atari. In a place where work can lead to antics and antics can lead to work breakthroughs, it’s difficult at times to suss out the precise boundary between work and antics. It takes passion and commitment to pursue side quests productively and yet remain on task when necessary.
The main reason this was a challenge comes down to the fact there are so many distractions constantly going on. Creative people tend to be creative frequently and spontaneously. Also, their creativity is much more motivated by fascination and interest than it is by task lists or project plans. Fun can break out at any moment, and answering the call isn’t always the right choice, no matter how compelling the siren.
Rob Fulop, creator of Missile Command and Demon Attack for the Atari 2600 (among many other hits) isn’t only a great game maker, he’s also a keen observer of human nature. We used to chat about just where the edge is between work and play at Atari. Those who misjudge it can easily fall off the cliff.
Likewise, we explored the concept of what makes a good game designer. Rob said it’s just the right combination of silly and anal. He believed that the people who did well at Atari (and as game makers in general) were the people who could be silly enough to recognise fun, and anal enough to get all the minutia and details aligned correctly in order to deliver the fun. Of course, Rob (being the poet he is) created a wonderful phrasing to describe those with the right stuff. He put it like this: the people who did well at Atari were the people who could goof around as much as possible but still go to heaven.
Get your copy of Wireframe issue 53
You can read more features like this one in Wireframe issue 53, available directly from Raspberry Pi Press — we deliver worldwide.
Ever wondered what to do with Raspberry Pi boards you haven’t used in a while? Do you tend to upgrade your projects to newer models, leaving previous ones languishing at the back of a drawer? There are a lot of venerable Raspberry Pis out there doing useful stuff just as well as ever, and we take great care to make sure new versions of Raspberry Pi OS continue to run on these models, but we’re realists: we understand that ending up with older boards lying around doing nothing is a thing. Rather than leave them to gather dust, you now have a sustainable way to get your unused tech back in the hands of makers who’ll put it to work.
OKdo has partnered with Sony to launch the first official Raspberry Pi recycling initiative. OKdo Renew gives you rewards in return for your preloved boards.
Which boards can I recycle?
If you have any of these boards sitting around unused, you can recycle them:
Our Raspberry Pi boards are manufactured at the Sony Technology Centre in Wales, and that’s where OKdo returns all the hardware you donate. When it gets there, it’ll be tested, reconditioned, and repackaged, ready to be sold to its new home. OKdo will be offering the refurbished boards at a lower price than new boards, and they all come with a twelve-month warranty.
How do I send my preloved Raspberry Pi boards to Sony?
If you have one of the boards listed above and it’s still in working order, you can register to renew your Raspberry Pi. Print the prepaid label so you can return you board for free! Then package up your board to avoid damage, being careful not to exceed the dimensions listed here.
Make sure you remove your memory card before posting your board. Sony can’t return them and we don’t want you to lose any important stuff you’ve got stored.
What’s my reward?
In return for recycling your board, you will get a £10 voucher to use towards your next OKdo purchase. You could upgrade to a faster board than the one you recycled, or pick up a new accessory.
We’re not going to lie — the thing we like most about this automated plant watering project is the timelapse at the very end of the build video. But we also thought now might be a good time to show you another Raspberry Pi project for keeping an eye on your plants, since some of us are getting back to something more like our usual routines and our houseplants are no longer our best friends, so they might need a little extra automated attention.
Maker Christopher Barnatt chose Raspberry Pi Zero for this project because although Raspberry Pi Pico could handle it, he needed a camera connector to record timelapse video of his plants’ growth.
Christopher is a gem and has included links to all the hardware he used. There are also some cheaper, smaller alternatives listed in the info section of his build video.
The moisture sensor checks every half hour to determine whether or not the plant has enough water, and communicates with the Raspberry Pi. Water flow is controlled by the solenoid valve, and if the Raspberry Pi finds the soil is too dry, it opens the valve for a set amount of time to let water out.
Code your own plant watering machine
Christopher has shared all the code you need to make your own plant watering system:
Watering.py — the final watering system and timelapse code
Check out Christopher’s YouTube channel Explaining Computers where he posts new videos every week on topics like PC hardware, single board computers such as Raspberry Pi, AI, Big Data, and quantum computing.
Today, it’s easy to run Edge Impulse machine learning on any operating system, like Raspberry Pi OS, and on every cloud, like Microsoft’s Azure IoT. Evan Rust, Technology Ambassador for Edge Impulse, walks us through it.
Building enterprise-grade IoT solutions takes a lot of practical effort and a healthy dose of imagination. As a foundation, you start with a highly secure and reliable communication between your IoT application and the devices it manages. We picked our favorite integration, the Microsoft Azure IoT Hub, which provides us with a cloud-hosted solution backend to connect virtually any device. For our hardware, we selected the ubiquitous Raspberry Pi 4, and of course Edge Impulse, which will connect to both platforms and extend our showcased solution from cloud to edge, including device authentication, out-of-box device management, and model provisioning.
From edge to cloud – getting started
Edge machine learning devices fall into two categories: some are able to run very simple models locally, and others have more advanced capabilities that allow them to be more powerful and have cloud connectivity. The second group is often expensive to develop and maintain, as training and deploying models can be an arduous process. That’s where Edge Impulse comes in to help to simplify the pipeline, as data can be gathered remotely, used effortlessly to train models, downloaded to the devices directly from the Azure IoT Hub, and then run – fast.
This reference project will serve you as a guide for quickly getting started with Edge Impulse on Raspberry Pi 4 and Azure IoT, to train a model that detects lug nuts on a wheel and sends alerts to the cloud.
country=<Insert 2 letter ISO 3166-1 country code here>
ssid="<Name of your wireless LAN>"
psk="<Password for your wireless LAN>"
along with an empty file named ssh (both within the /boot directory), you can go ahead and power up the board. Once you’ve successfully SSH’d into the device with
and the password raspberry, it’s time to install the dependencies for the Edge Impulse Linux SDK. Simply run the next three commands to set up the NodeJS environment and everything else that’s required for the edge-impulse-linux wizard:
$ npm config set user root && sudo npm install edge-impulse-linux -g --unsafe-perm
Since this project deals with images, we’ll need some way to capture them. The wizard supports both the Pi Camera modules and standard USB webcams, so make sure to enable the camera module first with
$ sudo raspi-config
if you plan on using one. With that completed, go to the Edge Impulse Studio and create a new project, then run the wizard with
and make sure your device appears within the Edge Impulse Studio’s device section after logging in and selecting your project.
Capturing your data
Training accurate machine learning models requires feeding plenty of varied data, which means a lot of images are required. For this use case, I captured around 50 images of a wheel that had lug nuts on it. After I was done, I headed to the Labeling queue in the Data Acquisition page and added bounding boxes around each lug nut within every image, along with every wheel.
To add some test data, I went back to the main Dashboard page and clicked the Rebalance dataset button, which moves 20% of the training data to the test data bin.
Training your models
So now that we have plenty of training data, it’s time to do something with it, namely train a model. The first block in the impulse is an Image Data block, and it scales each image to a size of 320 by 320 pixels. Next, image data is fed to the Image processing block which takes the raw RGB data and derives features from it.
Finally, these features are sent to the Transfer Learning Object Detection model which learns to recognize the objects. I set my model to train for 30 cycles at a learning rate of .15, but this can be adjusted to fine-tune the accuracy.
As you can see from the screenshot below, the model I trained was able to achieve an initial accuracy of 35.4%, but after some fine-tuning, it was able to correctly recognize objects at an accuracy of 73.5%.
Testing and deploying your models
In order to verify that the model works correctly in the real world, we’ll need to deploy it to our Raspberry Pi 4. This is a simple task thanks to the Edge Impulse CLI, as all we have to do is run
which downloads the model and creates a local webserver. From here, we can open a browser tab and visit the address listed after we run the command to see a live camera feed and any objects that are currently detected.
Integrating your models with Microsoft Azure IoT
With the model working locally on the device, let’s add an integration with an Azure IoT Hub that will allow our Raspberry Pi to send messages to the cloud. First, make sure you’ve installed the Azure CLI and have signed in using az login. Then get the name of the resource group you’ll be using for the project. If you don’t have one, you can follow this guide on how to create a new resource group. After that, return to the terminal and run the following commands to create a new IoT Hub and register a new device ID:
to add the necessary libraries. (Note: if you do not set the environment variable or pass it in as an argument, the program will not work!) The connection string contains the information required for the device to establish a connection with the IoT Hub service and communicate with it. You can then monitor output in the Hub with
To make sure it works, download and run this example to make sure you can see the test message. For the second half of deployment, we’ll need a way to customize how our model is used within the code. Thankfully, Edge Impulse provides a Python SDK for this purpose. Install it with
where <LUG_NUT_COUNT> is the correct number of lug nuts that should be attached to the wheel (you might have to use python3 if both Python 2 and 3 are installed).
Now whenever a wheel is detected the number of lug nuts is calculated. If this number falls short of the target, a message is sent to the Azure IoT Hub.
And by only sending messages when there’s something wrong, we can prevent an excess amount of bandwidth from being taken due to empty payloads.
The possibilities are endless
Imagine utilizing object detection for an industrial task such as quality control on an assembly line, or identifying ripe fruit amongst rows of crops, or detecting machinery malfunction, or remote, battery-powered inferencing devices. Between Edge Impulse, hardware like Raspberry Pi, and the Microsoft Azure IoT Hub, you can design endless models and deploy them on every device, while authenticating each and every device with built-in security.
You can set up individual identities and credentials for each of your connected devices to help retain the confidentiality of both cloud-to-device and device-to-cloud messages, revoke access rights for specific devices, transmit code and services between the cloud and the edge, and benefit from advanced analytics on devices running offline or with intermittent connectivity. And if you’re really looking to scale your operation and enjoy a complete dashboard view of the device fleets you manage, it is also possible to receive IoT alerts in Microsoft’s Connected Field Service from Azure IoT Central – directly.
A whole lot of super free hands-on activities are happening at the Raspberry Pi Store this summer.
We have teamed up with the Centre for Computing History to create an interactive learning space that’s accessible to all ages and abilities. Best of all, everything is free. It’s all happening in a big space new space we’ve borrowed a few doors down from the Raspberry Pi Store in the Grand Arcade in Cambridge, UK.
What is Raspberry Pi doing?
Everyone aged seven to 107 can get hands-on and creative with our free beginner-friendly workshops. You can make games with Scratch on Raspberry Pi, learn simple electronics for beginners, or get hands-on with the Raspberry Pi camera and Python programming.
If you don’t know anything about coding, don’t worry: there are friendly people on hand to help you learn.
The workshops take place every Monday, Wednesday and Friday until 3 September. Pre-booking is highly advisable. If the one you want is fully booked, it’s well worth dropping by if you’re in the neighbourhood, because spaces often become available at the last minute. And if you book and find you can no longer come along, please do make sure you cancel, because there will be lots of people who would love to take your space!
Come and celebrate thirty years of the World Wide Web and see how things have changed over the last three decades.
This interactive exhibition celebrates the years since Tim Berners-Lee changed the world forever by publishing the very first website at CERN in 1991. You can trace the footsteps of the early web, and have a go on some original hardware.
Here are some of the things you can do:
Browse the very first website from 1991
Search the web with Archie, the first search engine
While we would love to have a Raspberry Pi store in every town in every country all over the world (cackles maniacally), we are sticking with just the one in our hometown for now. But we make lots of cool stuff you can access online to relieve the FOMO.
The Raspberry Pi Foundation’s livestreamed Digital Making at Home videos are all still available for young people to watch and learn along with. You can chat, code together, hear from cool people, and see amazing digital making projects from kids who love making with technology.
There are also more than thirty Raspberry Pi courses available for free on FutureLearn. There’s something for every type of user and level of learner, from coders looking to move from Scratch to Python programming, to people looking to start up their own CoderDojo. Plus tons of materials for teachers sharing practical resources for the classroom.
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.