Python code creates curious, wordless comic strips at random, spewing them from the thermal printer mouth of a laser-cut body reminiscent of Disney Pixar’s WALL-E: meet the Vomit Comic Robot!
The age of the thermal printer!
Thermal printers allow you to instantly print photos, data, and text using a few lines of code, with no need for ink. More and more makers are using this handy, low-maintenance bit of kit for truly creative projects, from Pierre Muth’s tiny PolaPi-Zero camera to the sound-printing Waves project by Eunice Lee, Matthew Zhang, and Bomani McClendon (and our own Secret Santa Babbage).
Vomiting robots
Interaction designer and developer Cadin Batrack, whose background is in game design and interactivity, has built the Vomit Comic Robot, which creates “one-of-a-kind comics on demand by processing hand-drawn images through a custom software algorithm.”
The robot is made up of a Raspberry Pi 3, a USB thermal printer, and a handful of LEDs.
At the press of a button, Processing code selects one of a set of Cadin’s hand-drawn empty comic grids and then randomly picks images from a library to fill in the gaps.
Each image is associated with data that allows the code to fit it correctly into the available panels. Cadin says about the concept behing his build:
Although images are selected and placed randomly, the comic panel format suggests relationships between elements. Our minds create a story where there is none in an attempt to explain visuals created by a non-intelligent machine.
The Raspberry Pi saves the final image as a high-resolution PNG file (so that Cadin can sell prints on thick paper via Etsy), and a Python script sends it to be vomited up by the thermal printer.
For more about the Vomit Comic Robot, check out Cadin’s blog. If you want to recreate it, you can find the info you need in the Imgur album he has put together.
We cute robots
We have a soft spot for cute robots here at Pi Towers, and of course we make no exception for the Vomit Comic Robot. If, like us, you’re a fan of adorable bots, check out Mira, the tiny interactive robot by Alonso Martinez, and Peeqo, the GIF bot by Abhishek Singh.
Warning: a GIF used in today’s blog contains flashing images.
Students at the University of Bremen, Germany, have built a wearable camera that records the seconds of vision lost when you blink. Augenblick uses a Raspberry Pi Zero and Camera Module alongside muscle sensors to record footage whenever you close your eyes, producing a rather disjointed film of the sights you miss out on.
Blink and you’ll miss it
The average person blinks up to five times a minute, with each blink lasting 0.5 to 0.8 seconds. These half-seconds add up to about 30 minutes a day. What sights are we losing during these minutes? That is the question asked by students Manasse Pinsuwan and René Henrich when they set out to design Augenblick.
Blinking is a highly invasive mechanism for our eyesight. Every day we close our eyes thousands of times without noticing it. Our mind manages to never let us wonder what exactly happens in the moments that we miss.
Capturing lost moments
For Augenblick, the wearer sticks MyoWare Muscle Sensor pads to their face, and these detect the electrical impulses that trigger blinking.
Two pads are applied over the orbicularis oculi muscle that forms a ring around the eye socket, while the third pad is attached to the cheek as a neutral point.
Biology fact: there are two muscles responsible for blinking. The orbicularis oculi muscle closes the eye, while the levator palpebrae superioris muscle opens it — and yes, they both sound like the names of Harry Potter spells.
The sensor is read 25 times a second. Whenever it detects that the orbicularis oculi is active, the Camera Module records video footage.
Pressing a button on the side of the Augenblick glasses set the code running. An LED lights up whenever the camera is recording and also serves to confirm the correct placement of the sensor pads.
The Pi Zero saves the footage so that it can be stitched together later to form a continuous, if disjointed, film.
Learn more about the Augenblick blink camera
You can find more information on the conception, design, and build process of Augenblickhere in German, with a shorter explanation including lots of photos here in English.
And if you’re keen to recreate this project, our free project resource for a wearable Pi Zero time-lapse camera will come in handy as a starting point.
Michael Portera‘s trading card scanner uses LEGO, servo motors, and a Raspberry Pi and Camera Module to scan Magic: The Gathering cards and look up their prices online. This is a neat and easy-to-recreate project that you can adapt for whatever your, or your younger self’s, favourite trading cards are.
For those of you who aren’t this nerdy [Janina is 100% this nerdy – Ed.], Magic: The Gathering (or MTG for short) is a trading card game first launched in 1993. It’s based on a sprawling fantasy multiverse storyline, and is very heavy on mechanics — the current comprehensive rules fill 228 pages! You can imagine it as being a bit like Dungeons and Dragons, with less role-playing and more of a chess vibe. Unlike in chess, however, you can beat your MTG opponent in one turn with just the right combination of cards. If that’s your style of play, that is.
Scanning trading cards
So far, there are around 20000 official MTG cards, and, as with other types of trading cards, some of them are worth a lot of money.
Michael is one of the many people who were keen MTG players in their youth. Here’s how he came up with his project idea:
I was really into trading cards as a kid. I recently came across a lot of Magic: The Gathering cards in a box and thought to myself — I wonder how many cards I have and how much they’re worth?! Logging and looking these up manually would take a while, so I decided to see if I could automate some of the process. Somehow, the process led to building a platform out of Lego and leveraging AWS S3 and Rekognition.
LEGO, servos and camera
To build the housing of the scanner, Michael used LEGO, stating “I’m not good at wood working, and I thought that it might be rough on the cards.” While he doesn’t provide a build plan for the housing, Michael only used bricks from in the LEGO Medium Creative Brick Box he bought for the project. In addition, his tutorial includes a lot of pictures to guide you.
Servo motors spin plastic wheels to move single cards from a stack set into the scanner. Michael positioned a Raspberry Pi Camera Module so that it can take a picture of the title of each card as it is set before the lens. The length of the camera’s ribbon cable gave Michael a little difficulty, so he recommends getting an extension for it if you’re planning to recreate the build.
Optical character recognition and MTG card price API
On the software side, Michael wrote three scripts. One is a Python script to control the servos and take pictures. This, he says, “[records] about 20–25 cards a minute.”
Another script identifies the cards and looks up their prices automatically. Michael tried out OpenCV and Tesseract for optical character recognition (OCR) first, before settling on AWS S3 and Rekognition for storing and processing images, respectively. You’ll need an AWS account to do this — Michael used the free tier, which he says allows him to process 5000 pictures per month.
A sizeable collection
Finally, the data that Rekognition sends back gets processed by another Python script that looks up the identified cards on the TCGplayer API to find their price.
Michael says he’s very satisfied with the accuracy of the project’s OCR. He found out that the 920 Magic: The Gathering cards he scanned are worth about $275 in total. He provides a full write-up plus code over on hackster.io.
And for my next trick…
You might be thinking what I’m thinking: the logical next step for this project is to turn it into a card sorter. Then you could input a list of the card deck you want to put together, and presto! The device picks out the right cards from your collection. Building a Commander deck just became a little easier!
What trading cards would you use this project with, and how would you extend it? Also, what’s your favourite commander? Let me know in the comments!
Germany-based Andreas Rottach’s multi-purpose LED table is an impressive build within a gorgeous-looking body. Play games, view (heavily pixelated) images, and become hypnotised by flashy lights, once you’ve built your own using his newly released tutorial.
This is a short presentation of my LED-Matrix Table. The table is controlled by a raspberry pi computer that executes a control engine, written in c++. It supports input from keyboards or custom made game controllers. A full list of all features as well as the source code is available on GitHub (https://github.com/rottaca/LEDTableEngine).
Much excitement
Andreas uploaded a video of his LED Matrix Table to YouTube back in February, with the promise of publishing a complete write-up within the coming weeks. And so the members of Pi Towers sat, eagerly waiting and watching. Now the write-up has arrived, to our cheers of acclaim for this beautful, shiny, flashy, LED-based wonderment.
Andreas created the table’s impressive light matrix using a strip of 300 LEDs, chained together and connected to the Raspberry Pi via an LED controller.
The LEDs are set out in zigzags
For the code, he used several open-source tools, such as SDL for image and audio support, and CMake for building the project software.
Anyone planning to recreate Andreas’ table can compile its engine by downloading the project repository from GitHub. Again, find full instructions for this on his GitHub.
Features
The table boasts multiple cool features, including games and visualisation tools. Using the controllers, you can play simplified versions of Flappy Bird and Minesweeper, or go on a nostalgia trip with Tetris, Pong, and Snake.
There’s also a version of Conway’s Game of Life. Andreas explains: “The lifespan of each cell is color-coded. If the game field gets static, the animation is automatically reset to a new random cell population.”
The table can also display downsampled Bitmap images, or show clear static images such as a chess board, atop of which you can place physical game pieces.
Find all the 3D-printable aspects of the LED table on Thingiverse here and here, and the full GitHub tutorial and repository here. If you build your own, or have already dabbled in LED tables and displays, be sure to share your project with us, either in the comments below or via our social media accounts. What other functions would you integrate into this awesome build?
Chris Campbell’s qrocodile uses a Raspberry Pi, a camera, and QR codes to allow Chris’s children to take full control of the Sonos home sound system. And we love it!
Introducing qrocodile, a kid-friendly system for controlling your Sonos with QR codes. Source code is available at: https://github.com/chrispcampbell/qrocodile Learn more at: http://labonnesoupe.org https://twitter.com/chrscmpbll
Sonos
SONOS is SONOS backwards. It’s also SONOS upside down, and SONOS upside down and backwards. I just learnt that this means SONOS is an ambigram. Hurray for learning!
Sonos (the product, not the ambigram) is a multi-room speaker system controlled by an app. Speakers in different rooms can play different tracks or join forces to play one track for a smooth musical atmosphere throughout your home.
If you have a Sonos system in your home, I would highly recommend accessing to it from outside your home and set it to play the Imperial March as you walk through the front door. Why wouldn’t you?
qrocodile
One day, Chris’s young children wanted to play an album while eating dinner. By this one request, he was inspired to create qrocodile, a musical jukebox enabling his children to control the songs Sonos plays, and where it plays them, via QR codes.
It all started one night at the dinner table over winter break. The kids wanted to put an album on the turntable (hooked up to the line-in on a Sonos PLAY:5 in the dining room). They’re perfectly capable of putting vinyl on the turntable all by themselves, but using the Sonos app to switch over to play from the line-in is a different story.
The QR codes represent commands (such as Play in the living room,Use the turntable, or Build a song list) and artists (such as my current musical crush Courtney Barnett or the Ramones).
A camera attached to a Raspberry Pi 3 feeds the Pi the QR code that’s presented, and the Pi runs a script that recognises the code and sends instructions to Sonos accordingly.
Chris used a costum version of the Sonos HTTP API created by Jimmy Shimizu to gain access to Sonos from his Raspberry Pi. To build the QR codes, he wrote a script that utilises the Spotify API via the Spotipy library.
His children are now able to present recognisable album art to the camera in order to play their desired track.
It’s been interesting seeing the kids putting the thing through its paces during their frequent “dance parties”, queuing up their favorite songs and uncovering new ones. I really like that they can use tangible objects to discover music in much the same way I did when I was their age, looking through my parents records, seeing which ones had interesting artwork or reading the song titles on the back, listening and exploring.
Chris has provided all the scripts for the project, along with a tutorial of how to set it up, on his GitHub — have a look if you want to recreate it or learn more about his code. Also check out Chris’ website for more on qrocodile and to see some of his other creations.
This column is from The MagPi issue 59. You can download a PDF of the full issue for free, or subscribe to receive the print edition through your letterbox or the digital edition on your tablet. All proceeds from the print and digital editions help the Raspberry Pi Foundation achieve our charitable goals.
“Hey, world!” Estefannie exclaims, a wide grin across her face as the camera begins to roll for another YouTube tutorial video. With a growing number of followers and wonderful support from her fans, Estefannie is building a solid reputation as an online maker, creating unique, fun content accessible to all.
It’s as if she was born into performing and making for an audience, but this fun, enjoyable journey to social media stardom came not from a desire to be in front of the camera, but rather as a unique approach to her own learning. While studying, Estefannie decided the best way to confirm her knowledge of a subject was to create an educational video explaining it. If she could teach a topic successfully, she knew she’d retained the information. And so her YouTube channel, Estefannie Explains It All, came into being.
Her first videos featured pages of notes with voice-over explanations of data structure and algorithm analysis. Then she moved in front of the camera, and expanded her skills in the process.
But YouTube isn’t her only outlet. With nearly 50000 followers, Estefannie’s Instagram game is strong, adding to an increasing number of female coders taking to the platform. Across her Instagram grid, you’ll find insights into her daily routine, from programming on location for work to behind-the-scenes troubleshooting as she begins to create another tutorial video. It’s hard work, with content creation for both Instagram and YouTube forever on her mind as she continues to work and progress successfully as a software engineer.
As a thank you to her Instagram fans for helping her reach 10000 followers, Estefannie created a free game for Android and iOS called Gravitris — imagine Tetris with balance issues!
Estefannie was born and raised in Mexico, with ambitions to become a graphic designer and animator. However, a documentary on coding at Pixar, and the beauty of Merida’s hair in Brave, opened her mind to the opportunities of software engineering in animation. She altered her career path, moved to the United States, and switched to a Computer Science course.
With a constant desire to make and to learn, Estefannie combines her software engineering profession with her hobby to create fun, exciting content for YouTube.
While studying, Estefannie started a Computer Science Girls Club at the University of Houston, Texas, and she found herself eager to put more time and effort into the movement to increase the percentage of women in the industry. The club was a success, and still is to this day. While Estefannie has handed over the reins, she’s still very involved in the cause.
Through her YouTube videos, Estefannie continues the theme of inclusion, with every project offering a warm sense of approachability for all, regardless of age, gender, or skill. From exploring Scratch and Makey Makey with her young niece and nephew to creating her own Disney ‘Made with Magic’ backpack for a trip to Disney World, Florida, Estefannie’s videos are essentially a documentary of her own learning process, produced so viewers can learn with her — and learn from her mistakes — to create their own tech wonders.
Estefannie’s automated gingerbread house project was a labour of love, with electronics, wires, and candy strewn across both her living room and kitchen for weeks before completion. While she already was a skilled programmer, the world of physical digital making was still fairly new for Estefannie. Having ditched her hot glue gun in favour of a soldering iron in a previous video, she continued to experiment and try out new, interesting techniques that are now second nature to many members of the maker community. With the gingerbread house, Estefannie was able to research and apply techniques such as light controls, servos, and app making, although the latter was already firmly within her skill set. The result? A fun video of ups and downs that resulted in a wonderful, festive treat. She even gave her holiday home its own solar panel!
1,910 Likes, 43 Comments – Estefannie Explains It All (@estefanniegg) on Instagram: “A DAY AT RASPBERRY PI TOWERS!! LINK IN BIO @raspberrypifoundation”
And that’s just the beginning of her adventures with Pi…but we won’t spoil her future plans by telling you what’s coming next. Sorry! However, since this article was written last year, Estefannie has released a few more Pi-based project videos, plus some awesome interviews and live-streams with other members of the maker community such as Simone Giertz. She even made us an awesome video for our Raspberry Pi YouTube channel! So be sure to check out her latest releases.
2,264 Likes, 56 Comments – Estefannie Explains It All (@estefanniegg) on Instagram: “Best day yet!! I got to hangout, play Jenga with a huge arm robot, and have afternoon tea with…”
While many wonderful maker videos show off a project without much explanation, or expect a certain level of skill from viewers hoping to recreate the project, Estefannie’s videos exist almost within their own category. We can’t wait to see where Estefannie Explains It All goes next!
Jennifer Fox, founder of FoxBot Industries, uses a Raspberry Pi pet monitor to check the sound levels of her home while she is out, allowing her to keep track of when her dog Marley gets noisy or agitated, and to interact with the gorgeous furball accordingly.
A quick overview and demo of the Bark Back, a project to monitor and interact with Check out the full tutorial here: https://learn.sparkfun.com/tutorials/bark-back-interactive-pet-monitor For any licensing requests please contact [email protected]
Marley, bark!
Using a Raspberry Pi 3, speakers, SparkFun’s MEMS microphone breakout board, and an analogue-to-digital converter (ADC), the IoT Pet Monitor is fairly easy to recreate, all thanks to Jennifer’s full tutorial on the FoxBot website.
Building the pet monitor
In a nutshell, once the Raspberry Pi and the appropriate bits and pieces are set up, you’ll need to sign up at CloudMQTT — it’s free if you select the Cute Cat account. CloudMQTT will create an invisible bridge between your home and wherever you are that isn’t home, so that you can check in on your pet monitor.
Image c/o FoxBot Industries
Within the project code, you’ll be able to calculate the peak-to-peak amplitude of sound the microphone picks up. Then you can decide how noisy is too noisy when it comes to the occasional whine and bark of your beloved pup.
The MEMS microphone breakout board collects sound data and relays it back to the Raspberry Pi via the ADC. Image c/o FoxBot Industries
Next you can import sounds to a preset song list that will be played back when the volume rises above your predefined threshold. As Jennifer states in the tutorial, the sounds can easily be recorded via apps such as Garageband, or even on your mobile phone.
Using the pet monitor
Whenever the Bark Back IoT Pet Monitor is triggered to play back audio, this information is fed to the CloudMQTT service, allowing you to see if anything is going on back home.
*incoherent coos of affection from Alex* Image c/o FoxBot Industries
And as Jennifer recommends, a update of the project could include a camera or sensors to feed back more information about your home environment.
If you’ve created something similar, be sure to let us know in the comments. And if you haven’t, but you’re now planning to build your own IoT pet monitor, be sure to let us know in the comments. And if you don’t have a pet but just want to say hi…that’s right, be sure to let us know in the comments.
AWS IoT is a managed cloud platform that lets connected devices easily and securely interact with cloud applications and other devices by using the Message Queuing Telemetry Transport (MQTT) protocol, HTTP, and the MQTT over the WebSocket protocol. Every connected device must authenticate to AWS IoT, and AWS IoT must authorize all requests to determine if access to the requested operations or resources is allowed. Until now, AWS IoT has supported two kinds of authentication techniques: the Transport Layer Security (TLS) mutual authentication protocol and the AWS Signature Version 4 algorithm. Callers must possess either an X.509 certificate or AWS security credentials to be able to authenticate their calls. The requests are authorized based on the policies attached to the certificate or the AWS security credentials.
However, many of our customers have their own systems that issue custom authorization tokens to their devices. These systems use different access control mechanisms such as OAuth over JWT or SAML tokens. AWS IoT now supports a custom authorizer to enable you to use custom authorization tokens for access control. You can now use custom tokens to authenticate and authorize HTTPS over the TLS server authentication protocol and MQTT over WebSocket connections to AWS IoT.
In this blog post, I explain the AWS IoT custom authorizer design and then demonstrate the end-to-end process of setting up a custom authorizer to authorize an HTTPS over TLS server authentication connection to AWS IoT using a custom authorization token. In this process, you configure an AWS Lambda function, which will validate the token and provide the policies necessary to control the access privileges on the connection.
Note: This post assumes you are familiar with AWS IoT and AWS Lambda enough to perform steps using the AWS CLI and OpenSSL. Make sure you are running the latest version of the AWS CLI.
Overview of the custom authorizer workflow
The following numbered diagram illustrates the custom authorizer workflow. The diagram is followed by explanations of the steps.
To explain the steps of the workflow as illustrated in the preceding diagram:
The connected device uses the AWS SDK or custom client to make an HTTPS or MQTT over WebSocket request to the AWS IoT gateway. The request includes a custom authorization token and the name of a preconfigured authorizer Lambda function that is to be invoked to validate the authorization token.
The AWS IoT gateway identifies the authorization token in the request and determines that it is a custom authorization request. The gateway checks if a Lambda authorization function is configured for the AWS account that owns the device. If yes, the gateway invokes the Lambda function by passing the authorization token.
The Lambda function verifies the authorization token and returns to the AWS IoT gateway a principal that identifies the connection and a set of AWS IoT policies that determine permissions for the requested operation. The Lambda function also returns two time-to-live values that determine the validity (in seconds) of the connection and policy documents.
The AWS IoT gateway invokes the AWS policy evaluation engine to authorize the requested operation against the set of policies that are received from the authorizer Lambda function.
The AWS policy evaluation engine evaluates the policy documents and returns the result to the AWS IoT gateway. The gateway then caches the policy documents.
If the policy evaluation allows the requested operation, the AWS IoT gateway serves the request. If the requested operation is denied, the AWS IoT gateway returns an AccessDenied exception with the status code of 403 to the device (the red line in the preceding diagram).
Outline of the steps to configure and use the custom authorizer
The following are the steps you will perform as part of the solution:
Create a Lambda function: Start by creating a Lambda function. The function takes the authorization token in the input, verifies it, and returns authorization policies to determine the caller’s permissions for the requested operation.
Create an authorizer: Create an authorizer in AWS IoT. An authorizer is an alternate data model pointing to a pre-created Lambda function. You can specify in the custom authorization request an authorizer name. AWS IoT invokes a corresponding Lambda function to verify the authorization token. You may update the authorizer to point to a different Lambda function and thus easily control which Lambda function to invoke to verify an authorization token.
Designate the default authorizer: You may designate one of your authorizers as the default authorizer. AWS IoT invokes the default authorizer implicitly when a custom authorization request does not include a specific authorizer name.
Add Lambda invocation permissions: AWS IoT needs to be able to call your Lambda function on your behalf to verify the token in the custom authorization request. You need to associate a resource policy with the Lambda function to allow this.
Test the Lambda function: When the Lambda function and the custom authorizer are configured, use the test function to verify that they are functioning correctly.
Invoke the custom authorizer: Finally, make an HTTPS request to the gateway that includes a custom authorization token. The request invokes the custom authorizer.
Deploy the solution
1. Create a Lambda function
In this step, I show you how to create a Lambda function that runs your custom authorizer code and returns a set of essential attributes to authorize the request.
Sign in to your AWS account and create from the Lambda console a Lambda function that takes as input an authorization token and performs whichever actions are necessary to validate the token, as shown in the following code example. The output, in JSON format, must contain the following attributes:
IsAuthenticated: This is a Boolean (true/false) attribute that indicates whether the request is authenticated.
PrincipalId: This is an alphanumeric string; the minimum length is 1 character, and the maximum length is 128 characters. This string acts as an identifier associated with the token that is received in the custom authorization request.
PolicyDocuments: This is a list of JSON formatted policy documents following the same conventions as an AWS IoT policy. The list contains at most 10 policy documents, each of which can be a maximum of 2,048 characters.
DisconnectAfterInSeconds: This indicates the maximum duration (in seconds) of the connection to the AWS IoT gateway, after which it will be disconnected. The minimum value is 300 seconds, and the maximum value is 86,400 seconds.
RefreshAfterInSeconds: This is the period between policy refreshes. When it lapses, the Lambda function is invoked again to allow for policy refreshes. The minimum value is 300 seconds, and the maximum value is 86,400 seconds.
The following code example is a Lambda function in Node.js 6.10 that authenticates a token.
// A simple authorizer Lambda function demonstrating
// how to parse auth token and generate response
exports.handler = function(event, context, callback) {
var token = event.token;
switch (token.toLowerCase()) {
case 'allow':
callback(null, generateAuthResponse(token, 'Allow'));
case 'deny':
callback(null, generateAuthResponse(token, 'Deny'));
default:
callback("Error: Invalid token");
}
};
// Helper function to generate authorization response
var generateAuthResponse = function(token, effect) {
// Invoke your preferred identity provider
// to get the authN and authZ response.
// Following is just for simplicity sake
var authResponse = {};
authResponse.isAuthenticated = true;
authResponse.principalId = 'principalId';
var policyDocument = {};
policyDocument.Version = '2012-10-17';
policyDocument.Statement = [];
var statement = {};
statement.Action = 'iot:Publish';
statement.Effect = effect;
statement.Resource = "arn:aws:iot:us-east-1:<your_aws_account_id>:topic/customauthtesting";
policyDocument.Statement[0] = statement;
authResponse.policyDocuments = [policyDocument];
authResponse.disconnectAfterInSeconds = 3600;
authResponse.refreshAfterInSeconds = 600;
return authResponse;
}
The preceding function takes an authorization token and returns an object containing the five attributes (a-e) described earlier in this step.
2. Create an authorizer
Now that you have created the Lambda function, you will create an authorizer with AWS IoT pointing to the Lambda function. You do this so that you can easily control which Lambda function to invoke to verify an authorization token. The following attributes are required to create an authorizer:
AuthorizerName: This is the name of the authorizer. It is a string; the minimum length is 1 character, and the maximum length is 128 characters.
AuthorizerFunctionArn: This is the Amazon Resource Name (ARN) of the Lambda function that you created in the previous step.
TokenKeyName: This specifies the key name that your device chooses, which indicates the token in the custom authorization HTTP request header. It is a string; the minimum length is 1 character, and the maximum length is 128 characters.
TokenSigningPublicKeys: This is a map of one (minimum) and two (maximum) public keys. It is a 2,048-bit key at minimum. You need to generate a key pair, sign the custom authorization token and include the digital signature in the request in order to be able to use custom authorization successfully. AWS IoT needs the corresponding public key to verify the digital signature. Therefore, you must provide the public key while creating an authorizer.
Status: This specifies the status (ACTIVE or INACTIVE) of the authorizer. It is optional. The default value is INACTIVE.
Run the following command in OpenSSL to create an RSA key pair that will be used to generate a token signature.
openssl genrsa -out private.pem 2048
Run the following command to create the public key out of the key pair generated in the previous step.
You need to store the key pair securely and pass the public key in the TokenSigningPublicKeys parameter when creating the authorizer in AWS IoT. You will use the private key in the key pair to sign the authorization token and include the signature in the custom authorization request.
Run the following command in the AWS CLI to create an authorizer.
You must set the authorizer in the ACTIVE status to be invoked. The describe-authorizer API returns the attributes of an existing authorizer. You can use the following command to verify if all the attributes in the authorizer are set correctly.
You can have multiple authorizers in your account. You can designate one of them as the default so that AWS IoT invokes it if the custom authorization request does not specify an authorizer name. Run the following command in the AWS CLI to designate a default authorizer.
AWS IoT needs to invoke your authorizer Lambda function to evaluate the custom authorizer token. You need to provide AWS IoT appropriate permissions to invoke your Lambda function when a custom authorization request is made. Use the AWS CLI with the AddPermission command to grant the AWS IoT service principal permission to call lambda:InvokeFunction, as shown in the following command.
Note that you are using the precreated AuthorizerArn as the SourceArn while granting the permission. The Lambda function gets triggered only if the source ARN provided by AWS IoT during the invocation matches with the SourceArn that you have given permission. Even if your Lambda function ARN is put in an authorizer owned by someone else, they cannot trigger the function causing illegitimate charge to you.
5. Test the Lambda function
To verify the configuration, test to see if AWS IoT can invoke the Lambda function and get the correct output. You can do this by using the TestInvokeAuthorizer API. The API takes the following input:
AuthorizerName: This is the name of the authorizer. It is a string; the minimum length is 1 character, and the maximum length is 128 characters.
Token: This specifies the custom authorization token to authorize the request to the AWS IoT gateway. It is a string; the minimum length is 1 character, and the maximum length is 1,024 characters.
TokenSignature: This is the token signature generated by using the private key with the SHA256withRSA algorithm. This is a string with a maximum length 2,560 characters. You must Base64-encode the signature while passing it as an input (the command follows).
Run the following command in OpenSSL to generate a signature for string, allow.
If AWS IoT is able to invoke the Lambda function successfully, the output of the TestInvokeAuthorizer API will be exactly the same as the output of the Lambda function. The following is sample output of the test-invoke-authorizer command for the Lambda function you created in Step 1 earlier in this post.
You can now make a custom authorization request to the AWS IoT gateway. Custom authorization is supported for HTTPS over TLS server authentication and MQTT over WebSocket connections. Regardless of the protocol, requests must include the following attributes in the header. Note that supplying these attributes through query parameters is not allowed for security reasons.
Token: This specifies the authorization token. The header name must be the same as the TokenKeyName of the authorizer.
TokenSignature: This specifies the Base64-encoded digital signature of the token string. The header name must be x-amz-customauthorizer-signature.
AuthorizerName: This specifies the name of one of the ACTIVE authorizers preconfigured in your account. The header name must be x-amz-customauthorizer-name. If this field is not present and you have preconfigured a default custom authorizer for your account, the AWS IoT gateway invokes the default authorizer to authenticate and authorize the request.
Run the following command in the AWS CLI to obtain your AWS account-specific AWS IoT endpoint. See the DescribeEndpoint API documentation for further details. You need to specify the endpoint when making requests to the AWS IoT gateway.
aws iot describe-endpoint
The following is sample output of the describe-endpoint command. It contains the endpointAddress.
Now, make an HTTPS request to the AWS IoT gateway to post a message. You may use your preferred HTTP client for the request. I use curl in the following example, which posts the message, “Hello from custom auth,” to an MQTT topic, customauthtesting, by using the token, allow.
If the command succeeds, you will see the following output.
HTTP/1.1 200 OK
content-type: application/json
content-length: 65
date: Sun, 07 Jan 2018 19:56:12 GMT
x-amzn-RequestId: e7c13873-6c61-a12c-1216-9a935901c130
connection: keep-alive
{"message":"OK","traceId":"e7c13873-6c61-a12c-1216-9a935901c130"}
Conclusion
In this blog post, I have shown how to configure a custom authorizer in your AWS account and use it to authorize an HTTPS over TLS server authentication connection and publish a message to a specific MQTT topic by using a custom authorization token. Similarly, you can use custom authorizers to authorize MQTT over WebSocket requests to the AWS IoT gateway to publish messages to a specific topic or subscribe to a topic filter.
If you have comments about this blog post, submit them in the “Comments” section below. If you have questions about or issues implementing this solution, start a new thread in the AWS IoT forum.
Today I’m back with an update on the Pi Plug I made a while back. This prototype is still in the works, and is much more modular than the previous version. https://N-O-D-E.net/piplug2.html https://github.com/N-O-D-E/piplug —————- Shop: http://N-O-D-E.net/shop/ Patreon: http://patreon.com/N_O_D_E_ BTC: 17HqC7ZzmpE7E8Liuyb5WRbpwswBUgKRGZ Newsletter: http://eepurl.com/ceA-nL Music: https://archive.org/details/Fwawn-FromManToGod
The Pi Zero Power Case
In a video early last year, YouTuber N-O-D-E revealed his Pi Zero Power Case, an all-in-one always-on networked computer that fits snugly against a wall power socket.
The project uses an official Raspberry Pi power supply, a Zero4U USB hub, and a Raspberry Pi Zero W, and it allows completely wireless connection to a network. N-O-D-E cut the power cord and soldered its wires directly to the power input of the USB hub. The hub powers the Zero via pogo pins that connect directly to the test pads beneath.
The Power Case is a neat project, but it may be a little daunting for anyone not keen on cutting and soldering the power supply wires.
Pi Plug 2
In his overhaul of the design, N-O-D-E has created a modular reimagining of the previous always-on networked computer that fits more streamlined to the wall socket and requires absolutely no soldering or hacking of physical hardware.
The Pi Plug 2 uses a USB power supply alongside two custom PCBs and a Zero W. While one PCB houses a USB connector that slots directly into the power supply, two blobs of solder on the second PCB press against the test pads beneath the Zero W. When connected, the PCBs run power directly from the wall socket to the Raspberry Pi Zero W. Neat!
While N-O-D-E isn’t currently selling these PCBs in his online store, all files are available on GitHub, so have a look if you want to recreate the Pi Plug.
Uses
In another video — and seriously, if you haven’t checked out N-O-D-E’s YouTube channel yet, you really should — he demonstrates a few changes that can turn your Zero into a USB dongle computer. This is a great hack if you don’t want to carry a power supply around in your pocket. As N-O-D-E explains:
Besides simply SSH’ing into the Pi, you could also easily install a remote desktop client and use the GUI. You can share your computer’s internet connection with the Pi and use it just like you would normally, but now without the need for a monitor, chargers, adapters, cables, or peripherals.
We’re keen to see how our community is hacking their Zeros and Zero Ws in order to take full advantage of the small footprint of the computer, so be sure to share your projects and ideas with us, either in the comments below or via social media.
This post summarizes the responses we received to our November 28 post asking our readers how they handle the challenge of digital asset management (DAM). You can read the previous posts in this series below:
Use the Join button above to receive notification of future posts on this topic.
This past November, we published a blog post entitled What’s the Best Solution for Managing Digital Photos and Videos? We asked our readers to tell us how they’re currently backing up their digital media assets and what their ideal system might be. We posed these questions:
How are you currently backing up your digital photos, video files, and/or file libraries/catalogs? Do you have a backup system that uses attached drives, a local network, the cloud, or offline storage media? Does it work well for you?
Imagine your ideal digital asset backup setup. What would it look like? Don’t be constrained by current products, technologies, brands, or solutions. Invent a technology or product if you wish. Describe an ideal system that would work the way you want it to.
We were thrilled to receive a large number of responses from readers. What was clear from the responses is that there is no consensus on solutions for either amateur or professional, and that users had many ideas for how digital media management could be improved to meet their needs.
We asked our readers to contribute to this dialog for a number of reasons. As a cloud backup and cloud storage service provider, we want to understand how our users are working with digital media so we know how to improve our services. Also, we want to participate in the digital media community, and hope that sharing the challenges our readers are facing and the solutions they are using will make a contribution to that community.
The State of Managing Digital Media
While a few readers told us they had settled on a system that worked for them, most said that they were still looking for a better solution. Many expressed frustration with dealing with the growing amount of data for digital photos and videos that is only getting larger with the increasing resolution of still and video cameras. Amateurs are making do with a number of consumer services, while professionals employ a wide range of commercial, open source, or jury rigged solutions for managing data and maintaining its integrity.
I’ve summarized the responses we received in three sections on, 1) what readers are doing today, 2) common wishes they have for improvements, and 3) concerns that were expressed by a number of respondents.
The Digital Media Workflow
Protecting Media From Camera to Cloud
We heard from a wide range of smartphone users, DSLR and other format photographers, and digital video creators. Speed of operation, the ability to share files with collaborators and clients, and product feature sets were frequently cited as reasons for selecting their particular solution. Also of great importance was protecting the integrity of media through the entire capture, transfer, editing, and backup workflow.
Avid Media Composer
Many readers said they backed up their camera memory cards as soon as possible to a computer or external drive and erased cards only when they had more than one backup of the media. Some said that they used dual memory cards that are written to simultaneously by the camera for peace-of-mind.
While some cameras now come equipped with Wi-Fi, no one other than smartphone users said they were using Wi-Fi as part of their workflow. Also, we didn’t receive feedback from any photographers who regularly shoot tethered.
Some readers said they still use CDs and DVDs for storing media. One user admitted to previously using VHS tape.
NAS (Network Attached Storage) is in wide use. Synology, Drobo, FreeNAS, and other RAID and non-RAID storage devices were frequently mentioned.
A number were backing up their NAS to the cloud for archiving. Others said they had duplicate external drives that were stored onsite or offsite, including in a physical safe, other business locations, a bank lock box, and even “mom’s house.”
Many said they had regular backup practices, including nightly backups, weekly and other regularly scheduled backups, often in non-work hours.
One reader said that a monthly data scrub was performed on the NAS to ensure data integrity.
Hardware used for backups included Synology, QNAP, Drobo, and FreeNAS systems.
Services used by our readers for backing up included Backblaze Backup, Backblaze B2 Cloud Storage, CrashPlan, SmugMug, Amazon Glacier, Google Photos, Amazon Prime Photos, Adobe Creative Cloud, Apple Photos, Lima, DropBox, and Tarsnap. Some readers made a distinction between how they used sync (such as DropBox), backup (such as Backblaze Backup), and storage (such as Backblaze B2), but others did not. (See Sync vs. Backup vs. Storage on our blog for an explanation of the differences.)
Software used for backups and maintaining file integrity included Arq, Carbon Copy Cloner, ChronoSync, SoftRAID, FreeNAS, corz checksum, rclone, rsync, Apple Time Machine, Capture One, Btrfs, BorgBackup, SuperDuper, restic, Acronis True Image, custom Python scripts, and smartphone apps PhotoTransfer and PhotoSync.
Cloud torrent services mentioned were Offcloud, Bitport, and Seedr.
A common practice mentioned is to use SSD (Solid State Drives) in the working computer or attached drives (or both) to improve speed and reliability. Protection from magnetic fields was another reason given to use SSDs.
Many users copy their media to multiple attached or network drives for redundancy.
Users of Lightroom reported keeping their Lightroom catalog on a local drive and their photo files on an attached drive. They frequently had different backup schemes for the catalog and the media. Many readers are careful to have multiple backups of their Lightroom catalog. Some expressed the desire to back up both their original raw files and their edited (working) raw files, but limitations in bandwidth and backup media caused some to give priority to good backups of their raw files, since the edited files could be recreated if necessary.
A number of smartphone users reported using Apple or Google Photos to store their photos and share them.
Digital Editing and Enhancement
Adobe still rules for many users for photo editing. Some expressed interest in alternatives from Phase One, Skylum (formerly Macphun), ON1, and DxO.
Adobe Lightroom
While Adobe Lightroom (and Adobe Photoshop for some) are the foundation of many users’ photo media workflow, others are still looking for something that might better suit their needs. A number of comments were made regarding Adobe’s switch to a subscription model.
Software used for image and video editing and enhancement included Adobe Lightroom, Adobe Photoshop, Luminar, Affinity Photo, Phase One, DxO, ON1, GoPro Quik, Apple Aperture (discontinued), Avid Media Composer, Adobe Premiere, and Apple Final Cut Studio (discontinued) or Final Cut Pro.
Luminar 2018 DAM preview
Managing, Archiving, Adding Metadata, Searching for Media Files
While some of our respondents are casual or serious amateur digital media users, others make a living from digital photography and videography. A number of our readers report having hundreds of thousands of files and many terabytes of data — even approaching one petabyte of data for one professional who responded. Whether amateur or professional, all shared the desire to preserve their digital media assets for the future. Consequently, they want to be able to attach metadata quickly and easily, and search for and retrieve files from wherever they are stored when necessary.
It’s not surprising that metadata was of great interest to our readers. Tagging, categorizing, and maintaining searchable records is important to anyone dealing with digital media.
While Lightroom was frequently used to manage catalogs, metadata, and files, others used spreadsheets to record archive location and grep for searching records.
Some liked the idea of Adobe’s Creative Cloud but weren’t excited about its cost and lack of choice in cloud providers.
Others reported using Photo Mechanic, DxO, digiKam, Google Photos, Daminion, Photo Supreme, Phraseanet, Phase One Media Pro, Google Picasa (discontinued), Adobe Bridge, Synology Photo Station, FotoStation, PhotoShelter, Flickr, and SmugMug.
Photo Mechanic 5
Common Wishes For Managing Digital Media in the Future
Our readers came through with numerous suggestions for how digital media management could be improved. There were a number of common themes centered around bigger and better storage, faster broadband or other ways to get data into the cloud, managing metadata, and ensuring integrity of their data.
Many wished for faster internet speeds that would make transferring and backing up files more efficient. This desire was expressed multiple times. Many said that the sheer volume of digital data they worked with made cloud services and storage impractical.
A number of readers would like the option to be able to ship files on a physical device to a cloud provider so that the initial large transfer would not take as long. Some wished to be able to send monthly physical transfers with incremental transfers send over the internet. (Note that Backblaze supports adding data via a hardware drive to B2 Cloud Storage with our Fireball service.)
Reasonable service cost, not surprisingly, was a desire expressed by just about everyone.
Many wished for not just backup, but long-term archiving of data. One suggestion was to be able to specify the length-of-term for archiving and pay by that metric for specific sets of files.
An easy-to-use Windows, Macintosh, or Linux client was a feature that many appreciated. Some were comfortable with using third-party apps for cloud storage and others wanted a vendor-supplied client.
A number of users like the combination of NAS and cloud. Many backed up their NAS devices to the cloud. Some suggested that the NAS should be the local gateway to unlimited virtual storage in the cloud. (They should read our recent blog post on Morro Data’s CloudNAS solution.)
Some just wanted the storage problem solved. They would like the computer system to manage storage intelligently so they don’t have to. One reader said that storage should be managed and optimized by the system, as RAM is, and not by the user.
Common Concerns Expressed by our Readers
Over and over again our readers expressed similar concerns about the state of digital asset management.
Dealing with large volumes of data was a common challenge. As digital media files increase in size, readers struggle to manage the amount of data they have to deal with. As one reader wrote, “Why don’t I have an online backup of my entire library? Because it’s too much damn data!”
Many said they would back up more often, or back up even more files if they had the bandwidth or storage media to do so.
The cloud is attractive to many, but some said that they didn’t have the bandwidth to get their data into the cloud in an efficient manner, the cloud is too expensive, or they have other concerns about trusting the cloud with their data.
Most of our respondents are using Apple computer systems, some Windows, and a few Linux. A lot of the Mac users are using Time Machine. Some liked the concept of Time Machine but said they had experienced corrupted data when using it.
Visibility into the backup process was mentioned many times. Users want to know what’s happening to their data. A number said they wanted automatic integrity checks of their data and reports sent to them if anything changes.
A number of readers said they didn’t want to be locked into one vendor’s proprietary solution. They prefer open standards to prevent loss if a vendor leaves the market, changes the product, or makes a turn in strategy that they don’t wish to follow.
A number of users talked about how their practices differed depending on whether they were working in the field or working in a studio or at home. Access to the internet and data transfer speed was an issue for many.
It’s clear that people working in high resolution photography and videography are pushing the envelope for moving data between storage devices and the cloud.
Some readers expressed concern about the integrity of their stored data. They were concerned that over time, files would degrade. Some asked for tools to verify data integrity manually, or that data integrity should be monitored and reported by the storage vendor on a regular basis. The OpenZFS and Btrfs file systems were mentioned by some.
A few readers mentioned that they preferred redundant data centers for cloud storage.
Metadata is an important element for many, and making sure that metadata is easily and permanently associated with their files is essential.
The ability to share working files with collaborators or finished media with clients, friends, and family also is a common requirement.
Thank You for Your Comments and Suggestions
As a cloud backup and storage provider, your contributions were of great interest to us. A number of readers made suggestions for how we can improve or augment our services to increase the options for digital media management. We listened and are considering your comments. They will be included in our discussions and planning for possible future services and offerings from Backblaze. We thank everyone for your contributions.
Digital media management
Let’s Keep the Conversation Going!
Were you surprised by any of the responses? Do you have something further to contribute? This is by no means the end of our exploration of how to better serve media professionals, so let’s keep the lines of communication open.
Recreate Snake, the favourite mobile phone game from the late nineties, using a slug*, a Raspberry Pi, a Sense HAT, and our free resource!
*A virtual slug. Not a real slug. Please leave the real slugs out in nature.
Snake SLUG!
Move aside, Angry Birds! On your bike, Pokémon Go! When it comes to the cream of the crop of mobile phone games, Snake holds the top spot.
I could while away the hours…
You may still have an old Nokia 3310 lost in the depths of a drawer somewhere — the drawer that won’t open all the way because something inside is jammed at an odd angle. So it will be far easier to grab your Pi and Sense HAT, or use the free Sense HAT emulator (online or on Raspbian), and code Snake SLUG yourself. In doing so, you can introduce the smaller residents of your household to the best reptile-focused game ever made…now with added mollusc.
The resource
To try out the game for yourself, head to our resource page, where you’ll find the online Sense HAT emulator embedded and ready to roll.
It’ll look just like this, and you can use your computer’s arrow keys to direct your slug toward her tasty treats.
From there, you’ll be taken on a step-by-step journey from zero to SLUG glory while coding your own versionof the game in Python. On the way, you’ll learn to work with two-dimensional lists and to use the Sense HAT’s pixel display and joystick input. And by completing the resource, you’ll expand your understanding of applying abstraction and decomposition to solve more complex problems, in line with our Digital Making Curriculum.
The Sense HAT
The Raspberry Pi Sense HAT was originally designed and made as part of the Astro Pi mission in December 2015. With an 8×8 RGB LED matrix, a joystick, and a plethora of on-board sensors including an accelerometer, gyroscope, and magnetometer, it’s a great add-on for your digital making toolkit, and excellent for projects involving data collection and evaluation.
As more and more digital home assistants are appearing on the consumer market, it’s not uncommon to see the towering Amazon Echo or sleek Google Home when visiting friends or family. But we, the maker community, are rarely happy unless our tech stands out from the rest. So without further ado, here’s a roundup of some fantastic retrofitted home assistant projects you can recreate and give pride of place in your kitchen, on your bookshelf, or wherever else you’d like to talk to your virtual, disembodied PA.
Turned an 80s Tomy Mr Money into a little Google AIY / Raspberry Pi based assistant.
Matt ‘Circuitbeard’ Brailsford’s Tomy Mr Money Google AIY Assistant is just one of many home-brew home assistants makers have built since the release of APIs for Amazon Alexa and Google Home. Here are some more…
Teddy Ruxpin
Oh Teddy, how exciting and mysterious you were when I unwrapped you back in the mideighties. With your awkwardly moving lips and twitching eyelids, you were the cream of the crop of robotic toys! How was I to know that during my thirties, you would become augmented with home assistant software and suddenly instil within me a fear unlike any I’d felt before? (Save for my lifelong horror of ET…)
Please watch: “DIY Fidget LED Display – Part 1” https://www.youtube.com/watch?v=FAZIc82Duzk -~-~~-~~~-~~-~- There are tons of virtual assistants out on the market: Siri, Ok Google, Alexa, etc. I had this crazy idea…what if I made the virtual assistant real…kinda. I decided to take an old animatronic teddy bear and hack it so that it ran Amazon Alexa.
Several makers around the world have performed surgery on Teddy to install a Raspberry Pi within his stomach and integrate him with Amazon Alexa Voice or Google’s AIY Projects Voice kit. And because these makers are talented, they’ve also managed to hijack Teddy’s wiring to make his lips move in time with his responses to your commands. Freaky…
Speaking of freaky: check out Zack’s Furlexa — an Amazon Alexa Furby that will haunt your nightmares.
Give old tech new life
Devices that were the height of technology when you purchased them may now be languishing in your attic collecting dust. With new and improved versions of gadgets and gizmos being released almost constantly, it is likely that your household harbours a spare whosit or whatsit which you can dismantle and give a new Raspberry Pi heart and purpose.
Take, for example, Martin Mander’s Google Pi intercom. By gutting and thoroughly cleaning a vintage intercom, Martin fashioned a suitable housing the Google AIY Projects Voice kit to create a new home assistant for his house:
This is a 1986 Radio Shack Intercom that I’ve converted into a Google Home style device using a Raspberry Pi and the Google AIY (Artificial Intelligence Yourself) kit that came free with the MagPi magazine (issue 57). It uses the Google Assistant to answer questions and perform actions, using IFTTT to integrate with smart home accessories and other web services.
Not only does this build look fantastic, it’s also a great conversation starter for any visitors who had a similar device during the eighties.
…and then I’ll put that box inside of another box, and then I’ll mail that box to myself, and when it arrives…
A GIF. A harmless, little GIF…and proof of the comms team’s obsession with The Emperor’s New Groove.
You don’t have to be fancy when it comes to housing your home assistant. And often, especially if you’re working with the smaller people in your household, the results of a simple homespun approach are just as delightful.
Here are Hannah and her dad Tom, explaining how they built a home assistant together and fit it inside an old cigar box:
My 7 year old daughter and I decided to play around with the Raspberry Pi and build ourselves an Amazon Echo (Alexa). The video tells you about what we did and the links below will take you to all the sites we used to get this up and running.
And now it’s your turn! I challenge you all (and also myself) to create a home assistant using the Raspberry Pi. Whether you decide to fit Amazon Alexa inside an old shoebox or Google Home inside your sister’s Barbie, I’d love to see what you create using the free home assistant software available online.
Check out these otherhomeassistants for Raspberry Pi, and keep an eye on our blog to see what I manage to create as part of the challenge.
Tim Rowledge produces and sells wonderful replicas of the cases which our Astro Pis live in aboard the International Space Station. Here is the story of how he came to do this. Over to you, Tim!
When the Astro Pi case was first revealed a couple of years ago, the collective outpouring of ‘Squee!’ it elicited may have been heard on board the ISS itself. People wanted to buy it or build it at home, and someone wanted to know whether it would blend. (There’s always one.)
The Sense HAT and its Pi tucked snugly in the original Astro Pi flight case — gorgeous, isn’t it?
Replicating the Astro Pi case
Some months later the STL files for printing your own Astro Pi case were released, and people jumped at the chance to use them. Soon reports appeared saying you had to make quite a few attempts before getting a good print — normal for any complex 3D-printing project. A fellow member of my local makerspace successfully made a couple of cases, but it took a lot of time, filament, and post-print finishing work. And of course, a plastic Astro Pi case simply doesn’t look or feel like the original made of machined aluminium — or ‘aluminum’, as they tend to say over here in North America.
A batch of tops designed by Tim
I wanted to build an Astro Pi case which would more closely match the original. Fortunately, someone else at my makerspace happens to have some serious CNC machining equipment at his small manufacturing company. Therefore, I focused on creating a case design that could be produced with his three-axis device. This meant simplifying some parts to avoid expensive, slow, complex multi-fixture work. It took us a while, but we ended up with a design we can efficiently make using his machine.
Tim’s first lasered case
And the resulting case looks really, really like the original — in fact, upon receiving one of the final prototypes, Eben commented:
“I have to say, at first glance they look spectacular: unless you hold them side by side with the originals, it’s hard to pinpoint what’s changed. I’m looking forward to seeing one built up and then seeing them in the wild.”
Inside the Astro Pi case
Making just the bare case is nice, but there are other parts required to recreate a complete Astro Pi unit. Thus I got my local electronics company to design a small HAT to provide much the same support the mezzanine board offers: an RTC and nice, clean connections to the six buttons. We also added well-labelled, grouped pads for all the other GPIO lines, along with space for an ADC. If you’re making your own Astro Pi replica, you might like the Switchboard.
The electronics supply industry just loves to offer *some* of what you need, so that one supplier never has everything: we had to obtain the required stand-offs, screws, spacers, and JST wires from assorted other sources. Jeff at my nearby Industrial Paint & Plastics took on the laser engraving of our cases, leaving out copyrighted logos etcetera.
Lasering the top of a case
Get your own Astro Pi case
Should you like to buy one of our Astro Pi case kits, pop over to www.astropicase.com, and we’ll get it on its way to you pronto. If you’re an institutional or corporate customer, the fully built option might make more sense for you — ordering the Pi and other components, and having a staff member assemble it all, may well be more work than is sensible.
Tim’s first full Astro Pi case replica, complete with shiny APEM buttons
To put the kit together yourself, all you need to do is add a Pi, Sense HAT, Camera Module, and RTC battery, and choose your buttons. An illustrated manual explains the process step by step. Our version of the Astro Pi case uses the same APEM buttons as the units in orbit, and whilst they are expensive, just clicking them is a source of great joy. It comes in a nice travel case too.
This is Tim. Thanks, Tim!
Take part in Astro Pi
If having an Astro Pi replica is not enough for you, this is your chance: the 2017-18 Astro Pi challenge is open! Do you know a teenager who might be keen to design a experiment to run on the Astro Pis in space? Are you one yourself? You have until 29 October to send us your Mission Space Lab entry and become part of the next generation of space scientists? Head over to the Astro Pi website to find out more.
Joe Birch has built a simple device that converts online news stories to Braille, inspired by his family’s predisposition to loss of eyesight. He has based his BrailleBox on Android Things, News API, and a Raspberry Pi 3.
The background
Braille is a symbol system for people with visual impairment which represents letters and numbers as raised points. Commercial devices that dynamically produce Braille are very expensive, so Joe decided to build a low-cost alternative that is simple to recreate.
News API is a tool for fetching JSON metadata from over 70 online news sources. You can use it to integrate headlines or articles into websites and text-based applications.
The BrailleBox
To create the six nubs necessary to form Braille symbols, Joe topped solenoids with wooden balls. He then wired them up to GPIO pins of the Pi 3 via a breadboard.
One of the solenoids Joe built into the BrailleBox
Next, he took control of the solenoids using Android Things. He set up the BrailleBox software to start running on boot, and added a push button. When he presses the button, the program fetches a news story using News API, and the solenoids start moving.
Since Joe is an Android Engineer, looking through his write-up and code for BrailleBox might be useful for anyone interested in Android Things.
If you like this project, make sure you keep an eye on Joe’s Twitter, since he has plans to update the BrailleBox design. His next step is to move on from the prototyping stage and house all the hardware inside the box. Moreover, he is thinking about adding a potentiometer so that users can choose their preferred reading speed.
Accessibility
If you want to find our community’s conversation about accessibility and assistive technology, head to the forums. And if you’re working to make computing more accessible, or if you’ve built an assistive project, let us know in the comments or on social media, so that we can boost the signal!
In a previous AWS Security Blog post, Jeff Levine showed how you can monitor changes to your Amazon EC2 security groups. The methods he describes in that post are examples of detective controls, which can help you determine when changes are made to security controls on your AWS resources.
In this post, I take that approach a step further by introducing an example of a responsive control, which you can use to automatically respond to a detected security event by applying a chosen security mitigation. I demonstrate a solution that continuously monitors changes made to an Amazon VPCsecurity group, and if a new ingress rule (the same as an inbound rule) is added to that security group, the solution removes the rule and then sends you a notification after the changes have been automatically reverted.
The scenario
Let’s say you want to reduce your infrastructure complexity by replacing your Secure Shell (SSH) bastion hosts with Amazon EC2 Systems Manager (SSM). SSM allows you to run commands on your hosts remotely, removing the need to manage bastion hosts or rely on SSH to execute commands. To support this objective, you must prevent your staff members from opening SSH ports to your web server’s Amazon VPC security group. If one of your staff members does modify the VPC security group to allow SSH access, you want the change to be automatically reverted and then receive a notification that the change to the security group was automatically reverted. If you are not yet familiar with security groups, see Security Groups for Your VPC before reading the rest of this post.
Solution overview
This solution begins with a directivecontrol to mandate that no web server should be accessible using SSH. The directive control is enforced using a preventivecontrol, which is implemented using a security group rule that prevents ingress from port 22 (typically used for SSH). The detective control is a “listener” that identifies any changes made to your security group. Finally, the responsive control reverts changes made to the security group and then sends a notification of this security mitigation.
The detective control, in this case, is an Amazon CloudWatch event that detects changes to your security group and triggers the responsive control, which in this case is an AWS Lambda function. I use AWS CloudFormation to simplify the deployment.
The following diagram shows the architecture of this solution.
Here is how the process works:
Someone on your staff adds a new ingress rule to your security group.
A CloudWatch event that continually monitors changes to your security groups detects the new ingress rule and invokes a designated Lambda function (with Lambda, you can run code without provisioning or managing servers).
The Lambda function evaluates the event to determine whether you are monitoring this security group and reverts the new security group ingress rule.
Finally, the Lambda function sends you an email to let you know what the change was, who made it, and that the change was reverted.
Deploy the solution by using CloudFormation
In this section, you will click the Launch Stack button shown below to launch the CloudFormation stack and deploy the solution.
Prerequisites
You must have AWS CloudTrail already enabled in the AWS Region where you will be deploying the solution. CloudTrail lets you log, continuously monitor, and retain events related to API calls across your AWS infrastructure. See Getting Started with CloudTrail for more information.
You must have a default VPC in the region in which you will be deploying the solution. AWS accounts have one default VPC per AWS Region. If you’ve deleted your VPC, see Creating a Default VPC to recreate it.
Resources that this solution creates
When you launch the CloudFormation stack, it creates the following resources:
A sample VPC security group in your default VPC, which is used as the target for reverting ingress rule changes.
A CloudWatch event rule that monitors changes to your AWS infrastructure.
A Lambda function that reverts changes to the security group and sends you email notifications.
A permission that allows CloudWatch to invoke your Lambda function.
An Amazon SNS topic to which the Lambda function publishes notifications.
Launch the CloudFormation stack
The link in this section uses the us-east-1 Region (the US East [N. Virginia] Region). Change the region if you want to use this solution in a different region. See Selecting a Region for more information about changing the region.
To deploy the solution, click the following Launch Stack button to launch the stack. After you click the button, you must sign in to the AWS Management Console if you have not already done so.
Then:
Choose Next to proceed to the Specify Details page.
On the Specify Details page, type your email address in the Send notifications to box. This is the email address to which change notifications will be sent. (After the stack is launched, you will receive a confirmation email that you must accept before you can receive notifications.)
Choose Next until you get to the Review page, and then choose the I acknowledge that AWS CloudFormation might create IAM resources check box. This confirms that you are aware that the CloudFormation template includes an IAM resource.
Choose Create. CloudFormation displays the stack status, CREATE_COMPLETE, when the stack has launched completely, which should take less than two minutes.
Testing the solution
Check your email for the SNS confirmation email. You must confirm this subscription to receive future notification emails. If you don’t confirm the subscription, your security group ingress rules still will be automatically reverted, but you will not receive notification emails.
Navigate to the EC2 console and choose Security Groups in the navigation pane.
Choose the security group created by CloudFormation. Its name is Web Server Security Group.
Choose the Inbound tab in the bottom pane of the page. Note that only one rule allows HTTPS ingress on port 443 from 0.0.0.0/0 (from anywhere).
Choose Edit to display the Edit inbound rules dialog box (again, an inbound rule and an ingress rule are the same thing).
Choose Add Rule.
Choose SSH from the Type drop-down list.
Choose My IP from the Source drop-down list. Your IP address is populated for you. By adding this rule, you are simulating one of your staff members violating your organization’s policy (in this blog post’s hypothetical example) against allowing SSH access to your EC2 servers. You are testing the solution created when you launched the CloudFormation stack in the previous section. The solution should remove this newly created SSH rule automatically.
Choose Save.
Adding this rule creates an EC2 AuthorizeSecurityGroupIngress service event, which triggers the Lambda function created in the CloudFormation stack. After a few moments, choose the refresh button ( ) to see that the new SSH ingress rule that you just created has been removed by the solution you deployed earlier with the CloudFormation stack. If the rule is still there, wait a few more moments and choose the refresh button again.
You should also receive an email to notify you that the ingress rule was added and subsequently reverted.
Cleaning up
If you want to remove the resources created by this CloudFormation stack, you can delete the CloudFormation stack:
CloudFormation will display a status of DELETE_IN_PROGRESS while it deletes the resources created with the stack. After a few moments, the stack should no longer appear in the list of completed stacks.
Other applications of this solution
I have shown one way to use multiple AWS services to help continuously ensure that your security controls haven’t deviated from your security baseline. However, you also could use the CIS Amazon Web Services Foundations Benchmarks, for example, to establish a governance baseline across your AWS accounts and then use the principles in this blog post to automatically mitigate changes to that baseline.
To scale this solution, you can create a framework that uses resource tags to identify particular resources for monitoring. You also can use a consolidated monitoring approach by using cross-account event delivery. See Sending and Receiving Events Between AWS Accounts for more information. You also can extend the principle of automatic mitigation to detect and revert changes to other resources such as IAM policies and Amazon S3 bucket policies.
Summary
In this blog post, I demonstrated how you can automatically revert changes to a VPC security group and have a notification sent about the changes. You can use this solution in your own AWS accounts to enforce your security requirements continuously.
If you have comments about this blog post or other ideas for ways to use this solution, submit a comment in the “Comments” section below. If you have implementation questions, start a new thread in the EC2 forum or contact AWS Support.
Think you can create a really spooky Halloween video?
We’re giving out $100 Visa gift cards just in time for the holidays. Want a chance to win? You’ll need to make a spooky 30-second Halloween-themed video. We had a lot of fun with this the last time we did it a few years back so we’re doing it again this year.
Here’s How to Enter
Prepare a short, 30 seconds or less, video recreating your favorite horror movie scene using your computer or hard drive as the victim — or make something original!
Insert the following image at the end of the video (right-click and save as):
Upload your video to YouTube
Post a link to your video on the Backblaze Facebook wall or on Twitter with the hashtag #Backblaze so we can see it and enter it into the contest. Or, link to it in the comments below!
Share your video with friends
Common Questions Q: How many people can be in the video? A: However many you need in order to recreate the scene! Q: Can I make it longer than 30 seconds? A: Maybe 32 seconds, but that’s it. If you want to make a longer “director’s cut,” we’d love to see it, but the contest video should be close to 30 seconds. Please keep it short and spooky. Q: Can I record it on an iPhone, Android, iPad, Camera, etc? A: You can use whatever device you wish to record your video. Q: Can I submit multiple videos? A: If you have multiple favorite scenes, make a vignette! But please submit only one video. Q: How many winners will there be? A: We will select up to three winners total.
Contest Rules
To upload the video to YouTube, you must have a valid YouTube account and comply with all YouTube rules for age, content, copyright, etc.
To post a link to your video on the Backblaze Facebook wall, you must use a valid Facebook account and comply with all Facebook rules for age, content, copyrights, etc.
We reserve the right to remove and/or not consider as a valid entry, any videos which we deem inappropriate. We reserve the exclusive right to determine what is inappropriate.
Backblaze reserves the right to use your video for promotional purposes.
The contest will end on October 29, 2017 at 11:59:59 PM Pacific Daylight Time. The winners (up to three) will be selected by Backblaze and will be announced on October 31, 2017.
We will be giving away gift cards to the top winners. The prize will be mailed to the winner in a timely manner.
Please keep the content of the post PG rated — no cursing or extreme gore/violence.
By submitting a video you agree to all of these rules.
WordPress is the most popular CMS (Content Management System) for websites, with almost 30% of all websites in the world using WordPress. That’s a lot of sites — over 350 million!
In this post we’ll talk about the different approaches to keeping the data on your WordPress website safe.
Stop the Presses! (Or the Internet!)
As we were getting ready to publish this post, we received news from UpdraftPlus, one of the biggest WordPress plugin developers, that they are supporting Backblaze B2 as a storage solution for their backup plugin. They shipped the update (1.13.9) this week. This is great news for Backblaze customers! UpdraftPlus is also offering a 20% discount to Backblaze customers wishing to purchase or upgrade to UpdraftPlus Premium. The complete information is below.
Your WordPress website data is on a web server that’s most likely located in a large data center. You might wonder why it is necessary to have a backup of your website if it’s in a data center. Website data can be lost in a number of ways, including mistakes by the website owner (been there), hacking, or even domain ownership dispute (I’ve seen it happen more than once). A website backup also can provide a history of changes you’ve made to the website, which can be useful. As an overall strategy, it’s best to have a backup of any data that you can’t afford to lose for personal or business reasons.
Your web hosting company might provide backup services as part of your hosting plan. If you are using their service, you should know where and how often your data is being backed up. You don’t want to find out too late that your backup plan was not adequate.
Sites on WordPress.com are automatically backed up by VaultPress (Automattic), which also is available for self-hosted WordPress installations. If you don’t want the work or decisions involved in managing the hosting for your WordPress site, WordPress.com will handle it for you. You do, however, give up some customization abilities, such as the option to add plugins of your own choice.
Very large and active websites might consider WordPress VIP by Automattic, or another premium WordPress hosting service such as Pagely.com.
This post is about backing up self-hosted WordPress sites, so we’ll focus on those options.
WordPress Backup
Backup strategies for WordPress can be divided into broad categories depending on 1) what you back up, 2) when you back up, and 3) where the data is backed up.
With server data, such as with a WordPress installation, you should plan to have three copies of the data (the 3-2-1 backup strategy). The first is the active data on the WordPress web server, the second is a backup stored on the web server or downloaded to your local computer, and the third should be in another location, such as the cloud.
We’ll talk about the different approaches to backing up WordPress, but we recommend using a WordPress plugin to handle your backups. A backup plugin can automate the task, optimize your backup storage space, and alert you of problems with your backups or WordPress itself. We’ll cover plugins in more detail, below.
What to Back Up?
The main components of your WordPress installation are:
You should decide which of these elements you wish to back up. The database is the top priority, as it contains all your website posts and pages (exclusive of media). Your current theme is important, as it likely contains customizations you’ve made. Following those in priority are any other files you’ve customized or made changes to.
You can choose to back up the WordPress core installation and plugins, if you wish, but these files can be downloaded again if necessary from the source, so you might not wish to include them. You likely have all the media files you use on your website on your local computer (which should be backed up), so it is your choice whether to back these up from the server as well.
If you wish to be able to recreate your entire website easily in case of data loss or disaster, you might choose to back up everything, though on a large website this could be a lot of data.
Generally, you should 1) prioritize any file that you’ve customized that you can’t afford to lose, and 2) decide whether you need a copy of everything in order to get your site back up quickly. These choices will determine your backup method and the amount of storage you need.
A good backup plugin for WordPress enables you to specify which files you wish to back up, and even to create separate backups and schedules for different backup contents. That’s another good reason to use a plugin for backing up WordPress.
When to Back Up?
You can back up manually at any time by using the Export tool in WordPress. This is handy if you wish to do a quick backup of your site or parts of it. Since it is manual, however, it is not a part of a dependable backup plan that should be done regularly. If you wish to use this tool, go to Tools, Export, and select what you wish to back up. The output will be an XML file that uses the WordPress Extended RSS format, also known as WXR. You can create a WXR file that contains all of the information on your site or just portions of the site, such as posts or pages by selecting: All content, Posts, Pages, or Media. Note: You can use WordPress’s Export tool for sites hosted on WordPress.com, as well.
Many of the backup plugins we’ll be discussing later also let you do a manual backup on demand in addition to regularly scheduled or continuous backups.
Note: Another use of the WordPress Export tool and the WXR file is to transfer or clone your website to another server. Once you have exported the WXR file from the website you wish to transfer from, you can import the WXR file from the Tools, Import menu on the new WordPress destination site. Be aware that there are file size limits depending on the settings on your web server. See the WordPress Codex entry for more information. To make this job easier, you may wish to use one of a number of WordPress plugins designed specifically for this task.
You also can manually back up the WordPress MySQL database using a number of tools or a plugin. The WordPress Codex has good information on this. All WordPress plugins will handle this for you and do it automatically. They also typically include tools for optimizing the database tables, which is just good housekeeping.
A dependable backup strategy doesn’t rely on manual backups, which means you should consider using one of the many backup plugins available either free or for purchase. We’ll talk more about them below.
Which Format To Back Up In?
In addition to the WordPress WXR format, plugins and server tools will use various file formats and compression algorithms to store and compress your backup. You may get to choose between zip, tar, tar.gz, tar.gz2, and others. See The Most Common Archive File Formats for more information on these formats.
Select a format that you know you can access and unarchive should you need access to your backup. All of these formats are standard and supported across operating systems, though you might need to download a utility to access the file.
Where To Back Up?
Once you have your data in a suitable format for backup, where do you back it up to?
We want to have multiple copies of our active website data, so we’ll choose more than one destination for our backup data. The backup plugins we’ll discuss below enable you to specify one or more possible destinations for your backup. The possible destinations for your backup include:
A backup folder on your web server
A backup folder on your web server is an OK solution if you also have a copy elsewhere. Depending on your hosting plan, the size of your site, and what you include in the backup, you may or may not have sufficient disk space on the web server. Some backup plugins allow you to configure the plugin to keep only a certain number of recent backups and delete older ones, saving you disk space on the server.
Email to you
Because email servers have size limitations, the email option is not the best one to use unless you use it to specifically back up just the database or your main theme files.
FTP, SFTP, SCP, WebDAV
FTP, SFTP, SCP, and WebDAV are all widely-supported protocols for transferring files over the internet and can be used if you have access credentials to another server or supported storage device that is suitable for storing a backup.
Sync service (Dropbox, SugarSync, Google Drive, OneDrive)
A sync service is another possible server storage location though it can be a pricier choice depending on the plan you have and how much you wish to store.
Cloud storage (Backblaze B2, Amazon S3, Google Cloud, Microsoft Azure, Rackspace)
A cloud storage service can be an inexpensive and flexible option with pay-as-you go pricing for storing backups and other data.
A good website backup strategy would be to have multiple backups of your website data: one in a backup folder on your web hosting server, one downloaded to your local computer, and one in the cloud, such as with Backblaze B2.
If I had to choose just one of these, I would choose backing up to the cloud because it is geographically separated from both your local computer and your web host, it uses fault-tolerant and redundant data storage technologies to protect your data, and it is available from anywhere if you need to restore your site.
Backup Plugins for WordPress
Probably the easiest and most common way to implement a solid backup strategy for WordPress is to use one of the many backup plugins available for WordPress. Fortunately, there are a number of good ones and are available free or in “freemium” plans in which you can use the free version and pay for more features and capabilities only if you need them. The premium options can give you more flexibility in configuring backups or have additional options for where you can store the backups.
How to Choose a WordPress Backup Plugin
When considering which plugin to use, you should take into account a number of factors in making your choice.
Is the plugin actively maintained and up-to-date? You can determine this from the listing in the WordPress Plugin Repository. You also can look at reviews and support comments to get an idea of user satisfaction and how well issues are resolved.
Does the plugin work with your web hosting provider? Generally, well-supported plugins do, but you might want to check to make sure there are no issues with your hosting provider.
Does it support the cloud service or protocol you wish to use? This can be determined from looking at the listing in the WordPress Plugin Repository or on the developer’s website. Developers often will add support for cloud services or other backup destinations based on user demand, so let the developer know if there is a feature or backup destination you’d like them to add to their plugin.
Other features and options to consider in choosing a backup plugin are:
Whether encryption of your backup data is available
What are the options for automatically deleting backups from the storage destination?
Can you globally exclude files, folders, and specific types of files from the backup?
Do the options for scheduling automatic backups meet your needs for frequency?
Can you exclude/include specific database tables (a good way to save space in your backup)?
WordPress Backup Plugins Review
Let’s review a few of the top choices for WordPress backup plugins.
UpdraftPlus
UpdraftPlus is one of the most popular backup plugins for WordPress with over one million active installations. It is available in both free and Premium versions.
UpdraftPlus just released support for Backblaze B2 Cloud Storage in their 1.13.9 update on September 25. According to the developer, support for Backblaze B2 was the most frequent request for a new storage option for their plugin. B2 support is available in their Premium plugin and as a stand-alone update to their standard product.
Note: The developers of UpdraftPlus are offering a special 20% discount to Backblaze customers on the purchase of UpdraftPlus Premium by using the coupon code backblaze20. The discount is valid until the end of Friday, October 6th, 2017.
XCloner supports B2 Cloud Storage in their free plugin.
BlogVault
BlogVault describes themselves as a “complete WordPress backup solution.” They offer a free trial of their paid WordPress backup subscription service that features real-time backups of changes to your WordPress site, as well as many other features.
BlogVault has announced their intent to support Backblaze B2 Cloud Storage in a future update.
BackWPup
BackWPup is a popular and free option for backing up WordPress. It supports a number of options for storing your backup, including the cloud, FTP, email, or on your local computer.
WPBackItUp
WPBackItUp has been around since 2012 and is highly rated. It has both free and paid versions.
VaultPress
VaultPress is part of Automattic’s well-known WordPress product, JetPack. You will need a JetPack subscription plan to use VaultPress. There are different pricing plans with different sets of features.
Backup by Supsystic
Backup by Supsystic supports a number of options for backup destinations, encryption, and scheduling.
BackupWordPress
BackUpWordPress is an open-source project on Github that has a popular and active following and many positive reviews.
BackupBuddy
BackupBuddy, from iThemes, is the old-timer of backup plugins, having been around since 2010. iThemes knows a lot about WordPress, as they develop plugins, themes, utilities, and provide training in WordPress.
BackupBuddy’s backup includes all WordPress files, all files in the WordPress Media library, WordPress themes, and plugins. BackupBuddy generates a downloadable zip file of the entire WordPress website. Remote storage destinations also are supported.
WordPress and the Cloud
Do you use WordPress and back up to the cloud? We’d like to hear about it. We’d also like to hear whether you are interested in using B2 Cloud Storage for storing media files served by WordPress. If you are, we’ll write about it in a future post.
In the meantime, keep your eye out for new plugins supporting Backblaze B2, or better yet, urge them to support B2 if they’re not already.
The Best Backup Strategy is the One You Use
There are other approaches and tools for backing up WordPress that you might use. If you have an approach that works for you, we’d love to hear about it in the comments.
Did you realise the Sense HAT has been available for over two years now? Used by astronauts on the International Space Station, the exact same hardware is available to you on Earth. With a new Astro Pi challenge just launched, it’s time for a retrospective/roundup/inspiration post about this marvellous bit of kit.
The Sense HAT on a Pi in full glory
The Sense HAT explained
We developed our scientific add-on board to be part of the Astro Pi computers we sent to the International Space Station with ESA astronaut Tim Peake. For a play-by-play of Astro Pi’s history, head to the blog archive.
Just to remind you, this is all the cool stuff our engineers have managed to fit onto the HAT:
Use the LED matrix and joystick to recreate games such as Pong or Flappy Bird. Of course, you could also add sensor input to your game: code an egg drop game or a Magic 8 Ball that reacts to how the device moves.
If you like the great outdoors, you could also use your Sense HAT to recreate this Hiking Companion by Marcus Johnson. Take it with you on your next hike!
It’s also possible to incorporate Sense HAT data into your digital art! The Python Turtle module and the Processing language are both useful tools for creating beautiful animations based on real-world information.
A Sense HAT project that also uses this principle is Giorgio Sancristoforo’s Tableau, a ‘generative music album’. This device creates music according to the sensor data:
“There is no doubt that, as music is removed by the phonographrecord from the realm of live production and from the imperative of artistic activity and becomes petrified, it absorbs into itself, in this process of petrification, the very life that would otherwise vanish.”
Our online resource shows you how to record the information your HAT picks up. Next you can analyse and graph your data using Mathematica, which is included for free on Raspbian. This resource walks you through how this software works.
If you’re seeking inspiration for experiments you can do on our Astro Pis Izzy and Ed on the ISS, check out the winning entries of previous rounds of the Astro Pi challenge.
Thomas Pesquet with Ed and Izzy
But you can also stick to terrestrial scientific investigations. For example, why not build a weather station and share its data on your own web server or via Weather Underground?
Your code in space!
If you’re a student or an educator in one of the 22 ESA member states, you can get a team together to enter our 2017-18 Astro Pi challenge. There are two missions to choose from, including Mission Zero: follow a few guidelines, and your code is guaranteed to run in space!
We have no choice – we must help her! If things are as bad as she says they are, our only hope of survival is to work together.
We know you have the skills and imagination required to make something. We’ve seen that in previous Pioneers challenges. That’s why we’re coming directly to you with this: we know you won’t let her down.
What you need to do
We’ve watched back through the recording and pulled out as much information as we can:
To save us, you have ten weeks to create something using tech. This means you need to be done on 1 December, or it will be too late!
The build you will create needs to help her in the treacherous situation she’s in. What you decide to make is completely up to you.
Her call is for those of you aged between 11 and 16 who are based in the UK or Republic of Ireland. You need to work in groups of up to five, and you need to find someone aged 18 or over to act as a mentor and support your project.
Any tech will do. We work for the Raspberry Pi Foundation, but this doesn’t mean you need to use a Raspberry Pi. Use anything at all — from microcontrollers to repurposed devices such as laptops and cameras.
To keep in contact with you, it looks like she’s created a form for you to fill in and share your team name and details with her. In return she will trade some items with you — things that will help inspire you in your mission. We’ve managed to find the link to the form: you can fill it in here.
In order to help her (and any others who might still be out there!) to recreate your project, you need to make sure you record your working process. Take photos and footage to document how you build your make, and put together a video to send to her when you’re done making.
If you manage to access social media, you could also share your progress as you go along! Make sure to use #MakeYourIdeas, so that other survivors can see your work.
We’ve assembled some more information on the Pioneers website to create a port of call for you. Check it out, and let us know if you have any questions. We will do whatever we can to help you protect the world.
There’s a little company called Apple making big announcements this week, but about 45% of you are on Windows machines, so we thought it would be a good idea to devote a blog post today to Windows users and the options they have for backing up Windows computers.
We’ll be talking about the various options for backing up Windows desktop OS’s 7, 8, and 10, and Windows servers. We’ve written previously about this topic in How to Back Up Windows, and Computer Backup Options, but we’ll be covering some new topics and ways to combine strategies in this post. So, if you’re a Windows user looking for shelter from all the Apple hoopla, welcome to our Apple Announcement DayWindows Backup Day post.
First, Let’s Talk About What We Mean by Backup
This might seem to our readers like an unneeded appetizer on the way to the main course of our post, but we at Backblaze know that people often mean very different things when they use backup and related terms. Let’s start by defining what we mean when we say backup, cloud storage, sync, and archive.
Backup
A backup is an active copy of the system or files that you are using. It is distinguished from an archive, which is the storing of data that is no longer in active use. Backups fall into two main categories: file and image. File backup software will back up whichever files you designate by either letting you include files you wish backed up or by excluding files you don’t want backed up, or both. An image backup, sometimes called a disaster recovery backup or a system clone, is useful if you need to recreate your system on a new drive or computer.
The first backup generally will be a full backup of all files. After that, the backup will be incremental, meaning that only files that have been changed since the full backup will be added. Often, the software will keep changed versions of the files for some period of time, so you can maintain a number of previous revisions of your files in case you wish to return to something in an earlier version of your file.
The destination for your backup could be another drive on your computer, an attached drive, a network-attached drive (NAS), or the cloud.
Cloud Storage
Cloud storage vendors supply data storage just as a utility company supplies power, gas, or water. Cloud storage can be used for data backups, but it can also be used for data archives, application data, records, or libraries of photos, videos, and other media.
You contract with the service for storing any type of data, and the storage location is available to you via the internet. Cloud storage providers generally charge by some combination of data ingress, egress, and the amount of data stored.
Sync
File sync is useful for files that you wish to have access to from different places or computers, or for files that you wish to share with others. While sync has its uses, it has limitations for keeping files safe and how much it could cost you to store large amounts of data. As opposed to backup, which keeps revision of files, sync is designed to keep two or more locations exactly the same. Sync costs are based on how much data you sync and can get expensive for large amounts of data.
Archive
A data archive is for data that is no longer in active use but needs to be saved, and may or may not ever be retrieved again. In old-style storage parlance, it is called cold storage. An archive could be stored with a cloud storage provider, or put on a hard drive or flash drive that you disconnect and put in the closet, or mail to your brother in Idaho.
What’s the Best Strategy for Backing Up?
Now that we’ve got our terminology clear, let’s talk backup strategies for Windows.
At Backblaze, we advocate the 3-2-1 strategy for safeguarding your data, which means that you should maintain three copies of any valuable data — two copies stored locally and one stored remotely. I follow this strategy at home by working on the active data on my Windows 10 desktop computer (copy one), which is backed up to a Drobo RAID device attached via USB (copy two), and backing up the desktop to Backblaze’s Personal Backup in the cloud (copy three). I also keep an image of my primary disk on a separate drive and frequently update it using Windows 10’s image tool.
I use Dropbox for sharing specific files I am working on that I might wish to have access to when I am traveling or on another computer. Once my subscription with Dropbox expires, I’ll use the latest release of Backblaze that has individual file preview with sharing built-in.
Before you decide which backup strategy will work best for your situation, you’ll need to ask yourself a number of questions. These questions include where you wish to store your backups, whether you wish to supply your own storage media, whether the backups will be manual or automatic, and whether limited or unlimited data storage will work best for you.
Strategy 1 — Back Up to a Local or Attached Drive
The first copy of the data you are working on is often on your desktop or laptop. You can create a second copy of your data on another drive or directory on your computer, or copy the data to a drive directly attached to your computer, such as via USB.
Windows has built-in tools for both file and image level backup. Depending on which version of Windows you use, these tools are called Backup and Restore, File History, or Image. These tools enable you to set a schedule for automatic backups, which ensures that it is done regularly. You also have the choice to use Windows Explorer (aka File Explorer) to manually copy files to another location. Some external disk drives and USB Flash Drives come with their own backup software, and other backup utilities are available for free or for purchase.
This is a supply-your-own media solution, meaning that you need to have a hard disk or other medium available of sufficient size to hold all your backup data. When a disk becomes full, you’ll need to add a disk or swap out the full disk to continue your backups.
Strategy 2 — Back Up to a Local Area Network (LAN)
Computers, servers, and network-attached-storage (NAS) on your local network all can be used for backing up data. Microsoft’s built-in backup tools can be used for this job, as can any utility that supports network protocols such as NFS or SMB/CIFS, which are common protocols that allow shared access to files on a network for Windows and other operatings systems. There are many third-party applications available as well that provide extensive options for managing and scheduling backups and restoring data when needed.
Multiple computers can be backed up to a single network-shared computer, server, or NAS, which also could then be backed up to the cloud, which rounds out a nice backup strategy, because it covers both local and remote copies of your data. System images of multiple computers on the LAN can be included in these backups if desired.
Again, you are managing the backup media on the local network, so you’ll need to be sure you have sufficient room on the destination drives to store all your backup data.
Strategy 3 — Back Up to Detached Drive at Another Location
You may have have read our recent blog post, Getting Data Archives Out of Your Closet, in which we discuss the practice of filling hard drives and storing them in a closet. Of course, to satisfy the off-site backup guideline, these drives would need to be stored in a closet that’s in a different geographical location than your main computer. If you’re willing to do all the work of copying the data to drives and transporting them to another location, this is a viable option.
The only limitation to the amount of backup data is the number of hard drives you are willing to purchase — and maybe the size of your closet.
Strategy 4 — Back Up to the Cloud
Backing up to the cloud has become a popular option for a number of reasons. Internet speeds have made moving large amounts of data possible, and not having to worry about supplying the storage media simplifies choices for users. Additionally, cloud vendors implement features such as data protection, deduplication, and encryption as part of their services that make cloud storage reliable, secure, and efficient. Unlimited cloud storage for data from a single computer is a popular option.
A backup vendor likely will provide a software client that runs on your computer and backs up your data to the cloud in the background while you’re doing other things, such as Backblaze Personal Backup, which has clients for Windows computers, Macintosh computers, and mobile apps for both iOS and Android. For restores, Backblaze users can download one or all of their files for free from anywhere in the world. Optionally, a 128 GB flash drive or 4 TB drive can be overnighted to the customer, with a refund available if the drive is returned.
Backblaze B2 Cloud Storage is an option for those who need capabilities beyond Backblaze’s Personal Backup. B2 provides cloud storage that is priced based on the amount of data the customer uses, and is suitable for long-term data storage. B2 supports integrations with NAS devices, as well as Windows, Macintosh, and Linux computers and servers.
Services such as BackBlaze B2 are often called Cloud Object Storage or IaaS (Infrastructure as a Service), because they provide a complete solution for storing all types of data in partnership with vendors who integrate various solutions for working with B2. B2 has its own API (Application Programming Interface) and CLI (Command-line Interface) to work with B2, but B2 becomes even more powerful when paired with any one of a number of other solutions for data storage and management provided by third parties who offer both hardware and software solutions.
Backing Up Windows Servers
Windows Servers are popular workstations for some users, and provide needed network services for others. They also can be used to store backups from other computers on the network. They, in turn, can be backed up to attached drives or the cloud. While our Personal Backup client doesn’t support Windows servers, our B2 Cloud Storage has a number of integrations with vendors who supply software or hardware for storing data both locally and on B2. We’ve written a number of blog posts and articles that address these solutions, including How to Back Up your Windows Server with B2 and CloudBerry.
Sometimes the Best Strategy is to Mix and Match
The great thing about computers, software, and networks is that there is an endless number of ways to combine them. Our users and hardware and software partners are ingenious in configuring solutions that save data locally, copy it to an attached or network drive, and then store it to the cloud.
Among our B2 partners, Synology, CloudBerryArchiware, QNAP, Morro Data, and GoodSync have integrations that allow their NAS devices to store and retrieve data to and from B2 Cloud Storage. For a drag-and-drop experience on the desktop, take a look at CyberDuck, MountainDuck, and Dropshare, which provide users with an easy and interactive way to store and use data in B2.
If you’d like to explore more options for combining software, hardware, and cloud solutions, we invite you to browse the integrations for our many B2 partners.
Have Questions?
Windows versions, tools, and backup terminology all can be confusing, and we know how hard it can be to make sense of all of it. If there’s something we haven’t addressed here, or if you have a question or contribution, please let us know in the comments.
And happy Windows Backup Day! (Just don’t tell Apple.)
By continuing to use the site, you agree to the use of cookies. more information
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.