We know crawl spaces are creepy, sweaty, and confining but, hear us out…
You need to keep an eye on the humidity level in your crawl space, as it can seriously affect the whole house’s overall health. It’s ideal to be able to do this remotely (given the creepy, sweaty atmosphere of the space), and a Raspberry Pi allows this.
Jamie Bailey took to Medium to share his Raspberry Pi setup that allows him to monitor the humidity of the crawl space in his home from a mobile device and/or laptop. His setup lets you check on the current humidity level and also see the historical data over time. You can also set alarms to be sent to you via text or email whenever the humidity level exceeds a certain threshold.
The hardware you need
Power outlet or extension cord in your crawl space
Raspberry Pi (3 or 4) or Raspberry Pi Zero W (or WH)
The BME280 sensor has four pins you need to connect to your Raspberry Pi. This will send the humidity data to your Raspberry Pi, which you’ll have already set up to let you know what’s happening remotely.
BME280 VIN pin connects to GPIO pin 1 (3.3V)
BME280 GND pin connects to GPIO pin 6 (GND)
BME280 SCL pin connects to GPIO pin 5 (SCL)
BME280 SDA pin connects to GPIO pin 3 (SDA)
You can see the Raspberry Pi in a black case hanging in the centre against a floor joist.
Once you have all your software sorted and your hardware connected, turn your Raspberry Pi off and take it down to your crawl space (monitor, keyboard, and mouse are no longer necessary). Jamie advises hanging your Raspberry Pi from the floor joists instead of letting it touch the ground, to avoid contact with any water. He put a nail in one of the floor joists and draped the power cord over the nail (see above). Turn your tiny computer on, make sure data starts flowing into your dashboard, and you’ve got yourself remote humidity sensor!
PS We’re English so… is a crawl space the same as an attic or what? Asking for a friend!
The Internet of Things (IoT) has precipitated to an influx of connected devices and data that can be mined to gain useful business insights. If you own an IoT device, you might want the data to be uploaded seamlessly from your connected devices to the cloud so that you can make use of cloud storage and the processing power to perform sophisticated analysis of data. To upload the data to the AWS Cloud, devices must pass authentication and authorization checks performed by the respective AWS services. The standard way of authenticating AWS requests is the Signature Version 4 algorithm that requires the caller to have an access key ID and secret access key. Consequently, you need to hardcode the access key ID and the secret access key on your devices. Alternatively, you can use the built-in X.509 certificate as the unique device identity to authenticate AWS requests.
AWS IoT has introduced the credentials provider feature that allows a caller to authenticate AWS requests by having an X.509 certificate. The credentials provider authenticates a caller using an X.509 certificate, and vends a temporary, limited-privilege security token. The token can be used to sign and authenticate any AWS request. Thus, the credentials provider relieves you from having to manage and periodically refresh the access key ID and secret access key remotely on your devices.
In the process of retrieving a security token, you use AWS IoT to create a thing (a representation of a specific device or logical entity), register a certificate, and create AWS IoT policies. You also configure an AWS Identity and Access Management (IAM) role and attach appropriate IAM policies to the role so that the credentials provider can assume the role on your behalf. You also make an HTTP-over-Transport Layer Security (TLS) mutual authentication request to the credentials provider that uses your preconfigured thing, certificate, policies, and IAM role to authenticate and authorize the request, and obtain a security token on your behalf. You can then use the token to sign any AWS request using Signature Version 4.
In this blog post, I explain the AWS IoT credentials provider design and then demonstrate the end-to-end process of retrieving a security token from AWS IoT and using the token to write a temperature and humidity record to a specific Amazon DynamoDB table.
Note: This post assumes you are familiar with AWS IoT and IAM to perform steps using the AWS CLI and OpenSSL. Make sure you are running the latest version of the AWS CLI.
Overview of the credentials provider workflow
The following numbered diagram illustrates the credentials provider workflow. The diagram is followed by explanations of the steps.
To explain the steps of the workflow as illustrated in the preceding diagram:
The AWS IoT device uses the AWS SDK or custom client to make an HTTPS request to the credentials provider for a security token. The request includes the device X.509 certificate for authentication.
The credentials provider forwards the request to the AWS IoT authentication and authorization module to verify the certificate and the permission to request the security token.
If the certificate is valid and has permission to request a security token, the AWS IoT authentication and authorization module returns success. Otherwise, it returns failure, which goes back to the device with the appropriate exception.
If assuming the role succeeds, AWS STS returns a temporary, limited-privilege security token to the credentials provider.
The credentials provider returns the security token to the device.
The AWS SDK on the device uses the security token to sign an AWS request with AWS Signature Version 4.
The requested service invokes IAM to validate the signature and authorize the request against access policies attached to the preconfigured IAM role.
If IAM validates the signature successfully and authorizes the request, the request goes through.
In another solution, you could configure an AWS Lambda rule that ingests your device data and sends it to another AWS service. However, in applications that require the uploading of large files such as videos or aggregated telemetry to the AWS Cloud, you may want your devices to be able to authenticate and send data directly to the AWS service of your choice. The credentials provider enables you to do that.
Outline of the steps to retrieve and use security token
Perform the following steps as part of this solution:
Create an AWS IoT thing: Start by creating a thing that corresponds to your home thermostat in the AWS IoT thing registry database. This allows you to authenticate the request as a thing and use thing attributes as policy variables in AWS IoT and IAM policies.
Register a certificate: Create and register a certificate with AWS IoT, and attach it to the thing for successful device authentication.
Create and configure an IAM role: Create an IAM role to be assumed by the service on behalf of your device. I illustrate how to configure a trust policy and an access policy so that AWS IoT has permission to assume the role, and the token has necessary permission to make requests to DynamoDB.
Create a role alias: Create a role alias in AWS IoT. A role alias is an alternate data model pointing to an IAM role. The credentials provider request must include a role alias name to indicate which IAM role to assume for obtaining a security token from AWS STS. You may update the role alias on the server to point to a different IAM role and thus make your device obtain a security token with different permissions.
Attach a policy: Create an authorization policy with AWS IoT and attach it to the certificate to control which device can assume which role aliases.
Request a security token: Make an HTTPS request to the credentials provider and retrieve a security token and use it to sign a DynamoDB request with Signature Version 4.
Use the security token to sign a request: Use the retrieved token to sign a request to DynamoDB and successfully write a temperature and humidity record from your home thermostat in a specific table. Thus, starting with an X.509 certificate on your home thermostat, you can successfully upload your thermostat record to DynamoDB and use it for further analysis. Before the availability of the credentials provider, you could not do this.
Deploy the solution
1. Create an AWS IoT thing
Register your home thermostat in the AWS IoT thing registry database by creating a thing type and a thing. You can use the AWS CLI with the following command to create a thing type. The thing type allows you to store description and configuration information that is common to a set of things.
Now, you need to have a Certificate Authority (CA) certificate, sign a device certificate using the CA certificate, and register both certificates with AWS IoT before your device can authenticate to AWS IoT. If you do not already have a CA certificate, you can use OpenSSL to create a CA certificate, as described in Use Your Own Certificate. To register your CA certificate with AWS IoT, follow the steps on Registering Your CA Certificate.
You then have to create a device certificate signed by the CA certificate and register it with AWS IoT, which you can do by following the steps on Creating a Device Certificate Using Your CA Certificate. Save the certificate and the corresponding key pair; you will use them when you request a security token later. Also, remember the password you provide when you create the certificate.
Run the following command in the AWS CLI to attach the device certificate to your thing so that you can use thing attributes in policy variables.
aws iot attach-thing-principal – thing-name MyHomeThermostat – principal <certificate-arn>
If the attach-thing-principal command succeeds, the output is empty.
3. Configure an IAM role
Next, configure an IAM role in your AWS account that will be assumed by the credentials provider on behalf of your device. You are required to associate two policies with the role: a trust policy that controls who can assume the role, and an access policy that controls which actions can be performed on which resources by assuming the role.
The following trust policy grants the credentials provider permission to assume the role. Put it in a text document and save the document with the name, trustpolicyforiot.json.
The following access policy allows DynamoDB operations on the table that has the same name as the thing name that you created in Step 1, MyHomeThermostat, by using credentials-iot:ThingName as a policy variable. I explain after Step 5 about using thing attributes as policy variables. Put the following policy in a text document and save the document with the name, accesspolicyfordynamodb.json.
Finally, run the following command in the AWS CLI to attach the access policy to your role.
aws iam attach-role-policy – role-name dynamodb-access-role – policy-arn arn:aws:iam::<your_aws_account_id>:policy/accesspolicyfordynamodb
If the attach-role-policy command succeeds, the output is empty.
Configure the PassRole permissions
The IAM role that you have created must be passed to AWS IoT to create a role alias, as described in Step 4. The user who performs the operation requires iam:PassRole permission to authorize this action. You also should add permission for the iam:GetRole action to allow the user to retrieve information about the specified role. Create the following policy to grant iam:PassRole and iam:GetRole permissions. Name this policy, passrolepermission.json.
Now, run the following command to attach the policy to the user.
aws iam attach-user-policy – policy-arn arn:aws:iam::<your_aws_account_id>:policy/passrolepermission – user-name <user_name>
If the attach-user-policy command succeeds, the output is empty.
4. Create a role alias
Now that you have configured the IAM role, you will create a role alias with AWS IoT. You must provide the following pieces of information when creating a role alias:
RoleAlias: This is the primary key of the role alias data model and hence a mandatory attribute. It is a string; the minimum length is 1 character, and the maximum length is 128 characters.
RoleArn: This is the Amazon Resource Name (ARN) of the IAM role you have created. This is also a mandatory attribute.
CredentialDurationSeconds: This is an optional attribute specifying the validity (in seconds) of the security token. The minimum value is 900 seconds (15 minutes), and the maximum value is 3,600 seconds (60 minutes); the default value is 3,600 seconds, if not specified.
Run the following command in the AWS CLI to create a role alias. Use the credentials of the user to whom you have given the iam:PassRole permission.
You created and registered a certificate with AWS IoT earlier for successful authentication of your device. Now, you need to create and attach a policy to the certificate to authorize the request for the security token.
Let’s say you want to allow a thing to get credentials for the role alias, Thermostat-dynamodb-access-role-alias, with thing owner Alice, thing type thermostat, and the thing attached to a principal. The following policy, with thing attributes as policy variables, achieves these requirements. After this step, I explain more about using thing attributes as policy variables. Put the policy in a text document, and save it with the name, alicethermostatpolicy.json.
If the attach-policy command succeeds, the output is empty.
You have completed all the necessary steps to request an AWS security token from the credentials provider!
Using thing attributes as policy variables
Before I show how to request a security token, I want to explain more about how to use thing attributes as policy variables and the advantage of using them. As a prerequisite, a device must provide a thing name in the credentials provider request.
Thing substitution variables in AWS IoT policies
AWS IoT Simplified Permission Management allows you to associate a connection with a specific thing, and allow the thing name, thing type, and other thing attributes to be available as substitution variables in AWS IoT policies. You can write a generic AWS IoT policy as in alicethermostatpolicy.json in Step 5, attach it to multiple certificates, and authorize the connection as a thing. For example, you could attach alicethermostatpolicy.json to certificates corresponding to each of the thermostats you have that you want to assume the role alias, Thermostat-dynamodb-access-role-alias, and allow operations only on the table with the name that matches the thing name. For more information, see the full list of thing policy variables.
Thing substitution variables in IAM policies
You also can use the following three substitution variables in the IAM role’s access policy (I used credentials-iot:ThingName in accesspolicyfordynamodb.json in Step 3):
credentials-iot:ThingName
credentials-iot:ThingTypeName
credentials-iot:AwsCertificateId
When the device provides the thing name in the request, the credentials provider fetches these three variables from the database and adds them as context variables to the security token. When the device uses the token to access DynamoDB, the variables in the role’s access policy are replaced with the corresponding values in the security token. Note that you also can use credentials-iot:AwsCertificateId as a policy variable; AWS IoT returns certificateId during registration.
6. Request a security token
Make an HTTPS request to the credentials provider to fetch a security token. You have to supply the following information:
Certificate and key pair: Because this is an HTTP request over TLS mutual authentication, you have to provide the certificate and the corresponding key pair to your client while making the request. Use the same certificate and key pair that you used during certificate registration with AWS IoT.
RoleAlias: Provide the role alias (in this example, Thermostat-dynamodb-access-role-alias) to be assumed in the request.
ThingName: Provide the thing name that you created earlier in the AWS IoT thing registry database. This is passed as a header with the name, x-amzn-iot-thingname. Note that the thing name is mandatory only if you have thing attributes as policy variables in AWS IoT or IAM policies.
Run the following command in the AWS CLI to obtain your AWS account-specific endpoint for the credentials provider. See the DescribeEndpoint API documentation for further details.
Note that if you are on Mac OS X, you need to export your certificate to a .pfx or .p12 file before you can pass it in the https request. Use OpenSSL with the following command to convert the device certificate from .pem to .pfx format. Remember the password because you will need it subsequently in a curl command.
Now, make an HTTPS request to the credentials provider to fetch a security token. You may use your preferred HTTP client for the request. I use curl in the following examples.
This command returns a security token object that has an accessKeyId, a secretAccessKey, a sessionToken, and an expiration. The following is sample output of the curl command.
Create a DynamoDB table called MyHomeThermostat in your AWS account. You will have to choose the hash (partition key) and the range (sort key) while creating the table to uniquely identify a record. Make the hash the serial_number of the thermostat and the range the timestamp of the record. Create a text file with the following JSON to put a temperature and humidity record in the table. Name the file, item.json.
You can use the accessKeyId, secretAccessKey, and sessionToken retrieved from the output of the curl command to sign a request that writes the temperature and humidity record to the DynamoDB table. Use the following commands to accomplish this.
In this blog post, I demonstrated how to retrieve a security token by using an X.509 certificate and then writing an item to a DynamoDB table by using the security token. Similarly, you could run applications on surveillance cameras or sensor devices that exchange the X.509 certificate for an AWS security token and use the token to upload video streams to Amazon Kinesis or telemetry data to Amazon CloudWatch.
If you have comments about this blog post, submit them in the “Comments” section below. If you have questions about or issues implementing this solution, start a new thread on the AWS IoT forum.
Right now, 400km above the Earth aboard the International Space Station, are two very special Raspberry Pi computers. They were launched into space on 6 December 2015 and are, most assuredly, the farthest-travelled Raspberry Pi computers in existence. Each year they run experiments that school students create in the European Astro Pi Challenge.
Left: Astro Pi Vis (Ed); right: Astro Pi IR (Izzy). Image credit: ESA.
The European Columbus module
Today marks the tenth anniversary of the launch of the European Columbus module. The Columbus module is the European Space Agency’s largest single contribution to the ISS, and it supports research in many scientific disciplines, from astrobiology and solar science to metallurgy and psychology. More than 225 experiments have been carried out inside it during the past decade. It’s also home to our Astro Pi computers.
Here’s a video from 7 February 2008, when Space Shuttle Atlantis went skywards carrying the Columbus module in its cargo bay.
From February 7th, 2008 NASA-TV Coverage of The 121st Space Shuttle Launch Launched At:2:45:30 P.M E.T – Coverage begins exactly one hour till launch STS-122 Crew:
Today, coincidentally, is also the deadline for the European Astro Pi Challenge: Mission Space Lab. Participating teams have until midnight tonight to submit their experiments.
Anniversary celebrations
At 16:30 GMT today there will be a live event on NASA TV for the Columbus module anniversary with NASA flight engineers Joe Acaba and Mark Vande Hei.
Our Astro Pi computers will be joining in the celebrations by displaying a digital birthday candle that the crew can blow out. It works by detecting an increase in humidity when someone blows on it. The video below demonstrates the concept.
The exact Astro Pi code that will run on the ISS today is available for you to download and run on your own Raspberry Pi and Sense HAT. You’ll notice that the program includes code to make it stop automatically when the date changes to 8 February. This is just to save time for the ground control team.
If you have a Raspberry Pi and a Sense HAT, you can use the terminal commands below to download and run the code yourself:
When you see a blank blue screen with the brightness increasing, the Sense HAT is measuring the baseline humidity. It does this every 15 minutes so it can recalibrate to take account of natural changes in background humidity. A humidity increase of 2% is needed to blow out the candle, so if the background humidity changes by more than 2% in 15 minutes, it’s possible to get a false positive. Press Ctrl + C to quit.
Please tweet pictures of your candles to @astro_pi – we might share yours! And if we’re lucky, we might catch a glimpse of the candle on the ISS during the NASA TV event at 16:30 GMT today.
Scale takes on a whole new meaning when it comes to IoT. Last year I was lucky enough to tour a gigantic factory that had, on average, one environment sensor per square meter. The sensors measured temperature, humidity, and air purity several times per second, and served as an early warning system for contaminants. I’ve heard customers express interest in deploying IoT-enabled consumer devices in the millions or tens of millions.
With powerful, long-lived devices deployed in a geographically distributed fashion, managing security challenges is crucial. However, the limited amount of local compute power and memory can sometimes limit the ability to use encryption and other forms of data protection.
To address these challenges and to allow our customers to confidently deploy IoT devices at scale, we are working on IoT Device Defender. While the details might change before release, AWS IoT Device Defender is designed to offer these benefits:
Continuous Auditing – AWS IoT Device Defender monitors the policies related to your devices to ensure that the desired security settings are in place. It looks for drifts away from best practices and supports custom audit rules so that you can check for conditions that are specific to your deployment. For example, you could check to see if a compromised device has subscribed to sensor data from another device. You can run audits on a schedule or on an as-needed basis.
Real-Time Detection and Alerting – AWS IoT Device Defender looks for and quickly alerts you to unusual behavior that could be coming from a compromised device. It does this by monitoring the behavior of similar devices over time, looking for unauthorized access attempts, changes in connection patterns, and changes in traffic patterns (either inbound or outbound).
Fast Investigation and Mitigation – In the event that you get an alert that something unusual is happening, AWS IoT Device Defender gives you the tools, including contextual information, to help you to investigate and mitigate the problem. Device information, device statistics, diagnostic logs, and previous alerts are all at your fingertips. You have the option to reboot the device, revoke its permissions, reset it to factory defaults, or push a security fix.
Stay Tuned I’ll have more info (and a hands-on post) as soon as possible, so stay tuned!
Since we launched the Oracle Weather Station project, we’ve collected more than six million records from our network of stations at schools and colleges around the world. Each one of these records contains data from ten separate sensors — that’s over 60 million individual weather measurements!
Weather station measurements in Oracle database
Weather data collection
Having lots of data covering a long period of time is great for spotting trends, but to do so, you need some way of visualising your measurements. We’ve always had great resources like Graphing the weather to help anyone analyse their weather data.
And from now on its going to be even easier for our Oracle Weather Station owners to display and share their measurements. I’m pleased to announce a new partnership with our friends at Initial State: they are generously providing a white-label platform to which all Oracle Weather Station recipients can stream their data.
Using Initial State
Initial State makes it easy to create vibrant dashboards that show off local climate data. The service is perfect for having your Oracle Weather Station data on permanent display, for example in the school reception area or on the school’s website.
But that’s not all: the Initial State toolkit includes a whole range of easy-to-use analysis tools for extracting trends from your data. Distribution plots and statistics are just a few clicks away!
Looks like Auntie Beryl is right — it has been a damp old year! (Humidity value distribution May–Nov 2017)
The wind direction data from my Weather Station supports my excuse as to why I’ve not managed a high-altitude balloon launch this year: to use my launch site, I need winds coming from the east, and those have been in short supply.
Chart showing wind direction over time
Initial State credientials
Every Raspberry Pi Oracle Weather Station school will shortly be receiving the credentials needed to start streaming their data to Initial State. If you’re super keen though, please email [email protected] with a photo of your Oracle Weather Station, and I’ll let you jump the queue!
The Initial State folks are big fans of Raspberry Pi and have a ton of Pi-related projects on their website. They even included shout-outs to us in the music video they made to celebrate the publication of their 50th tutorial. Can you spot their weather station?
Your home-brew weather station
If you’ve built your own Raspberry Pi–powered weather station and would like to dabble with the Initial State dashboards, you’re in luck! The team at Initial State is offering 14-day trials for everyone. For more information on Initial State, and to sign up for the trial, check out their website.
When James Puderer moved to Lima, Peru, his roadside runs left a rather nasty taste in his mouth. Hit by the pollution from old diesel cars in the area, he decided to monitor the air quality in his new city using Raspberry Pis and the abundant taxies as his tech carriers.
How to assemble the enclosure for my Taxi Datalogger project: https://www.hackster.io/james-puderer/distributed-air-quality-monitoring-using-taxis-69647e
Sensing air quality in Lima
Luckily for James, almost all taxies in Lima are equipped with the standard hollow vinyl roof sign seen in the video above, which makes them ideal for hacking.
With the onboard tech, the device collects data on longitude, latitude, humidity, temperature, pressure, and airborne particle count, feeding it back to an Android Things datalogger. This data is then pushed to Google IoT Core, where it can be remotely accessed.
Next, the data is processed by Google Dataflow and turned into a BigQuery table. Users can then visualize the collected measurements. And while James uses Google Maps to analyse his data, there are many tools online that will allow you to organise and study your figures depending on what final result you’re hoping to achieve.
James hopped in a taxi and took his monitor on the road, collecting results throughout the journey
James has provided the complete build process, including all tech ingredients and code, on his Hackster.io project page, and urges makers to create their own air quality monitor for their local area. He also plans on building upon the existing design by adding a 12V power hookup for connecting to the taxi, functioning lights within the sign, and companion apps for drivers.
Sensing the world around you
We’ve seen a wide variety of Raspberry Pi projects using sensors to track the world around us, such as Kasia Molga’s Human Sensor costume series, which reacts to air pollution by lighting up, and Clodagh O’Mahony’s Social Interaction Dress, which she created to judge how conversation and physical human interaction can be scored and studied.
Kasia Molga’s Human Sensor — a collection of hi-tech costumes that react to air pollution within the wearer’s environment.
Many people also build their own Pi-powered weather stations, or use the Raspberry Pi Oracle Weather Station, to measure and record conditions in their towns and cities from the roofs of schools, offices, and homes.
Have you incorporated sensors into your Raspberry Pi projects? Share your builds in the comments below or via social media by tagging us.
Did you realise the Sense HAT has been available for over two years now? Used by astronauts on the International Space Station, the exact same hardware is available to you on Earth. With a new Astro Pi challenge just launched, it’s time for a retrospective/roundup/inspiration post about this marvellous bit of kit.
The Sense HAT on a Pi in full glory
The Sense HAT explained
We developed our scientific add-on board to be part of the Astro Pi computers we sent to the International Space Station with ESA astronaut Tim Peake. For a play-by-play of Astro Pi’s history, head to the blog archive.
Just to remind you, this is all the cool stuff our engineers have managed to fit onto the HAT:
Use the LED matrix and joystick to recreate games such as Pong or Flappy Bird. Of course, you could also add sensor input to your game: code an egg drop game or a Magic 8 Ball that reacts to how the device moves.
If you like the great outdoors, you could also use your Sense HAT to recreate this Hiking Companion by Marcus Johnson. Take it with you on your next hike!
It’s also possible to incorporate Sense HAT data into your digital art! The Python Turtle module and the Processing language are both useful tools for creating beautiful animations based on real-world information.
A Sense HAT project that also uses this principle is Giorgio Sancristoforo’s Tableau, a ‘generative music album’. This device creates music according to the sensor data:
“There is no doubt that, as music is removed by the phonographrecord from the realm of live production and from the imperative of artistic activity and becomes petrified, it absorbs into itself, in this process of petrification, the very life that would otherwise vanish.”
Our online resource shows you how to record the information your HAT picks up. Next you can analyse and graph your data using Mathematica, which is included for free on Raspbian. This resource walks you through how this software works.
If you’re seeking inspiration for experiments you can do on our Astro Pis Izzy and Ed on the ISS, check out the winning entries of previous rounds of the Astro Pi challenge.
Thomas Pesquet with Ed and Izzy
But you can also stick to terrestrial scientific investigations. For example, why not build a weather station and share its data on your own web server or via Weather Underground?
Your code in space!
If you’re a student or an educator in one of the 22 ESA member states, you can get a team together to enter our 2017-18 Astro Pi challenge. There are two missions to choose from, including Mission Zero: follow a few guidelines, and your code is guaranteed to run in space!
Customers across the world are increasingly building innovative Internet of Things (IoT) workloads on AWS. With AWS, they can handle the constant stream of data coming from millions of new, internet-connected devices. This data can be a valuable source of information if it can be processed, analyzed, and visualized quickly in a scalable, cost-efficient manner. Engineers and developers can monitor performance and troubleshoot issues while sales and marketing can track usage patterns and statistics to base business decisions.
In this post, I demonstrate a sample solution to build a quick and easy monitoring and visualization dashboard for your IoT data using AWS serverless and managed services. There’s no need for purchasing any additional software or hardware. If you are already using AWS IoT, you can build this dashboard to tap into your existing device data. If you are new to AWS IoT, you can be up and running in minutes using sample data. Later, you can customize it to your needs, as your business grows to millions of devices and messages.
Architecture
The following is a high-level architecture diagram showing the serverless setup to configure.
AWS service overview
AWS IoT is a managed cloud platform that lets connected devices interact easily and securely with cloud applications and other devices. AWS IoT can process and route billions of messages to AWS endpoints and to other devices reliably and securely.
Amazon Kinesis Firehose is the easiest way to capture, transform, and load streaming data continuously into AWS from thousands of data sources, such as IoT devices. It is a fully managed service that automatically scales to match the throughput of your data and requires no ongoing administration.
Amazon Kinesis Analytics allows you to process streaming data coming from IoT devices in real time with standard SQL, without having to learn new programming languages or processing frameworks, providing actionable insights promptly.
The processed data is fed into Amazon QuickSight, which is a fast, cloud-powered business analytics service that makes it easy to build visualizations, perform ad-hoc analysis, and quickly get business insights from the data.
The most popular way for Internet-connected devices to send data is using MQTT messages. The AWS IoT gateway receives these messages from registered IoT devices. The solution in this post uses device data from AWS Simple Beer Service (SBS), a series of internet-connected kegerators sending sensor outputs such as temperature, humidity, and sound levels in a JSON payload. You can use any existing IoT data source that you may have.
The AWS IoT rules engine allows selecting data from message payloads, processing it, and sending it to other services. You forward the data to a Firehose delivery stream to consolidate the continuous data stream into batches for further processing. The batched data is also stored temporarily in an Amazon S3 bucket for later retrieval and can be set for deletion after a specified time using S3 Lifecycle Management rules.
The incoming data from the Firehose delivery stream is fed into an Analytics application that provides an easy way to process the data in real time using standard SQL queries. Analytics allows writing standard SQL queries to extract specific components from the incoming data stream and perform real-time ETL on it. In this post, you use this feature to aggregate minimum and maximum temperature values from the sensors per minute. You load it in Amazon QuickSight to create a monitoring dashboard and check if the devices are over-heating or cooling down during use. You also extract every device’s location, parameters such as temperature, sound levels, humidity, and the time stamp in Analytics to use on the visualization dashboard.
The processed data from the two queries is fed into two Firehose delivery streams, both of which batch the data into CSV files every minute and store it in S3. The batching time interval is configurable between 1 and 15 minutes in 1-second intervals.
Finally, you use Amazon QuickSight to ingest the processed CSV files from S3 as a data source to build visualizations. Amazon QuickSight’s super-fast, parallel, in-memory, calculation engine (SPICE) parses the ingested data and allows you to create a variety of visualizations with different graph types. You can also use the Amazon QuickSight built-in Story feature to combine visualizations into business dashboards that can be shared in a secure manner.
Implementation
AWS IoT, Amazon Kinesis, and Amazon QuickSight are all fully managed services, which means you can complete the entire setup in just a few steps using the AWS Management Console. Don’t worry about setting up any underlying hardware or installing any additional software. So, get started.
Step 1. Set up your AWS IoT data source
Do you currently use AWS IoT? If you have an existing IoT thing set up and running on AWS IoT, you can skip to Step 2.
If you have an AWS IoT button or other IoT devices that can publish MQTT messages and would like to use that for the setup, follow the Getting Started with AWS IoT topic to connect your thing to AWS IoT. Continue to Step 2.
If you do not have an existing IoT device, you can generate simulated device data using a script on your local machine and have it publish to AWS IoT. The following script lets you set up your AWS IoT environment and publish simulated data that mimics device data from Simple Beer Service.
Generate sample Data
Running the sbs.py Python script generates fictitious AWS IoT messages from multiple SBS devices. The IoT rule sends the message to Firehose for further processing.
Run the script and keep it running for the duration of the project to generate sufficient data.
Tip: If you encounter any issues running the script from your local machine, launch an EC2 instance and run the script there as a root user. Remember to assign an appropriate IAM role to your instance at the time of launch that allows it to access AWS IoT.
Step 2. Create three Firehose delivery streams
For this post, you require three Firehose delivery streams: one to batch raw data from AWS IoT, and two to batch output device data and aggregated data from Analytics.
Send messages to an Amazon Kinesis Firehose stream (select IoT-Source-Stream from dropdown)
Select Separator
“\n (newline)”
A quick check before proceeding further: make sure that you have run the script to generate simulated IoT data or that your IoT Thing is running and delivering data. If not, set it up now. The Amazon Kinesis Analytics application you set up in the next step needs the data to process it further.
Step 4: Create an Analytics application to process data
Enter a name of your choice, for example, SBS-IoT-Data.
For the source, choose IoT-Source-Stream.
Analytics auto-discovers the schema on the data by sampling records from the input stream. It also includes an in-built SQL editor that allows you to write standard SQL queries to transform incoming data.
Tip: If Analytics is unable to discover your incoming data, it may be missing the appropriate IAM permissions. In the IAM console, select the role that you assigned to your IoT rule in Step 3. Make sure that it has the ARN of the IoT-Source-Data Firehose stream listed in the firehose:putRecord section.
Here is a sample SQL query that generates two output streams:
DESTINATION_SQL_BASIC_STREAM contains the device ID, device parameter, its value, and the time stamp from the incoming stream.
DESTINATION_SQL_AGGREGATE_STREAM aggregates the maximum and minimum values of temperatures from the sensors over a one-minute period from the incoming data.
– Create an output stream with four columns, which is used to send IoT data to the destination
CREATE OR REPLACE STREAM "DESTINATION_SQL_BASIC_STREAM" (dateTime TIMESTAMP, deviceId VARCHAR(8), deviceParameter VARCHAR(16), deviceValue INTEGER);
– Create a pump that continuously selects from the source stream and inserts it into the output data stream
CREATE OR REPLACE PUMP "STREAM_PUMP_1" AS INSERT INTO "DESTINATION_SQL_BASIC_STREAM"
– Filter specific columns from the source stream
SELECT STREAM "dateTime", "deviceId", "deviceParameter", "deviceValue" FROM "SOURCE_SQL_STREAM_001";
– Create a second output stream with three columns, which is used to send aggregated min/max data to the destination
CREATE OR REPLACE STREAM "DESTINATION_SQL_AGGREGATE_STREAM" (dateTime TIMESTAMP, highestTemp SMALLINT, lowestTemp SMALLINT);
– Create a pump that continuously selects from a source stream
CREATE OR REPLACE PUMP "STREAM_PUMP_2" AS INSERT INTO "DESTINATION_SQL_AGGREGATE_STREAM"
– Extract time in minutes, plus the highest and lowest value of device temperature in that minute, into the destination aggregate stream, aggregated per minute
SELECT STREAM FLOOR("SOURCE_SQL_STREAM_001".ROWTIME TO MINUTE) AS "dateTime", MAX("deviceValue") AS "highestTemp", MIN("deviceValue") AS "lowestTemp" FROM "SOURCE_SQL_STREAM_001" WHERE "deviceParameter"='Temperature' GROUP BY FLOOR("SOURCE_SQL_STREAM_001".ROWTIME TO MINUTE);
Real-time analytics shows the results of the SQL query. If everything is working correctly, you see three streams listed, similar to the following screenshot.
Step 5: Connect the Analytics application to output Firehose delivery streams
You create two destinations for the two delivery streams that you created in the previous step. A single Analytics application can have multiple destinations defined; however, this needs to be set up using the AWS CLI, not from the console. If you do not already have it, install the AWS CLI on your local machine and configure it with your credentials.
Tip: If you are running the IoT script from an EC2 instance, it comes pre-installed with the AWS CLI.
Create the first destination delivery stream
The AWS CLI command to create a new output Firehose delivery stream is as follows:
aws kinesisanalytics add-application-output – application-name <Name of Analytics Application> – current-application-version-id <number> – application-output 'Name=DESTINATION_SQL_BASIC_STREAM,KinesisFirehoseOutput={ResourceARN=<ARN of IoT-Data-Stream>,RoleARN=<ARN of Analytics application>,DestinationSchema={RecordFormatType=CSV}'
Do not copy this into the CLI just yet! Before entering this command, make the following four changes to personalize it:
For Name of Analytics Application, enter the value from Step 4, or from the Analytics console.
For current-application-version-ID, run the following command:
aws kinesisanalytics describe-application – application-name <application name from above>; | grep ApplicationVersionId
aws kinesisanalytics describe-application – application-name <application name from above>; | grep RoleARN
Now, paste the complete command in the AWS CLI and press Enter. If there are any errors, the response provides details. If everything goes well, a new destination delivery stream is created to send the first query (DESTINATION_SQL_BASIC_STREAM) to IoT-Destination-Data-Stream.
Create the second destination delivery stream
Following similar steps as above, create a second destination Firehose delivery stream with the following changes:
For Name of Analytics Application, enter the same name as the first delivery stream.
For current-application-version-ID, increment by 1 from the previous value (unless you made other changes in between these steps). If unsure, run the same command as above to get it again.
For ResourceARN, get the value by running the following CLI command:
For RoleArn, enter the same value as the first stream.
Run the aws kinesisanalytics CLI command, similar to the previous step but with the new parameters substituted. This creates the second output Firehose destination delivery stream.
Update the IAM role for Analytics to allow writing to both output streams.
In the console, choose IAM, Roles.
Select the role that you created with Analytics in Step 4.
Choose Policy, JSON, and Edit.
Find “Sid”: “WriteOutputFirehose” in the JSON document, go to the “Resource” section and make sure that it includes Resource ARNs of both streams that you found in the previous step.
If it has only one ARN, add the second ARN and choose Save.
This completes the Amazon Kinesis setup. The incoming IoT data is processed by Analytics and delivered, using two output delivery streams, to two separate folders in your S3 bucket.
Step 6: Set up Amazon QuickSight to analyze the data
To build the visualization dashboard, ingest the processed CSV files from the S3 bucket into Amazon QuickSight.
In the console, choose QuickSight.
If this is your first time using Amazon QuickSight, you are asked to create a new account. Follow the prompts.
When you are logged in to your account, choose New Analysis and enter a name of your choice.
Choose New data set for the analysis or, if you have previously imported your data set, select one from the available data sets.
You import two data sets: one with general device parameters information, and the other with aggregates of maximum and minimum temperatures for monitoring. For the first data set, choose S3 from the list of available data sources and enter a name, for example, IoT Device Data.
The location of the S3 bucket and the objects to use are provided to Amazon QuickSight as a manifest file. Create a new manifest file following the supported formats for Amazon S3 manifest files.
In the URIPrefixes section, provide your appropriate S3 bucket and folder location for the general device data. Hint: it should include <your unique name>-kinesis/data/.
Your manifest file should look similar to the following:
Amazon QuickSight imports and parses the data set, and provides available data fields that can be used for making graphs. The Edit/Preview data button allows you to format and transform the data, change data types, and filter or join your data. Make sure that the columns have the correct titles. If not, you can edit them and then save.
Tip: choose the downward arrow on the top right and unselect Files include headers to give each column appropriate headers. Choose Save. This takes you back to the data sets page.
Follow the same steps as above to import the second data set. This time, your manifest should include your aggregate data set folder on S3, which is named <your unique name>-kinesis/aggregate/. Update headers if necessary and choose Save & visualize.
Build an analysis
The visualization screen shows the data set that you last imported, which in this case is the aggregate data. To include the general device data as well, for Fields on the top left, choose Edit analysis data sets. Choose Add data set and select the other data set that you saved earlier.
Now both data sets are available on the analysis screen. For Visual Types at bottom left, select the type of graph to make. For Fields, select the fields to visualize. For example, drag Device ID, Device Parameter, and Value to Field wells, as shown in the screenshot below, to generate a visualization of average parameter values compared across devices.
You can create another visual by choose +Add. This time, select a line graph to show monitoring of the maximum temperature values of the sensors in any minute, from the aggregate data set.
If you would like to create an interactive story to present to your team or organization, you can choose the Story option on the left panel. Create a dashboard with multiple visualizations, to save and share securely with the intended audience. An example of a story is shown below.
Conclusion
Any data is valuable only when it can be actually put to use. In this post, you’ve seen how it’s possible to quickly build a simple Analytics application to ingest, process, and visualize IoT data in near real time entirely using AWS managed services. This solution is scalable and reliable, and costs a fraction of other business intelligence solutions. It is easy enough that anyone with an AWS account can build and use it without any special training.
If you have any questions or suggestions, please comment below.
About the Author
Karan Desai is a Solutions Architect with Amazon Web Services. He works with startups and small businesses in the US, helping them adopt cloud technology to build scalable and secure solutions using AWS. In his spare time, he likes to build personal IoT projects, travel to offbeat places and write about it.
OpenEnergyMonitor, who make open-source tools for energy monitoring, have been using Raspberry Pi since we launched in 2012. Like Raspberry Pi, they manufacture their hardware in Wales and send it to people all over the world. We invited co-founder Glyn Hudson to tell us why they do what they do, and how Raspberry Pi helps.
Hi, I’m Glyn from OpenEnergyMonitor. The OpenEnergyMonitor project was founded out of a desire for open-source tools to help people understand and relate to their use of energy, their energy systems, and the challenge of sustainable energy.
The next 20 years will see a revolution in our energy systems, as we switch away from fossil fuels towards a zero-carbon energy supply.
By using energy monitoring, modelling, and assessment tools, we can take an informed approach to determine the best energy-saving measures to apply. We can then check to ensure solutions achieve their expected performance over time.
We started the OpenEnergyMonitor project in 2009, and the first versions of our energy monitoring system used an Arduino with Ethernet Shield, and later a Nanode RF with an embedded Ethernet controller. These early versions were limited by a very basic TCP/IP stack; running any sort of web application locally was totally out of the question!
I can remember my excitement at getting hold of the very first version of the Raspberry Pi in early 2012. Within a few hours of tearing open the padded envelope, we had Emoncms (our open-source web logging, graphing, and visualisation application) up and running locally on the Raspberry Pi. The Pi quickly became our web-connected base station of choice (emonBase). The following year, 2013, we launched the RFM12Pi receiver board (now updated to RFM69Pi). This allowed the Raspberry Pi to receive data via low-power RF 433Mhz from our emonTx energy monitoring unit, and later from our emonTH remote temperature and humidity monitoring node.
In 2015 we went all-in with Raspberry Pi when we launched the emonPi, an all-in-one Raspberry Pi energy monitoring unit, via Kickstarter. Thanks to the hard work of the Raspberry Pi Foundation, the emonPi has enjoyed several upgrades: extra processing power from the Raspberry Pi 2, then even more power and integrated wireless LAN thanks to the Raspberry Pi 3. With all this extra processing power, we have been able to build an open software stack including Emoncms, MQTT, Node-RED, and openHAB, allowing the emonPi to function as a powerful home automation hub.
Emoncms Apps interface to emonPi home automation hub
Inspired by the Raspberry Pi Foundation, we manufacture and assemble our hardware in Wales, UK, and ship worldwide via our online store.
All of our work is fully open source. We believe this is a better way of doing things: we can learn from and build upon each other’s work, creating better solutions to the challenges we face. Using Raspberry Pi has allowed us to draw on the expertise and work of many other projects. With lots of help from our fantastic community, we have built an online learning resource section of our website to help others get started: it covers things like basic AC power theory, Arduino, and the bigger picture of sustainable energy.
To learn more about OpenEnergyMonitor systems, take a look at our Getting Started User Guide. We hope you’ll join our community.
Last year, we partnered with Trinket to develop a web-based emulator for the Sense HAT, the multipurpose add-on board for the Raspberry Pi. Today, we are proud to announce an exciting new upgrade to the emulator. We hope this will make it even easier for you to design amazing experiments with the Sense HAT!
What’s new?
The original release of the emulator didn’t fully support all of the Sense HAT features. Specifically, the movement sensors were not emulated. Thanks to funding from the UK Space Agency, we are delighted to announce that a new round of development has just been completed. From today, the movement sensors are fully supported. The emulator also comes with a shiny new 3D interface, Astro Pi skin mode, and Pygame event handling. Click the ▶︎ button below to see what’s new!
Upgraded sensors
On a physical Sense HAT, real sensors react to changes in environmental conditions like fluctuations in temperature or humidity. The emulator has sliders which are designed to simulate this. However, emulating the movement sensor is a bit more complicated. The upgrade introduces a 3D slider, which is essentially a model of the Sense HAT that you can move with your mouse. Moving the model affects the readings provided by the accelerometer, gyroscope, and magnetometer sensors.
Code written in this emulator is directly portable to a physical Raspberry Pi and Sense HAT without modification. This means you can now develop and test programs using the movement sensors from any internet-connected computer, anywhere in the world.
Astro Pi mode
Astro Pi is our series of competitions offering students the chance to have their code run in space! The code is run on two space-hardened Raspberry Pi units, with attached Sense HATs, on the International Space Station.
Astro Pi skin mode
There are a number of practical things that can catch you out when you are porting your Sense HAT code to an Astro Pi unit, though, such as the orientation of the screen and joystick. Just as having a 3D-printed Astro Pi case enables you to discover and overcome these, so does the Astro Pi skin mode in this emulator. In the bottom right-hand panel, there is an Astro Pi button which enables the mode: click it again to go back to the Sense HAT.
The joystick and push buttons are operated by pressing your keyboard keys: use the cursor keys and Enter for the joystick, and U, D, L, R, A, and B for the buttons.
Sense Hat resources for Code Clubs
Click the image to visit the Code Club projects page
We also have a new range of Code Club resources which are based on the emulator. Of these, three use the environmental sensors and two use the movement sensors. The resources are an ideal way for any Code Club to get into physical computing.
The technology
The 3D models in the emulator are represented entirely with HTML and CSS. “This project pushed the Trinket team, and the 3D web, to its limit,” says Elliott Hauser, CEO of Trinket. “Our first step was to test whether pure 3D HTML/CSS was feasible, using Julian Garnier’s Tridiv.”
The Trinket team’s preliminary 3D model of the Sense HAT
“We added JavaScript rotation logic and the proof of concept worked!” Elliot continues. “Countless iterations, SVG textures, and pixel-pushing tweaks later, the finished emulator is far more than the sum of its parts.”
The finished Sense HAT model: doesn’t it look amazing?
Check out this blog post from Trinket for more on the technology and mathematics behind the models.
One of the compromises we’ve had to make is browser support. Unfortunately, browsers like Firefox and Microsoft Edge don’t fully support this technology yet. Instead, we recommend that you use Chrome, Safari, or Opera to access the emulator.
Where do I start?
If you’re new to the Sense HAT, you can simply copy and paste many of the code examples from our educational resources, like this one. Alternatively, you can check out our Sense HAT Essentials e-book. For a complete list of all the functions you can use, have a look at the Sense HAT API reference here.
Multi-talented maker Giorgio Sancristoforo has used a Raspberry Pi and Sense HAT to create Tableau, a generative music album. It’s an innovative idea: the music constantly evolves as it reacts to environmental stimuli like atmospheric pressure, humidity, and temperature.
“There is no doubt that, as music is removed by the phonographrecord from the realm of live production and from the imperative of artistic activity and becomes petrified, it absorbs into itself, in this process of petrification, the very life that would otherwise vanish.”
Creating generative music
“I’ve been dreaming about using portable microcomputers to create a generative music album,” explains Giorgio. “Now my dream is finally a reality: this is my first portable generative LP (PGLP)”. Tableau uses both a Raspberry Pi 2 and a Sense HAT: the HAT provides the data for the album’s musical evolution via its range of onboard sensors.
Photo credit: Giorgio Sancristoforo
The Sense HAT was originally designed for use aboard the International Space Station (ISS) as part of the ongoing Astro Pi challenge. It has, however, become a staple within the Raspberry Pi maker community. This is partly thanks to the myriad of possibilities offered by its five onboard sensors, five-button joystick, and 8 × 8 LED matrix.
Photo credit: Giorgio Sancristoforo
Limited edition
The final release of Tableau consists of a limited edition of fifty PGLPs: each is set up to begin playing immediately power is connected, and the music will continue to evolve indefinitely. “Instead of being reproduced as on a CD or in an MP3 file, the music is spontaneously generated and arranged while you are listening to it,” Giorgio explains on his website. “It never sounds the same. Tableau creates an almost endless number of mixes of the LP (4 × 12 factorial). Each time you will listen, the music will be different, and it will keep on evolving until you switch the power off.”
Photo credit: Giorgio Sancristoforo
Experiment with the Sense HAT
What really interests us is how the sound of Tableau might alter in different locations. Would it sound different in Cambridge as opposed to the deserts of Mexico? What about Antarctica versus the ISS?
If Giorgio’s project has piqued your interest, why not try using our free data logging resource for the Sense HAT? You can use it to collect information from the HAT’s onboard sensors and create your own projects. How about collecting data over a year, and transforming this into your own works of art?
Even if you don’t have access to the Sense HAT, you can experience it via the Sense HAT desktop emulator. This is a great solution if you want to work on Sense HAT-based projects in the classroom, as it reduces the amount of hardware you need.
If you’ve already built a project using the Sense HAT, make sure to share it in the comments below. We would love to see what you have been making!
Following a post-Christmas decision to keep illuminated decorations on her stairway bannister throughout the year, Lorraine Underwood found a new purpose for a strip of NeoPixels she had lying around.
Changed the stair lights from a string to a strip & they look awesome! #neopixel #raspberrypi https://t.co/dksLwy1SE1
Simply running the lights up the stairs, blinking and flashing to a random code, wasn’t enough for her. By using an API to check the outdoor weather, Lorraine’s lights went from decorative to informative: they now give an indication of outside weather conditions through their colour and the quantity illuminated.
“The idea is that more lights will light up as it gets warmer,” Lorraine explains. “The temperature is checked every five minutes (I think that may even be a little too often). I am looking forward to walking downstairs to a nice warm yellow light instead of the current blue!”
In total, Lorraine had 240 lights in the strip; she created a chart indicating a range of outside temperatures and the quantity of lights which for each value, as well as specifying the colour of those lights, running from chilly blue through to scorching red.
Oh, Lorraine! We love your optimistic dreams of the British summer being more than its usual rainy 16 Celsius…
The lights are controlled by a Raspberry Pi Zero running a code that can be found on Lorraine’s blog. The code dictates which lights are lit and when.
“Do I need a coat today? I’ll check the stairs.”
Lorraine is planning some future additions to the build, including a toddler-proof 3D housing, powering the Zero from the lights’ power supply, and gathering her own temperature data instead of relying on a third-party API.
While gathering the temperature data from outside her house, she may also want to look into building an entire weather station, collecting extra data on rain, humidity, and wind conditions. After all, this is the UK: just because it’s hot outside, it doesn’t mean it’s not also raining.
As well as working with classroom teachers and supporting learning in schools, Raspberry Pi brings computing and digital making to educators and learners in all sorts of other settings. I recently attended Wintercamp, a camp for Scouts at Gilwell Park. With some help from Brian and Richard from Vodafone, I ran a Raspberry Pi activity space introducing Scouts to digital making with Raspberry Pi, using the Sense HAT, the Camera Module, and GPIO, based on some of our own learning resources.
Today I’m running @Raspberry_Pi activities for @UKScouting at @gpwintercamp with @VodafoneUK!
Note the plastic sheeting on the floor! Kids were dropping into our sessions all day with muddy boots, having taken part in all sorts of fun activities, indoors and out.
In the UK, the Scouts have Digital Citizen and Digital Maker badges, and we’re currently working with the Scout Association to help deliver content for the Digital Maker badge, as supported by the Vodafone Foundation.
The activities we ran were just a gentle introduction to creative tech and experimenting with sensors, but they went down really well, and many of the participants felt happy to move beyond the worksheets and try out their own ideas. We set challenges, and got them to think about how they could incorporate technology like this into their Scouting activities.
Having been through the Scouting movement myself, it’s amazing to be involved in working to show young people how technology can be applied to projects related to their other hobbies and interests. I loved introducing the Scouts to the idea that programming and making can be tools to help solve problems that are relevant to them and to others in their communities, as well as enabling them to do some good in the world, and to be creative.
Can you breathe on the Sense HAT to make the humidity read 90?” “That’s cool. It makes you light-headed…
While conducting a survey of Raspberry Jam organisers recently, I discovered that a high proportion of those who run Jams are also involved in other youth organisations. Many were Scout leaders. Other active Pi community folk happen to be involved in Scouting too, like Brian and Richard, who helped out at the weekend, and who are Scout and Cub leaders. I’m interested to speak to anyone in the Pi community who has an affiliation with the Scouts to share ideas on how they think digital making can be incorporated in Scouting activities. Please do get in touch!
Not a great picture but the Scouts made a Fleur de Lys on the Sense HAT at @gpwintercamp
The timing is perfect for young people in this age group to get involved with digital making, as we’ve just launched our first Pioneers challenge. There’s plenty of scope there for outdoor tech projects.
Thanks to UK Scouting and the Wintercamp team for a great weekend. Smiles all round!
Over the last few months, we’ve been working with US-based startup Trinket to develop a web-based emulator for the Sense HAT, the multipurpose add-on board for the Raspberry Pi which is also the core component of the Astro Pi units on the International Space Station. We wanted to provide a unique, free learning resource that brings the excitement of programming our space-qualified hardware to students, teachers, and others all over the world.
We’re delighted to announce its release today, and you can try it for yourself right now. Click the Run ▻ button below and see what happens!
The emulator will allow more people to participate in future Astro Pi competitions – you’ll be able to join in without needing to own a Raspberry Pi computer or a Sense HAT.
British ESA Astronaut Tim Peake with the Astro Pi. Image credit ESA
The new emulator builds on Trinket’s existing Python-in-browser platform, and provides the following features:
Virtual Sense HAT with environmental controls and joystick input
Full Python syntax highlighting
Contextual auto-complete
Intuitive error reporting and highlighting
Image upload
HTML page embedding
Social media integration
Project sharing via direct URL
Project download as zip (for moving to Raspberry Pi)
All major browsers supported
The Sense HAT has temperature, pressure and humidity sensors, and can change its behaviour according to the values they report. The Sense HAT emulator has sliders you can move to change these values, so you can test how your code responds to environmental variables.
You can move the sliders to change what the sensors are reporting
Code written in this emulator is directly portable to a physical Raspberry Pi with a Sense HAT without modification. This means any code you write can be run by the Astro Pi units on board the ISS! It is our hope that, within the next 12 months, code that has been written in the emulator will run in space. Look out for news on this, coming soon on the Astro Pi site!
We owe huge thanks to Trinket, who have been wonderful partners in this project. The development work has been completed in just over two months, and has been a huge collaborative effort from the start. The software relies heavily on open-source technology and a global community of developers who are committed to making the power of code more accessible to students.
A closed group of beta testers, made up of previous Astro Pi participants and Code Club champions, has been putting the emulator through its paces over recent weeks. We’re proud to say that we’ve just had a bug-free open beta over the weekend, and now we’re looking forward to seeing it used as widely as possible.
So, where do you start? If you’re new to the Sense HAT, you can just copy and paste a lot of the code examples from our educational resources like this one. You can also check out our e-book Sense HAT Essentials. For a complete list of all the functions you can use, have a look at the Sense HAT API reference here; please note that the IMU (movement-sensing) functions will be supported in a future update. Head over to the main Sense HAT emulator site to see loads of other cool examples of what’s possible. Flappy LED, anyone?
We keep a very close eye on the annual Tech4Good competition, and especially the children who are nominated for their BT Young Pioneer award; there are some fiercely smart kids there doing some hugely impressive work. This year’s was a very close field (I would not like to have been judging – there were some extraordinary projects presented).
Tech4Good award winners 2016
Arnav Sharma, nine years old, was the Winner of Winners as well as the winner of the Young Pioneer section with this asthma monitor, which runs on Raspberry Pi. Arnav started by learning about the causes and effects of asthma, and thought about ways to help patients. He discovered that asthma is hard to diagnose, but can be fatal if left undetected. This leads to many children being over-diagnosed and over-medicated; inhalers are often given as treatment to reduce the symptoms of asthma, but come with side-effects like reduced growth and immunity. Arnav discovered that the best way to manage asthma is to prevent attacks by understanding what triggers asthma attacks and following a treatment plan.
AsthmaPi
Arnav’s AsthmaPi uses a Raspberry Pi, a Sense HAT, an MQ-135 Gas Sensor, a Sharp Optical Dust Sensor and an Arduino Uno.The sensors on the SenseHAT are used to measure temperature and humidity, while the MQ gas sensor detects nitrogen compounds, carbon dioxide, cigarette smoke, smog, ammonia and alcohol, all known asthma triggers. The dust sensor measures the size of dust particles and their density. The AsthmaPi is programmed in Python and C++, and triggers email and SMS text message alerts to remind the owner take medication and to go for review visits.
Here’s Arnav’s very impressive project video, which will walk you through what he’s put together, and how it all works.
This is the video demo for the AsthmaPi: An affordable asthma management kit made by Arnav Sharma, aged 9, finalist of Tech For Good competition. Please tweet him at #T4GArnavSharma or visit his page here http://www.tech4goodawards.com/finalist/arnav-sharma/ or vote for him at http://www.tech4goodawards.com/peoples-award/ Thank you.
The Raspberry Pi official forums are the central online meeting place for the Raspberry Pi community. They’re where you’ll find support from hundreds of thousands (141,183, as of this morning) of other Pi users, including people from our own engineering team; lots of inspiration for your own projects, and loads of advice. You can chat to a selection of those of us who work at Raspberry Pi there too – we’re usually poking around in there for part of the day.
Commence to poking.
We found this rather brilliant hack to ventilate and maintain a cellar’s humidity on the forums. Forum member DasManul, from Frankfurt, put this together to measure temperature and relative humidity inside and outside his cellar, and to use those values to calculate absolute humidity. The setup then ventilates the space if the humidity inside is higher than it is outside.
On reading what he was up to, I assumed DasManul was looking after a cellarful of wine. Then I saw his pictures. He’s actually tending bottles of fabric softener and yoghurt.
Nestled next to the nonalcoholic liquids, you’ll find a touchscreen controller for the system, along with a USB receiver. Here’s a closer look at the display:
The system also outputs more detailed graphs (daily, weekly, or monthly) to a website served by ngnix, which allows you to control the system remotely if you don’t happen to be down in the cellar conditioning your fabric.
DasManul says that he’s not much for hardware tinkering, and didn’t want to start drilling into his house’s infrastructure, so he used off-the-shelf parts for sensing and controlling. Two inexpensive wireless sensors, one indoors and one outdoors, from elv.de, do all the work checking the humidity; they feed information to the USB receiver, and intake and exhaust fans are controlled with an Energenie plug strip. (These things are great – I use an Energenie plug strip to turn lamps on and off via a remote PIR sensor in my living room).
DasManul has made all the code available (with German and English documentation) over at BitBucket so you can replicate the project. There’s plenty more like this over at the Raspberry Pi Forums – get stuck in!
By continuing to use the site, you agree to the use of cookies. more information
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.