Tag Archives: humidity

Astro Pi celebrates anniversary of ISS Columbus module

Post Syndicated from David Honess original https://www.raspberrypi.org/blog/astro-pi-celebrates-anniversary/

Right now, 400km above the Earth aboard the International Space Station, are two very special Raspberry Pi computers. They were launched into space on 6 December 2015 and are, most assuredly, the farthest-travelled Raspberry Pi computers in existence. Each year they run experiments that school students create in the European Astro Pi Challenge.

Raspberry Astro Pi units on the International Space Station

Left: Astro Pi Vis (Ed); right: Astro Pi IR (Izzy). Image credit: ESA.

The European Columbus module

Today marks the tenth anniversary of the launch of the European Columbus module. The Columbus module is the European Space Agency’s largest single contribution to the ISS, and it supports research in many scientific disciplines, from astrobiology and solar science to metallurgy and psychology. More than 225 experiments have been carried out inside it during the past decade. It’s also home to our Astro Pi computers.

Here’s a video from 7 February 2008, when Space Shuttle Atlantis went skywards carrying the Columbus module in its cargo bay.

STS-122 Launch NASA TV Coverage

From February 7th, 2008 NASA-TV Coverage of The 121st Space Shuttle Launch Launched At:2:45:30 P.M E.T – Coverage begins exactly one hour till launch STS-122 Crew:

Today, coincidentally, is also the deadline for the European Astro Pi Challenge: Mission Space Lab. Participating teams have until midnight tonight to submit their experiments.

Anniversary celebrations

At 16:30 GMT today there will be a live event on NASA TV for the Columbus module anniversary with NASA flight engineers Joe Acaba and Mark Vande Hei.

Our Astro Pi computers will be joining in the celebrations by displaying a digital birthday candle that the crew can blow out. It works by detecting an increase in humidity when someone blows on it. The video below demonstrates the concept.

AstroPi candle

Uploaded by Effi Edmonton on 2018-01-17.

Do try this at home

The exact Astro Pi code that will run on the ISS today is available for you to download and run on your own Raspberry Pi and Sense HAT. You’ll notice that the program includes code to make it stop automatically when the date changes to 8 February. This is just to save time for the ground control team.

If you have a Raspberry Pi and a Sense HAT, you can use the terminal commands below to download and run the code yourself:

wget http://rpf.io/colbday -O birthday.py
chmod +x birthday.py
./birthday.py

When you see a blank blue screen with the brightness increasing, the Sense HAT is measuring the baseline humidity. It does this every 15 minutes so it can recalibrate to take account of natural changes in background humidity. A humidity increase of 2% is needed to blow out the candle, so if the background humidity changes by more than 2% in 15 minutes, it’s possible to get a false positive. Press Ctrl + C to quit.

Please tweet pictures of your candles to @astro_pi – we might share yours! And if we’re lucky, we might catch a glimpse of the candle on the ISS during the NASA TV event at 16:30 GMT today.

The post Astro Pi celebrates anniversary of ISS Columbus module appeared first on Raspberry Pi.

In the Works – AWS IoT Device Defender – Secure Your IoT Fleet

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/in-the-works-aws-sepio-secure-your-iot-fleet/

Scale takes on a whole new meaning when it comes to IoT. Last year I was lucky enough to tour a gigantic factory that had, on average, one environment sensor per square meter. The sensors measured temperature, humidity, and air purity several times per second, and served as an early warning system for contaminants. I’ve heard customers express interest in deploying IoT-enabled consumer devices in the millions or tens of millions.

With powerful, long-lived devices deployed in a geographically distributed fashion, managing security challenges is crucial. However, the limited amount of local compute power and memory can sometimes limit the ability to use encryption and other forms of data protection.

To address these challenges and to allow our customers to confidently deploy IoT devices at scale, we are working on IoT Device Defender. While the details might change before release, AWS IoT Device Defender is designed to offer these benefits:

Continuous AuditingAWS IoT Device Defender monitors the policies related to your devices to ensure that the desired security settings are in place. It looks for drifts away from best practices and supports custom audit rules so that you can check for conditions that are specific to your deployment. For example, you could check to see if a compromised device has subscribed to sensor data from another device. You can run audits on a schedule or on an as-needed basis.

Real-Time Detection and AlertingAWS IoT Device Defender looks for and quickly alerts you to unusual behavior that could be coming from a compromised device. It does this by monitoring the behavior of similar devices over time, looking for unauthorized access attempts, changes in connection patterns, and changes in traffic patterns (either inbound or outbound).

Fast Investigation and Mitigation – In the event that you get an alert that something unusual is happening, AWS IoT Device Defender gives you the tools, including contextual information, to help you to investigate and mitigate the problem. Device information, device statistics, diagnostic logs, and previous alerts are all at your fingertips. You have the option to reboot the device, revoke its permissions, reset it to factory defaults, or push a security fix.

Stay Tuned
I’ll have more info (and a hands-on post) as soon as possible, so stay tuned!

Jeff;

Visualising Weather Station data with Initial State

Post Syndicated from Richard Hayler original https://www.raspberrypi.org/blog/initial-state/

Since we launched the Oracle Weather Station project, we’ve collected more than six million records from our network of stations at schools and colleges around the world. Each one of these records contains data from ten separate sensors — that’s over 60 million individual weather measurements!

Weather station measurements in Oracle database - Initial State

Weather station measurements in Oracle database

Weather data collection

Having lots of data covering a long period of time is great for spotting trends, but to do so, you need some way of visualising your measurements. We’ve always had great resources like Graphing the weather to help anyone analyse their weather data.

And from now on its going to be even easier for our Oracle Weather Station owners to display and share their measurements. I’m pleased to announce a new partnership with our friends at Initial State: they are generously providing a white-label platform to which all Oracle Weather Station recipients can stream their data.

Using Initial State

Initial State makes it easy to create vibrant dashboards that show off local climate data. The service is perfect for having your Oracle Weather Station data on permanent display, for example in the school reception area or on the school’s website.

But that’s not all: the Initial State toolkit includes a whole range of easy-to-use analysis tools for extracting trends from your data. Distribution plots and statistics are just a few clicks away!

Humidity value distribution (May-Nov 2017) - Raspberry Pi Oracle Weather Station Initial State

Looks like Auntie Beryl is right — it has been a damp old year! (Humidity value distribution May–Nov 2017)

The wind direction data from my Weather Station supports my excuse as to why I’ve not managed a high-altitude balloon launch this year: to use my launch site, I need winds coming from the east, and those have been in short supply.

Chart showing wind direction over time - Raspberry Pi Oracle Weather Station Initial State

Chart showing wind direction over time

Initial State credientials

Every Raspberry Pi Oracle Weather Station school will shortly be receiving the credentials needed to start streaming their data to Initial State. If you’re super keen though, please email [email protected] with a photo of your Oracle Weather Station, and I’ll let you jump the queue!

The Initial State folks are big fans of Raspberry Pi and have a ton of Pi-related projects on their website. They even included shout-outs to us in the music video they made to celebrate the publication of their 50th tutorial. Can you spot their weather station?

Your home-brew weather station

If you’ve built your own Raspberry Pi–powered weather station and would like to dabble with the Initial State dashboards, you’re in luck! The team at Initial State is offering 14-day trials for everyone. For more information on Initial State, and to sign up for the trial, check out their website.

The post Visualising Weather Station data with Initial State appeared first on Raspberry Pi.

Using taxies to monitor air quality in Peru

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/air-quality-peru/

When James Puderer moved to Lima, Peru, his roadside runs left a rather nasty taste in his mouth. Hit by the pollution from old diesel cars in the area, he decided to monitor the air quality in his new city using Raspberry Pis and the abundant taxies as his tech carriers.

Taxi Datalogger – Assembly

How to assemble the enclosure for my Taxi Datalogger project: https://www.hackster.io/james-puderer/distributed-air-quality-monitoring-using-taxis-69647e

Sensing air quality in Lima

Luckily for James, almost all taxies in Lima are equipped with the standard hollow vinyl roof sign seen in the video above, which makes them ideal for hacking.

Using a Raspberry Pi alongside various Adafuit tech including the BME280 Temperature/Humidity/Pressure Sensor and GPS Antenna, James created a battery-powered retrofit setup that fits snugly into the vinyl sign.

The schematic of the air quality monitor tech inside the taxi sign

With the onboard tech, the device collects data on longitude, latitude, humidity, temperature, pressure, and airborne particle count, feeding it back to an Android Things datalogger. This data is then pushed to Google IoT Core, where it can be remotely accessed.

Next, the data is processed by Google Dataflow and turned into a BigQuery table. Users can then visualize the collected measurements. And while James uses Google Maps to analyse his data, there are many tools online that will allow you to organise and study your figures depending on what final result you’re hoping to achieve.

A heat map of James' local area showing air quality

James hopped in a taxi and took his monitor on the road, collecting results throughout the journey

James has provided the complete build process, including all tech ingredients and code, on his Hackster.io project page, and urges makers to create their own air quality monitor for their local area. He also plans on building upon the existing design by adding a 12V power hookup for connecting to the taxi, functioning lights within the sign, and companion apps for drivers.

Sensing the world around you

We’ve seen a wide variety of Raspberry Pi projects using sensors to track the world around us, such as Kasia Molga’s Human Sensor costume series, which reacts to air pollution by lighting up, and Clodagh O’Mahony’s Social Interaction Dress, which she created to judge how conversation and physical human interaction can be scored and studied.

Human Sensor

Kasia Molga’s Human Sensor — a collection of hi-tech costumes that react to air pollution within the wearer’s environment.

Many people also build their own Pi-powered weather stations, or use the Raspberry Pi Oracle Weather Station, to measure and record conditions in their towns and cities from the roofs of schools, offices, and homes.

Have you incorporated sensors into your Raspberry Pi projects? Share your builds in the comments below or via social media by tagging us.

The post Using taxies to monitor air quality in Peru appeared first on Raspberry Pi.

timeShift(GrafanaBuzz, 1w) Issue 18

Post Syndicated from Blogs on Grafana Labs Blog original https://grafana.com/blog/2017/10/20/timeshiftgrafanabuzz-1w-issue-18/

Welcome to another issue of timeShift. This week we released Grafana 4.6.0-beta2, which includes some fixes for alerts, annotations, the Cloudwatch data source, and a few panel updates. We’re also gearing up for Oredev, one of the biggest tech conferences in Scandinavia, November 7-10. In addition to sponsoring, our very own Carl Bergquist will be presenting “Monitoring for everyone.” Hope to see you there – swing by our booth and say hi!


Latest Release

Grafana 4.6-beta-2 is now available! Grafana 4.6.0-beta2 adds fixes for:

  • ColorPicker display
  • Alerting test
  • Cloudwatch improvements
  • CSV export
  • Text panel enhancements
  • Annotation fix for MySQL

To see more details on what’s in the newest version, please see the release notes.

Download Grafana 4.6.0-beta-2 Now


From the Blogosphere

Screeps and Grafana: Graphing your AI: If you’re unfamiliar with Screeps, it’s a MMO RTS game for programmers, where the objective is to grow your colony through programming your units’ AI. You control your colony by writing JavaScript, which operates 247 in the single persistent real-time world filled by other players. This article walks you through graphing all your game stats with Grafana.

ntopng Grafana Integration: The Beauty of Data Visualization: Our friends at ntop created a tutorial so that you can graph ntop monitoring data in Grafana. He goes through the metrics exposed, configuring the ntopng Data Source plugin, and building your first dashboard. They’ve also created a nice video tutorial of the process.

Installing Graphite and Grafana to Display the Graphs of Centreon: This article, provides a step-by-step guide to getting your Centreon data into Graphite and visualizing the data in Grafana.

Bit v. Byte Episode 3 – Metrics for the Win: Bit v. Byte is a new weekly Podcast about the web industry, tools and techniques upcoming and in use today. This episode dives into metrics, and discusses Grafana, Prometheus and NGINX Amplify.

Code-Quickie: Visualize heating with Grafana: With the winter weather coming, Reinhard wanted to monitor the stats in his boiler room. This article covers not only the visualization of the data, but the different devices and sensors you can use to can use in your own home.

RuuviTag with C.H.I.P – BLE – Node-RED: Following the temperature-monitoring theme from the last article, Tobias writes about his journey of hooking up his new RuuviTag to Grafana to measure temperature, relative humidity, air pressure and more.


Early Bird will be Ending Soon

Early bird discounts will be ending soon, but you still have a few days to lock in the lower price. We will be closing early bird on October 31, so don’t wait until the last minute to take advantage of the discounted tickets!

Also, there’s still time to submit your talk. We’ll accept submissions through the end of October. We’re looking for technical and non-technical talks of all sizes. Submit a CFP now.

Get Your Early Bird Ticket Now


Grafana Plugins

This week we have updates to two panels and a brand new panel that can add some animation to your dashboards. Installing plugins in Grafana is easy; for on-prem Grafana, use the Grafana-cli tool, or with 1 click if you are using Hosted Grafana.

NEW PLUGIN

Geoloop Panel – The Geoloop panel is a simple visualizer for joining GeoJSON to Time Series data, and animating the geo features in a loop. An example of using the panel would be showing the rate of rainfall during a 5-hour storm.

Install Now

UPDATED PLUGIN

Breadcrumb Panel – This plugin keeps track of dashboards you have visited within one session and displays them as a breadcrumb. The latest update fixes some issues with back navigation and url query params.

Update

UPDATED PLUGIN

Influx Admin Panel – The Influx Admin panel duplicates features from the now deprecated Web Admin Interface for InfluxDB and has lots of features like letting you see the currently running queries, which can also be easily killed.

Changes in the latest release:

  • Converted to typescript project based on typescript-template-datasource
  • Select Databases. This only works with PR#8096
  • Added time format options
  • Show tags from response
  • Support template variables in the query

Update


Contribution of the week:

Each week we highlight some of the important contributions from our amazing open source community. Thank you for helping make Grafana better!

The Stockholm Go Meetup had a hackathon this week and sent a PR for letting whitelisted cookies pass through the Grafana proxy. Thanks to everyone who worked on this PR!


Tweet of the Week

We scour Twitter each week to find an interesting/beautiful dashboard and show it off! #monitoringLove

This is awesome – we can’t get enough of these public dashboards!

We Need Your Help!

Do you have a graph that you love because the data is beautiful or because the graph provides interesting information? Please get in touch. Tweet or send us an email with a screenshot, and we’ll tell you about this fun experiment.

Tell Me More


Grafana Labs is Hiring!

We are passionate about open source software and thrive on tackling complex challenges to build the future. We ship code from every corner of the globe and love working with the community. If this sounds exciting, you’re in luck – WE’RE HIRING!

Check out our Open Positions


How are we doing?

Please tell us how we’re doing. Submit a comment on this article below, or post something at our community forum. Help us make these weekly roundups better!

Follow us on Twitter, like us on Facebook, and join the Grafana Labs community.

The possibilities of the Sense HAT

Post Syndicated from Janina Ander original https://www.raspberrypi.org/blog/sense-hat-projects/

Did you realise the Sense HAT has been available for over two years now? Used by astronauts on the International Space Station, the exact same hardware is available to you on Earth. With a new Astro Pi challenge just launched, it’s time for a retrospective/roundup/inspiration post about this marvellous bit of kit.

Sense HAT attached to Pi and power cord

The Sense HAT on a Pi in full glory

The Sense HAT explained

We developed our scientific add-on board to be part of the Astro Pi computers we sent to the International Space Station with ESA astronaut Tim Peake. For a play-by-play of Astro Pi’s history, head to the blog archive.

Astro Pi logo with starry background

Just to remind you, this is all the cool stuff our engineers have managed to fit onto the HAT:

  • A gyroscope (sensing pitch, roll, and yaw)
  • An accelerometer
  • A magnetometer
  • Sensors for temperature, humidity, and barometric pressure
  • A joystick
  • An 8×8 LED matrix

You can find a roundup of the technical specs here on the blog.

How to Sense HAT

It’s easy to begin exploring this device: take a look at our free Getting started with the Sense HAT resource, or use one of our Code Club Sense HAT projects. You can also try out the emulator, available offline on Raspbian and online on Trinket.

Sense HAT emulator on Trinket

The Sense HAT emulator on trinket.io

Fun and games with the Sense HAT

Use the LED matrix and joystick to recreate games such as Pong or Flappy Bird. Of course, you could also add sensor input to your game: code an egg drop game or a Magic 8 Ball that reacts to how the device moves.

Sense HAT Random Sparkles

Create random sparkles on the Sense HAT

Once December rolls around, you could brighten up your home with a voice-controlled Christmas tree or an advent calendar on your Sense HAT.

If you like the great outdoors, you could also use your Sense HAT to recreate this Hiking Companion by Marcus Johnson. Take it with you on your next hike!

Art with the Sense HAT

The LED matrix is perfect for getting creative. To draw something basic without having to squint at a Python list, use this app by our very own Richard Hayler. Feeling more ambitious? The MagPi will teach you how to create magnificent pixel art. Ben Nuttall has created this neat little Python script for displaying a photo taken by the Raspberry Pi Camera Module on the Sense HAT.

Brett Haines Mathematica on the Sense HAT

It’s also possible to incorporate Sense HAT data into your digital art! The Python Turtle module and the Processing language are both useful tools for creating beautiful animations based on real-world information.

A Sense HAT project that also uses this principle is Giorgio Sancristoforo’s Tableau, a ‘generative music album’. This device creates music according to the sensor data:

Tableau Generative Album

“There is no doubt that, as music is removed by the phonographrecord from the realm of live production and from the imperative of artistic activity and becomes petrified, it absorbs into itself, in this process of petrification, the very life that would otherwise vanish.”

Science with the Sense HAT

This free Essentials book from The MagPi team covers all the Sense HAT science basics. You can, for example, learn how to measure gravity.

Cropped cover of Experiment with the Sense HAT book

Our online resource shows you how to record the information your HAT picks up. Next you can analyse and graph your data using Mathematica, which is included for free on Raspbian. This resource walks you through how this software works.

If you’re seeking inspiration for experiments you can do on our Astro Pis Izzy and Ed on the ISS, check out the winning entries of previous rounds of the Astro Pi challenge.

Thomas Pesquet with Ed and Izzy

Thomas Pesquet with Ed and Izzy

But you can also stick to terrestrial scientific investigations. For example, why not build a weather station and share its data on your own web server or via Weather Underground?

Your code in space!

If you’re a student or an educator in one of the 22 ESA member states, you can get a team together to enter our 2017-18 Astro Pi challenge. There are two missions to choose from, including Mission Zero: follow a few guidelines, and your code is guaranteed to run in space!

The post The possibilities of the Sense HAT appeared first on Raspberry Pi.

Build a Visualization and Monitoring Dashboard for IoT Data with Amazon Kinesis Analytics and Amazon QuickSight

Post Syndicated from Karan Desai original https://aws.amazon.com/blogs/big-data/build-a-visualization-and-monitoring-dashboard-for-iot-data-with-amazon-kinesis-analytics-and-amazon-quicksight/

Customers across the world are increasingly building innovative Internet of Things (IoT) workloads on AWS. With AWS, they can handle the constant stream of data coming from millions of new, internet-connected devices. This data can be a valuable source of information if it can be processed, analyzed, and visualized quickly in a scalable, cost-efficient manner. Engineers and developers can monitor performance and troubleshoot issues while sales and marketing can track usage patterns and statistics to base business decisions.

In this post, I demonstrate a sample solution to build a quick and easy monitoring and visualization dashboard for your IoT data using AWS serverless and managed services. There’s no need for purchasing any additional software or hardware. If you are already using AWS IoT, you can build this dashboard to tap into your existing device data. If you are new to AWS IoT, you can be up and running in minutes using sample data. Later, you can customize it to your needs, as your business grows to millions of devices and messages.

Architecture

The following is a high-level architecture diagram showing the serverless setup to configure.

 

AWS service overview

AWS IoT is a managed cloud platform that lets connected devices interact easily and securely with cloud applications and other devices. AWS IoT can process and route billions of messages to AWS endpoints and to other devices reliably and securely.

Amazon Kinesis Firehose is the easiest way to capture, transform, and load streaming data continuously into AWS from thousands of data sources, such as IoT devices. It is a fully managed service that automatically scales to match the throughput of your data and requires no ongoing administration.

Amazon Kinesis Analytics allows you to process streaming data coming from IoT devices in real time with standard SQL, without having to learn new programming languages or processing frameworks, providing actionable insights promptly.

The processed data is fed into Amazon QuickSight, which is a fast, cloud-powered business analytics service that makes it easy to build visualizations, perform ad-hoc analysis, and quickly get business insights from the data.

The most popular way for Internet-connected devices to send data is using MQTT messages. The AWS IoT gateway receives these messages from registered IoT devices. The solution in this post uses device data from AWS Simple Beer Service (SBS), a series of internet-connected kegerators sending sensor outputs such as temperature, humidity, and sound levels in a JSON payload. You can use any existing IoT data source that you may have.

The AWS IoT rules engine allows selecting data from message payloads, processing it, and sending it to other services. You forward the data to a Firehose delivery stream to consolidate the continuous data stream into batches for further processing. The batched data is also stored temporarily in an Amazon S3 bucket for later retrieval and can be set for deletion after a specified time using S3 Lifecycle Management rules.

The incoming data from the Firehose delivery stream is fed into an Analytics application that provides an easy way to process the data in real time using standard SQL queries. Analytics allows writing standard SQL queries to extract specific components from the incoming data stream and perform real-time ETL on it. In this post, you use this feature to aggregate minimum and maximum temperature values from the sensors per minute. You load it in Amazon QuickSight to create a monitoring dashboard and check if the devices are over-heating or cooling down during use. You also extract every device’s location, parameters such as temperature, sound levels, humidity, and the time stamp in Analytics to use on the visualization dashboard.

The processed data from the two queries is fed into two Firehose delivery streams, both of which batch the data into CSV files every minute and store it in S3. The batching time interval is configurable between 1 and 15 minutes in 1-second intervals.

Finally, you use Amazon QuickSight to ingest the processed CSV files from S3 as a data source to build visualizations. Amazon QuickSight’s super-fast, parallel, in-memory, calculation engine (SPICE) parses the ingested data and allows you to create a variety of visualizations with different graph types. You can also use the Amazon QuickSight built-in Story feature to combine visualizations into business dashboards that can be shared in a secure manner.

Implementation

AWS IoT, Amazon Kinesis, and Amazon QuickSight are all fully managed services, which means you can complete the entire setup in just a few steps using the AWS Management Console. Don’t worry about setting up any underlying hardware or installing any additional software. So, get started.

Step 1. Set up your AWS IoT data source

Do you currently use AWS IoT? If you have an existing IoT thing set up and running on AWS IoT, you can skip to Step 2.

If you have an AWS IoT button or other IoT devices that can publish MQTT messages and would like to use that for the setup, follow the Getting Started with AWS IoT topic to connect your thing to AWS IoT. Continue to Step 2.

If you do not have an existing IoT device, you can generate simulated device data using a script on your local machine and have it publish to AWS IoT. The following script lets you set up your AWS IoT environment and publish simulated data that mimics device data from Simple Beer Service.

Generate sample Data

Running the sbs.py Python script generates fictitious AWS IoT messages from multiple SBS devices. The IoT rule sends the message to Firehose for further processing.

The script requires access to AWS CLI credentials and boto3 installation on the machine running the script. Download and run the following Python script:

https://github.com/awslabs/sbs-iot-data-generator/blob/master/sbs.py

The script generates random data that looks like the following:

{"deviceParameter": "Temperature", "deviceValue": 33, "deviceId": "SBS01", "dateTime": "2017-02-03 11:29:37"}
{"deviceParameter": "Sound", "deviceValue": 140, "deviceId": "SBS03", "dateTime": "2017-02-03 11:29:38"}
{"deviceParameter": "Humidity", "deviceValue": 63, "deviceId": "SBS01", "dateTime": "2017-02-03 11:29:39"}
{"deviceParameter": "Flow", "deviceValue": 80, "deviceId": "SBS04", "dateTime": "2017-02-03 11:29:41"}

Run the script and keep it running for the duration of the project to generate sufficient data.

Tip: If you encounter any issues running the script from your local machine, launch an EC2 instance and run the script there as a root user. Remember to assign an appropriate IAM role to your instance at the time of launch that allows it to access AWS IoT.

Step 2. Create three Firehose delivery streams

For this post, you require three Firehose delivery streams:  one to batch raw data from AWS IoT, and two to batch output device data and aggregated data from Analytics.

  1. In the console, choose Firehose.
  2. Create all three Firehose delivery streams using the following field values.

Delivery stream 1:

NameIoT-Source-Stream
S3 bucket<your unique name>-kinesis
S3 prefixsource/

Delivery stream 2:

NameIoT-Destination-Data-Stream
S3 bucket<your unique name>-kinesis
S3 prefixdata/

Delivery stream 3:

NameIoT-Destination-Aggregate-Stream
S3 bucket<your unique name>-kinesis
S3 prefixaggregate/

Step 3. Set up AWS IoT to receive and forward incoming data

  1. In the console, choose IoT.
  2. Create a new AWS IoT rule with the following field values.
NameIoT_to_Firehose
Attribute*
Topic Filter/sbs/devicedata/#
Add ActionSend messages to an Amazon Kinesis Firehose stream (select IoT-Source-Stream from dropdown)
Select Separator“\n (newline)”

A quick check before proceeding further: make sure that you have run the script to generate simulated IoT data or that your IoT Thing is running and delivering data. If not, set it up now. The Amazon Kinesis Analytics application you set up in the next step needs the data to process it further.

Step 4: Create an Analytics application to process data

  1. In the console, choose Kinesis.
  2. Create a new application.
  3. Enter a name of your choice, for example, SBS-IoT-Data.
  4. For the source, choose IoT-Source-Stream.

Analytics auto-discovers the schema on the data by sampling records from the input stream. It also includes an in-built SQL editor that allows you to write standard SQL queries to transform incoming data.

Tip: If Analytics is unable to discover your incoming data, it may be missing the appropriate IAM permissions. In the IAM console, select the role that you assigned to your IoT rule in Step 3. Make sure that it has the ARN of the IoT-Source-Data Firehose stream listed in the firehose:putRecord section.

Here is a sample SQL query that generates two output streams:

  • DESTINATION_SQL_BASIC_STREAM contains the device ID, device parameter, its value, and the time stamp from the incoming stream.
  • DESTINATION_SQL_AGGREGATE_STREAM aggregates the maximum and minimum values of temperatures from the sensors over a one-minute period from the incoming data.
-- Create an output stream with four columns, which is used to send IoT data to the destination
CREATE OR REPLACE STREAM "DESTINATION_SQL_BASIC_STREAM" (dateTime TIMESTAMP, deviceId VARCHAR(8), deviceParameter VARCHAR(16), deviceValue INTEGER);

-- Create a pump that continuously selects from the source stream and inserts it into the output data stream
CREATE OR REPLACE PUMP "STREAM_PUMP_1" AS INSERT INTO "DESTINATION_SQL_BASIC_STREAM"

-- Filter specific columns from the source stream
SELECT STREAM "dateTime", "deviceId", "deviceParameter", "deviceValue" FROM "SOURCE_SQL_STREAM_001";

-- Create a second output stream with three columns, which is used to send aggregated min/max data to the destination
CREATE OR REPLACE STREAM "DESTINATION_SQL_AGGREGATE_STREAM" (dateTime TIMESTAMP, highestTemp SMALLINT, lowestTemp SMALLINT);

-- Create a pump that continuously selects from a source stream 
CREATE OR REPLACE PUMP "STREAM_PUMP_2" AS INSERT INTO "DESTINATION_SQL_AGGREGATE_STREAM"

-- Extract time in minutes, plus the highest and lowest value of device temperature in that minute, into the destination aggregate stream, aggregated per minute
SELECT STREAM FLOOR("SOURCE_SQL_STREAM_001".ROWTIME TO MINUTE) AS "dateTime", MAX("deviceValue") AS "highestTemp", MIN("deviceValue") AS "lowestTemp" FROM "SOURCE_SQL_STREAM_001" WHERE "deviceParameter"='Temperature' GROUP BY FLOOR("SOURCE_SQL_STREAM_001".ROWTIME TO MINUTE);

Real-time analytics shows the results of the SQL query. If everything is working correctly, you see three streams listed, similar to the following screenshot.

Step 5: Connect the Analytics application to output Firehose delivery streams

You create two destinations for the two delivery streams that you created in the previous step. A single Analytics application can have multiple destinations defined; however, this needs to be set up using the AWS CLI, not from the console. If you do not already have it, install the AWS CLI on your local machine and configure it with your credentials.

Tip: If you are running the IoT script from an EC2 instance, it comes pre-installed with the AWS CLI.

Create the first destination delivery stream 

The AWS CLI command to create a new output Firehose delivery stream is as follows:

aws kinesisanalytics add-application-output --application-name <Name of Analytics Application> --current-application-version-id <number> --application-output 'Name=DESTINATION_SQL_BASIC_STREAM,KinesisFirehoseOutput={ResourceARN=<ARN of IoT-Data-Stream>,RoleARN=<ARN of Analytics application>,DestinationSchema={RecordFormatType=CSV}'

Do not copy this into the CLI just yet! Before entering this command, make the following four changes to personalize it:

  • For Name of Analytics Application, enter the value from Step 4, or from the Analytics console.
  • For current-application-version-ID, run the following command:
aws kinesisanalytics describe-application --application-name <application name from above>; | grep ApplicationVersionId
  • For ResourceARN, run the following command:
aws firehose describe-delivery-stream --delivery-stream-name IoT-Destination-Data-Stream | grep DeliveryStreamARN
  • For RoleARN, run the following command:
aws kinesisanalytics describe-application --application-name <application name from above>; | grep RoleARN

Now, paste the complete command in the AWS CLI and press Enter. If there are any errors, the response provides details. If everything goes well, a new destination delivery stream is created to send the first query (DESTINATION_SQL_BASIC_STREAM) to IoT-Destination-Data-Stream.

Create the second destination delivery stream

Following similar steps as above, create a second destination Firehose delivery stream with the following changes:

  • For Name of Analytics Application, enter the same name as the first delivery stream.
  • For current-application-version-ID, increment by 1 from the previous value (unless you made other changes in between these steps). If unsure, run the same command as above to get it again.
  • For ResourceARN, get the value by running the following CLI command:
aws firehose describe-delivery-stream --delivery-stream-name IoT-Destination-Aggregate-Stream | grep DeliveryStreamARN
  • For RoleArn, enter the same value as the first stream.

Run the aws kinesisanalytics CLI command, similar to the previous step but with the new parameters substituted. This creates the second output Firehose destination delivery stream.

Update the IAM role for Analytics to allow writing to both output streams.

  1. In the console, choose IAM, Roles.
  2. Select the role that you created with Analytics in Step 4.
  3. Choose Policy, JSON, and Edit.
  4. Find “Sid”: “WriteOutputFirehose” in the JSON document, go to the “Resource” section and make sure that it includes Resource ARNs of both streams that you found in the previous step.
  5. If it has only one ARN, add the second ARN and choose Save.

This completes the Amazon Kinesis setup. The incoming IoT data is processed by Analytics and delivered, using two output delivery streams, to two separate folders in your S3 bucket.

Step 6: Set up Amazon QuickSight to analyze the data

To build the visualization dashboard, ingest the processed CSV files from the S3 bucket into Amazon QuickSight.

  1. In the console, choose QuickSight.
  2. If this is your first time using Amazon QuickSight, you are asked to create a new account. Follow the prompts.
  3. When you are logged in to your account, choose New Analysis and enter a name of your choice.
  4. Choose New data set for the analysis or, if you have previously imported your data set, select one from the available data sets.
  5. You import two data sets: one with general device parameters information, and the other with aggregates of maximum and minimum temperatures for monitoring. For the first data set, choose S3 from the list of available data sources and enter a name, for example, IoT Device Data.
  6. The location of the S3 bucket and the objects to use are provided to Amazon QuickSight as a manifest file. Create a new manifest file following the supported formats for Amazon S3 manifest files.
  7. In the URIPrefixes section, provide your appropriate S3 bucket and folder location for the general device data. Hint: it should include <your unique name>-kinesis/data/.

Your manifest file should look similar to the following:

{ 
    "fileLocations": [                                                    
              {"URIPrefixes": ["https://s3.amazonaws.com/<YOUR_BUCKET_NAME>/data/<YEAR>/<MONTH>/<DATE>/<HOUR>/"]}
     ],
     "globalUploadSettings": { 
     "format": "CSV",  
     "delimiter": ","
    }
}

Amazon QuickSight imports and parses the data set, and provides available data fields that can be used for making graphs. The Edit/Preview data button allows you to format and transform the data, change data types, and filter or join your data. Make sure that the columns have the correct titles. If not, you can edit them and then save.

Tip: choose the downward arrow on the top right and unselect Files include headers to give each column appropriate headers. Choose Save. This takes you back to the data sets page.

Follow the same steps as above to import the second data set. This time, your manifest should include your aggregate data set folder on S3, which is named <your unique name>-kinesis/aggregate/. Update headers if necessary and choose Save & visualize.

Build an analysis

The visualization screen shows the data set that you last imported, which in this case is the aggregate data. To include the general device data as well, for Fields on the top left, choose Edit analysis data sets. Choose Add data set and select the other data set that you saved earlier.

Now both data sets are available on the analysis screen. For Visual Types at bottom left, select the type of graph to make. For Fields, select the fields to visualize. For example, drag Device ID, Device Parameter, and Value to Field wells, as shown in the screenshot below, to generate a visualization of average parameter values compared across devices.

You can create another visual by choose +Add. This time, select a line graph to show monitoring of the maximum temperature values of the sensors in any minute, from the aggregate data set.

If you would like to create an interactive story to present to your team or organization, you can choose the Story option on the left panel. Create a dashboard with multiple visualizations, to save and share securely with the intended audience. An example of a story is shown below.

Conclusion

Any data is valuable only when it can be actually put to use. In this post, you’ve seen how it’s possible to quickly build a simple Analytics application to ingest, process, and visualize IoT data in near real time entirely using AWS managed services. This solution is scalable and reliable, and costs a fraction of other business intelligence solutions. It is easy enough that anyone with an AWS account can build and use it without any special training.

If you have any questions or suggestions, please comment below.


About the Author

Karan Desai is a Solutions Architect with Amazon Web Services. He works with startups and small businesses in the US, helping them adopt cloud technology to build scalable and secure solutions using AWS. In his spare time, he likes to build personal IoT projects, travel to offbeat places and write about it.

 

 


Related

Visualize Big Data with Amazon QuickSight, Presto, and Apache Spark on Amazon EMR

 

 

 

 

 

 

 

Open source energy monitoring using Raspberry Pi

Post Syndicated from Helen Lynn original https://www.raspberrypi.org/blog/open-source-energy-monitoring-raspberry-pi/

OpenEnergyMonitor, who make open-source tools for energy monitoring, have been using Raspberry Pi since we launched in 2012. Like Raspberry Pi, they manufacture their hardware in Wales and send it to people all over the world. We invited co-founder Glyn Hudson to tell us why they do what they do, and how Raspberry Pi helps.

Hi, I’m Glyn from OpenEnergyMonitor. The OpenEnergyMonitor project was founded out of a desire for open-source tools to help people understand and relate to their use of energy, their energy systems, and the challenge of sustainable energy.

Photo: an emonPi energy monitoring unit in an aluminium case with an aerial and an LCD display, a mobile phone showing daily energy use as a histogram, and a bunch of daffodils in a glass bottle

The next 20 years will see a revolution in our energy systems, as we switch away from fossil fuels towards a zero-carbon energy supply.

By using energy monitoring, modelling, and assessment tools, we can take an informed approach to determine the best energy-saving measures to apply. We can then check to ensure solutions achieve their expected performance over time.

We started the OpenEnergyMonitor project in 2009, and the first versions of our energy monitoring system used an Arduino with Ethernet Shield, and later a Nanode RF with an embedded Ethernet controller. These early versions were limited by a very basic TCP/IP stack; running any sort of web application locally was totally out of the question!

I can remember my excitement at getting hold of the very first version of the Raspberry Pi in early 2012. Within a few hours of tearing open the padded envelope, we had Emoncms (our open-source web logging, graphing, and visualisation application) up and running locally on the Raspberry Pi. The Pi quickly became our web-connected base station of choice (emonBase). The following year, 2013, we launched the RFM12Pi receiver board (now updated to RFM69Pi). This allowed the Raspberry Pi to receive data via low-power RF 433Mhz from our emonTx energy monitoring unit, and later from our emonTH remote temperature and humidity monitoring node.

Diagram: communication between OpenEnergyMonitor monitoring units, base station and web interface

In 2015 we went all-in with Raspberry Pi when we launched the emonPi, an all-in-one Raspberry Pi energy monitoring unit, via Kickstarter. Thanks to the hard work of the Raspberry Pi Foundation, the emonPi has enjoyed several upgrades: extra processing power from the Raspberry Pi 2, then even more power and integrated wireless LAN thanks to the Raspberry Pi 3. With all this extra processing power, we have been able to build an open software stack including Emoncms, MQTT, Node-RED, and openHAB, allowing the emonPi to function as a powerful home automation hub.

Screenshot: Emoncms Apps interface to emonPi home automation hub, with histogram of daily electricity use

Emoncms Apps interface to emonPi home automation hub

Inspired by the Raspberry Pi Foundation, we manufacture and assemble our hardware in Wales, UK, and ship worldwide via our online store.

All of our work is fully open source. We believe this is a better way of doing things: we can learn from and build upon each other’s work, creating better solutions to the challenges we face. Using Raspberry Pi has allowed us to draw on the expertise and work of many other projects. With lots of help from our fantastic community, we have built an online learning resource section of our website to help others get started: it covers things like basic AC power theory, Arduino, and the bigger picture of sustainable energy.

To learn more about OpenEnergyMonitor systems, take a look at our Getting Started User Guide. We hope you’ll join our community.

The post Open source energy monitoring using Raspberry Pi appeared first on Raspberry Pi.

Sense HAT Emulator Upgrade

Post Syndicated from David Honess original https://www.raspberrypi.org/blog/sense-hat-emulator-upgrade/

Last year, we partnered with Trinket to develop a web-based emulator for the Sense HAT, the multipurpose add-on board for the Raspberry Pi. Today, we are proud to announce an exciting new upgrade to the emulator. We hope this will make it even easier for you to design amazing experiments with the Sense HAT!

What’s new?

The original release of the emulator didn’t fully support all of the Sense HAT features. Specifically, the movement sensors were not emulated. Thanks to funding from the UK Space Agency, we are delighted to announce that a new round of development has just been completed. From today, the movement sensors are fully supported. The emulator also comes with a shiny new 3D interface, Astro Pi skin mode, and Pygame event handling. Click the ▶︎ button below to see what’s new!

Upgraded sensors

On a physical Sense HAT, real sensors react to changes in environmental conditions like fluctuations in temperature or humidity. The emulator has sliders which are designed to simulate this. However, emulating the movement sensor is a bit more complicated. The upgrade introduces a 3D slider, which is essentially a model of the Sense HAT that you can move with your mouse. Moving the model affects the readings provided by the accelerometer, gyroscope, and magnetometer sensors.

Code written in this emulator is directly portable to a physical Raspberry Pi and Sense HAT without modification. This means you can now develop and test programs using the movement sensors from any internet-connected computer, anywhere in the world.

Astro Pi mode

Astro Pi is our series of competitions offering students the chance to have their code run in space! The code is run on two space-hardened Raspberry Pi units, with attached Sense HATs, on the International Space Station.

Image of Astro Pi unit Sense HAT emulator upgrade

Astro Pi skin mode

There are a number of practical things that can catch you out when you are porting your Sense HAT code to an Astro Pi unit, though, such as the orientation of the screen and joystick. Just as having a 3D-printed Astro Pi case enables you to discover and overcome these, so does the Astro Pi skin mode in this emulator. In the bottom right-hand panel, there is an Astro Pi button which enables the mode: click it again to go back to the Sense HAT.

The joystick and push buttons are operated by pressing your keyboard keys: use the cursor keys and Enter for the joystick, and U, D, L, R, A, and B for the buttons.

Sense Hat resources for Code Clubs

Image of gallery of Code Club Sense HAT projects Sense HAT emulator upgrade

Click the image to visit the Code Club projects page

We also have a new range of Code Club resources which are based on the emulator. Of these, three use the environmental sensors and two use the movement sensors. The resources are an ideal way for any Code Club to get into physical computing.

The technology

The 3D models in the emulator are represented entirely with HTML and CSS. “This project pushed the Trinket team, and the 3D web, to its limit,” says Elliott Hauser, CEO of Trinket. “Our first step was to test whether pure 3D HTML/CSS was feasible, using Julian Garnier’s Tridiv.”

Sense HAT 3D image mockup Sense HAT emulator upgrade

The Trinket team’s preliminary 3D model of the Sense HAT

“We added JavaScript rotation logic and the proof of concept worked!” Elliot continues. “Countless iterations, SVG textures, and pixel-pushing tweaks later, the finished emulator is far more than the sum of its parts.”

Sense HAT emulator 3d image final version Sense HAT emulator upgrade

The finished Sense HAT model: doesn’t it look amazing?

Check out this blog post from Trinket for more on the technology and mathematics behind the models.

One of the compromises we’ve had to make is browser support. Unfortunately, browsers like Firefox and Microsoft Edge don’t fully support this technology yet. Instead, we recommend that you use Chrome, Safari, or Opera to access the emulator.

Where do I start?

If you’re new to the Sense HAT, you can simply copy and paste many of the code examples from our educational resources, like this one. Alternatively, you can check out our Sense HAT Essentials e-book. For a complete list of all the functions you can use, have a look at the Sense HAT API reference here.

The post Sense HAT Emulator Upgrade appeared first on Raspberry Pi.

Tableau: a generative music album based on the Sense HAT

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/tableau-generative-music-album/

Multi-talented maker Giorgio Sancristoforo has used a Raspberry Pi and Sense HAT to create Tableau, a generative music album. It’s an innovative idea: the music constantly evolves as it reacts to environmental stimuli like atmospheric pressure, humidity, and temperature.

Tableau Generative Album

“There is no doubt that, as music is removed by the phonographrecord from the realm of live production and from the imperative of artistic activity and becomes petrified, it absorbs into itself, in this process of petrification, the very life that would otherwise vanish.”

Creating generative music

“I’ve been dreaming about using portable microcomputers to create a generative music album,” explains Giorgio. “Now my dream is finally a reality: this is my first portable generative LP (PGLP)”. Tableau uses both a Raspberry Pi 2 and a Sense HAT: the HAT provides the data for the album’s musical evolution via its range of onboard sensors.

Image of Tableau generative music device with Sense HAT illuminated

Photo credit: Giorgio Sancristoforo

The Sense HAT was originally designed for use aboard the International Space Station (ISS) as part of the ongoing Astro Pi challenge. It has, however, become a staple within the Raspberry Pi maker community. This is partly thanks to the myriad of possibilities offered by its five onboard sensors, five-button joystick, and 8 × 8 LED matrix.

Image of Tableau generative music device with Sense HAT illuminated

Photo credit: Giorgio Sancristoforo

Limited edition

The final release of Tableau consists of a limited edition of fifty PGLPs: each is set up to begin playing immediately power is connected, and the music will continue to evolve indefinitely. “Instead of being reproduced as on a CD or in an MP3 file, the music is spontaneously generated and arranged while you are listening to it,” Giorgio explains on his website. “It never sounds the same. Tableau creates an almost endless number of mixes of the LP (4 × 12 factorial). Each time you will listen, the music will be different, and it will keep on evolving until you switch the power off.”

Image of Tableau generative music device with Sense HAT illuminated

Photo credit: Giorgio Sancristoforo

Experiment with the Sense HAT

What really interests us is how the sound of Tableau might alter in different locations. Would it sound different in Cambridge as opposed to the deserts of Mexico? What about Antarctica versus the ISS?

If Giorgio’s project has piqued your interest, why not try using our free data logging resource for the Sense HAT? You can use it to collect information from the HAT’s onboard sensors and create your own projects. How about collecting data over a year, and transforming this into your own works of art?

Even if you don’t have access to the Sense HAT, you can experience it via the Sense HAT desktop emulator. This is a great solution if you want to work on Sense HAT-based projects in the classroom, as it reduces the amount of hardware you need.

If you’ve already built a project using the Sense HAT, make sure to share it in the comments below. We would love to see what you have been making!

 

The post Tableau: a generative music album based on the Sense HAT appeared first on Raspberry Pi.

NeoPixel Temperature Stair Lights

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/neopixel-temperature-stair-lights/

Following a post-Christmas decision to keep illuminated decorations on her stairway bannister throughout the year, Lorraine Underwood found a new purpose for a strip of NeoPixels she had lying around.

Lorraine Underwood on Twitter

Changed the stair lights from a string to a strip & they look awesome! #neopixel #raspberrypi https://t.co/dksLwy1SE1

Simply running the lights up the stairs, blinking and flashing to a random code, wasn’t enough for her. By using an API to check the outdoor weather, Lorraine’s lights went from decorative to informative: they now give an indication of outside weather conditions through their colour and the quantity illuminated.

“The idea is that more lights will light up as it gets warmer,” Lorraine explains. “The temperature is checked every five minutes (I think that may even be a little too often). I am looking forward to walking downstairs to a nice warm yellow light instead of the current blue!”

In total, Lorraine had 240 lights in the strip; she created a chart indicating a range of outside temperatures and the quantity of lights which for each value, as well as specifying the colour of those lights, running from chilly blue through to scorching red.

Lorraine Underwood Neopixel stair way lights

Oh, Lorraine! We love your optimistic dreams of the British summer being more than its usual rainy 16 Celsius…

The lights are controlled by a Raspberry Pi Zero running a code that can be found on Lorraine’s blog. The code dictates which lights are lit and when.

Lorraine Underwood Neopixel stair way lights

“Do I need a coat today? I’ll check the stairs.”

Lorraine is planning some future additions to the build, including a toddler-proof 3D housing, powering the Zero from the lights’ power supply, and gathering her own temperature data instead of relying on a third-party API.

While gathering the temperature data from outside her house, she may also want to look into building an entire weather station, collecting extra data on rain, humidity, and wind conditions. After all, this is the UK: just because it’s hot outside, it doesn’t mean it’s not also raining.

The post NeoPixel Temperature Stair Lights appeared first on Raspberry Pi.

Raspberry Pi at Scouts Wintercamp

Post Syndicated from Ben Nuttall original https://www.raspberrypi.org/blog/raspberry-pi-at-scouts-wintercamp/

As well as working with classroom teachers and supporting learning in schools, Raspberry Pi brings computing and digital making to educators and learners in all sorts of other settings. I recently attended Wintercamp, a camp for Scouts at Gilwell Park. With some help from Brian and Richard from Vodafone, I ran a Raspberry Pi activity space introducing Scouts to digital making with Raspberry Pi, using the Sense HAT, the Camera Module, and GPIO, based on some of our own learning resources.

Ben Nuttall on Twitter

Today I’m running @Raspberry_Pi activities for @UKScouting at @gpwintercamp with @VodafoneUK!

Note the plastic sheeting on the floor! Kids were dropping into our sessions all day with muddy boots, having taken part in all sorts of fun activities, indoors and out.

Ben Nuttall on Twitter

@gpwintercamp

In the UK, the Scouts have Digital Citizen and Digital Maker badges, and we’re currently working with the Scout Association to help deliver content for the Digital Maker badge, as supported by the Vodafone Foundation.

The activities we ran were just a gentle introduction to creative tech and experimenting with sensors, but they went down really well, and many of the participants felt happy to move beyond the worksheets and try out their own ideas. We set challenges, and got them to think about how they could incorporate technology like this into their Scouting activities.

Having been through the Scouting movement myself, it’s amazing to be involved in working to show young people how technology can be applied to projects related to their other hobbies and interests. I loved introducing the Scouts to the idea that programming and making can be tools to help solve problems that are relevant to them and to others in their communities, as well as enabling them to do some good in the world, and to be creative.

Scouts coding

Ben Nuttall on Twitter

Can you breathe on the Sense HAT to make the humidity read 90?” “That’s cool. It makes you light-headed…

While conducting a survey of Raspberry Jam organisers recently, I discovered that a high proportion of those who run Jams are also involved in other youth organisations. Many were Scout leaders. Other active Pi community folk happen to be involved in Scouting too, like Brian and Richard, who helped out at the weekend, and who are Scout and Cub leaders. I’m interested to speak to anyone in the Pi community who has an affiliation with the Scouts to share ideas on how they think digital making can be incorporated in Scouting activities. Please do get in touch!

Ben Nuttall on Twitter

Not a great picture but the Scouts made a Fleur de Lys on the Sense HAT at @gpwintercamp



The timing is perfect for young people in this age group to get involved with digital making, as we’ve just launched our first Pioneers challenge. There’s plenty of scope there for outdoor tech projects.

Thanks to UK Scouting and the Wintercamp team for a great weekend. Smiles all round!

The post Raspberry Pi at Scouts Wintercamp appeared first on Raspberry Pi.

Sense HAT emulator

Post Syndicated from David Honess original https://www.raspberrypi.org/blog/sense-hat-emulator/

Over the last few months, we’ve been working with US-based startup Trinket to develop a web-based emulator for the Sense HAT, the multipurpose add-on board for the Raspberry Pi which is also the core component of the Astro Pi units on the International Space Station. We wanted to provide a unique, free learning resource that brings the excitement of programming our space-qualified hardware to students, teachers, and others all over the world.

We’re delighted to announce its release today, and you can try it for yourself right now. Click the Run button below and see what happens!

trinket-logo

The emulator will allow more people to participate in future Astro Pi competitions – you’ll be able to join in without needing to own a Raspberry Pi computer or a Sense HAT.

British ESA Astronaut Tim Peake with an Astro Pi unit on the International Space Station

British ESA Astronaut Tim Peake with the Astro Pi. Image credit ESA

The new emulator builds on Trinket’s existing Python-in-browser platform, and provides the following features:

  • Virtual Sense HAT with environmental controls and joystick input
  • Full Python syntax highlighting
  • Contextual auto-complete
  • Intuitive error reporting and highlighting
  • Image upload
  • HTML page embedding
  • Social media integration
  • Project sharing via direct URL
  • Project download as zip (for moving to Raspberry Pi)
  • All major browsers supported

sense_hat_emu

The Sense HAT has temperature, pressure and humidity sensors, and can change its behaviour according to the values they report. The Sense HAT emulator has sliders you can move to change these values, so you can test how your code responds to environmental variables.

Part of a screenshot of the Astro Pi emulator, showing three silders with buttons that can be dragged to change the temperature, pressure and humidity that the virtual Sense HAT's sensors are reporting

You can move the sliders to change what the sensors are reporting

Code written in this emulator is directly portable to a physical Raspberry Pi with a Sense HAT without modification. This means any code you write can be run by the Astro Pi units on board the ISS! It is our hope that, within the next 12 months, code that has been written in the emulator will run in space. Look out for news on this, coming soon on the Astro Pi site!

We owe huge thanks to Trinket, who have been wonderful partners in this project. The development work has been completed in just over two months, and has been a huge collaborative effort from the start. The software relies heavily on open-source technology and a global community of developers who are committed to making the power of code more accessible to students.

A closed group of beta testers, made up of previous Astro Pi participants and Code Club champions, has been putting the emulator through its paces over recent weeks. We’re proud to say that we’ve just had a bug-free open beta over the weekend, and now we’re looking forward to seeing it used as widely as possible.

So, where do you start? If you’re new to the Sense HAT, you can just copy and paste a lot of the code examples from our educational resources like this one. You can also check out our e-book Sense HAT Essentials. For a complete list of all the functions you can use, have a look at the Sense HAT API reference here; please note that the IMU (movement-sensing) functions will be supported in a future update. Head over to the main Sense HAT emulator site to see loads of other cool examples of what’s possible. Flappy LED, anyone?

Don’t forget to share your projects!

The post Sense HAT emulator appeared first on Raspberry Pi.

Nine-year-old inventor’s award-winning asthma monitor

Post Syndicated from Liz Upton original https://www.raspberrypi.org/blog/nine-year-old-inventors-asthma-monitor/

We keep a very close eye on the annual Tech4Good competition, and especially the children who are nominated for their BT Young Pioneer award; there are some fiercely smart kids there doing some hugely impressive work. This year’s was a very close field (I would not like to have been judging – there were some extraordinary projects presented).

Tech4Good award winners 2016

Tech4Good award winners 2016

Arnav Sharma, nine years old, was the Winner of Winners as well as the winner of the Young Pioneer section with this asthma monitor, which runs on Raspberry Pi. Arnav started by learning about the causes and effects of asthma, and thought about ways to help patients. He discovered that asthma is hard to diagnose, but can be fatal if left undetected. This leads to many children being over-diagnosed and over-medicated; inhalers are often given as treatment to reduce the symptoms of asthma, but come with side-effects like reduced growth and immunity. Arnav discovered that the best way to manage asthma is to prevent attacks by understanding what triggers asthma attacks and following a treatment plan.

Asthma Pi

AsthmaPi

Arnav’s AsthmaPi uses a Raspberry Pi, a Sense HAT, an MQ-135 Gas Sensor, a Sharp Optical Dust Sensor and an Arduino Uno.The sensors on the SenseHAT are used to measure temperature and humidity, while the MQ gas sensor detects nitrogen compounds, carbon dioxide, cigarette smoke, smog, ammonia and alcohol, all known asthma triggers. The dust sensor measures the size of dust particles and their density. The AsthmaPi is programmed in Python and C++, and triggers email and SMS text message alerts to remind the owner take medication and to go for review visits.

Here’s Arnav’s very impressive project video, which will walk you through what he’s put together, and how it all works.

AsthmaPi Asthma Management Kit Arnav, Asthma, Allergy, Raspberry Pi, Dust Sensor, Gas Sensor

This is the video demo for the AsthmaPi: An affordable asthma management kit made by Arnav Sharma, aged 9, finalist of Tech For Good competition. Please tweet him at #T4GArnavSharma or visit his page here http://www.tech4goodawards.com/finalist/arnav-sharma/ or vote for him at http://www.tech4goodawards.com/peoples-award/ Thank you.

Well done Arnav!

The post Nine-year-old inventor’s award-winning asthma monitor appeared first on Raspberry Pi.

Forums project: humidity-controlled cellar ventilation

Post Syndicated from Liz Upton original https://www.raspberrypi.org/blog/forums-project-humidity-controlled-cellar-ventilation/

The Raspberry Pi official forums are the central online meeting place for the Raspberry Pi community. They’re where you’ll find support from hundreds of thousands (141,183, as of this morning) of other Pi users, including people from our own engineering team; lots of inspiration for your own projects, and loads of advice. You can chat to a selection of those of us who work at Raspberry Pi there too – we’re usually poking around in there for part of the day.

Commence to poking

Commence to poking.

We found this rather brilliant hack to ventilate and maintain a cellar’s humidity on the forums. Forum member DasManul, from Frankfurt, put this together to measure temperature and relative humidity inside and outside his cellar, and to use those values to calculate absolute humidity. The setup then ventilates the space if the humidity inside is higher than it is outside.

On reading what he was up to, I assumed DasManul was looking after a cellarful of wine. Then I saw his pictures. He’s actually tending bottles of fabric softener and yoghurt.

installation

Nestled next to the nonalcoholic liquids, you’ll find a touchscreen controller for the system, along with a USB receiver. Here’s a closer look at the display:

touchscreen

(For non-German speakers, DasManul says: “Keller – cellar, Aussen – outside, TP (Taupunkt) – dew point, RF/AF (Relative/Absolute Feuchte) – relative/absolute humidity, Lüfter – fan, An/Aus – on/off”.)

The system also outputs more detailed graphs (daily, weekly, or monthly) to a website served by ngnix, which allows you to control the system remotely if you don’t happen to be down in the cellar conditioning your fabric.

1Mavg 1w

DasManul says that he’s not much for hardware tinkering, and didn’t want to start drilling into his house’s infrastructure, so he used off-the-shelf parts for sensing and controlling. Two inexpensive wireless sensors, one indoors and one outdoors, from elv.de, do all the work checking the humidity; they feed information to the USB receiver, and intake and exhaust fans are controlled with an Energenie plug strip. (These things are great – I use an Energenie plug strip to turn lamps on and off via a remote PIR sensor in my living room).

DasManul has made all the code available (with German and English documentation) over at BitBucket so you can replicate the project. There’s plenty more like this over at the Raspberry Pi Forums – get stuck in!

 

The post Forums project: humidity-controlled cellar ventilation appeared first on Raspberry Pi.