Tag Archives: AWS IoT

A Data Sharing Platform Based on AWS Lambda

Post Syndicated from Bryan Liston original https://aws.amazon.com/blogs/compute/a-data-sharing-platform-based-on-aws-lambda/

Julien Lepine

Julien Lepine
Solutions Architect

As developers, one of our top priorities is to build reliable systems; this is a core pillar of the AWS Well Architected Framework. A common pattern to fulfill this goal is to have an architecture built around loosely coupled components.

Amazon Kinesis Streams offers an excellent answer for this, as the events generated can be consumed independently by multiple consumers and remain available for 1 to 7 days. Building an Amazon Kinesis consumer application is done by leveraging the Amazon Kinesis Client Library (KCL) or native integration with AWS Lambda.

As I was speaking with other developers and customers about their use of Amazon Kinesis, there are a few patterns that came up. This post addresses those common patterns.

Protecting streams

Amazon Kinesis has made the implementation of event buses easy and inexpensive, so that applications can send meaningful information to their surrounding ecosystem. As your applications grow and get more usage within your company, more teams will want to consume the data generated, even probably external parties such as business partners or customers.

When the applications get more usage, some concerns may arise:

  • When a new consumer starts (or re-starts after some maintenance), it needs to read a lot of data from the stream (its backlog) in a short amount of time in order to get up to speed
  • A customer may start many consumers at the same time, reading a lot of events in parallel or having a high call rate to Amazon Kinesis
  • A consumer may have an issue (such as infinite loop, retry error) that causes it to call Amazon Kinesis at an extremely high rate

These cases may lead to a depletion of the resources available in your stream, and that could potentially impact all your consumers.

Managing the increased load can be done by leveraging the scale-out model of Amazon Kinesis through the addition of shards to an existing stream. Each shard adds both input (ingestion) and output (consumption) capacity to your stream:

  • 1000 write records and up to 1 megabyte per second for ingesting events
  • 5 read transactions and up to 2 megabytes per second for consuming events

Avoiding these scenarios could be done by scaling-out your streams, and provisioning for peak, but that would create inefficiencies and may not even fully protect your consumers from the behavior of others.

What becomes apparent in these cases is the impact that a single failing consumer may have on all other consumers, a symptom described as the “noisy neighbor”, or managing the blast radius of your system. The key point is to limit the impact that a single consumer can have on others.

A solution is to compartmentalize your platform: this method consists of creating multiple streams and then creating groups of consumers that share the same stream. This gives you the possibility to limit the impact a single consumer can have on its neighbors, and potentially to propose a model where some customers have a dedicated stream.

You can build an Amazon Kinesis consumer application (via the KCL or Lambda) that reads a source stream and sends the messages to the “contained” streams that the actual consumers will use.

Transforming streams

Another use case I see from customers is the need to transfer the data in their stream to other services:

  • Some applications may have limitations in their ability to receive or process the events
  • They may not have connectors to Amazon Kinesis, and only support Amazon SQS
  • They may only support a push model, where their APIs need to be called directly when a message arrives
  • Some analytics/caching/search may be needed on the events generated
  • Data may need to be archived or sent to a data warehouse engine

There are many other cases, but the core need is having the ability to get the data from Amazon Kinesis into other platforms.

The solution for these use cases is to build an Amazon Kinesis consumer application that reads a stream and prepares these messages for other services.

Sharing data with external parties

The final request I have seen is the possibility to process a stream from a different AWS account or region. While you can give access to your resources to an external AWS account through cross-account IAM roles, that feature requires development and is not supported natively by some services. For example, you cannot subscribe a Lambda function to a stream in a different AWS account or region.

The solution is to replicate the Amazon Kinesis stream or the events to another environment (AWS account, region, or service).

This can be done one time through an Amazon Kinesis consumer application that reads a stream and forwards the events to the remote environment.

Solution: A Lambda-based fan-out function

These three major needs have a common solution: the deployment of an Amazon Kinesis consumer application that listens to a stream and is able to send messages to other instances of Amazon Kinesis, services, or environments (AWS accounts or regions).

In the aws-lambda-fanout GitHub repository, you’ll find a Lambda function that specifically supports this scenario. This function is made to forward incoming messages from Amazon Kinesis or DynamoDB Streams.

The architecture of the function is made to be simple and extensible, with one core file fanout.js that loads modules for the different providers. The currently supported providers are as follows:

  • Amazon SNS
  • Amazon SQS
  • Amazon Elasticsearch Service
  • Amazon Kinesis Streams
  • Amazon Kinesis Firehose
  • AWS IoT
  • AWS Lambda
  • Amazon ElastiCache for Memcached
  • Amazon ElastiCache for Redis

The function is built to support multiple inputs:

  • Amazon Kinesis streams
  • Amazon Kinesis streams containing Amazon Kinesis Producer Library (KPL) records
  • DynamoDB Streams records

It relies on Lambda for a fully-managed environment where scaling, logging, and monitoring are automated by the platform. It also supports Lambda functions in a VPC for Amazon ElastiCache.

The configuration is stored in a DynamoDB table, and associates the output configuration with each function. This table has a simple schema:

  • sourceArn (Partition Key): The Amazon Resource Name (ARN) of the input Amazon Kinesis stream
  • id [String]: The name of the mapping
  • type [String]: The destination type
  • destination [String]: The ARN or name of the destination
  • active [Boolean]: Whether that mapping is active

Depending on the target, some other properties are also stored.

The function can also group records together for services that don’t initially support it, such as Amazon SQS, Amazon SNS, or AWS IoT. Amazon DynamoDB Streams records can also be transformed to plain JSON objects to simplify management in later stages. The function comes with a Bash-based command line Interface to make the deployment and management easier.

As an example, the following lines deploy the function, which registers a mapping from one stream (inputStream) to another (outputStream).

./fanout deploy --function fanout

./fanout register kinesis --function fanout --source-type kinesis --source inputStream --id target1 --destination outputStream --active true

./fanout hook --function fanout --source-type kinesis --source inputStream

Summary

There are many options available for you to forward your events from one service or environment to another. For more information about this topic, see Using AWS Lambda with Amazon Kinesis. Happy eventing!

If you have questions or suggestions, please comment below.

AWS Week in Review – August 29, 2016

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/aws-week-in-review-august-29-2016/

This is the second community-driven edition of the AWS Week in Review. Special thanks are due to the 13 external contributors who helped to make this happen. If you would like to contribute, please take a look at the AWS Week in Review on GitHub. Adding relevant content is fast and easy and can be done from the comfort of your web browser! Just to be clear, it is perfectly fine for you to add content written by someone else. The goal is to catch it all, as they say!


Monday

August 29

Tuesday

August 30

Wednesday

August 31

Thursday

September 1

Friday

September 2

New & Notable Open Source

  • apilogs is a command-line utility to help aggregate, stream, and filter CloudWatch Log events produced by API Gateway and Lambda serverless APIs.
  • MoonMail is a fully Lambda / SES powered email marketing tool.

New SlideShare Presentations

New Customer Success Stories

  • Bustle uses AWS Lambda to process high volumes of data generated by the website in real-time, allowing the team to make faster, data-driven decisions. Bustle.com is a news, entertainment, lifestyle, and fashion website catering to women.
  • Graze continually improves its customers’ experience by staying agile—including in its infrastructure. The company sells healthy snacks through its website and via U.K. retailers. It runs all its infrastructure on AWS, including its customer-facing websites and all its internal systems from the factory floor to business intelligence.
  • Made.com migrated to AWS to support a record-breaking sales period with no downtime. The company provides a website that links home-furnishings designers directly to consumers. It now runs its e-commerce platform, website, and customer-facing applications on AWS, using services such as Amazon EC2, Amazon RDS, and Auto Scaling groups.
  • Sony DADC New Media Solutions (NMS) distributes hundreds of thousands of hours of video content monthly, spins up data analytics, renders solutions in days instead of months, and saves millions of dollars in hardware refresh costs by going all in on AWS. The organization distributes and delivers content to film studios, television broadcasters, and other providers across the globe. NMS runs its content distribution platform, broadcast playout services, and video archive on the AWS Cloud.
  • Upserve quickly develops and trains more than 100 learning models, streams restaurant sales and menu item data in real time, and gives restaurateurs the ability to predict their nightly business using Amazon Machine Learning. The company provides online payment and analytical software to thousands of restaurant owners throughout the U.S. Upserve uses Amazon Machine Learning to provide predictive analysis through its Shift Prep application.

Upcoming Events

Help Wanted

Stay tuned for next week! In the meantime, follow me on Twitter, subscribe to the RSS feed, and contribute some content!

Jeff;

Implementing a Serverless AWS IoT Backend with AWS Lambda and Amazon DynamoDB

Post Syndicated from Bryan Liston original https://aws.amazon.com/blogs/compute/implementing-a-serverless-aws-iot-backend-with-aws-lambda-and-amazon-dynamodb/

Ed Lima

Ed Lima
Cloud Support Engineer

Does your IoT device fleet scale to hundreds or thousands of devices? Do you find it somewhat challenging to retrieve the details for multiple devices? AWS IoT provides a platform to connect those devices and build a scalable solution for your Internet of Things workloads.

Out of the box, the AWS IoT console gives you your own searchable device registry with access to the device state and information about device shadows. You can enhance and customize the service using AWS Lambda and Amazon DynamoDB to build a serverless backend with a customizable device database that can be used to store useful information about the devices as well as helping to track what devices are activated with an activation code, if required.

You can use DynamoDB to extend the AWS IoT internal device registry to help manage the device fleet, as well as storing specific additional data about each device. Lambda provides the link between AWS IoT and DynamoDB allowing you to add, update, and query your new device database backend.

In this post, you learn how to use AWS IoT rules to trigger specific device registration logic using Lamba in order to populate a DynamoDB table. You then use a second Lambda function to search the database for a specific device serial number and a randomly generated activation code to activate the device and register the email of the device owner in the same table. After you’re done, you’ll have a fully functional serverless IoT backend, allowing you to focus on your own IoT solution and logic instead of managing the infrastructure to do so.

Prerequisites

You must have the following before you can create and deploy this framework:

  • An AWS account
  • An IAM user with permissions to create AWS resources (AWS IoT things and rules, Lambda functions, DynamoDB tables, IAM policies and roles, etc.)
  • JS and the AWS SDK for JavaScript installed locally to test the deployment

Building a backend

In this post, I assume that you have some basic knowledge about the services involved. If not, you can review the documentation:

For this use case, imagine that you have a fleet of devices called “myThing”. These devices can be anything: a smart lightbulb, smart hub, Internet-connected robot, music player, smart thermostat, or anything with specific sensors that can be managed using AWS IoT.

When you create a myThing device, there is some specific information that you want to be available in your database, namely:

  • Client ID
  • Serial number
  • Activation code
  • Activation status
  • Device name
  • Device type
  • Owner email
  • AWS IoT endpoint

The following is a sample payload with details of a single myThing device to be sent to a specific MQTT topic, which triggers an IoT rule. The data is in a format that AWS IoT can understand, good old JSON. For example:

{
  "clientId": "ID-91B2F06B3F05",
  "serialNumber": "SN-D7F3C8947867",
  "activationCode": "AC-9BE75CD0F1543D44C9AB",
  "activated": "false",
  "device": "myThing1",
  "type": "MySmartIoTDevice",
  "email": "[email protected]",
  "endpoint": "<endpoint prefix>.iot.<region>.amazonaws.com"
}

The rule then invokes the first Lambda function, which you create now. Open the Lambda console, choose Create a Lambda function , and follow the steps. Here’s the code:

console.log('Loading function');
var AWS = require('aws-sdk');
var dynamo = new AWS.DynamoDB.DocumentClient();
var table = "iotCatalog";

exports.handler = function(event, context) {
    //console.log('Received event:', JSON.stringify(event, null, 2));
   var params = {
    TableName:table,
    Item:{
        "serialNumber": event.serialNumber,
        "clientId": event.clientId,
        "device": event.device,
        "endpoint": event.endpoint,
        "type": event.type,
        "certificateId": event.certificateId,
        "activationCode": event.activationCode,
        "activated": event.activated,
        "email": event.email
        }
    };

    console.log("Adding a new IoT device...");
    dynamo.put(params, function(err, data) {
        if (err) {
            console.error("Unable to add device. Error JSON:", JSON.stringify(err, null, 2));
            context.fail();
        } else {
            console.log("Added device:", JSON.stringify(data, null, 2));
            context.succeed();
        }
    });
}

The function adds an item to a DynamoDB database called iotCatalog based on events like the JSON data provided earlier. You now need to create the database as well as making sure the Lambda function has permissions to add items to the DynamoDB table, by configuring it with the appropriate execution role.

Open the DynamoDB console, choose Create table and follow the steps. For this table, use the following details.

The serial number uniquely identifies your device; if, for instance, it is a smart hub that has different client devices connecting to it, use the client ID as the sort key.

The backend is good to go! You just need to make the new resources work together; for that, you configure an IoT rule to do so.

On the AWS IoT console, choose Create a resource and Create a rule , and use the following settings to point the rule to your newly-created Lambda function, also called iotCatalog.

After creating the rule, AWS IoT adds permissions on the background to allow it to trigger the Lambda function whenever a message is published to the MQTT topic called registration. You can use the following Node.js deployment code to test:

var AWS = require('aws-sdk');
AWS.config.region = 'ap-northeast-1';

var crypto = require('crypto');
var endpoint = "<endpoint prefix>.iot.<region>.amazonaws.com";
var iot = new AWS.Iot();
var iotdata = new AWS.IotData({endpoint: endpoint});
var topic = "registration";
var type = "MySmartIoTDevice"

//Create 50 AWS IoT Things
for(var i = 1; i < 51; i++) {
  var serialNumber = "SN-"+crypto.randomBytes(Math.ceil(12/2)).toString('hex').slice(0,15).toUpperCase();
  var clientId = "ID-"+crypto.randomBytes(Math.ceil(12/2)).toString('hex').slice(0,12).toUpperCase();
  var activationCode = "AC-"+crypto.randomBytes(Math.ceil(20/2)).toString('hex').slice(0,20).toUpperCase();
  var thing = "myThing"+i.toString();
  var thingParams = {
    thingName: thing
  };
  
  iot.createThing(thingParams).on('success', function(response) {
    //Thing Created!
  }).on('error', function(response) {
    console.log(response);
  }).send();

  //Publish JSON to Registration Topic

  var registrationData = '{\n \"serialNumber\": \"'+serialNumber+'\",\n \"clientId\": \"'+clientId+'\",\n \"device\": \"'+thing+'\",\n \"endpoint\": \"'+endpoint+'\",\n\"type\": \"'+type+'\",\n \"activationCode\": \"'+activationCode+'\",\n \"activated\": \"false\",\n \"email\": \"[email protected]\" \n}';

  var registrationParams = {
    topic: topic,
    payload: registrationData,
    qos: 0
  };

  iotdata.publish(registrationParams, function(err, data) {
    if (err) console.log(err, err.stack); // an error occurred
    // else Published Successfully!
  });
  setTimeout(function(){},50);
}

//Checking all devices were created

iot.listThings().on('success', function(response) {
  var things = response.data.things;
  var myThings = [];
  for(var i = 0; i < things.length; i++) {
    if (things[i].thingName.includes("myThing")){
      myThings[i]=things[i].thingName;
    }
  }

  if (myThings.length = 50){
    console.log("myThing1 to 50 created and registered!");
  }
}).on('error', function(response) {
  console.log(response);
}).send();

console.log("Registration data on the way to Lambda and DynamoDB");

The code above creates 50 IoT things in AWS IoT and generate random client IDs, serial numbers, and activation codes for each device. It then publishes the device data as a JSON payload to the IoT topic accordingly, which in turn triggers the Lambda function:

And here it is! The function was triggered successfully by your IoT rule and created your database of IoT devices with all the custom information you need. You can query the database to find your things and any other details related to them.

In the AWS IoT console, the newly-created things are also available in the thing registry.

Now you can create certificates, policies, attach them to each “myThing” AWS IoT Thing then install each certificate as you provision the physical devices.

Activation and registration logic

However, you’re not done yet…. What if you want to activate a device in the field with the pre-generated activation code as well as register the email details of whoever activated the device?

You need a second Lambda function for that, with the same execution role from the first function (Basic with DynamoDB). Here’s the code:

console.log('Loading function');

var AWS = require('aws-sdk');
var dynamo = new AWS.DynamoDB.DocumentClient();
var table = "iotCatalog";

exports.handler = function(event, context) {
    //console.log('Received event:', JSON.stringify(event, null, 2));

   var params = {
    TableName:table,
    Key:{
        "serialNumber": event.serialNumber,
        "clientId": event.clientId,
        }
    };

    console.log("Gettings IoT device details...");
    dynamo.get(params, function(err, data) {
    if (err) {
        console.error("Unable to get device details. Error JSON:", JSON.stringify(err, null, 2));
        context.fail();
    } else {
        console.log("Device data:", JSON.stringify(data, null, 2));
        console.log(data.Item.activationCode);
        if (data.Item.activationCode == event.activationCode){
            console.log("Valid Activation Code! Proceed to register owner e-mail and update activation status");
            var params = {
                TableName:table,
                Key:{
                    "serialNumber": event.serialNumber,
                    "clientId": event.clientId,
                },
                UpdateExpression: "set email = :val1, activated = :val2",
                ExpressionAttributeValues:{
                    ":val1": event.email,
                    ":val2": "true"
                },
                ReturnValues:"UPDATED\_NEW"
            };
            dynamo.update(params, function(err, data) {
                if (err) {
                    console.error("Unable to update item. Error JSON:", JSON.stringify(err, null, 2));
                    context.fail();
                } else {
                    console.log("Device now active!", JSON.stringify(data, null, 2));
                    context.succeed("Device now active! Your e-mail is now registered as device owner, thank you for activating your Smart IoT Device!");
                }
            });
        } else {
            context.fail("Activation Code Invalid");
        }
    }
});
}

The function needs just a small subset of the data used earlier:

{
  "clientId": "ID-91B2F06B3F05",
  "serialNumber": "SN-D7F3C8947867",
  "activationCode": "AC-9BE75CD0F1543D44C9AB",
  "email": "[email protected]"
}

Lambda uses the hash and range keys (serialNumber and clientId) to query the database and compare the database current pre-generated activation code to a code that is supplied by the device owner along with their email address. If the activation code matches the one from the database, the activation status and email details are updated in DynamoDB accordingly. If not, the user gets an error message stating that the code is invalid.

You can turn it into an API with Amazon API Gateway. In order to do so, go to the Lambda function and add an API endpoint, as follows.

Now test the access to the newly-created API endpoint, using a tool such as Postman.

If an invalid code is provided, the requester gets an error message accordingly.

Back in the database, you can confirm the record was updated as required.

Cleanup

After you finish the tutorial, delete all the newly created resources (IoT things, Lambda functions, and DynamoDB table). Alternatively, you can keep the Lambda function code for future reference, as you won’t incur charges unless the functions are invoked.

Conclusion

As you can see, by leveraging the power of the AWS IoT Rules Engine, you can take advantage of the seamless integration with AWS Lambda to create a flexible and scalable IoT backend powered by Amazon DynamoDB that can be used to manage your growing Internet of Things fleet.

You can also configure an activation API to make use of the newly-created backend and activate devices as well as register email contact details from the device owner; this information could be used to get in touch with your users regarding marketing campaigns or newsletters about new products or new versions of your IoT products.

If you have questions or suggestions, please comment below.

AWS Webinars – August, 2016

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/aws-webinars-august-2016/

Everyone on the AWS team understands the value of educating our customers on the best ways to use our services. We work hard to create documentation, training materials, and blog posts for you! We run live events such as our Global AWS Summits and AWS re:Invent where the focus is on education. Last but not least, we put our heads together and create a fresh lineup of webinars for you each and every month.

We have a great selection of webinars on the schedule for August. As always they are free, but they do fill up and I strongly suggest that you register ahead of time. All times are PT, and each webinar runs for one hour:

August 23

August 24

August 25

August 30

August 31


Jeff;

PS – Check out the AWS Webinar Archive for more great content!

 

New – Just-in-Time Certificate Registration for AWS IoT

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/new-just-in-time-certificate-registration-for-aws-iot/

We launched AWS IoT at re:Invent (read AWS IoT – Cloud Services for Connected Devices for an introduction) and made it generally available last December.

Earlier this year my colleague Olawale Oladehin showed you how to Use Your Own Certificate with AWS IoT. Before that, John Renshaw talked about Predictive Maintenance with AWS IoT and Amazon Machine Learning.

Just-in-Time Registration
Today we are making AWS IoT even more flexible by giving you the ability to do Just-in-Time registration of device certificates. This expands on the feature described by Olawale, and simplifies the process of building systems that make use of millions of connected devices. Instead of having to build a separate database to track the certificates and the associated devices, you can now arrange to automatically register new certificates as part of the initial  communication between the device and AWS IoT.

In order to do this, you start with a CA (Certificate Authority) certificate that you later use to sign the per-device certificates (this is a great example of the chain of trust model that is fundamental to the use of digital certificates).

Putting this new feature to use is pretty easy, but you do have to take care of some important details. Here are the principal steps:

  1. Register & activate the CA certificate that will sign the other certificates.
  2. Use the certificate to generate and sign certificates for each device.
  3. Have the device present the certificate to AWS IoT and then activate it.

The final step can be implemented using a AWS Lambda function. The function simply listens on a designated MQTT topic using an AWS IoT Rule Engine Action. A notification will be sent to the topic each time a new certificate is presented to AWS IoT. The function can then activate the device certificate and take care of any other initialization or registration required by your application.

Learn More
To learn more about this important new feature and to review all of the necessary steps in detail, read about Just in Time Registration of Device Certificates on AWS IoT on The Internet of Things on AWS Blog.


Jeff;

 

Watch the AWS Summit – Santa Clara Keynote in Real Time on July 13

Post Syndicated from Craig Liebendorfer original https://blogs.aws.amazon.com/security/post/Tx1UMV1L79BHDWJ/Watch-the-AWS-Summit-Santa-Clara-Keynote-in-Real-Time-on-July-13

Join us online Wednesday, July 13, at 10:00 A.M. Pacific Time for the AWS Summit – Santa Clara Livestream! This keynote presentation, given by Dr. Matt Wood, AWS General Manager of Product Strategy, will highlight the newest AWS features and services, and select customer stories. Don’t miss this live presentation!

Join us in person at the Santa Clara Convention Center
If you are in the Santa Clara area and would like to attend the free Summit, you still have time. Register now to attend.

The Summit includes:

  • More than 50 technical sessions, including these security-related sessions:

    • Automating Security Operations in AWS (Deep Dive)
    • Securing Cloud Workloads with DevOps Automation
    • Deep Dive on AWS IoT
    • Getting Started with AWS Security (Intro)
    • Network Security and Access Control within AWS (Intro)
  • Training opportunities in Hands-on Labs.
  • Full-day training bootcamps. Registration is $600.
  • The opportunity to learn best practices and get questions answered from AWS engineers, expert customers, and partners.
  • Networking opportunities with your cloud and IT peers.

– Craig 

P.S. Can’t make the Santa Clara event? Check out our other AWS Summit locations. If you have summit questions, please contact us at [email protected]

Arduino Web Editor and Cloud Platform – Powered by AWS

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/arduino-web-editor-and-cloud-platform-powered-by-aws/

Last night I spoke with Luca Cipriani from Arduino to learn more about the new AWS-powered Arduino Web Editor and Arduino Cloud Platform offerings. Luca was en-route to the Bay Area Maker Faire and we had just a few minutes to speak, but that was enough time for me to learn a bit about what they have built.

If you have ever used an Arduino, you know that there are several steps involved. First you need to connect the board to your PC’s serial port using a special cable (you can also use Wi-Fi if you have the appropriate add-on “shield”), ensure that the port is properly configured, and establish basic communication. Then you need to install, configure, and launch your development environment, make sure that it can talk to your Arduino, tell it which make and model of Arduino that you are using, and select the libraries that you want to call from your code. With all of that taken care of, you are ready to write code, compile it, and then download it to the board for debugging and testing.

Arduino Code Editor
Luca told me that the Arduino Code Editor was designed to simplify and streamline the setup and development process. The editor runs within your browser and is hosted on AWS (although we did not have time to get in to the details, I understand that they made good use of AWS Lambda and several other AWS services).

You can write and modify your code, save it to the cloud and optionally share it with your colleagues and/or friends. The editor can also detect your board (using a small native plugin) and configure itself accordingly; it even makes sure that you can only write code using libraries that are compatible with your board. All of your code is compiled in the cloud and then downloaded to your board for execution.

Here’s what the editor looks like (see Sneak Peek on the New, Web-Based Arduino Create for more):

Arduino Cloud Platform
Because Arduinos are small, easy to program, and consume very little power, they work well in IoT (Internet of Things) applications. Even better, it is easy to connect them to all sorts of sensors, displays, and actuators so that they can collect data and effect changes.

The new Arduino Cloud Platform is designed to simplify the task of building IoT applications that make use of Arduino technology. Connected devices will be able to be able to connect to the Internet, upload information derived from sensors, and effect changes upon command from the cloud. Building upon the functionality provided by AWS IoT, this new platform will allow devices to communicate with the Internet and with each other. While the final details are still under wraps, I believe that this will pave the wave for sensors to activate Lambda functions and for Lambda functions to take control of displays and actuators.

I look forward to learning more about this platform as the details become available!


Jeff;

 

AWS Week in Review – April 25, 2016

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/aws-week-in-review-april-25-2016/

Let’s take a quick look at what happened in AWS-land last week:

Monday

April 25

Tuesday

April 26

Wednesday

April 27

Thursday

April 28

Friday

April 29

Saturday

April 30

Sunday

May 1

New & Notable Open Source

New SlideShare Presentations

New Customer Success Stories

Upcoming Events

Help Wanted

Stay tuned for next week! In the meantime, follow me on Twitter and subscribe to the RSS feed.

Jeff;

Register Now – AWS DevDay in San Francisco

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/register-now-aws-devday-in-san-francisco/

I am a firm believer in the value of continuing education. These days, the half-life on knowledge of any particular technical topic seems to be less than a year. Put another way, once you stop learning your knowledge base will be just about obsolete within 2 or 3 years!

In order to make sure that you stay on top of your field, you need to decide to learn something new every week. Continuous learning will leave you in a great position to capitalize on the latest and greatest languages, tools, and technologies. By committing to a career marked by lifelong learning, you can be sure that your skills will remain relevant in the face of all of this change.

Keeping all of this in mind, I am happy to be able to announce that we will be holding an AWS DevDay in San Francisco on June 21st.The day will be packed with technical sessions, live demos, and hands-on workshops, all focused on some of today’s hottest and most relevant topics. If you attend the AWS DevDay, you will also have the opportunity to meet and speak with AWS engineers and to network with the AWS technical community.

Here are the tracks:

  • Serverless – Build and run applications without having to provision, manage, or scale infrastructure. We will demonstrate how you can build a range of applications from data processing systems to mobile backends to web applications.
  • Containers – Package your application’s code, configurations, and dependencies into easy-to-use building blocks. Learn how to run Docker-enabled applications on AWS.
  • IoT – Get the most out of connecting IoT devices to the cloud with AWS. We will highlight best practices using the cloud for IoT applications, connecting devices with AWS IoT, and using AWS endpoints.
  • Mobile – When developing mobile apps, you want to focus on the activities that make your app great and not the heavy lifting required to build, manage, and scale the backend infrastructure. We will demonstrate how AWS helps you easily develop and test your mobile apps and scale to millions of users.

We will also be running a series of hands-on workshops that day:

  • Zombie Apocalypse Workshop: Building Serverless Microservices.
  • Develop a Snapchat Clone in One Hour.
  • Connecting to AWS IoT.

Registration and Location
There’s no charge for this event, but space is limited and you need to register quickly in order to attend.

All sessions will take place at the AMC Metreon at 135 4th Street in San Francisco.


Jeff;

 

 

 

AWS Week in Review – April 18, 2016

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/aws-week-in-review-april-18-2016/

Let’s take a quick look at what happened in AWS-land last week:

Monday

April 18

Tuesday

April 19

Wednesday

April 20

Thursday

April 21

Friday

April 22

Saturday

April 23

Sunday

April 24

New & Notable Open Source

New SlideShare Presentations

New Customer Success Stories

New YouTube Videos

Upcoming Events

Help Wanted

Stay tuned for next week! In the meantime, follow me on Twitter and subscribe to the RSS feed.

Jeff;

AWS Week in Review – April 11, 2016

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/aws-week-in-review-april-11-2016/

Let’s take a quick look at what happened in AWS-land last week:

Monday

April 11

Tuesday

April 12

Wednesday

April 13

Thursday

April 14

Friday

April 15

Saturday

April 16

Sunday

April 17

New & Notable Open Source

  • cfn-include implements a Fn::Include for CloudFormation templates.
  • TumblessTemplates is a set of CloudFormation templates for quick setup of the Tumbless blogging platform.
  • s3git is Git for cloud storage.
  • s3_uploader is an S3 file uploader GUI written in Python.
  • SSH2EC2 lets you connect to EC2 instances via tags and metadata.
  • lambada is AWS Lambda for silly people.
  • aws-iam-proxy is a proxy that signs requests with IAM credentials.
  • hyperion is a Scala library and a set of abstractions for AWS Data Pipeline.
  • dynq is a DynamoDB query library.
  • cloud-custodian is a policy rules engine for AWS management.

New SlideShare Presentations

New Customer Success Stories

New YouTube Videos

Upcoming Events

Help Wanted

Stay tuned for next week! In the meantime, follow me on Twitter and subscribe to the RSS feed.

Jeff;

AWS Week in Review – April 4, 2016

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/aws-week-in-review-april-4-2016/

Let’s take a quick look at what happened in AWS-land last week:

Monday

April 4

Tuesday

April 5

Wednesday

April 6

Thursday

April 7

Friday

April 8

Upcoming Events

Help Wanted

Stay tuned for next week! In the meantime, follow me on Twitter and subscribe to the RSS feed.

Jeff;

AWS Week in Review – March 21, 2016

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/aws-week-in-review-march-21-2016/

Let’s take a quick look at what happened in AWS-land last week:

Monday

March 21

Tuesday

March 22

Wednesday

March 23

Thursday

March 24

Friday

March 25

Saturday

March 26

Stay tuned for next week! In the meantime, follow me on Twitter and subscribe to the RSS feed.

Jeff;

AWS Week in Review – March 14, 2016

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/aws-week-in-review-march-14-2016/

Let’s take a quick look at what happened in AWS-land last week:

Monday
March 14

We announced that the Developer Preview of AWS SDK for C++ is Now Available.
We celebrated Ten Years in the AWS Cloud.
We launched Amazon EMR 4.4.0 with Sqoop, HCatalog, Java 8, and More.
The AWS Compute Blog announced the Launch of AWS Lambda and Amazon API Gateway in the EU (Frankfurt) Region.
The Amazon Simple Email Service Blog annouced that Amazon SES Now Supports Custom Email From Domains.
The AWS Java Blog talked about Using Amazon SQS with Spring Boot and Spring JMS.
The AWS Partner Network Blog urged you to Take Advantage of AWS Self-Paced Labs.
The AWS Windows and .NET Developer Blog showed you how to Retrieve Request Metrics from the AWS SDK for .NET.
The AWS Government, Education, & Nonprofits Blog announced the New Amazon-Busan Cloud Innovation and Technology Center.
We announced Lumberyard Beta 1.1 is Now Available.
Bometric shared AWS Security Best Practices: Network Security.
CloudCheckr listed 5 AWS Security Traps You Might be Missing.
Serverless Code announced that ServerlessConf is Here!
Cloud Academy launched 2 New AWS Courses – (Advanced Techniques for AWS Monitoring, Metrics and Logging and Advanced Deployment Techniques on AWS).
Cloudonaut reminded you to Avoid Sharing Key Pairs for EC2.
8KMiles talked about How Cloud Computing Can Address Healthcare Industry Challenges.
Evident discussed the CIS Foundations Benchmark for AWS Security.
Talkin’ Cloud shared 10 Facts About AWS as it Celebrates 10 Years.
The Next Platform reviewed Ten Years of AWS And a Status Check for HPC Clouds.
ZephyCloud is AWS-powered Wind Farm Design Software.

Tuesday
March 15

We announced the AWS Database Migration Service.
We announced that AWS CloudFormation Now Supports Amazon GameLift.
The AWS Partner Network Blog reminded everyone that Friends Don’t Let Friends Build Data Centers.
The Amazon GameDev Blog talked about Using Autoscaling to Control Costs While Delivering Great Player Experiences.
We updated the AWS SDK for JavaScript, the AWS SDK for Ruby, and the AWS SDK for Go.
Calorious talked about Uploading Images into Amazon S3.
Serverless Code showed you How to Use LXML in Lambda.
The Acquia Developer Center talked about Open-Sourcing Moonshot.
Concurrency Labs encouraged you to Hatch a Swarm of AWS IoT Things Using Locust, EC2 and Get Your IoT Application Ready for Prime Time.

Wednesday
March 16

We announced an S3 Lifecycle Management Update with Support for Multipart Upload and Delete Markers.
We announced that the EC2 Container Service is Now Available in the US West (Oregon) Region.
We announced that Amazon ElastiCache now supports the R3 node family in AWS China (Beijing) and AWS South America (Sao Paulo) Regions.
We announced that AWS IoT Now Integrates with Amazon Elasticsearch Service and CloudWatch.
We published the Puppet on the AWS Cloud: Quick Start Reference Deployment.
We announced that Amazon RDS Enhanced Monitoring is now available in the Asia Pacific (Seoul) Region.
I wrote about Additional Failover Control for Amazon Aurora (this feature was launched earlier in the year).
The AWS Security Blog showed you How to Set Up Uninterrupted, Federated User Access to AWS Using AD FS.
The AWS Java Blog talked about Migrating Your Databases Using AWS Database Migration Service.
We updated the AWS SDK for Java and the AWS CLI.
CloudWedge asked Cloud Computing: Cost Saver or Additional Expense?
Gathering Clouds reviewed New 2016 AWS Services: Certificate Manager, Lambda, Dev SecOps.

Thursday
March 17

We announced the new Marketplace Metering Service for 3rd Party Sellers.
We announced Amazon VPC Endpoints for Amazon S3 in South America (Sao Paulo) and Asia Pacific (Seoul).
We announced AWS CloudTrail Support for Kinesis Firehose.
The AWS Big Data Blog showed you How to Analyze a Time Series in Real Time with AWS Lambda, Amazon Kinesis and Amazon DynamoDB Streams.
The AWS Enterprise Blog showed you How to Create a Cloud Center of Excellence in your Enterprise, and then talked about Staffing Your Enterprise’s Cloud Center of Excellence.
The AWS Mobile Development Blog showed you How to Analyze Device-Generated Data with AWS IoT and Amazon Elasticsearch Service.
Stelligent initiated a series on Serverless Delivery.
CloudHealth Academy talked about Modeling RDS Reservations.
N2W Software talked about How to Pre-Warm Your EBS Volumes on AWS.
ParkMyCloud explained How to Save Money on AWS With ParkMyCloud.

Friday
March 18

The AWS Government, Education, & Nonprofits Blog told you how AWS GovCloud (US) Helps ASD Cut Costs by 50% While Dramatically Improving Security.
The Amazon GameDev Blog discussed Code Archeology: Crafting Lumberyard.
Calorious talked about Importing JSON into DynamoDB.
DZone Cloud Zone talked about Graceful Shutdown Using AWS AutoScaling Groups and Terraform.

Saturday
March 19

DZone Cloud Zone wants to honor some Trailblazing Women in the Cloud.

Sunday
March 20

 Cloudability talked about How Atlassian Nailed the Reserved Instance Buying Process.
DZone Cloud Zone talked about Serverless Delivery Architectures.
Gorillastack explained Why the Cloud is THE Key Technology Enabler for Digital Transformation.

New & Notable Open Source

Tumbless is a blogging platform based only on S3 and your browser.
aws-amicleaner cleans up old, unused AMIs and related snapshots.
alexa-aws-administration helps you to do various administration tasks in your AWS account using an Amazon Echo.
aws-s3-zipper takes an S3 bucket folder and zips it for streaming.
aws-lambda-helper is a collection of helper methods for Lambda.
CloudSeed lets you describe a list of AWS stack components, then configure and build a custom stack.
aws-ses-sns-dashboard is a Go-based dashboard with SES and SNS notifications.
snowplow-scala-analytics-sdk is a Scala SDK for working with Snowplow-enriched events in Spark using Lambda.
StackFormation is a lightweight CloudFormation stack manager.
aws-keychain-util is a command-line utility to manage AWS credentials in the OS X keychain.

New SlideShare Presentations

Account Separation and Mandatory Access Control on AWS.
Crypto Options in AWS.
Security Day IAM Recommended Practices.
What’s Nearly New.

New Customer Success Stories

AdiMap measures online advertising spend, app financials, and salary data. Using AWS, AdiMap builds predictive financial models without spending millions on compute resources and hardware, providing scalable financial intelligence and reducing time to market for new products.
Change.org is the world’s largest and fastest growing social change platform, with more than 125 million users in 196 countries starting campaigns and mobilizing support for local causes and global issues. The organization runs its website and business intelligence cluster on AWS, and runs its continuous integration and testing on Solano CI from APN member Solano Labs.
Flatiron Health has been able to reach 230 cancer clinics and 2,200 clinicians across the United States with a solution that captures and organizes oncology data, helping to support cancer treatments. Flatiron moved its solution to AWS to improve speed to market and to minimize the time and expense that the startup company needs to devote to its IT infrastructure.
Global Red specializes in lifecycle marketing, including strategy, data, analytics, and execution across all digital channels. By re-architecting and migrating its data platform and related applications to AWS, Global Red reduced the time to onboard new customers for its advertising trading desk and marketing automation platforms by 50 percent.
GMobi primarily sells its products and services to Original Design Manufacturers and Original Equipment Manufacturers in emerging markets. By running its “over the air” firmware updates, mobile billing, and advertising software development kits in an AWS infrastructure, GMobi has grown to support 120 million users while maintaining more than 99.9 percent availability
Time Inc.’s new chief technology officer joined the renowned media organization in early 2014, and promised big changes. With AWS, Time Inc. can leverage security features and functionality that mirror the benefits of cloud computing, including rich tools, best-in-class industry standards and protocols and lower costs.
Seaco Global is one of the world’s largest shipping companies. By using AWS to run SAP applications, it also reduced the time needed to complete monthly business processes to just one day, down from four days in the past.

New YouTube Videos

AWS Database Migration Service.
Introduction to Amazon WorkSpaces.
AWS Pop-up Loft.
Save the Date – AWS re:Invent 2016.

Upcoming Events

March 22nd – Live Event (Seattle, Washington) – AWS Big Data Meetup – Intro to SparkR.
March 22nd – Live Broadcast – VoiceOps: Commanding and Controlling Your AWS environments using Amazon Echo and Lambda.
March 23rd – Live Event (Atlanta, Georgia) – AWS Key Management Service & AWS Storage Services for a Hybrid Cloud (Atlanta AWS Community).
April 6th – Live Event (Boston, Massachusetts) AWS at Bio-IT World.
April 18th & 19th – Live Event (Chicago, Illinois) – AWS Summit – Chicago.
April 20th – Live Event (Melbourne, Australia) – Inaugural Melbourne Serverless Meetup.
April 26th – Live Event (Sydney, Australia) – AWS Partner Summit.
April 26th – Live Event (Sydney, Australia) – Inaugural Sydney Serverless Meetup.
ParkMyCloud 2016 AWS Cost-Reduction Roadshow.
AWS Loft – San Francisco.
AWS Loft – New York.
AWS Loft – Tel Aviv.
AWS Zombie Microservices Roadshow.
AWS Public Sector Events.
AWS Global Summit Series.

Help Wanted

AWS Careers.

Stay tuned for next week! In the meantime, follow me on Twitter and subscribe to the RSS feed.
Jeff;

AWS Week in Review – March 14, 2016

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/aws-week-in-review-march-14-2016/

Let’s take a quick look at what happened in AWS-land last week:

Monday
March 14

We announced that the Developer Preview of AWS SDK for C++ is Now Available.
We celebrated Ten Years in the AWS Cloud.
We launched Amazon EMR 4.4.0 with Sqoop, HCatalog, Java 8, and More.
The AWS Compute Blog announced the Launch of AWS Lambda and Amazon API Gateway in the EU (Frankfurt) Region.
The Amazon Simple Email Service Blog annouced that Amazon SES Now Supports Custom Email From Domains.
The AWS Java Blog talked about Using Amazon SQS with Spring Boot and Spring JMS.
The AWS Partner Network Blog urged you to Take Advantage of AWS Self-Paced Labs.
The AWS Windows and .NET Developer Blog showed you how to Retrieve Request Metrics from the AWS SDK for .NET.
The AWS Government, Education, & Nonprofits Blog announced the New Amazon-Busan Cloud Innovation and Technology Center.
We announced Lumberyard Beta 1.1 is Now Available.
Bometric shared AWS Security Best Practices: Network Security.
CloudCheckr listed 5 AWS Security Traps You Might be Missing.
Serverless Code announced that ServerlessConf is Here!
Cloud Academy launched 2 New AWS Courses – (Advanced Techniques for AWS Monitoring, Metrics and Logging and Advanced Deployment Techniques on AWS).
Cloudonaut reminded you to Avoid Sharing Key Pairs for EC2.
8KMiles talked about How Cloud Computing Can Address Healthcare Industry Challenges.
Evident discussed the CIS Foundations Benchmark for AWS Security.
Talkin’ Cloud shared 10 Facts About AWS as it Celebrates 10 Years.
The Next Platform reviewed Ten Years of AWS And a Status Check for HPC Clouds.
ZephyCloud is AWS-powered Wind Farm Design Software.

Tuesday
March 15

We announced the AWS Database Migration Service.
We announced that AWS CloudFormation Now Supports Amazon GameLift.
The AWS Partner Network Blog reminded everyone that Friends Don’t Let Friends Build Data Centers.
The Amazon GameDev Blog talked about Using Autoscaling to Control Costs While Delivering Great Player Experiences.
We updated the AWS SDK for JavaScript, the AWS SDK for Ruby, and the AWS SDK for Go.
Calorious talked about Uploading Images into Amazon S3.
Serverless Code showed you How to Use LXML in Lambda.
The Acquia Developer Center talked about Open-Sourcing Moonshot.
Concurrency Labs encouraged you to Hatch a Swarm of AWS IoT Things Using Locust, EC2 and Get Your IoT Application Ready for Prime Time.

Wednesday
March 16

We announced an S3 Lifecycle Management Update with Support for Multipart Upload and Delete Markers.
We announced that the EC2 Container Service is Now Available in the US West (Oregon) Region.
We announced that Amazon ElastiCache now supports the R3 node family in AWS China (Beijing) and AWS South America (Sao Paulo) Regions.
We announced that AWS IoT Now Integrates with Amazon Elasticsearch Service and CloudWatch.
We published the Puppet on the AWS Cloud: Quick Start Reference Deployment.
We announced that Amazon RDS Enhanced Monitoring is now available in the Asia Pacific (Seoul) Region.
I wrote about Additional Failover Control for Amazon Aurora (this feature was launched earlier in the year).
The AWS Security Blog showed you How to Set Up Uninterrupted, Federated User Access to AWS Using AD FS.
The AWS Java Blog talked about Migrating Your Databases Using AWS Database Migration Service.
We updated the AWS SDK for Java and the AWS CLI.
CloudWedge asked Cloud Computing: Cost Saver or Additional Expense?
Gathering Clouds reviewed New 2016 AWS Services: Certificate Manager, Lambda, Dev SecOps.

Thursday
March 17

We announced the new Marketplace Metering Service for 3rd Party Sellers.
We announced Amazon VPC Endpoints for Amazon S3 in South America (Sao Paulo) and Asia Pacific (Seoul).
We announced AWS CloudTrail Support for Kinesis Firehose.
The AWS Big Data Blog showed you How to Analyze a Time Series in Real Time with AWS Lambda, Amazon Kinesis and Amazon DynamoDB Streams.
The AWS Enterprise Blog showed you How to Create a Cloud Center of Excellence in your Enterprise, and then talked about Staffing Your Enterprise’s Cloud Center of Excellence.
The AWS Mobile Development Blog showed you How to Analyze Device-Generated Data with AWS IoT and Amazon Elasticsearch Service.
Stelligent initiated a series on Serverless Delivery.
CloudHealth Academy talked about Modeling RDS Reservations.
N2W Software talked about How to Pre-Warm Your EBS Volumes on AWS.
ParkMyCloud explained How to Save Money on AWS With ParkMyCloud.

Friday
March 18

The AWS Government, Education, & Nonprofits Blog told you how AWS GovCloud (US) Helps ASD Cut Costs by 50% While Dramatically Improving Security.
The Amazon GameDev Blog discussed Code Archeology: Crafting Lumberyard.
Calorious talked about Importing JSON into DynamoDB.
DZone Cloud Zone talked about Graceful Shutdown Using AWS AutoScaling Groups and Terraform.

Saturday
March 19

DZone Cloud Zone wants to honor some Trailblazing Women in the Cloud.

Sunday
March 20

 Cloudability talked about How Atlassian Nailed the Reserved Instance Buying Process.
DZone Cloud Zone talked about Serverless Delivery Architectures.
Gorillastack explained Why the Cloud is THE Key Technology Enabler for Digital Transformation.

New & Notable Open Source

Tumbless is a blogging platform based only on S3 and your browser.
aws-amicleaner cleans up old, unused AMIs and related snapshots.
alexa-aws-administration helps you to do various administration tasks in your AWS account using an Amazon Echo.
aws-s3-zipper takes an S3 bucket folder and zips it for streaming.
aws-lambda-helper is a collection of helper methods for Lambda.
CloudSeed lets you describe a list of AWS stack components, then configure and build a custom stack.
aws-ses-sns-dashboard is a Go-based dashboard with SES and SNS notifications.
snowplow-scala-analytics-sdk is a Scala SDK for working with Snowplow-enriched events in Spark using Lambda.
StackFormation is a lightweight CloudFormation stack manager.
aws-keychain-util is a command-line utility to manage AWS credentials in the OS X keychain.

New SlideShare Presentations

Account Separation and Mandatory Access Control on AWS.
Crypto Options in AWS.
Security Day IAM Recommended Practices.
What’s Nearly New.

New Customer Success Stories

AdiMap measures online advertising spend, app financials, and salary data. Using AWS, AdiMap builds predictive financial models without spending millions on compute resources and hardware, providing scalable financial intelligence and reducing time to market for new products.
Change.org is the world’s largest and fastest growing social change platform, with more than 125 million users in 196 countries starting campaigns and mobilizing support for local causes and global issues. The organization runs its website and business intelligence cluster on AWS, and runs its continuous integration and testing on Solano CI from APN member Solano Labs.
Flatiron Health has been able to reach 230 cancer clinics and 2,200 clinicians across the United States with a solution that captures and organizes oncology data, helping to support cancer treatments. Flatiron moved its solution to AWS to improve speed to market and to minimize the time and expense that the startup company needs to devote to its IT infrastructure.
Global Red specializes in lifecycle marketing, including strategy, data, analytics, and execution across all digital channels. By re-architecting and migrating its data platform and related applications to AWS, Global Red reduced the time to onboard new customers for its advertising trading desk and marketing automation platforms by 50 percent.
GMobi primarily sells its products and services to Original Design Manufacturers and Original Equipment Manufacturers in emerging markets. By running its “over the air” firmware updates, mobile billing, and advertising software development kits in an AWS infrastructure, GMobi has grown to support 120 million users while maintaining more than 99.9 percent availability
Time Inc.’s new chief technology officer joined the renowned media organization in early 2014, and promised big changes. With AWS, Time Inc. can leverage security features and functionality that mirror the benefits of cloud computing, including rich tools, best-in-class industry standards and protocols and lower costs.
Seaco Global is one of the world’s largest shipping companies. By using AWS to run SAP applications, it also reduced the time needed to complete monthly business processes to just one day, down from four days in the past.

New YouTube Videos

AWS Database Migration Service.
Introduction to Amazon WorkSpaces.
AWS Pop-up Loft.
Save the Date – AWS re:Invent 2016.

Upcoming Events

March 22nd – Live Event (Seattle, Washington) – AWS Big Data Meetup – Intro to SparkR.
March 22nd – Live Broadcast – VoiceOps: Commanding and Controlling Your AWS environments using Amazon Echo and Lambda.
March 23rd – Live Event (Atlanta, Georgia) – AWS Key Management Service & AWS Storage Services for a Hybrid Cloud (Atlanta AWS Community).
April 6th – Live Event (Boston, Massachusetts) AWS at Bio-IT World.
April 18th & 19th – Live Event (Chicago, Illinois) – AWS Summit – Chicago.
April 20th – Live Event (Melbourne, Australia) – Inaugural Melbourne Serverless Meetup.
April 26th – Live Event (Sydney, Australia) – AWS Partner Summit.
April 26th – Live Event (Sydney, Australia) – Inaugural Sydney Serverless Meetup.
ParkMyCloud 2016 AWS Cost-Reduction Roadshow.
AWS Loft – San Francisco.
AWS Loft – New York.
AWS Loft – Tel Aviv.
AWS Zombie Microservices Roadshow.
AWS Public Sector Events.
AWS Global Summit Series.

Help Wanted

AWS Careers.

Stay tuned for next week! In the meantime, follow me on Twitter and subscribe to the RSS feed.
Jeff;