Tag Archives: Amazon Macie

How to create custom alerts with Amazon Macie

Post Syndicated from Jeremy Haynes original https://aws.amazon.com/blogs/security/how-to-create-custom-alerts-with-amazon-macie/

Amazon Macie is a security service that makes it easy for you to discover, classify, and protect sensitive data in Amazon Simple Storage Service (Amazon S3). Macie collects AWS CloudTrail events and Amazon S3 metadata such as permissions and content classification. In this post, I’ll show you how to use Amazon Macie to create custom alerts for those data sets to notify you of events and objects of interest. I’ll go through the various types of data you can find in Macie, and talk about how to identify fields that are relevant to a given security use case. What you’ll learn is a method of investigation and alerting that you’ll be able to apply to your own situation.

If you’re totally new to Macie, make sure you first head over to the AWS Blog launch post for instructions on how to get started.

Understand the types of data in Macie

There are three sources of data found in Macie: CloudTrail events, S3 Bucket metadata, and S3 Object metadata. Each data source is stored separately in its own index. Therefore, when writing queries it’s important to keep fields from different sources in separate queries.

Within each data source there are two types of fields: Extracted and Generated. Extracted fields come directly from pre-existing AWS fields associated with that data source. For example, Extracted fields from the CloudTrail data source come directly from the events recorded in the CloudTrail logs that correspond to the actions taken by an IAM user, role, or an AWS service. See these CloudTrail log file examples for more details.

Generated fields, on the other hand, are created by Macie. These fields provide additional security value and context for creating more powerful queries. For example, the Internet Service Provider (ISP) field is populated by taking the IP addresses from a CloudTrail event and matching them against a global service provider list. This enables searching for the ISP associated with an API call.

In the sections below, I’ll explore each of the three data sources in more detail. There are reference guides in the Macie documentation dedicated to each source that provide an extensive list of fields and descriptions. I’ll use these lists to discover what fields are appropriate for investigating a particular security use case.

Choose a security use case to instrument alerts

For the purposes of this post, I’ll choose one particular security use case to focus on. The process of discovering relevant data fields collected by Macie and turning those into custom alerts will be the same for any use case you wish to investigate. With that in mind, let’s choose the theme of sensitive or critical data stored in S3 to explore because it can affect many AWS customers.

When beginning to design alerts, the first step is to think about all of the resources, attributes, actions, and identities related to the subject. In this case, I’m looking at sensitive or critical data stored in S3, so the following are some potentially useful fields of data to consider:

  • S3 bucket and object resources
  • S3 configuration and security attributes
  • Read, write, and delete actions on S3
  • IAM users, roles, and access policies associated with the S3 resources

I’ll use this list to guide my search for relevant fields in the Macie reference documentation. First, I’ll dive into the CloudTrail data.

Explore CloudTrail data

The best way to build an understanding of what activities are happening in your AWS environment is by using the CloudTrail data source because it contains your AWS API calls and the identities that made them. If you’re unfamiliar with CloudTrail, head over to the AWS CloudTrail documentation for more information.

Ok, I have a critical S3 bucket and I want to be notified each time an object write is attempted to it outside of the typical access pattern used in my corporate environment. In this case, write actions to the bucket are normally controlled by my organization’s own identity system via federated access. So that means I want to write a query that searches for object write attempts by a non-federated principal. If you’re unfamiliar with what federation is, that’s ok, you can still follow along. If you’re curious about federation, see the IAM Identity Providers and Federation documentation.

I begin by opening the Macie CloudTrail Reference documentation and looking at the section titled “CloudTrail Data Fields Extracted by Macie.” Skimming over the list, I see that the objectsWritten.key description matches my criteria for investigating actions on an S3 resource: “A list of S3 objects’ ARNs that were part of a PutObject, CopyObject, or CompleteMultipartUpload API calls.”

Next, I take a look at the CloudTrail field names that are extracted from the userIdentity object in an event. The userIdentity.type field looks promising, but I’m unsure what values this field can accept. Using the CloudTrail userIdentity Element documentation as a reference, I look up the type field and see that FederatedUser is listed as one of the values.

Great! I have all of the information necessary to write my first custom query. I’ll search for all “user sessions” (a 5-minute aggregation of CloudTrail events corresponding to a user) that have both an attempted write to my S3 bucket and any API call with a userIdentity type which is not FederatedUser. It’s important to note that I’m searching groups of events and not individual events because we’re looking for events related to a specific API call rather than all events related to all API calls. To match values which do not equal FederatedUser, I use a regular expression and place FederatedUser inside parenthesis with a tilde character (~) at the beginning. Here’s what I came up with using the corresponding Macie field names and example search queries as a formatting guide:
 
objectsWritten.key:/arn:aws:s3:::my_sensitive_bucket.*/ AND userIdentityType.key:/(~FederatedUser)/
 
Now, it’s time to test this query on the Research page and turn it into a custom alert.

Create custom alerts

The first step to create a custom alert is to run the proposed query from the Research page. I do this so I can verify that the results match my expectations and to get an idea of how often the alert will occur. Once I’m happy with the query, setting up a custom alert is only a few clicks away.

  1. In the Macie navigation pane, select Research.
  2. Select the data source matching the query from the options in the drop-down menu. In this case, I select CloudTrail data.
     
    Figure 1: Select the data source on the Research page

    Figure 1: Select the data source on the Research page

  3. To run the search, I copy my custom query, paste it in the query box, and then select Enter.
     
    objectsWritten.key:/arn:aws:s3:::my_sensitive_bucket.*/ AND userIdentityType.key:/(~FederatedUser)/
     
  4. At this point, I verify that the results match my expectations and make any necessary modifications to the query. To learn more about how the features of the Research page can help with verifying and modifying queries, see Using the Macie Research Tab.
  5. Once I’m confident in the results, I select the Save query as alert icon.
     
    Figure 2: Select the "Save query as alert" icon

    Figure 2: Select the “Save query as alert” icon

  6. Now, I fill in the remaining fields: Alert title, Description, Min number of matches, and Severity. For more information about each of these fields, see Macie Adding New Custom Basic Alerts.
     
    Figure 3: Fill in the remaining "Alert title," "Description," "Min number of matches," and "Severity" fields

    Figure 3: Fill in the remaining “Alert title,” “Description,” “Min number of matches,” and “Severity” fields

  7. In the Whitelisted users field, I add any users which appeared in my Research page results that I would like to exclude from the alert. For more details on this feature, see Whitelisting Users or Buckets for Basic Alerts.
  8. Finally, I save the alert.

That’s it! I now have a custom basic alert that will alert me every time there’s an attempt to write to my bucket in the same user session as an API call with a userIdentityType other than FederatedUser. Now, I’ll use the S3 object and S3 bucket data sources to look for some more useful fields.

Use S3 object data

The S3 object data source contains fields extracted from S3 API metadata, as well as fields populated by Macie relating to content classification. To expand my search and alerts for sensitive or critical data stored in S3, I’ll look at the S3 Object Data Reference documentation and look for fields that are promising candidates.

I see that S3 Server Side Encryption Settings metadata could be useful because I know that all of the objects in a certain bucket should be encrypted using AES256, and I’d like to be notified every time an object is uploaded that doesn’t match that attribute.

To create the query, I combine the server_encryption field and the bucket field to match on all S3 objects within the specified bucket. Note the forward slashes and “.*” that make this a regular expression search. This allows me to match all buckets that share my project name, even when the full bucket name is different.
 
filesystem_metadata.bucket:/.*my_sensitive_project.*/ AND NOT filesystem_metadata.server_encryption:"AES256"
 
Next, I have a different bucket that I know should never have any objects which contain personally identifiable information (PII). This includes such information as names, email addresses, and credit card numbers. For the full list of what Macie considers PII, see Classifying Data with Amazon Macie. I’d like to set up an alert that notifies me every time an object containing any type of PII is added to this bucket. Since this is a data field that’s provided by Macie, I look under the Generated Fields heading and find the field pii_impact. I’m looking for all levels of PII impact, so my query will search for any value which isn’t equal to none.

As before, I’ll combine this with the bucket field to include all S3 objects matching the bucket name.
 
filesystem_metadata.bucket:"my_logs_bucket" AND NOT pii_impact:"none"
 

Use S3 bucket data

The S3 bucket data source contains information extracted from S3 bucket APIs, as well as fields that Macie generates by processing bucket metadata and access control lists (ACLs). Following the same method as before, I head over to the S3 Bucket Data Reference documentation and look for fields that will help me create useful alerts.

There are plenty of fields which could be useful here, depending on how much information I know in advance about my bucket and which type of security vector I want to protect against. To narrow my search, I decide to add some protection against accidental or unauthorized data destruction.

In the S3 Versioning section, I see that the Multi-Factor Authentication Delete settings are one of the available fields. Since I have this bucket locked down to only allow MFA delete actions, I can create an alert to notify me every time this delete action is disabled.
 
bucket:"my_critical_bucket" AND versioning.MFADelete:"disabled"
 
Another method of potential data destruction is through the bucket lifecycle expiration settings automatically removing data after a period of time. I’d like to know if someone changes this to a low number so I can make a modification and prevent losing any recent data.
 
bucket:"my_critical_bucket" AND lifecycle_configuration.Rules.Expiration.Days:<3
 
Now that I have gathered another set of potential alert queries, I can walk through the same steps I used for CloudTrail data to turn them into custom alerts. Once saved, I’ll begin to receive notifications in the Macie console whenever a match is found. I can view the alerts I’ve received by selecting Alerts from the Macie navigation pane.

Summary

I described the various types and sources of data available in Macie. After demonstrating how to take a security use case and discover relevant fields, I stepped through the process of creating queries and turning them into custom alerts. My goal has been to show you how to build alerts that are tailored to your specific environment and solve your own individual needs.

If you have comments about this post, submit them in the Comments section below. If you have questions about how to use Macie, or you’d like to request new fields and data sources, start a new thread on the Macie forum or contact AWS Support.

Want more AWS Security news? Follow us on Twitter.

AWS Adds 16 More Services to Its PCI DSS Compliance Program

Post Syndicated from Chad Woolf original https://aws.amazon.com/blogs/security/aws-adds-16-more-services-to-its-pci-dss-compliance-program/

PCI logo

AWS has added 16 more AWS services to its Payment Card Industry Data Security Standard (PCI DSS) compliance program, giving you more options, flexibility, and functionality to process and store sensitive payment card data in the AWS Cloud. The services were audited by Coalfire to ensure that they meet strict PCI DSS standards.

The newly compliant AWS services are:

AWS now offers 58 services that are officially PCI DSS compliant, giving administrators more service options for implementing a PCI-compliant cardholder environment.

For more information about the AWS PCI DSS compliance program, see Compliance ResourcesAWS Services in Scope by Compliance Program, and PCI DSS Compliance.

– Chad Woolf

AWS Updated Its ISO Certifications and Now Has 67 Services Under ISO Compliance

Post Syndicated from Chad Woolf original https://aws.amazon.com/blogs/security/aws-updated-its-iso-certifications-and-now-has-67-services-under-iso-compliance/

ISO logo

AWS has updated its certifications against ISO 9001, ISO 27001, ISO 27017, and ISO 27018 standards, bringing the total to 67 services now under ISO compliance. We added the following 29 services this cycle:

Amazon AuroraAmazon S3 Transfer AccelerationAWS [email protected]
Amazon Cloud DirectoryAmazon SageMakerAWS Managed Services
Amazon CloudWatch LogsAmazon Simple Notification ServiceAWS OpsWorks Stacks
Amazon CognitoAuto ScalingAWS Shield
Amazon ConnectAWS BatchAWS Snowball Edge
Amazon Elastic Container RegistryAWS CodeBuildAWS Snowmobile
Amazon InspectorAWS CodeCommitAWS Step Functions
Amazon Kinesis Data StreamsAWS CodeDeployAWS Systems Manager (formerly Amazon EC2 Systems Manager)
Amazon MacieAWS CodePipelineAWS X-Ray
Amazon QuickSightAWS IoT Core

For the complete list of services under ISO compliance, see AWS Services in Scope by Compliance Program.

AWS maintains certifications through extensive audits of its controls to ensure that information security risks that affect the confidentiality, integrity, and availability of company and customer information are appropriately managed.

You can download copies of the AWS ISO certificates that contain AWS’s in-scope services and Regions, and use these certificates to jump-start your own certification efforts:

AWS does not increase service costs in any AWS Region as a result of updating its certifications.

To learn more about compliance in the AWS Cloud, see AWS Cloud Compliance.

– Chad

Introducing the New GDPR Center and “Navigating GDPR Compliance on AWS” Whitepaper

Post Syndicated from Chad Woolf original https://aws.amazon.com/blogs/security/introducing-the-new-gdpr-center-and-navigating-gdpr-compliance-on-aws-whitepaper/

European Union flag

At AWS re:Invent 2017, the AWS Compliance team participated in excellent engagements with AWS customers about the General Data Protection Regulation (GDPR), including discussions that generated helpful input. Today, I am announcing resulting enhancements to our recently launched GDPR Center and the release of a new whitepaper, Navigating GDPR Compliance on AWS. The resources available on the GDPR Center are designed to give you GDPR basics, and provide some ideas as you work out the details of the regulation and find a path to compliance.

In this post, I focus on two of these new GDPR requirements in terms of articles in the GDPR, and explain some of the AWS services and other resources that can help you meet these requirements.

Background about the GDPR

The GDPR is a European privacy law that will become enforceable on May 25, 2018, and is intended to harmonize data protection laws throughout the European Union (EU) by applying a single data protection law that is binding throughout each EU member state. The GDPR not only applies to organizations located within the EU, but also to organizations located outside the EU if they offer goods or services to, or monitor the behavior of, EU data subjects. All AWS services will comply with the GDPR in advance of the May 25, 2018, enforcement date.

We are already seeing customers move personal data to AWS to help solve challenges in complying with the EU’s GDPR because of AWS’s advanced toolset for identifying, securing, and managing all types of data, including personal data. Steve Schmidt, the AWS CISO, has already written about the internal and external work we have been undertaking to help you use AWS services to meet your own GDPR compliance goals.

Article 25 – Data Protection by Design and by Default (Privacy by Design)

Privacy by Design is the integration of data privacy and compliance into the systems development process, enabling applications, systems, and accounts, among other things, to be secure by default. To secure your AWS account, we offer a script to evaluate your AWS account against the full Center for Internet Security (CIS) Amazon Web Services Foundations Benchmark 1.1. You can access this public benchmark on GitHub. Additionally, AWS Trusted Advisor is an online resource to help you improve security by optimizing your AWS environment. Among other things, Trusted Advisor lists a number of security-related controls you should be monitoring. AWS also offers AWS CloudTrail, a logging tool to track usage and API activity. Another example of tooling that enables data protection is Amazon Inspector, which includes a knowledge base of hundreds of rules (regularly updated by AWS security researchers) mapped to common security best practices and vulnerability definitions. Examples of built-in rules include checking for remote root login being enabled or vulnerable software versions installed. These and other tools enable you to design an environment that protects customer data by design.

An accurate inventory of all the GDPR-impacting data is important but sometimes difficult to assess. AWS has some advanced tooling, such as Amazon Macie, to help you determine where customer data is present in your AWS resources. Macie uses advanced machine learning to automatically discover and classify data so that you can protect data, per Article 25.

Article 32 – Security of Processing

You can use many AWS services and features to secure the processing of data regulated by the GDPR. Amazon Virtual Private Cloud (Amazon VPC) lets you provision a logically isolated section of the AWS Cloud where you can launch resources in a virtual network that you define. You have complete control over your virtual networking environment, including the selection of your own IP address range, creation of subnets, and configuration of route tables and network gateways. With Amazon VPC, you can make the Amazon Cloud a seamless extension of your existing on-premises resources.

AWS Key Management Service (AWS KMS) is a managed service that makes it easy for you to create and control the encryption keys used to encrypt your data, and uses hardware security modules (HSMs) to help protect your keys. Managing keys with AWS KMS allows you to choose to encrypt data either on the server side or the client side. AWS KMS is integrated with several other AWS services to help you protect the data you store with these services. AWS KMS is also integrated with CloudTrail to provide you with logs of all key usage to help meet your regulatory and compliance needs. You can also use the AWS Encryption SDK to correctly generate and use encryption keys, as well as protect keys after they have been used.

We also recently announced new encryption and security features for Amazon S3, including default encryption and a detailed inventory report. Services of this type as well as additional GDPR enablers will be published regularly on our GDPR Center.

Other resources

As you prepare for GDPR, you may want to visit our AWS Customer Compliance Center or Tools for Amazon Web Services to learn about options for building anything from small scripts that delete data to a full orchestration framework that uses AWS Code services.

-Chad

How to Query Personally Identifiable Information with Amazon Macie

Post Syndicated from Chad Woolf original https://aws.amazon.com/blogs/security/how-to-query-personally-identifiable-information-with-amazon-macie/

Amazon Macie logo

In August 2017 at the AWS Summit New York, AWS launched a new security and compliance service called Amazon Macie. Macie uses machine learning to automatically discover, classify, and protect sensitive data in AWS. In this blog post, I demonstrate how you can use Macie to help enable compliance with applicable regulations, starting with data retention.

How to query retained PII with Macie

Data retention and mandatory data deletion are common topics across compliance frameworks, so knowing what is stored and how long it has been or needs to be stored is of critical importance. For example, you can use Macie for Payment Card Industry Data Security Standard (PCI DSS) 3.2, requirement 3, “Protect stored cardholder data,” which mandates a “quarterly process for identifying and securely deleting stored cardholder data that exceeds defined retention.” You also can use Macie for ISO 27017 requirement 12.3.1, which calls for “retention periods for backup data.” In each of these cases, you can use Macie’s built-in queries to identify the age of data in your Amazon S3 buckets and to help meet your compliance needs.

To get started with Macie and run your first queries of personally identifiable information (PII) and sensitive data, follow the initial setup as described in the launch post on the AWS Blog. After you have set up Macie, walk through the following steps to start running queries. Start by focusing on the S3 buckets that you want to inventory and capture important compliance related activity and data.

To start running Macie queries:

  1. In the AWS Management Console, launch the Macie console (you can type macie to find the console).
  2. Click Dashboard in the navigation pane. This shows you an overview of the risk level and data classification type of all inventoried S3 buckets, categorized by date and type.
    Screenshot of "Dashboard" in the navigation pane
  3. Choose S3 objects by PII priority. This dashboard lets you sort by PII priority and PII types.
    Screenshot of "S3 objects by PII priority"
  4. In this case, I want to find information about credit card numbers. I choose the magnifying glass for the type cc_number (note that PII types can be used for custom queries). This view shows the events where PII classified data has been uploaded to S3. When I scroll down, I see the individual files that have been identified.
    Screenshot showing the events where PII classified data has been uploaded to S3
  5. Before looking at the files, I want to continue to build the query by only showing items with high priority. To do so, I choose the row called Object PII Priority and then the magnifying glass icon next to High.
    Screenshot of refining the query for high priority events
  6. To view the results matching these queries, I scroll down and choose any file listed. This shows vital information such as creation date, location, and object access control list (ACL).
  7. The piece I am most interested in this case is the Object PII details line to understand more about what was found in the file. In this case, I see name and credit card information, which is what caused the high priority. Scrolling up again, I also see that the query fields have updated as I interacted with the UI.
    Screenshot showing "Object PII details"

Let’s say that I want to get an alert every time Macie finds new data matching this query. This alert can be used to automate response actions by using AWS Lambda and Amazon CloudWatch Events.

  1. I choose the left green icon called Save query as alert.
    Screenshot of "Save query as alert" button
  2. I can customize the alert and change things like category or severity to fit my needs based on the alert data.
  3. Another way to find the information I am looking for is to run custom queries. To start using custom queries, I choose Research in the navigation pane.
    1. To learn more about custom Macie queries and what you can do on the Research tab, see Using the Macie Research Tab.
  4. I change the type of query I want to run from CloudTrail data to S3 objects in the drop-down list menu.
    Screenshot of choosing "S3 objects" from the drop-down list menu
  5. Because I want PII data, I start typing in the query box, which has an autocomplete feature. I choose the pii_types: query. I can now type the data I want to look for. In this case, I want to see all files matching the credit card filter so I type cc_number and press Enter. The query box now says, pii_types:cc_number. I press Enter again to enable autocomplete, and then I type AND pii_types:email to require both a credit card number and email address in a single object.
    The query looks for all files matching the credit card filter ("cc_number")
  6. I choose the magnifying glass to search and Macie shows me all S3 objects that are tagged as PII of type Credit Cards. I can further specify that I only want to see PII of type Credit Card that are classified as High priority by adding AND and pii_impact:high to the query.
    Screenshot showing narrowing the query results furtherAs before, I can save this new query as an alert by clicking Save query as alert, which will be triggered by data matching the query going forward.

Advanced tip

Try the following advanced queries using Lucene query syntax and save the queries as alerts in Macie.

  • Use a regular-expression based query to search for a minimum of 10 credit card numbers and 10 email addresses in a single object:
    • pii_explain.cc_number:/([1-9][0-9]|[0-9]{3,}) distinct Credit Card Numbers.*/ AND pii_explain.email:/([1-9][0-9]|[0-9]{3,}) distinct Email Addresses.*/
  • Search for objects containing at least one credit card, name, and email address that have an object policy enabling global access (searching for S3 AllUsers or AuthenticatedUsers permissions):
    • (object_acl.Grants.Grantee.URI:”http\://acs.amazonaws.com/groups/global/AllUsers” OR  object_acl.Grants.Grantee.URI:”http\://acs.amazonaws.com/groups/global/AllUsers”) AND (pii_types.cc_number AND pii_types.email AND pii_types.name)

These are two ways to identify and be alerted about PII by using Macie. In a similar way, you can create custom alerts for various AWS CloudTrail events by choosing a different data set on which to run the queries again. In the examples in this post, I identified credit cards stored in plain text (all data in this post is example data only), determined how long they had been stored in S3 by viewing the result details, and set up alerts to notify or trigger actions on new sensitive data being stored. With queries like these, you can build a reliable data validation program.

If you have comments about this post, submit them in the “Comments” section below. If you have questions about how to use Macie, start a new thread on the Macie forum or contact AWS Support.

-Chad

AWS Summit New York – Summary of Announcements

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/aws-summit-new-york-summary-of-announcements/

Whew – what a week! Tara, Randall, Ana, and I have been working around the clock to create blog posts for the announcements that we made at the AWS Summit in New York. Here’s a summary to help you to get started:

Amazon Macie – This new service helps you to discover, classify, and secure content at scale. Powered by machine learning and making use of Natural Language Processing (NLP), Macie looks for patterns and alerts you to suspicious behavior, and can help you with governance, compliance, and auditing. You can read Tara’s post to see how to put Macie to work; you select the buckets of interest, customize the classification settings, and review the results in the Macie Dashboard.

AWS GlueRandall’s post (with deluxe animated GIFs) introduces you to this new extract, transform, and load (ETL) service. Glue is serverless and fully managed, As you can see from the post, Glue crawls your data, infers schemas, and generates ETL scripts in Python. You define jobs that move data from place to place, with a wide selection of transforms, each expressed as code and stored in human-readable form. Glue uses Development Endpoints and notebooks to provide you with a testing environment for the scripts you build. We also announced that Amazon Athena now integrates with Amazon Glue, as does Apache Spark and Hive on Amazon EMR.

AWS Migration Hub – This new service will help you to migrate your application portfolio to AWS. My post outlines the major steps and shows you how the Migration Hub accelerates, tracks,and simplifies your migration effort. You can begin with a discovery step, or you can jump right in and migrate directly. Migration Hub integrates with tools from our migration partners and builds upon the Server Migration Service and the Database Migration Service.

CloudHSM Update – We made a major upgrade to AWS CloudHSM, making the benefits of hardware-based key management available to a wider audience. The service is offered on a pay-as-you-go basis, and is fully managed. It is open and standards compliant, with support for multiple APIs, programming languages, and cryptography extensions. CloudHSM is an integral part of AWS and can be accessed from the AWS Management Console, AWS Command Line Interface (CLI), and through API calls. Read my post to learn more and to see how to set up a CloudHSM cluster.

Managed Rules to Secure S3 Buckets – We added two new rules to AWS Config that will help you to secure your S3 buckets. The s3-bucket-public-write-prohibited rule identifies buckets that have public write access and the s3-bucket-public-read-prohibited rule identifies buckets that have global read access. As I noted in my post, you can run these rules in response to configuration changes or on a schedule. The rules make use of some leading-edge constraint solving techniques, as part of a larger effort to use automated formal reasoning about AWS.

CloudTrail for All Customers – Tara’s post revealed that AWS CloudTrail is now available and enabled by default for all AWS customers. As a bonus, Tara reviewed the principal benefits of CloudTrail and showed you how to review your event history and to deep-dive on a single event. She also showed you how to create a second trail, for use with CloudWatch CloudWatch Events.

Encryption of Data at Rest for EFS – When you create a new file system, you now have the option to select a key that will be used to encrypt the contents of the files on the file system. The encryption is done using an industry-standard AES-256 algorithm. My post shows you how to select a key and to verify that it is being used.

Watch the Keynote
My colleagues Adrian Cockcroft and Matt Wood talked about these services and others on the stage, and also invited some AWS customers to share their stories. Here’s the video:

Jeff;

 

AWS Announces Amazon Macie

Post Syndicated from Stephen Schmidt original https://aws.amazon.com/blogs/security/aws-announces-amazon-macie/

I’m pleased to announce that today we’ve launched a new security service, Amazon Macie.

This service leverages machine learning to help customers prevent data loss by automatically discovering, classifying, and protecting sensitive data in AWS. Amazon Macie recognizes sensitive data such as personally identifiable information (PII) or intellectual property, providing customers with dashboards and alerts that give visibility into how data is being accessed or moved. This enables customers to apply machine learning to a wide array of security and compliance workloads, we think this will be a significant enabler for our customers.

To learn more about the see the full AWS Blog post.

–  Steve