Tag Archives: Compliance

New – Enhanced Amazon Macie Now Available with Substantially Reduced Pricing

Post Syndicated from Danilo Poccia original https://aws.amazon.com/blogs/aws/new-enhanced-amazon-macie-now-available/

Amazon Macie is a fully managed service that helps you discover and protect your sensitive data, using machine learning to automatically spot and classify data for you.

Over time, Macie customers told us what they like, and what they didn’t. The service team has worked hard to address this feedback, and today I am very happy to share that we are making available a new, enhanced version of Amazon Macie!

This new version has simplified the pricing plan: you are now charged based on the number of Amazon Simple Storage Service (S3) buckets that are evaluated, and the amount of data processed for sensitive data discovery jobs. The new tiered pricing plan has reduced the price by 80%. With higher volumes, you can reduce your costs by more than 90%.

At the same time, we have introduced many new features:

  • An expanded sensitive data discovery, including updated machine learning models for personally identifiable information (PII) detection, and customer-defined sensitive data types using regular expressions.
  • Multi-account support with AWS Organizations.
  • Full API coverage for programmatic use of the service with AWS SDKs and AWS Command Line Interface (CLI).
  • Expanded regional availability to 17 Regions.
  • A new, simplified free tier and free trial to help you get started and understand your costs.
  • A completely redesigned console and user experience.

Macie is now tightly integrated with S3 in the backend, providing more advantages:

  • Enabling S3 data events in AWS CloudTrail is no longer a requirement, further reducing overall costs.
  • There is now a continual evaluation of all buckets, issuing security findings for any public bucket, unencrypted buckets, and for buckets shared with (or replicated to) an AWS account outside of your Organization.

The anomaly detection features monitoring S3 data access activity previously available in Macie are now in private beta as part of Amazon GuardDuty, and have been enhanced to include deeper capabilities to protect your data in S3.

Enabling Amazon Macie
In the Macie console, I select to Enable Macie. If you use AWS Organizations, you can delegate an AWS account to administer Macie for your Organization.

After it has been enabled, Amazon Macie automatically provides a summary of my S3 buckets in the region, and continually evaluates those buckets to generate actionable security findings for any unencrypted or publicly accessible data, including buckets shared with AWS accounts outside of my Organization.

Below the summary, I see the top findings by type and by S3 bucket. Overall, this page provides a great overview of the status of my S3 buckets.

In the Findings section I have the full list of findings, and I can select them to archive, unarchive, or export them. I can also select one of the findings to see the full information collected by Macie.

Findings can be viewed in the web console and are sent to Amazon CloudWatch Events for easy integration with existing workflow or event management systems, or to be used in combination with AWS Step Functions to take automated remediation actions. This can help meet regulations such as Payment Card Industry Data Security Standard (PCI-DSS), Health Insurance Portability and Accountability Act (HIPAA), General Data Privacy Regulation (GDPR), and California Consumer Protection Act (CCPA).

In the S3 Buckets section, I can search and filter on buckets of interest to create sensitive data discovery jobs across one or multiple buckets to discover sensitive data in objects, and to check encryption status and public accessibility at object level. Jobs can be executed once, or scheduled daily, weekly, or monthly.

For jobs, Amazon Macie automatically tracks changes to the buckets and only evaluates new or modified objects over time. In the additional settings, I can include or exclude objects based on tags, size, file extensions, or last modified date.

To monitor my costs, and the use of the free trial, I look at the Usage section of the console.

Creating Custom Data Identifiers
Amazon Macie supports natively the most common sensitive data types, including personally identifying information (PII) and credential data. You can extend that list with custom data identifiers to discover proprietary or unique sensitive data for your business.

For example, often companies have a specific syntax for their employee IDs. A possible syntax is to have a capital letter, that defines if this is a full-time or a part-time employee, followed by a dash, and then eight numbers. Possible values in this case are F-12345678 or P-87654321.

To create this custom data identifier, I enter a regular expression (regex) to describe the pattern to match:

[A-Z]-\d{8}

To avoid false positives, I ask that the employee keyword is found near the identifier (by default, less than 50 characters apart). I use the Evaluate box to test that this configuration works with sample text, then I select Submit.

Available Now
For Amazon Macie regional availability, please see the AWS Region Table. You can find more information on how the new enhanced Macie in the documentation.

This release of Amazon Macie remains optimized for S3. However, anything you can get into S3, permanently or temporarily, in an object format supported by Macie, can be scanned for sensitive data. This allows you to expand the coverage to data residing outside of S3 by pulling data out of custom applications, databases, and third-party services, temporarily placing it in S3, and using Amazon Macie to identify sensitive data.

For example, we’ve made this even easier with RDS and Aurora now supporting snapshots to S3 in Apache Parquet, which is a format Macie supports. Similarly, in DynamoDB, you can use AWS Glue to export tables to S3 which can then be scanned by Macie. With the new API and SDKs coverage, you can use the new enhanced Amazon Macie as a building block in an automated process exporting data to S3 to discover and protect your sensitive data across multiple sources.

Danilo

The DoD Isn’t Fixing Its Security Problems

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2020/04/the_dod_isnt_fi.html

It has produced several reports outlining what’s wrong and what needs to be fixed. It’s not fixing them:

GAO looked at three DoD-designed initiatives to see whether the Pentagon is following through on its own goals. In a majority of cases, DoD has not completed the cybersecurity training and awareness tasks it set out to. The status of various efforts is simply unknown because no one has tracked their progress. While an assessment of “cybersecurity hygiene” like this doesn’t directly analyze a network’s hardware and software vulnerabilities, it does underscore the need for people who use digital systems to interact with them in secure ways. Especially when those people work on national defense.

[…]

The report focuses on three ongoing DoD cybersecurity hygiene initiatives. The 2015 Cybersecurity Culture and Compliance Initiative outlined 11 education-related goals for 2016; the GAO found that the Pentagon completed only four of them. Similarly, the 2015 Cyber Discipline plan outlined 17 goals related to detecting and eliminating preventable vulnerabilities from DoD’s networks by the end of 2018. GAO found that DoD has met only six of those. Four are still pending, and the status of the seven others is unknown, because no one at DoD has kept track of the progress.

GAO repeatedly identified lack of status updates and accountability as core issues within DoD’s cybersecurity awareness and education efforts. It was unclear in many cases who had completed which training modules. There were even DoD departments lacking information on which users should have their network access revoked for failure to complete trainings.

The report.

New IRAP report provides Australian public sector the ability to leverage additional services at PROTECTED level

Post Syndicated from John Hildebrandt original https://aws.amazon.com/blogs/security/new-irap-report-australian-public-sector-leverage-services-protected-level/

Following the award of PROTECTED certification to AWS in January 2019, we have now released updated Information Security Registered Assessors Program (IRAP) PROTECTED documentation via AWS Artifact. This information provides the ability to plan, architect, and self-assess systems built in AWS under the Digital Transformation Agency’s Secure Cloud Guidelines. The new documentation expands the scope to 64 PROTECTED services, including new category areas such as artificial intelligence (AI), machine learning (ML), and IoT services. For example, Amazon SageMaker is a service that provides every developer and data scientist with tools to build, train, and deploy machine learning models quickly.

This documentation gives public sector customers everything needed to evaluate AWS at the PROTECTED level. AWS is making this resource available to download on-demand through AWS Artifact. The guide provides a mapping of AWS controls for securing PROTECTED data.

The AWS IRAP PROTECTED documentation helps individual agencies simplify the process of adopting AWS services. The information enables individual agencies to complete their own assessments and adopt AWS for a broader range of services. These assessed AWS services are available within the existing AWS Asia-Pacific (Sydney) Region and cover service categories such as compute, storage, network, database, security, analytics, AI/ML, IoT, application integration, management, and governance. This means you can take advantage of all the security benefits without paying a price premium, or needing to modify your existing applications or environments.

The newly added services to the scope of the IRAP document are listed below.

For the full list of services in scope of the IRAP report, see the services in scope page (select the IRAP tab).

If you have questions about our PROTECTED certification or would like to inquire about how to use AWS for your highly sensitive workloads, contact your account team.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Author photo

John Hildebrandt

John is Head of Security Assurance for Australia and New Zealand at AWS in Canberra Australia. He is passionate about removing regulatory barriers of cloud adoption for customers. John has been working with Government customers at AWS for over 7 years, as the first employee for the ANZ Public Sector team.

AWS achieves FedRAMP JAB High and Moderate Provisional Authorization across 16 services in the AWS US East/West and AWS GovCloud (US) Regions

Post Syndicated from Amendaze Thomas original https://aws.amazon.com/blogs/security/aws-achieves-fedramp-jab-high-moderate-provisional-authorization-16-services-us-east-west-govcloud-us-regions/

AWS is continually expanding the scope of our compliance programs to help your organization run sensitive and regulated workloads. Today, we’re pleased to announce an additional array of AWS services that are available in the AWS US East/West and AWS GovCloud (US) Regions, marking a 17.7% increase in our number of FedRAMP authorizations since the beginning of December 2019. We’ve achieved authorizations for 16 additional services, 6 of which have been authorized for both the AWS US East/West and AWS GovCloud (US) Regions.

We’ve achieved FedRAMP authorizations for the following 7 services in our AWS US East/West Regions:

We also received 15 service authorizations in our AWS GovCloud (US) Regions:

In total, we now offer 78 AWS services authorized in the AWS US East/West Regions under FedRAMP Moderate and 70 services authorized in the AWS GovCloud (US) Regions under FedRamp High. You can see our full, updated list of authorizations on the FedRAMP Marketplace. We also list all of our services in scope by compliance program on our Services in Scope by Compliance Program page.

Our FedRAMP assessment was completed with an accredited third-party assessment organization (3PAO) to ensure an independent validation of our technical, management, and operational security controls against the FedRAMP NIST requirements and baselines.

We care deeply about our customers’ needs, and compliance is my team’s priority. We want to continue to onboard services into the compliance programs our customers are using, such as FedRAMP.

To learn what other public sector customers are doing on AWS, see our Government, Education, and Nonprofits Case Studies and Customer Success Stories. Stay tuned for future updates on our Services in Scope by Compliance Program page. If you have feedback about this blog post, let us know in the Comments section below.

author photo

Amendaze Thomas

Amendaze is the manager of the AWS Government Assessments and Authorization Program (GAAP). He has 15 years of experience providing advisory services to clients in the Federal government, and over 13 years’ experience supporting CISO teams with risk management framework (RMF) activities

55 additional AWS services achieve HITRUST CSF Certification

Post Syndicated from Hadis Ali original https://aws.amazon.com/blogs/security/55-additional-aws-services-achieve-hitrust-csf-certification/

We’re excited to announce the addition of 55 new services in scope under our latest Health Information Trust Alliance (HITRUST) Common Security Framework (CSF) certification, for a total of 119 AWS services in scope.

You can deploy environments onto AWS and inherit our HITRUST certification provided that you use only in-scope services and apply the controls detailed in the CSF. The full list of HITRUST CSF certified AWS services is available on our Services in Scope by Compliance program page. The HITRUST CSF certification of AWS is valid for two years.

AWS has released a Quick Start reference architecture for HITRUST on AWS. The Quick Start includes AWS CloudFormation templates to automate building a baseline architecture that fits within your organization’s larger HITRUST program. It also includes a security controls reference (XLS download), which maps HITRUST controls to architecture decisions, features, and configuration of the baseline.

You can download the new HITRUST CSF certificate now through AWS Artifact in the AWS Management Console. As always, we value your feedback and questions. Please feel free to reach out to the team through the Contact Us page.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Hadis Ali

Hadis is a Security and Privacy Manager at Amazon Web Services. He leads multiple security and privacy initiatives within AWS Security Assurance. Hadis holds Bachelor’s degrees in Accounting and Information Systems from the University of Washington.

Fall 2019 PCI DSS report now available with 7 services added in scope

Post Syndicated from Nivetha Chandran original https://aws.amazon.com/blogs/security/fall-2019-pci-dss-report-available-7-services-added/

We’re pleased to announce that seven services have been added to the scope of our Payment Card Industry Data Security Standard (PCI DSS) certification, providing our customers more options to process and store their payment card data and architect their Cardholder Data Environment (CDE) securely in AWS.

In the past year we have increased the scope of our PCI DSS certification by 19%. With these additions, you can now select from a total of 118 PCI-certified services. You can see the full list on our Services in Scope by Compliance program page. The seven new services are:

We were evaluated by third-party auditors from Coalfire and their report is available through AWS Artifact.

To learn more about our PCI program and other compliance and security programs, see the AWS Compliance Programs page. As always, we value your feedback and questions; reach out to the team through the Contact Us page.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Nivetha Chandran

Nivetha is a Security Assurance Manager at Amazon Web Services on the Global Audits team, managing the PCI compliance program. Nivetha holds a Master’s degree in Information Management from the University of Washington.

AWS achieves FedRAMP JAB High and Moderate Provisional Authorization across 26 services in the AWS US East/West and AWS GovCloud (US) Regions

Post Syndicated from Amendaze Thomas original https://aws.amazon.com/blogs/security/aws-achieves-fedramp-jab-high-and-moderate-provisional-authorization-across-26-services-in-the-aws-us-east-west-and-aws-govcloud-us-regions/

AWS continues to expand the number of services that customers can use to run sensitive and highly regulated workloads in the federal government space. Today, I’m pleased to announce another expansion of our FedRAMP program, marking a 36.2% increase in our number of FedRAMP authorizations. We’ve achieved authorizations for 26 additional services, 7 of which have been authorized for both the AWS US East/West and AWS GovCloud (US) Regions.

We’ve achieved FedRAMP authorizations for the following 22 services in our AWS US East/West Regions:

We also received 11 service authorizations in our AWS GovCloud (US) Regions:

In total, we now offer 70 AWS services authorized in the AWS US East/West Regions under FedRAMP Moderate and 54 services authorized in the AWS GovCloud (US) Regions under FedRamp High. You can see our full, updated list of authorizations on the FedRAMP Marketplace. We also list all of our services in scope by compliance program on our Services in Scope page.

Our FedRAMP assessment was completed with a third-party assessment partner to ensure an independent validation of our technical, management, and operational security controls against the FedRAMP baselines.

We care deeply about our customers’ needs, and compliance is my team’s priority. We want to continue to onboard services into the compliance programs our customers are using, such as FedRAMP.

To learn what other public sector customers are doing on AWS, see our Government, Education, and Nonprofits Case Studies and Customer Success Stories. Stay tuned for future updates on our Services in Scope by Compliance Program page. If you have feedback about this blog post, let us know in the Comments section below.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

author photo

Amendaze Thomas

Amendaze is the manager of the AWS Government Assessments and Authorization Program (GAAP). He has 15 years of experience providing advisory services to clients in the Federal government, and over 13 years’ experience supporting CISO teams with risk management framework (RMF) activities

Identify Unintended Resource Access with AWS Identity and Access Management (IAM) Access Analyzer

Post Syndicated from Brandon West original https://aws.amazon.com/blogs/aws/identify-unintended-resource-access-with-aws-identity-and-access-management-iam-access-analyzer/

Today I get to share my favorite kind of announcement. It’s the sort of thing that will improve security for just about everyone that builds on AWS, it can be turned on with almost no configuration, and it costs nothing to use. We’re launching a new, first-of-its-kind capability called AWS Identity and Access Management (IAM) Access Analyzer. IAM Access Analyzer mathematically analyzes access control policies attached to resources and determines which resources can be accessed publicly or from other accounts. It continuously monitors all policies for Amazon Simple Storage Service (S3) buckets, IAM roles, AWS Key Management Service (KMS) keys, AWS Lambda functions, and Amazon Simple Queue Service (SQS) queues. With IAM Access Analyzer, you have visibility into the aggregate impact of your access controls, so you can be confident your resources are protected from unintended access from outside of your account.

Let’s look at a couple examples. An IAM Access Analyzer finding might indicate an S3 bucket named my-bucket-1 is accessible to an AWS account with the id 123456789012 when originating from the source IP 11.0.0.0/15. Or IAM Access Analyzer may detect a KMS key policy that allow users from another account to delete the key, identifying a data loss risk you can fix by adjusting the policy. If the findings show intentional access paths, they can be archived.

So how does it work? Using the kind of math that shows up on unexpected final exams in my nightmares, IAM Access Analyzer evaluates your policies to determine how a given resource can be accessed. Critically, this analysis is not based on historical events or pattern matching or brute force tests. Instead, IAM Access Analyzer understands your policies semantically. All possible access paths are verified by mathematical proofs, and thousands of policies can be analyzed in a few seconds. This is done using a type of cognitive science called automated reasoning. IAM Access Analyzer is the first service powered by automated reasoning available to builders everywhere, offering functionality unique to AWS. To start learning about automated reasoning, I highly recommend this short video explainer. If you are interested in diving a bit deeper, check out this re:Invent talk on automated reasoning from Byron Cook, Director of the AWS Automated Reasoning Group. And if you’re really interested in understanding the methodology, make yourself a nice cup of chamomile tea, grab a blanket, and get cozy with a copy of Semantic-based Automated Reasoning for AWS Access Policies using SMT.

Turning on IAM Access Analyzer is way less stressful than an unexpected nightmare final exam. There’s just one step. From the IAM Console, select Access analyzer from the menu on the left, then click Create analyzer.

Creating an Access Analyzer

Analyzers generate findings in the account from which they are created. Analyzers also work within the region defined when they are created, so create one in each region for which you’d like to see findings.

Once our analyzer is created, findings that show accessible resources appear in the Console. My account has a few findings that are worth looking into, such as KMS keys and IAM roles that are accessible by other accounts and federated users.Viewing Access Analyzer Findings

I’m going to click on the first finding and take a look at the access policy for this KMS key.

An Access Analyzer Finding

From here we can see the open access paths and details about the resources and principals involved. I went over to the KMS console and confirmed that this is intended access, so I archived this particular finding.

All IAM Access Analyzer findings are visible in the IAM Console, and can also be accessed using the IAM Access Analyzer API. Findings related to S3 buckets can be viewed directly in the S3 Console. Bucket policies can then be updated right in the S3 Console, closing the open access pathway.

An Access Analyzer finding in S3

You can also see high-priority findings generated by IAM Access Analyzer in AWS Security Hub, ensuring a comprehensive, single source of truth for your compliance and security-focused team members. IAM Access Analyzer also integrates with CloudWatch Events, making it easy to automatically respond to or send alerts regarding findings through the use of custom rules.

Now that you’ve seen how IAM Access Analyzer provides a comprehensive overview of cloud resource access, you should probably head over to IAM and turn it on. One of the great advantages of building in the cloud is that the infrastructure and tools continue to get stronger over time and IAM Access Analyzer is a great example. Did I mention that it’s free? Fire it up, then send me a tweet sharing some of the interesting things you find. As always, happy building!

— Brandon

Use AWS Fargate and Prowler to send security configuration findings about AWS services to Security Hub

Post Syndicated from Jonathan Rau original https://aws.amazon.com/blogs/security/use-aws-fargate-prowler-send-security-configuration-findings-about-aws-services-security-hub/

In this blog post, I’ll show you how to integrate Prowler, an open-source security tool, with AWS Security Hub. Prowler provides dozens of security configuration checks related to services such as Amazon Redshift, Amazon ElasticCache, Amazon API Gateway and Amazon CloudFront. Integrating Prowler with Security Hub will provide posture information about resources not currently covered by existing Security Hub integrations or compliance standards. You can use Prowler checks to supplement the existing CIS AWS Foundations compliance standard Security Hub already provides, as well as other compliance-related findings you may be ingesting from partner solutions.

In this post, I’ll show you how to containerize Prowler using Docker and host it on the serverless container service AWS Fargate. By running Prowler on Fargate, you no longer have to provision, configure, or scale infrastructure, and it will only run when needed. Containers provide a standard way to package your application’s code, configurations, and dependencies into a single object that can run anywhere. Serverless applications automatically run and scale in response to events you define, rather than requiring you to provision, scale, and manage servers.

Solution overview

The following diagram shows the flow of events in the solution I describe in this blog post.
 

Figure 1: Prowler on Fargate Architecture

Figure 1: Prowler on Fargate Architecture

 

The integration works as follows:

  1. A time-based CloudWatch Event starts the Fargate task on a schedule you define.
  2. Fargate pulls a Prowler Docker image from Amazon Elastic Container Registry (ECR).
  3. Prowler scans your AWS infrastructure and writes the scan results to a CSV file.
  4. Python scripts in the Prowler container convert the CSV to JSON and load an Amazon DynamoDB table with formatted Prowler findings.
  5. A DynamoDB stream invokes an AWS Lambda function.
  6. The Lambda function maps Prowler findings into the AWS Security Finding Format (ASFF) before importing them to Security Hub.

Except for an ECR repository, you’ll deploy all of the above via AWS CloudFormation. You’ll also need the following prerequisites to supply as parameters for the CloudFormation template.

Prerequisites

  • A VPC with at least 2 subnets that have access to the Internet plus a security group that allows access on Port 443 (HTTPS).
  • An ECS task role with the permissions that Prowler needs to complete its scans. You can find more information about these permissions on the official Prowler GitHub page.
  • An ECS task execution IAM role to allow Fargate to publish logs to CloudWatch and to download your Prowler image from Amazon ECR.

Step 1: Create an Amazon ECR repository

In this step, you’ll create an ECR repository. This is where you’ll upload your Docker image for Step 2.

  1. Navigate to the Amazon ECR Console and select Create repository.
  2. Enter a name for your repository (I’ve named my example securityhub-prowler, as shown in figure 2), then choose Mutable as your image tag mutability setting, and select Create repository.
     
    Figure 2: ECR Repository Creation

    Figure 2: ECR Repository Creation

Keep the browser tab in which you created the repository open so that you can easily reference the Docker commands you’ll need in the next step.

Step 2: Build and push the Docker image

In this step, you’ll create a Docker image that contains scripts that will map Prowler findings into DynamoDB. Before you begin step 2, ensure your workstation has the necessary permissions to push images to ECR.

  1. Create a Dockerfile via your favorite text editor, and name it Dockerfile.
    
    FROM python:latest
    
    # Declar Env Vars
    ENV MY_DYANMODB_TABLE=MY_DYANMODB_TABLE
    ENV AWS_REGION=AWS_REGION
    
    # Install Dependencies
    RUN \
        apt update && \
        apt upgrade -y && \
        pip install awscli && \
        apt install -y python3-pip
    
    # Place scripts
    ADD converter.py /root
    ADD loader.py /root
    ADD script.sh /root
    
    # Installs prowler, moves scripts into prowler directory
    RUN \
        git clone https://github.com/toniblyx/prowler && \
        mv root/converter.py /prowler && \
        mv root/loader.py /prowler && \
        mv root/script.sh /prowler
    
    # Runs prowler, ETLs ouput with converter and loads DynamoDB with loader
    WORKDIR /prowler
    RUN pip3 install boto3
    CMD bash script.sh
    

  2. Create a new file called script.sh and paste in the below code. This script will call the remaining scripts, which you’re about to create in a specific order.

    Note: Change the AWS Region in the Prowler command on line 3 to the region in which you’ve enabled Security Hub.

    
    #!/bin/bash
    echo "Running Prowler Scans"
    ./prowler -b -n -f us-east-1 -g extras -M csv > prowler_report.csv
    echo "Converting Prowler Report from CSV to JSON"
    python converter.py
    echo "Loading JSON data into DynamoDB"
    python loader.py
    

  3. Create a new file called converter.py and paste in the below code. This Python script will convert the Prowler CSV report into JSON, and both versions will be written to the local storage of the Prowler container.
    
    import csv
    import json
    
    # Set variables for within container
    CSV_PATH = 'prowler_report.csv'
    JSON_PATH = 'prowler_report.json'
    
    # Reads prowler CSV output
    csv_file = csv.DictReader(open(CSV_PATH, 'r'))
    
    # Create empty JSON list, read out rows from CSV into it
    json_list = []
    for row in csv_file:
        json_list.append(row)
    
    # Writes row into JSON file, writes out to docker from .dumps
    open(JSON_PATH, 'w').write(json.dumps(json_list))
    
    # open newly converted prowler output
    with open('prowler_report.json') as f:
        data = json.load(f)
    
    # remove data not needed for Security Hub BatchImportFindings    
    for element in data: 
        del element['PROFILE']
        del element['SCORED']
        del element['LEVEL']
        del element['ACCOUNT_NUM']
        del element['REGION']
    
    # writes out to a new file, prettified
    with open('format_prowler_report.json', 'w') as f:
        json.dump(data, f, indent=2)
    

  4. Create your last file, called loader.py and paste in the below code. This Python script will read values from the JSON file and send them to DynamoDB.
    
    from __future__ import print_function # Python 2/3 compatibility
    import boto3
    import json
    import decimal
    import os
    
    awsRegion = os.environ['AWS_REGION']
    prowlerDynamoDBTable = os.environ['MY_DYANMODB_TABLE']
    
    dynamodb = boto3.resource('dynamodb', region_name=awsRegion)
    
    table = dynamodb.Table(prowlerDynamoDBTable)
    
    # CHANGE FILE AS NEEDED
    with open('format_prowler_report.json') as json_file:
        findings = json.load(json_file, parse_float = decimal.Decimal)
        for finding in findings:
            TITLE_ID = finding['TITLE_ID']
            TITLE_TEXT = finding['TITLE_TEXT']
            RESULT = finding['RESULT']
            NOTES = finding['NOTES']
    
            print("Adding finding:", TITLE_ID, TITLE_TEXT)
    
            table.put_item(
               Item={
                   'TITLE_ID': TITLE_ID,
                   'TITLE_TEXT': TITLE_TEXT,
                   'RESULT': RESULT,
                   'NOTES': NOTES,
                }
            )
    

  5. From the ECR console, within your repository, select View push commands to get operating system-specific instructions and additional resources to build, tag, and push your image to ECR. See Figure 3 for an example.
     
    Figure 3: ECR Push Commands

    Figure 3: ECR Push Commands

    Note: If you’ve built Docker images previously within your workstation, pass the –no-cache flag with your docker build command.

  6. After you’ve built and pushed your Image, note the URI within the ECR console (such as 12345678910.dkr.ecr.us-east-1.amazonaws.com/my-repo), as you’ll need this for a CloudFormation parameter in step 3.

Step 3: Deploy CloudFormation template

Download the CloudFormation template from GitHub and create a CloudFormation stack. For more information about how to create a CloudFormation stack, see Getting Started with AWS CloudFormation in the CloudFormation User Guide.

You’ll need the values you noted in Step 2 and during the “Solution overview” prerequisites. The description of each parameter is provided on the Parameters page of the CloudFormation deployment (see Figure 4)
 

Figure 4: CloudFormation Parameters

Figure 4: CloudFormation Parameters

After the CloudFormation stack finishes deploying, click the Resources tab to find your Task Definition (called ProwlerECSTaskDefinition). You’ll need this during Step 4.
 

Figure 5: CloudFormation Resources

Figure 5: CloudFormation Resources

Step 4: Manually run ECS task

In this step, you’ll run your ECS Task manually to verify the integration works. (Once you’ve tested it, this step will be automatic based on CloudWatch events.)

  1. Navigate to the Amazon ECS console and from the navigation pane select Task Definitions.
  2. As shown in Figure 6, select the check box for the task definition you deployed via CloudFormation, then select the Actions dropdown menu and choose Run Task.
     
    Figure 6: ECS Run Task

    Figure 6: ECS Run Task

  3. Configure the following settings (shown in Figure 7), then select Run Task:
    1. Launch Type: FARGATE
    2. Platform Version: Latest
    3. Cluster: Select the cluster deployed by CloudFormation
    4. Number of tasks: 1
    5. Cluster VPC: Enter the VPC of the subnets you provided as CloudFormation parameters
    6. Subnets: Select 1 or more subnets in the VPC
    7. Security groups: Enter the same security group you provided as a CloudFormation parameter
    8. Auto-assign public IP: ENABLED
       
      Figure 7: ECS Task Settings

      Figure 7: ECS Task Settings

  4. Depending on the size of your account and the resources within it, your task can take up to an hour to complete. Follow the progress by looking at the Logs tab within the Task view (Figure 8) by selecting your task. The stdout from Prowler will appear in the logs.

    Note: Once the task has completed it will automatically delete itself. You do not need to take additional actions for this to happen during this or subsequent runs.

     

    Figure 8: ECS Task Logs

    Figure 8: ECS Task Logs

  5. Under the Details tab, monitor the status. When the status reads Stopped, navigate to the DynamoDB console.
  6. Select your table, then select the Items tab. Your findings will be indexed under the primary key NOTES, as shown in Figure 9. From here, the Lambda function will trigger each time new items are written into the table from Fargate and will load them into Security Hub.
     
    Figure 9: DynamoDB Items

    Figure 9: DynamoDB Items

  7. Finally, navigate to the Security Hub console, select the Findings menu, and wait for findings from Prowler to arrive in the dashboard as shown in figure 10.
     
    Figure 10: Prowler Findings in Security Hub

    Figure 10: Prowler Findings in Security Hub

If you run into errors when running your Fargate task, refer to the Amazon ECS Troubleshooting guide. Log errors commonly come from missing permissions or disabled Regions – refer back to the Prowler GitHub for troubleshooting information.

Conclusion

In this post, I showed you how to containerize Prowler, run it manually, create a schedule with CloudWatch Events, and use custom Python scripts along with DynamoDB streams and Lambda functions to load Prowler findings into Security Hub. By using Security Hub, you can centralize and aggregate security configuration information from Prowler alongside findings from AWS and partner services.

From Security Hub, you can use custom actions to send one or a group of findings from Prowler to downstream services such as ticketing systems or to take custom remediation actions. You can also use Security Hub custom insights to create saved searches from your Prowler findings. Lastly, you can use Security Hub in a master-member format to aggregate findings across multiple accounts for centralized reporting.

If you have feedback about this blog post, submit comments in the Comments section below. If you have questions about this blog post, start a new thread on the AWS Security Hub forum.

Want more AWS Security news? Follow us on Twitter.

Jonathon Rau

Jonathan Rau

Jonathan is the Senior TPM for AWS Security Hub. He holds an AWS Certified Specialty-Security certification and is extremely passionate about cyber security, data privacy, and new emerging technologies, such as blockchain. He devotes personal time into research and advocacy about those same topics.

15 additional AWS services receive DoD Impact Level 4 and 5 authorization

Post Syndicated from Tyler Harding original https://aws.amazon.com/blogs/security/15-additional-aws-services-receive-dod-impact-level-4-and-5-authorization/

I’m pleased to announce that the Defense Information Systems Agency (DISA) has extended the Provisional Authorization to Operate (P-ATO) of AWS GovCloud (US) Regions for Department of Defense (DoD) workloads at DoD Impact Levels (IL) 4 and 5 under the DoD’s Cloud Computing Security Requirements Guide (DoD CC SRG). Our authorizations at DoD IL 4 and IL 5 allow DoD Mission Owners to process unclassified, mission critical workloads for National Security Systems in the AWS GovCloud (US) Regions.

AWS successfully completed an independent, third-party evaluation that confirmed we effectively implement over 400 security controls using applicable criteria from NIST SP 800-53 Rev 4, the US General Services Administration’s FedRAMP High baseline, the DoD CC SRG, and the Committee on National Security Systems Instruction No. 1253 at the High Confidentiality, High Integrity, and High Availability impact levels.

In addition to a P-ATO extension through November 2020 for the existing 24 services, DISA authorized an additional 15 AWS services at DoD Impact Levels 4 and 5 as listed below.

Newly authorized AWS services at DoD Impact Levels 4 and 5

With these additional 15 services, we can now offer AWS customers a total of 39 services, authorized to store, process, and analyze DoD mission critical data at Impact Level 4 and Impact Level 5.

To learn more about AWS solutions for DoD, please see our AWS solution offerings. Stay tuned for future updates on our Services in Scope by Compliance Program page. If you have feedback about this blog post, let us know in the Comments section below.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Tyler Harding

Tyler is the DoD Compliance Program Manager within AWS Security Assurance. He has over 20 years of experience providing information security solutions to federal civilian, DoD, and intelligence agencies.

New guidance to help you navigate Australian Prudential Regulation Authority requirements

Post Syndicated from Paul Curtis original https://aws.amazon.com/blogs/security/new-guidance-navigate-australian-prudential-regulation-authority-requirements/

There have been two noteworthy 2019 updates for Australian Prudential Regulation Authority (APRA) regulated entities such as banks, insurance companies, credit unions, deposit takers, and the superannuation industry.

On June 25, APRA released an updated version of the Prudential Practice Guide CPG 234 Information Security, which provides guidance on how to implement the revised Prudential Standard CPS 234 Information Security. The new Prudential Practice Guide has been expanded significantly compared to the previous version. The revised guidance reflects the evolving cybersecurity landscape and focuses on areas of importance to APRA regulated entities.

On July 1, APRA’s Prudential Standard CPS 234 Information Security became effective. This standard represents a set of legally enforceable information security requirements for APRA regulated entities. CPS 234 aims to:

“…ensure that an APRA regulated entity takes measures to be resilient against information security incidents (including cyberattacks) by maintaining an information security capability commensurate with information security vulnerabilities and threats.”

In response to these updates, we have updated our AWS User Guide to Financial Services Regulations & Guidelines in Australia, which provides APRA regulated entities with a summary of APRA’s requirements, plus recommendations related to outsourcing and IT risk in the cloud. In addition to introducing the shared responsibility model, AWS Compliance Assurance Programs, and the AWS Global Infrastructure, our user guide summarizes four APRA documents that regulated entities should be aware of: APRA’s Prudential Standard CPS 231 Outsourcing, Information Paper on Outsourcing Involving Cloud Computing Services, CPS 234 Information Security, and the updated CPG 234 Information Security.

To assist our customers in meeting the updated recommendations in CPG 234 Information Security, we’ve also updated the APRA CPG 234 Workbook. The workbook is available for download through AWS Artifact (you’ll need an AWS account). Our updates reflect the revised content in APRA’s guidance, and the workbook now includes guidance on how to meet CPG 234’s recommendations for security “IN the cloud” by mapping to the five pillars of the AWS Well-Architected Framework.

As the regulatory environment continues to evolve, we’ll provide further updates on the AWS Security Blog and the AWS Compliance page. The user guide and workbook add to the resources AWS provides about financial services regulation across the world. You can find more information on cloud-related regulatory compliance at the AWS Compliance Center. You can also reach out to your AWS account manager for help finding the resources you need.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Author

Paul Curtis

As an FSI Compliance Specialist in the Global Financial Services Industry Team, Paul works with financial organisations, supporting their risk management and compliance functions. He works with our customers’ risk teams to help them manage their technology risk in a scalable way that unlocks their ability to realize cloud benefits. Paul has over fifteen years of experience working in risk and technology across the financial services industry in Australia, Southeast Asia and South Africa. Paul holds an MBA (Corporate Governance) and is a Graduate member of the Australian Institute of Company Directors.

AWS achieves FedRAMP JAB High and Moderate Provisional Authorization across 18 services in the AWS US East/West and AWS GovCloud (US) Regions

Post Syndicated from Amendaze Thomas original https://aws.amazon.com/blogs/security/aws-achieves-fedramp-jab-high-moderate-provisional-authorization-18-services/

It’s my pleasure to announce that we’ve expanded the number of AWS services that customers can use to run sensitive and highly regulated workloads in the federal government space. This expansion of our FedRAMP program marks a 28.6% increase in our number of FedRAMP authorizations.

Today, we’ve achieved FedRAMP authorizations for 6 services in our AWS US East/West Regions:

We also received 14 service authorizations in our AWS GovCloud (US) Regions:

In total, we now offer 48 AWS services authorized in the AWS US East/West Regions under FedRAMP Moderate and 43 services authorized in our AWS GovCloud (US) Regions under FedRamp High. You can see our full, updated list of authorizations on the FedRAMP Marketplace. We also list all of our services in scope by compliance program on our Services in Scope page.

Our FedRAMP assessment was completed with a third-party assessment partner to ensure an independent validation of our technical, management, and operational security controls against the FedRAMP baselines.

We care deeply about our customers’ needs, and compliance is my team’s priority. As we expand in the federal space, we want to continue to onboard services into the compliance programs our customers are using, such as FedRAMP.

To learn what other public sector customers are doing on AWS, see our Government, Education, and Nonprofits Case Studies and Customer Success Stories. Stay tuned for future updates on our Services in Scope by Compliance Program page. If you have feedback about this blog post, let us know in the Comments section below.

Want more AWS Security news? Follow us on Twitter.

author photo

Amendaze Thomas

Amendaze is the manager of AWS Security’s Government Assessments and Authorization Program (GAAP). He has 15 years of experience providing advisory services to clients in the Federal government, and over 13 years’ experience supporting CISO teams with risk management framework (RMF) activities.

Updated whitepaper available: “Navigating GDPR Compliance on AWS”

Post Syndicated from Carmela Gambardella original https://aws.amazon.com/blogs/security/updated-whitepaper-available-navigating-gdpr-compliance-on-aws/

The European Union’s General Data Protection Regulation 2016/679 (GDPR) safeguards EU citizens’ fundamental right to privacy and to personal data protection. In order to make local regulations coherent and homogeneous, the GDPR introduces and defines stringent new standards in terms of compliance, security and data protection.

The updated version of our Navigating GDPR Compliance on AWS whitepaper (.pdf) explains the role that AWS plays in your GDPR compliance process and shows how AWS can help your organization accelerate the process of aligning your compliance programs to the GDPR by using AWS cloud services.

AWS compliance, data protection, and security experts work with customers across the world to help them run workloads in the AWS Cloud, including customers who must operate within GDPR requirements. AWS teams also review what AWS is responsible for to make sure that our operations comply with the requirements of the GDPR so that customers can continue to use AWS services. The whitepaper provides guidelines to better orient you to the wide variety of AWS security offerings and to help you identify the service that best suits your GDPR compliance needs.

If you have feedback about this blog post, please submit comments in the Comments section below.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Author photo

Carmela Gambardella

Carmela graduated in Computer Science at the Federico II University of Naples, Italy. She has worked in a variety of roles at large IT companies, including as a software engineer, security consultant, and security solutions architect. Her areas of interest include data protection, security and compliance, application security, and software engineering. In April 2018, she joined the AWS Public Sector Solution Architects team in Italy.

Author photo

Giuseppe Russo

Giuseppe is a Security Assurance Manager for AWS in Italy. He has a Master’s Degree in Computer Science with a specialization in cryptography, security and coding theory. Giuseppe is s a seasoned information security practitioner with many years of experience engaging key stakeholders, developing guidelines, and influencing the security market on strategic topics such as privacy and critical infrastructure protection.

Compliance, and why Raspberry Pi 4 may not be available in your country yet

Post Syndicated from Roger Thornton original https://www.raspberrypi.org/blog/compliance-and-why-raspberry-pi-4-may-not-be-available-in-your-country-yet/

In June we launched Raspberry Pi 4, and it has been selling extremely well, with over 1 million devices already made. We launched the product in a select set of countries in June, and ever since, we’ve been steadily making it available in more and more places; currently, Raspberry Pi 4 is on the market in 55 countries.

Raspberry Pi 4 and compliance

There have been many questions around why Raspberry Pi 4 isn’t available in certain countries, and this post will give you some insight into this.

Whenever a company wants to sell a product on a market, it first has to prove that selling it is safe and legal. Compliance requirements vary between different products; rules that would apply to a complicated machine like a car will, naturally, not be the same as those that apply to a pair of trainers (although there is some overlap in the Venn diagram of rules).

Raspberry Pi Integrator Programme

Regions of the world within each of which products have to be separately tested

Different countries usually have slightly different sets of regulations, and testing has to be conducted at an accredited facility for the region the company intends to sell the product in.

Compliance for a country is broken into the following: testing, certification, and marking.

Testing

Compliance testing requirements vary from country to country; there is no single set of tests or approvals that allow you to sell a product globally. Often, it’s necessary to test the product within the country that compliance is needed for; only some countries accept test reports from other countries.

For the launch of Raspberry Pi 4, we tested to EU, FCC (USA), and IC (Canada) regulations, and we’ve used these test reports to apply for compliance in as many countries as possible.

Certification

Once testing is complete, a certificate is issued for the product. The time this takes is variable, and some countries post such certificates online publicly so people can search for products.

Testing in the remaining countries that require testing to happen in-country is now complete, and the respective certificates are being granted for Raspberry Pi 4 right now. However, whilst the certificate is being issued, the product isn’t yet compliant; we need to add the regulatory markings for this to happen.

Marking

Like testing requirements, product marking requirements may differ from country to country. The main difficulty of marking is that many countries require a unique certificate number to be printed on packaging, leaflets, and the product itself.

Some countries, such as the USA, allow companies to create the certificate number themselves (hence jazzy numbers like 2ABCB-RPI4B), and so we can place these on the product before launch. In other countries, however, the certificate number is issued at the end of the certification process.

For Raspberry Pi 4, we are now at the final stage for compliance: marking. All our certificates have been issued, and we are updating the packaging, leaflet, and product with the various certificate numbers needed to unlock the last few countries.

The countries that we have certificates for that require markings to be added: China, South Korea, Brazil, Mexico, Taiwan, Chile, and Japan.

The process is beginning, and Raspberry Pi 4 should be available in these markets soon.

We post all our product compliance information online.

Conclusion

This is a broad overview of the compliance process for Raspberry Pi, and there are some details omitted for the sake of clarity. Compliance is a complex and varied task, but it is very important to demonstrate that Raspberry Pi 4 is a compliant, safe, and trustworthy product.

We aim to make Raspberry Pi 4 available in more countries than ever before, ensuring that everyone can take advantage of the amazing features, power, and cost-effectiveness it offers.

The post Compliance, and why Raspberry Pi 4 may not be available in your country yet appeared first on Raspberry Pi.

Learn about AWS Services & Solutions – September AWS Online Tech Talks

Post Syndicated from Jenny Hang original https://aws.amazon.com/blogs/aws/learn-about-aws-services-solutions-september-aws-online-tech-talks/

Learn about AWS Services & Solutions – September AWS Online Tech Talks

AWS Tech Talks

Join us this September to learn about AWS services and solutions. The AWS Online Tech Talks are live, online presentations that cover a broad range of topics at varying technical levels. These tech talks, led by AWS solutions architects and engineers, feature technical deep dives, live demonstrations, customer examples, and Q&A with AWS experts. Register Now!

Note – All sessions are free and in Pacific Time.

Tech talks this month:

 

Compute:

September 23, 2019 | 11:00 AM – 12:00 PM PTBuild Your Hybrid Cloud Architecture with AWS – Learn about the extensive range of services AWS offers to help you build a hybrid cloud architecture best suited for your use case.

September 26, 2019 | 1:00 PM – 2:00 PM PTSelf-Hosted WordPress: It’s Easier Than You Think – Learn how you can easily build a fault-tolerant WordPress site using Amazon Lightsail.

October 3, 2019 | 11:00 AM – 12:00 PM PTLower Costs by Right Sizing Your Instance with Amazon EC2 T3 General Purpose Burstable Instances – Get an overview of T3 instances, understand what workloads are ideal for them, and understand how the T3 credit system works so that you can lower your EC2 instance costs today.

 

Containers:

September 26, 2019 | 11:00 AM – 12:00 PM PTDevelop a Web App Using Amazon ECS and AWS Cloud Development Kit (CDK) – Learn how to build your first app using CDK and AWS container services.

 

Data Lakes & Analytics:

September 26, 2019 | 9:00 AM – 10:00 AM PTBest Practices for Provisioning Amazon MSK Clusters and Using Popular Apache Kafka-Compatible Tooling – Learn best practices on running Apache Kafka production workloads at a lower cost on Amazon MSK.

 

Databases:

September 25, 2019 | 1:00 PM – 2:00 PM PTWhat’s New in Amazon DocumentDB (with MongoDB compatibility) – Learn what’s new in Amazon DocumentDB, a fully managed MongoDB compatible database service designed from the ground up to be fast, scalable, and highly available.

October 3, 2019 | 9:00 AM – 10:00 AM PTBest Practices for Enterprise-Class Security, High-Availability, and Scalability with Amazon ElastiCache – Learn about new enterprise-friendly Amazon ElastiCache enhancements like customer managed key and online scaling up or down to make your critical workloads more secure, scalable and available.

 

DevOps:

October 1, 2019 | 9:00 AM – 10:00 AM PT – CI/CD for Containers: A Way Forward for Your DevOps Pipeline – Learn how to build CI/CD pipelines using AWS services to get the most out of the agility afforded by containers.

 

Enterprise & Hybrid:

September 24, 2019 | 1:00 PM – 2:30 PM PT Virtual Workshop: How to Monitor and Manage Your AWS Costs – Learn how to visualize and manage your AWS cost and usage in this virtual hands-on workshop.

October 2, 2019 | 1:00 PM – 2:00 PM PT – Accelerate Cloud Adoption and Reduce Operational Risk with AWS Managed Services – Learn how AMS accelerates your migration to AWS, reduces your operating costs, improves security and compliance, and enables you to focus on your differentiating business priorities.

 

IoT:

September 25, 2019 | 9:00 AM – 10:00 AM PTComplex Monitoring for Industrial with AWS IoT Data Services – Learn how to solve your complex event monitoring challenges with AWS IoT Data Services.

 

Machine Learning:

September 23, 2019 | 9:00 AM – 10:00 AM PTTraining Machine Learning Models Faster – Learn how to train machine learning models quickly and with a single click using Amazon SageMaker.

September 30, 2019 | 11:00 AM – 12:00 PM PTUsing Containers for Deep Learning Workflows – Learn how containers can help address challenges in deploying deep learning environments.

October 3, 2019 | 1:00 PM – 2:30 PM PTVirtual Workshop: Getting Hands-On with Machine Learning and Ready to Race in the AWS DeepRacer League – Join DeClercq Wentzel, Senior Product Manager for AWS DeepRacer, for a presentation on the basics of machine learning and how to build a reinforcement learning model that you can use to join the AWS DeepRacer League.

 

AWS Marketplace:

September 30, 2019 | 9:00 AM – 10:00 AM PTAdvancing Software Procurement in a Containerized World – Learn how to deploy applications faster with third-party container products.

 

Migration:

September 24, 2019 | 11:00 AM – 12:00 PM PTApplication Migrations Using AWS Server Migration Service (SMS) – Learn how to use AWS Server Migration Service (SMS) for automating application migration and scheduling continuous replication, from your on-premises data centers or Microsoft Azure to AWS.

 

Networking & Content Delivery:

September 25, 2019 | 11:00 AM – 12:00 PM PTBuilding Highly Available and Performant Applications using AWS Global Accelerator – Learn how to build highly available and performant architectures for your applications with AWS Global Accelerator, now with source IP preservation.

September 30, 2019 | 1:00 PM – 2:00 PM PTAWS Office Hours: Amazon CloudFront – Just getting started with Amazon CloudFront and [email protected]? Get answers directly from our experts during AWS Office Hours.

 

Robotics:

October 1, 2019 | 11:00 AM – 12:00 PM PTRobots and STEM: AWS RoboMaker and AWS Educate Unite! – Come join members of the AWS RoboMaker and AWS Educate teams as we provide an overview of our education initiatives and walk you through the newly launched RoboMaker Badge.

 

Security, Identity & Compliance:

October 1, 2019 | 1:00 PM – 2:00 PM PTDeep Dive on Running Active Directory on AWS – Learn how to deploy Active Directory on AWS and start migrating your windows workloads.

 

Serverless:

October 2, 2019 | 9:00 AM – 10:00 AM PTDeep Dive on Amazon EventBridge – Learn how to optimize event-driven applications, and use rules and policies to route, transform, and control access to these events that react to data from SaaS apps.

 

Storage:

September 24, 2019 | 9:00 AM – 10:00 AM PTOptimize Your Amazon S3 Data Lake with S3 Storage Classes and Management Tools – Learn how to use the Amazon S3 Storage Classes and management tools to better manage your data lake at scale and to optimize storage costs and resources.

October 2, 2019 | 11:00 AM – 12:00 PM PTThe Great Migration to Cloud Storage: Choosing the Right Storage Solution for Your Workload – Learn more about AWS storage services and identify which service is the right fit for your business.

 

 

AWS and the European Banking Authority Guidelines on Outsourcing

Post Syndicated from Chad Woolf original https://aws.amazon.com/blogs/security/aws-european-banking-authority-guidelines-on-outsourcing/

Financial institutions across the globe use AWS to transform the way they do business. It’s exciting to watch our customers in the financial services industry innovate on AWS in unique ways, across all geos and use cases. Regulations continue to evolve in this space, and we’re working hard to help customers proactively respond to new rules and guidelines. In many cases, the AWS Cloud makes it easier than ever before for customers to comply with different regulations and frameworks around the world.

The European Banking Authority (EBA), an EU financial supervisory authority, recently provided EU financial institutions (which includes credit institutions, certain investment firms, and payment institutions) with new outsourcing guidelines (PDF), which also apply to the use of cloud services. We’re ready and able to support our customers’ compliance with their obligations under the EBA Guidelines and to help meet and exceed their regulators’ expectations. We offer our customers a wide range of services that can simplify and directly assist in complying with the new guidelines, which take effect on September 30, 2019.

What do the EBA Guidelines mean for AWS customers?

The EBA Guidelines establish technology-neutral outsourcing requirements for EU financial institutions, and there is a particular focus on the outsourcing of “critical or important functions.” For AWS and our customers, the key takeaway is that the EBA Guidelines allow for EU financial institutions to use cloud services for material, regulated workloads. When considering or using third-party services, many EU financial institutions already follow due diligence, risk management, and regulatory notification processes that are similar to those processes laid out in the EBA Guidelines. To meet and exceed the EBA Guidelines’ requirements on security, resiliency, and assurance, EU financial institutions can use a variety of AWS security and compliance services.

Risk-based approach

The EBA Guidelines incorporate a risk-based approach that expects regulated entities to identify, assess, and mitigate the risks associated with any outsourcing arrangement. The risk-based approach outlined in the EBA Guidelines is consistent with the long-standing AWS shared responsibility model. This approach applies throughout the EBA Guidelines, including the areas of risk assessment, contractual and audit requirements, data location and transfer, and security implementation.

  • Risk assessment: The EBA Guidelines emphasize the need for EU financial institutions to assess the potential impact of outsourcing arrangements on their operational risk. The AWS shared responsibility model helps customers formulate their risk assessment approach because it illustrates how their security and management responsibilities change depending on the AWS services they use. For example, AWS operates some controls on behalf of customers, such as data center security, while customers operate other controls, such as event logging. In practice, AWS services help customers assess and improve their risk profile relative to traditional, on-premises environments.
  • Contractual and audit requirements: The EBA Guidelines lay out requirements for the written agreement between an EU financial institution and its service provider, including access and audit rights. For EU financial institutions running regulated workloads on AWS services, we offer the EBA Financial Services Addendum to address the EBA Guidelines’ contractual requirements. We also provide these institutions the ability to comply with the audit requirements in the EBA Guidelines through the AWS Security & Audit Series, including participation in an Audit Symposium, to facilitate customer audits. To align with regulatory requirements and expectations, our EBA addendum and audit program incorporate feedback that we’ve received from a variety of financial supervisory authorities across EU member states. EU financial services customers interested in learning more about the addendum or about the audit engagements offered by AWS can reach out to their AWS account teams.
  • Data location and transfer: The EBA Guidelines do not put restrictions on where an EU financial institution can store and process its data, but rather state that EU financial institutions should “adopt a risk-based approach to data storage and data processing location(s) (i.e. country or region) and information security considerations.” Our customers can choose which AWS Regions they store their content in, and we will not move or replicate your customer content outside of your chosen Regions unless you instruct us to do so. Customers can replicate and back up their customer content in more than one AWS Region to meet a variety of objectives, such as availability goals and geographic requirements.
  • Security implementation: The EBA Guidelines require EU financial institutions to consider, implement, and monitor various security measures. Using AWS services, customers can meet this requirement in a scalable and cost-effective way while improving their security posture. Customers can use AWS Config or AWS Security Hub to simplify auditing, security analysis, change management, and operational troubleshooting. As part of their cybersecurity measures, customers can activate Amazon GuardDuty, which provides intelligent threat detection and continuous monitoring, to generate detailed and actionable security alerts. Amazon Inspector automatically assesses a customer’s AWS resources for vulnerabilities or deviations from best practices and then produces a detailed list of security findings prioritized by level of severity. Customers can also enhance their security by using AWS Key Management Service (creation and control of encryption keys), AWS Shield (DDoS protection), and AWS WAF (filtering of malicious web traffic). These are just a few of the 500+ services and features we offer that enable strong availability, security, and compliance for our customers.

As reflected in the EBA Guidelines, it’s important to take a balanced approach when evaluating responsibilities in a cloud implementation. We are responsible for the security of the AWS Global Infrastructure. In the EU, we currently operate AWS Regions in Ireland, Frankfurt, London, Paris, and Stockholm, with our new Milan Region opening soon. For all of our data centers, we assess and manage environmental risks, employ extensive physical and personnel security controls, and guard against outages through our resiliency and testing procedures. In addition, independent, third-party auditors test more than 2,600 standards and requirements in the AWS environment throughout the year.

Conclusion

We encourage customers to learn about how the EBA Guidelines apply to their organization. Our teams of security, compliance, and legal experts continue to work with our EU financial services customers, both large and small, to support their journey to the AWS Cloud. AWS is closely following how regulatory authorities apply the EBA Guidelines locally and will provide further updates as needed. If you have any questions about compliance with the EBA Guidelines and their application to your use of AWS, or if you require the EBA Financial Services Addendum, please reach out to your account representative or request to be contacted.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Author

Chad Woolf

Chad joined Amazon in 2010 and built the AWS compliance functions from the ground up, including audit and certifications, privacy, contract compliance, control automation engineering and security process monitoring. Chad’s work also includes enabling public sector and regulated industry adoption of the AWS Cloud, compliance with complex privacy regulations such as GDPR and operating a trade and product compliance team in conjunction with global region expansion. Prior to joining AWS, Chad spent 12 years with Ernst & Young as a Senior Manager working directly with Fortune 100 companies consulting on IT process, security, risk, and vendor management advisory work, as well as designing and deploying global security and assurance software solutions. Chad holds a Masters of Information Systems Management and a Bachelors of Accounting from Brigham Young University, Utah. Follow Chad on Twitter.

64 AWS services achieve HITRUST certification

Post Syndicated from Chris Gile original https://aws.amazon.com/blogs/security/64-aws-services-achieve-hitrust-certification/

We’re excited to announce that 64 AWS services are now certified for the Health Information Trust Alliance (HITRUST) Common Security Framework (CSF).

The full list of AWS services that were audited by a third party auditor and certified under HITRUST CSF is available on our Services in Scope by Compliance Program page. You can view and download our HITRUST CSF certification here:

The HITRUST certification allows AWS customers to tailor their security control baselines to a variety of factors including, but not limited to, regulatory requirements and organization type.

The HITRUST Alliance has established the CSF as a certifiable framework that can be leveraged by organizations to comply with ISO/IEC 27000 series and HIPAA related requirements. The HITRUST CSF is already widely adopted by leading organizations in a variety of industries in their approach to security and privacy. Please visit the HITRUST Alliance website for more information.

As always, we value your feedback and questions and commit to helping customers achieve and maintain the highest standard of security and compliance. Please feel free to reach out to the team through the AWS Compliance Contact Us page.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

AWS achieves OSPAR outsourcing standard for Singapore financial industry

Post Syndicated from Brandon Lim original https://aws.amazon.com/blogs/security/aws-achieves-ospar-outsourcing-standard-for-singapore-financial-industry/

AWS has achieved the Outsourced Service Provider Audit Report (OSPAR) attestation for 66 services in the Asia Pacific (Singapore) Region. The OSPAR assessment is performed by an independent third party auditor. AWS’s OSPAR demonstrates that AWS has a system of controls in place that meet the Association of Banks in Singapore’s Guidelines on Control Objectives and Procedures for Outsourced Service Providers (ABS Guidelines).

The ABS Guidelines are intended to assist financial institutions in understanding approaches to due diligence, vendor management, and key technical and organizational controls that should be implemented in cloud outsourcing arrangements, particularly for material workloads. The ABS Guidelines are closely aligned with the Monetary Authority of Singapore’s Outsourcing Guidelines, and they’re one of the standards that the financial services industry in Singapore uses to assess the capability of their outsourced service providers (including cloud service providers).

AWS’s alignment with the ABS Guidelines demonstrates to customers AWS’s commitment to meeting the high expectations for cloud service providers set by the financial services industry in Singapore. Customers can leverage OSPAR to conduct their due diligence, minimizing the effort and costs required for compliance. AWS’s OSPAR report is now available in AWS Artifact.

You can find additional resources about regulatory requirements in the Singapore financial industry at the AWS Compliance Center. If you have questions about AWS’s OSPAR, or if you’d like to inquire about how to use AWS for your material workloads, please contact your AWS account team.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Author photo

Brandon Lim

Brandon is the Head of Security Assurance for Financial Services, Asia-Pacific. Brandon leads AWS’s regulatory and security engagement efforts for the Financial Services industry across the Asia Pacific region. He is passionate about working with Financial Services Regulators in the region to drive innovation and cloud adoption for the financial industry.

Introducing the “Preparing for the California Consumer Privacy Act” whitepaper

Post Syndicated from Julia Soscia original https://aws.amazon.com/blogs/security/introducing-the-preparing-for-the-california-consumer-privacy-act-whitepaper/

AWS has published a whitepaper, Preparing for the California Consumer Protection Act, to provide guidance on designing and updating your cloud architecture to follow the requirements of the California Consumer Privacy Act (CCPA), which goes into effect on January 1, 2020.

The whitepaper is intended for engineers and solution builders, but it also serves as a guide for qualified security assessors (QSAs) and internal security assessors (ISAs) so that you can better understand the range of AWS products and services that are available for you to use.

The CCPA was enacted into law on June 28, 2018 and grants California consumers certain privacy rights. The CCPA grants consumers the right to request that a business disclose the categories and specific pieces of personal information collected about the consumer, the categories of sources from which that information is collected, the “business purposes” for collecting or selling the information, and the categories of third parties with whom the information is shared. This whitepaper looks to address the three main subsections of the CCPA: data collection, data retrieval and deletion, and data awareness.

To read the text of the CCPA please visit the website for California Legislative Information.

If you have questions or want to learn more, contact your account executive or leave a comment below.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Author photo

Julia Soscia

Julia is a Solutions Architect at Amazon Web Services based out of New York City. Her main focus is to help customers create well-architected environments on the AWS cloud platform. She is an experienced data analyst with a focus in Big Data and Analytics.

Author photo

Anthony Pasquarielo

Anthony is a Solutions Architect at Amazon Web Services. He’s based in New York City. His main focus is providing customers technical guidance and consultation during their cloud journey. Anthony enjoys delighting customers by designing well-architected solutions that drive value and provide growth opportunity for their business.

Author photo

Justin De Castri

Justin is a Manager of Solutions Architecture at Amazon Web Services based in New York City. His primary focus is helping customers build secure, scaleable, and cost optimized solutions that are aligned with their business objectives.

Spring 2019 PCI DSS report now available, 12 services added in scope

Post Syndicated from Chris Gile original https://aws.amazon.com/blogs/security/spring-2019-pci-dss-report-now-available-12-services-added-in-scope/

At AWS Security, continuously raising the cloud security bar for our customers is central to all that we do. Part of that work is focused on our formal compliance certifications, which enable our customers to use the AWS cloud for highly sensitive and/or regulated workloads. We see our customers constantly developing creative and innovative solutions—and in order for them to continue to do so, we need to increase the availability of services within our certifications. I’m pleased to tell you that in the past year, we’ve increased our Payment Card Industry – Data Security Standard (PCI DSS) certification scope by 79%, from 62 services to 111 services, including 12 newly added services in our latest PCI report (listed below), and we were audited by our third-party auditor, Coalfire.

The PCI DSS report and certification cover the 111 services currently in scope that are used by our customers to architect a secure Cardholder Data Environment (CDE) to protect important workloads. The full list of PCI DSS certified AWS services is available on our Services in Scope by Compliance program page. The 12 newly added services for our Spring 2019 report are:

Our compliance reports, including this latest PCI report, are available on-demand through AWS Artifact.

To learn more about our PCI program and other compliance and security programs, please visit the AWS Compliance Programs page.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.