Tag Archives: Compliance

Identify Unintended Resource Access with AWS Identity and Access Management (IAM) Access Analyzer

Post Syndicated from Brandon West original https://aws.amazon.com/blogs/aws/identify-unintended-resource-access-with-aws-identity-and-access-management-iam-access-analyzer/

Today I get to share my favorite kind of announcement. It’s the sort of thing that will improve security for just about everyone that builds on AWS, it can be turned on with almost no configuration, and it costs nothing to use. We’re launching a new, first-of-its-kind capability called AWS Identity and Access Management (IAM) Access Analyzer. IAM Access Analyzer mathematically analyzes access control policies attached to resources and determines which resources can be accessed publicly or from other accounts. It continuously monitors all policies for Amazon Simple Storage Service (S3) buckets, IAM roles, AWS Key Management Service (KMS) keys, AWS Lambda functions, and Amazon Simple Queue Service (SQS) queues. With IAM Access Analyzer, you have visibility into the aggregate impact of your access controls, so you can be confident your resources are protected from unintended access from outside of your account.

Let’s look at a couple examples. An IAM Access Analyzer finding might indicate an S3 bucket named my-bucket-1 is accessible to an AWS account with the id 123456789012 when originating from the source IP 11.0.0.0/15. Or IAM Access Analyzer may detect a KMS key policy that allow users from another account to delete the key, identifying a data loss risk you can fix by adjusting the policy. If the findings show intentional access paths, they can be archived.

So how does it work? Using the kind of math that shows up on unexpected final exams in my nightmares, IAM Access Analyzer evaluates your policies to determine how a given resource can be accessed. Critically, this analysis is not based on historical events or pattern matching or brute force tests. Instead, IAM Access Analyzer understands your policies semantically. All possible access paths are verified by mathematical proofs, and thousands of policies can be analyzed in a few seconds. This is done using a type of cognitive science called automated reasoning. IAM Access Analyzer is the first service powered by automated reasoning available to builders everywhere, offering functionality unique to AWS. To start learning about automated reasoning, I highly recommend this short video explainer. If you are interested in diving a bit deeper, check out this re:Invent talk on automated reasoning from Byron Cook, Director of the AWS Automated Reasoning Group. And if you’re really interested in understanding the methodology, make yourself a nice cup of chamomile tea, grab a blanket, and get cozy with a copy of Semantic-based Automated Reasoning for AWS Access Policies using SMT.

Turning on IAM Access Analyzer is way less stressful than an unexpected nightmare final exam. There’s just one step. From the IAM Console, select Access analyzer from the menu on the left, then click Create analyzer.

Creating an Access Analyzer

Analyzers generate findings in the account from which they are created. Analyzers also work within the region defined when they are created, so create one in each region for which you’d like to see findings.

Once our analyzer is created, findings that show accessible resources appear in the Console. My account has a few findings that are worth looking into, such as KMS keys and IAM roles that are accessible by other accounts and federated users.Viewing Access Analyzer Findings

I’m going to click on the first finding and take a look at the access policy for this KMS key.

An Access Analyzer Finding

From here we can see the open access paths and details about the resources and principals involved. I went over to the KMS console and confirmed that this is intended access, so I archived this particular finding.

All IAM Access Analyzer findings are visible in the IAM Console, and can also be accessed using the IAM Access Analyzer API. Findings related to S3 buckets can be viewed directly in the S3 Console. Bucket policies can then be updated right in the S3 Console, closing the open access pathway.

An Access Analyzer finding in S3

You can also see high-priority findings generated by IAM Access Analyzer in AWS Security Hub, ensuring a comprehensive, single source of truth for your compliance and security-focused team members. IAM Access Analyzer also integrates with CloudWatch Events, making it easy to automatically respond to or send alerts regarding findings through the use of custom rules.

Now that you’ve seen how IAM Access Analyzer provides a comprehensive overview of cloud resource access, you should probably head over to IAM and turn it on. One of the great advantages of building in the cloud is that the infrastructure and tools continue to get stronger over time and IAM Access Analyzer is a great example. Did I mention that it’s free? Fire it up, then send me a tweet sharing some of the interesting things you find. As always, happy building!

— Brandon

Use AWS Fargate and Prowler to send security configuration findings about AWS services to Security Hub

Post Syndicated from Jonathan Rau original https://aws.amazon.com/blogs/security/use-aws-fargate-prowler-send-security-configuration-findings-about-aws-services-security-hub/

In this blog post, I’ll show you how to integrate Prowler, an open-source security tool, with AWS Security Hub. Prowler provides dozens of security configuration checks related to services such as Amazon Redshift, Amazon ElasticCache, Amazon API Gateway and Amazon CloudFront. Integrating Prowler with Security Hub will provide posture information about resources not currently covered by existing Security Hub integrations or compliance standards. You can use Prowler checks to supplement the existing CIS AWS Foundations compliance standard Security Hub already provides, as well as other compliance-related findings you may be ingesting from partner solutions.

In this post, I’ll show you how to containerize Prowler using Docker and host it on the serverless container service AWS Fargate. By running Prowler on Fargate, you no longer have to provision, configure, or scale infrastructure, and it will only run when needed. Containers provide a standard way to package your application’s code, configurations, and dependencies into a single object that can run anywhere. Serverless applications automatically run and scale in response to events you define, rather than requiring you to provision, scale, and manage servers.

Solution overview

The following diagram shows the flow of events in the solution I describe in this blog post.
 

Figure 1: Prowler on Fargate Architecture

Figure 1: Prowler on Fargate Architecture

 

The integration works as follows:

  1. A time-based CloudWatch Event starts the Fargate task on a schedule you define.
  2. Fargate pulls a Prowler Docker image from Amazon Elastic Container Registry (ECR).
  3. Prowler scans your AWS infrastructure and writes the scan results to a CSV file.
  4. Python scripts in the Prowler container convert the CSV to JSON and load an Amazon DynamoDB table with formatted Prowler findings.
  5. A DynamoDB stream invokes an AWS Lambda function.
  6. The Lambda function maps Prowler findings into the AWS Security Finding Format (ASFF) before importing them to Security Hub.

Except for an ECR repository, you’ll deploy all of the above via AWS CloudFormation. You’ll also need the following prerequisites to supply as parameters for the CloudFormation template.

Prerequisites

  • A VPC with at least 2 subnets that have access to the Internet plus a security group that allows access on Port 443 (HTTPS).
  • An ECS task role with the permissions that Prowler needs to complete its scans. You can find more information about these permissions on the official Prowler GitHub page.
  • An ECS task execution IAM role to allow Fargate to publish logs to CloudWatch and to download your Prowler image from Amazon ECR.

Step 1: Create an Amazon ECR repository

In this step, you’ll create an ECR repository. This is where you’ll upload your Docker image for Step 2.

  1. Navigate to the Amazon ECR Console and select Create repository.
  2. Enter a name for your repository (I’ve named my example securityhub-prowler, as shown in figure 2), then choose Mutable as your image tag mutability setting, and select Create repository.
     
    Figure 2: ECR Repository Creation

    Figure 2: ECR Repository Creation

Keep the browser tab in which you created the repository open so that you can easily reference the Docker commands you’ll need in the next step.

Step 2: Build and push the Docker image

In this step, you’ll create a Docker image that contains scripts that will map Prowler findings into DynamoDB. Before you begin step 2, ensure your workstation has the necessary permissions to push images to ECR.

  1. Create a Dockerfile via your favorite text editor, and name it Dockerfile.
    
    FROM python:latest
    
    # Declar Env Vars
    ENV MY_DYANMODB_TABLE=MY_DYANMODB_TABLE
    ENV AWS_REGION=AWS_REGION
    
    # Install Dependencies
    RUN \
        apt update && \
        apt upgrade -y && \
        pip install awscli && \
        apt install -y python3-pip
    
    # Place scripts
    ADD converter.py /root
    ADD loader.py /root
    ADD script.sh /root
    
    # Installs prowler, moves scripts into prowler directory
    RUN \
        git clone https://github.com/toniblyx/prowler && \
        mv root/converter.py /prowler && \
        mv root/loader.py /prowler && \
        mv root/script.sh /prowler
    
    # Runs prowler, ETLs ouput with converter and loads DynamoDB with loader
    WORKDIR /prowler
    RUN pip3 install boto3
    CMD bash script.sh
    

  2. Create a new file called script.sh and paste in the below code. This script will call the remaining scripts, which you’re about to create in a specific order.

    Note: Change the AWS Region in the Prowler command on line 3 to the region in which you’ve enabled Security Hub.

    
    #!/bin/bash
    echo "Running Prowler Scans"
    ./prowler -b -n -f us-east-1 -g extras -M csv > prowler_report.csv
    echo "Converting Prowler Report from CSV to JSON"
    python converter.py
    echo "Loading JSON data into DynamoDB"
    python loader.py
    

  3. Create a new file called converter.py and paste in the below code. This Python script will convert the Prowler CSV report into JSON, and both versions will be written to the local storage of the Prowler container.
    
    import csv
    import json
    
    # Set variables for within container
    CSV_PATH = 'prowler_report.csv'
    JSON_PATH = 'prowler_report.json'
    
    # Reads prowler CSV output
    csv_file = csv.DictReader(open(CSV_PATH, 'r'))
    
    # Create empty JSON list, read out rows from CSV into it
    json_list = []
    for row in csv_file:
        json_list.append(row)
    
    # Writes row into JSON file, writes out to docker from .dumps
    open(JSON_PATH, 'w').write(json.dumps(json_list))
    
    # open newly converted prowler output
    with open('prowler_report.json') as f:
        data = json.load(f)
    
    # remove data not needed for Security Hub BatchImportFindings    
    for element in data: 
        del element['PROFILE']
        del element['SCORED']
        del element['LEVEL']
        del element['ACCOUNT_NUM']
        del element['REGION']
    
    # writes out to a new file, prettified
    with open('format_prowler_report.json', 'w') as f:
        json.dump(data, f, indent=2)
    

  4. Create your last file, called loader.py and paste in the below code. This Python script will read values from the JSON file and send them to DynamoDB.
    
    from __future__ import print_function # Python 2/3 compatibility
    import boto3
    import json
    import decimal
    import os
    
    awsRegion = os.environ['AWS_REGION']
    prowlerDynamoDBTable = os.environ['MY_DYANMODB_TABLE']
    
    dynamodb = boto3.resource('dynamodb', region_name=awsRegion)
    
    table = dynamodb.Table(prowlerDynamoDBTable)
    
    # CHANGE FILE AS NEEDED
    with open('format_prowler_report.json') as json_file:
        findings = json.load(json_file, parse_float = decimal.Decimal)
        for finding in findings:
            TITLE_ID = finding['TITLE_ID']
            TITLE_TEXT = finding['TITLE_TEXT']
            RESULT = finding['RESULT']
            NOTES = finding['NOTES']
    
            print("Adding finding:", TITLE_ID, TITLE_TEXT)
    
            table.put_item(
               Item={
                   'TITLE_ID': TITLE_ID,
                   'TITLE_TEXT': TITLE_TEXT,
                   'RESULT': RESULT,
                   'NOTES': NOTES,
                }
            )
    

  5. From the ECR console, within your repository, select View push commands to get operating system-specific instructions and additional resources to build, tag, and push your image to ECR. See Figure 3 for an example.
     
    Figure 3: ECR Push Commands

    Figure 3: ECR Push Commands

    Note: If you’ve built Docker images previously within your workstation, pass the –no-cache flag with your docker build command.

  6. After you’ve built and pushed your Image, note the URI within the ECR console (such as 12345678910.dkr.ecr.us-east-1.amazonaws.com/my-repo), as you’ll need this for a CloudFormation parameter in step 3.

Step 3: Deploy CloudFormation template

Download the CloudFormation template from GitHub and create a CloudFormation stack. For more information about how to create a CloudFormation stack, see Getting Started with AWS CloudFormation in the CloudFormation User Guide.

You’ll need the values you noted in Step 2 and during the “Solution overview” prerequisites. The description of each parameter is provided on the Parameters page of the CloudFormation deployment (see Figure 4)
 

Figure 4: CloudFormation Parameters

Figure 4: CloudFormation Parameters

After the CloudFormation stack finishes deploying, click the Resources tab to find your Task Definition (called ProwlerECSTaskDefinition). You’ll need this during Step 4.
 

Figure 5: CloudFormation Resources

Figure 5: CloudFormation Resources

Step 4: Manually run ECS task

In this step, you’ll run your ECS Task manually to verify the integration works. (Once you’ve tested it, this step will be automatic based on CloudWatch events.)

  1. Navigate to the Amazon ECS console and from the navigation pane select Task Definitions.
  2. As shown in Figure 6, select the check box for the task definition you deployed via CloudFormation, then select the Actions dropdown menu and choose Run Task.
     
    Figure 6: ECS Run Task

    Figure 6: ECS Run Task

  3. Configure the following settings (shown in Figure 7), then select Run Task:
    1. Launch Type: FARGATE
    2. Platform Version: Latest
    3. Cluster: Select the cluster deployed by CloudFormation
    4. Number of tasks: 1
    5. Cluster VPC: Enter the VPC of the subnets you provided as CloudFormation parameters
    6. Subnets: Select 1 or more subnets in the VPC
    7. Security groups: Enter the same security group you provided as a CloudFormation parameter
    8. Auto-assign public IP: ENABLED
       
      Figure 7: ECS Task Settings

      Figure 7: ECS Task Settings

  4. Depending on the size of your account and the resources within it, your task can take up to an hour to complete. Follow the progress by looking at the Logs tab within the Task view (Figure 8) by selecting your task. The stdout from Prowler will appear in the logs.

    Note: Once the task has completed it will automatically delete itself. You do not need to take additional actions for this to happen during this or subsequent runs.

     

    Figure 8: ECS Task Logs

    Figure 8: ECS Task Logs

  5. Under the Details tab, monitor the status. When the status reads Stopped, navigate to the DynamoDB console.
  6. Select your table, then select the Items tab. Your findings will be indexed under the primary key NOTES, as shown in Figure 9. From here, the Lambda function will trigger each time new items are written into the table from Fargate and will load them into Security Hub.
     
    Figure 9: DynamoDB Items

    Figure 9: DynamoDB Items

  7. Finally, navigate to the Security Hub console, select the Findings menu, and wait for findings from Prowler to arrive in the dashboard as shown in figure 10.
     
    Figure 10: Prowler Findings in Security Hub

    Figure 10: Prowler Findings in Security Hub

If you run into errors when running your Fargate task, refer to the Amazon ECS Troubleshooting guide. Log errors commonly come from missing permissions or disabled Regions – refer back to the Prowler GitHub for troubleshooting information.

Conclusion

In this post, I showed you how to containerize Prowler, run it manually, create a schedule with CloudWatch Events, and use custom Python scripts along with DynamoDB streams and Lambda functions to load Prowler findings into Security Hub. By using Security Hub, you can centralize and aggregate security configuration information from Prowler alongside findings from AWS and partner services.

From Security Hub, you can use custom actions to send one or a group of findings from Prowler to downstream services such as ticketing systems or to take custom remediation actions. You can also use Security Hub custom insights to create saved searches from your Prowler findings. Lastly, you can use Security Hub in a master-member format to aggregate findings across multiple accounts for centralized reporting.

If you have feedback about this blog post, submit comments in the Comments section below. If you have questions about this blog post, start a new thread on the AWS Security Hub forum.

Want more AWS Security news? Follow us on Twitter.

Jonathon Rau

Jonathan Rau

Jonathan is the Senior TPM for AWS Security Hub. He holds an AWS Certified Specialty-Security certification and is extremely passionate about cyber security, data privacy, and new emerging technologies, such as blockchain. He devotes personal time into research and advocacy about those same topics.

15 additional AWS services receive DoD Impact Level 4 and 5 authorization

Post Syndicated from Tyler Harding original https://aws.amazon.com/blogs/security/15-additional-aws-services-receive-dod-impact-level-4-and-5-authorization/

I’m pleased to announce that the Defense Information Systems Agency (DISA) has extended the Provisional Authorization to Operate (P-ATO) of AWS GovCloud (US) Regions for Department of Defense (DoD) workloads at DoD Impact Levels (IL) 4 and 5 under the DoD’s Cloud Computing Security Requirements Guide (DoD CC SRG). Our authorizations at DoD IL 4 and IL 5 allow DoD Mission Owners to process unclassified, mission critical workloads for National Security Systems in the AWS GovCloud (US) Regions.

AWS successfully completed an independent, third-party evaluation that confirmed we effectively implement over 400 security controls using applicable criteria from NIST SP 800-53 Rev 4, the US General Services Administration’s FedRAMP High baseline, the DoD CC SRG, and the Committee on National Security Systems Instruction No. 1253 at the High Confidentiality, High Integrity, and High Availability impact levels.

In addition to a P-ATO extension through November 2020 for the existing 24 services, DISA authorized an additional 15 AWS services at DoD Impact Levels 4 and 5 as listed below.

Newly authorized AWS services at DoD Impact Levels 4 and 5

With these additional 15 services, we can now offer AWS customers a total of 39 services, authorized to store, process, and analyze DoD mission critical data at Impact Level 4 and Impact Level 5.

To learn more about AWS solutions for DoD, please see our AWS solution offerings. Stay tuned for future updates on our Services in Scope by Compliance Program page. If you have feedback about this blog post, let us know in the Comments section below.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Tyler Harding

Tyler is the DoD Compliance Program Manager within AWS Security Assurance. He has over 20 years of experience providing information security solutions to federal civilian, DoD, and intelligence agencies.

New guidance to help you navigate Australian Prudential Regulation Authority requirements

Post Syndicated from Paul Curtis original https://aws.amazon.com/blogs/security/new-guidance-navigate-australian-prudential-regulation-authority-requirements/

There have been two noteworthy 2019 updates for Australian Prudential Regulation Authority (APRA) regulated entities such as banks, insurance companies, credit unions, deposit takers, and the superannuation industry.

On June 25, APRA released an updated version of the Prudential Practice Guide CPG 234 Information Security, which provides guidance on how to implement the revised Prudential Standard CPS 234 Information Security. The new Prudential Practice Guide has been expanded significantly compared to the previous version. The revised guidance reflects the evolving cybersecurity landscape and focuses on areas of importance to APRA regulated entities.

On July 1, APRA’s Prudential Standard CPS 234 Information Security became effective. This standard represents a set of legally enforceable information security requirements for APRA regulated entities. CPS 234 aims to:

“…ensure that an APRA regulated entity takes measures to be resilient against information security incidents (including cyberattacks) by maintaining an information security capability commensurate with information security vulnerabilities and threats.”

In response to these updates, we have updated our AWS User Guide to Financial Services Regulations & Guidelines in Australia, which provides APRA regulated entities with a summary of APRA’s requirements, plus recommendations related to outsourcing and IT risk in the cloud. In addition to introducing the shared responsibility model, AWS Compliance Assurance Programs, and the AWS Global Infrastructure, our user guide summarizes four APRA documents that regulated entities should be aware of: APRA’s Prudential Standard CPS 231 Outsourcing, Information Paper on Outsourcing Involving Cloud Computing Services, CPS 234 Information Security, and the updated CPG 234 Information Security.

To assist our customers in meeting the updated recommendations in CPG 234 Information Security, we’ve also updated the APRA CPG 234 Workbook. The workbook is available for download through AWS Artifact (you’ll need an AWS account). Our updates reflect the revised content in APRA’s guidance, and the workbook now includes guidance on how to meet CPG 234’s recommendations for security “IN the cloud” by mapping to the five pillars of the AWS Well-Architected Framework.

As the regulatory environment continues to evolve, we’ll provide further updates on the AWS Security Blog and the AWS Compliance page. The user guide and workbook add to the resources AWS provides about financial services regulation across the world. You can find more information on cloud-related regulatory compliance at the AWS Compliance Center. You can also reach out to your AWS account manager for help finding the resources you need.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Author

Paul Curtis

As an FSI Compliance Specialist in the Global Financial Services Industry Team, Paul works with financial organisations, supporting their risk management and compliance functions. He works with our customers’ risk teams to help them manage their technology risk in a scalable way that unlocks their ability to realize cloud benefits. Paul has over fifteen years of experience working in risk and technology across the financial services industry in Australia, Southeast Asia and South Africa. Paul holds an MBA (Corporate Governance) and is a Graduate member of the Australian Institute of Company Directors.

AWS achieves FedRAMP JAB High and Moderate Provisional Authorization across 18 services in the AWS US East/West and AWS GovCloud (US) Regions

Post Syndicated from Amendaze Thomas original https://aws.amazon.com/blogs/security/aws-achieves-fedramp-jab-high-moderate-provisional-authorization-18-services/

It’s my pleasure to announce that we’ve expanded the number of AWS services that customers can use to run sensitive and highly regulated workloads in the federal government space. This expansion of our FedRAMP program marks a 28.6% increase in our number of FedRAMP authorizations.

Today, we’ve achieved FedRAMP authorizations for 6 services in our AWS US East/West Regions:

We also received 14 service authorizations in our AWS GovCloud (US) Regions:

In total, we now offer 48 AWS services authorized in the AWS US East/West Regions under FedRAMP Moderate and 43 services authorized in our AWS GovCloud (US) Regions under FedRamp High. You can see our full, updated list of authorizations on the FedRAMP Marketplace. We also list all of our services in scope by compliance program on our Services in Scope page.

Our FedRAMP assessment was completed with a third-party assessment partner to ensure an independent validation of our technical, management, and operational security controls against the FedRAMP baselines.

We care deeply about our customers’ needs, and compliance is my team’s priority. As we expand in the federal space, we want to continue to onboard services into the compliance programs our customers are using, such as FedRAMP.

To learn what other public sector customers are doing on AWS, see our Government, Education, and Nonprofits Case Studies and Customer Success Stories. Stay tuned for future updates on our Services in Scope by Compliance Program page. If you have feedback about this blog post, let us know in the Comments section below.

Want more AWS Security news? Follow us on Twitter.

author photo

Amendaze Thomas

Amendaze is the manager of AWS Security’s Government Assessments and Authorization Program (GAAP). He has 15 years of experience providing advisory services to clients in the Federal government, and over 13 years’ experience supporting CISO teams with risk management framework (RMF) activities.

Updated whitepaper available: “Navigating GDPR Compliance on AWS”

Post Syndicated from Carmela Gambardella original https://aws.amazon.com/blogs/security/updated-whitepaper-available-navigating-gdpr-compliance-on-aws/

The European Union’s General Data Protection Regulation 2016/679 (GDPR) safeguards EU citizens’ fundamental right to privacy and to personal data protection. In order to make local regulations coherent and homogeneous, the GDPR introduces and defines stringent new standards in terms of compliance, security and data protection.

The updated version of our Navigating GDPR Compliance on AWS whitepaper (.pdf) explains the role that AWS plays in your GDPR compliance process and shows how AWS can help your organization accelerate the process of aligning your compliance programs to the GDPR by using AWS cloud services.

AWS compliance, data protection, and security experts work with customers across the world to help them run workloads in the AWS Cloud, including customers who must operate within GDPR requirements. AWS teams also review what AWS is responsible for to make sure that our operations comply with the requirements of the GDPR so that customers can continue to use AWS services. The whitepaper provides guidelines to better orient you to the wide variety of AWS security offerings and to help you identify the service that best suits your GDPR compliance needs.

If you have feedback about this blog post, please submit comments in the Comments section below.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Author photo

Carmela Gambardella

Carmela graduated in Computer Science at the Federico II University of Naples, Italy. She has worked in a variety of roles at large IT companies, including as a software engineer, security consultant, and security solutions architect. Her areas of interest include data protection, security and compliance, application security, and software engineering. In April 2018, she joined the AWS Public Sector Solution Architects team in Italy.

Author photo

Giuseppe Russo

Giuseppe is a Security Assurance Manager for AWS in Italy. He has a Master’s Degree in Computer Science with a specialization in cryptography, security and coding theory. Giuseppe is s a seasoned information security practitioner with many years of experience engaging key stakeholders, developing guidelines, and influencing the security market on strategic topics such as privacy and critical infrastructure protection.

Compliance, and why Raspberry Pi 4 may not be available in your country yet

Post Syndicated from Roger Thornton original https://www.raspberrypi.org/blog/compliance-and-why-raspberry-pi-4-may-not-be-available-in-your-country-yet/

In June we launched Raspberry Pi 4, and it has been selling extremely well, with over 1 million devices already made. We launched the product in a select set of countries in June, and ever since, we’ve been steadily making it available in more and more places; currently, Raspberry Pi 4 is on the market in 55 countries.

Raspberry Pi 4 and compliance

There have been many questions around why Raspberry Pi 4 isn’t available in certain countries, and this post will give you some insight into this.

Whenever a company wants to sell a product on a market, it first has to prove that selling it is safe and legal. Compliance requirements vary between different products; rules that would apply to a complicated machine like a car will, naturally, not be the same as those that apply to a pair of trainers (although there is some overlap in the Venn diagram of rules).

Raspberry Pi Integrator Programme

Regions of the world within each of which products have to be separately tested

Different countries usually have slightly different sets of regulations, and testing has to be conducted at an accredited facility for the region the company intends to sell the product in.

Compliance for a country is broken into the following: testing, certification, and marking.

Testing

Compliance testing requirements vary from country to country; there is no single set of tests or approvals that allow you to sell a product globally. Often, it’s necessary to test the product within the country that compliance is needed for; only some countries accept test reports from other countries.

For the launch of Raspberry Pi 4, we tested to EU, FCC (USA), and IC (Canada) regulations, and we’ve used these test reports to apply for compliance in as many countries as possible.

Certification

Once testing is complete, a certificate is issued for the product. The time this takes is variable, and some countries post such certificates online publicly so people can search for products.

Testing in the remaining countries that require testing to happen in-country is now complete, and the respective certificates are being granted for Raspberry Pi 4 right now. However, whilst the certificate is being issued, the product isn’t yet compliant; we need to add the regulatory markings for this to happen.

Marking

Like testing requirements, product marking requirements may differ from country to country. The main difficulty of marking is that many countries require a unique certificate number to be printed on packaging, leaflets, and the product itself.

Some countries, such as the USA, allow companies to create the certificate number themselves (hence jazzy numbers like 2ABCB-RPI4B), and so we can place these on the product before launch. In other countries, however, the certificate number is issued at the end of the certification process.

For Raspberry Pi 4, we are now at the final stage for compliance: marking. All our certificates have been issued, and we are updating the packaging, leaflet, and product with the various certificate numbers needed to unlock the last few countries.

The countries that we have certificates for that require markings to be added: China, South Korea, Brazil, Mexico, Taiwan, Chile, and Japan.

The process is beginning, and Raspberry Pi 4 should be available in these markets soon.

We post all our product compliance information online.

Conclusion

This is a broad overview of the compliance process for Raspberry Pi, and there are some details omitted for the sake of clarity. Compliance is a complex and varied task, but it is very important to demonstrate that Raspberry Pi 4 is a compliant, safe, and trustworthy product.

We aim to make Raspberry Pi 4 available in more countries than ever before, ensuring that everyone can take advantage of the amazing features, power, and cost-effectiveness it offers.

The post Compliance, and why Raspberry Pi 4 may not be available in your country yet appeared first on Raspberry Pi.

Learn about AWS Services & Solutions – September AWS Online Tech Talks

Post Syndicated from Jenny Hang original https://aws.amazon.com/blogs/aws/learn-about-aws-services-solutions-september-aws-online-tech-talks/

Learn about AWS Services & Solutions – September AWS Online Tech Talks

AWS Tech Talks

Join us this September to learn about AWS services and solutions. The AWS Online Tech Talks are live, online presentations that cover a broad range of topics at varying technical levels. These tech talks, led by AWS solutions architects and engineers, feature technical deep dives, live demonstrations, customer examples, and Q&A with AWS experts. Register Now!

Note – All sessions are free and in Pacific Time.

Tech talks this month:

 

Compute:

September 23, 2019 | 11:00 AM – 12:00 PM PTBuild Your Hybrid Cloud Architecture with AWS – Learn about the extensive range of services AWS offers to help you build a hybrid cloud architecture best suited for your use case.

September 26, 2019 | 1:00 PM – 2:00 PM PTSelf-Hosted WordPress: It’s Easier Than You Think – Learn how you can easily build a fault-tolerant WordPress site using Amazon Lightsail.

October 3, 2019 | 11:00 AM – 12:00 PM PTLower Costs by Right Sizing Your Instance with Amazon EC2 T3 General Purpose Burstable Instances – Get an overview of T3 instances, understand what workloads are ideal for them, and understand how the T3 credit system works so that you can lower your EC2 instance costs today.

 

Containers:

September 26, 2019 | 11:00 AM – 12:00 PM PTDevelop a Web App Using Amazon ECS and AWS Cloud Development Kit (CDK) – Learn how to build your first app using CDK and AWS container services.

 

Data Lakes & Analytics:

September 26, 2019 | 9:00 AM – 10:00 AM PTBest Practices for Provisioning Amazon MSK Clusters and Using Popular Apache Kafka-Compatible Tooling – Learn best practices on running Apache Kafka production workloads at a lower cost on Amazon MSK.

 

Databases:

September 25, 2019 | 1:00 PM – 2:00 PM PTWhat’s New in Amazon DocumentDB (with MongoDB compatibility) – Learn what’s new in Amazon DocumentDB, a fully managed MongoDB compatible database service designed from the ground up to be fast, scalable, and highly available.

October 3, 2019 | 9:00 AM – 10:00 AM PTBest Practices for Enterprise-Class Security, High-Availability, and Scalability with Amazon ElastiCache – Learn about new enterprise-friendly Amazon ElastiCache enhancements like customer managed key and online scaling up or down to make your critical workloads more secure, scalable and available.

 

DevOps:

October 1, 2019 | 9:00 AM – 10:00 AM PT – CI/CD for Containers: A Way Forward for Your DevOps Pipeline – Learn how to build CI/CD pipelines using AWS services to get the most out of the agility afforded by containers.

 

Enterprise & Hybrid:

September 24, 2019 | 1:00 PM – 2:30 PM PT Virtual Workshop: How to Monitor and Manage Your AWS Costs – Learn how to visualize and manage your AWS cost and usage in this virtual hands-on workshop.

October 2, 2019 | 1:00 PM – 2:00 PM PT – Accelerate Cloud Adoption and Reduce Operational Risk with AWS Managed Services – Learn how AMS accelerates your migration to AWS, reduces your operating costs, improves security and compliance, and enables you to focus on your differentiating business priorities.

 

IoT:

September 25, 2019 | 9:00 AM – 10:00 AM PTComplex Monitoring for Industrial with AWS IoT Data Services – Learn how to solve your complex event monitoring challenges with AWS IoT Data Services.

 

Machine Learning:

September 23, 2019 | 9:00 AM – 10:00 AM PTTraining Machine Learning Models Faster – Learn how to train machine learning models quickly and with a single click using Amazon SageMaker.

September 30, 2019 | 11:00 AM – 12:00 PM PTUsing Containers for Deep Learning Workflows – Learn how containers can help address challenges in deploying deep learning environments.

October 3, 2019 | 1:00 PM – 2:30 PM PTVirtual Workshop: Getting Hands-On with Machine Learning and Ready to Race in the AWS DeepRacer League – Join DeClercq Wentzel, Senior Product Manager for AWS DeepRacer, for a presentation on the basics of machine learning and how to build a reinforcement learning model that you can use to join the AWS DeepRacer League.

 

AWS Marketplace:

September 30, 2019 | 9:00 AM – 10:00 AM PTAdvancing Software Procurement in a Containerized World – Learn how to deploy applications faster with third-party container products.

 

Migration:

September 24, 2019 | 11:00 AM – 12:00 PM PTApplication Migrations Using AWS Server Migration Service (SMS) – Learn how to use AWS Server Migration Service (SMS) for automating application migration and scheduling continuous replication, from your on-premises data centers or Microsoft Azure to AWS.

 

Networking & Content Delivery:

September 25, 2019 | 11:00 AM – 12:00 PM PTBuilding Highly Available and Performant Applications using AWS Global Accelerator – Learn how to build highly available and performant architectures for your applications with AWS Global Accelerator, now with source IP preservation.

September 30, 2019 | 1:00 PM – 2:00 PM PTAWS Office Hours: Amazon CloudFront – Just getting started with Amazon CloudFront and [email protected]? Get answers directly from our experts during AWS Office Hours.

 

Robotics:

October 1, 2019 | 11:00 AM – 12:00 PM PTRobots and STEM: AWS RoboMaker and AWS Educate Unite! – Come join members of the AWS RoboMaker and AWS Educate teams as we provide an overview of our education initiatives and walk you through the newly launched RoboMaker Badge.

 

Security, Identity & Compliance:

October 1, 2019 | 1:00 PM – 2:00 PM PTDeep Dive on Running Active Directory on AWS – Learn how to deploy Active Directory on AWS and start migrating your windows workloads.

 

Serverless:

October 2, 2019 | 9:00 AM – 10:00 AM PTDeep Dive on Amazon EventBridge – Learn how to optimize event-driven applications, and use rules and policies to route, transform, and control access to these events that react to data from SaaS apps.

 

Storage:

September 24, 2019 | 9:00 AM – 10:00 AM PTOptimize Your Amazon S3 Data Lake with S3 Storage Classes and Management Tools – Learn how to use the Amazon S3 Storage Classes and management tools to better manage your data lake at scale and to optimize storage costs and resources.

October 2, 2019 | 11:00 AM – 12:00 PM PTThe Great Migration to Cloud Storage: Choosing the Right Storage Solution for Your Workload – Learn more about AWS storage services and identify which service is the right fit for your business.

 

 

AWS and the European Banking Authority Guidelines on Outsourcing

Post Syndicated from Chad Woolf original https://aws.amazon.com/blogs/security/aws-european-banking-authority-guidelines-on-outsourcing/

Financial institutions across the globe use AWS to transform the way they do business. It’s exciting to watch our customers in the financial services industry innovate on AWS in unique ways, across all geos and use cases. Regulations continue to evolve in this space, and we’re working hard to help customers proactively respond to new rules and guidelines. In many cases, the AWS Cloud makes it easier than ever before for customers to comply with different regulations and frameworks around the world.

The European Banking Authority (EBA), an EU financial supervisory authority, recently provided EU financial institutions (which includes credit institutions, certain investment firms, and payment institutions) with new outsourcing guidelines (PDF), which also apply to the use of cloud services. We’re ready and able to support our customers’ compliance with their obligations under the EBA Guidelines and to help meet and exceed their regulators’ expectations. We offer our customers a wide range of services that can simplify and directly assist in complying with the new guidelines, which take effect on September 30, 2019.

What do the EBA Guidelines mean for AWS customers?

The EBA Guidelines establish technology-neutral outsourcing requirements for EU financial institutions, and there is a particular focus on the outsourcing of “critical or important functions.” For AWS and our customers, the key takeaway is that the EBA Guidelines allow for EU financial institutions to use cloud services for material, regulated workloads. When considering or using third-party services, many EU financial institutions already follow due diligence, risk management, and regulatory notification processes that are similar to those processes laid out in the EBA Guidelines. To meet and exceed the EBA Guidelines’ requirements on security, resiliency, and assurance, EU financial institutions can use a variety of AWS security and compliance services.

Risk-based approach

The EBA Guidelines incorporate a risk-based approach that expects regulated entities to identify, assess, and mitigate the risks associated with any outsourcing arrangement. The risk-based approach outlined in the EBA Guidelines is consistent with the long-standing AWS shared responsibility model. This approach applies throughout the EBA Guidelines, including the areas of risk assessment, contractual and audit requirements, data location and transfer, and security implementation.

  • Risk assessment: The EBA Guidelines emphasize the need for EU financial institutions to assess the potential impact of outsourcing arrangements on their operational risk. The AWS shared responsibility model helps customers formulate their risk assessment approach because it illustrates how their security and management responsibilities change depending on the AWS services they use. For example, AWS operates some controls on behalf of customers, such as data center security, while customers operate other controls, such as event logging. In practice, AWS services help customers assess and improve their risk profile relative to traditional, on-premises environments.
  • Contractual and audit requirements: The EBA Guidelines lay out requirements for the written agreement between an EU financial institution and its service provider, including access and audit rights. For EU financial institutions running regulated workloads on AWS services, we offer the EBA Financial Services Addendum to address the EBA Guidelines’ contractual requirements. We also provide these institutions the ability to comply with the audit requirements in the EBA Guidelines through the AWS Security & Audit Series, including participation in an Audit Symposium, to facilitate customer audits. To align with regulatory requirements and expectations, our EBA addendum and audit program incorporate feedback that we’ve received from a variety of financial supervisory authorities across EU member states. EU financial services customers interested in learning more about the addendum or about the audit engagements offered by AWS can reach out to their AWS account teams.
  • Data location and transfer: The EBA Guidelines do not put restrictions on where an EU financial institution can store and process its data, but rather state that EU financial institutions should “adopt a risk-based approach to data storage and data processing location(s) (i.e. country or region) and information security considerations.” Our customers can choose which AWS Regions they store their content in, and we will not move or replicate your customer content outside of your chosen Regions unless you instruct us to do so. Customers can replicate and back up their customer content in more than one AWS Region to meet a variety of objectives, such as availability goals and geographic requirements.
  • Security implementation: The EBA Guidelines require EU financial institutions to consider, implement, and monitor various security measures. Using AWS services, customers can meet this requirement in a scalable and cost-effective way while improving their security posture. Customers can use AWS Config or AWS Security Hub to simplify auditing, security analysis, change management, and operational troubleshooting. As part of their cybersecurity measures, customers can activate Amazon GuardDuty, which provides intelligent threat detection and continuous monitoring, to generate detailed and actionable security alerts. Amazon Inspector automatically assesses a customer’s AWS resources for vulnerabilities or deviations from best practices and then produces a detailed list of security findings prioritized by level of severity. Customers can also enhance their security by using AWS Key Management Service (creation and control of encryption keys), AWS Shield (DDoS protection), and AWS WAF (filtering of malicious web traffic). These are just a few of the 500+ services and features we offer that enable strong availability, security, and compliance for our customers.

As reflected in the EBA Guidelines, it’s important to take a balanced approach when evaluating responsibilities in a cloud implementation. We are responsible for the security of the AWS Global Infrastructure. In the EU, we currently operate AWS Regions in Ireland, Frankfurt, London, Paris, and Stockholm, with our new Milan Region opening soon. For all of our data centers, we assess and manage environmental risks, employ extensive physical and personnel security controls, and guard against outages through our resiliency and testing procedures. In addition, independent, third-party auditors test more than 2,600 standards and requirements in the AWS environment throughout the year.

Conclusion

We encourage customers to learn about how the EBA Guidelines apply to their organization. Our teams of security, compliance, and legal experts continue to work with our EU financial services customers, both large and small, to support their journey to the AWS Cloud. AWS is closely following how regulatory authorities apply the EBA Guidelines locally and will provide further updates as needed. If you have any questions about compliance with the EBA Guidelines and their application to your use of AWS, or if you require the EBA Financial Services Addendum, please reach out to your account representative or request to be contacted.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Author

Chad Woolf

Chad joined Amazon in 2010 and built the AWS compliance functions from the ground up, including audit and certifications, privacy, contract compliance, control automation engineering and security process monitoring. Chad’s work also includes enabling public sector and regulated industry adoption of the AWS Cloud, compliance with complex privacy regulations such as GDPR and operating a trade and product compliance team in conjunction with global region expansion. Prior to joining AWS, Chad spent 12 years with Ernst & Young as a Senior Manager working directly with Fortune 100 companies consulting on IT process, security, risk, and vendor management advisory work, as well as designing and deploying global security and assurance software solutions. Chad holds a Masters of Information Systems Management and a Bachelors of Accounting from Brigham Young University, Utah. Follow Chad on Twitter.

64 AWS services achieve HITRUST certification

Post Syndicated from Chris Gile original https://aws.amazon.com/blogs/security/64-aws-services-achieve-hitrust-certification/

We’re excited to announce that 64 AWS services are now certified for the Health Information Trust Alliance (HITRUST) Common Security Framework (CSF).

The full list of AWS services that were audited by a third party auditor and certified under HITRUST CSF is available on our Services in Scope by Compliance Program page. You can view and download our HITRUST CSF certification here:

The HITRUST certification allows AWS customers to tailor their security control baselines to a variety of factors including, but not limited to, regulatory requirements and organization type.

The HITRUST Alliance has established the CSF as a certifiable framework that can be leveraged by organizations to comply with ISO/IEC 27000 series and HIPAA related requirements. The HITRUST CSF is already widely adopted by leading organizations in a variety of industries in their approach to security and privacy. Please visit the HITRUST Alliance website for more information.

As always, we value your feedback and questions and commit to helping customers achieve and maintain the highest standard of security and compliance. Please feel free to reach out to the team through the AWS Compliance Contact Us page.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

AWS achieves OSPAR outsourcing standard for Singapore financial industry

Post Syndicated from Brandon Lim original https://aws.amazon.com/blogs/security/aws-achieves-ospar-outsourcing-standard-for-singapore-financial-industry/

AWS has achieved the Outsourced Service Provider Audit Report (OSPAR) attestation for 66 services in the Asia Pacific (Singapore) Region. The OSPAR assessment is performed by an independent third party auditor. AWS’s OSPAR demonstrates that AWS has a system of controls in place that meet the Association of Banks in Singapore’s Guidelines on Control Objectives and Procedures for Outsourced Service Providers (ABS Guidelines).

The ABS Guidelines are intended to assist financial institutions in understanding approaches to due diligence, vendor management, and key technical and organizational controls that should be implemented in cloud outsourcing arrangements, particularly for material workloads. The ABS Guidelines are closely aligned with the Monetary Authority of Singapore’s Outsourcing Guidelines, and they’re one of the standards that the financial services industry in Singapore uses to assess the capability of their outsourced service providers (including cloud service providers).

AWS’s alignment with the ABS Guidelines demonstrates to customers AWS’s commitment to meeting the high expectations for cloud service providers set by the financial services industry in Singapore. Customers can leverage OSPAR to conduct their due diligence, minimizing the effort and costs required for compliance. AWS’s OSPAR report is now available in AWS Artifact.

You can find additional resources about regulatory requirements in the Singapore financial industry at the AWS Compliance Center. If you have questions about AWS’s OSPAR, or if you’d like to inquire about how to use AWS for your material workloads, please contact your AWS account team.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Author photo

Brandon Lim

Brandon is the Head of Security Assurance for Financial Services, Asia-Pacific. Brandon leads AWS’s regulatory and security engagement efforts for the Financial Services industry across the Asia Pacific region. He is passionate about working with Financial Services Regulators in the region to drive innovation and cloud adoption for the financial industry.

Introducing the “Preparing for the California Consumer Privacy Act” whitepaper

Post Syndicated from Julia Soscia original https://aws.amazon.com/blogs/security/introducing-the-preparing-for-the-california-consumer-privacy-act-whitepaper/

AWS has published a whitepaper, Preparing for the California Consumer Protection Act, to provide guidance on designing and updating your cloud architecture to follow the requirements of the California Consumer Privacy Act (CCPA), which goes into effect on January 1, 2020.

The whitepaper is intended for engineers and solution builders, but it also serves as a guide for qualified security assessors (QSAs) and internal security assessors (ISAs) so that you can better understand the range of AWS products and services that are available for you to use.

The CCPA was enacted into law on June 28, 2018 and grants California consumers certain privacy rights. The CCPA grants consumers the right to request that a business disclose the categories and specific pieces of personal information collected about the consumer, the categories of sources from which that information is collected, the “business purposes” for collecting or selling the information, and the categories of third parties with whom the information is shared. This whitepaper looks to address the three main subsections of the CCPA: data collection, data retrieval and deletion, and data awareness.

To read the text of the CCPA please visit the website for California Legislative Information.

If you have questions or want to learn more, contact your account executive or leave a comment below.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Author photo

Julia Soscia

Julia is a Solutions Architect at Amazon Web Services based out of New York City. Her main focus is to help customers create well-architected environments on the AWS cloud platform. She is an experienced data analyst with a focus in Big Data and Analytics.

Author photo

Anthony Pasquarielo

Anthony is a Solutions Architect at Amazon Web Services. He’s based in New York City. His main focus is providing customers technical guidance and consultation during their cloud journey. Anthony enjoys delighting customers by designing well-architected solutions that drive value and provide growth opportunity for their business.

Author photo

Justin De Castri

Justin is a Manager of Solutions Architecture at Amazon Web Services based in New York City. His primary focus is helping customers build secure, scaleable, and cost optimized solutions that are aligned with their business objectives.

Spring 2019 PCI DSS report now available, 12 services added in scope

Post Syndicated from Chris Gile original https://aws.amazon.com/blogs/security/spring-2019-pci-dss-report-now-available-12-services-added-in-scope/

At AWS Security, continuously raising the cloud security bar for our customers is central to all that we do. Part of that work is focused on our formal compliance certifications, which enable our customers to use the AWS cloud for highly sensitive and/or regulated workloads. We see our customers constantly developing creative and innovative solutions—and in order for them to continue to do so, we need to increase the availability of services within our certifications. I’m pleased to tell you that in the past year, we’ve increased our Payment Card Industry – Data Security Standard (PCI DSS) certification scope by 79%, from 62 services to 111 services, including 12 newly added services in our latest PCI report (listed below), and we were audited by our third-party auditor, Coalfire.

The PCI DSS report and certification cover the 111 services currently in scope that are used by our customers to architect a secure Cardholder Data Environment (CDE) to protect important workloads. The full list of PCI DSS certified AWS services is available on our Services in Scope by Compliance program page. The 12 newly added services for our Spring 2019 report are:

Our compliance reports, including this latest PCI report, are available on-demand through AWS Artifact.

To learn more about our PCI program and other compliance and security programs, please visit the AWS Compliance Programs page.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Security Compliance at Cloudflare

Post Syndicated from Rebecca Rogers original https://blog.cloudflare.com/security-compliance-at-cloudflare/

Security Compliance at Cloudflare

Cloudflare believes trust is fundamental to helping build a better Internet. One way Cloudflare is helping our customers earn their users’ trust is through industry standard security compliance certifications and regulations.

Security compliance certifications are reports created by independent, third-party auditors that validate  and document a company’s commitment to security. These external auditors will conduct a rigorous review of a company’s technical environment and evaluate whether or not there are thorough controls – or safeguards – in place to protect the security, confidentiality, and availability of information stored and processed in the environment. SOC 2 was established by the American Institute of CPAs and is important to many of our U.S. companies, as it is a standardized set of requirements a company must meet in order to comply. Additionally, PCI and ISO 27001 are international standards. Cloudflare cares about achieving certifications because our adherence to these standards creates confidence to customers across the globe that we are committed to security. So, the Security team has been hard at work obtaining these meaningful compliance certifications.

Since the beginning of this year, we have been renewing our PCI DSS certification in February, achieving SOC 2 Type 1 compliance in March, obtaining our ISO 27001 certification in April, and today we are proud to announce we are SOC 2 Type 2 compliant!

Our SOC 2 Journey

SOC 2 is a compliance certification that focuses on internal controls of an organization related to five trust services criteria. These criteria are: Security, Confidentiality, Availability, Processing Integrity, and Privacy. Each criterion presents a set of control standards that are established by the American Institute of Certified Public Accountants (AICPA) and are to be used to implement controls on the information systems of a company.

Cloudflare’s Security team made the decision to evaluate our companies’ controls around three of the five criteria. We determined to pursue our SOC 2 compliance by evaluating our controls around Security, Confidentiality, and Availability across our entire organization. We first worked across the company to design and implement strong controls that meet the requirements set forth by the AICPA. This took effort and collaboration between teams in Engineering, IT, Legal, and HR to create strong controls that also make sense to our environment. Our external auditors then performed an audit of Cloudflare’s controls, and determined our security controls were suitably designed as of January 31, 2019.

Security Compliance at Cloudflare

Three months after obtaining SOC 2 Type 1 compliance, the next step for Cloudflare was to demonstrate the controls we designed were actually operating effectively. Our SOC 2 Type 2 audit tested the operating effectiveness of Cloudflare’s security controls over this three month period. Cloudflare’s SOC 2 Type 2 report can be available upon request and describes the design of Cloudflare’s internal control framework around security, confidentiality and availability and the products and services in-scope for our certification.

What else?

SOC 3

In addition to SOC 2 Type 2, Cloudflare also obtained our SOC 3 report from our independent external auditors. SOC 3 is a report for public consumption on the external auditor’s opinion and a narrative of Cloudflare’s control environment. Cloudflare’s Security team decided on obtaining our SOC 3 report so all customers and prospects could access our auditor’s opinion of our implementation of security, confidentiality, and availability controls.

ISO/IEC 27001: 2013

Prior to Cloudflare’s SOC audit, Cloudflare was working to mature our organizations’ Information Security Management System in order to obtain our ISO/IEC 27001: 2013 certification. ISO 27001 is an international management system standard developed by the International Organization for Standardization (ISO) and is an industry-wide accepted information security certification. Cloudflare’s commitment to achieving ISO/IEC 27001: 2013 certification was to demonstrate to our customers that we are committed to preserving the confidentiality, integrity, and availability of information on a global scale.

The primary focus of ISO 27001:2013 requirements is the focus on implementation of an Information Security Management System (ISMS) and a comprehensive risk management program.  Cloudflare worked across the organization to implement the ISMS to ensure sensitive company information remains secure.

Security Compliance at Cloudflare

Cloudflare’s ISMS was assessed by a third-party auditor, A-LIGN, and we received our ISO 27001: 2013 certification in April 2019. Cloudflare’s ISO 27001:2013 certificate is also available to customers upon request.

PCI DSS v3.2.1

Although Cloudflare has been PCI certified as a Level 1 Service Provider since 2014, our latest certification adheres to the newest security standards. The Payment Card Industry Data Security Standard (PCI DSS) is a global financial information security standards that ensures customers’ credit card data is safe and secure.

Maintaining PCI DSS compliance is important for Cloudflare because not only are we evaluated as a merchant, but we are also a service provider. Cloudflare’s WAF product satisfies PCI requirement 6.6, and may be used by Cloudflare’s customers as a solution to prevent web-based attacks in front of public-facing web applications.

Security Compliance at Cloudflare

Early in 2019, Cloudflare was audited by an independent Qualified Security Assessor to validate our adherence to the PCI DSS security requirements. Cloudflare’s latest PCI Attestation of Compliance (AOC) is available to customers upon request.

Compliance Page on the Website

Cloudflare is committed to helping our customers’ earn their user’s trust by ensuring our products are secure. The Security team is committed to adhering to security compliance certifications and regulations that maintain the security, confidentiality, and availability of company and client information.
In order to help our customers keep track of the latest certifications, Cloudflare has launched our Compliance certification page – www.cloudflare.com/compliance. Today, you can view our status on all compliance certifications and download our SOC 3 report.

Spring 2019 SOC 2 Type 1 Privacy report now available

Post Syndicated from Chris Gile original https://aws.amazon.com/blogs/security/spring-2019-soc-2-type-1-privacy-report-now-available/

At AWS, our customers’ security and privacy is of the highest importance and we continue to provide transparency into our security and privacy posture. Following our first SOC 2 Type 1 Privacy report released in December 2018, AWS is proud to announce the release of the Spring 2019 SOC 2 Type 1 privacy report. The Spring 2019 SOC 2 Privacy report provides you with a third-party attestation of our systems and the suitability of the design of our privacy controls. The report also provides a detailed description of those controls, the same controls that AWS uses to address the GDPR requirements around data security and privacy.

This updated report is a part of the SOC family of reports, and has been updated to align with the new Association of International Certified Professional Accountants (AICPA) Trust Service Criteria. The Trust Service Criteria align with the Committee of Sponsoring Organizations of the Treadway Commission (COSO) 2013 framework which has been designed to better address cybersecurity risks.

The highlights of the new Trust Service Criteria include:

  • A definition of principal service commitments and system requirements.
  • Restructuring and addition of supplemental criteria to better address cybersecurity risks.
  • New description criteria requiring the disclosure of system incidents.

The scope of the privacy report includes systems that AWS uses to collect personal information and all 104 services and locations in scope for the latest AWS SOC reports. You can download the new SOC 2 Type I Privacy report now through AWS Artifact in the AWS Management Console.

As always, we value your feedback and questions. Please feel free to reach out to the team through the Contact Us page.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Registration for AWS re:Inforce 2019 now open!

Post Syndicated from Stephen Schmidt original https://aws.amazon.com/blogs/security/registration-for-aws-reinforce-2019-now-open/

AWS re:Inforce

In late November, I announced AWS re:Inforce, a standalone conference where we will deep dive into the latest approaches to security, identity, and risk management utilizing AWS services, features, and tools. Now, after months of planning, the time has arrived to open registration! Ticket sales begin on March 12th at 10:00am PDT, and you can access the ticket sales website here. We do expect to sell out, so please consider registering soon to also secure a hotel (as well as take advantage of our travel discounts). In celebration, we are offering a limited, while supplies last, $300 discount on the full conference ticket price of $1,099. Register with code RFSAL19 to take advantage of this limited offer.

The benefits of attending AWS re:Inforce 2019 are considerable. The conference will be built around gaining hands-on tactical knowledge of cloud security, identity, and compliance. Over 100 security-specific AWS Partners will be featured in our learning hub to help you tackle all manner of security concerns. Additionally, we’ll have bootcamps where you can meet with likeminded professionals to learn skills that are applicable to your individual job scope. More details about specific session offerings will be announced in the next few weeks, but you can already find details on the track types and session levels here.

Taking a step back for a moment, creating a conference focused on cloud security was important to AWS because, as we’ve often stated, security is job zero for us. While re:Invent is a great opportunity to check in yearly with customers on our new features and services, we felt a conference tailored specifically to cloud security & identity professionals offered a great opportunity for everyone to strengthen their own security program from the ground up. We’ll have four tracks, geared for those just starting out all the way up to next generation aspirational security. We want to be at the forefront of an industry shift from reactive to proactive security, and our inaugural re:Inforce gathering is a great chance for us to hear from customers about their real-world concerns, from encryption to resiliency. We also think building an ongoing community of security stakeholders is critical—we know that excellent guidance for customers doesn’t always come directly from AWS. It can also spring forth from peer conversations and networking opportunities. The strength of the AWS cloud is customers. Our customers see use cases every day that both inform our security roadmap and make our cloud stronger for everyone. Simply put, there is no AWS security story without the tremendous diligence of customers and partners. Creating a space where all parties can come together to exchange knowledge and ideas, whether in a formal session or at a casual dinner, was at the forefront of our thinking when we first considered launching re:Inforce. Seeing the threads and details come together on this re:Inforce has been personally exciting and professionally validating; I can’t wait to see you all there in late June.

Purchase tickets for AWS re:Inforce via the ticket sales website here.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Follow Steve on Twitter.

Author

Steve Schmidt

Steve is Vice President and Chief Information Security Officer for AWS. His duties include leading product design, management, and engineering development efforts focused on bringing the competitive, economic, and security benefits of cloud computing to business and government customers. Prior to AWS, he had an extensive career at the Federal Bureau of Investigation, where he served as a senior executive and section chief. He currently holds five patents in the field of cloud security architecture.

New AWS services launch with HIPAA, PCI, ISO, and SOC – a company first

Post Syndicated from Chris Gile original https://aws.amazon.com/blogs/security/new-aws-services-launch-with-hipaa-pci-iso-and-soc/

Our security culture is one of the things that sets AWS apart. Security is job zero — it is the foundation for all AWS employees and impacts the work we do every day, across the company. And that’s reflected in our services, which undergo exacting internal and external security reviews before being released. From there, we have historically waited for customer demand to begin the complex process of third-party assessment and validating services under specific compliance programs. However, we’ve heard you tell us you want every generally available (GA) service in scope to keep up with the pace of your innovation and at the same time, meet rigorous compliance and regulatory requirements.

I wanted to share how we’re meeting this challenge with a more proactive approach to service certification by certifying services at launch. For the first time, we’ve launched new GA services with PCI DSS, ISO 9001/27001/27017/27018, SOC 2, and HIPAA eligibility. That means customers who rely on or require these compliance programs can select from 10 brand new services right away, without having to wait for one or more trailing audit cycles.

Verifying the security and compliance of the following new services is as simple as going to the console and using AWS Artifact to download the audit reports.

  • Amazon DocumentDB (with MongoDB compatibility) [HIPAA, PCI, ISO, SOC 2]
  • Amazon FSx [HIPAA, PCI, ISO]
  • Amazon Route 53 Resolver [ISO]
  • AWS Amplify [HIPAA, ISO]
  • AWS DataSync [HIPAA, PCI, ISO]
  • AWS Elemental MediaConnect [HIPAA, PCI, ISO]
  • AWS Global Accelerator [PCI, ISO]
  • AWS License Manager [ISO]
  • AWS RoboMaker [HIPAA, PCI, ISO]
  • AWS Transfer for SFTP [HIPAA, PCI, ISO]

This proactive compliance approach means we move upstream in the product development process. Over the last several months, we’ve made significant process improvements to deliver additional services with compliance certifications and HIPAA eligibility. Our security, compliance, and service teams have partnered in new ways to implement controls and audit earlier in a service’s development phase to demonstrate operating effectiveness. We also integrated auditing mechanisms into multiple stages of the launch process, enabling our security and compliance teams, as well as auditors, to assess controls throughout a service’s preview period. Additionally, we increased our audit frequency to meet services’ GA deadlines.

The work reflects a meaningful shift in our business. We’re excited to get these services into your hands sooner and wanted to report our overall progress. We also ask for your continued feedback since it drives our decisions and prioritization. Because going forward, we’ll continue to iterate and innovate until all of our services are certified at launch.

New SOC 2 Report Available: Privacy

Post Syndicated from Chris Gile original https://aws.amazon.com/blogs/security/new-soc-2-report-available-privacy/

Maintaining your trust is an ongoing commitment of ours, and your voice drives our growing portfolio of compliance reports, attestations, and certifications. As a result of your feedback and deep interest in privacy and data security, we are happy to announce the publication of our new SOC 2 Type I Privacy report.

Keeping you informed of our privacy and data security policies, practices, and technologies we’ve put in place is important to us. The SOC 2 Privacy Type I report is complementary to that effort . The SOC 2 Privacy Trust Principle, developed by the American Institute of CPAs (AICPA), establishes the criteria for evaluating controls related to how personal information is collected, used, retained, disclosed, and disposed to meet the entity’s objectives. The AWS SOC 2 Privacy Type I report provides you with a third-party attestation of our systems and the suitability of the design of our privacy controls, as stated in our Privacy Notice.

The scope of the privacy report includes systems AWS uses to collect personal information and all 72 services and locations in scope for the latest AWS SOC reports. You can download the new SOC 2 Type I Privacy report now through AWS Artifact in the AWS Management Console.

As always, we value your feedback and questions. Please feel free to reach out to the team through the Contact Us page.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

New podcast: VP of Security answers your compliance and data privacy questions

Post Syndicated from Katie Doptis original https://aws.amazon.com/blogs/security/new-podcast-vp-of-security-answers-your-compliance-and-data-privacy-questions/

Does AWS comply with X program? How about GDPR? What about after Brexit? And what happens with machine learning data?

In the latest AWS Security & Compliance Podcast, we sit down with VP of Security Chad Woolf, who answers your compliance and data privacy questions. Including one of the most frequently asked questions from customers around the world, which is: how many compliance programs does AWS have/attest to/audit against?

Chad also shares what it was like to work at AWS in the early days. When he joined, AWS was housed on just a handful of floors, in a single building. Over the course of nearly nine years with the company, he has witnessed tremendous growth of the business and industry.

Listen to the podcast and hear about company history and get answers to your tough questions. If you have a compliance or data privacy question, you can submit it through our contact us form.

Want more AWS news? Follow us on Twitter.

AWS re:Invent Security Recap: Launches, Enhancements, and Takeaways

Post Syndicated from Stephen Schmidt original https://aws.amazon.com/blogs/security/aws-reinvent-security-recap-launches-enhancements-and-takeaways/

For more from Steve, follow him on Twitter

Customers continue to tell me that our AWS re:Invent conference is a winner. It’s a place where they can learn, meet their peers, and rediscover the art of the possible. Of course, there is always an air of anticipation around what new AWS service releases will be announced. This time around, we went even bigger than we ever have before. There were over 50,000 people in attendance, spread across the Las Vegas strip, with over 2,000 breakout sessions, and jam packed hands-on learning opportunities with multiple day hackathons, workshops, and bootcamps.

A big part of all this activity included sharing knowledge about the latest AWS Security, Identity and Compliance services and features, as well as announcing new technology that we’re excited to be adopted so quickly across so many use-cases.

Here are the top Security, Identity and Compliance releases from re:invent 2018:

Keynotes: All that’s new

New AWS offerings provide more prescriptive guidance

The AWS re:Invent keynotes from Andy Jassy, Werner Vogels, and Peter DeSantis, as well as my own leadership session, featured the following new releases and service enhancements. We continue to strive to make architecting easier for developers, as well as our partners and our customers, so they stay secure as they build and innovate in the cloud.

  • We launched several prescriptive security services to assist developers and customers in understanding and managing their security and compliance postures in real time. My favorite new service is AWS Security Hub, which helps you centrally manage your security and compliance controls. With Security Hub, you now have a single place that aggregates, organizes, and prioritizes your security alerts, or findings, from multiple AWS services, such as Amazon GuardDuty, Amazon Inspector, and Amazon Macie, as well as from AWS Partner solutions. Findings are visually summarized on integrated dashboards with actionable graphs and tables. You can also continuously monitor your environment using automated compliance checks based on the AWS best practices and industry standards your organization follows. Get started with AWS Security Hub with just a few clicks in the Management Console and once enabled, Security Hub will begin aggregating and prioritizing findings. You can enable Security Hub on a single account with one click in the AWS Security Hub console or a single API call.
  • Another prescriptive service we launched is called AWS Control Tower. One of the first things customers think about when moving to the cloud is how to set up a landing zone for their data. AWS Control Tower removes the guesswork, automating the set-up of an AWS landing zone that is secure, well-architected and supports multiple accounts. AWS Control Tower does this by using a set of blueprints that embody AWS best practices. Guardrails, both mandatory and recommended, are available for high-level, rule-based governance, allowing you to have the right operational control over your accounts. An integrated dashboard enables you to keep a watchful eye over the accounts provisioned, the guardrails that are enabled, and your overall compliance status. Sign up for the Control Tower preview, here.
  • The third prescriptive service, called AWS Lake Formation, will reduce your data lake build time from months to days. Prior to AWS Lake Formation, setting up a data lake involved numerous granular tasks. Creating a data lake with Lake Formation is as simple as defining where your data resides and what data access and security policies you want to apply. Lake Formation then collects and catalogs data from databases and object storage, moves the data into your new Amazon S3 data lake, cleans and classifies data using machine learning algorithms, and secures access to your sensitive data. Get started with a preview of AWS Lake Formation, here.
  • Next up, IoT Greengrass enables enhanced security through hardware root of trusted private key storage on hardware secure elements including Trusted Platform Modules (TPMs) and Hardware Security Modules (HSMs). Storing your private key on a hardware secure element adds hardware root of trust level-security to existing AWS IoT Greengrass security features that include X.509 certificates for TLS mutual authentication and encryption of data both in transit and at rest. You can also use the hardware secure element to protect secrets that you deploy to your AWS IoT Greengrass device using AWS IoT Greengrass Secrets Manager. To try these security enhancements for yourself, check out https://aws.amazon.com/greengrass/.
  • You can now use the AWS Key Management Service (KMS) custom key store feature to gain more control over your KMS keys. Previously, KMS offered the ability to store keys in shared HSMs managed by KMS. However, we heard from customers that their needs were more nuanced. In particular, they needed to manage keys in single-tenant HSMs under their exclusive control. With KMS custom key store, you can configure your own CloudHSM cluster and authorize KMS to use it as a dedicated key store for your keys. Then, when you create keys in KMS, you can choose to generate the key material in your CloudHSM cluster. Get started with KMS custom key store by following the steps in this blog post.
  • We’re excited to announce the release of ATO on AWS to help customers and partners speed up the FedRAMP approval process (which has traditionally taken SaaS providers up to 2 years to complete). We’ve already had customers, such as Smartsheet, complete the process in less than 90 days with ATO on AWS. Customers will have access to training, tools, pre-built CloudFormation templates, control implementation details, and pre-built artifacts. Additionally, customers are able to access direct engagement and guidance from AWS compliance specialists and support from expert AWS consulting and technology partners who are a part of our Security Automation and Orchestration (SAO) initiative, including GitHub, Yubico, RedHat, Splunk, Allgress, Puppet, Trend Micro, Telos, CloudCheckr, Saint, Center for Internet Security (CIS), OKTA, Barracuda, Anitian, Kratos, and Coalfire. To get started with ATO on AWS, contact the AWS partner team at [email protected].
  • Finally, I announced our first conference dedicated to cloud security, identity and compliance: AWS re:Inforce. The inaugural AWS re:Inforce, a hands-on gathering of like-minded security professionals, will take place in Boston, MA on June 25th and 26th, 2019 at the Boston Convention and Exhibition Center. The cost for a full conference pass will be $1,099. I’m hoping to see you all there. Sign up here to be notified of when registration opens.

Key re:Invent Takeaways

AWS is here to help you build

  1. Customers want to innovate, and cloud needs to securely enable this. Companies need to able to innovate to meet rapidly evolving consumer demands. This means they need cloud security capabilities they can rely on to meet their specific security requirements, while allowing them to continue to meet and exceed customer expectations. AWS Lake Formation, AWS Control Tower, and AWS Security Hub aggregate and automate otherwise manual processes involved with setting up a secure and compliant cloud environment, giving customers greater flexibility to innovate, create, and manage their businesses.
  2. Cloud Security is as much art as it is science. Getting to what you really need to know about your security posture can be a challenge. At AWS, we’ve found that the sweet spot lies in services and features that enable you to continuously gain greater depth of knowledge into your security posture, while automating mission critical tasks that relieve you from having to constantly monitor your infrastructure. This manifests itself in having an end-to-end automated remediation workflow. I spent some time covering this in my re:Invent session, and will continue to advocate using a combination of services, such as AWS Lambda, WAF, S3, AWS CloudTrail, and AWS Config to proactively identify, mitigate, and remediate threats that may arise as your infrastructure evolves.
  3. Remove human access to data. I’ve set a goal at AWS to reduce human access to data by 80%. While that number may sound lofty, it’s purposeful, because the only way to achieve this is through automation. There have been a number of security incidents in the news across industries, ranging from inappropriate access to personal information in healthcare, to credential stuffing in financial services. The way to protect against such incidents? Automate key security measures and minimize your attack surface by enabling access control and credential management with services like AWS IAM and AWS Secrets Manager. Additional gains can be found by leveraging threat intelligence through continuous monitoring of incidents via services such as Amazon GuardDuty, Amazon Inspector, and Amazon Macie (intelligence from these services will now be available in AWS Security Hub).
  4. Get your leadership on board with your security plan. We offer 500+ security services and features; however, new services and technology can’t be wholly responsible for implementing reliable security measures. Security teams need to set expectations with leadership early, aligning on a number of critical protocols, including how to restrict and monitor human access to data, patching and log retention duration, credential lifespan, blast radius reduction, embedded encryption throughout AWS architecture, and canaries and invariants for security functionality. It’s also important to set security Key Performance Indicators (KPIs) to continuously track. At AWS, we monitor the number of AppSec reviews, how many security checks we can automate, third-party compliance audits, metrics on internal time spent, and conformity with Service Level Agreements (SLAs). While the needs of your business may vary, we find baseline KPIs to be consistent measures of security assurance that can be easily communicated to leadership.

Final Thoughts

Queen’s famous lyric, “I want it all, I want it all, and I want it now,” accurately captures the sentiment at re:Invent this year. Security will always be job zero for us, and we continue to iterate on behalf of customers so they can securely build, experiment and create … right now! AWS is trusted by many of the world’s most risk-sensitive organizations precisely because we have demonstrated this unwavering commitment to putting security above all. Still, I believe we are in the early days of innovation and adoption of the cloud, and I look forward to seeing both the gains and use cases that come out of our latest batch of tools and services.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Author

Steve Schmidt

Steve is Vice President and Chief Information Security Officer for AWS. His duties include leading product design, management, and engineering development efforts focused on bringing the competitive, economic, and security benefits of cloud computing to business and government customers. Prior to AWS, he had an extensive career at the Federal Bureau of Investigation, where he served as a senior executive and section chief. He currently holds five patents in the field of cloud security architecture. Follow Steve on Twitter