Tag Archives: Compliance

Security Compliance at Cloudflare

Post Syndicated from Rebecca Rogers original https://blog.cloudflare.com/security-compliance-at-cloudflare/

Security Compliance at Cloudflare

Cloudflare believes trust is fundamental to helping build a better Internet. One way Cloudflare is helping our customers earn their users’ trust is through industry standard security compliance certifications and regulations.

Security compliance certifications are reports created by independent, third-party auditors that validate  and document a company’s commitment to security. These external auditors will conduct a rigorous review of a company’s technical environment and evaluate whether or not there are thorough controls – or safeguards – in place to protect the security, confidentiality, and availability of information stored and processed in the environment. SOC 2 was established by the American Institute of CPAs and is important to many of our U.S. companies, as it is a standardized set of requirements a company must meet in order to comply. Additionally, PCI and ISO 27001 are international standards. Cloudflare cares about achieving certifications because our adherence to these standards creates confidence to customers across the globe that we are committed to security. So, the Security team has been hard at work obtaining these meaningful compliance certifications.

Since the beginning of this year, we have been renewing our PCI DSS certification in February, achieving SOC 2 Type 1 compliance in March, obtaining our ISO 27001 certification in April, and today we are proud to announce we are SOC 2 Type 2 compliant!

Our SOC 2 Journey

SOC 2 is a compliance certification that focuses on internal controls of an organization related to five trust services criteria. These criteria are: Security, Confidentiality, Availability, Processing Integrity, and Privacy. Each criterion presents a set of control standards that are established by the American Institute of Certified Public Accountants (AICPA) and are to be used to implement controls on the information systems of a company.

Cloudflare’s Security team made the decision to evaluate our companies’ controls around three of the five criteria. We determined to pursue our SOC 2 compliance by evaluating our controls around Security, Confidentiality, and Availability across our entire organization. We first worked across the company to design and implement strong controls that meet the requirements set forth by the AICPA. This took effort and collaboration between teams in Engineering, IT, Legal, and HR to create strong controls that also make sense to our environment. Our external auditors then performed an audit of Cloudflare’s controls, and determined our security controls were suitably designed as of January 31, 2019.

Security Compliance at Cloudflare

Three months after obtaining SOC 2 Type 1 compliance, the next step for Cloudflare was to demonstrate the controls we designed were actually operating effectively. Our SOC 2 Type 2 audit tested the operating effectiveness of Cloudflare’s security controls over this three month period. Cloudflare’s SOC 2 Type 2 report can be available upon request and describes the design of Cloudflare’s internal control framework around security, confidentiality and availability and the products and services in-scope for our certification.

What else?

SOC 3

In addition to SOC 2 Type 2, Cloudflare also obtained our SOC 3 report from our independent external auditors. SOC 3 is a report for public consumption on the external auditor’s opinion and a narrative of Cloudflare’s control environment. Cloudflare’s Security team decided on obtaining our SOC 3 report so all customers and prospects could access our auditor’s opinion of our implementation of security, confidentiality, and availability controls.

ISO/IEC 27001: 2013

Prior to Cloudflare’s SOC audit, Cloudflare was working to mature our organizations’ Information Security Management System in order to obtain our ISO/IEC 27001: 2013 certification. ISO 27001 is an international management system standard developed by the International Organization for Standardization (ISO) and is an industry-wide accepted information security certification. Cloudflare’s commitment to achieving ISO/IEC 27001: 2013 certification was to demonstrate to our customers that we are committed to preserving the confidentiality, integrity, and availability of information on a global scale.

The primary focus of ISO 27001:2013 requirements is the focus on implementation of an Information Security Management System (ISMS) and a comprehensive risk management program.  Cloudflare worked across the organization to implement the ISMS to ensure sensitive company information remains secure.

Security Compliance at Cloudflare

Cloudflare’s ISMS was assessed by a third-party auditor, A-LIGN, and we received our ISO 27001: 2013 certification in April 2019. Cloudflare’s ISO 27001:2013 certificate is also available to customers upon request.

PCI DSS v3.2.1

Although Cloudflare has been PCI certified as a Level 1 Service Provider since 2014, our latest certification adheres to the newest security standards. The Payment Card Industry Data Security Standard (PCI DSS) is a global financial information security standards that ensures customers’ credit card data is safe and secure.

Maintaining PCI DSS compliance is important for Cloudflare because not only are we evaluated as a merchant, but we are also a service provider. Cloudflare’s WAF product satisfies PCI requirement 6.6, and may be used by Cloudflare’s customers as a solution to prevent web-based attacks in front of public-facing web applications.

Security Compliance at Cloudflare

Early in 2019, Cloudflare was audited by an independent Qualified Security Assessor to validate our adherence to the PCI DSS security requirements. Cloudflare’s latest PCI Attestation of Compliance (AOC) is available to customers upon request.

Compliance Page on the Website

Cloudflare is committed to helping our customers’ earn their user’s trust by ensuring our products are secure. The Security team is committed to adhering to security compliance certifications and regulations that maintain the security, confidentiality, and availability of company and client information.
In order to help our customers keep track of the latest certifications, Cloudflare has launched our Compliance certification page – www.cloudflare.com/compliance. Today, you can view our status on all compliance certifications and download our SOC 3 report.

Spring 2019 SOC 2 Type 1 Privacy report now available

Post Syndicated from Chris Gile original https://aws.amazon.com/blogs/security/spring-2019-soc-2-type-1-privacy-report-now-available/

At AWS, our customers’ security and privacy is of the highest importance and we continue to provide transparency into our security and privacy posture. Following our first SOC 2 Type 1 Privacy report released in December 2018, AWS is proud to announce the release of the Spring 2019 SOC 2 Type 1 privacy report. The Spring 2019 SOC 2 Privacy report provides you with a third-party attestation of our systems and the suitability of the design of our privacy controls. The report also provides a detailed description of those controls, the same controls that AWS uses to address the GDPR requirements around data security and privacy.

This updated report is a part of the SOC family of reports, and has been updated to align with the new Association of International Certified Professional Accountants (AICPA) Trust Service Criteria. The Trust Service Criteria align with the Committee of Sponsoring Organizations of the Treadway Commission (COSO) 2013 framework which has been designed to better address cybersecurity risks.

The highlights of the new Trust Service Criteria include:

  • A definition of principal service commitments and system requirements.
  • Restructuring and addition of supplemental criteria to better address cybersecurity risks.
  • New description criteria requiring the disclosure of system incidents.

The scope of the privacy report includes systems that AWS uses to collect personal information and all 104 services and locations in scope for the latest AWS SOC reports. You can download the new SOC 2 Type I Privacy report now through AWS Artifact in the AWS Management Console.

As always, we value your feedback and questions. Please feel free to reach out to the team through the Contact Us page.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Registration for AWS re:Inforce 2019 now open!

Post Syndicated from Stephen Schmidt original https://aws.amazon.com/blogs/security/registration-for-aws-reinforce-2019-now-open/

AWS re:Inforce

In late November, I announced AWS re:Inforce, a standalone conference where we will deep dive into the latest approaches to security, identity, and risk management utilizing AWS services, features, and tools. Now, after months of planning, the time has arrived to open registration! Ticket sales begin on March 12th at 10:00am PDT, and you can access the ticket sales website here. We do expect to sell out, so please consider registering soon to also secure a hotel (as well as take advantage of our travel discounts). In celebration, we are offering a limited, while supplies last, $300 discount on the full conference ticket price of $1,099. Register with code RFSAL19 to take advantage of this limited offer.

The benefits of attending AWS re:Inforce 2019 are considerable. The conference will be built around gaining hands-on tactical knowledge of cloud security, identity, and compliance. Over 100 security-specific AWS Partners will be featured in our learning hub to help you tackle all manner of security concerns. Additionally, we’ll have bootcamps where you can meet with likeminded professionals to learn skills that are applicable to your individual job scope. More details about specific session offerings will be announced in the next few weeks, but you can already find details on the track types and session levels here.

Taking a step back for a moment, creating a conference focused on cloud security was important to AWS because, as we’ve often stated, security is job zero for us. While re:Invent is a great opportunity to check in yearly with customers on our new features and services, we felt a conference tailored specifically to cloud security & identity professionals offered a great opportunity for everyone to strengthen their own security program from the ground up. We’ll have four tracks, geared for those just starting out all the way up to next generation aspirational security. We want to be at the forefront of an industry shift from reactive to proactive security, and our inaugural re:Inforce gathering is a great chance for us to hear from customers about their real-world concerns, from encryption to resiliency. We also think building an ongoing community of security stakeholders is critical—we know that excellent guidance for customers doesn’t always come directly from AWS. It can also spring forth from peer conversations and networking opportunities. The strength of the AWS cloud is customers. Our customers see use cases every day that both inform our security roadmap and make our cloud stronger for everyone. Simply put, there is no AWS security story without the tremendous diligence of customers and partners. Creating a space where all parties can come together to exchange knowledge and ideas, whether in a formal session or at a casual dinner, was at the forefront of our thinking when we first considered launching re:Inforce. Seeing the threads and details come together on this re:Inforce has been personally exciting and professionally validating; I can’t wait to see you all there in late June.

Purchase tickets for AWS re:Inforce via the ticket sales website here.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Follow Steve on Twitter.

Author

Steve Schmidt

Steve is Vice President and Chief Information Security Officer for AWS. His duties include leading product design, management, and engineering development efforts focused on bringing the competitive, economic, and security benefits of cloud computing to business and government customers. Prior to AWS, he had an extensive career at the Federal Bureau of Investigation, where he served as a senior executive and section chief. He currently holds five patents in the field of cloud security architecture.

New AWS services launch with HIPAA, PCI, ISO, and SOC – a company first

Post Syndicated from Chris Gile original https://aws.amazon.com/blogs/security/new-aws-services-launch-with-hipaa-pci-iso-and-soc/

Our security culture is one of the things that sets AWS apart. Security is job zero — it is the foundation for all AWS employees and impacts the work we do every day, across the company. And that’s reflected in our services, which undergo exacting internal and external security reviews before being released. From there, we have historically waited for customer demand to begin the complex process of third-party assessment and validating services under specific compliance programs. However, we’ve heard you tell us you want every generally available (GA) service in scope to keep up with the pace of your innovation and at the same time, meet rigorous compliance and regulatory requirements.

I wanted to share how we’re meeting this challenge with a more proactive approach to service certification by certifying services at launch. For the first time, we’ve launched new GA services with PCI DSS, ISO 9001/27001/27017/27018, SOC 2, and HIPAA eligibility. That means customers who rely on or require these compliance programs can select from 10 brand new services right away, without having to wait for one or more trailing audit cycles.

Verifying the security and compliance of the following new services is as simple as going to the console and using AWS Artifact to download the audit reports.

  • Amazon DocumentDB (with MongoDB compatibility) [HIPAA, PCI, ISO, SOC 2]
  • Amazon FSx [HIPAA, PCI, ISO]
  • Amazon Route 53 Resolver [ISO]
  • AWS Amplify [HIPAA, ISO]
  • AWS DataSync [HIPAA, PCI, ISO]
  • AWS Elemental MediaConnect [HIPAA, PCI, ISO]
  • AWS Global Accelerator [PCI, ISO]
  • AWS License Manager [ISO]
  • AWS RoboMaker [HIPAA, PCI, ISO]
  • AWS Transfer for SFTP [HIPAA, PCI, ISO]

This proactive compliance approach means we move upstream in the product development process. Over the last several months, we’ve made significant process improvements to deliver additional services with compliance certifications and HIPAA eligibility. Our security, compliance, and service teams have partnered in new ways to implement controls and audit earlier in a service’s development phase to demonstrate operating effectiveness. We also integrated auditing mechanisms into multiple stages of the launch process, enabling our security and compliance teams, as well as auditors, to assess controls throughout a service’s preview period. Additionally, we increased our audit frequency to meet services’ GA deadlines.

The work reflects a meaningful shift in our business. We’re excited to get these services into your hands sooner and wanted to report our overall progress. We also ask for your continued feedback since it drives our decisions and prioritization. Because going forward, we’ll continue to iterate and innovate until all of our services are certified at launch.

New SOC 2 Report Available: Privacy

Post Syndicated from Chris Gile original https://aws.amazon.com/blogs/security/new-soc-2-report-available-privacy/

Maintaining your trust is an ongoing commitment of ours, and your voice drives our growing portfolio of compliance reports, attestations, and certifications. As a result of your feedback and deep interest in privacy and data security, we are happy to announce the publication of our new SOC 2 Type I Privacy report.

Keeping you informed of our privacy and data security policies, practices, and technologies we’ve put in place is important to us. The SOC 2 Privacy Type I report is complementary to that effort . The SOC 2 Privacy Trust Principle, developed by the American Institute of CPAs (AICPA), establishes the criteria for evaluating controls related to how personal information is collected, used, retained, disclosed, and disposed to meet the entity’s objectives. The AWS SOC 2 Privacy Type I report provides you with a third-party attestation of our systems and the suitability of the design of our privacy controls, as stated in our Privacy Notice.

The scope of the privacy report includes systems AWS uses to collect personal information and all 72 services and locations in scope for the latest AWS SOC reports. You can download the new SOC 2 Type I Privacy report now through AWS Artifact in the AWS Management Console.

As always, we value your feedback and questions. Please feel free to reach out to the team through the Contact Us page.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

New podcast: VP of Security answers your compliance and data privacy questions

Post Syndicated from Katie Doptis original https://aws.amazon.com/blogs/security/new-podcast-vp-of-security-answers-your-compliance-and-data-privacy-questions/

Does AWS comply with X program? How about GDPR? What about after Brexit? And what happens with machine learning data?

In the latest AWS Security & Compliance Podcast, we sit down with VP of Security Chad Woolf, who answers your compliance and data privacy questions. Including one of the most frequently asked questions from customers around the world, which is: how many compliance programs does AWS have/attest to/audit against?

Chad also shares what it was like to work at AWS in the early days. When he joined, AWS was housed on just a handful of floors, in a single building. Over the course of nearly nine years with the company, he has witnessed tremendous growth of the business and industry.

Listen to the podcast and hear about company history and get answers to your tough questions. If you have a compliance or data privacy question, you can submit it through our contact us form.

Want more AWS news? Follow us on Twitter.

AWS re:Invent Security Recap: Launches, Enhancements, and Takeaways

Post Syndicated from Stephen Schmidt original https://aws.amazon.com/blogs/security/aws-reinvent-security-recap-launches-enhancements-and-takeaways/

For more from Steve, follow him on Twitter

Customers continue to tell me that our AWS re:Invent conference is a winner. It’s a place where they can learn, meet their peers, and rediscover the art of the possible. Of course, there is always an air of anticipation around what new AWS service releases will be announced. This time around, we went even bigger than we ever have before. There were over 50,000 people in attendance, spread across the Las Vegas strip, with over 2,000 breakout sessions, and jam packed hands-on learning opportunities with multiple day hackathons, workshops, and bootcamps.

A big part of all this activity included sharing knowledge about the latest AWS Security, Identity and Compliance services and features, as well as announcing new technology that we’re excited to be adopted so quickly across so many use-cases.

Here are the top Security, Identity and Compliance releases from re:invent 2018:

Keynotes: All that’s new

New AWS offerings provide more prescriptive guidance

The AWS re:Invent keynotes from Andy Jassy, Werner Vogels, and Peter DeSantis, as well as my own leadership session, featured the following new releases and service enhancements. We continue to strive to make architecting easier for developers, as well as our partners and our customers, so they stay secure as they build and innovate in the cloud.

  • We launched several prescriptive security services to assist developers and customers in understanding and managing their security and compliance postures in real time. My favorite new service is AWS Security Hub, which helps you centrally manage your security and compliance controls. With Security Hub, you now have a single place that aggregates, organizes, and prioritizes your security alerts, or findings, from multiple AWS services, such as Amazon GuardDuty, Amazon Inspector, and Amazon Macie, as well as from AWS Partner solutions. Findings are visually summarized on integrated dashboards with actionable graphs and tables. You can also continuously monitor your environment using automated compliance checks based on the AWS best practices and industry standards your organization follows. Get started with AWS Security Hub with just a few clicks in the Management Console and once enabled, Security Hub will begin aggregating and prioritizing findings. You can enable Security Hub on a single account with one click in the AWS Security Hub console or a single API call.
  • Another prescriptive service we launched is called AWS Control Tower. One of the first things customers think about when moving to the cloud is how to set up a landing zone for their data. AWS Control Tower removes the guesswork, automating the set-up of an AWS landing zone that is secure, well-architected and supports multiple accounts. AWS Control Tower does this by using a set of blueprints that embody AWS best practices. Guardrails, both mandatory and recommended, are available for high-level, rule-based governance, allowing you to have the right operational control over your accounts. An integrated dashboard enables you to keep a watchful eye over the accounts provisioned, the guardrails that are enabled, and your overall compliance status. Sign up for the Control Tower preview, here.
  • The third prescriptive service, called AWS Lake Formation, will reduce your data lake build time from months to days. Prior to AWS Lake Formation, setting up a data lake involved numerous granular tasks. Creating a data lake with Lake Formation is as simple as defining where your data resides and what data access and security policies you want to apply. Lake Formation then collects and catalogs data from databases and object storage, moves the data into your new Amazon S3 data lake, cleans and classifies data using machine learning algorithms, and secures access to your sensitive data. Get started with a preview of AWS Lake Formation, here.
  • Next up, IoT Greengrass enables enhanced security through hardware root of trusted private key storage on hardware secure elements including Trusted Platform Modules (TPMs) and Hardware Security Modules (HSMs). Storing your private key on a hardware secure element adds hardware root of trust level-security to existing AWS IoT Greengrass security features that include X.509 certificates for TLS mutual authentication and encryption of data both in transit and at rest. You can also use the hardware secure element to protect secrets that you deploy to your AWS IoT Greengrass device using AWS IoT Greengrass Secrets Manager. To try these security enhancements for yourself, check out https://aws.amazon.com/greengrass/.
  • You can now use the AWS Key Management Service (KMS) custom key store feature to gain more control over your KMS keys. Previously, KMS offered the ability to store keys in shared HSMs managed by KMS. However, we heard from customers that their needs were more nuanced. In particular, they needed to manage keys in single-tenant HSMs under their exclusive control. With KMS custom key store, you can configure your own CloudHSM cluster and authorize KMS to use it as a dedicated key store for your keys. Then, when you create keys in KMS, you can choose to generate the key material in your CloudHSM cluster. Get started with KMS custom key store by following the steps in this blog post.
  • We’re excited to announce the release of ATO on AWS to help customers and partners speed up the FedRAMP approval process (which has traditionally taken SaaS providers up to 2 years to complete). We’ve already had customers, such as Smartsheet, complete the process in less than 90 days with ATO on AWS. Customers will have access to training, tools, pre-built CloudFormation templates, control implementation details, and pre-built artifacts. Additionally, customers are able to access direct engagement and guidance from AWS compliance specialists and support from expert AWS consulting and technology partners who are a part of our Security Automation and Orchestration (SAO) initiative, including GitHub, Yubico, RedHat, Splunk, Allgress, Puppet, Trend Micro, Telos, CloudCheckr, Saint, Center for Internet Security (CIS), OKTA, Barracuda, Anitian, Kratos, and Coalfire. To get started with ATO on AWS, contact the AWS partner team at [email protected].
  • Finally, I announced our first conference dedicated to cloud security, identity and compliance: AWS re:Inforce. The inaugural AWS re:Inforce, a hands-on gathering of like-minded security professionals, will take place in Boston, MA on June 25th and 26th, 2019 at the Boston Convention and Exhibition Center. The cost for a full conference pass will be $1,099. I’m hoping to see you all there. Sign up here to be notified of when registration opens.

Key re:Invent Takeaways

AWS is here to help you build

  1. Customers want to innovate, and cloud needs to securely enable this. Companies need to able to innovate to meet rapidly evolving consumer demands. This means they need cloud security capabilities they can rely on to meet their specific security requirements, while allowing them to continue to meet and exceed customer expectations. AWS Lake Formation, AWS Control Tower, and AWS Security Hub aggregate and automate otherwise manual processes involved with setting up a secure and compliant cloud environment, giving customers greater flexibility to innovate, create, and manage their businesses.
  2. Cloud Security is as much art as it is science. Getting to what you really need to know about your security posture can be a challenge. At AWS, we’ve found that the sweet spot lies in services and features that enable you to continuously gain greater depth of knowledge into your security posture, while automating mission critical tasks that relieve you from having to constantly monitor your infrastructure. This manifests itself in having an end-to-end automated remediation workflow. I spent some time covering this in my re:Invent session, and will continue to advocate using a combination of services, such as AWS Lambda, WAF, S3, AWS CloudTrail, and AWS Config to proactively identify, mitigate, and remediate threats that may arise as your infrastructure evolves.
  3. Remove human access to data. I’ve set a goal at AWS to reduce human access to data by 80%. While that number may sound lofty, it’s purposeful, because the only way to achieve this is through automation. There have been a number of security incidents in the news across industries, ranging from inappropriate access to personal information in healthcare, to credential stuffing in financial services. The way to protect against such incidents? Automate key security measures and minimize your attack surface by enabling access control and credential management with services like AWS IAM and AWS Secrets Manager. Additional gains can be found by leveraging threat intelligence through continuous monitoring of incidents via services such as Amazon GuardDuty, Amazon Inspector, and Amazon Macie (intelligence from these services will now be available in AWS Security Hub).
  4. Get your leadership on board with your security plan. We offer 500+ security services and features; however, new services and technology can’t be wholly responsible for implementing reliable security measures. Security teams need to set expectations with leadership early, aligning on a number of critical protocols, including how to restrict and monitor human access to data, patching and log retention duration, credential lifespan, blast radius reduction, embedded encryption throughout AWS architecture, and canaries and invariants for security functionality. It’s also important to set security Key Performance Indicators (KPIs) to continuously track. At AWS, we monitor the number of AppSec reviews, how many security checks we can automate, third-party compliance audits, metrics on internal time spent, and conformity with Service Level Agreements (SLAs). While the needs of your business may vary, we find baseline KPIs to be consistent measures of security assurance that can be easily communicated to leadership.

Final Thoughts

Queen’s famous lyric, “I want it all, I want it all, and I want it now,” accurately captures the sentiment at re:Invent this year. Security will always be job zero for us, and we continue to iterate on behalf of customers so they can securely build, experiment and create … right now! AWS is trusted by many of the world’s most risk-sensitive organizations precisely because we have demonstrated this unwavering commitment to putting security above all. Still, I believe we are in the early days of innovation and adoption of the cloud, and I look forward to seeing both the gains and use cases that come out of our latest batch of tools and services.

Want more AWS Security how-to content, news, and feature announcements? Follow us on Twitter.

Author

Steve Schmidt

Steve is Vice President and Chief Information Security Officer for AWS. His duties include leading product design, management, and engineering development efforts focused on bringing the competitive, economic, and security benefits of cloud computing to business and government customers. Prior to AWS, he had an extensive career at the Federal Bureau of Investigation, where he served as a senior executive and section chief. He currently holds five patents in the field of cloud security architecture. Follow Steve on Twitter

2018 ISO certificates are here, with a 70% increase of in scope services

Post Syndicated from Chris Gile original https://aws.amazon.com/blogs/security/2018-iso-certificates-are-here-with-a-70-increase-of-in-scope-services/

In just the last year, we’ve increased the number of ISO services in scope by 70%. That makes 114 services in total that have been validated against ISO 9001, 27001, 27017, and 27018.

The following services are new to our ISO program:

  • Amazon AppStream 2.0
  • Amazon Athena
  • Amazon Chime
  • Amazon CloudWatch Events
  • Amazon CloudWatch
  • Amazon Comprehend
  • Amazon Elastic Container Service for Kubernetes
  • Amazon Elasticsearch Service
  • Amazon FreeRTOS
  • Amazon FSx*
  • Amazon GuardDuty
  • Amazon Kinesis Data Analytics
  • Amazon Kinesis Data Firehose
  • Amazon Kinesis Video Streams
  • Amazon MQ
  • Amazon Neptune
  • Amazon Pinpoint
  • Amazon Polly
  • Amazon Rekognition
  • Amazon Transcribe
  • Amazon Translate
  • AWS Amplify*
  • AWS AppSync
  • AWS Artifact
  • AWS Certificate Manager
  • AWS CodeStar
  • AWS DataSync*
  • AWS Device Farm
  • AWS Elemental MediaConnect*
  • AWS Elemental MediaConvert
  • AWS Elemental MediaLive
  • AWS Firewall Manager
  • AWS Global Accelerator*
  • AWS Glue
  • AWS IoT Greengrass
  • AWS IoT 1-Click
  • AWS IoT Analytics
  • AWS License Manager*
  • AWS OpsWorks CM [includes Chef Automate, Puppet Enterprise]
  • AWS Organizations
  • AWS RoboMaker*
  • AWS Secrets Manager
  • AWS Server Migration Service
  • AWS Serverless Application Repository
  • AWS Service Catalog
  • AWS Single Sign-On
  • AWS Transfer for SFTP*
  • AWS Trusted Advisor
  • Amazon Route 53 Resolver*

*New Service

The latest certificates for ISO 9001, 27001, 27017, and 27018 are now available, giving you insight into our information security management system from third-party auditors. They contain the full list of AWS locations in scope and reference the ISO Certified webpage, which includes all services in scope. For convenience, you can also download the certs in the console via AWS Artifact, as well.

We’re clearly accelerating the pace that we add services in scope, but our ultimate goal is to eliminate your wait for compliant services altogether. To that end, in this latest audit cycle 9 of the 51 services added, launched generally available with the certifications at re:Invent 2018.

Want more AWS Security news? Follow us on Twitter.

New PCI DSS report now available, 31 services added to scope

Post Syndicated from Chris Gile original https://aws.amazon.com/blogs/security/new-pci-dss-report-now-available-31-services-added-to-scope/

In just the last 6 months, we’ve increased the number of Payment Card Industry Data Security Standard (PCI DSS) certified services by 50%. We were evaluated by third-party auditors from Coalfire and the latest report is now available on AWS Artifact.

I would like to especially call out the six new services (marked with asterisks) that just launched generally available at re:Invent with PCI certification. We’re increasing the rate we add existing services in scope and are also launching new services PCI certified, enabling you to use them for regulated workloads sooner. The goal is for all of our services to have compliance certifications so you never have to wait to verify their security and compliance posture. Additional work to that end is already underway, and we’ll be updating you about our progress at every significant milestone.

With the addition of the following 31 services, you can now select from a total of 93 PCI-compliant services. To see the full list, go to our Services in Scope by Compliance Program page.

  • Amazon Athena
  • Amazon Comprehend
  • Amazon Elastic Container Service for Kubernetes (EKS)
  • Amazon Elasticsearch Service
  • Amazon FreeRTOS
  • Amazon FSx*
  • Amazon GuardDuty
  • Amazon Kinesis Data Analytics
  • Amazon Kinesis Data Firehose
  • Amazon Kinesis Video Streams
  • Amazon MQ
  • Amazon Neptune
  • Amazon Rekognition
  • Amazon Transcribe
  • Amazon Translate
  • AWS AppSync
  • AWS Certificate Manager (ACM)
  • AWS DataSync*
  • AWS Elemental MediaConnect*
  • AWS Global Accelerator*
  • AWS Glue
  • AWS Greengrass
  • AWS IoT Core {includes Device Management}
  • AWS OpsWorks for Chef Automate {includes Puppet Enterprise}
  • AWS RoboMaker*
  • AWS Secrets Manager
  • AWS Serverless Application Repository
  • AWS Server Migration Service (SMS)
  • AWS Step Functions
  • AWS Transfer for SFTP*
  • VM Import/Export

*New Service

If you want to know more about our compliance programs or provide feedback, please contact us. Your feedback helps us prioritize our decisions and innovate our programs.

Want more AWS Security news? Follow us on Twitter.

Scaling a governance, risk, and compliance program for the cloud, emerging technologies, and innovation

Post Syndicated from Michael South original https://aws.amazon.com/blogs/security/scaling-a-governance-risk-and-compliance-program-for-the-cloud/

Governance, risk, and compliance (GRC) programs are sometimes looked upon as the bureaucracy getting in the way of exciting cybersecurity work. But a good GRC program establishes the foundation for meeting security and compliance objectives. It is the proactive approach to cybersecurity that, if done well, minimizes reactive incident response.

Of the three components of cybersecurity—people, processes, and technology—technology is the viewed as the “easy button” because in relative terms, it’s simpler than drafting a policy with the right balance of flexibility and specificity or managing countless organizational principles and human behavior. Still, as much as we promote technology and automation at AWS, we also understand that automating a bad process with the latest technology doesn’t make the process or outcome better. Cybersecurity must incorporate all three aspects with a programmatic approach that scales. To reach that goal, an effective GRC program is essential because it ensures a holistic view has been taken while tackling the daunting mission of cybersecurity.

Although governance, risk, and compliance are oftentimes viewed as separate functions, there is a symbiotic relationship between them. Governance establishes the strategy and guardrails for meeting specific requirements that align and support the business. Risk management connects specific controls to governance and assessed risks, and provides business leaders with the information they need to prioritize resources and make risk-informed decisions. Compliance is the adherence and monitoring of controls to specific governance requirements. It is also the “ante” to play the game in certain industries and, with continuous monitoring, closes the feedback loop regarding effective governance. Security architecture, engineering, and operations are built upon the GRC foundation.

Without a GRC program, people tend to solely focus on technology and stove-pipe processes. For example, say a security operations employee is faced with four events to research and mitigate. In the absence of a GRC program, the staffer would have no context about the business risk or compliance impact of the events, which could lead them to prioritize the least important issue.

GRC relationship model

GRC has a symbiotic relationship

The breadth and depth of a GRC program varies with each organization. Regardless of its simplicity or complexity, there are opportunities to transform or scale that program for the adoption of cloud services, emerging technologies, and other future innovations.

Below is a checklist of best practices to help you on your journey. The key takeaways of the checklist are: base governance on objectives and capabilities, include risk context in decision-making, and automate monitoring and response.

Governance

Identify compliance requirements

__ Identify required compliance frameworks (such as HIPAA or PCI) and contract/agreement obligations.

__ Identify restrictions/limitations to cloud or emerging technologies.

__ Identify required or chosen standards to implement (for example NIST, ISO, COBIT, CSA, CIS, etc.).

Conduct program assessment

__ Conduct a program assessment based on industry processes such as the NIST Cyber Security Framework (CSF) or ISO/IEC TR 27103:2018 to understand the capability and maturity of your current profile.

__ Determine desired end-state capability and maturity, also known as target profile.

__ Document and prioritize gaps (people, process, and technologies) for resource allocation.

__ Build a Cloud Center of Excellence (CCoE) team.

__ Draft and publish a cloud strategy that includes procurement, DevSecOps, management, and security.

__ Define and assign functions, roles, and responsibilities.

Update and publish policies, processes, procedures

__ Update policies based on objectives and desired capabilities that align to your business.

__ Update processes for modern organization and management techniques such as DevSecOps and Agile, specifying how to upgrade old technologies.

__ Update procedures to integrate cloud services and other emerging technologies.

__ Establish technical governance standards to be used to select controls and that monitor compliance.

Risk management

Conduct a risk assessment*

__ Conduct or update an organizational risk assessment (e.g., market, financial, reputation, legal, etc.).

__ Conduct or update a risk assessment for each business line (such as mission, market, products/services, financial, etc.).

__ Conduct or update a risk assessment for each asset type.

* The use of pre-established threat models can simplify the risk assessment process, both initial and updates.

Draft risk plans

__ Implement plans to mitigate, avoid, transfer, or accept risk at each tier, business line, and asset (for example, a business continuity plan, a continuity of operations plan, a systems security plan).

__ Implement plans for specific risk areas (such as supply chain risk, insider threat).

Authorize systems

__ Use NIST Risk Management Framework (RMF) or other process to authorize and track systems.

__ Use NIST Special Publication 800-53, ISO ISO/IEC 27002:2013, or other control set to select, implement, and assess controls based on risk.

__ Implement continuous monitoring of controls and risk, employing automation to the greatest extent possible.

Incorporate risk information into decisions

__ Link system risk to business and organizational risk

__ Automate translation of continuous system risk monitoring and status to business and org risk

__ Incorporate “What’s the risk?” (financial, cyber, legal, reputation) into leadership decision-making

Compliance

Monitor compliance with policy, standards, and security controls

__ Automate technical control monitoring and reporting (advanced maturity will lead to AI/ML).

__ Implement manual monitoring of non-technical controls (for example periodic review of visitor logs).

__ Link compliance monitoring with security information and event management (SIEM) and other tools.

Continually self-assess

__ Automate application security testing and vulnerability scans.

__ Conduct periodic self-assessments from sampling of controls, entire functional area, and pen-tests.

__ Be overly critical of assumptions, perspectives, and artifacts.

Respond to events and changes to risk

__ Integrate security operations with the compliance team for response management.

__ Establish standard operating procedures to respond to unintentional changes in controls.

__ Mitigate impact and reset affected control(s); automate as much as possible.

Communicate events and changes to risk

__ Establish a reporting tree and thresholds for each type of incident.

__ Include general counsel in reporting.

__ Ensure applicable regulatory authorities are notified when required.

__ Automate where appropriate.

Want more AWS Security news? Follow us on Twitter.

Michael South

Michael joined AWS in 2017 as the Americas Regional Leader for public sector security and compliance business development. He supports customers who want to achieve business objectives and improve their security and compliance in the cloud. His customers span across the public sector, including: federal governments, militaries, state/provincial governments, academic institutions, and non-profits from North to South America. Prior to AWS, Michael was the Deputy Chief Information Security Officer for the city of Washington, DC and the U.S. Navy’s Chief Information Officer for Japan.

How federal agencies can leverage AWS to extend CDM programs and CIO Metric Reporting

Post Syndicated from Darren House original https://aws.amazon.com/blogs/security/how-federal-agencies-can-leverage-aws-to-extend-cdm-programs-and-cio-metric-reporting/

Continuous Diagnostics and Mitigation (CDM), a U.S. Department of Homeland Security cybersecurity program, is gaining new visibility as part of the federal government’s overall focus on securing its information and networks. How an agency performs against CDM will soon be regularly reported in the updated Federal Information Technology Acquisition Reform Act (FITARA) scorecard. That’s in addition to updates in the President’s Management Agenda. Because of this additional scrutiny, there are many questions about how cloud service providers can enable the CDM journey.

This blog will explain how you can implement a CDM program—or extend an existing one—within your AWS environment, and how you can use AWS capabilities to provide real-time CDM compliance and FISMA reporting.

When it comes to compliance for departments and agencies, the AWS Shared Responsibility Model describes how AWS is responsible for security of the cloud and the customer is responsible for security in the cloud. The Shared Responsibility Model is segmented into 3 categories (1) infrastructure services (see figure 1 below), (2) container services, and (3) abstracted services, each having a different level of controls customers can inherit to minimize their effort in managing and maintaining compliance and audit requirements. For example, when designing your workloads in AWS that fall under Infrastructure Services in the Shared Responsibility Model, AWS helps relieve your operational burden, as AWS operates, manages, and controls the components from the host operating system and virtualization layer down to the physical security of the facilities in which the services operate. This also relates to IT controls you can inherit through AWS compliance programs. Before their journey to the AWS Cloud, a customer may have been responsible for the entire control set in their compliance and auditing program. With AWS, you can inherit controls from AWS compliance programs, allowing you to focus on workloads and data you put in the cloud.

Figure 1: AWS Infrastructure Services

For example, if you deploy infrastructure services such as Amazon Virtual Private Cloud (Amazon VPC) networking and Amazon Elastic Compute Cloud (Amazon EC2) instances, you can base your CDM compliance controls on the AWS controls you inherit for network infrastructure, virtualization, and physical security. You would be responsible for managing things like change management for Amazon EC2 AMIs, operating system and patching, AWS Config management, AWS Identity and Access Management (IAM), and encryption of services at rest and in transit.

If you deploy container services such as Amazon Relational Database Service (Amazon RDS) or Amazon EMR that build on top of infrastructure services, AWS manages the underlying infrastructure virtualization, physical controls, and the OS and associated IT controls, like patching and change management. You inherit the IT security controls from AWS compliance programs and can use them as artifacts in your compliance program. For example, you can request a SOC report for one of our 62 SOC-compliant services available from AWS Artifact. You are still responsible for the controls of what you put in the cloud.

Another example is if you deploy abstracted services such as Amazon Simple Storage Service (S3) or Amazon DynamoDB. AWS is responsible for the IT controls applicable to the services we provide. You are responsible for managing your data, classifying your assets, using IAM to apply permissions to individual resources at the platform level, or managing permissions based on user identity or user responsibility at the IAM user/group level.

For agencies struggling to comply with FISMA requirements and thus CDM reporting, leveraging abstracted and container services means you now have the ability to inherit more controls from AWS, reducing the resource drain and cost of FISMA compliance.

So, what does this all mean for your CDM program on AWS? It means that AWS can help agencies realize CDM’s vision of near-real time FISMA reporting for infrastructure, container, and abstracted services. The following paragraphs explain how you can leverage existing AWS Services and solutions that support CDM.

1.0 Identify

AWS can meet CDM Asset Management (AM) 1.1 – 1.3 using AWS Config to track AWS resource configurations, use the resource groups tagging API to manage FISMA Classifications of your cloud infrastructure, and automatically tag Amazon EC2 resources.

For AM 1.4, AWS Config identifies all cloud assets in your AWS account, which can be stored in a DynamoDB table in the reporting format required. AWS Config rules can enforce and report on compliance, ensuring all Amazon Elastic Block Store (EBS) volumes (block level storage) and Amazon S3 buckets (object storage) are encrypted.

2.0 Protect

For CIO metrics 2.1, 2.2, and 2.14, Amazon Inspector uses Common Vulnerabilities and Exposures (CVEs) based on the National Vulnerability Database (NVD) Common Vulnerability Scoring System (CVSS). Using Amazon Inspector, customers can set up continuous golden AMI vulnerability assessments then take the Inspector findings and store them in a CIO metrics DynamoDB table for reporting.

For metrics 2.3 – 2.7, AWS provides federation services with on-premise Active Directory (AD) services, allowing customers to map IAM Roles to AD groups, report on IAM privileged system access, and identify which IAM roles have access to particular services.

To provide reporting for metric 2.10.1, AWS offers FIPS 140-2 validated endpoints for US East/West regions and AWS GovCloud (US).

For metric 2.9, AWS Config provides relationship information for all resources, which means you can use AWS Config relationship data to report on CIO metrics focused on the segmentation of resources. AWS Config can identify things like the network interfaces, security groups (firewalls), subnets, and VPCs related to an instance.

3.0 Detect

For detection that covers 3.3, 3.6, 3.8, 3.9, 3.11, and 3.12, AWS Web Application Firewall (WAF) and Amazon GuardDuty have native automation through Amazon CloudWatch, AWS Step Functions (orchestration), and AWS Lambda (serverless) to provide adaptive computing capabilities such as automating the application of AWS WAF rules, recovering from incidents, mitigating OWASP top 10 threats, and automating the import of third-party threat intelligence feeds.

4.0 Respond

The focus is on you to develop policies and procedures to respond to cyber “incidents.” AWS provides capabilities to automate policies and procedures in a policy driven manner to improve detection times, shorten response times, and reduce attack surface. FISMA defines “incident” as “an occurrence that (A) actually or imminently jeopardizes, without lawful authority, the integrity, confidentiality, or availability of information or an information system; or (B) constitutes a violation or imminent threat of violation of law, security policies, security procedures, or acceptable use policies.”

Enabling GuardDuty in your accounts and writing finding results to a reporting database complies with CIO “Recover” metrics 4.1 and 4.2. Requirement 4.3 mandates you “automatically disable the system or relevant asset upon the detection of a given security violation or vulnerability,” per NIST SP 800-53r4 IR-4(2). You can comply by enabling automation to remediate instances.

5.0 Recover

Responding and recovering are closely related processes. Enabling automation to remediate instances also complies with 5.1 and 5.2, which focus on recovering from incidents. While it is your responsibility to develop an Information System Contingency Plan (ISCP), AWS can enable you to translate that plan into automated machine-readable policy through AWS CloudFormation. The service can use AWS Config to report on the number of HVA systems for which an ISCP has been developed (5.1). For both 5.1 and 5.2, “mean time for the organization to restore operations following the containment of a system intrusion or compromise,” using AWS multi-region architectures, inter-region VPC peering, S3 cross-region replication, AWS Landing Zones, and the global AWS network to enable global networking with services like AWS Direct Connect gateway allows you to develop an architecture (5.1.1) that tracks the number of HVA systems that have an alternate processing site identified and provisioned to enable global recovery in minutes.

Conclusion

There are many ways to report and visualize the data for all the solutions identified. A simple way is to write sensor data to a DynamoDB table. You can enhance your basic reporting to provide advanced analytics and visualization by integrating DynamoDB data with Amazon Athena. You can log all your services directly to an Amazon S3 bucket, use Amazon QuickSight for the dashboard and create custom analyses from your dashboards. You could also enrich your CIO metric dashboard data with other normalized datasets.

The end result is that you can build a new CDM program or extend an existing one using AWS. We are relentlessly focused on innovating for you and providing a comprehensive platform of secure IT services to meet your unique needs. Federal customers required to comply with CDM can use these innovations to incorporate services like AWS Config and AWS Systems Manager to provide assets and software management in AWS and on-premise systems to answer the question, “What is on the network?”

For real time compliance and reporting, you can leverage services like GuardDuty to analyze AWS CloudTrail, DNS, and VPC flow logs in your account so you really know “who is on your network,” and “what is happening on the network.”  VPC, security groups, Amazon CloudFront, and AWS WAF allow you to protect the boundary of your environment, while services like AWS Key Management Service (KMS), AWS CloudHSM, AWS Certificate Manager (ACM) and HTTPS endpoints enable you to “protect the data on the network.”

The services discussed in this blog provide you with the ability to design a cloud architecture that can help you move from ad-hoc to optimized in your CDM reporting. If you want more data, send us feedback and please, let us know how we can help with your CDM journey! If you have feedback about this blog post, submit comments in the Comments section below, or contact AWS Support.

Want more AWS Security news? Follow us on Twitter.

Darren House

Darren is a Senior Solutions Architect at AWS who is passionate about designing business- and mission-aligned cloud solutions that enable government customers to achieve objectives and improve desired outcomes. Outside work, Darren enjoys hiking, camping, motorcycles, and snowboarding.

Fall 2018 SOC reports now available with 73 services in scope

Post Syndicated from Chris Gile original https://aws.amazon.com/blogs/security/fall-2018-soc-reports-now-available-with-73-services-in-scope/

Seventy-three. That’s the number of AWS services now available to our customers under our System and Organizational Controls (SOC) 1, 2, and 3 audits, with 11 additional services included during this most recent audit cycle. The SOC reports are now available to you on demand in the AWS Management Console. The SOC 3 report can be downloaded online as a pdf.

As you can see from the list of new services added below, we’re now including services’ namespaces to our assessment documentation. We’ll be including namespaces going forward to have a standard naming convention for services across our audits. Knowing services’ namespaces also helps you identify services when creating IAM policies, working with Amazon Resource Names (ARNs), and reading AWS CloudTrail logs.

The 11 services newly added to our SOC scope:

As always, my team strives to include services into the scope of our compliance programs based on your architectural and regulatory needs. Please reach out to your AWS representatives to let us know what additional services you would like to see in scope across any of our compliance programs. To see our current list, go to the Services in Scope page.

Want more AWS Security news? Follow us on Twitter.

New Podcast: Preview the security track at re:Invent, learn what’s new and maximize your time

Post Syndicated from Katie Doptis original https://aws.amazon.com/blogs/security/previewing-the-security-track-at-reinvent-learn-whats-new-and-maximize-your-time/

There are about 60 security-focused sessions and talks at re:Invent this year. That’s in addition to more than 2,000 other sessions, activities, chalk talks, and demos planned throughout the week. We want to help you get the most out the event and maximize your time. That’s why we’re previewing the security track and highlighting what’s new in the latest AWS Security & Compliance podcast.

Staffers developing security track content offer their advice for navigating the learning conference that is expected to draw 50,000 people from around the world. Listen to the podcast and learn about the newest hands-on session, which was designed to give you deep technical insight within a small-group setting. Plus, find out about the event change that is meant to make it easier to attend more of the talks that interest you.

Three key trends in financial services cloud compliance

Post Syndicated from Igor Kleyman original https://aws.amazon.com/blogs/security/three-key-trends-in-financial-services-cloud-compliance/

As financial institutions increasingly move their technology infrastructure to the cloud, financial regulators are tailoring their oversight to the unique features of a cloud environment. Regulators have followed a variety of approaches, sometimes issuing new rules and guidance tailored to the cloud. Other times, they have updated existing guidelines for managing technology providers to be more applicable for emerging technologies. In each case, however, policymakers’ heightened focus on cybersecurity and privacy has led to increased scrutiny on how financial institutions manage security and compliance.

Because we strive to ensure you can use AWS to meet the highest security standards, we also closely monitor regulatory developments and look for trends to help you stay ahead of the curve. Here are three common themes we’ve seen emerge in the regulatory landscape:

Data security and data management

Regulators expect financial institutions to implement controls and safety measures to protect the security and confidentiality of data stored in the cloud. AWS services are content agnostic—we treat all customer data and associated assets as highly confidential. We have implemented sophisticated technical and physical measures against unauthorized access. Encryption is an important step to help protect sensitive information. You can use AWS Key Management Service (KMS), which is integrated into many services, to encrypt data. KMS also makes it easy to create and control your encryption keys.

Cybersecurity

Financial regulators expect financial institutions to maintain a strong cybersecurity posture. In the cloud, security is a shared responsibility between the cloud provider and the customer: AWS manages security of the cloud, and customers are responsible for managing security in the cloud. To manage security of the cloud, AWS has developed and implemented a security control environment designed to protect the confidentiality, integrity, and availability of your systems and content. AWS infrastructure complies with global and regional regulatory requirements and best practices. You can help ensure security in the cloud by leveraging AWS services. Some new services strive to automate security. Amazon Inspector performs automated security assessments to scan cloud environments for vulnerabilities or deviations from best practices. AWS is also on the cutting edge of using automated reasoning to ensure established security protocols are in place. You can leverage automated proofs with a tool called Zelkova, which is integrated within certain AWS services. Zelkova helps you obtain higher levels of security assurance about your most sensitive systems and operations. Financial institutions can also perform vulnerability scans and penetration testing on their AWS environments—another recurring expectation of financial regulators.

Risk management

Regulators expect financial institutions to have robust risk management processes when using the cloud. Continuous monitoring is key to ensuring that you are managing the risk of your cloud environment, and AWS offers financial institutions a number of tools for governance and traceability. You can have complete visibility of your AWS resources by using services such as AWS CloudTrail, Amazon CloudWatch, and AWS Config to monitor, analyze, and audit events that occur in your cloud environment. You can also use AWS CloudTrail to log and retain account activity related to actions across your AWS infrastructure.

We understand how important security and compliance are for financial institutions, and we strive to ensure that you can use AWS to meet the highest regulatory standards. Here is a selection of resources we created to help you make sense of the changing regulatory landscape around the world:

You can go to our security and compliance resources page for additional information. Have more questions? Reach out to your Account Manager or request to be contacted.

Want more AWS Security news? Follow us on Twitter.

AWS completes TISAX high assessment

Post Syndicated from Gerald Boyne original https://aws.amazon.com/blogs/security/aws-completes-tisax-high-assessment/

We have completed the European automotive industry’s TISAX high assessment for 43 services. To successfully complete the TISAX high assessment, EY Germany conducted an independent audit, and attested that our information management system meets industry-set standards. This provides automotive industry organizations the assurance needed to build secure applications and services on AWS.

TISAX was established by the German Association of the Automotive Industry (VDA) and is governed by the European Network Exchange (ENX), which is an association of 15 companies within the European automotive industry.

The following AWS services were TISAX high assessed in our Dublin and Frankfurt Regions:

  • Amazon API Gateway
  • Amazon CloudFront
  • Amazon CloudWatch Logs
  • Amazon Cognito
  • Amazon Connect
  • Amazon DynamoDB
  • Amazon ElastiCache
  • Amazon Elastic Block Store (EBS)
  • Amazon Elastic Container Registry (ECR)
  • Amazon Elastic Container Service (ECS)
  • Amazon Elastic Cloud Compute (EC2)
  • Amazon Elastic File System (EFS)
  • Amazon Elastic Load Balancing
  • Amazon Elastic MapReduce (EMR)
  • Amazon Glacier
  • Amazon Kinesis Data Streams
  • Amazon Redshift
  • Amazon Relational Database Service (RDS)
  • Amazon Route 53
  • Amazon S3 Transfer Acceleration
  • Amazon Simple Notification Service (SNS)
  • Amazon Simple Queue Service (SQS)
  • Amazon Simple Storage Service (S3)
  • Amazon Simple Workflow Service (SWF)
  • Amazon Virtual Private Cloud (VPC)
  • Amazon WorkSpaces
  • AWS CloudFormation
  • AWS CloudHSM
  • AWS CloudTrail
  • AWS Database Migration Service (DMS)
  • AWS Direct Connect
  • AWS Directory Service for Microsoft Active Directory
  • AWS Elastic Beanstalk
  • AWS Identity and Access Management (IAM)
  • AWS IoT Core
  • AWS Key Management Service (KMS)
  • AWS Lambda
  • AWS [email protected]
  • AWS Shield
  • AWS Step Functions
  • AWS Storage Gateway
  • AWS Systems Manager
  • VM Import/Export

AWS Compliance Center for financial services now available

Post Syndicated from Frank Fallon original https://aws.amazon.com/blogs/security/aws-compliance-center-financial-services/

On Tuesday, September 4, AWS announced the launch of an AWS Compliance Center for our Financial Services (FS) customers. This addition to our compliance offerings gives you a central location to research cloud-related regulatory requirements that impact the financial services industry. Prior to the launch of the AWS Compliance Center, customers preparing to adopt AWS for their FS workloads typically had to browse multiple in-depth sources to understand the expectations of regulatory agencies in each country.

The AWS Compliance Center is designed to make this process easier. It aggregates any given country’s regulatory position regarding the adoption and operation of cloud services. Key components of the FS industry—including regulatory approvals, data privacy, and data protection—are explained, along with the steps you must take throughout your adoption of AWS services to help satisfy regulatory requirements. You can browse the information in the portal and export it as printable documents.

We expect the AWS Compliance Center to evolve as our customers’ compliance needs change and as regulators begin to address the challenges and opportunities that cloud services create in the FS industry. The AWS Compliance Center covers 13 countries, and we’ll continue to enhance it with additional countries and information based on your needs.

AWS achieves FedRAMP JAB High and Moderate Provisional Authorization across 14 Services in the AWS US East/West and GovCloud Regions

Post Syndicated from Chris Gile original https://aws.amazon.com/blogs/security/aws-achieves-fedramp-jab-high-moderate-provisional-authorization/

Since I launched our FedRAMP program way back in 2013, it has always excited me to talk about how we’re continually expanding the scope of our compliance programs because that means you’re able to use more of our services for sensitive and regulated workloads. Up to this point, we’ve had 22 services in our US East/West Regions under FedRAMP Moderate and 21 services in our GovCloud Region under FedRAMP High.

Today, I’m happy tell you about the latest expansion of our FedRAMP program, which makes for a 64% overall increase in FedRAMP covered services. We’ve achieved JAB authorizations for an additional 14 FedRAMP Moderate services in our US East/West Regions and three of those services also received FedRAMP High in our GovCloud Region. Check out the services below. All the services are available in the US East/West Regions, and the services with asterisks are also available in GovCloud.

  • Amazon API Gateway
  • Amazon Cloud Directory
  • Amazon Cognito
  • Amazon ElastiCache*
  • Amazon Inspector
  • Amazon Macie
  • Amazon QuickSight
  • Amazon Route 53
  • Amazon WAF
  • AWS Config
  • AWS Database Migration Service*
  • AWS Lambda
  • AWS Shield Advanced
  • AWS Snowball/Snowball Edge*

You can now see our updated list of authorizations on the FedRAMP Marketplace. We also list all of our services in scope by compliance program on our site. As always, our FedRAMP assessment was completed with a third-party assessment partner to ensure an independent validation of our technical, management, and operational security controls against the FedRAMP baselines.

Our customer obsession starts with you. It’s been a personal goal of mine, and a point of direct feedback from you, to accelerate the pace at which we’re onboarding services into all of our compliance programs, not just FedRAMP. So, we’ll continue to work with you and with regulatory and compliance bodies around the world to ensure that we’re raising the bar on your security and compliance needs and continually earning the trust you place in us.

To learn about what other public sector customers are doing on AWS, see our Government, Education, and Nonprofits Case Studies and Customer Success Stories. And certainly, stay tuned for more exciting future FedRAMP updates.

Want more AWS Security news? Follow us on Twitter.

How to use AWS Secrets Manager to rotate credentials for all Amazon RDS database types, including Oracle

Post Syndicated from Apurv Awasthi original https://aws.amazon.com/blogs/security/how-to-use-aws-secrets-manager-rotate-credentials-amazon-rds-database-types-oracle/

You can now use AWS Secrets Manager to rotate credentials for Oracle, Microsoft SQL Server, or MariaDB databases hosted on Amazon Relational Database Service (Amazon RDS) automatically. Previously, I showed how to rotate credentials for a MySQL database hosted on Amazon RDS automatically with AWS Secrets Manager. With today’s launch, you can use Secrets Manager to automatically rotate credentials for all types of databases hosted on Amazon RDS.

In this post, I review the key features of Secrets Manager. You’ll then learn:

  1. How to store the database credential for the superuser of an Oracle database hosted on Amazon RDS
  2. How to store the Oracle database credential used by an application
  3. How to configure Secrets Manager to rotate both Oracle credentials automatically on a schedule that you define

Key features of Secrets Manager

AWS Secrets Manager makes it easier to rotate, manage, and retrieve database credentials, API keys, and other secrets throughout their lifecycle. The key features of this service include the ability to:

  1. Secure and manage secrets centrally. You can store, view, and manage all your secrets centrally. By default, Secrets Manager encrypts these secrets with encryption keys that you own and control. You can use fine-grained IAM policies or resource-based policies to control access to your secrets. You can also tag secrets to help you discover, organize, and control access to secrets used throughout your organization.
  2. Rotate secrets safely. You can configure Secrets Manager to rotate secrets automatically without disrupting your applications. Secrets Manager offers built-in integrations for rotating credentials for all Amazon RDS databases (MySQL, PostgreSQL, Oracle, Microsoft SQL Server, MariaDB, and Amazon Aurora.) You can also extend Secrets Manager to meet your custom rotation requirements by creating an AWS Lambda function to rotate other types of secrets.
  3. Transmit securely. Secrets are transmitted securely over Transport Layer Security (TLS) protocol 1.2. You can also use Secrets Manager with Amazon Virtual Private Cloud (Amazon VPC) endpoints powered by AWS Privatelink to keep this communication within the AWS network and help meet your compliance and regulatory requirements to limit public internet connectivity.
  4. Pay as you go. Pay for the secrets you store in Secrets Manager and for the use of these secrets; there are no long-term contracts, licensing fees, or infrastructure and personnel costs. For example, a typical production-scale web application will generate an estimated monthly bill of $6. If you follow along the instructions in this blog post, your estimated monthly bill for Secrets Manager will be $1. Note: you may incur additional charges for using Amazon RDS and Amazon Lambda, if you’ve already consumed the free tier for these services.

Now that you’re familiar with Secrets Manager features, I’ll show you how to store and automatically rotate credentials for an Oracle database hosted on Amazon RDS. I divided these instructions into three phases:

  1. Phase 1: Store and configure rotation for the superuser credential
  2. Phase 2: Store and configure rotation for the application credential
  3. Phase 3: Retrieve the credential from Secrets Manager programmatically

Prerequisites

To follow along, your AWS Identity and Access Management (IAM) principal (user or role) requires the SecretsManagerReadWrite AWS managed policy to store the secrets. Your principal also requires the IAMFullAccess AWS managed policy to create and configure permissions for the IAM role used by Lambda for executing rotations. You can use IAM permissions boundaries to grant an employee the ability to configure rotation without also granting them full administrative access to your account.

Phase 1: Store and configure rotation for the superuser credential

From the Secrets Manager console, on the right side, select Store a new secret.

Since I’m storing credentials for database hosted on Amazon RDS, I select Credentials for RDS database. Next, I input the user name and password for the superuser. I start by securing the superuser because it’s the most powerful database credential and has full access to the database.
 

Figure 1: For "Select secret type," choose "Credentials for RDS database"

Figure 1: For “Select secret type,” choose “Credentials for RDS database”

For this example, I choose to use the default encryption settings. Secrets Manager will encrypt this secret using the Secrets Manager DefaultEncryptionKey in this account. Alternatively, I can choose to encrypt using a customer master key (CMK) that I have stored in AWS Key Management Service (AWS KMS). To learn more, read the Using Your AWS KMS CMK documentation.
 

Figure 2: Choose either DefaultEncryptionKey or use a CMK

Figure 2: Choose either DefaultEncryptionKey or use a CMK

Next, I view the list of Amazon RDS instances in my account and select the database this credential accesses. For this example, I select the DB instance oracle-rds-database from the list, and then I select Next.

I then specify values for Secret name and Description. For this example, I use Database/Development/Oracle-Superuser as the name and enter a description of this secret, and then select Next.
 

Figure 3: Provide values for "Secret name" and "Description"

Figure 3: Provide values for “Secret name” and “Description”

Since this database is not yet being used, I choose to enable rotation. To do so, I select Enable automatic rotation, and then set the rotation interval to 60 days. Remember, if this database credential is currently being used, first update the application (see phase 3) to use Secrets Manager APIs to retrieve secrets before enabling rotation.
 

Figure 4: Select "Enable automatic rotation"

Figure 4: Select “Enable automatic rotation”

Next, Secrets Manager requires permissions to rotate this secret on my behalf. Because I’m storing the credentials for the superuser, Secrets Manager can use this credential to perform rotations. Therefore, on the same screen, I select Use a secret that I have previously stored in AWS Secrets Manager, and then select Next.

Finally, I review the information on the next screen. Everything looks correct, so I select Store. I have now successfully stored a secret in Secrets Manager.

Note: Secrets Manager will now create a Lambda function in the same VPC as my Oracle database and trigger this function periodically to change the password for the superuser. I can view the name of the Lambda function on the Rotation configuration section of the Secret Details page.

The banner on the next screen confirms that I’ve successfully configured rotation and the first rotation is in progress, which enables me to verify that rotation is functioning as expected. Secrets Manager will rotate this credential automatically every 60 days.
 

Figure 5: The confirmation notification

Figure 5: The confirmation notification

Phase 2: Store and configure rotation for the application credential

The superuser is a powerful credential that should be used only for administrative tasks. To enable your applications to access a database, create a unique database credential per application and grant these credentials limited permissions. You can use these database credentials to read or write to database tables required by the application. As a security best practice, deny the ability to perform management actions, such as creating new credentials.

In this phase, I will store the credential that my application will use to connect to the Oracle database. To get started, from the Secrets Manager console, on the right side, select Store a new secret.

Next, I select Credentials for RDS database, and input the user name and password for the application credential.

I continue to use the default encryption key. I select the DB instance oracle-rds-database, and then select Next.

I specify values for Secret Name and Description. For this example, I use Database/Development/Oracle-Application-User as the name and enter a description of this secret, and then select Next.

I now configure rotation. Once again, since my application is not using this database credential yet, I’ll configure rotation as part of storing this secret. I select Enable automatic rotation, and set the rotation interval to 60 days.

Next, Secrets Manager requires permissions to rotate this secret on behalf of my application. Earlier in the post, I mentioned that applications credentials have limited permissions and are unable to change their password. Therefore, I will use the superuser credential, Database/Development/Oracle-Superuser, that I stored in Phase 1 to rotate the application credential. With this configuration, Secrets Manager creates a clone application user.
 

Figure 6: Select the superuser credential

Figure 6: Select the superuser credential

Note: Creating a clone application user is the preferred mechanism of rotation because the old version of the secret continues to operate and handle service requests while the new version is prepared and tested. There’s no application downtime while changing between versions.

I review the information on the next screen. Everything looks correct, so I select Store. I have now successfully stored the application credential in Secrets Manager.

As mentioned in Phase 1, AWS Secrets Manager creates a Lambda function in the same VPC as the database and then triggers this function periodically to rotate the secret. Since I chose to use the existing superuser secret to rotate the application secret, I will grant the rotation Lambda function permissions to retrieve the superuser secret. To grant this permission, I first select role from the confirmation banner.
 

Figure 7: Select the "role" link that's in the confirmation notification

Figure 7: Select the “role” link that’s in the confirmation notification

Next, in the Permissions tab, I select SecretsManagerRDSMySQLRotationMultiUserRolePolicy0. Then I select Edit policy.
 

Figure 8: Edit the policy on the "Permissions" tab

Figure 8: Edit the policy on the “Permissions” tab

In this step, I update the policy (see below) and select Review policy. When following along, remember to replace the placeholder ARN-OF-SUPERUSER-SECRET with the ARN of the secret you stored in Phase 1.


{
  "Statement": [
    {
        "Effect": "Allow",
        "Action": [
            "ec2:CreateNetworkInterface",
			"ec2:DeleteNetworkInterface",
			"ec2:DescribeNetworkInterfaces",
			"ec2:DetachNetworkInterface"
		],
		"Resource": "*"
	},
	{
	    "Sid": "GrantPermissionToUse",
		"Effect": "Allow",
		"Action": [
            "secretsmanager:GetSecretValue"
        ],
		"Resource": "ARN-OF-SUPERUSER-SECRET"
	}
  ]
}

Here’s what it will look like:
 

Figure 9: Edit the policy

Figure 9: Edit the policy

Next, I select Save changes. I have now completed all the steps required to configure rotation for the application credential, Database/Development/Oracle-Application-User.

Phase 3: Retrieve the credential from Secrets Manager programmatically

Now that I have stored the secret in Secrets Manager, I add code to my application to retrieve the database credential from Secrets Manager. I use the sample code from Phase 2 above. This code sets up the client and retrieves and decrypts the secret Database/Development/Oracle-Application-User.

Remember, applications require permissions to retrieve the secret, Database/Development/Oracle-Application-User, from Secrets Manager. My application runs on Amazon EC2 and uses an IAM role to obtain access to AWS services. I attach the following policy to my IAM role. This policy uses the GetSecretValue action to grant my application permissions to read secret from Secrets Manager. This policy also uses the resource element to limit my application to read only the Database/Development/Oracle-Application-User secret from Secrets Manager. You can refer to the Secrets Manager Documentation to understand the minimum IAM permissions required to retrieve a secret.


{
 "Version": "2012-10-17",
 "Statement": {
    "Sid": "RetrieveDbCredentialFromSecretsManager",
    "Effect": "Allow",
    "Action": "secretsmanager:GetSecretValue",
    "Resource": "arn:aws:secretsmanager:<AWS-REGION>:<ACCOUNT-NUMBER>:secret: Database/Development/Oracle-Application-User     
 }
}

In the above policy, remember to replace the placeholder <AWS-REGION> with the AWS region that you’re using and the placeholder <ACCOUNT-NUMBER> with the number of your AWS account.

Summary

I explained the key benefits of Secrets Manager as they relate to RDS and showed you how to help meet your compliance requirements by configuring Secrets Manager to rotate database credentials automatically on your behalf. Secrets Manager helps you protect access to your applications, services, and IT resources without the upfront investment and on-going maintenance costs of operating your own secrets management infrastructure. To get started, visit the Secrets Manager console. To learn more, visit Secrets Manager documentation.

If you have comments about this post, submit them in the Comments section below. If you have questions about anything in this post, start a new thread on the Secrets Manager forum.

Want more AWS Security news? Follow us on Twitter.

Apurv Awasthi

Apurv is the product manager for credentials management services at AWS, including AWS Secrets Manager and IAM Roles. He enjoys the “Day 1” culture at Amazon because it aligns with his experience building startups in the sports and recruiting industries. Outside of work, Apurv enjoys hiking. He holds an MBA from UCLA and an MS in computer science from University of Kentucky.

New guide helps financial services customers in Brazil navigate cloud requirements

Post Syndicated from Leandro Bennaton original https://aws.amazon.com/blogs/security/new-guide-helps-financial-services-customers-in-brazil-navigate-cloud-requirements/

We have a new resource to help our financial services customers in Brazil navigate regulatory requirements for using the cloud. The AWS User Guide to Financial Services Regulations in Brazil is a deep dive into the Brazilian National Monetary Council’s Resolution No. 4,658. The cybersecurity cloud resolution is the first of its kind by regulators in Brazil. The guide details how our services may be able to assist you in achieving these security expectations.

The resolution covers topics such as implementing a cybersecurity policy, incident response, entering into agreements with cloud service providers, subcontracting, business continuity, and notification requirements. Our guide addresses each of these issues and provides specific guidance on how you can use AWS to satisfy requirements.

The AWS User Guide to Financial Services Regulations in Brazil is part of a series of publications that seek to facilitate customer compliance. It’s available in English and Portuguese. We’ll continue to monitor the regulatory environment in Brazil and around the world and to publish additional resources.

If you have any questions, please contact your account executive.

Amazon ElastiCache for Redis now PCI DSS compliant, allowing you to process sensitive payment card data in-memory for faster performance

Post Syndicated from Manan Goel original https://aws.amazon.com/blogs/security/amazon-elasticache-redis-now-pci-dss-compliant-payment-card-data-in-memory/

Amazon ElastiCache for Redis has achieved the Payment Card Industry Data Security Standard (PCI DSS). This means that you can now use ElastiCache for Redis for low-latency and high-throughput in-memory processing of sensitive payment card data, such as Customer Cardholder Data (CHD). ElastiCache for Redis is a Redis-compatible, fully-managed, in-memory data store and caching service in the cloud. It delivers sub-millisecond response times with millions of requests per second.

To create a PCI-Compliant ElastiCache for Redis cluster, you must use the latest Redis engine version 4.0.10 or higher and current generation node types. The service offers various data security controls to store, process, and transmit sensitive financial data. These controls include in-transit encryption (TLS), at-rest encryption, and Redis AUTH. There’s no additional charge for PCI DSS compliant ElastiCache for Redis.

In addition to PCI, ElastiCache for Redis is a HIPAA eligible service. If you want to use your existing Redis clusters that process healthcare information to also process financial information while meeting PCI requirements, you must upgrade your Redis clusters from 3.2.6 to 4.0.10. For more details, see Upgrading Engine Versions and ElastiCache for Redis Compliance.

Meeting these high bars for security and compliance means ElastiCache for Redis can be used for secure database and application caching, session management, queues, chat/messaging, and streaming analytics in industries as diverse as financial services, gaming, retail, e-commerce, and healthcare. For example, you can use ElastiCache for Redis to build an internet-scale, ride-hailing application and add digital wallets that store customer payment card numbers, thus enabling people to perform financial transactions securely and at industry standards.

To get started, see ElastiCache for Redis Compliance Documentation.

Want more AWS Security news? Follow us on Twitter.