Tag Archives: Migration and Transfer

AWS DataSync adds support for on-premises object storage

Post Syndicated from Alejandra Quetzalli original https://aws.amazon.com/blogs/aws/aws-datasync-adds-support-for-on-premises-object-storage/

AWS DataSync now supports transfers to and from self-managed object storage! 🎉

This new feature makes it easy for customers to automate copying large amounts of data between on-premises object storage and AWS Cloud Storage.

What is AWS DataSync?

If you’re new to AWS DataSync, you may be wondering how this service works.

AWS DataSync makes it simple and fast to move large amounts of data between on-premises storage and AWS Cloud Storage. AWS DataSync makes it easy for you to move data by automating both the management of data transfer processes and the infrastructure required for a high-performance and secure data transfer. The below image illustrates how you can use AWS DataSync to quickly and efficiently move data between on-premises storage systems and Amazon S3, Amazon EFS, and Amazon FSx for Windows File Server.




Let’s see a demo…

Who’s ready for a demo? I know I am! 😁

Let’s head over to the AWS Console and search for AWS DataSync.

AWS DataSync


Let’s click “Get Started.”

AWS DataSync

For the purpose of this blog post, we will pretend we’ve already Created an Agent. After you have created an agent and configured your source and destination, the next step is to Create a Task.

We want to set up a task that will transfer files from our on-premises object storage system to AWS. To do this, we select “Create Task.”


We select “Create a new location” and specify an Object storage location type for our source.


Now it’s time to select your agent. We add the IP address of our on-premises object storage server, as well as our path to the files we want to move.

Your object storage will likely require credentials to access it, so you may want to require Authentication.

We configure an Access key and Secret key to access our S3 bucket.

Now it’s time to configure our destination location. We do so by selecting “Choose an existing location” and by choosing an existing S3 bucket.

Next I pick a “Task Name.

We also have the option to schedule what frequency we wish to execute this Task. (You may want to schedule it in order to regularly pick up incremental changes until the migration cut-over.)

Now we can attach a CloudWatch LogGroup to the task and log all transferred objects and files!

The next step is to review our setup and finally create that task.

And if all goes well, we get our happy green bar. 🙌🏽


Now we head over to the S3 console, because we want to make sure that the objects were actually copied into our S3 bucket.

Voilà! There it is. 😁


Support for on-premises object storage for AWS DataSync is now globally available in 22 AWS Regions for our customers to leverage. 🌎

To learn more about AWS DataSync, visit the DataSync product page and the DataSync developer guide.


You may also enjoy…

Check out this awesome demo video of AWS DataSync made by my fellow teammate, Jerry Hargrove.


¡Gracias por tu tiempo!
~Alejandra 💁🏻‍♀️ y Canela 🐾


New – AWS Transfer for FTP and FTPS, in addition to existing SFTP

Post Syndicated from Harunobu Kameda original https://aws.amazon.com/blogs/aws/new-aws-transfer-for-ftp-and-ftps-in-addition-to-existing-sftp/

AWS Transfer for SFTP was launched on November 2018 as a fully managed service that enables the transfer of files directly into and out of Amazon S3 using the Secure File Transfer Protocol (SFTP).

Today, we are happy to announce the expansion of the service to add support for FTPS and FTP, which makes it easy to migrate and securely run File Transfer Protocol over SSL (FTPS) and FTP workloads in AWS, in addition to the existing AWS Transfer for SFTP service. Supporting SFTP-, FTPS-, and FTP-based transfers for Amazon S3, we are also announcing the “AWS Transfer Family,” which is the aggregated name of AWS Transfer for SFTP, FTPS, and FTP.

Some software archiving and scientific research applications use FTP to distribute software artifacts or public datasets, and CRM, ERP, and supply chain applications use FTPS for transferring sensitive data. Many of their existing applications cannot switch from FTP or FTPS to SFTP because this requires changing existing applications and processes – especially those involving third-parties – and is often impractical or infeasible. Customers are looking for an easy and secure way to migrate their file transfers without disrupting their existing integrations and end users. For these reasons, we are launching AWS Transfer for FTPS and AWS Transfer for FTP.

Basic difference between SFTP and FTPS/FTP

Let’s talk a bit about the differences among SFTP and FTPS/FTP before we start a walk through. These are actually different protocols, but they work similar to “File Transfer.”

  • Secure File Transfer Protocol (SFTP) – Defined by the Internet Engineering Task Force (IETF) as an extended version of SSH 2.0, allowing file transfer over SSH and for use with Transport Layer Security (TLS) and VPN applications.
  • File Transfer Protocol (FTP) – Defined by RFC114 originally, and replaced by RFC765 and RFC959 for TCP/IP basis.
  • File Transfer Protocol over SSL/TLS (FTPS) – Used to encrypt FTP communication by SSL/TLS.

Until now, customers with multiple protocol needs were using the service for SFTP or were waiting for this launch. With this announcement, customers who use either of the three protocols can migrate and leverage AWS services for their end to end file transfer needs. Availability of these new protocols increases accessibility to your data, while the same options that were available for SFTP can be used for FTPS and FTP to secure access. Access control features available include using use of IAM Roles and policies, Logical directories for S3 and Security Groups.

Walk through

This walk through provides a step-by-step guide for creating a fully managed FTP Server. FTP servers are only accessible inside your VPC, including AWS Direct Connect or VPN. You can use FTPS if you need access via the internet.

You will see a new AWS console page when you access the AWS Transfer Family console. Click Create server to begin.

There are now three protocol choices – SFTP, FTPS, and FTP.

For this example, let’s start the walk through by selecting FTP. Click the FTP check box, and uncheck the SFTP check box. We can assign both protocols at the same time, but we are creating a FTP server as the new feature for this step.

Click Next

We now need to assign an Identity provider. The identity provider is used for authentication when logging on to the FTP server. Only Custom which is provided by Amazon API Gateway is supported for FTPS and FTP. To be able to invoke the API, we need to create an Invocation URL, which is an API Gateway endpoint, and also an IAM role. Here are guidelines for how to create an Invocation URL using CloudFormation with a yaml template. For servers enabled for SFTP only, you can also choose Service Managed authentication to store and manage identities within the service.

Click Next, and an Endpoint configuration dialog comes up.

We can only choose VPC as a VPC hosted endpoint for FTP. If we need access from the internet, we need to choose FTPS instead of FTP for security reasons. Then, we choose an appropriate VPC and its subnet to host the endpoint.

Click Next, and the next dialog comes up. The next step is optional. We can enable CloudWatch logging by assigning an IAM role. The CloudFormation template above created an IAM role that we can use, or we can use a different role.

We skip the Server Host key section because this is for SFTP.

Assign the appropriate tags and click Next. Then, click Create server. The FTP server is created.

Click Server ID, and we see the detail of the FTP server.

It is time to test the FTP server!

From Action, let’s select Test. Type “myuser” as Username and “MySuperSecretPassword” as Password.

Status code of HTTP 200 is returned if your FTP server is successfully integrated with your identity provider.

Now that we know your identity provider is all integrated correctly, let’s test using a ftp client.
We can now perform cd/ls/put/get/rm operations using a FTP client against an existing Amazon S3 bucket(s). We use Amazon EC2 for this walk through. Create an instance if we do not have it in the subnet specified above, and Install lftp client.

sudo yum install lftp

To connect to the server, we will need its endpoint URL of the FTP server. We need to access the VPC Endpoint console to obtain it. If you were using an internet facing SFTP and/or FTPS server, you could get this information directly from the AWS Transfer Family Console. If we access the Endpoint from another subnet or other VPC, please be sure that Security Groups allows TCP port 21 and port 8192-8200.

Then, we can try to login to the FTP server by below command;

lftp -d ftp://{VPC End Point of your FTP Server} -u 'myuser, MySuperSecretPassword'

(Click to enlarge the image)

Next Step

Username and Password for test is specified in the source code inside the Lambda function created by CloudFormation as guided.

The blog article “Enable password authentication for AWS Transfer for SFTP using AWS Secrets Manager” is a good way to start to learn more about managing an authentication data, and this CloudFormation template is used for creating API Gateway and Lambda functions with AWS Secrets Manager.

Closing Remarks:

  • Only Passive mode is supported. Our service does not make outbound connections.
  • Only Explicit mode for FTPS is supported. Our service does not support implicit mode.
  • Renaming file name is supported, but renaming directory (S3 BucketName) is not supported, and also append operations are not supported

Available Today

AWS Transfer for FTPS and FTP are available in all Regions where AWS Transfer for SFTP is currently available. Take a look at the product page and the documentation to learn more. You can also check this video for a demo.

– Kame;


Use MAP for Windows to Simplify your Migration to AWS

Post Syndicated from Fred Wurden original https://aws.amazon.com/blogs/compute/use-map-for-windows-to-simplify-your-migration-to-aws/

There’s no question that organizations today are being disrupted in their industry. In a previous blog post, I shared that such disruption often accelerates organizations’ decisions to move to the cloud. When these organizations migrate to the cloud, Windows workloads are often critical to their business and these workloads require a performant, reliable, and secure cloud infrastructure. Customers tell us that reducing risk, building cloud expertise, and lowering costs are important factors when choosing that infrastructure.

Today, we are announcing the general availability of the Migration Acceleration Program (MAP) for Windows, a comprehensive program that helps you execute large-scale migrations and modernizations of your Windows workloads on AWS. We have millions of customers on AWS, and have spent the last 11 years helping Windows customers successfully move to our cloud. We’ve built a proven methodology, providing you with AWS services, tools, and expertise to help simplify the migration of your Windows workloads to AWS. MAP for Windows provides prescriptive guidance, consulting support from experts, tools, trainings, and service credits to help reduce the risk and cost of migrating to the cloud as you embark on your migration journey.

MAP for Windows also helps you along the pathways to modernize current and legacy versions of Windows Server and SQL Server to cloud native and open source solutions, enabling you to break free from commercial licensing costs. With the strong price-performance of open-source solutions and the proven reliability of AWS, you can innovate quickly while reducing your risk.

With MAP for Windows, you will follow a simple three-step migration process to your migration:

  1. Assess Your Readiness: The migration readiness assessment helps you identify gaps along the six dimensions of the AWS Cloud Adoption Framework: business, process, people, platform, operations, and security. This assessment helps customers identify capabilities required in the migration. MAP for Windows also includes an Optimization and Licensing Assessment, which provides recommendations on how to optimize your licenses on AWS.
  2. Mobilize Your Resources: The mobilize phase helps you build an operational foundation for your migration, with the goal of fixing the capability gaps identified in the assessment phase. The mobilize phase accelerates your migration decisions by providing clear guidance on migration plans that improve the success of your migration.
  3. Migrate or Modernize Your Workloads: APN Partners and the AWS ProServe team help customers execute the large-scale migration plan developed during the mobilize phase. MAP for Windows also offers financial incentives to help you offset migration costs such as labor, training, and the expense of sometimes running two environments in parallel.

MAP for Windows includes support from AWS Professional Services and AWS Migration Competency Partners, such as Rackspace, 2nd Watch, Accenture, Cloudreach, Enimbos Global Services, Onica, and Slalom. Our MAP for Windows partners have successfully demonstrated completion of multiple large-scale migrations to AWS. They have received the APN Migration Competency Partner and the Microsoft Workloads Competency designations.

Learn about what MAP for Windows can do for you on this page. Learn also about the migration experiences of AWS customers. And contact us to discuss your Windows migration or modernization initiatives and apply to MAP for Windows.

About the Author

Fred Wurden is the GM of Enterprise Engineering (Windows, VMware, Red Hat, SAP, benchmarking) working to make AWS the most customer-centric cloud platform on Earth. Prior to AWS, Fred worked at Microsoft for 17 years and held positions, including: EU/DOJ engineering compliance for Windows and Azure, interoperability principles and partner engagements, and open source engineering. He lives with his wife and a few four-legged friends since his kids are all in college now.

Migration Complete – Amazon’s Consumer Business Just Turned off its Final Oracle Database

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/migration-complete-amazons-consumer-business-just-turned-off-its-final-oracle-database/

Over my 17 years at Amazon, I have seen that my colleagues on the engineering team are never content to leave good-enough alone. They routinely re-evaluate every internal system to make sure that it is as scalable, efficient, performant, and secure as possible. When they find an avenue for improvement, they will use what they have learned to thoroughly modernize our architectures and implementations, often going so far as to rip apart existing systems and rebuild them from the ground up if necessary.

Today I would like to tell you about an internal database migration effort of this type that just wrapped up after several years of work. Over the years we realized that we were spending too much time managing and scaling thousands of legacy Oracle databases. Instead of focusing on high-value differentiated work, our database administrators (DBAs) spent a lot of time simply keeping the lights on while transaction rates climbed and the overall amount of stored data mounted. This included time spent dealing with complex & inefficient hardware provisioning, license management, and many other issues that are now best handled by modern, managed database services.

More than 100 teams in Amazon’s Consumer business participated in the migration effort. This includes well-known customer-facing brands and sites such as Alexa, Amazon Prime, Amazon Prime Video, Amazon Fresh, Kindle, Amazon Music, Audible, Shopbop, Twitch, and Zappos, as well as internal teams such as AdTech, Amazon Fulfillment Technology, Consumer Payments, Customer Returns, Catalog Systems, Deliver Experience, Digital Devices, External Payments, Finance, InfoSec, Marketplace, Ordering, and Retail Systems.

Migration Complete
I am happy to report that this database migration effort is now complete. Amazon’s Consumer business just turned off its final Oracle database (some third-party applications are tightly bound to Oracle and were not migrated).

We migrated 75 petabytes of internal data stored in nearly 7,500 Oracle databases to multiple AWS database services including Amazon DynamoDB, Amazon Aurora, Amazon Relational Database Service (RDS), and Amazon Redshift. The migrations were accomplished with little or no downtime, and covered 100% of our proprietary systems. This includes complex purchasing, catalog management, order fulfillment, accounting, and video streaming workloads. We kept careful track of the costs and the performance, and realized the following results:

  • Cost Reduction – We reduced our database costs by over 60% on top of the heavily discounted rate we negotiated based on our scale. Customers regularly report cost savings of 90% by switching from Oracle to AWS.
  • Performance Improvements – Latency of our consumer-facing applications was reduced by 40%.
  • Administrative Overhead – The switch to managed services reduced database admin overhead by 70%.

The migration gave each internal team the freedom to choose the purpose-built AWS database service that best fit their needs, and also gave them better control over their budget and their cost model. Low-latency services were migrated to DynamoDB and other highly scalable non-relational databases such as Amazon ElastiCache. Transactional relational workloads with high data consistency requirements were moved to Aurora and RDS; analytics workloads were migrated to Redshift, our cloud data warehouse.

We captured the shutdown of the final Oracle database, and had a quick celebration:

DBA Career Path
As I explained earlier, our DBAs once spent a lot of time managing and scaling our legacy Oracle databases. The migration freed up time that our DBAs now use to do an even better job of performance monitoring and query optimization, all with the goal of letting them deliver a better customer experience.

As part of the migration, we also worked to create a fresh career path for our Oracle DBAs, training them to become database migration specialists and advisors. This training includes education on AWS database technologies, cloud-based architecture, cloud security, OpEx-style cost management. They now work with both internal and external customers in an advisory role, where they have an opportunity to share their first-hand experience with large-scale migration of mission-critical databases.

Migration Examples
Here are examples drawn from a few of the migrations:

Advertising – After the migration, this team was able to double their database fleet size (and their throughput) in minutes to accommodate peak traffic, courtesy of RDS. This scale-up effort would have taken months.

Buyer Fraud – This team moved 40 TB of data with just one hour of downtime, and realized the same or better performance at half the cost, powered by Amazon Aurora.

Financial Ledger – This team moved 120 TB of data, reduced latency by 40%, cut costs by 70%, and cut overhead by the same 70%, all powered by DynamoDB.

Wallet – This team migrated more than 10 billion records to DynamoDB, reducing latency by 50% and operational costs by 90% in the process. To learn more about this migration, read Amazon Wallet Scales Using Amazon DynamoDB.

My recent Prime Day 2019 post contains more examples of the extreme scale and performance that are possible with AWS.

Migration Resources
If you are ready to migrate from Oracle (or another hand-managed legacy database) to one or more AWS database services, here are some resources to get you started:

AWS Migration Partners – Our slate of AWS Migration Partners have the experience, expertise, and tools to help you to understand, plan, and execute a database migration.

Migration Case Studies -Read How Amazon is Achieving Database Freedom Using AWS to learn more about this effort; read the Prime Video, Advertising, Items & Offers, Amazon Fulfillment, and Analytics case studies to learn more about the examples that I mentioned above.

AWS Professional Services – My colleagues at AWS Professional Services are ready to work alongside you to make your migration a success.

AWS Migration Tools & Services – Check out our Cloud Migration page, read more about Migration Hub, and don’t forget about the Database Migration Service.

AWS Database Freedom – The AWS Database Freedom program is designed to help qualified customers migrate from traditional databases to cloud-native AWS databases.

AWS re:Invent Sessions – We are finalizing an extensive lineup of chalk talks and breakout sessions for AWS re:Invent that will focus on this migration effort, all led by the team members that planned and executed the migrations.




Learn about AWS Services & Solutions – September AWS Online Tech Talks

Post Syndicated from Jenny Hang original https://aws.amazon.com/blogs/aws/learn-about-aws-services-solutions-september-aws-online-tech-talks/

Learn about AWS Services & Solutions – September AWS Online Tech Talks

AWS Tech Talks

Join us this September to learn about AWS services and solutions. The AWS Online Tech Talks are live, online presentations that cover a broad range of topics at varying technical levels. These tech talks, led by AWS solutions architects and engineers, feature technical deep dives, live demonstrations, customer examples, and Q&A with AWS experts. Register Now!

Note – All sessions are free and in Pacific Time.

Tech talks this month:



September 23, 2019 | 11:00 AM – 12:00 PM PTBuild Your Hybrid Cloud Architecture with AWS – Learn about the extensive range of services AWS offers to help you build a hybrid cloud architecture best suited for your use case.

September 26, 2019 | 1:00 PM – 2:00 PM PTSelf-Hosted WordPress: It’s Easier Than You Think – Learn how you can easily build a fault-tolerant WordPress site using Amazon Lightsail.

October 3, 2019 | 11:00 AM – 12:00 PM PTLower Costs by Right Sizing Your Instance with Amazon EC2 T3 General Purpose Burstable Instances – Get an overview of T3 instances, understand what workloads are ideal for them, and understand how the T3 credit system works so that you can lower your EC2 instance costs today.



September 26, 2019 | 11:00 AM – 12:00 PM PTDevelop a Web App Using Amazon ECS and AWS Cloud Development Kit (CDK) – Learn how to build your first app using CDK and AWS container services.


Data Lakes & Analytics:

September 26, 2019 | 9:00 AM – 10:00 AM PTBest Practices for Provisioning Amazon MSK Clusters and Using Popular Apache Kafka-Compatible Tooling – Learn best practices on running Apache Kafka production workloads at a lower cost on Amazon MSK.



September 25, 2019 | 1:00 PM – 2:00 PM PTWhat’s New in Amazon DocumentDB (with MongoDB compatibility) – Learn what’s new in Amazon DocumentDB, a fully managed MongoDB compatible database service designed from the ground up to be fast, scalable, and highly available.

October 3, 2019 | 9:00 AM – 10:00 AM PTBest Practices for Enterprise-Class Security, High-Availability, and Scalability with Amazon ElastiCache – Learn about new enterprise-friendly Amazon ElastiCache enhancements like customer managed key and online scaling up or down to make your critical workloads more secure, scalable and available.



October 1, 2019 | 9:00 AM – 10:00 AM PT – CI/CD for Containers: A Way Forward for Your DevOps Pipeline – Learn how to build CI/CD pipelines using AWS services to get the most out of the agility afforded by containers.


Enterprise & Hybrid:

September 24, 2019 | 1:00 PM – 2:30 PM PT Virtual Workshop: How to Monitor and Manage Your AWS Costs – Learn how to visualize and manage your AWS cost and usage in this virtual hands-on workshop.

October 2, 2019 | 1:00 PM – 2:00 PM PT – Accelerate Cloud Adoption and Reduce Operational Risk with AWS Managed Services – Learn how AMS accelerates your migration to AWS, reduces your operating costs, improves security and compliance, and enables you to focus on your differentiating business priorities.



September 25, 2019 | 9:00 AM – 10:00 AM PTComplex Monitoring for Industrial with AWS IoT Data Services – Learn how to solve your complex event monitoring challenges with AWS IoT Data Services.


Machine Learning:

September 23, 2019 | 9:00 AM – 10:00 AM PTTraining Machine Learning Models Faster – Learn how to train machine learning models quickly and with a single click using Amazon SageMaker.

September 30, 2019 | 11:00 AM – 12:00 PM PTUsing Containers for Deep Learning Workflows – Learn how containers can help address challenges in deploying deep learning environments.

October 3, 2019 | 1:00 PM – 2:30 PM PTVirtual Workshop: Getting Hands-On with Machine Learning and Ready to Race in the AWS DeepRacer League – Join DeClercq Wentzel, Senior Product Manager for AWS DeepRacer, for a presentation on the basics of machine learning and how to build a reinforcement learning model that you can use to join the AWS DeepRacer League.


AWS Marketplace:

September 30, 2019 | 9:00 AM – 10:00 AM PTAdvancing Software Procurement in a Containerized World – Learn how to deploy applications faster with third-party container products.



September 24, 2019 | 11:00 AM – 12:00 PM PTApplication Migrations Using AWS Server Migration Service (SMS) – Learn how to use AWS Server Migration Service (SMS) for automating application migration and scheduling continuous replication, from your on-premises data centers or Microsoft Azure to AWS.


Networking & Content Delivery:

September 25, 2019 | 11:00 AM – 12:00 PM PTBuilding Highly Available and Performant Applications using AWS Global Accelerator – Learn how to build highly available and performant architectures for your applications with AWS Global Accelerator, now with source IP preservation.

September 30, 2019 | 1:00 PM – 2:00 PM PTAWS Office Hours: Amazon CloudFront – Just getting started with Amazon CloudFront and [email protected]? Get answers directly from our experts during AWS Office Hours.



October 1, 2019 | 11:00 AM – 12:00 PM PTRobots and STEM: AWS RoboMaker and AWS Educate Unite! – Come join members of the AWS RoboMaker and AWS Educate teams as we provide an overview of our education initiatives and walk you through the newly launched RoboMaker Badge.


Security, Identity & Compliance:

October 1, 2019 | 1:00 PM – 2:00 PM PTDeep Dive on Running Active Directory on AWS – Learn how to deploy Active Directory on AWS and start migrating your windows workloads.



October 2, 2019 | 9:00 AM – 10:00 AM PTDeep Dive on Amazon EventBridge – Learn how to optimize event-driven applications, and use rules and policies to route, transform, and control access to these events that react to data from SaaS apps.



September 24, 2019 | 9:00 AM – 10:00 AM PTOptimize Your Amazon S3 Data Lake with S3 Storage Classes and Management Tools – Learn how to use the Amazon S3 Storage Classes and management tools to better manage your data lake at scale and to optimize storage costs and resources.

October 2, 2019 | 11:00 AM – 12:00 PM PTThe Great Migration to Cloud Storage: Choosing the Right Storage Solution for Your Workload – Learn more about AWS storage services and identify which service is the right fit for your business.