Tag Archives: AWS .NET Development

Using NuGet with AWS CodeArtifact

Post Syndicated from John Standish original https://aws.amazon.com/blogs/devops/using-nuget-with-aws-codeartifact/

Managing NuGet packages for .NET development can be a challenge. Tasks such as initial configuration, ongoing maintenance, and scaling inefficiencies are the biggest pain points for developers and organizations. With its addition of NuGet package support, AWS CodeArtifact now provides easy-to-configure and scalable package management for .NET developers. You can use NuGet packages stored in CodeArtifact in Visual Studio, allowing you to use the tools you already know.

In this post, we show how you can provision NuGet repositories in 5 minutes. Then we demonstrate how to consume packages from your new NuGet repositories, all while using .NET native tooling.

All relevant code for this post is available in the aws-codeartifact-samples GitHub repo.

Prerequisites

For this walkthrough, you should have the following prerequisites:

Architecture overview

Two core resource types make up CodeArtifact: domains and repositories. Domains provide an easy way manage multiple repositories within an organization. Repositories store packages and their assets. You can connect repositories to other CodeArtifact repositories, or popular public package repositories such as nuget.org, using upstream and external connections. For more information about these concepts, see AWS CodeArtifact Concepts.

The following diagram illustrates this architecture.

AWS CodeArtifact core concepts

Figure: AWS CodeArtifact core concepts

Creating CodeArtifact resources with AWS CloudFormation

The AWS CloudFormation template provided in this post provisions three CodeArtifact resources: a domain, a team repository, and a shared repository. The team repository is configured to use the shared repository as an upstream repository, and the shared repository has an external connection to nuget.org.

The following diagram illustrates this architecture.

Example AWS CodeArtifact architecture

Figure: Example AWS CodeArtifact architecture

The following CloudFormation template used in this walkthrough:

AWSTemplateFormatVersion: '2010-09-09'
Description: AWS CodeArtifact resources for dotnet

Resources:
  # Create Domain
  ExampleDomain:
    Type: AWS::CodeArtifact::Domain
    Properties:
      DomainName: example-domain
      PermissionsPolicyDocument:
        Version: 2012-10-17
        Statement:
          - Effect: Allow
            Principal:
              AWS: 
              - !Sub arn:aws:iam::${AWS::AccountId}:root
            Resource: "*"
            Action:
              - codeartifact:CreateRepository
              - codeartifact:DescribeDomain
              - codeartifact:GetAuthorizationToken
              - codeartifact:GetDomainPermissionsPolicy
              - codeartifact:ListRepositoriesInDomain

  # Create External Repository
  MyExternalRepository:
    Type: AWS::CodeArtifact::Repository
    Condition: ProvisionNugetTeamAndUpstream
    Properties:
      DomainName: !GetAtt ExampleDomain.Name
      RepositoryName: my-external-repository       
      ExternalConnections:
        - public:nuget-org
      PermissionsPolicyDocument:
        Version: 2012-10-17
        Statement:
          - Effect: Allow
            Principal:
              AWS: 
              - !Sub arn:aws:iam::${AWS::AccountId}:root
            Resource: "*"
            Action:
              - codeartifact:DescribePackageVersion
              - codeartifact:DescribeRepository
              - codeartifact:GetPackageVersionReadme
              - codeartifact:GetRepositoryEndpoint
              - codeartifact:ListPackageVersionAssets
              - codeartifact:ListPackageVersionDependencies
              - codeartifact:ListPackageVersions
              - codeartifact:ListPackages
              - codeartifact:PublishPackageVersion
              - codeartifact:PutPackageMetadata
              - codeartifact:ReadFromRepository

  # Create Repository
  MyTeamRepository:
    Type: AWS::CodeArtifact::Repository
    Properties:
      DomainName: !GetAtt ExampleDomain.Name
      RepositoryName: my-team-repository
      Upstreams:
        - !GetAtt MyExternalRepository.Name
      PermissionsPolicyDocument:
        Version: 2012-10-17
        Statement:
          - Effect: Allow
            Principal:
              AWS: 
              - !Sub arn:aws:iam::${AWS::AccountId}:root
            Resource: "*"
            Action:
              - codeartifact:DescribePackageVersion
              - codeartifact:DescribeRepository
              - codeartifact:GetPackageVersionReadme
              - codeartifact:GetRepositoryEndpoint
              - codeartifact:ListPackageVersionAssets
              - codeartifact:ListPackageVersionDependencies
              - codeartifact:ListPackageVersions
              - codeartifact:ListPackages
              - codeartifact:PublishPackageVersion
              - codeartifact:PutPackageMetadata
              - codeartifact:ReadFromRepository

Getting the CloudFormation template

To use the CloudFormation stack, we recommend you clone the following GitHub repo so you also have access to the example projects. See the following code:

git clone https://github.com/aws-samples/aws-codeartifact-samples.git
cd aws-codeartifact-samples/getting-started/dotnet/cloudformation/

Alternatively, you can copy the previous template into a file on your local filesystem named deploy.yml.

Provisioning the CloudFormation stack

Now that you have a local copy of the template, you need to provision the resources using a CloudFormation stack. You can deploy the stack using the AWS CLI or on the AWS CloudFormation console.

To use the AWS CLI, enter the following code:

aws cloudformation deploy \
--template-file deploy.yml \
--region <YOUR_PREFERRED_REGION> \
--stack-name CodeArtifact-GettingStarted-DotNet

To use the AWS CloudFormation console, complete the following steps:

  1. On the AWS CloudFormation console, choose Create stack.
  2. Choose With new resources (standard).
  3. Select Upload a template file.
  4. Choose Choose file.
  5. Name the stack CodeArtifact-GettingStarted-DotNet.
  6. Continue to choose Next until prompted to create the stack.

Configuring your local development experience

We use the CodeArtifact credential provider to connect the Visual Studio IDE to a CodeArtifact repository. You need to download and install the AWS Toolkit for Visual Studio to configure the credential provider. The toolkit is an extension for Microsoft Visual Studio on Microsoft Windows that makes it easy to develop, debug, and deploy .NET applications to AWS. The credential provider automates fetching and refreshing the authentication token required to pull packages from CodeArtifact. For more information about the authentication process, see AWS CodeArtifact authentication and tokens.

To connect to a repository, you complete the following steps:

  1. Configure an account profile in the AWS Toolkit.
  2. Copy the source endpoint from the AWS Explorer.
  3. Set the NuGet package source as the source endpoint.
  4. Add packages for your project via your CodeArtifact repository.

Configuring an account profile in the AWS Toolkit

Before you can use the Toolkit for Visual Studio, you must provide a set of valid AWS credentials. In this step, we set up a profile that has access to interact with CodeArtifact. For instructions, see Providing AWS Credentials.

Visual Studio Toolkit for AWS Account Profile Setup

Figure: Visual Studio Toolkit for AWS Account Profile Setup

Copying the NuGet source endpoint

After you set up your profile, you can see your provisioned repositories.

  1. In the AWS Explorer pane, navigate to the repository you want to connect to.
  2. Choose your repository (right-click).
  3. Choose Copy NuGet Source Endpoint.
AWS CodeArtifact repositories shown in the AWS Explorer

Figure: AWS CodeArtifact repositories shown in the AWS Explorer

 

You use the source endpoint later to configure your NuGet package sources.

Setting the package source using the source endpoint

Now that you have your source endpoint, you can set up the NuGet package source.

  1. In Visual Studio, under Tools, choose Options.
  2. Choose NuGet Package Manager.
  3. Under Options, choose the + icon to add a package source.
  4. For Name , enter codeartifact.
  5. For Source, enter the source endpoint you copied from the previous step.
Configuring Nuget package sources for AWS CodeArtifact

Figure: Configuring NuGet package sources for AWS CodeArtifact

 

Adding packages via your CodeArtifact repository

After the package source is configured against your team repository, you can pull packages via the upstream connection to the shared repository.

  1. Choose Manage NuGet Packages for your project.
    • You can now see packages from nuget.org.
  2. Choose any package to add it to your project.
Exploring packages while connected to a AWS CodeArtifact repository

Exploring packages while connected to a AWS CodeArtifact repository

Viewing packages stored in your CodeArtifact team repository

Packages are stored in a repository you pull from, or referenced via the upstream connection. Because we’re pulling packages from nuget.org through an external connection, you can see cached copies of those packages in your repository. To view the packages, navigate to your repository on the CodeArtifact console.

Packages stored in a AWS CodeArtifact repository

Packages stored in a AWS CodeArtifact repository

Cleaning Up

When you’re finished with this walkthrough, you may want to remove any provisioned resources. To remove the resources that the CloudFormation template created, navigate to the stack on the AWS CloudFormation console and choose Delete Stack. It may take a few minutes to delete all provisioned resources.

After the resources are deleted, there are no more cleanup steps.

Conclusion

We have shown you how to set up CodeArtifact in minutes and easily integrate it with NuGet. You can build and push your package faster, from hours or days to minutes. You can also integrate CodeArtifact directly in your Visual Studio environment with four simple steps. With CodeArtifact repositories, you inherit the durability and security posture from the underlying storage of CodeArtifact for your packages.

As of November 2020, CodeArtifact is available in the following AWS Regions:

  • US: US East (Ohio), US East (N. Virginia), US West (Oregon)
  • AP: Asia Pacific (Mumbai), Asia Pacific (Singapore), Asia Pacific (Sydney), Asia Pacific (Tokyo)
  • EU: Europe (Frankfurt), Europe (Ireland), Europe (Stockholm)

For an up-to-date list of Regions where CodeArtifact is available, see AWS CodeArtifact FAQ.

About the Authors

John Standish

John Standish is a Solutions Architect at AWS and spent over 13 years as a Microsoft .Net developer. Outside of work, he enjoys playing video games, cooking, and watching hockey.

Nuatu Tseggai

Nuatu Tseggai is a Cloud Infrastructure Architect at Amazon Web Services. He enjoys working with customers to design and build event-driven distributed systems that span multiple services.

Neha Gupta

Neha Gupta is a Solutions Architect at AWS and have 16 years of experience as a Database architect/ DBA. Apart from work, she’s outdoorsy and loves to dance.

Elijah Batkoski

Elijah is a Technical Writer for Amazon Web Services. Elijah has produced technical documentation and blogs for a variety of tools and services, primarily focused around DevOps.

Modernizing and containerizing a legacy MVC .NET application with Entity Framework to .NET Core with Entity Framework Core: Part 2

Post Syndicated from Pratip Bagchi original https://aws.amazon.com/blogs/devops/modernizing-and-containerizing-a-legacy-mvc-net-application-with-entity-framework-to-net-core-with-entity-framework-core-part-2/

This is the second post in a two-part series in which you migrate and containerize a modernized enterprise application. In Part 1, we walked you through a step-by-step approach to re-architect a legacy ASP.NET MVC application and ported it to .NET Core Framework. In this post, you will deploy the previously re-architected application to Amazon Elastic Container Service (Amazon ECS) and run it as a task with AWS Fargate.

Overview of solution

In the first post, you ported the legacy MVC ASP.NET application to ASP.NET Core, you will now modernize the same application as a Docker container and host it in the ECS cluster.

The following diagram illustrates this architecture.

Architecture Diagram

 

You first launch a SQL Server Express RDS (1) instance and create a Cycle Store database on that instance with tables of different categories and subcategories of bikes. You use the previously re-architected and modernized ASP.NET Core application as a starting point for this post, this app is using AWS Secrets Manager (2) to fetch database credentials to access Amazon RDS instance. In the next step, you build a Docker image of the application and push it to Amazon Elastic Container Registry (Amazon ECR) (3). After this you create an ECS cluster (4) to run the Docker image as a AWS Fargate task.

Prerequisites

For this walkthrough, you should have the following prerequisites:

This post implements the solution in Region us-east-1.

Source Code

Clone the source code from the GitHub repo. The source code folder contains the re-architected source code and the AWS CloudFormation template to launch the infrastructure, and Amazon ECS task definition.

Setting up the database server

To make sure that your database works out of the box, you use a CloudFormation template to create an instance of Microsoft SQL Server Express and AWS Secrets Manager secrets to store database credentials, security groups, and IAM roles to access Amazon Relational Database Service (Amazon RDS) and Secrets Manager. This stack takes approximately 15 minutes to complete, with most of that time being when the services are being provisioned.

  1. On the AWS CloudFormation console, choose Create stack.
  2. For Prepare template, select Template is ready.
  3. For Template source, select Upload a template file.
  4. Upload SqlServerRDSFixedUidPwd.yaml, which is available in the GitHub repo.
  5. Choose Next.
    Create AWS CloudFormation stack
  6. For Stack name, enter SQLRDSEXStack.
  7. Choose Next.
  8. Keep the rest of the options at their default.
  9. Select I acknowledge that AWS CloudFormation might create IAM resources with custom names.
  10. Choose Create stack.
    Add IAM Capabilities
  11. When the status shows as CREATE_COMPLETE, choose the Outputs tab.
  12. Record the value for the SQLDatabaseEndpoint key.
    CloudFormation output
  13. Connect the database from the SQL Server Management Studio with the following credentials:User id: DBUserand Password:DBU$er2020

Setting up the CYCLE_STORE database

To set up your database, complete the following steps:

  1. On the SQL Server Management console, connect to the DB instance using the ID and password you defined earlier.
  2. Under File, choose New.
  3. Choose Query with Current Connection.Alternatively, choose New Query from the toolbar.
    Run database restore script
  4. Open CYCLE_STORE_Schema_data.sql from the GitHub repository and run it.

This creates the CYCLE_STORE database with all the tables and data you need.

Setting up the ASP.NET MVC Core application

To set up your ASP.NET application, complete the following steps:

  1. Open the re-architected application code that you cloned from the GitHub repo. The Dockerfile added to the solution enables Docker support.
  2. Open the appsettings.Development.json file and replace the RDS endpoint present in the ConnectionStrings section with the output of the AWS CloudFormation stack without the port number which is :1433 for SQL Server.

The ASP.NET application should now load with bike categories and subcategories. See the following screenshot.

Final run

Setting up Amazon ECR

To set up your repository in Amazon ECR, complete the following steps:

  1. On the Amazon ECR console, choose Repositories.
  2. Choose Create repository.
  3. For Repository name, enter coretoecsrepo.
  4. Choose Create repository Create repositiory
  5. Copy the repository URI to use later.
  6. Select the repository you just created and choose View push commands.View Push Commands
  7. In the folder where you cloned the repo, navigate to the AdventureWorksMVCCore.Web folder.
  8. In the View push commands popup window, complete steps 1–4 to push your Docker image to Amazon ECR. Push to Amazon Elastic Repository

The following screenshot shows completion of Steps 1 and 2 and ensure your working directory is set to AdventureWorksMVCCore.Web as below.

Login
The following screenshot shows completion of Steps 3 and 4.

Amazon ECR Push

Setting up Amazon ECS

To set up your ECS cluster, complete the following steps:

    1. On the Amazon ECS console, choose Clusters.
    2. Choose Create cluster.
    3. Choose the Networking only cluster template.
    4. Name your cluster cycle-store-cluster.
    5. Leave everything else as its default.
    6. Choose Create cluster.
    7. Select your cluster.
    8. Choose Task Definitions and choose Create new Task Definition.
    9. On the Select launch type compatibility page choose FARGATE and click Next step.
    10. On the Configure task and container definitions page, scroll to the bottom of the page and choose Configure via JSON.
    11. In the text area, enter the task definition JSON (task-definition.json) provided in the GitHub repo. Make sure to replace [YOUR-AWS-ACCOUNT-NO] in task-definition.json with your AWS account number on the line number 44, 68 and 71. The task definition file assumes that you named your repository as coretoecsrepo. If you named it something else, modify this file accordingly. It also assumes that you are using us-east-1 as your default region, so please consider replacing the region in the task-definition.json on line number 15 and 44 if you are not using us-east-1 region. Replace AWS Account number
    12. Choose Save.
    13. On the Task Definitions page, select cycle-store-td.
    14. From the Actions drop-down menu, choose Run Task.Run task
    15. Choose Launch type is equal to Fargate.
    16. Choose your default VPC as Cluster VPC.
    17. Select at least one Subnet.
    18. Choose Edit Security Groups and select ECSSecurityGroup (created by the AWS CloudFormation stack).Select security group
    19. Choose Run Task

Running your application

Choose the link under the task and find the public IP. When you navigate to the URL http://your-public-ip, you should see the .NET Core Cycle Store web application user interface running in Amazon ECS.

See the following screenshot.

Final run

Cleaning up

To avoid incurring future charges, delete the stacks you created for this post.

  1. On the AWS CloudFormation console, choose Stacks.
  2. Select SQLRDSEXStack.
  3. In the Stack details pane, choose Delete.

Conclusion

This post concludes your journey towards modernizing a legacy enterprise MVC ASP.NET web application using .NET Core and containerizing using Amazon ECS using the AWS Fargate compute engine on a Linux container. Portability to .NET Core helps you run enterprise workload without any dependencies on windows environment and AWS Fargate gives you a way to run containers directly without managing any EC2 instances and giving you full control. Additionally, couple of recent launched AWS tools in this area.

About the Author

Saleha Haider is a Senior Partner Solution Architect with Amazon Web Services.
Pratip Bagchi is a Partner Solutions Architect with Amazon Web Services.

 

Automated CI/CD pipeline for .NET Core Lambda functions using AWS extensions for dotnet CLI

Post Syndicated from Sundar Narasiman original https://aws.amazon.com/blogs/devops/automated-ci-cd-pipeline-for-net-core-lambda-functions-using-aws-extensions-for-dotnet-cli/

The trend of building AWS Serverless applications using AWS Lambda is increasing at an ever-rapid pace. Common use cases for AWS Lambda include data processing, real-time file processing, and extract, transform, and load (ETL) for data processing, web backends, internet of things (IoT) backends, and mobile backends. Lambda natively supports languages such as Java, Go, PowerShell, Node.js, C#, Python, and Ruby. It also provides a Runtime API that allows you to use any additional programming languages to author your functions.

.NET framework occupies a significant footprint in the technology landscape of enterprises. Nowadays, enterprise customers are modernizing .NET framework applications to .NET Core using AWS Serverless (Lambda). In this journey, you break down a large monolith service into multiple smaller independent and autonomous microservices using.NET Core Lambda functions

When you have several microservices running in production, a change management strategy is key for business agility and time-to-market changes. The change management of .NET Core Lambda functions translates to how well you implement an automated CI/CD pipeline using AWS CodePipeline. In this post, you see two approaches for implementing CI/CD for .NET Core Lambda functions: creating a pipeline with either two or three stages.

Creating a pipeline with two stages

In this approach, you define the pipeline in CodePipeline with two stages: AWS CodeCommit and AWS CodeBuild. CodeCommit is the fully-managed source control repository that stores the source code for .NET Core Lambda functions. It triggers CodeBuild when a new code change is published. CodeBuild defines a compute environment for the build process. It builds the .NET Core Lambda function and creates a deployment package (.zip). Finally, CodeBuild uses AWS extensions for Dotnet CLI to deploy the Lambda packages (.zip) to the Lambda environment. The following diagram illustrates this architecture.

 

CodePipeline with CodeBuild and CodeCommit stages.

CodePipeline with CodeBuild and CodeCommit stages.

Creating a pipeline with three stages

In this approach, you define the pipeline with three stages: CodeCommit, CodeBuild, and AWS CodeDeploy.

CodeCommit stores the source code for .NET Core Lambda functions and triggers CodeBuild when a new code change is published. CodeBuild defines a compute environment for the build process and builds the .NET Core Lambda function. Then CodeBuild invokes the CodeDeploy stage. CodeDeploy uses AWS CloudFormation templates to deploy the Lambda function to the Lambda environment. The following diagram illustrates this architecture.

CodePipeline with CodeCommit, CodeBuild and CodeDeploy stages.

CodePipeline with CodeCommit, CodeBuild and CodeDeploy stages.

Solution Overview

In this post, you learn how to implement an automated CI/CD pipeline using the first approach: CodePipeline with CodeCommit and CodeBuild stages. The CodeBuild stage in this approach implements the build and deploy functionalities. The high-level steps are as follows:

  1. Create the CodeCommit repository.
  2. Create a Lambda execution role.
  3. Create a Lambda project with .NET Core CLI.
  4. Change the Lambda project configuration.
  5. Create a buildspec file.
  6. Commit changes to the CodeCommit repository.
  7. Create your CI/CD pipeline.
  8. Complete and verify pipeline creation.

For the source code and buildspec file, see the GitHub repo.

Prerequisites

Before you get started, you need the following prerequisites:

Creating a CodeCommit repository

You first need a CodeCommit repository to store the Lambda project source code.

1. In the Repository settings section, for Repository name, enter a name for your repository.

2. Choose Create.

Name a repository

 

 

 

 

 

 

 

 

3. Initialize this repository with a markdown file (readme.md). You need this markdown file to create documentation about the repository.

4. Set up an AWS Identity and Access Management (IAM) credential to CodeCommit. Alternatively, you can set up SSH-based access. For instructions, see Setup for HTTPS users using Git credentials and Setup steps for SSH connections to AWS CodeCommit repositories on Linux, MacOS, or Unix. You need this to work with the CodeCommit repository from the development environment.

5. Clone the CodeCommit repository to a local folder.

Proceed to the next step to create an IAM role for Lambda execution.

Creating a Lambda execution role

Every Lambda function needs an IAM role for execution. Create an IAM role for Lambda execution with the appropriate IAM policy, if it doesn’t exist already. You’re now ready to create a Lambda function project using .NET Core Command Line Interface (CLI).

Creating a Lambda function project

You have multiple options for creating .NET Core Lambda function projects, such as using Visual Studio 2019, Visual Studio Code, and .NET Core CLI. In this post, you use .NET Core CLI.

By default, .NET Core CLI doesn’t support Lambda projects. You need the Amazon.Lambda.Templates nuget package to create your project.

  1. Install the nuget package Amazon.Lambda.Templates to have all the Amazon Lambda project templates in the development environment. See the following CLI Command.
    dotnet new -i Amazon.Lambda.Templates::*
  2. Verify the installation with the following CLI Command.
    dotnet new

    You should see the following output reflecting the presence of various Lambda templates in the development environment. You also need to install AWS extensions for Dotnet Lambda CLI to deploy and invoke Lambda functions from the terminal or command prompt.dotnet cli command listing lambda project templates

  3. To install the extensions, enter the following CLI Commands.
    dotnet tool install -g Amazon.Lambda.Tools
    dotnet tool update -g Amazon.Lambda.Tools
    

    You’re now ready to create a Lambda function project in the development environment.

  4. Navigate to the root of the cloned CodeCommit repository (which you created in the previous step).
  5. Create the Lambda function by entering the following CLI Command.
    dotnet new lambda.EmptyFunction --name Dotnetlambda4 --profile default --region us-east-1

    After you create your Lambda function project, you need to make some configuration changes.

Changing the Lambda function project configuration

When you create a .NET Core Lambda function project, it adds the configuration file aws-lambda-tools-defaults.json at the root of the project directory. This file holds the various configuration parameters for Lambda execution. You want to make sure that the function role is set to the IAM role you created earlier, and that the profile is set to default.

The updated aws-lambda-tools-defaults.json file should look like the following code:

{
  "Information": [
    "This file provides default values for the deployment wizard inside Visual Studio and the AWS Lambda commands added to the .NET Core CLI.",
    "To learn more about the Lambda commands with the .NET Core CLI execute the following command at the command line in the project root directory.",

    "dotnet lambda help",

    "All the command line options for the Lambda command can be specified in this file."
  ],

  "profile": "default",
  "region": "us-east-1",
  "configuration": "Release",
  "framework": "netcoreapp3.1",
  "function-runtime": "dotnetcore3.1",
  "function-memory-size": 256,
  "function-timeout": 30,
  "function-handler": "Dotnetlambda4::Dotnetlambda4.Function::FunctionHandler",
  "function-role": "arn:aws:iam::awsaccountnumber:role/testlambdarole"
}

After you update your project configuration, you’re ready to create the buildspec.yml file.

Creating a buildspec file

As a prerequisite to configuring the CodeCommit stage, you created a Lambda function project. For the CodeBuild stage, you need to create a buildspec file.

 

Create a buildspec.yml file with the following definition and save it at the root of the CodeCommit directory:

version: 0.2
env:
  variables:
    DOTNET_ROOT: /root/.dotnet
  secrets-manager:
    AWS_ACCESS_KEY_ID_PARAM: CodeBuild:AWS_ACCESS_KEY_ID
    AWS_SECRET_ACCESS_KEY_PARAM: CodeBuild:AWS_SECRET_ACCESS_KEY
phases:
  install:
    runtime-versions:
      dotnet: 3.1
  pre_build:
    commands:
      - echo Restore started on `date`
      - export PATH="$PATH:/root/.dotnet/tools"
      - pip install --upgrade awscli
      - aws configure set profile $Profile
      - aws configure set region $Region
      - aws configure set aws_access_key_id $AWS_ACCESS_KEY_ID_PARAM
      - aws configure set aws_secret_access_key $AWS_SECRET_ACCESS_KEY_PARAM
      - cd Dotnetlambda4
      - cd src
      - cd Dotnetlambda4
      - dotnet clean 
      - dotnet restore
  build:
    commands:
      - echo Build started on `date`
      - dotnet new -i Amazon.Lambda.Templates::*
      - dotnet tool install -g Amazon.Lambda.Tools
      - dotnet tool update -g Amazon.Lambda.Tools
      - dotnet lambda deploy-function "Dotnetlambda4" --function-role "arn:aws:iam::yourawsaccount:role/youriamroleforlambda" --region "us-east-1"

You’re now ready to commit your changes to the CodeCommit repository.

Committing changes to the CodeCommit repository

To push changes to your CodeCommit repository, enter the following git commands.

git add --all
git commit –a –m “Initial Comment”
git push

After you commit the changes, you can create your CI/CD pipeline using CodePipeline.

Creating a CI/CD pipeline

To create your pipeline with a CodeCommit and CodeBuild stage, complete the following steps:

  1. In the Pipeline settings section, for Pipeline name, enter a name.
  2. For Service role, select New service role.
  3. For Role name, use the auto-generated name.
  4. Select Allow AWS CodePipeline to create a service role so it can be used with this new pipeline.
  5. Choose Next.Choose Pipeline settings
  6. In the Source section, for Source provider, choose AWS CodeCommit.
  7. For Repository name, choose your repository.
  8. For Branch name, choose your branch.
  9. For Change detection options, select Amazon CloudWatch Events.
  10. Choose Next.Populating the Source stage
  11. In the Build section, for Build provider, choose AWS CodeBuild.Populating the CodeBuild stage
  12. For Environment image, choose Managed image.
  13. For Operating system, choose Ubuntu.
  14. For Image, choose aws/codebuild/standard:4.0.
  15. For Image version, choose Always use the latest image for this runtime versionSelecting Codebuild runtime
  16. CodeBuild needs to assume an IAM service role to get the required privileges for successful build operation.Create a new service role for the CodeBuild project.Selecting the Service role
  17. Attach the following IAM policy to the role:
    
    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Sid": "SecretManagerRead",
                "Effect": "Allow",
                "Action": [
                    "secretsmanager:GetRandomPassword",
                    "secretsmanager:GetResourcePolicy",
                    "secretsmanager:UntagResource",
                    "secretsmanager:GetSecretValue",
                    "secretsmanager:DescribeSecret",
                    "secretsmanager:ListSecretVersionIds",
                    "secretsmanager:ListSecrets",
                    "secretsmanager:TagResource"
                ],
                "Resource": "*"
            }
        ]
    }
    
  18. You now need to define the compute and environment variables for CodeBuild. For Compute, select your preferred compute.
  19. For Environment variables, enter two variables. For Region, enter your preferred Region. For Profile, Enter Value as default. Selecting CodeBuild env optionsThis allows the environment to use the default AWS profile in the build process.
  20. To set up an AWS profile, the CodeBuild environment needs AccessKeyId and SecretAccessKey. As a best practice, configure AccessKeyId and SecretAccessKey as secrets in AWS Secrets Manager and reference it in buildspec.yml. On the Secrets Manager console, choose Store a new secret.
  21. For Select secret type, select Other type of secrets.Selecting secret types
  22. Configure secrets AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY.Configuring secrets
  23. For the encryption key, choose DefaultEncryptionKey.
  24. Choose Next.
  25. For Secret name, enter CodeBuild.Secret name
  26. Leave the rest of selections as default and choose Store.Commented code
  27. In the Add deploy stage section, choose Skip deploy stage.Add Deploy stage

Completing and verifying your pipeline

After you save your pipeline, push the code changes of the Lambda function from the local repository to the remote CodeCommit repository.

After a few seconds, you should see the activation of the CodeCommit stage and transition to CodeBuild stage. Pipeline creation can take up to a few minutes.

CodePipeline

You can verity your pipeline on the CodePipeline console. This should deploy the Lambda function changes to the Lambda environment.

Cleaning up

If you no longer need the following resources, delete them to avoid incurring further charges:

  • CodeCommit repository
  • CodePipeline project
  • CodeBuild project
  • IAM role for Lambda execution
  • Lambda function

Conclusion

In this post, you implemented an automated CI/CD for .NET Core Lambda functions using two stages of CodePipeline: CodeCommit and CodeBuild. You can apply this solution to your own use cases.

About the author

Sundararajan Narasiman works as Senior Partner Solutions Architect with Amazon Web Services.

Modernizing and containerizing a legacy MVC .NET application with Entity Framework to .NET Core with Entity Framework Core: Part 1

Post Syndicated from Pratip Bagchi original https://aws.amazon.com/blogs/devops/modernizing-and-containerizing-a-legacy-mvc-net-application-with-entity-framework-to-net-core-with-entity-framework-core-part-1/

Tens of thousands of .NET applications are running across the world, many of which are ASP.NET web applications. This number becomes interesting when you consider that the .NET framework, as we know it, will be changing significantly. The current release schedule for .NET 5.0 is November 2020, and going forward there will be just one .NET that you can use to target multiple platforms like Windows and Linux. This is important because those .NET applications running in version 4.8 and lower can’t automatically upgrade to this new version of .NET. This is because .NET 5.0 is based on .NET Core and thus has breaking changes when trying to upgrade from an older version of .NET.

This is an important step in the .NET Eco-sphere because it enables .NET applications to move beyond the Windows world. However, this also means that active applications need to go through a refactoring before they can take advantage of this new definition. One choice for this refactoring is to wait until the new version of .NET is released and start the refactoring process at that time. The second choice is to get an early start and start converting your applications to .NET Core v3.1 so that the migration to .NET 5.0 will be smoother. This post demonstrates an approach of migrating an ASP.NET MVC (Model View Controller) web application using Entity Framework 6 to and ASP.NET Core with Entity Framework Core.

This post shows steps to modernize a legacy enterprise MVC ASP.NET web application using .NET core along with converting Entity Framework to Entity Framework Core.

Overview of the solution

The first step that you take is to get an Asp.NET MVC application and its required database server up and working in your AWS environment. We take this approach so you can run the application locally to see how it works. You first set up the database, which is SQL Server running in Amazon Relational Database Service (Amazon RDS). Amazon RDS provides a managed SQL Server experience. After you define the database, you set up schema and data. If you already have your own SQL Server instance running, you can also load data there if desired; you simply need to ensure your connection string points to that server rather than the Amazon RDS server you set up in this walk-through.

Next you launch a legacy MVC ASP.NET web application that displays lists of bike categories and its subcategories. This legacy application uses Entity Framework 6 to fetch data from database.

Finally, you take a step-by-step approach to convert the same use case and create a new ASP.NET Core web application. Here you use Entity Framework Core to fetch data from the database. As a best practice, you also use AWS Secrets Manager to store database login information.

MIgration blog overview

Prerequisites

For this walkthrough, you should have the following prerequisites:

Setting up the database server

For this walk-through, we have provided an AWS CloudFormation template inside the GitHub repository to create an instance of Microsoft SQL Server Express. Which can be downloaded from this link.

  1. On the AWS CloudFormation console, choose Create stack.
  2. For Prepare template, select Template is ready.
  3. For Template source, select Upload a template file.
  4. Upload SqlServerRDSFixedUidPwd.yaml and choose Next.
    Create AWS CloudFormation stack
  5. For Stack name, enter SQLRDSEXStack and choose Next.
  6. Keep the rest of the options at their default.
  7. Select I acknowledge that AWS CloudFormation might create IAM resources with custom names.
  8. Choose Create stack.
    Add IAM Capabilities
  9. When the status shows as CREATE_COMPLETE, choose the Outputs tab and record the value from the SQLDatabaseEndpoint key.AWS CloudFormation output
  10. Connect the database from the SQL Server Management Studio with the following credentials:User id: DBUserPassword: DBU$er2020

Setting up the CYCLE_STORE database

To set up your database, complete the following steps:

  1. On the SQL Server Management console, connect to the DB instance using the ID and password you defined earlier.
  2. Under File, choose New.
  3. Choose Query with Current Connection.
    Alternatively, choose New Query from the toolbar.
    Run database restore script
  4. Download cycle_store_schema_data.sql  and run it.

This creates the CYCLE_STORE database with all the tables and data you need.

Setting up and validating the legacy MVC application

  1. Download the source code from the GitHub repo.
  2. Open AdventureWorksMVC_2013.sln and modify the database connection string in the web.config file by replacing the Data Source property value with the server name from your Amazon RDS setup.

The ASP.NET application should load with bike categories and subcategories. The following screenshot shows the Unicorn Bike Rentals website after configuration.Legacy code output

Now that you have the legacy application running locally, you can look at what it would take to refactor it so that it’s a .NET Core 3.1 application. Two main approaches are available for this:

  • Update in place – You make all the changes within a single code set
  • Move code – You create a new .NET Core solution and move the code over piece by piece

For this post, we show the second approach because it means that you have to do less scaffolding.

Creating a new MVC Core application

To create your new MVC Core application, complete the following steps:

  1. Open Visual Studio.
  2. From the Get Started page, choose Create a New Project.
  3. Choose ASP.NET Core Web Application.
  4. For Project name, enter AdventureWorksMVCCore.Web.
  5. Add a Location you prefer.
  6. For Solution name, enter AdventureWorksMVCCore.
  7. Choose Web Application (Model-View-Controller).
  8. Choose Create. Make sure that the project is set to use .NET Core 3.1.
  9. Choose Build, Build Solution.
  10. Press CTRL + Shift + B to make sure the current solution is building correctly.

You should get a default ASP.NET Core startup page.

Aligning the projects

ASP.NET Core MVC is dependent upon the use of a well-known folder structure; a lot of the scaffolding depends upon view source code files being in the Views folder, controller source code files being in the Controllers folder, etc. Some of the non-.NET specific folders are also at the same level, such as css and images. In .NET Core, the expectations are that static content should be in a new construct, the wwwroot folder. This includes Javascript, CSS, and image files. You also need to update the configuration file with the same database connection string that you used earlier.

Your first step is to move the static content.

  1. In the .NET Core solution, delete all the content created during solution creation.This includes css, js, and lib directories and the favicon.ico file.
  2. Copy over the css, favicon, and Images folders from the legacy solution to the wwwroot folder of the new solution.When completed, your .NET Core wwwroot directory should appear like the following screenshot.
    Static content
  3. Open appsettings.Development.json and add a ConnectionStrings section (replace the server with the Amazon RDS endpoint that you have already been using). See the following code:.
{
  "ConnectionStrings": {
    "DefaultConnection": "Server=sqlrdsdb.xxxxx.us-east-1.rds.amazonaws.com; Database=CYCLE_STORE;User Id=DBUser;Password=DBU$er2020;"
  },
  "Logging": {
    "LogLevel": {
      "Default": "Information",
      "Microsoft": "Warning",
      "Microsoft.Hosting.Lifetime": "Information"
    }
  }
}

Setting up Entity Framework Core

One of the changes in .NET Core is around changes to Entity Framework. Entity Framework Core is a lightweight, extensible data access technology. It can act as an object-relational mapper (O/RM) that enables interactions with the database using .NET objects, thus abstracting out much of the database access code. To use Entity Framework Core, you first have to add the packages to your project.

  1.  In the Solution Explorer window, choose the project (right-click) and choose Manage Nuget packages…
  2. On the Browse tab, search for the latest stable version of these two Nuget packages.You should see a screen similar to the following screenshot, containing:
    • Microsoft.EntityFrameworkCore.SqlServer
    • Microsoft.EntityFrameworkCore.Tools
      Add Entity Framework Core nuget
      After you add the packages, the next step is to generate database models. This is part of the O/RM functionality; these models map to the database tables and include information about the fields, constraints, and other information necessary to make sure that the generated models match the database. Fortunately, there is an easy way to generate those models. Complete the following steps:
  3. Open Package Manager Console from Visual Studio.
  4. Enter the following code (replace the server endpoint):
    Scaffold-DbContext "Server= sqlrdsdb.xxxxxx.us-east-1.rds.amazonaws.com; Database=CYCLE_STORE;User Id= DBUser;Password= DBU`$er2020;" Microsoft.EntityFrameworkCore.SqlServer -OutputDir Models
    The ` in the password right before the $ is the escape character.
    You should now have a Context and several Model classes from the database stored within the Models folder. See the following screenshot.
    Databse model scaffolding
  5. Open the CYCLE_STOREContext.cs file under the Models folder and comment the following lines of code as shown in the following screenshot. You instead take advantage of the middleware to read the connection string in appsettings.Development.json that you previously configured.
    if (!optionsBuilder.IsConfigured)
                {
    #warning To protect potentially sensitive information in your connection string, you should move it out of source code. See http://go.microsoft.com/fwlink/?LinkId=723263 for guidance on storing connection strings.
                    optionsBuilder.UseSqlServer(
                        "Server=sqlrdsdb.cph0bnedghnc.us-east-1.rds.amazonaws.com; " +
                        "Database=CYCLE_STORE;User Id= DBUser;Password= DBU$er2020;");
                }
  6. Open the startup.cs file and add the following lines of code in the ConfigureServices method. You need to add reference of AdventureWorksMVCCore.Web.Models and Microsoft.EntityFrameworkCore in the using statement. This reads the connection string from the appSettings file and integrates with Entity Framework Core.
    services.AddDbContext<CYCLE_STOREContext>(options => options.UseSqlServer(Configuration.GetConnectionString("DefaultConnection")));

Setting up a service layer

Because you’re working with an MVC application, that application control logic belongs in the controller. In the previous step, you created the data access layer. You now create the service layer. This service layer is responsible for mediating communication between the controller and the data access layer. Generally, this is where you put considerations such as business logic and validation.

Setting up the interfaces for these services follows the dependency inversion and interface segregation principles.

    1. Create a folder named Service under the project directory.
    2. Create two subfolders, Interface and Implementation, under the new Service folder.
      Service setup
    3. Add a new interface, ICategoryService, to the Service\Interface folder.
    4. Add the following code to that interface
      using AdventureWorksMVCCore.Web.Models;
      using System.Collections.Generic;
       
      namespace AdventureWorksMVCCore.Web.Service.Interface
      {
          public interface ICategoryService
          {
              List<ProductCategory> GetCategoriesWithSubCategory();”
          }
      }
    5. Add a new service file, CategoryService, to the Service/Implementation folder.
    6. Create a class file CategoryService.cs and implement the interface you just created with the following code:
      using AdventureWorksMVCCore.Web.Models;
      using AdventureWorksMVCCore.Web.Service.Interface;
      using Microsoft.EntityFrameworkCore;
      using System.Collections.Generic;
      using System.Linq;
       
      namespace AdventureWorksMVCCore.Web.Service.Implementation
      {
        
          public class CategoryService : ICategoryService
          {
              private readonly CYCLE_STOREContext _context;
              public CategoryService(CYCLE_STOREContext context)
              {
                  _context = context;
              }
              public List<ProductCategory> GetCategoriesWithSubCategory()
              {
                  return _context.ProductCategory
                          .Include(category => category.ProductSubcategory)
                          .ToList(); 
               }
          }
      }
      

      Now that you have the interface and instantiation completed, the next step is to add the dependency resolver. This adds the interface to the application’s service collection and acts as a map on how to instantiate that class when it’s injected into a class constructor.

    7. To add this mapping, open the startup.cs file and add the following line of code below where you added DbContext:
      services.TryAddScoped<ICategoryService, CategoryService>();

      You may also need to add the following references:

      using AdventureWorksMVCCore.Web.Service.Implementation;
      using AdventureWorksMVCCore.Web.Service.Interface;
      using Microsoft.Extensions.DependencyInjection;
      using Microsoft.Extensions.DependencyInjection.Extensions;
      

Setting up view components

In this section, you move the UI to your new ASP.NET Core project. In ASP.NET Core, the default folder structure for managing views is different that it was in ASP.NET MVC. Also, the formatting of the Razor files is slightly different.

    1. Under Views, Shared, create a  folder called Components.
    2. Create a sub folder called Header.
    3. In the Header folder, create a new view called Default.cshtml and enter the following code:
      <div method="post" asp-action="header" asp-controller="home">
          <div id="hd">
              <div id="doc2" class="yui-t3 wrapper">
                  <table>
                      <tr>
                          <td>
                              <h2 class="banner">
                                  <a href="@Url.Action("Default","Home")" id="LnkHome">
                                      <img src="/Images/logo.png" style="width:125px;height:125px" alt="ComponentOne" />
       
                                  </a>
                              </h2>
                          </td>
                          <td class="hd-header"><h2 class="banner">Unicorn Bike Rentals</h2></td>
                      </tr>
                  </table>
              </div>
       
          </div>
      </div>
      
    4. Create a class within the Header folder called HeaderLayout.cs and enter the following code:
      using Microsoft.AspNetCore.Mvc;
       
      namespace AdventureWorksMVCCore.Web
      {
          public class HeaderViewComponent : ViewComponent
          {
              public IViewComponentResult Invoke()
              {
                  return View();
              }
          }
      }
      

      You can now create the content view component, which shows bike categories and subcategories.

    5. Under Views, Shared, Components, create a folder called Content
    6. Create a class ContentLayoutModel.cs and enter the following code:
      using AdventureWorksMVCCore.Web.Models;
      using System.Collections.Generic;
       
      namespace AdventureWorksMVCCore.Web.Views.Components
      {
          public class ContentLayoutModel
          {
              public List<ProductCategory> ProductCategories { get; set; }
          }
      }
      
    7. In this folder, create a view Default.cshtml and enter the following code:
      @model AdventureWorksMVCCore.Web.Views.Components.ContentLayoutModel
       
      <div method="post" asp-action="footer" asp-controller="home">
          <div class="content">
       
              <div class="footerinner">
                  <div id="PnlExpFooter">
                      <div>
                          @foreach (var category in Model.ProductCategories)
                          {
                              <div asp-for="@category.Name" [email protected]($"{category.Name}Menu")>
                                  <h1>
                                      <b>@category.Name</b>
                                  </h1>
                                  <ul [email protected]($"{category.Name}List")>
                                      @foreach (var subCategory in category.ProductSubcategory.ToList())
                                      {
                                          <li>@subCategory.Name</li>
                                      }
                                  </ul>
                              </div>
                          }
                      </div>
                  </div>
              </div>
          </div>
      </div>
    8. Create a class ContentLayout.cs and enter the following code:
      using AdventureWorksMVCCore.Web.Models;
      using AdventureWorksMVCCore.Web.Service.Interface;
      using Microsoft.AspNetCore.Mvc;
       
      namespace AdventureWorksMVCCore.Web.Views.Components
      {
          public class ContentViewComponent : ViewComponent
          {
              private readonly CYCLE_STOREContext _context;
              private readonly ICategoryService _categoryService;
              public ContentViewComponent(CYCLE_STOREContext context,
                  ICategoryService categoryService)
              {
                  _context = context;
                  _categoryService = categoryService;
              }
       
              public IViewComponentResult Invoke()
              {
                  ContentLayoutModel content = new ContentLayoutModel();
                  content.ProductCategories = _categoryService.GetCategoriesWithSubCategory();
                  return View(content);
              }
       
       
          }
      }

      The website layout is driven by _Layout.cshtml file.

    9. To render the header and portal the way you want, modify _Layout.cshtml and replace the existing code with the following code:
      <!DOCTYPE html>
      <html lang="en">
      <head>
          <meta name="viewport" content="width=device-width" />
          <title>Core Cycles Store</title>
          <link rel="apple-touch-icon" sizes="180x180" href="favicon/apple-touch-icon.png" />
          <link rel="icon" type="image/png" href="favicon/favicon-32x32.png" sizes="32x32" />
          <link rel="icon" type="image/png" href="favicon/favicon-16x16.png" sizes="16x16" />
          <link rel="manifest" href="favicon/manifest.json" />
          <link rel="mask-icon" href="favicon/safari-pinned-tab.svg" color="#503b75" />
          <link href="@Url.Content("~/css/StyleSheet.css")" rel="stylesheet" />
      </head>
      <body class='@ViewBag.BodyClass' id="body1">
          @await Component.InvokeAsync("Header");
       
          <div id="doc2" class="yui-t3 wrapper">
              <div id="bd">
                  <div id="yui-main">
                      <div class="content">
                          <div> 
                              @RenderBody()
                          </div>
                      </div>
                  </div>
              </div>
          </div>
      </body>
      </html>
      

      Upon completion your directory should look like the following screenshot.
      View component setup

Modifying the index file

In this final step, you modify the Home, Index.cshtml file to hold this content ViewComponent:

@{
    ViewBag.Title = "Core Cycles";
    Layout = "~/Views/Shared/_Layout.cshtml";
}
 
<div id="homepage" class="">    
    <div class="content-mid">
        @await Component.InvokeAsync("Content");
    </div>
 
</div>

You can now build the solution. You should have an MVC .NET Core 3.1 application running with data from your database. The following screenshot shows the website view.

Re-architected code output

Securing the database user and password

The CloudFormation stack you launched also created a Secrets Manager entry to store the CYCLE_STORE database user ID and password. As an optional step, you can use that to retrieve the database user ID and password instead of hard-coding it to ConnectionString.

To do so, you can use the AWS Secrets Manager client-side caching library. The dependency package is also available through NuGet. For this post, I use NuGet to add the library to the project.

  1. On the NuGet Package Manager console, browse for AWSSDK.SecretsManager.Caching.
  2. Choose the library and install it.
  3. Follow these steps to also install Newtonsoft.Json.
    Add Aws Secrects Manager nuget
  4. Add a new class ServicesConfiguration to this solution and enter the following code in the class. Make sure all the references are added to the class. This is an extension method so we made the class static:
    public static class ServicesConfiguration
       {
           public static async Task<Dictionary<string, string>> GetSqlCredential(this IServiceCollection services, string secretId)
           {
     
               var credential = new Dictionary<string, string>();
     
               using (var secretsManager = new AmazonSecretsManagerClient(Amazon.RegionEndpoint.USEast1))
               using (var cache = new SecretsManagerCache(secretsManager))
               {
     
     
                   var sec = await cache.GetSecretString(secretId);
                   var jo = Newtonsoft.Json.Linq.JObject.Parse(sec);
     
                   credential["username"] = jo["username"].ToObject<string>();
                   credential["password"] = jo["password"].ToObject<string>();
               }
     
               return credential;
           }
       }
    
  5. In appsettings.Development.json, replace the DefaultConnection with the following code:
    "Server=sqlrdsdb.cph0bnedghnc.us-east-1.rds.amazonaws.com; Database=CYCLE_STORE;User Id=<UserId>;Password=<Password>;"
  6. Add the following code in the startup.cs, which replaces the placeholder user ID and password with the value retrieved from Secrets Manager:
    Dictionary<string, string> secrets = services.GetSqlCredential("CycleStoreCredentials").Result;
    connectionString = connectionString.Replace("<UserId>", secrets["username"]);
    connectionString = connectionString.Replace("<Password>", secrets["password"]);
    services.AddDbContext<CYCLE_STOREContext>(options => options.UseSqlServer(connectionString));

    Build the solution again.

Cleaning up

To avoid incurring future charges, on the AWS CloudFormation console, delete the SQLRDSEXStack stack.

Conclusion

This post showed the process to modernize a legacy enterprise MVC ASP.NET web application using .NET Core and convert Entity Framework to Entity Framework Core. In Part 2 of this post, we take this one step further to show you how to host this application in Linux containers.

About the Author

Saleha Haider is a Senior Partner Solution Architect with Amazon Web Services.
Pratip Bagchi is a Partner Solutions Architect with Amazon Web Services.

 

Deploying a ASP.NET Core web application to Amazon ECS using an Azure DevOps pipeline

Post Syndicated from John Formento original https://aws.amazon.com/blogs/devops/deploying-a-asp-net-core-web-application-to-amazon-ecs-using-an-azure-devops-pipeline/

For .NET developers, leveraging Team Foundation Server (TFS) has been the cornerstone for CI/CD over the years. As more and more .NET developers start to deploy onto AWS, they have been asking questions about using the same tools to deploy to the AWS cloud. By configuring a pipeline in Azure DevOps to deploy to the AWS cloud, you can easily use familiar Microsoft development tools to build great applications.

Solution overview

This blog post demonstrates how to create a simple Azure DevOps project, repository, and pipeline to deploy an ASP.NET Core web application to Amazon ECS using Azure DevOps. The following screenshot shows a high-level architecture diagram of the pipeline:

 

Solution Architecture Diagram

In this example, you perform the following steps:

  1. Create an Azure DevOps Project, clone project repo, and push ASP.NET Core web application.
  2. Create a pipeline in Azure DevOps
  3. Build an Amazon ECS Cluster, Task and Service.
  4. Kick-off deployment of the ASP.Net Core web application using the newly create Azure DevOps pipeline.

 

Prerequisites

Ensure you have the following prerequisites set up:

  • An Amazon ECR repository
  • An IAM user with permissions for Amazon ECR and Amazon ECS (the user will need an access key and secret access key)

 

Create an Azure DevOps Project, clone project repo, and push ASP.NET Core web application

Follow these steps to deploy a .NET Core app onto your Amazon ECS cluster using the Azure DevOps (ADO) repository and pipeline:

 

  1. Login to dev.azure.com and navigate to the marketplace.
  2. Go to Visual Studio, search for “AWS”, and add the AWS Tools for Microsoft Visual Studio Team Services.
  3. Create a project in ADO: Provide a project name and choose Create.
  4. On the Project Summary page, choose Project Settings.
  5. In the Project Settings pane, navigate to the Service Connections page.
  6. Choose Create service connection, select AWS, and choose Next.
  7. Input an Access Key ID and Secret Access Key. (You’ll need an IAM user with permissions for Amazon ECR and Amazon ECS in order to deploy via the Azure DevOps pipeline.) Choose Save.
  8. Choose Repos in the left pane, then Clone in Visual Studio under Clone to your computer.
  9. Create a ASP.NET Core web application in Visual Studio, set the location to locally cloned repository, and check Enable Docker support.
  10. Once you’ve created the new project, perform an initial commit and push to the repository in Azure DevOps.

 

Creating a pipeline in Azure DevOps

Now that you have synced the repository, create a pipeline in Azure DevOps.

  1. Go to the pipeline page within Azure DevOps and choose Create Pipeline.
  2. Choose Use the classic editor.Pipeline configuration with repository
  3. Select Azure Repos Git for the location of your code and select the repository you created earlier.
  4. On the Choose a Template page, select Docker Container and choose Apply.
  5. Remove the Push an image step.
  6. Add an Amazon ECR Push task by choosing the + symbol next to Agent job 1. You can search for “AWS” in the Add tasks pane to filter for all AWS tasks.

 

Now, configure each task:

  1. Choose the Build an image task and ensure that the action is set to Build an image. Additionally, you can modify the Image Name to your standards.Pipeline configuration page Azure DevOps
  2. Choose the Push Image task and provide the following
    • Enter a name under Display Name.
    • Select the AWS Credentials that you created in Service Connections.
    • Select the AWS Region.
    • Provide the source image name, which you can find in the setting for the Build an image task.
    • Enter the name of the repository in Amazon ECR to which the image is pushedPipeline configuration page Azure DevOps
  3. Choose Save and queue.

Build Amazon ECS Cluster, Task, and Service

The goal here is to test up to building the Docker image and ensure it’s pushed to Amazon ECR. Once the Docker image is in Amazon ECR, you can create the Amazon ECS cluster, task definition, and service leveraging the newly created Docker image.

  1. Create an Amazon ECS cluster.
  2. Create an Amazon ECS task definition. When you create the task definition and configure the container, use the Amazon ECR URI for the Docker image that was just pushed to Amazon ECR.
  3. Create an Amazon ECS service.

Go back and edit the pipeline:

  1. Add the last step by choosing the + symbol next to Agent job 1.
  2. Search for “AWS CLI” in the search bar and add the task.
  3. Choose AWS CLI and configure the task.
  4. Enter a name under Display Name, such as Update ECS Service.
  5. Select the AWS Credentials that you created in Service Connections.
  6. Select the AWS Region.
  7. Input the following command, which updates the Amazon ECS service after a new image is pushed to Amazon ECR. Replace <clustername> and <servicename> with your Amazon ECS cluster and service names.
    • Command:ecs
    • Subcommand:update-service
    • Options and parameters: --cluster <clustername> --service <servicename> --force-new-deployment
  8. Now choose the Triggers tab and select Enable continuous integration with the repository you created.
  9. Choose Save and queue.

 

At this point, your build pipeline kicks off and builds a Docker image from the source code in the repository you created, pushes the image to Amazon ECR, and updates the Amazon ECS service with the new image.

You can verify by viewing the build. Choose Pipelines in Azure DevOps, selecting the entry for the latest run, and then the icon under the status column. Once it successfully completes, you can log in to the AWS console and view the updated image in Amazon ECR and the updated service in Amazon ECS.Pipeline status page Azure DevOps

Every time you commit and push your code through Visual Studio, this pipeline kicks off and builds and deploys your application to Amazon ECS.

Cleanup

At the end of this example, once you’ve completed all steps and are finished testing, follow these steps to disable or delete resources to avoid incurring costs:

  1. Go to the Amazon ECS console within the AWS Console.
  2. Navigate to the cluster you created, then choose the Tasks tab.
  3. Choose Stop all to turn off the tasks.

Conclusion

This blog post reviewed how to create a CI/CD pipeline in Azure DevOps to deploy a Docker Image to Amazon ECR and container to Amazon ECS. It provided detailed steps on how to set up a basic CI/CD pipeline, leveraging tools with which .NET developers are familiar and the steps needed to integrate with Amazon ECR and Amazon ECS.

I hope this post was informative and has helped you learn the basics of how to integrate Amazon ECR and Amazon ECS with Azure DevOps to create a robust CI/CD pipeline.

About the Authors

John Formento

 

 

John Formento is a Solution Architect at Amazon Web Services. He helps large enterprises achieve their goals by architecting secure and scalable solutions on the AWS Cloud.