Continuous Deployment of an ASP.net Core app to Azure using VSTS

In a previous post I went through how to deploy an ARM template to Azure through Visual Studio Team Services – Continuous Integration and deployment, using the VSTS Build engine and Release management features. Now I’m going to cover deployment of an ASP.net Core web app to Azure app services, including some more details around parameterisation, tying resources together with ARM templates, and a few of the many changes in VSTS that have arrived in the past few months.

Scenario

I’ve built a template for a project that I keep reusing, so either I’m just stuck in my ways, or this environment covers the majority of my deployments. I’ve included the full template here both to show what it does and how it works, but also if you want to use it for your own projects you’re welcome to. This is what my environment looks like – A web site hosted in an App Service plan, a storage account that I use for storing blobs (mainly static content like images and other resources) and storage queues. An App Insights instance for recording analytics of usage and performance data, a SQL Azure Database (and a SQL Azure Database server instance). There are other components that you’d probably also require in some circumstances – commonly perhaps a CDN profile and a Redis cache, and although I haven’t included those to keep some simplicity – the pattern as to how you include them and tie them to your web app is the same as for storage accounts and app insights.

arm-for-dotnetcore-webapp

I’m going to assume you know how to create a Team Project in VSTS, and that you’ve got one setup, using whatever source control or template you like. Next we’re going to setup the connection to your Azure subscription and setup the Environment build.

Environment Build

Linking VSTS to your Azure Subscription’s Azure Resource Manager

For VSTS’s release manager to be able to communicate and deploy to your subscription, you need to setup a Service connection between VSTS and the subscription, so it can call the Azure Resource Manager API with the right permissions. Since my last post, the act of setting up a service principal in VSTS has become a one click operation, rather than having to download and execute a PowerShell script. If you haven’t already linked your VSTS project to your Azure subscription, you will need to do this:

  • Navigate to your team project’s home page and then click on the settings (gear cog) icon within the Team project.
  • Click on the Services tab
  • Select the New Service Endpoint dropdown menu and choose “Azure Resource Manager”
  • Give the connection a name, select the subscription, click OK – you’re done.

A few things can go wrong while setting this up:
– The subscription you want doesn’t appear in the dropdown list: This means that the user account you’ve logged into VSTS with doesn’t have access to the subscription you’re after, you will need to add that user account as an administrator to the Azure subscription via the portal, then it should appear in here.
– You don’t have the necessary permissions: You will need to ask someone who has Create Application rights in your company’s Azure Active Directory to either give your account the permissions it needs, or ask them to go through these steps for you.
There is a link on the popup dialog to take you through the steps for running the PowerShell script still should you need to revert to the manual process.

Azure Resource Manager (ARM) Template

I used Visual Studio 2015 to create a new Project template for my Azure Resources, using the VS “Azure Resource Group” template.

arm2

I started with the Web+SQL template, but then modified it to match my requirements. Some of the parameters in the standard template I prefer to turn into variables that are calculated based on a single parameter such as “environment” – to give a SQL database called “-tst-db” for example instead of the randomly generated names that the standard template gives you, and I’ve defaulted some of the other parameters to different values to suit my budget.

ARM Template (github)
Parameters file (github)

So this template will accept a parameter named “environment”, and that will cause the various resources to be created with different names, based on “prd”, “tst” or “dev”. You will notice my resources have names like “CSAPWeb-tst”, “csapsql-tst” and so on – you should change any occurrence of CSAP to your own project’s name. Note that anything with a public endpoint (web site, sql database server) needs to have a globally unique name.

Linked resources: I need the web application to know how to talk to the SQL Database, the storage account and the app insights instance. This is done by including a connection string resource and an app settings resource in the template for the web site, and that in turn will extract the information from the other resources, so one template can be used for all environments. You should also include here any settings your application needs that are known at this stage, or can be calculated per environment, or can be injected per environment later in the release management phase. Note this is really important: as an example here, I’ve included the Azure AD Secret and ClientId as a parameter that I can inject per environment (and manage the password elsewhere), but since the Tenant Id won’t change, I’ve kept that as a constant (obfuscated in this case). If you’re not using Azure AD for your project, please remove the client ID, client Secret and all the “Authentication:” settings below, from this step and future ones.

        {
          "apiVersion": "2015-08-01",
          "type": "config",
          "name": "connectionstrings",
          "dependsOn": [
            "[concat('Microsoft.Web/Sites/', variables('webSiteName'))]",
            "[concat('Microsoft.Storage/storageAccounts/',variables('storageName'))]"
          ],
          "properties": {
            "csapreporting": {
              "value": "[concat('Data Source=tcp:', reference(concat('Microsoft.Sql/servers/', variables('sqlserverName'))).fullyQualifiedDomainName, ',1433;Initial Catalog=', variables('databaseName'), ';User Id=', parameters('administratorLogin'), '@', variables('sqlserverName'), ';Password=', parameters('administratorLoginPassword'), ';')]",
              "type": "SQLServer"
            },
            "storage": {
              "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageName'), ';AccountKey=', listKeys(resourceId('Microsoft.Storage/storageAccounts', variables('storageName')), '2015-06-15').key1)]",
              "type": "Custom"
            }
          }
        },
        {
          "name": "appsettings",
          "type": "config",
          "apiVersion": "2015-08-01",
          "dependsOn": [
            "[concat('Microsoft.Web/sites/', variables('webSiteName'))]",
            "[concat('Microsoft.Insights/components/', variables('webSiteName'))]"
          ],
          "tags": {
            "displayName": "appsettings"
          },
          "properties": {
            "environment": "[parameters('environment')]",
            "applicationInsightsInstrumentationKey": "[reference(resourceId('Microsoft.Insights/components', variables('webSiteName')), '2014-04-01').InstrumentationKey]",
            "Authentication:AzureAd:ClientId": "[parameters('clientid')]",
            "Authentication:AzureAd:ClientSecret": "[parameters('clientSecret')]",
            "Authentication:AzureAd:AADInstance": "https://login.microsoftonline.com/",
            "Authentication:AzureAd:CallbackPath": "/signin-oidc",
            "Authentication:AzureAd:TenantId": "72f988bf-86f1-41af-91ab-2d7cd011db47"
          }

Note also the applicationInsightsInstrumentationKey – this will read the key from your environment-based Insights resource and set that up in the config also, ready to be used by your app at runtime.

Creating Continuous Deployment for one environment

This is basically following the steps as per the previous post – refer to that for details, here’s the procedure:
– Go to your VSTS portal, open your project, go to the Build & Release/Builds page
– Click to add a new Build Definition
– Choose “Empty” as the template type (this is a really simple “Build”)
– Select your repository and branch, and tick the box marked “Continuous Integration”. This means every time you change and checkin your template or parameters file, a new build will be kicked off automatically
– Click “Add build step” in the Build editor, choose “Copy and Publish Artifacts”, click Add, then click Close
– Select the new step from the left hand pane, then in the right hand build step settings pane, complete the parameters:

  • Copy Root should be the path to your two JSON files (in my case `CSAPReporting/CSAPReporting.AzureResources/`)
  • Contents: `*.json`
  • Artifact Name: “ARM Template”
  • Artifact Type: “Server” – this means it will publish the artifact to VSTS rather than a file share

Name your definition and then save it. All this build is doing is finding the .json files in your repository and then packaging them as an artefact for the release manager to pick up later. Now for the release and the environment. Queue a build and wait for it to finish successfully – this will then allow us to select the artefact and link it from the release management stage.

  • Go to the Releases tab.
  • Select add new release definition (the dropdown menu by the + icon).
  • Choose empty template again.
  • In the next tab, choose the build definition you created earlier as the “Source” from the dropdown, and check the Continuous Deployment box – this means every time a new build output is created from the linked build definition, a release will be kicked off and the output will be deployed for you automatically
  • The queue tells VSTS which set of agents to use to carry out the release – leave this as Hosted to have it run in the cloud.
  • Click Create – you will end up with a single environment called Environment 1 and no tasks.
  • Click Add Tasks, click the Add button next to “Azure Resource Group Deployment”, then click close.
  • You will now have some red validation errors and need to fill out the various step settings:
  • Azure Connection Type: “Azure Resource Manager”
  • Azure RM Subscription: If you’ve completed the step above to link your subscription to VSTS, you will see your subscription here, select it
  • Action: “Create or Update Resource Group”
  • Resource Group: Enter the name of the resource group you want to deploy to, relevant to the test environment, eg. “CSAP-test” in my case.
  • Template: Use the “…” to find and select the Template Json File. (mine’s at `$(System.DefaultWorkingDirectory)/App Service Infra/ARM Template/WebSiteSQLDatabase.json`)
  • Template Parameters: Likewise, use the “…” – mine is `$(System.DefaultWorkingDirectory)/App Service Infra/ARM Template/WebSiteSQLDatabase.parameters.json`
  • Override Template Parameters: `-clientSecret $(clientSecret) -clientId $(clientId) -administratorLoginPassword (ConvertTo-SecureString -String ‘$(sqlPassword)’ -AsPlainText -Force)` (I will explain this later)
  • Deployment Mode: “Incremental”

So those override parameters – the name of the parameter matches the parameters we’ve exposed in our JSON template earlier. When the Resource Manager runs, these settings will all be passed in, and will override any default values in the template itself, and any parameters saved in the template parameter file. The $('<paramname>') is a placeholder and will be substituted at release time by a variable configured in VSTS, hence the next steps…

Before we can deploy, we need to setup the environment specific variables such as the SQL Password to use, the client secret, etc.

  • Click on “Environment 1” in the environments list, and rename it “Test”
  • Click on the “…” and choose “Configure Variables”
  • Here you can enter the three parameters we’ve overridden in the template.

environmentvariables

Note that in my picture the Secret and Password are just asterisks – once you type in the password (and by the way, I don’t expect you to KNOW what the SQL password is at this point, it’s up to you to DECIDE at this point what you want it to be – it will be passed to the template and used when creating the SQL database and the connection string in your app) – click the padlock button on the right. That will tell VSTS that this should be a secret. Once you click the OK button, no one can see the plain text version of the variable, it’s stored in an Azure Key Vault now. Click OK to save the variables in the environment. Now that environment and release definition is complete. As an aside, if you have variables that you want to manage in VSTS but they apply to ALL of your environments, rather than use Configure Variables in the environment definition, use the Variables sub-menu in the release definition tab. These are “global” within the scope of your Release Definition, but are used in exactly the same way. There are other “Built-in” variables as well, such as $(System.TeamProject) that you can use to further customise your deployment.

Once you’ve saved that environment and release definition, you can create a release manually to give it a test – this should take a few minutes, but you’ll see the deployment happening via the console log (near real-time), and then when it’s complete you can validate the deployment in the Azure portal.

armsuccess

Progress – we now have an end-to-end continuously integrated and deployed test environment. Whenever I change my template and check it in, the Azure resources will be updated to reflect those changes automatically. Now for the application itself.

Application Build

As per the subject of this post, the application I’ve chosen is a web app built on ASP.net Core. I’ve chosen to target .net Framework 4.6.1, but that’s not relevant to the steps you take to build and deploy the application itself. I’m going to give a little information about the structure and architecture of my web app, but I don’t expect this to match everyone’s requirements, I’m not going to include any source code or anything, I’m just giving an example of what a typical app might look like and how you can build and deploy it to your environments using VSTS. For brevity, I’ve also not included unit tests in the solution or build/deployment. I will cover this in a later post. Please do not forget about them though, they are critical for code quality, and confidence in your ability to make and deploy changes through continuous integration/deployment.

In my solution (called CSAPReporting), I’ve chosen to use EntityFramework Core, and split the data layer from the app, resulting in 4 projects:

  • CSAPReporting.AzureResources – this is the one I created above for the ARM templates.
  • CSAPReporting.Web – my main ASP.net Core web application, and includes things like Controllers, Views, ViewModels, a wwwroot for static content, TagHelpers.
  • CSAPReporting.Models – my Data Model (poco classes), Extensions to the Data Model (helper functions or properties that don’t belong in the database), and my common Interfaces which will be used by the Web project and any API projects or Service assemblies I choose to build in future.
  • CSAPReporting.DataAccess – this is my logical data layer and contains my Repository class, the DBContext, Entity Framework Migrations, and a BlobManger service to talk to Azure Storage and issue SAS tokens with limited time policies.

Digression – getting EF.Core to work against a non-app assembly

So if I want to add an API layer I can, the only services I have right now are data related so sit in the DataAccess project, but if I created some kind of Calculation or Reporting service for example, I would put that in it’s own assembly, add the interface to the Models project, and then could consume it from the Web application, or a WebAPI app if I chose to build one.

I struggled a little with getting some of the above to work nicely. EF.Core tooling didn’t like me splitting the DBContext and Data Model away from the main Web app, and I had to use some preview libraries to get it all working. The project files can be found in the same GitHub repo for reference. With this setup, I can edit my data model classes in the Models folder, then go to the DataAccess folder at the command line and add my migrations, which will get created in the DataAccess project:

dotnet ef migrations add <name>

To do this, I needed to fool dotnet into thinking this project is an application type (see the project.json buildOptions : { emitEntryPoint:"true" }), and create a public IDbContextFactory for my data context so it could be discovered by the ef tooling:

namespace CSAPReporting.DataAccess
{
  public class TempDbContextFactory : IDbContextFactory<CSAPDataContext>
  {
    public CSAPDataContext Create(DbContextFactoryOptions options)
    {
      var builder = new DbContextOptionsBuilder<CSAPDataContext>();
      builder.UseSqlServer("Server=localhost;Database=csapreporting;Trusted_Connection=True;");
      return new CSAPDataContext(builder.Options);
    }
  }
}

And in my Web app’s startup project I’ve initialised the data context like this to make my app automatically apply all outstanding migrations to the configured database when it first runs. Note this isn’t the same as Automatic Migrations, which will have EF examine your code vs the database at runtime and generate “On-the-fly” migrations for you. I wanted to retain some kind of control 🙂

(ConfigureServices)
      services.AddDbContext<CSAPDataContext>(options => 
        options.UseSqlServer(Configuration.GetConnectionString("csapreporting"), b => b.MigrationsAssembly("CSAPReporting.DataAccess")));

(Configure)
     using (var context = new CSAPDataContext(app.ApplicationServices.GetRequiredService<DbContextOptions<CSAPDataContext>>()))
      {
        context.Database.Migrate();
      }

Digression number 2 – Skipping authentication and authorisation when developing and debugging locally

I like to do some coding on the train to London from time to time, so work totally disconnected from the web; even when working from an office/home the authentication process takes some time, so I thought I’d figure out how to bypass it, yet keep the code in place and in tact for when it’s deployed. There are probably other ways to achieve this, but it works for me.

  • Create a dummy authorization service that just returns TRUE for all requests
using Microsoft.AspNetCore.Authorization;
using System.Collections.Generic;
using System.Threading.Tasks;
using System.Security.Claims;

namespace CSAPReporting.Web.Filters
{
  public class DummyAuthorizationService : IAuthorizationService
  {
    public async Task<bool> AuthorizeAsync(ClaimsPrincipal user, object resource, string policyName)
    {
      return true;
    }

    public async Task<bool> AuthorizeAsync(ClaimsPrincipal user, object resource, IEnumerable<IAuthorizationRequirement> requirements)
    {
      return true;
    }
  }
}

  • in ConfigureServices, add some code to read the working Environment, and if it’s “Development”, then replace the default AuthorizationService implementation with the dummy one.
      IDictionary dict = Environment.GetEnvironmentVariables();

      if (dict["ASPNETCORE_ENVIRONMENT"] != null)
      {
        if (string.Compare(dict["ASPNETCORE_ENVIRONMENT"].ToString(), "Development") == 0)
        {
          var data = services.Where(s => s.ServiceType.Name.Contains("IAuthorizationService")).ToList();
          foreach (var service in data)
            services.Remove(service);
          services.AddTransient<IAuthorizationService, DummyAuthorizationService>();
        }
      }
  • bypass the calls to Authentication in the pipeline (ie. Configure()) for the development environment
      if (!env.IsDevelopment())
      {
        app.UseCookieAuthentication();
        app.UseOpenIdConnectAuthentication(new OpenIdConnectOptions
        {
          ClientId = Configuration["Authentication:AzureAd:ClientId"],
          ClientSecret = Configuration["Authentication:AzureAd:ClientSecret"],
          Authority = Configuration["Authentication:AzureAd:AADInstance"] + Configuration["Authentication:AzureAd:TenantId"],
          CallbackPath = Configuration["Authentication:AzureAd:CallbackPath"],
          ResponseType = OpenIdConnectResponseTypes.CodeIdToken
        });
      }

Back to the plot

Now I have my project with some functionality in an a database behind it. Locally I’m using a SQL Database instance on localhost, and I’m using the Azure Storage emulator for blob and queue storage. The connection strings for those are stored in my UserSecrets file – this is a handy feature that’s come in to .net Core projects – right click on the main web app project and select “Manage User Secrets”. Anything you put in here can be appended to the normal configuration settings (appsettings.config) by including the following lines in your Startup constructor. The user secrets file is stored somewhere in your roaming profile folder, so it’s unlikely you’ll ever check those into source control by accident. Obviously the storage and sql connection strings we’ve got in our azure test environment are already setup and good to go, this is just to let our development environment work locally.

      if (env.IsDevelopment())
        builder.AddUserSecrets();

Once the project’s all building ok (dotnet build) then I’m ready to setup the server Build and Release definitions for the project and have my continuous integration push my app out to the test environment.

Application Build and Deploy

In these steps we’re going to setup a CI build for the solution described above, it’s going to package my web app up to a zip file that will be published as an artefact to the VSTS release manager, which will then take that package and deploy it to our Azure environment.

App Build definition

The tooling in this area for dotnet core has been improving lately, and I found this process straightforward. The first thing I wanted to do was make sure I don’t kick off an environment (as opposed to application code) deployment every time I check in some code to the same repository. You can approach this either by using separate Repos for your infrastructure vs code, or you can use the Build Definition Triggers tab to only include specific paths in your Repo. To do this, I edited my “App Service Infra” Build definition (ie. the one created earlier to publish the ARM template), to look like this:

infratriggers

Now to create the Application solution build definition.

  • Add New Build Definition
  • Select Empty, click Next
  • Choose the branch and Repo your source code is in, and check the “Continuous integration” box on, click Create.
  • Click on the Add Build step… button, and add the following three steps:
  • .NET Core (PREVIEW) – add this twice
  • Publish Artifact

The .NET Core preview task is pretty simple – it runs the dotnet command using the parameters you provide in the UI.
– edit the first .NET Core build step, set the command to “restore”.
– edit the second .NET Core build step, set the command to “publish”, tick ON the Publish Web Projects box, and in the arguments add -c $(BuildConfiguration) -o $(Build.ArtifactStagingDirectory) – this tells it which configuration to build and where to put the output
– check the Zip Published projects option on
– Edit the “Publish Artifact” step, set Path to Publish as $(build.artifactstagingdirectory), give the artefact a name (eg. “Webapp”), and type will be Server again.
– finally for this part, set the triggers up to ignore the infrastructure path and just include our app solution subfolder

apptrigger

And that’s it for the build – give it a name, save it, and queue a new build, watch it run through. If it worked from your command line in the development environment and you’ve checked in all the right files, it should work for you here too, and you’ll end up with an artefact with a zip file in. If you want to view the contents of the artefact, you can do this once a build’s complete. Go to the build summary page, then click on “Artifacts”, and “Explore”.

artifactexplore

You should see a single zip file with your web project in.

App Release Definition

  • in the Releases tab, create a new Release Definition, use the empty template again, and link to the output from the web app build we’ve just created in the previous step, and check on “Continuous Deployment”.
  • add a new task, “Azure App Service Deployment: ARM”, and click close.
  • Rename the Environment 1 to “Test” as per the infrastructure build.
  • Click on the task to select it and configure it:
  • AzureRM Subscription: (choose your subscription)
  • App Service Name: (this should be populated from your subscription, and include the one created in your template for this environment), eg. `”CSAPWeb-tst”`
  • Deploy to Slot: Leave this blank for now
  • Virtual Application: Leave this blank
  • Package or Folder: `$(System.DefaultWorkingDirectory)/WebApp CI/webapp/CSAPReporting.Web.zip` in my case – use the “…” ellipsis to select the Zip file output from your linked build artefact.
  • Additional Deployment Options: Turn ON Publish using Web Deploy and Take App Offline, leave the others OFF

Name and save this release definition, and you’re ready to go – test it by creating a new release using the latest build, and you should end up with a fully working deployed web app in your test environment.

Creating the Production environment

We’ve now got a full end-to-end build/release process for the environment and application code, but only for one environment. The benefit of using ARM templates and VSTS Release manager now becomes apparent. We can create a production environment pretty quickly – the aim is to use the exact same package (build output) that we’ve been testing, so no change is needed to the Build definitions. We will clone the “environment definition” for both Release definitions, and update the settings to reflect our production scenario. If your production environment is to be hosted in a separate Azure subscription (recommended but not essential – for easier segregation of security rights and controls), you will need to configure a service connection to that subscription as well before we start.

  • Open the Infrastructure release definition and click the Edit link

editinfrarelease

  • Click on the “…” in the “Test” environment definition box

cloneenv

  • Since we’re building a production environment here, we want to setup an approval process to control who can authorise the release. Select “Specific Users”, select the users you want to have approval rights. You can later setup how approvals are completed – ie. anyone, everyone, or everyone in a set order.
  • If you check ON the automatic trigger deployment for this release, then every time a release succeeds into Test, a release will be queued to production as well, and if the approvers accept, then it will also be deployed. Typically, companies will only trigger production deployments manually, or on successful deployment of a Pre-Release or staging environment, but this is all down to your own internal processes and controls.
  • Click create, and you will now see “Copy of Test” as an environment. Click on the name and change it to “Prod”, and now set the Deployment task step settings for production:
    <ul
  • Azure RM Subscription: (choose your production connection here)
  • Resource Group: CSAP-prod (for example) – make sure you change this from Test though
  • Add the environment name setting to the “override template parameters” box: -environment 'prd' – we didn’t need this in the test environment because the parameter defaults to “tst”, if we don’t change it here, the template will attempt to name the resources the same as we named them for test, and end up failing. Note the environment names “tst, prd, dev” are just what I decided to call them in the template, feel free to change/add your own.
  • leave the other settings as per the Test environment
  • Click on the “…” in the Prod environment definition box, and input the passwords and secrets you want to use in your production environment. Note, even if you want to use the same passwords, if you had marked them as secrets in the previous environment, they will appear here as “*****”, but actually be BLANK. You need to re-complete these fields, otherwise your deployment will fail with as missing String parameter error.
  • Close that dialog, Save the release definition
  • Now when you create a new release (not release definition), you will notice some options around how deployments to prod are to be triggered. You should also notice now that the releases summary page shows two columns now as opposed to one:

    newenv

    Each box represents an environment.
    – Grey means it’s not yet been deployed to
    – stacked boxes indicate the release is queued
    – blue indicates a deployment is in progress
    – green indicates a successful deployment to that environment
    – red shows a release failed for some reason

    After a short while, if you’ve enabled approvals, you should see something like this:

    approvals

    The picture shows that the first environment (Test) was successful, and now we’re awaiting approvals before moving to Prod. Now open the release by clicking the “…” and selecting open.

    approvaldetail

    Notice the message asking me to approve or reject the deployment. When clicking this, you will be asked if you want to reassign the approval to someone else, defer the deployment to a specified date/time, or just approve it and go ahead. Usually you will want to configure emails to be sent out to the approvals individuals, you can do this via the environment definition, “Edit approvals” menu. There are two options at the bottom of the dialog – Send an email notification to the approver, and another to prevent the user who created the release from being able to approve it themselves. Once approved, the release will be queued, executed and you should see a new resource group fully populated appear in your Azure subscription.

    successrelease

    Now that’s all been done, the Azure environment is ready to deploy to, so we can modify the Release Definition for the application accordingly.

    • Edit the web app Release Definition
    • Clone the Test environment and rename the new one “Prod”
    • Change the Subscription and App Service Name (-prd should now appear in the drop down list of options)
    • Save the release definition

    All done – setting up additional environments is that easy.

    Further steps (Homework)

    • Don’t forget to add Unit tests to your solution, and add a build and release step to your application deployment process to execute those tests. If they fail, the build or release will also fail, giving you confidence in CI/CD.
    • Consider adding a deployment slot to production for staging. This allows you to deploy to the staging slot, then start that instance, and automatically carry out a Virtual IP swap with the production slot – ensuring no downtime for the web app during deployment.
    • Consider adding performance and availability tests to your web app endpoints
    • Setup a CDN profile to point to your storage account’s static assets as the origin; change your web pages to point to the CDN endpoints instead of the storage account and benefit from CDN performance and caching