Monday, 8 April 2019

Run PowerShell from Terraform

Introduction

As great as Terraform is, you will find times when you need to fall back to running other scripts or using other tools. One such tool is PowerShell. We'll do a quick demo on how to run PowerShell scripts from Terraform configurations.

We've got three options and we'll run through each one:
  1. Run PowerShell from Terraform the first time only
  2. Run PowerShell from Terraform every time
  3. Run PowerShell from Terraform on a trigger
To run PowerShell, we'll be using the null_resource in Terraform. You can find out more about it here. Using the null_resource, we'll be calling the local_exec provisioner which specifies that the PowerShell script will be run on the machine running the Terraform configuration. More info on that is here.

All the code for this blog post can be found on GitHub here.

Run PowerShell from Terraform the first time only

In this example, we want to run a PowerShell script only the first time the Terraform configuration runs.

We have a pretty simple PowerShell script which I'll put in a sub folder called helpers.


Our configuration is pretty simple for this one. Basically, we're calling the null_resource and the local_exec provisioner. We then pass in the path to our PowerShell script and the parameters. Note that you need to dot source your PowerShell script by putting a . before it and you also need to surround it in '' in case the path includes a space. 


We can then run terraform init and terraform apply and we see the output below and we can see Terraform ran the PowerShell script (no, you don't need to run terraform plan).




If we then run terraform apply again, we find that it doesn't run the script:


Run PowerShell from Terraform every time

Note that now we're running the same script every time Terraform runs. Terraform is idempotent which means it only makes changes on the first run and each next run doesn't make a change unless it needs to. Make sure your PowerShell scripts are idempotent if you're using this approach.

The PowerShell script is the same as before and it's in a folder called helpers as before. It's not idempotent but as it's just writing out information to the console it's not causing a problem.

As for the Terraform configuration, we're now adding a trigger which causes Terraform to run each time the trigger value changes. To do this, we're assigning a new UUID each time:


Terraform now runs the PowerShell script every time it runs.


Run PowerShell from Terraform on a defined trigger

Rather than using a UUID as the trigger value, we can configure Terraform to run the PowerShell script only when a particular value changes. In this example, we'll only run the PowerShell script if the value of TriggerValue changes. This can be any output or variable in Terraform e.g. you could specify that only set IP restrictions if the list of IP restrictions changes. 

Again, our PowerShell script is the same as before in the helpers folder.

To demonstrate this, we can add a variable TriggerValue and specify this in a variables.tf file. Terraform will expect us to pass this variable on the command line or via a TFVARS file.


Our Terraform configuration now needs to look like this. See that the trigger option is set to our TriggerValue variable.


When we run our configuration the first time using terraform apply, we need to specify the value for the variable (terraform apply -var TriggerValue=100) and we see that the script runs:



If we run it again, we can see that Terraform has nothing to do and the script doesn't run:


If we change the value of TriggerValue and run it again, Terraform now runs our PowerShell script:




Conclusion

In this article, we went through how you can run PowerShell scripts from Terraform either running every time, on a trigger or on the first time only.

Monday, 28 January 2019

Terraform - Failed to load backend: Initialization required

I came across this issue the other day and thought I'd share the solution.

Issue

When running terraform plan, you may get the error below:

Backend reinitialization required. Please run "terraform init".
Reason: Initial configuration of the requested backend "azurerm"

The "backend" is the interface that Terraform uses to store state,e,
perform operations, etc. If this message is showing up, it means that theshowing up, it means that the
Terraform configuration you're using is using a custom configuration forthe Terraform backend.

Changes to backend configurations require reinitialization. This allows
Terraform to setup the new configuration, copy existing state, etc. This isonly done during "terraform init". Please run that command now then try again.
If the change reason above is incorrect, please verify your configuration
hasn't changed and try again. At this point, no changes to your existingconfiguration or state have been made.
Failed to load backend: Initialization required. Please see the error message above.


Cause

I found that the cause of the problem was that I had used the backend block in my terraform configuration like below:


This tells terraform to expect some backend configuration however I was just testing and running terraform init without specifying a backend like below:

terraform init -input=false -backend=false

Solution

The solution is pretty simple. Basically, you need to either remove the backend settings from your terraform block but that's not the way you want to go if you want to use remote state. The other option is to specify backend settings when you run terraform init, like this:

terraform init \
-backend-config="storage_account_name=MyStorageAccount" \
-backend-config="container_name=MyStorageContainer" \
-backend-config="key=Terraform.tfstate" \
-backend-config="access_key=MyStorageAccountAccessKey"

You should now be able to run terraform plan.

Monday, 21 January 2019

Deploy Azure Durable Functions with Terraform and Azure Pipelines - Part 1

Introduction


I was recently doing the talk on Terraform and Azure Pipelines at the UK Cloud Infrastructure User Group and a number of people asked if I could write up a blog on Terraform and Azure Pipelines. We're also testing out Azure Durable Functions so I figure that I’ll write about deploying .NET Core Azure Durable Functions using Terraform and Azure Pipelines.


Azure Durable Functions


To get more information about Azure Durable Functions, see here

This series is more around the Azure Durable Function deployment rather than the durable function .NET core app itself so we won’t go into much detail however what I can tell you is that this durable function is triggered by an HTTP request, accepts a name value and it just responds with "Hello <name>!”.

You can find the code on GitHub here. I’ll use this repo to deploy my durable function into Azure.


Terraform Configuration


Terraform is a great alternative to ARM Templates. Compared to JSON, Hashicorp Configuration Language (HCL) is much simpler and pretty much human-readable. You can find out more about Terraform by going to https://learn.hashicorp.com/terraform or my Terraform presentation here.

There is a Terraform configuration in the same GitHub repo as the Azure Durable Function and we’ll use this configuration to deploy a Function App to Azure which will use the Consumption Plan. The Terraform configuration deploys a new resource group, app service plan, storage account and function app (V2).




Azure DevOps build definition as YAML


Azure DevOps has a cool feature where you can write out your Azure Pipeline build definition as code using YAML. This has the benefit that the build configuration is now stored in source control with the application code itself so you get all the benefits of versioning and peer review through pull requests and you can keep your application code and the build definition in sync. You can read more about pipelines as YAML here.

You can find the full YAML pipeline file in the repo here but we'll go through each part in turn. To get started, the first part of our YAML file needs to look like this:


The resources section specifies that the code we are running tasks against is the same repo that the YAML file is stored in (self).  We will also clean the build directory before starting our build.

The pool section specifies that we'll use the hosted Visual Studio 2017 on Server 2016 hosted build agent.

In the variables section, we're specifying that the variable buildConfiguration is set to Release. We'll use this variable when running our dotnet CLI commands.


Build a .NET Core app using Azure DevOps Pipeline as YAML


In the repo, there's a few folders as below. 
  • AzureDurableFunctionsDemo1: Contains the .NET Core project that includes the Azure Durable Functions
  • AzureDurableFunctionsDemo1.Tests: Contains a .NET Core project that includes tests for the methods in the Azure Durable Functions Demo 1 project
  • Deployment: Contains a terraform folder which includes the terraform configuration

In order to build and test our app, we need to do as we would for any .NET Core application - i.e. build, test and publish. Using the Azure DevOps pipeline as YAML, this looks like below:


Notice the steps keyword here - we only need to specify this before the first task in the build pipeline. 

  • Build: This task is using the .NET Core CLI Azure DevOps V2 task template and it's specifying that we want to build all projects using the configuration parameter which is set to the buildConfiguration variable which itself is set to release.
  • Test: This task is using the same task template but is only running tests in projects which include test in the project folder name. It's also publishing the test results to Azure DevOps as the publishTestResults parameter is set to true.
  • Publish: Again using the same template, this task publishes only the AzureDurableFunctionsDemo1 project and it publishes it to the artifact staging directory on the build agent which is specified by the built-in Azure DevOps variable build.ArtifactStagingDirectory. We don't have any web projects to publish so publishWebProjects is set to false however we do need to publish a zip file for deployment to Azure Functions so we'll set zipAfterPublish to true.

That concludes the build and test for the .NET core app so let's move on to building and testing our terraform configuration.



Build a terraform configuration using Azure DevOps pipeline as YAML


The YAML for the terraform build and test looks like below:


  • Terraform Install: As we're using the Visual Studio 2017 hosted build agent which doesn't have terraform installed on it so you'll see the first task is a Terraform Install task which uses chocolatey to install terraform: choco install terraform. This task uses the Command Line V2 task template in Azure DevOps.
  • Terraform Init: With terraform, we don't need to build/compile it as we do for a .NET app but we can run some basic form of unit tests on it by using terraform validate however before we run this, we need to initialize terraform which we do by running terraform init. We're setting input to false because we don't want terraform to prompt us for any missing variables as this is an unattended command and we'd want it to just fail instead. We'd use a backend configuration to specify the location for the remote state file when we're running the configuration however we're just validating our terraform in this case so we'll set the backend to false.
  • Terraform Validate (Dev): For this task, we need to specify all the variables that terraform would expect when we attempt to deploy an infrastructure. These don't need to be the exact values but they need to be the correct type of variable (string, array or map) so as we just have strings, we can just set them to test. You'll see that we're specifying the envDev.tfvars file so that terraform validate can validate the variables which are set in this file. This task has no output unless there's a problem with the validation where it provides a non-zero exit code which fails the step in the pipeline.

In both the Terraform Init and Terraform Validate (Dev) tasks, you'll see that we're specifying the terraform configuration path as deployment\terraform which is the relative path to our terraform configuration. 



Publish the .NET Core app and terraform configuration


We've now built and tested both our application and terraform code and are ready for the release stage however we first need to output some artifacts to pass through to the release. Below are the tasks to publish these artifacts:


Both tasks are using the Publish Build Artifacts Azure DevOps task templates and are simply specifying the path to publish and the name for the artifact. When run, this would output two artifacts - one called app and one called terraform.


Save the YAML pipeline


The last step is to save the YAML pipeline as azure-pipelines.yml and store this in the root of the repo.


Conclusion


In this post, we've gone through the YAML build definition to build and test a .NET Core Azure Durable Function and terraform configuration. In part 2 we'll go through how to create the Azure Build in Azure DevOps.

Thursday, 27 December 2018

Azure Function - The listener for function was unable to start

Issue

When debugging one of my Azure Functions in Visual Studio 2017, I got the error below:

“The listener for function 'MyFunction' was unable to start. Microsoft.WindowsAzure.Storage: No connection could be made because the target machine actively refused it. System.Net.Http: No connection could be made because the target machine actively refused it. System.Private.CoreLib: No connection could be made because the target machine actively refused it.”

image

After a Google search, I didn’t find much on the error message. I figured I’d write up the solution for the next person who has this problem.

Solution

I figured that this may be a firewall issue or a problem with one of the services that wasn’t started so I tried to eliminate those causes. I found that all services were started and that even with the firewall disabled, I still had the issue. I resorted to the classic turn it off and on again and that still didn’t fix it.

Eventually, it occurred to me to check that the Azure Storage Emulator was started and I found that it wasn’t. To check if it’s started, run the command below:

image

If you see IsRunning as False like above then you’re going to need to initialize and start the Azure Storage Emulator by running the commands below:

We can then check that the Azure Storage Emulator is running again by checking the status and we can see IsRunning is now True:

image

You should now be good to go and can now debug your functions.

Thursday, 20 December 2018

Publish Azure Function with Visual Studio 2017

Introduction

In my last post, I demonstrated how to create a new .NET Core Azure Function. In this post, I’ll demo how to publish your Azure Function using Visual Studio.

This post assumes that you’ve created your Azure Function already. If you need to go back over how to do that, click here.

Publish Azure Function with Visual Studio 2017

1) Right click your project and click Publish

image_thumb[25]

2) If you have an existing Azure Function App already deployed in Azure then you can choose Select Existing to use an existing one however I’m creating one from scratch so I’ll choose Create New

image_thumb[26]

3) Log into Azure and fill out all the details then click Create

image_thumb[27]

You’ll be prompted to upgrade the version of Azure Function as we need V2, not V1 for .NET Core. If so, click Yes.

image_thumb[28]

4) After a few seconds, Visual Studio completes the deployment and you can now see your new Azure Function in the Azure Portal

image_thumb[29]

5) On the overview tab, copy the URL into PostMan

image_thumb[30]

5) Click on Manage and copy down your function key. You can create a new key if you want.

image_thumb[31]

6) In PostMan, add the key and the path to the HttpTrigger function as below

https://httptrigger-markgossa.azurewebsites.net/api/HttpTrigger?code=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

You can now click Send and see your output

image_thumb[33]

There you have it! .NET Core Azure Function published to Azure in very little time. You can find the project code on GitHub here.

Thursday, 29 November 2018

Create .NET Core Azure Function with Visual Studio 2017

Introduction

With all the talk about Serverless and Azure Functions, it’s a good time to learn how to do this if you don’t already know. In this post, we’ll do a walkthrough of you to create a simple .NET Core 2.1 Azure Function with Visual Studio 2017.

Prerequisites

First of all, let’s set up our environment.

1) Install Visual Studio 2017 (you can download the Visual Studio Community Edition for free here).

2) Install the .NET desktop development workload for Visual Studio. I have these options enabled:

image

image

3) Install the Universal Windows Platform development workload:

image

4) Install the ASP.NET and web development workload:

image

image

5) Install the Azure development workload:

image

image

6) Install the .NET Core cross-platform development

image

image

7) Install the Azure Functions Core Tools:

You can find more info here.

8) Install PostMan. This is a free tool for making HTTP requests so it’s great for testing out APIs. You can download it here.

Create a new Function App V2 project with Visual Studio 2017:

1) Create a new project in Visual Studio (File > New > Project)

image

2) Select Cloud > Azure Functions then select a folder and click OK.

image

3) On the new window that pops up, make sure you click the drop down and select Azure Functions v2 (.NET Core) and Http Trigger.

image

Click OK when done.

4) Let’s add some code. Rename Function1.cs to HttpTrigger.cs and add the contents below.

This method simply takes an HttpRequest input, extracts the two strings FirstName and LastName and outputs a greeting message.

Debug your Azure Function in Visual Studio

1) At the time of writing, there’s an issue with Visual Studio debugging for .NET Core Azure Functions V2. The workaround for this is here. Go ahead and go through this article then come back to continue at step 2.

2) Hit F5 to start debugging your function and you should be presented with something like this

image

As we can see, our function is running and the HttpTrigger is listed in green.

3) Let’s open up PostMan, create a new Request. It prompts you to create a request name and category.

image

image

4) Change the request to POST

image

5) Go to Body, select raw and add the JSON content below

image

6) In the console window, copy the HttpTrigger URL into PostMan:

image

image

7) Click Send in PostMan

You should get the greeting message below in PostMan

image

And you should see a request come through in the Azure Function console window

image

So, there you have it. Your first Azure Function running .NET Core 2.1. Hit Shift+F5 to stop debugging. You can find the project code on GitHub here.

Monday, 26 November 2018

Debug .NET Core 2.1 Azure Function V2 with Visual Studio 2017

Issue:

When you create a new .NET Core 2.1 Azure Function in Visual Studio and then try to debug it, you see the console pop up and disappear then you get this error:

A fatal error has occurred and debugging needs to be terminated. For more details, please see the Microsoft Help and Support web site. HRESULT=0x8000ffff. ErrorCode=0x0.

image

Workaround:

I’m sure Microsoft will resolve this soon but what you have to do to get around it is to configure the project debug properties and configure Visual Studio with the Executable to run and give it the path of func.dll. See steps below.

1) Install the Azure Functions Core Tools:

You can find more info here.

2) Right click your project, click on Properties and go to the Debug tab

3) Set the Launch type to Executable 

4) Set Executable to

5) Set the Application arguments

You settings should look like this when done.

image

Let’s go ahead and try debug our function and we can see it’s working now:

image

Happy coding!

Sunday, 18 November 2018

Azure Policy - Deny inbound RDP from the internet

Introduction

In this article, I’ll do a quick run through of Azure Policy and what you can do with it. Given that we can now deploy infrastructure in Azure quickly and we can provide multiple teams access, the question is how do we control what can be deployed. In this article, we’ll look at how to prevent users opening up inbound RDP ports open from the internet to their Windows VMs.

What is Azure Policy

Azure Policy is a new Azure feature where you can assign policies to your Azure subscriptions or management groups (groups of Azure subscriptions). Using Azure Policy, you can specify what Azure resources should be denied, which should be audited and which should be automatically remediated by deploying an additional ARM template you specify. For example you can block all storage accounts that don’t use encryption.

There are some built in policies however you can create your own using JSON. There are different parts to the JSON policy as code file:

  • Policy definitions: These are policies that will be enforced such as Allowed Resource Types (set which resources can be deployed), Allowed Virtual Machine SKUs (sets which VM SKUs can be deployed).
  • Initiative definitions: These are groups of policies that are aimed at achieving a larger goal. For example, you could have an initiative for reducing costs and then you can have a number of policies under that such as one policy which prevents users deploying large virtual machines and another which prevents them deploying databases which high DTUs. You can then assign the initiative definition to a subscription or management group.

Create Network Security Group for testing

For starters, I’ll go ahead and create a new Network Security Group which allows TCP port 3389 from the internet. For more information on Network Security Groups, see here.

image

image

image


Create Azure Policy Definition to deny inbound RDP

Now that we have created our Network Security Group which we want to block, we will go ahead and create an Azure Policy Definition.

1) Log into your Azure Portal and search for Policy:

image

2) Here you see the Overview pane with a summary of your compliance status. There are no assigned policies so we can see that we’re 100% compliant.

image

3) Create new policy code

The policy is written in JSON and includes a number of fields:

  • displayName - The name that will appear in the Azure Portal
  • description - The description that will appear in the Azure Portal
  • mode: If set to all then the policy applies to all resource types. If set to indexed then the policy applies to only resource types that support tags and location
  • parameters: Here we can set the parameters for our policy. Rather than create a policy for each inbound port you want to block, you can create a single policy which takes a port parameter. See below:


  • if….then: This is the policy condition and action. It works like most if statements - i.e. if the resource meets certain criteria then an action will be taken. The action can be deny, audit and other options. There’s more information here.

The full JSON content is below:


4) Click on Definitions and add a new Policy Definition, add the JSON content to it and click Save:

image

5) Assign the policy. Click the policy definition you just created and then click on Assign.

image

6) Select the subscription or management group you want to assign the policy to and then set the parameters. In this case, we want to block inbound RDP traffic from the internet so you’d need to specify 3389 in the parameters section at the bottom. Click assign when done.

image


Testing Azure Policy:

Let’s test this out. We need to wait a bit of time for the policy to apply and hopefully we should see that the policy is not compliant and we can click through to find the offending resource.

image

image

If you try to create a new NSG rule which allows inbound port 3389 from the internet, it is denied by policy then you get an error like this:

image

You also get blocked if you try to deploy using PowerShell, terraform, the REST API or other methods as they all use the Azure Resource Manager. 

Conclusion

In this article, we went through how you can use Azure Policy to deny the creation of any NSG rule that allows inbound traffic from the internet on specified ports. This is one step forward to achieving good Azure governance.