Monday, 15 June 2020

Azure Service Bus First In, First Out (FIFO)


With Azure Service Bus, you get some great features but to get guaranteed FIFO (First-In, First-Out, i.e. ordered delivery), there was a lot of documentation to read through so I thought I'd show a quick example using a Topic and Subscriber.

Service Bus Message Sessions

To ensure FIFO, Service Bus requires that a Session is used which is fairly simple. The sender needs to set the SessionId property on the Message that it sends.

You need to enable message sessions, we will need to enable this when we create the subscription:

This method below initialises a TopicClient using a connection string and topic name. It then creates a list of messages to send to the service bus topic and it sets the same SessionId on each one. If you don't specify the same SessionId, you'll find messages received in the wrong order.

The receiver, initialises a SubscriptionClient then calls the RegisterSessionHandle() method which registers a message handler and takes a SessionHandlerOptions parameter. This SessionHandlerOptions object requires that a exception message handler is specified which is called in case of an exception handling the message.

This handler references the ProcessMessageAsync() method which takes an IMessageSession, Message and CancellationToken:

The ProcessMessageException() method looks like this:

That's all you need to do to get this working. Let's test it out.


To test it out, I set up a simple console app which sends the messages to the topic, waits for them to be processed and then checks to make sure all received messages are in the correct order. See below:

Monday, 8 April 2019

Run PowerShell from Terraform


As great as Terraform is, you will find times when you need to fall back to running other scripts or using other tools. One such tool is PowerShell. We'll do a quick demo on how to run PowerShell scripts from Terraform configurations.

We've got three options and we'll run through each one:
  1. Run PowerShell from Terraform the first time only
  2. Run PowerShell from Terraform every time
  3. Run PowerShell from Terraform on a trigger
To run PowerShell, we'll be using the null_resource in Terraform. You can find out more about it here. Using the null_resource, we'll be calling the local_exec provisioner which specifies that the PowerShell script will be run on the machine running the Terraform configuration. More info on that is here.

All the code for this blog post can be found on GitHub here.

Run PowerShell from Terraform the first time only

In this example, we want to run a PowerShell script only the first time the Terraform configuration runs.

We have a pretty simple PowerShell script which I'll put in a sub folder called helpers.

Our configuration is pretty simple for this one. Basically, we're calling the null_resource and the local_exec provisioner. We then pass in the path to our PowerShell script and the parameters. Note that you need to dot source your PowerShell script by putting a . before it and you also need to surround it in '' in case the path includes a space. 

We can then run terraform init and terraform apply and we see the output below and we can see Terraform ran the PowerShell script (no, you don't need to run terraform plan).

If we then run terraform apply again, we find that it doesn't run the script:

Run PowerShell from Terraform every time

Note that now we're running the same script every time Terraform runs. Terraform is idempotent which means it only makes changes on the first run and each next run doesn't make a change unless it needs to. Make sure your PowerShell scripts are idempotent if you're using this approach.

The PowerShell script is the same as before and it's in a folder called helpers as before. It's not idempotent but as it's just writing out information to the console it's not causing a problem.

As for the Terraform configuration, we're now adding a trigger which causes Terraform to run each time the trigger value changes. To do this, we're assigning a new UUID each time:

Terraform now runs the PowerShell script every time it runs.

Run PowerShell from Terraform on a defined trigger

Rather than using a UUID as the trigger value, we can configure Terraform to run the PowerShell script only when a particular value changes. In this example, we'll only run the PowerShell script if the value of TriggerValue changes. This can be any output or variable in Terraform e.g. you could specify that only set IP restrictions if the list of IP restrictions changes. 

Again, our PowerShell script is the same as before in the helpers folder.

To demonstrate this, we can add a variable TriggerValue and specify this in a file. Terraform will expect us to pass this variable on the command line or via a TFVARS file.

Our Terraform configuration now needs to look like this. See that the trigger option is set to our TriggerValue variable.

When we run our configuration the first time using terraform apply, we need to specify the value for the variable (terraform apply -var TriggerValue=100) and we see that the script runs:

If we run it again, we can see that Terraform has nothing to do and the script doesn't run:

If we change the value of TriggerValue and run it again, Terraform now runs our PowerShell script:


In this article, we went through how you can run PowerShell scripts from Terraform either running every time, on a trigger or on the first time only.

Monday, 28 January 2019

Terraform - Failed to load backend: Initialization required

I came across this issue the other day and thought I'd share the solution.


When running terraform plan, you may get the error below:

Backend reinitialization required. Please run "terraform init".
Reason: Initial configuration of the requested backend "azurerm"

The "backend" is the interface that Terraform uses to store state,e,
perform operations, etc. If this message is showing up, it means that theshowing up, it means that the
Terraform configuration you're using is using a custom configuration forthe Terraform backend.

Changes to backend configurations require reinitialization. This allows
Terraform to setup the new configuration, copy existing state, etc. This isonly done during "terraform init". Please run that command now then try again.
If the change reason above is incorrect, please verify your configuration
hasn't changed and try again. At this point, no changes to your existingconfiguration or state have been made.
Failed to load backend: Initialization required. Please see the error message above.


I found that the cause of the problem was that I had used the backend block in my terraform configuration like below:

This tells terraform to expect some backend configuration however I was just testing and running terraform init without specifying a backend like below:

terraform init -input=false -backend=false


The solution is pretty simple. Basically, you need to either remove the backend settings from your terraform block but that's not the way you want to go if you want to use remote state. The other option is to specify backend settings when you run terraform init, like this:

terraform init \
-backend-config="storage_account_name=MyStorageAccount" \
-backend-config="container_name=MyStorageContainer" \
-backend-config="key=Terraform.tfstate" \

You should now be able to run terraform plan.

Monday, 21 January 2019

Deploy Azure Durable Functions with Terraform and Azure Pipelines - Part 1


I was recently doing the talk on Terraform and Azure Pipelines at the UK Cloud Infrastructure User Group and a number of people asked if I could write up a blog on Terraform and Azure Pipelines. We're also testing out Azure Durable Functions so I figure that I’ll write about deploying .NET Core Azure Durable Functions using Terraform and Azure Pipelines.

Azure Durable Functions

To get more information about Azure Durable Functions, see here

This series is more around the Azure Durable Function deployment rather than the durable function .NET core app itself so we won’t go into much detail however what I can tell you is that this durable function is triggered by an HTTP request, accepts a name value and it just responds with "Hello <name>!”.

You can find the code on GitHub here. I’ll use this repo to deploy my durable function into Azure.

Terraform Configuration

Terraform is a great alternative to ARM Templates. Compared to JSON, Hashicorp Configuration Language (HCL) is much simpler and pretty much human-readable. You can find out more about Terraform by going to or my Terraform presentation here.

There is a Terraform configuration in the same GitHub repo as the Azure Durable Function and we’ll use this configuration to deploy a Function App to Azure which will use the Consumption Plan. The Terraform configuration deploys a new resource group, app service plan, storage account and function app (V2).

Azure DevOps build definition as YAML

Azure DevOps has a cool feature where you can write out your Azure Pipeline build definition as code using YAML. This has the benefit that the build configuration is now stored in source control with the application code itself so you get all the benefits of versioning and peer review through pull requests and you can keep your application code and the build definition in sync. You can read more about pipelines as YAML here.

You can find the full YAML pipeline file in the repo here but we'll go through each part in turn. To get started, the first part of our YAML file needs to look like this:

The resources section specifies that the code we are running tasks against is the same repo that the YAML file is stored in (self).  We will also clean the build directory before starting our build.

The pool section specifies that we'll use the hosted Visual Studio 2017 on Server 2016 hosted build agent.

In the variables section, we're specifying that the variable buildConfiguration is set to Release. We'll use this variable when running our dotnet CLI commands.

Build a .NET Core app using Azure DevOps Pipeline as YAML

In the repo, there's a few folders as below. 
  • AzureDurableFunctionsDemo1: Contains the .NET Core project that includes the Azure Durable Functions
  • AzureDurableFunctionsDemo1.Tests: Contains a .NET Core project that includes tests for the methods in the Azure Durable Functions Demo 1 project
  • Deployment: Contains a terraform folder which includes the terraform configuration

In order to build and test our app, we need to do as we would for any .NET Core application - i.e. build, test and publish. Using the Azure DevOps pipeline as YAML, this looks like below:

Notice the steps keyword here - we only need to specify this before the first task in the build pipeline. 

  • Build: This task is using the .NET Core CLI Azure DevOps V2 task template and it's specifying that we want to build all projects using the configuration parameter which is set to the buildConfiguration variable which itself is set to release.
  • Test: This task is using the same task template but is only running tests in projects which include test in the project folder name. It's also publishing the test results to Azure DevOps as the publishTestResults parameter is set to true.
  • Publish: Again using the same template, this task publishes only the AzureDurableFunctionsDemo1 project and it publishes it to the artifact staging directory on the build agent which is specified by the built-in Azure DevOps variable build.ArtifactStagingDirectory. We don't have any web projects to publish so publishWebProjects is set to false however we do need to publish a zip file for deployment to Azure Functions so we'll set zipAfterPublish to true.

That concludes the build and test for the .NET core app so let's move on to building and testing our terraform configuration.

Build a terraform configuration using Azure DevOps pipeline as YAML

The YAML for the terraform build and test looks like below:

  • Terraform Install: As we're using the Visual Studio 2017 hosted build agent which doesn't have terraform installed on it so you'll see the first task is a Terraform Install task which uses chocolatey to install terraform: choco install terraform. This task uses the Command Line V2 task template in Azure DevOps.
  • Terraform Init: With terraform, we don't need to build/compile it as we do for a .NET app but we can run some basic form of unit tests on it by using terraform validate however before we run this, we need to initialize terraform which we do by running terraform init. We're setting input to false because we don't want terraform to prompt us for any missing variables as this is an unattended command and we'd want it to just fail instead. We'd use a backend configuration to specify the location for the remote state file when we're running the configuration however we're just validating our terraform in this case so we'll set the backend to false.
  • Terraform Validate (Dev): For this task, we need to specify all the variables that terraform would expect when we attempt to deploy an infrastructure. These don't need to be the exact values but they need to be the correct type of variable (string, array or map) so as we just have strings, we can just set them to test. You'll see that we're specifying the envDev.tfvars file so that terraform validate can validate the variables which are set in this file. This task has no output unless there's a problem with the validation where it provides a non-zero exit code which fails the step in the pipeline.

In both the Terraform Init and Terraform Validate (Dev) tasks, you'll see that we're specifying the terraform configuration path as deployment\terraform which is the relative path to our terraform configuration. 

Publish the .NET Core app and terraform configuration

We've now built and tested both our application and terraform code and are ready for the release stage however we first need to output some artifacts to pass through to the release. Below are the tasks to publish these artifacts:

Both tasks are using the Publish Build Artifacts Azure DevOps task templates and are simply specifying the path to publish and the name for the artifact. When run, this would output two artifacts - one called app and one called terraform.

Save the YAML pipeline

The last step is to save the YAML pipeline as azure-pipelines.yml and store this in the root of the repo.


In this post, we've gone through the YAML build definition to build and test a .NET Core Azure Durable Function and terraform configuration. In part 2 we'll go through how to create the Azure Build in Azure DevOps.

Thursday, 27 December 2018

Azure Function - The listener for function was unable to start


When debugging one of my Azure Functions in Visual Studio 2017, I got the error below:

“The listener for function 'MyFunction' was unable to start. Microsoft.WindowsAzure.Storage: No connection could be made because the target machine actively refused it. System.Net.Http: No connection could be made because the target machine actively refused it. System.Private.CoreLib: No connection could be made because the target machine actively refused it.”


After a Google search, I didn’t find much on the error message. I figured I’d write up the solution for the next person who has this problem.


I figured that this may be a firewall issue or a problem with one of the services that wasn’t started so I tried to eliminate those causes. I found that all services were started and that even with the firewall disabled, I still had the issue. I resorted to the classic turn it off and on again and that still didn’t fix it.

Eventually, it occurred to me to check that the Azure Storage Emulator was started and I found that it wasn’t. To check if it’s started, run the command below:


If you see IsRunning as False like above then you’re going to need to initialize and start the Azure Storage Emulator by running the commands below:

We can then check that the Azure Storage Emulator is running again by checking the status and we can see IsRunning is now True:


You should now be good to go and can now debug your functions.

Thursday, 20 December 2018

Publish Azure Function with Visual Studio 2017


In my last post, I demonstrated how to create a new .NET Core Azure Function. In this post, I’ll demo how to publish your Azure Function using Visual Studio.

This post assumes that you’ve created your Azure Function already. If you need to go back over how to do that, click here.

Publish Azure Function with Visual Studio 2017

1) Right click your project and click Publish


2) If you have an existing Azure Function App already deployed in Azure then you can choose Select Existing to use an existing one however I’m creating one from scratch so I’ll choose Create New


3) Log into Azure and fill out all the details then click Create


You’ll be prompted to upgrade the version of Azure Function as we need V2, not V1 for .NET Core. If so, click Yes.


4) After a few seconds, Visual Studio completes the deployment and you can now see your new Azure Function in the Azure Portal


5) On the overview tab, copy the URL into PostMan


5) Click on Manage and copy down your function key. You can create a new key if you want.


6) In PostMan, add the key and the path to the HttpTrigger function as below

You can now click Send and see your output


There you have it! .NET Core Azure Function published to Azure in very little time. You can find the project code on GitHub here.

Thursday, 29 November 2018

Create .NET Core Azure Function with Visual Studio 2017


With all the talk about Serverless and Azure Functions, it’s a good time to learn how to do this if you don’t already know. In this post, we’ll do a walkthrough of you to create a simple .NET Core 2.1 Azure Function with Visual Studio 2017.


First of all, let’s set up our environment.

1) Install Visual Studio 2017 (you can download the Visual Studio Community Edition for free here).

2) Install the .NET desktop development workload for Visual Studio. I have these options enabled:



3) Install the Universal Windows Platform development workload:


4) Install the ASP.NET and web development workload:



5) Install the Azure development workload:



6) Install the .NET Core cross-platform development



7) Install the Azure Functions Core Tools:

You can find more info here.

8) Install PostMan. This is a free tool for making HTTP requests so it’s great for testing out APIs. You can download it here.

Create a new Function App V2 project with Visual Studio 2017:

1) Create a new project in Visual Studio (File > New > Project)


2) Select Cloud > Azure Functions then select a folder and click OK.


3) On the new window that pops up, make sure you click the drop down and select Azure Functions v2 (.NET Core) and Http Trigger.


Click OK when done.

4) Let’s add some code. Rename Function1.cs to HttpTrigger.cs and add the contents below.

This method simply takes an HttpRequest input, extracts the two strings FirstName and LastName and outputs a greeting message.

Debug your Azure Function in Visual Studio

1) At the time of writing, there’s an issue with Visual Studio debugging for .NET Core Azure Functions V2. The workaround for this is here. Go ahead and go through this article then come back to continue at step 2.

2) Hit F5 to start debugging your function and you should be presented with something like this


As we can see, our function is running and the HttpTrigger is listed in green.

3) Let’s open up PostMan, create a new Request. It prompts you to create a request name and category.



4) Change the request to POST


5) Go to Body, select raw and add the JSON content below


6) In the console window, copy the HttpTrigger URL into PostMan:



7) Click Send in PostMan

You should get the greeting message below in PostMan


And you should see a request come through in the Azure Function console window


So, there you have it. Your first Azure Function running .NET Core 2.1. Hit Shift+F5 to stop debugging. You can find the project code on GitHub here.