Azure Functions Express: Running Azure Functions locally using Docker Compose

Maarten Merken
GopenSource

--

Every time I join a new project, I try not to rely too much on external environments when building and running the software that I’m working on. Most of the time, a DEV or CI environment is overrated and unstable. My approach is to run all the components locally and remove the external dependencies.

I already wrote about running a local SQL Express Docker instance, you will find this article to have the same merit:

If it were up to me, I’d write everything in Azure Functions, not everything fits the model, though. Maarten Balliauw explains this in more detail here: https://blog.maartenballiauw.be/post/2019/10/02/dont-use-azure-functions-as-a-web-application.html

However, when I do develop an Azure Function, I like to run it locally first, without the interference of anything hosted in Azure or elsewhere. The fastest way of having such an experience, in my opinion, is using Docker and Docker Compose.

The term ‘ Docker’ is glorified by many and horrified by some, I consider Docker to be just another tool in my toolbelt, and a great one at that.

I think the reason I like Docker so much, is the versatility of the tool:

Do you have an ASP.NET Core Web Application? Docker.

How about an SQL Database? Docker.

Angular Frontend Application? Docker.

An executable that you’d likely run via a Windows Service? Docker.

Database Migrations? Docker.

Azure Storage Emulator? Docker.

Your entire CI/CD Jenkins Pipeline? Docker.

Local Azure functions with Docker and Docker-Compose

Everything I’ll mention here is compressed inside of the accompanied git repo below:

I’ve created a template C# Azure Functions project for you to scaffold that holds the following features:

  • An HTTP Triggered function
  • A Blob Triggered function with Blob output binding connected to a local storage account
  • A Queue Triggered function connected to a local queue
  • An easy “start and stop” way of hosting this function locally

What you will need to do first in order to get started:

docker-compose up

First, you will see the Azurite image being pulled from Docker.

Next, you will see the Local.Functions project being built and containerized.

Finally, when both containers are ready, they are spun up inside of one network using their respective names:

Now, you can open up the Storage Explorer and browse the local.storage.emulator’s Blob storage and Queues:

Preparing the environment

Create two containers named; input-container and output-container.

Create a queue named; queue.

Working with the Queue

Add a new message to the queue via the Azure Storage Explorer

In a few moments, you’ll see the message getting picked up by the local.functions container.

Working with the Blob Containers

Add any file to the input-container by simply dragging and dropping through the Aure Storage Explorer. The function will pick this up and copy the file to the output-container.

Testing the Http endpoint

Open the http folder from the cloned repo in VS Code.

Open the HttpTriggerGet.http file and click the ‘Send Request’ context item above the GET request line.

The Http function should send a response as a result:

Debugging

Once you’ve run the docker-compose command, you should have a local container named local.storage.emulator.

To debug the Local.Functions application, you simply need to press F5 from inside VS Code. The launch task will start the local.storage.emulator container to ensure the local storage emulation is running. Then you can put a breakpoint in the QueueTriggeredFunction to inspect the message being sent:

Key take-aways

This wraps up everything I wanted to share at this point, I wanted to create a starting point for running a containerized local Azure Function application that is connected to a local Storage Emulator, hope you find it useful.

--

--