Tuesday, June 1, 2021

Azure Durable Functions: Change the Storage Provider from Azure Storage to Microsoft SQL Server

During the Microsoft Build 2021 last week, there was an important announcement related to Azure Durable Functions. For all this time Azure Durable Functions was maintaining it's state in Azure Storage and we didn't have any control of that. These were mainly blobs, queues, tables.

But now we have 2 other storage options. But note: as of the date I am writing this post, these are available as Preview functionality.
  1. Netherite
  2. Microsoft SQL Server
    • This can be both on-premise and cloud-hosted deployments of SQL Server, including Azure SQL Database.
In this post, let's see how we can use Microsoft SQL Server as the state storage provider for an Azure Durable Function App. I am going to be using .NET Function App using C#.

I have created an Azure Function App using Visual Studio choosing Durable Functions Orchestrator as the template (Note: Durable Functions are not yet supported with .NET 5, we still need to target .NET Core 3 (LTS) which is .NET Core 3.1).

Then we need to install Microsoft.DurableTask.SqlServer.AzureFunctions NuGet package (Note: as of the date I am writing this post, it's still in a pre-release version). Once that's installed, I am updating the host.json as follows.

host.json
{
  "version""2.0",
  "logging": {
    "applicationInsights": {
      "samplingSettings": {
        "isEnabled"true,
        "excludedTypes""Request"
      }
    },
    "logLevel": {
      "DurableTask.SqlServer""Information",
      "DurableTask.Core""Warning"
    }
  },
  "extensions": {
    "durableTask": {
      "storageProvider": {
        "type""mssql",
        "connectionStringName""SQLDB_Connection"
      }
    }
  }
Here the most important change is in the extensions node. There we have customized the storageProvider to be mssql and configured a connectionStringName. And also there is a slight modification to logging configuration to reduce unnecessary noise, so we can see what's happening on SQL side of things.

Now we need to specify the SQLDB_Connection key and it's value. Since I am going to be running this function locally, I am going to add that key into local.settings.json. When deployed to Azure, you can maintain this setting in Application Settings or where ever you maintain your AppSettings.

Before that, let's create an empty database in our Microsoft SQL Server. I am just using my local MSSQL Server.
Empty Database
It's just an empty database, I just expanded Tables and Stored Procedures, just to be sure. Now I am copying the connection string for this database and updating local.settings.json as follows.

local.settings.json
{
  "IsEncrypted"false,
  "Values": {
    "AzureWebJobsStorage""UseDevelopmentStorage=true",
    "FUNCTIONS_WORKER_RUNTIME""dotnet",
    "SQLDB_Connection""Data Source=RAVANA-TPP15\\MSSQLSERVER2019;Initial Catalog=AzDurableFuncApp;Integrated Security=True;"
  }
}
Here we are still going to need AzureWebJobsStorage key and value, because there are other things (secretc etc. ) that needs to be stored.

Now let's run the function app.
First run on the function app
You will see that some SQL scripts are getting executed. Now let's refresh the database.
Updated Database
There are couple of tables and a bunch of Stored Procudures (amongst other things) that got added in order to manage the state. If you want to have a look at the SQL scripts that is getting executed, you can find those here: https://github.com/microsoft/durabletask-mssql/tree/main/src/DurableTask.SqlServer/Scripts.

So why we do need to consider about changing Storage Provider for Azure Durable Functions in the first place and which one should we use. You can find answers for those and find more information by reading through these.

Hope this helps.

Happy Coding.

Regards,
Jaliya

No comments:

Post a Comment