Friday, March 15, 2024

Read TLS/SSL Certificate in Azure App Service from C# Code

Recently I was updating an old .NET Core web application to .NET 8 and the code was reading a certificate as follows.

private X509Certificate2 GetCertificateByThumbprint(string thumbprint)
{
    X509Store store = new (StoreName.My, StoreLocation.CurrentUser);
    store.Open(OpenFlags.ReadOnly);
    X509Certificate2Collection certificateCollection =  store.Certificates.Find(X509FindType.FindByThumbprint, thumbprint, true);
    return certificateCollection.OfType<X509Certificate2>().SingleOrDefault();
}

This piece of code wasn't working once the application is deployed to Azure App Service (Windows). The certificate is set up in App Service, but the code wasn't picking it up. As usual, QAs were insisting it used to work.

It seems I needed to add an app setting WEBSITE_LOAD_CERTIFICATES with the value of comma-separated certificate thumbprints in order for them be loaded and accessible from App Service code.

{
  "name""WEBSITE_LOAD_CERTIFICATES",
  "value""<comma-separated-certificate-thumbprints>",
  "slotSetting"false
}

You can read more on Use a TLS/SSL certificate in your code in Azure App Service. It contains instructions for other scenarios like loading a certificate from a file and loading a certificate in Linux/Windows containers.

Hope this helps.

Happy Coding.

Regards,
Jaliya

Monday, March 11, 2024

Azure AD B2C: Call an External API Using Client Credentials in an User Journey

In this post, let's see how to call an external API using Client Credentials in an Azure AD B2C User Journey.

I am assuming Azure AD B2C App Registration is already set up for the client app with the necessary permission  (scope access) to call the protected API and you have noted down the Client ID, Client Secret, and the Scope.

Note: There are no additional actions to enable the client credentials for user flows or custom policies. Both Azure AD B2C user flows and custom policies support the client credentials flow by default. But of course, you can create a custom policy to customize the user journey of the OAuth 2.0 Client credentials and extend the token issuance process.

First, you can test that everything is set up correctly using the following Powershell script.

$clientId = "<clientId>"
$clientSecret = "<clientSecret>"
$endpoint = "https://<tenant-name>.b2clogin.com/<tenant-name>.onmicrosoft.com/<policy>/oauth2/v2.0/token"
$scope = "<scope>"
$body = "grant_type=client_credentials&scope=" + $scope + "&client_id=" + $clientId + "&client_secret=" + $clientSecret

$token = Invoke-RestMethod -Method Post -Uri $endpoint -Body $body
$token | ConvertTo-Json

Here the scope is something like follows:

$scope = "https://<tenant-name>.onmicrosoft.com/45a2252d-099a-4c6a-9c57-66eac05e2693/.default"
The script should output something like below.
Test Client Credentials
Now let's see how we can use this in an Azure AD B2C User Journey.

1. Define a ClaimType for access_token.

<BuildingBlocks>
  <ClaimsSchema> ...
    <ClaimType Id="access_token">
      <DisplayName>Access Token</DisplayName>
      <DataType>string</DataType>
    </ClaimType>
  </ClaimsSchema> ... </BuildingBlocks>

2. Define TechnicalProfiles to retrieve access_token and to call the external API using the retrieved access_token.

<ClaimsProvider>
  ...
  <TechnicalProfiles>
    <TechnicalProfile Id="REST-GetClientCredentials">
      <DisplayName>Get Client Credentials</DisplayName>
      <Protocol Name="Proprietary" Handler="Web.TPEngine.Providers.RestfulProvider, Web.TPEngine, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null"/>
      <Metadata>
        <Item Key="ServiceUrl">
          https://<tenant-name>.b2clogin.com/<tenant-name>.onmicrosoft.com/<policy>/oauth2/v2.0/token?grant_type=client_credentials&amp;scope=<scope>&amp;client_id=<clientId>&amp;client_secret=<clientSecret>
        </Item>
        <Item Key="SendClaimsIn">Body</Item>
        <Item Key="AuthenticationType">None</Item>
        <Item Key="AllowInsecureAuthInProduction">true</Item>
      </Metadata>
      <OutputClaims>
        <OutputClaim ClaimTypeReferenceId="access_token"/>
      </OutputClaims>
      <UseTechnicalProfileForSessionManagement ReferenceId="SM-Noop"/>
    </TechnicalProfile>
    <TechnicalProfile Id="REST-CallApiUsingClientCredentials">
      <DisplayName>Call an External API using Client Credentials</DisplayName>
      <Protocol Name="Proprietary" Handler="Web.TPEngine.Providers.RestfulProvider, Web.TPEngine, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null" />
      <Metadata>
        <Item Key="ServiceUrl"><Endpoint to call></Item>
        <Item Key="SendClaimsIn">Header</Item>
        <Item Key="AuthenticationType">Bearer</Item>
        <Item Key="UseClaimAsBearerToken">access_token</Item>
        <Item Key="AllowInsecureAuthInProduction">true</Item>
        <Item Key="IncludeClaimResolvingInClaimsHandling">true</Item>
      </Metadata>
      <InputClaims>
        <InputClaim ClaimTypeReferenceId="access_token"/>
      </InputClaims>
      <OutputClaims>
        <!-- Output Claims from Calling the API -->
      </OutputClaims>
      <UseTechnicalProfileForSessionManagement ReferenceId="SM-Noop" />
    </TechnicalProfile> ...
  </TechnicalProfiles>
</ClaimsProvider>

3. Finally, introduce additional OrchestrationSteps to your UserJourney to use the above TechnicalProfiles.

<UserJourneys>
  <UserJourney Id="<UserJourneyId>">
    <OrchestrationSteps>
      ...
      <OrchestrationStep Order="7" Type="ClaimsExchange">
        <ClaimsExchanges>
          <ClaimsExchange Id="RESTGetClientCredentials" TechnicalProfileReferenceId="REST-GetClientCredentials" />
        </ClaimsExchanges>
      </OrchestrationStep>
      <OrchestrationStep Order="8" Type="ClaimsExchange">
        <ClaimsExchanges>
          <ClaimsExchange Id="RESTCallApiUsingClientCredentials" TechnicalProfileReferenceId="REST-CallApiUsingClientCredentials" />
        </ClaimsExchanges>
      </OrchestrationStep> ...
      <OrchestrationStep Order="11" Type="SendClaims" CpimIssuerTechnicalProfileReferenceId="JwtIssuer" />
    </OrchestrationSteps>
  </UserJourney>

Now that should be it.

Hope this helps.

Happy Coding.

Regards,
Jaliya

Friday, March 1, 2024

Creating Integration Tests for Azure Functions

I wanted to have some integration tests for Azure Functions, especially for some complex durable functions. When you have durable functions and when you want to make sure that the orchestrations are behaving as expected, having integration tests is the only way to ensure that. And another important thing is I needed to be able to run these tests not just locally, but in a CI pipeline (GitHub workflows, Azure DevOps Pipeline, etc) as well.

Unfortunately as of today, there is no proper integration test mechanism for Azure Durable Functions (or Azure Functions) like we have for ASP.NET Core applications.

I came up with the following approach after gathering inputs from GitHub issues and other related posts on the subject.

The basic concept is as follows. Note I am using XUnit.net as my testing Framework.

1. Create a fixture class that implements IDisposable and on the constructor, I am spinning up the Function Application to test using func start. And doing the cleanup on Dispose().

2. Create an XUnit Collection Fixture using the above fixture. So basically my single test context (the function application) will get shared among different tests in several test classes, and it will get cleaned up after all the tests in the test classes have finished.

My fixture looks like something below.
using Polly;
using System.Diagnostics;
using System.Runtime.InteropServices;

namespace HelloAzureFunctions.Tests.Integration.Fixtures;

public class AzureFunctionFixture : IDisposable
{
    private readonly string _path = Directory.GetCurrentDirectory();
    private readonly string _testOutputPath = Path.Combine(Directory.GetCurrentDirectory(), "integration-test-output.log");
    private readonly int _port = 7071;
    private readonly string _baseUrl;
    private readonly Process _process;

    public readonly HttpClient HttpClient;

    public AzureFunctionFixture()
    {
        _baseUrl = $"http://localhost:{_port}";

        HttpClient = new HttpClient()
        {
            BaseAddress = new Uri(_baseUrl)
        };

        if (File.Exists(_testOutputPath))
        {
            File.Delete(_testOutputPath);
        }

        DirectoryInfo directoryInfo = new(_path);
        _process = StartProcess(_port, directoryInfo);
        _process.OutputDataReceived += (sender, args) =>
        {
            File.AppendAllLines(_testOutputPath, [args.Data]);
        };
        _process.BeginOutputReadLine();
    }

    public void Dispose()
    {
        if (!_process.HasExited)
        {
            _process.Kill(entireProcessTree: true);
        }

        _process.Dispose();
        HttpClient.Dispose();
    }

    public async Task WaitUntilFunctionsAreRunning()
    {
        PolicyResult<HttpResponseMessage> result =
            await Policy.TimeoutAsync(TimeSpan.FromSeconds(30))
                .WrapAsync(Policy.Handle<Exception>().WaitAndRetryForeverAsync(index => TimeSpan.FromMilliseconds(500)))
                .ExecuteAndCaptureAsync(() => HttpClient.GetAsync(""));

        if (result.Outcome != OutcomeType.Successful)
        {
            throw new InvalidOperationException("The Azure Functions project doesn't seem to be running.");
        }
    }

    private static Process StartProcess(int port, DirectoryInfo workingDirectory)
    {
        string fileName = "func";
        string arguments = $"start --port {port} --verbose";

        if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows))
        {
            fileName = "powershell.exe";
            arguments = $"func start --port {port} --verbose";
        }

        ProcessStartInfo processStartInfo = new(fileName, arguments)
        {
            UseShellExecute = false,
            CreateNoWindow = true,
            RedirectStandardOutput = true,
            WorkingDirectory = workingDirectory.FullName,
            EnvironmentVariables =
            { 
                // Passing an additional environment variable to the application,
                // So it can control the behavior when running for Integration Tests

                [ApplicationConstants.IsRunningIntegrationTests] = "true"
            }
        };

        Process process = new() { StartInfo = processStartInfo };
        process.Start();

        return process;
    }
}
I can use this fixture for my tests and it will work fine for running integration tests locally.

Now we need to be able to run these tests in a CI pipeline. I am using the following GitHub workflow.
name: Run Integration Tests

on:
  push:
    branches: ["main"]
    paths-ignore:
      - '**.md'

env:
  DOTNET_VERSION: '8.0.x'

jobs:
  build-and-test:
    strategy:
      matrix:
        os: [ubuntu-latestwindows-latest]
    runs-on: ${{ matrix.os }}
    env:
      INTEGRATION_TEST_EXECUTION_DIRECTORY: ./tests/HelloAzureFunctions.Tests.Integration/bin/Debug/net8.0

    steps:
    - name: 'Checkout GitHub Action'
      uses: actions/checkout@v3

    - name: Setup .NET ${{ env.DOTNET_VERSION }} Environment
      uses: actions/setup-dotnet@v3
      with:
        dotnet-version: ${{ env.DOTNET_VERSION }}

    - name: Build
      run: dotnet build

    # Install Azure Functions Core Tools in the runner, 
    # so we have access to 'func.exe' to spin up the Azure Functions app in integration tests
    - name: Install Azure Functions Core Tools 
      run: |
        npm install -g azure-functions-core-tools@4 --unsafe-perm true

    # Setup Azurite in the runner, 
    # so the Azure Functions app we are going to spin up, can use azurite as it's Storage Provider
    - name: Setup Azurite 
      shell: bash
      run: |
        npm install -g azurite
        azurite --silent &

    - name: Run Integration Tests
      # If there are any errors executing integration tests, uncomment the following line to continue the workflow, so you can look at integration-test-output.log
      # continue-on-error: true 
      run: dotnet test ${{ env.INTEGRATION_TEST_EXECUTION_DIRECTORY }}/HelloAzureFunctions.Tests.Integration.dll

    - name: Upload Integration Tests Execution Log
      uses: actions/upload-artifact@v4
      with:
        name: artifact-${{ matrix.os }}
        path: ${{ env.INTEGRATION_TEST_EXECUTION_DIRECTORY }}/integration-test-output.log
When the workflow runs, the output is as follows.
Build and Test: ubuntu-latest
Build and Test: windows-latest
You can find the full sample code here on this repo:

Happy Coding.

Regards,
Jaliya

Tuesday, February 20, 2024

.NET 8.0 Isolated Azure Functions: Binding Expressions that uses Azure App Configuration

In this post let's see how we can use binding expressions in an .NET 8.0 Isolated Azure Function and how to consume the binding expression values from Azure App Configuration (AAC).

Binding expressions are basically something like this. Let's take a simple ServiceBus trigger function. 
[Function(nameof(ServiceBusTrigger))]
public static void ServiceBusTrigger( [ServiceBusTrigger("%Messaging:Topic%", "%Messaging:Subscription%")] ServiceBusReceivedMessage serviceBusReceivedMessage)
{
    // TODO: Process the received message
}
Here the  %Messaging:Topic% and %Messaging:Subscription% are binding expressions and its value doesn't have to be a compile time constant.

In In-Process Azure functions, it's pretty straightforward, you can just add Azure App Configuration as another configuration provider in the Startup, and it will work.

But in Isolated functions at least as of today (20th February 2024), you can't do that (Support expression resolution from configuration sources registered by the worker #1253). While it's a bit disappointing (after having Isolated functions available for a couple of years), you can use the following workaround.

Let's say I have the following values in my Azure App Configuration.
Azure App Configuration
Azure App Configuration Values
I can use the following notation to access AAC values.
@Microsoft.AppConfiguration(Endpoint=https://aac-temp-001.azconfig.io; Key=<key>)
// if you want to choose a particular Label
@Microsoft.AppConfiguration(Endpoint=https://aac-temp-001.azconfig.io; Key=<key>; Label=<label>)
So I can update Function App settings in Azure as follows. Make sure the identity of the function app (system-assigned managed identity or user-assigned managed identity) can read the configuration from AAC.
Function App Configuration
More read:
   Use App Configuration references for App Service and Azure Functions (preview)

Friday, February 9, 2024

Azure DevOps Self-hosted Agent: NETSDK1045: The current .NET SDK does not support targeting .NET 8.0

Recently I have faced this issue in one of our Self-hosted agents in Azure DevOps when a pipeline is trying to build a .NET 8.0 application.
C:\vsts-agent\_work\_tool\dotnet\sdk\5.0.405\Sdks\Microsoft.NET.Sdk\targets\Microsoft.NET.TargetFrameworkInference.targets(141,5): 

error NETSDK1045: The current .NET SDK does not support targeting .NET 8.0.  
Either target .NET 5.0 or lower, or use a version of the .NET SDK that supports .NET 8.0.  [C:\vsts-agent\_work\67\s\xxxxx.csproj]
The error was happening in a NuGetCommand@2 task while doing a restore. I replaced that with a DotNetCoreCLI@2. Then that step succeeded but eventually failed again in a VSBuild@1 task (that was using vsVersion: '17.0' which is the latest) for the same reason. 

This was strange because the pipeline was specifically requesting for .NET 8.0.
task: UseDotNet@2
  displayName: Use .NET
  inputs:
    packageType: 'sdk'
    version: '8.0.x'
The pipeline had no reason to use .NET SDK 5.0.405 and had no idea where this specific version was coming from.

Then I started digging, and after scratching my head for a couple of hours, noticed the following in agent worker logs (usually inside C:\vsts-agent\_diag). To my surprise, the pipeline is getting executed with the following.
{
  ...
  "variables": {
    "DOTNET_MSBUILD_SDK_RESOLVER_SDKS_DIR": {
      "value""C:\\vsts-agent\\_work\\_tool\\dotnet\\sdk\\5.0.405\\Sdks"
    },
    "DOTNET_MSBUILD_SDK_RESOLVER_SDKS_VER": {
      "value""5.0.405"
    },
    ...
  }
  ...
}
DOTNET_MSBUILD_SDK_RESOLVER_* are .NET environment variables that are used to force the resolved SDK tasks and targets to come from a given base directory and report a given version to MSBuild.
  • DOTNET_MSBUILD_SDK_RESOLVER_SDKS_DIR: Overrides the .NET SDK directory.
  • DOTNET_MSBUILD_SDK_RESOLVER_SDKS_VER: Overrides the .NET SDK version.
  • DOTNET_MSBUILD_SDK_RESOLVER_CLI_DIR: Overrides the dotnet.exe directory path.
And that kind of answered where  .NET SDK 5.0.405 was coming from, but the question remains why. Submitted an Issue #19520: Self hosted agent uses incorrect DOTNET_MSBUILD_SDK_RESOLVER_SDKS_*.

To get past the issue, I had to override these variables. To test the concept, I have overridden these variables by passing .NET 8.0 counterpart values to the pipeline execution.
Passing variables to the pipeline execution
and that finally worked. But we can't be manually overriding these for each run, so I have overridden them in YAML as follows.
variables:
name: DOTNET_MSBUILD_SDK_RESOLVER_SDKS_DIR
  value: 'C:\vsts-agent\_work\_tool\dotnet\sdk\8.0.101\Sdks'
name: DOTNET_MSBUILD_SDK_RESOLVER_SDKS_VER
  value: '8.0.101'
...
Now the pipeline builds and publishes .NET 8 apps successfully, but I still have no idea why the older SDK was being forced.

Hopefully, we will find it here soon:

Hope this helps.

Happy Coding.

Regards,
Jaliya

Saturday, February 3, 2024

Azure AD B2C: Validating Output Claim from a Non-Self-Asserted Technical Profile

I had a requirement where I wanted to do an additional validation on a boolean claim value in an AAD B2C user journey. If the boolean claim value is true, I wanted to move forward in the user journey. If the value is false, I wanted to short circuit the user journey and return an error. 

I couldn't use Validation Technical Profiles, because the output claim I am validating upon was in a non-self-asserted technical profile (the claim was retrieved by calling an external REST endpoint)  and Validation Technical Profiles doesn't support non-self-asserted technical profiles.

In such cases, we can add an additional OrchestrationStep, do a Precondition in that particular step, assert and navigate the user to a self-asserted technical profile and display the error there.

So how do we do that? 

1. Define a ClaimType for a self-asserted technical profile.

<BuildingBlocks>
  <ClaimsSchema>
    ...
    <ClaimType Id="errorMessage">
      <DisplayName>Please contact support.</DisplayName>
      <DataType>string</DataType>
      <UserInputType>Paragraph</UserInputType>
    </ClaimType>
  </ClaimsSchema>
  ...
</BuildingBlocks>

2. Define a ClaimsTransformation.

<BuildingBlocks>
  ...
  <ClaimsTransformations> ...
    <ClaimsTransformation Id="CreateApplicationUserNotActiveErrorMessage" TransformationMethod="CreateStringClaim">
      <InputParameters>
        <InputParameter Id="value" DataType="string" Value="Application user is not active." />
      </InputParameters>
      <OutputClaims>
        <OutputClaim ClaimTypeReferenceId="errorMessage" TransformationClaimType="createdClaim" />
      </OutputClaims>
    </ClaimsTransformation>
  </ClaimsTransformations>
</BuildingBlocks>

3. Define a self-asserted TechnicalProfile. Use the above ClaimsTransformation as a InputClaimsTransformation. Reference the ClaimType created in the first step.

<ClaimsProviders>
  <ClaimsProvider>
    <DisplayName>...</DisplayName>
    <TechnicalProfiles> ...
      <TechnicalProfile Id="SelfAsserted-ApplicationUserNotActiveError">
        <DisplayName>Error message</DisplayName>
        <Protocol Name="Proprietary" Handler="Web.TPEngine.Providers.SelfAssertedAttributeProvider, Web.TPEngine, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null"/>
        <Metadata>
          <Item Key="ContentDefinitionReferenceId">api.selfasserted</Item>
          <Item Key="setting.showContinueButton">false</Item>
          <Item Key="setting.showCancelButton">true</Item>
        </Metadata>
        <InputClaimsTransformations>
          <InputClaimsTransformation ReferenceId="CreateApplicationUserNotActiveErrorMessage" />
        </InputClaimsTransformations>
        <InputClaims>
          <InputClaim ClaimTypeReferenceId="errorMessage"/>
        </InputClaims>
        <OutputClaims>
          <OutputClaim ClaimTypeReferenceId="errorMessage"/>
        </OutputClaims>
      </TechnicalProfile>
    </TechnicalProfiles>
  </ClaimsProvider>
</ClaimsProviders>

4. Introduce an additional OrchestrationStep with a Precondition before the last the OrchestrationStep. If the condition is not satisfied, use the created self-asserted TechnicalProfile.

<UserJourneys>
  ...
  <UserJourney Id="...">
    <OrchestrationSteps>
      ...
      <OrchestrationStep Order="9" Type="ClaimsExchange">
        <Preconditions>
          <Precondition Type="ClaimEquals" ExecuteActionsIf="true">
            <Value>isActive</Value> <!-- this claim is forwarded from a previous step -->
            <Value>True</Value>
            <Action>SkipThisOrchestrationStep</Action>
          </Precondition>
        </Preconditions>
        <ClaimsExchanges>
          <ClaimsExchange Id="SelfAssertedApplicationUserNotActiveError" TechnicalProfileReferenceId="SelfAsserted-ApplicationUserNotActiveError" />
        </ClaimsExchanges>
      </OrchestrationStep>
      ...
      <OrchestrationStep Order="11" Type="SendClaims" CpimIssuerTechnicalProfileReferenceId="JwtIssuer" />
    </OrchestrationSteps>
  </UserJourney>
  ...
</UserJourneys>

And this is what happens when isActive claim is false. When it's true, the above OrchestrationStep will get skipped and the user journey will continue.
Self-Asserted Technical Profile
Hope this helps.

Happy Coding.

Regards,
Jaliya

Thursday, February 1, 2024

.NET 8.0 Isolated Azure Durable Functions: Preserve Stack Order When Passing Between Orchestrators, Activities etc

In this post let's see how we can preserve Stack<T> order when it's getting passed between Orchestrators/Activities in a .NET Isolated Azure Durable Function. 

In Durable Functions in the .NET isolated worker, the Serialization default behavior has changed from Newtonsoft.Json to System.Text.Json.

I have already written a post about preserving Stack Order in an In-Process Azure Durable Functionshere. I am using the same code example, instead converted it to isolated worker. So I am not going to write down the entire example code to describe the issue here, you can have a look at the previous post.

You can see in the below screenshot, the order of Stack<T> is not preserved with default Serializer options.

Incorrect Result
With Isolated Durable Functions, we can easily configure the JsonSerializerOptions. We need to add a custom JsonConverter that correctly serializes and deserializes a Stack<T>. There is already JsonConverterFactoryForStackOfT shared by the .NET team that we can use in our Isolated Durable Function as follows.
using DurableFunctions.Isolated.StackSerialization.Converters;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using System.Text.Json;

IHost host = new HostBuilder()
    .ConfigureFunctionsWorkerDefaults()
    .ConfigureServices(services =>
    {
        services.Configure<JsonSerializerOptions>(options =>
        {
// Add custom converter to serialize and deserialize a Stack<T>
            options.Converters.Add(new JsonConverterFactoryForStackOfT());
        });
    })
    .Build();

host.Run();
And now once the Serializer options are configured, we can see Stack<T> is getting serialized/deserialized correctly.
Correct Result

Hope this helps.

Happy Coding.

Regards,
Jaliya

Monday, January 22, 2024

Monitoring Azure Durable Functions using Durable Functions Monitor

In this post let's have a look at a cool project that you might want to use if you are working on Azure Durable Functions. 

The project is DurableFunctionsMonitor, it provides a UI for monitoring, managing, and debugging orchestration instances in an Azure Durable Function App.
Durable Functions Monitor (DFM)
Durable Functions Monitor (DFM): Orchestration Sequence Diagram
I have been wanting to use this for several months now but only got to use this recently, and I like it.

This is basically an Azure Function App. The most important thing for Durable Functions Monitor (DFM) is, that it needs to know your Durable Function Apps' storage details, so it can pull the details and display them. So far I have only used it with Azure Storage, but it seems to support Netherite and Microsoft SQL Server.

 DFM can run in the following ways,
  • Injected mode: Install a NuGet package and expose the UI through your existing .NET Azure Functions project
  • Standalone: Since this is an Azure Function App, you can run it in the same ways as a typical Azure Function
    • Create a separate Azure Function App and install a NuGet package
    • Docker Container using the image scaletone/durablefunctionsmonitor
    • Deploy to your Kubernetes cluster
  • VS Code Extension
We can also customize things like Authentication for UI, its endpoints, etc.

Do give it a try:
   DurableFunctionsMonitor

Happy Coding.

Regards,
Jaliya