Tuesday, March 19, 2024

App Service Outbound Traffic through VNet Gets 403 When Trying to Access Another App Service with Public Network Access Enabled, but has a Private Endpoint

In this post, let's go through an interesting scenario related to App Service networking.

- App A: is integrated into VNet A.
App A: Networking
- App B: has Public network access enabled with no access restrictions. But it has a Private Endpoint in VNet B.
App B: Networking
- App C: has Public network access enabled with no access restrictions. No private endpoints.
App C: Networking
Now the interesting part. I am seeing the following access behavior.
  • App A -> App B: 403 Forbidden. From anywhere else -> App B: 200.
  • App A -> App C: 200
I was scratching my head for a couple of days trying to understand why is that App A -> App B: 403 Forbidden. Because App B has Public network access enabled and also a private endpoint and public access can co-exist on an app.

I can fix this by peering VNet A and VNet B. But I still needed to figure out why App A isn't reaching App B on the default endpoint (not on the private endpoint) as it's like App B is receiving traffic from any other source.

Finally, I got an explanation from Mads Wolff Damgård (a Principal Product Manager at Microsoft).

This is happening because I had Service Endpoint registered to Microsoft.Web in VNet As' integration subnet for App A
VNet As' integration subnet for App A
When we have a service endpoint registered, the traffic is sent over the public channel, but as service endpoint traffic. This uses the same protocol as the private endpoint and will then try to parse it as a private endpoint call, but since VNet A has no knowledge about the private endpoint, the traffic fails with 403.

Once I removed the service endpoint registration, App A was able to reach App B without any issues.

So hope someone finds this helpful!

Happy Coding.

Regards,
Jaliya

Friday, March 15, 2024

Read TLS/SSL Certificate in Azure App Service from C# Code

Recently I was updating an old .NET Core web application to .NET 8 and the code was reading a certificate as follows.

private X509Certificate2 GetCertificateByThumbprint(string thumbprint)
{
    X509Store store = new (StoreName.My, StoreLocation.CurrentUser);
    store.Open(OpenFlags.ReadOnly);
    X509Certificate2Collection certificateCollection =  store.Certificates.Find(X509FindType.FindByThumbprint, thumbprint, true);
    return certificateCollection.OfType<X509Certificate2>().SingleOrDefault();
}

This piece of code wasn't working once the application is deployed to Azure App Service (Windows). The certificate is set up in App Service, but the code wasn't picking it up. As usual, QAs were insisting it used to work.

It seems I needed to add an app setting WEBSITE_LOAD_CERTIFICATES with the value of comma-separated certificate thumbprints in order for them be loaded and accessible from App Service code.

{
  "name""WEBSITE_LOAD_CERTIFICATES",
  "value""<comma-separated-certificate-thumbprints>",
  "slotSetting"false
}

You can read more on Use a TLS/SSL certificate in your code in Azure App Service. It contains instructions for other scenarios like loading a certificate from a file and loading a certificate in Linux/Windows containers.

Hope this helps.

Happy Coding.

Regards,
Jaliya

Monday, March 11, 2024

Azure AD B2C: Call an External API Using Client Credentials in an User Journey

In this post, let's see how to call an external API using Client Credentials in an Azure AD B2C User Journey.

I am assuming Azure AD B2C App Registration is already set up for the client app with the necessary permission  (scope access) to call the protected API and you have noted down the Client ID, Client Secret, and the Scope.

Note: There are no additional actions to enable the client credentials for user flows or custom policies. Both Azure AD B2C user flows and custom policies support the client credentials flow by default. But of course, you can create a custom policy to customize the user journey of the OAuth 2.0 Client credentials and extend the token issuance process.

First, you can test that everything is set up correctly using the following Powershell script.

$clientId = "<clientId>"
$clientSecret = "<clientSecret>"
$endpoint = "https://<tenant-name>.b2clogin.com/<tenant-name>.onmicrosoft.com/<policy>/oauth2/v2.0/token"
$scope = "<scope>"
$body = "grant_type=client_credentials&scope=" + $scope + "&client_id=" + $clientId + "&client_secret=" + $clientSecret

$token = Invoke-RestMethod -Method Post -Uri $endpoint -Body $body
$token | ConvertTo-Json

Here the scope is something like follows:

$scope = "https://<tenant-name>.onmicrosoft.com/45a2252d-099a-4c6a-9c57-66eac05e2693/.default"
The script should output something like below.
Test Client Credentials
Now let's see how we can use this in an Azure AD B2C User Journey.

1. Define a ClaimType for access_token.

<BuildingBlocks>
  <ClaimsSchema> ...
    <ClaimType Id="access_token">
      <DisplayName>Access Token</DisplayName>
      <DataType>string</DataType>
    </ClaimType>
  </ClaimsSchema> ... </BuildingBlocks>

2. Define TechnicalProfiles to retrieve access_token and to call the external API using the retrieved access_token.

<ClaimsProvider>
  ...
  <TechnicalProfiles>
    <TechnicalProfile Id="REST-GetClientCredentials">
      <DisplayName>Get Client Credentials</DisplayName>
      <Protocol Name="Proprietary" Handler="Web.TPEngine.Providers.RestfulProvider, Web.TPEngine, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null"/>
      <Metadata>
        <Item Key="ServiceUrl">
          https://<tenant-name>.b2clogin.com/<tenant-name>.onmicrosoft.com/<policy>/oauth2/v2.0/token?grant_type=client_credentials&amp;scope=<scope>&amp;client_id=<clientId>&amp;client_secret=<clientSecret>
        </Item>
        <Item Key="SendClaimsIn">Body</Item>
        <Item Key="AuthenticationType">None</Item>
        <Item Key="AllowInsecureAuthInProduction">true</Item>
      </Metadata>
      <OutputClaims>
        <OutputClaim ClaimTypeReferenceId="access_token"/>
      </OutputClaims>
      <UseTechnicalProfileForSessionManagement ReferenceId="SM-Noop"/>
    </TechnicalProfile>
    <TechnicalProfile Id="REST-CallApiUsingClientCredentials">
      <DisplayName>Call an External API using Client Credentials</DisplayName>
      <Protocol Name="Proprietary" Handler="Web.TPEngine.Providers.RestfulProvider, Web.TPEngine, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null" />
      <Metadata>
        <Item Key="ServiceUrl"><Endpoint to call></Item>
        <Item Key="SendClaimsIn">Header</Item>
        <Item Key="AuthenticationType">Bearer</Item>
        <Item Key="UseClaimAsBearerToken">access_token</Item>
        <Item Key="AllowInsecureAuthInProduction">true</Item>
        <Item Key="IncludeClaimResolvingInClaimsHandling">true</Item>
      </Metadata>
      <InputClaims>
        <InputClaim ClaimTypeReferenceId="access_token"/>
      </InputClaims>
      <OutputClaims>
        <!-- Output Claims from Calling the API -->
      </OutputClaims>
      <UseTechnicalProfileForSessionManagement ReferenceId="SM-Noop" />
    </TechnicalProfile> ...
  </TechnicalProfiles>
</ClaimsProvider>

3. Finally, introduce additional OrchestrationSteps to your UserJourney to use the above TechnicalProfiles.

<UserJourneys>
  <UserJourney Id="<UserJourneyId>">
    <OrchestrationSteps>
      ...
      <OrchestrationStep Order="7" Type="ClaimsExchange">
        <ClaimsExchanges>
          <ClaimsExchange Id="RESTGetClientCredentials" TechnicalProfileReferenceId="REST-GetClientCredentials" />
        </ClaimsExchanges>
      </OrchestrationStep>
      <OrchestrationStep Order="8" Type="ClaimsExchange">
        <ClaimsExchanges>
          <ClaimsExchange Id="RESTCallApiUsingClientCredentials" TechnicalProfileReferenceId="REST-CallApiUsingClientCredentials" />
        </ClaimsExchanges>
      </OrchestrationStep> ...
      <OrchestrationStep Order="11" Type="SendClaims" CpimIssuerTechnicalProfileReferenceId="JwtIssuer" />
    </OrchestrationSteps>
  </UserJourney>

Now that should be it.

Hope this helps.

Happy Coding.

Regards,
Jaliya

Friday, March 1, 2024

Creating Integration Tests for Azure Functions

I wanted to have some integration tests for Azure Functions, especially for some complex durable functions. When you have durable functions and when you want to make sure that the orchestrations are behaving as expected, having integration tests is the only way to ensure that. And another important thing is I needed to be able to run these tests not just locally, but in a CI pipeline (GitHub workflows, Azure DevOps Pipeline, etc) as well.

Unfortunately as of today, there is no proper integration test mechanism for Azure Durable Functions (or Azure Functions) like we have for ASP.NET Core applications.

I came up with the following approach after gathering inputs from GitHub issues and other related posts on the subject.

The basic concept is as follows. Note I am using XUnit.net as my testing Framework.

1. Create a fixture class that implements IDisposable and on the constructor, I am spinning up the Function Application to test using func start. And doing the cleanup on Dispose().

2. Create an XUnit Collection Fixture using the above fixture. So basically my single test context (the function application) will get shared among different tests in several test classes, and it will get cleaned up after all the tests in the test classes have finished.

My fixture looks like something below.
using Polly;
using System.Diagnostics;
using System.Runtime.InteropServices;

namespace HelloAzureFunctions.Tests.Integration.Fixtures;

public class AzureFunctionFixture : IDisposable
{
    private readonly string _path = Directory.GetCurrentDirectory();
    private readonly string _testOutputPath = Path.Combine(Directory.GetCurrentDirectory(), "integration-test-output.log");
    private readonly int _port = 7071;
    private readonly string _baseUrl;
    private readonly Process _process;

    public readonly HttpClient HttpClient;

    public AzureFunctionFixture()
    {
        _baseUrl = $"http://localhost:{_port}";

        HttpClient = new HttpClient()
        {
            BaseAddress = new Uri(_baseUrl)
        };

        if (File.Exists(_testOutputPath))
        {
            File.Delete(_testOutputPath);
        }

        DirectoryInfo directoryInfo = new(_path);
        _process = StartProcess(_port, directoryInfo);
        _process.OutputDataReceived += (sender, args) =>
        {
            File.AppendAllLines(_testOutputPath, [args.Data]);
        };
        _process.BeginOutputReadLine();
    }

    public void Dispose()
    {
        if (!_process.HasExited)
        {
            _process.Kill(entireProcessTree: true);
        }

        _process.Dispose();
        HttpClient.Dispose();
    }

    public async Task WaitUntilFunctionsAreRunning()
    {
        PolicyResult<HttpResponseMessage> result =
            await Policy.TimeoutAsync(TimeSpan.FromSeconds(30))
                .WrapAsync(Policy.Handle<Exception>().WaitAndRetryForeverAsync(index => TimeSpan.FromMilliseconds(500)))
                .ExecuteAndCaptureAsync(() => HttpClient.GetAsync(""));

        if (result.Outcome != OutcomeType.Successful)
        {
            throw new InvalidOperationException("The Azure Functions project doesn't seem to be running.");
        }
    }

    private static Process StartProcess(int port, DirectoryInfo workingDirectory)
    {
        string fileName = "func";
        string arguments = $"start --port {port} --verbose";

        if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows))
        {
            fileName = "powershell.exe";
            arguments = $"func start --port {port} --verbose";
        }

        ProcessStartInfo processStartInfo = new(fileName, arguments)
        {
            UseShellExecute = false,
            CreateNoWindow = true,
            RedirectStandardOutput = true,
            WorkingDirectory = workingDirectory.FullName,
            EnvironmentVariables =
            { 
                // Passing an additional environment variable to the application,
                // So it can control the behavior when running for Integration Tests

                [ApplicationConstants.IsRunningIntegrationTests] = "true"
            }
        };

        Process process = new() { StartInfo = processStartInfo };
        process.Start();

        return process;
    }
}
I can use this fixture for my tests and it will work fine for running integration tests locally.

Now we need to be able to run these tests in a CI pipeline. I am using the following GitHub workflow.
name: Run Integration Tests

on:
  push:
    branches: ["main"]
    paths-ignore:
      - '**.md'

env:
  DOTNET_VERSION: '8.0.x'

jobs:
  build-and-test:
    strategy:
      matrix:
        os: [ubuntu-latestwindows-latest]
    runs-on: ${{ matrix.os }}
    env:
      INTEGRATION_TEST_EXECUTION_DIRECTORY: ./tests/HelloAzureFunctions.Tests.Integration/bin/Debug/net8.0

    steps:
    - name: 'Checkout GitHub Action'
      uses: actions/checkout@v3

    - name: Setup .NET ${{ env.DOTNET_VERSION }} Environment
      uses: actions/setup-dotnet@v3
      with:
        dotnet-version: ${{ env.DOTNET_VERSION }}

    - name: Build
      run: dotnet build

    # Install Azure Functions Core Tools in the runner, 
    # so we have access to 'func.exe' to spin up the Azure Functions app in integration tests
    - name: Install Azure Functions Core Tools 
      run: |
        npm install -g azure-functions-core-tools@4 --unsafe-perm true

    # Setup Azurite in the runner, 
    # so the Azure Functions app we are going to spin up, can use azurite as it's Storage Provider
    - name: Setup Azurite 
      shell: bash
      run: |
        npm install -g azurite
        azurite --silent &

    - name: Run Integration Tests
      # If there are any errors executing integration tests, uncomment the following line to continue the workflow, so you can look at integration-test-output.log
      # continue-on-error: true 
      run: dotnet test ${{ env.INTEGRATION_TEST_EXECUTION_DIRECTORY }}/HelloAzureFunctions.Tests.Integration.dll

    - name: Upload Integration Tests Execution Log
      uses: actions/upload-artifact@v4
      with:
        name: artifact-${{ matrix.os }}
        path: ${{ env.INTEGRATION_TEST_EXECUTION_DIRECTORY }}/integration-test-output.log
When the workflow runs, the output is as follows.
Build and Test: ubuntu-latest
Build and Test: windows-latest
You can find the full sample code here on this repo:

Happy Coding.

Regards,
Jaliya