App B: Networking |
- App A -> App B: 403 Forbidden. From anywhere else -> App B: 200.
- App A -> App C: 200
App B: Networking |
Recently I was updating an old .NET Core web application to .NET 8 and the code was reading a certificate as follows.
private X509Certificate2 GetCertificateByThumbprint(string thumbprint)
{
X509Store store = new (StoreName.My, StoreLocation.CurrentUser);
store.Open(OpenFlags.ReadOnly);
X509Certificate2Collection certificateCollection = store.Certificates.Find(X509FindType.FindByThumbprint, thumbprint, true);
return certificateCollection.OfType<X509Certificate2>().SingleOrDefault();
}
This piece of code wasn't working once the application is deployed to Azure App Service (Windows). The certificate is set up in App Service, but the code wasn't picking it up. As usual, QAs were insisting it used to work.
It seems I needed to add an app setting WEBSITE_LOAD_CERTIFICATES with the value of comma-separated certificate thumbprints in order for them be loaded and accessible from App Service code.
{
"name": "WEBSITE_LOAD_CERTIFICATES",
"value": "<comma-separated-certificate-thumbprints>",
"slotSetting": false
}
You can read more on Use a TLS/SSL certificate in your code in Azure App Service. It contains instructions for other scenarios like loading a certificate from a file and loading a certificate in Linux/Windows containers.
Hope this helps.
Happy Coding.
Regards,
Jaliya
In this post, let's see how to call an external API using Client Credentials in an Azure AD B2C User Journey.
I am assuming Azure AD B2C App Registration is already set up for the client app with the necessary permission (scope access) to call the protected API and you have noted down the Client ID, Client Secret, and the Scope.
Note: There are no additional actions to enable the client credentials for user flows or custom policies. Both Azure AD B2C user flows and custom policies support the client credentials flow by default. But of course, you can create a custom policy to customize the user journey of the OAuth 2.0 Client credentials and extend the token issuance process.
First, you can test that everything is set up correctly using the following Powershell script.
$clientId = "<clientId>"
$clientSecret = "<clientSecret>"
$endpoint = "https://<tenant-name>.b2clogin.com/<tenant-name>.onmicrosoft.com/<policy>/oauth2/v2.0/token"
$scope = "<scope>"
$body = "grant_type=client_credentials&scope=" + $scope + "&client_id=" + $clientId + "&client_secret=" + $clientSecret
$token = Invoke-RestMethod -Method Post -Uri $endpoint -Body $body
$token | ConvertTo-Json
Here the scope is something like follows:
$scope = "https://<tenant-name>.onmicrosoft.com/45a2252d-099a-4c6a-9c57-66eac05e2693/.default"
Test Client Credentials |
1. Define a ClaimType for access_token.
<BuildingBlocks>
<ClaimsSchema> ...
<ClaimType Id="access_token">
<DisplayName>Access Token</DisplayName>
<DataType>string</DataType>
</ClaimType>
</ClaimsSchema> ... </BuildingBlocks>
2. Define TechnicalProfiles to retrieve access_token and to call the external API using the retrieved access_token.
<ClaimsProvider>
...
<TechnicalProfiles>
<TechnicalProfile Id="REST-GetClientCredentials">
<DisplayName>Get Client Credentials</DisplayName>
<Protocol Name="Proprietary" Handler="Web.TPEngine.Providers.RestfulProvider, Web.TPEngine, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null"/>
<Metadata>
<Item Key="ServiceUrl">
https://<tenant-name>.b2clogin.com/<tenant-name>.onmicrosoft.com/<policy>/oauth2/v2.0/token?grant_type=client_credentials&scope=<scope>&client_id=<clientId>&client_secret=<clientSecret>
</Item>
<Item Key="SendClaimsIn">Body</Item>
<Item Key="AuthenticationType">None</Item>
<Item Key="AllowInsecureAuthInProduction">true</Item>
</Metadata>
<OutputClaims>
<OutputClaim ClaimTypeReferenceId="access_token"/>
</OutputClaims>
<UseTechnicalProfileForSessionManagement ReferenceId="SM-Noop"/>
</TechnicalProfile>
<TechnicalProfile Id="REST-CallApiUsingClientCredentials">
<DisplayName>Call an External API using Client Credentials</DisplayName>
<Protocol Name="Proprietary" Handler="Web.TPEngine.Providers.RestfulProvider, Web.TPEngine, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null" />
<Metadata>
<Item Key="ServiceUrl"><Endpoint to call></Item>
<Item Key="SendClaimsIn">Header</Item>
<Item Key="AuthenticationType">Bearer</Item>
<Item Key="UseClaimAsBearerToken">access_token</Item>
<Item Key="AllowInsecureAuthInProduction">true</Item>
<Item Key="IncludeClaimResolvingInClaimsHandling">true</Item>
</Metadata>
<InputClaims>
<InputClaim ClaimTypeReferenceId="access_token"/>
</InputClaims>
<OutputClaims>
<!-- Output Claims from Calling the API -->
</OutputClaims>
<UseTechnicalProfileForSessionManagement ReferenceId="SM-Noop" />
</TechnicalProfile> ...
</TechnicalProfiles>
</ClaimsProvider>
3. Finally, introduce additional OrchestrationSteps to your UserJourney to use the above TechnicalProfiles.
<UserJourneys>
<UserJourney Id="<UserJourneyId>">
<OrchestrationSteps>
...
<OrchestrationStep Order="7" Type="ClaimsExchange">
<ClaimsExchanges>
<ClaimsExchange Id="RESTGetClientCredentials" TechnicalProfileReferenceId="REST-GetClientCredentials" />
</ClaimsExchanges>
</OrchestrationStep>
<OrchestrationStep Order="8" Type="ClaimsExchange">
<ClaimsExchanges>
<ClaimsExchange Id="RESTCallApiUsingClientCredentials" TechnicalProfileReferenceId="REST-CallApiUsingClientCredentials" />
</ClaimsExchanges>
</OrchestrationStep> ...
<OrchestrationStep Order="11" Type="SendClaims" CpimIssuerTechnicalProfileReferenceId="JwtIssuer" />
</OrchestrationSteps>
</UserJourney>
Now that should be it.
Hope this helps.
Happy Coding.
using Polly;
using System.Diagnostics;
using System.Runtime.InteropServices;
namespace HelloAzureFunctions.Tests.Integration.Fixtures;
public class AzureFunctionFixture : IDisposable
{
private readonly string _path = Directory.GetCurrentDirectory();
private readonly string _testOutputPath = Path.Combine(Directory.GetCurrentDirectory(), "integration-test-output.log");
private readonly int _port = 7071;
private readonly string _baseUrl;
private readonly Process _process;
public readonly HttpClient HttpClient;
public AzureFunctionFixture()
{
_baseUrl = $"http://localhost:{_port}";
HttpClient = new HttpClient()
{
BaseAddress = new Uri(_baseUrl)
};
if (File.Exists(_testOutputPath))
{
File.Delete(_testOutputPath);
}
DirectoryInfo directoryInfo = new(_path);
_process = StartProcess(_port, directoryInfo);
_process.OutputDataReceived += (sender, args) =>
{
File.AppendAllLines(_testOutputPath, [args.Data]);
};
_process.BeginOutputReadLine();
}
public void Dispose()
{
if (!_process.HasExited)
{
_process.Kill(entireProcessTree: true);
}
_process.Dispose();
HttpClient.Dispose();
}
public async Task WaitUntilFunctionsAreRunning()
{
PolicyResult<HttpResponseMessage> result =
await Policy.TimeoutAsync(TimeSpan.FromSeconds(30))
.WrapAsync(Policy.Handle<Exception>().WaitAndRetryForeverAsync(index => TimeSpan.FromMilliseconds(500)))
.ExecuteAndCaptureAsync(() => HttpClient.GetAsync(""));
if (result.Outcome != OutcomeType.Successful)
{
throw new InvalidOperationException("The Azure Functions project doesn't seem to be running.");
}
}
private static Process StartProcess(int port, DirectoryInfo workingDirectory)
{
string fileName = "func";
string arguments = $"start --port {port} --verbose";
if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows))
{
fileName = "powershell.exe";
arguments = $"func start --port {port} --verbose";
}
ProcessStartInfo processStartInfo = new(fileName, arguments)
{
UseShellExecute = false,
CreateNoWindow = true,
RedirectStandardOutput = true,
WorkingDirectory = workingDirectory.FullName,
EnvironmentVariables =
{
// Passing an additional environment variable to the application,
// So it can control the behavior when running for Integration Tests
[ApplicationConstants.IsRunningIntegrationTests] = "true"
}
};
Process process = new() { StartInfo = processStartInfo };
process.Start();
return process;
}
}
name: Run Integration Tests
on:
push:
branches: ["main"]
paths-ignore:
- '**.md'
env:
DOTNET_VERSION: '8.0.x'
jobs:
build-and-test:
strategy:
matrix:
os: [ubuntu-latest, windows-latest]
runs-on: ${{ matrix.os }}
env:
INTEGRATION_TEST_EXECUTION_DIRECTORY: ./tests/HelloAzureFunctions.Tests.Integration/bin/Debug/net8.0
steps:
- name: 'Checkout GitHub Action'
uses: actions/checkout@v3
- name: Setup .NET ${{ env.DOTNET_VERSION }} Environment
uses: actions/setup-dotnet@v3
with:
dotnet-version: ${{ env.DOTNET_VERSION }}
- name: Build
run: dotnet build
# Install Azure Functions Core Tools in the runner,
# so we have access to 'func.exe' to spin up the Azure Functions app in integration tests
- name: Install Azure Functions Core Tools
run: |
npm install -g azure-functions-core-tools@4 --unsafe-perm true
# Setup Azurite in the runner,
# so the Azure Functions app we are going to spin up, can use azurite as it's Storage Provider
- name: Setup Azurite
shell: bash
run: |
npm install -g azurite
azurite --silent &
- name: Run Integration Tests
# If there are any errors executing integration tests, uncomment the following line to continue the workflow, so you can look at integration-test-output.log
# continue-on-error: true
run: dotnet test ${{ env.INTEGRATION_TEST_EXECUTION_DIRECTORY }}/HelloAzureFunctions.Tests.Integration.dll
- name: Upload Integration Tests Execution Log
uses: actions/upload-artifact@v4
with:
name: artifact-${{ matrix.os }}
path: ${{ env.INTEGRATION_TEST_EXECUTION_DIRECTORY }}/integration-test-output.log