Thursday, September 4, 2025

ASP.NET Core 10.0: Custom Validation Support for Minimal APIs

In a previous post, I wrote about ASP.NET Core 10.0: Validation Support for Minimal APIs. In this post, let's go a bit further and see how we can implement custom validations using both ValidationAttribute implementations and implementing the IValidatableObject interface.

ValidationAttribute 


With ValidationAttribute, we can create a Custom attribute with our own custom logic.
public class CustomEmptyValidationAttribute : ValidationAttribute
{
    protected override ValidationResultIsValid(objectvalueValidationContext validationContext)
    {
        if (value is string str && string.IsNullOrEmpty(str))
        {
            return new ValidationResult("Value cannot be null or empty.");
        }

        return ValidationResult.Success;
    }
}
And then we can apply the attribute, something like below for an example.
internal record Employee([CustomEmptyValidation] string Name);

IValidatableObject 


A class/record can implement IValidatableObject and add the validation logic. The validation will kick in as part of model binding.
internal record Employee : IValidatableObject
{
    [Range(1, int.MaxValue)]
    public int Id { getset}

    public string Name { getset}

    public IEnumerable<ValidationResult> Validate(ValidationContext validationContext)
    {
        if (string.IsNullOrEmpty(Name))
        {
            yield return new ValidationResult("Name cannot be null or empty."[nameof(Name)]);
        }
    }
}
Note: Currently there is a bug where IValidatableObject wouldn't trigger validation when there is no validation attribute on a property. (aspnetcore/issues/63394: ASP.NET Core 10.0: Built-in Validation with IValidatableObject)

Hope this helps.

Happy Coding.

Regards,
Jaliya

Saturday, August 30, 2025

ASP.NET Core 10.0: Validation Support for Minimal APIs

With ASP.NET Core 10.0, we now have built in validation support for Minimal APIs for request data in following.
  • Route parameters, Query Strings
  • Header
  • Request body
If any validation fails, the runtime returns a 400 Bad Request response with details of the validation errors.

Validations are defined using attributes in the System.ComponentModel.DataAnnotations namespace. We can even create our own validators using,
To register validation services and enable validation, we need to call the following method in the Program.cs.
builder.Services.AddValidation();
Now we can do something like below to validate route parameters.
app.MapGet("/employees/{employeeId}"([Range(1, int.MaxValue)] int employeeId) =>
{
    // Omitted
});
And if we try the endpoint with an incorrect route parameter, we will get an validation error.
GET {{WebApplication1_HostAddress}}/employees/0
Accept: application/json
Route parameter validation
We can use the similar concept with record types as well.

Say I have the following Employee record that has a annotated property.
internal record Employee([Required] string Name);
And now If I try to make a request to the following endpoint,
app.MapPost("/employees"(Employee employee) =>
{
    // Omitted
});
With a empty value for name,
POST {{WebApplication1_HostAddress}}/employees
Content-Type: application/json
{
    "name": ""
}
I am getting the following 400 Bad Request.
Request Body Validation
You can disable the validation at the endpoint by calling DisableValidation(), something like below:
app.MapPost("/employees"(Employee employee) =>
    {
        // Omitted
    })
    .DisableValidation();
Hope this helps.

Happy Coding.

Regards,
Jaliya

Wednesday, August 27, 2025

Azure Logic Apps (Consumption): HTTP Action to POST multipart/form-data with Files and Fields

In this post, let's see how we can POST multipart/form-data with files and fields using a HTTP action in a Consumption Azure Logic App.

I have the following Test API which I am going to call from the Logic App.
WebApplicationBuilder builder = WebApplication.CreateBuilder(args);

WebApplication app = builder.Build();

app.UseHttpsRedirection();

app.MapPost("/api/Files"async (IFormCollection formCollection) =>
{
    IFormFilefile = formCollection.Files.SingleOrDefault();

    if (file == null || file.Length == 0)
    {
        return Results.BadRequest("No file uploaded.");
    }

    string fileName = file.FileName;
// Save file and ensure file is good
    string filePath = Path.Combine(@"<some-location>"fileName);
    using FileStream stream = File.Create(filePath);
    await file.CopyToAsync(stream);

    return Results.Ok(new
    {
        fileName,
        fileSize = file.Length,
        someField = formCollection["someField"].ToString()
    });
})
.DisableAntiforgery();

app.Run();
In my Logic App, I have a variable called file  of type object and it's populated with data. 
{
  "fileName""<OMITTED>", // some-file.pdf
  "base64Content""<OMITTED>", // Base 64 encoded content
  "contentType""<OMITTED>" // application/pdf
}
And now let's add the HTTP action as follows:
HTTP Action
Code for Body is below.
{
  "$content-type""multipart/form-data",
  "$multipart": [
    {
      "headers": {
        "Content-Disposition""form-data; name=\"file\"; filename=\"@{variables('file')?['fileName']}\""
      },
      "body": {
        "$content""@{variables('file')?['base64Content']}",
        "$content-type""@{variables('file')?['contentType']}"
      }
    },
    {
      "headers": {
        "Content-Disposition""form-data; name=\"someField\""
      },
      "body""Hello World!"
    }
  ]
}
And now when the HTTP action is executed, I can see the values are getting passed correctly.
API Endpoint
Hope this helps.

Happy Coding.

Regards,
Jaliya

Wednesday, August 20, 2025

Azure Functions HTTP Orchestration Trigger with multipart/form-data

In this post, let's see have how we can invoke a HTTP Orchestration Trigger in an Azure Durable Functions with multipart/form-data. There can be scenarios where you want to pass multipart/form-data (mostly files) to HTTP Orchestration Trigger.

Note I am using Azure Functions Isolated model.

If you scaffold a a Durable Function in Visual Studio, you will see it's HTTP Trigger function to be something like below.
[Function("Function_HttpStart")]
public static async Task<HttpResponseData> HttpStart(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get""post")] HttpRequestData req,
    [DurableClient] DurableTaskClient client,
    FunctionContext executionContext)
{
    string instanceId = await client.ScheduleNewOrchestrationInstanceAsync(nameof(Function));
        
    // Omitted for brevity.
        
    return await client.CreateCheckStatusResponseAsync(reqinstanceId);
}
Notice that it uses HttpRequestData for request and for HttpResponseData. which is available via,
Microsoft.Azure.Functions.Worker.Extensions.Http
Instead of these, we can make use of Azure Functions ASP.NET Core integration and start using ASP.NET Core Request/Response types including HttpRequestHttpResponse and IActionResult in HTTP Triggers.

Azure Functions ASP.NET Core integration is available via following NuGet package.
Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore
And then in the Program.cs,
FunctionsApplicationBuilder builder = FunctionsApplication.CreateBuilder(args);

// Enable ASP.NET Core features in Azure Functions
builder.ConfigureFunctionsWebApplication();
// Omitted for brevity
And now we can change the HTTP trigger as follows.
[Function("Function_HttpStart")]
public static async Task<IActionResult> HttpStart(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get""post")] HttpRequest req,
    [DurableClient] DurableTaskClient client,
    FunctionContext executionContext)
{
    string instanceId = await client.ScheduleNewOrchestrationInstanceAsync(nameof(Function));

    if (req.HasFormContentType)
    {
        IFormCollection formCollection = await req.ReadFormAsync();
        // TODO: Process form data as needed.
    }

    // Omitted for brevity.

    HttpManagementPayload httpManagementPayload = client.CreateHttpManagementPayload(instanceId);
    return new ObjectResult(httpManagementPayload)
    {
        StatusCode = (int)HttpStatusCode.Accepted
    };
}
By the way, if you want to use multipart/form-data in any of the Azure Functions HTTP Triggers, Azure Functions ASP.NET Core integration is the way. 

Hope this helps.

Happy Coding.

Regards,
Jaliya

Saturday, August 9, 2025

Azure Automation Runbooks and Azure Cosmos DB for MongoDB

I recently wanted to run a daily job on an Azure Cosmos DB for MongoDB. I thought Logic Apps would be a good fit, but surprisingly there is still a no connector that supports Azure Cosmos DB for MongoDB (currently only supports Azure Cosmos DB for NoSQL), and that's a bummer.

But there are of course other approaches we can take, like Azure Automation Runbooks. 

In this post, let's see how we can create an Azure Automation Python Runbook to query Azure Cosmos DB for MongoDB.

I have an Azure Automation account created and I have created a Python 3.10 Runbook. Now in order to connect to MongoDB, I am going to use pymongo package. 

First let's add the package to Automation Account. Note: For Python 3.10 packages, only .whl files targeting cp310 Linux OS are currently supported.

I am downloading the pymongo package to my local computer.

pip download pymongo `
    --platform manylinux2014_x86_64 `
    --only-binary=:all: `
    --python-version 3.10

Upon completion of above command, I can see 2 .whl files, pymongo and a dependency.

.whl files
Then I am uploading both these .whl files to Python packages under Automation Account.

Add Python packages

Now I can run some python code to query my Azure Cosmos DB for MongoDB.

from pymongo import MongoClient

MONGODB_CONNECTIONSTRING = "mongodb://..."
DATABASE_NAME = "<Database_Name>"
COLLECTION_NAME = "<Collection_Name>"

client = MongoClient(MONGODB_CONNECTIONSTRING, ssl=True)
db = client[DATABASE_NAME]
collection = db[COLLECTION_NAME]

documents = collection.find(
    {
        # query to filter documents
    })

documents = list(documents)

client.close()
# TODO: Work with documents

Hope this helps.

Happy Coding.

Regards,
Jaliya

Wednesday, July 23, 2025

.NET Isolated Azure Durable Functions: Distributed Tracing

In this post, let's have a look at the power of Distributed Tracing in .NET Isolated Azure Durable Functions. This is one of the new features that got GA like few weeks ago.

First let's have a look at a simple Durable Function and see how it's logged in Application Insights.
public static class Function
{
    [Function(nameof(HttpStart))]
    public static async Task<HttpResponseData> HttpStart(
        [HttpTrigger(AuthorizationLevel.Anonymous, "get")] HttpRequestData req,
        [DurableClient] DurableTaskClient client,
        FunctionContext executionContext)
    {
        string instanceId =
            await client.ScheduleNewOrchestrationInstanceAsync(nameof(RunOrchestrator));

        return await client.CreateCheckStatusResponseAsync(reqinstanceId);
    }

    [Function(nameof(RunOrchestrator))]
    public static async Task<List<string>> RunOrchestrator(
        [OrchestrationTrigger] TaskOrchestrationContext context)
    {
        EntityInstanceId entityId = new(nameof(HelloHistoryEntity)"helloHistory");

        await context.Entities.CallEntityAsync<string>(entityId,
            nameof(HelloHistoryEntity.Reset));

        string result = await context.CallActivityAsync<string>(nameof(SayHello)"Tokyo");
        await context.Entities.CallEntityAsync<string>(entityId,
            nameof(HelloHistoryEntity.Add),
            result);

        result = await context.CallActivityAsync<string>(nameof(SayHello)"Seattle");
        await context.Entities.CallEntityAsync<string>(entityId,
            nameof(HelloHistoryEntity.Add),
            result);

        result = await context.CallActivityAsync<string>(nameof(SayHello)"London");
        await context.Entities.CallEntityAsync<string>(entityId,
            nameof(HelloHistoryEntity.Add),
            result);

        List<string> outputs = await context.Entities.CallEntityAsync<List<string>>(entityId,
            nameof(HelloHistoryEntity.Get));
        return outputs;
    }

    [Function(nameof(SayHello))]
    public static string SayHello([ActivityTrigger] string nameFunctionContext executionContext)
    {
        return $"Hello {name}!";
    }
}

public class HelloHistoryEntity : TaskEntity<List<string>>
{
    public void Add(string message) => State.Add(message);

    public void Reset() => State = [];

    public List<string> Get() => State;

    [Function(nameof(HelloHistoryEntity))]
    public Task RunEntityAsync([EntityTrigger] TaskEntityDispatcher dispatcher)
    {
        return dispatcher.DispatchAsync(this);
    }
}
Here I have a single Orchestrator that gets triggered by a HTTP function, and the Orchestrator calls an Activity and a Entity few times.

Once I trigger the HTTP function, the logs for HTTP Request looks like below.
Without Distributed Tracing
And this isn't quite helpful.

Now let's enable Distributed Tracing. For .NET Isolated Durable Functions, Distributed Tracing V2 is supported with Microsoft.Azure.Functions.Worker.Extensions.DurableTask >= v1.4.0. Make sure to update your packages before doing the next step.

Now modify the host.json as follows.
{
  "version""2.0",
  "extensions": {
    "durableTask": {
      "tracing": {
        "distributedTracingEnabled"true,
        "version""V2"
      }
    }
  }
}
And that's about it.

Now if I I trigger the HTTP function, the logs for HTTP Request looks like below.
Distributed Tracing
Now we can see the full execution, the call to the Orchestrator and all related Activity and Entity calls.

Isn't it just nice.

Happy Coding.

Regards,
Jaliya

Tuesday, July 22, 2025

Azure Functions on Azure Container Apps: KEDA Scaling for Service Bus Functions When Using AZURE_CLIENT_ID

In a previous post, I wrote about creating Azure Container Apps using az containerapp create --kind functionapp (az functionapp create VS az containerapp create --kind functionapp).

One of the main advantages of this approach is these function apps are preconfigured with auto scaling rules for triggers like Azure Service Bus, Azure Event Hubs etc.

However, in one of our Function Apps that was running on Azure Container Apps, noticed the scaling rules wasn't created for Service Bus Functions.

Only 1 rule for http-scale-rule
The function app was using a Managed Identity (AZURE_CLIENT_ID), and then noticed for Service Bus connection, we were using ServiceBus Connection String (😢).

az containerapp update `
    --name $ContainerAppName `
    --resource-group $ResourceGroup `
    --image $Image `
    --set-env-vars `
        "AzureWebJobsStorage=<AzureWebJobsStorage>" `
        'AZURE_CLIENT_ID="<ManagedIdentityClientId>"' `
        'AzureWebJobsServiceBus="<ServiceBus_ConnectionString>"'

In order for KEDA scaling rules to configure, we need to be using Identity-based Connections instead of secrets.

Something like,

az containerapp update `
    --name $ContainerAppName `
    --resource-group $ResourceGroup `
    --image $Image `
    --set-env-vars `
        "AzureWebJobsStorage=<AzureWebJobsStorage>" `
        'AZURE_CLIENT_ID="<ManagedIdentityClientId>"' `
        'AzureWebJobsServiceBus__fullyQualifiednamespace="<ServiceBusName>.servicebus.windows.net"'

Here for identity-based Connection, we don't need to set <CONNECTION_NAME_PREFIX>__clientId as AZURE_CLIENT_ID is declared and it will be used as <CONNECTION_NAME_PREFIX>__clientId. However, we can explicitly set <CONNECTION_NAME_PREFIX>__clientId to override the default.

 And now the rules are auto configured as expected.

Expected Output
Scale Rules
More read:
   Azure Functions on Azure Container Apps overview
   Azure Functions: Connection values
   Tutorial: Use identity-based connections instead of secrets with triggers and bindings

Hope this helps.

Happy Coding.

Regards,
Jaliya

Wednesday, July 16, 2025

Expose Secondary Azure Document Intelligence Service through Azure Front Door

In the last a couple of months, we had 2 incidents where Azure Document Intelligence Service in East US region has degraded performance. Because of that, we were getting a lot  of 503s (Service Unavailable) while doing various operations and our retries didn't help. Microsoft acknowledged the service degradations.


In this post, let's see how we can expose secondary Azure Document Intelligence Services through Azure Front Door.

We can add another Origin to the Origin Group that contains Document Intelligence Service. But then there is an important factor, from the consumer side we can't use the Ocp-Apim-Subscription-Key for authentication. That's because we won't know to which origin the traffic will get routed to and different Document Intelligence services will have different keys.

So we need to have a shared authentication mechanism for all our consumers and it can be achieved by using Managed Identities. And using keys (Ocp-Apim-Subscription-Key) is not recommended anyway and we need to be using Managed Identities as much as possible.

We can implement the authentication at 2 places. Either the consumer authenticate the request or we can have the AFD Origin Group do the authentication on behalf of the consumer before routing the request to a Origin.

For both these approaches, we need to have a managed identity created, and for that identity given role Cognitive Services User at both Document Intelligence services.

Consumer authenticating the request against Document Intelligence Services

Consumer authenticating the request against Document Intelligence Services
Here we are making the authentication at the Consumer level using the Managed Identity. This is helpful when you are consuming the Document Intelligence service through a SDK.

For an example, if you are using Azure.AI.DocumentIntelligence package,
var documentIntelligenceClient = 
    new DocumentIntelligenceClient(new Uri("<ENDPOINT>")new DefaultAzureCredential());
With this ManagedIdentityCredential will be attempted and a token will get retrieved as long as you have necessary the environment variables set.

Azure Front Door authenticating the request against Document Intelligence Services

AFD authenticating the request against Document Intelligence Services
Here, we really don't care about how a consumer is making the request to AFD, from the Origin Group in AFD, we will be authenticating the request using the Managed Identity for respective Document Intelligence service prior routing the request.

For that first we need to assign the identity to AFD. 
AFD Identity
And then update Origin Group enabling Origin authentication.
Enabling Origin Group Authentication
Hope this helps.

Happy Coding,

Regards,
Jaliya

Saturday, July 12, 2025

Exposing Azure Document Intelligence Service through Azure Front Door

In this post, let's see how we can expose an Azure Document Intelligence (DI) Service through Azure Front Door (AFD) and consume it via a  .NET Client Application that uses Azure.AI.DocumentIntelligence package.

Say, we have a DI service,

https://{document-intelligence-service-name}.cognitiveservices.azure.com/

And we need this service to be consumed via,

https://{front-door-name}.azurefd.net/di-api/

For an example, 

POST https://{document-intelligence-service-name}.cognitiveservices.azure.com/documentintelligence/documentClassifiers/{modelId}:analyze
will now be consumed via,
POST https://{front-door-name}/di-api/documentintelligence/documentClassifiers/{modelId}:analyze

To start off, following are already created.

  • An Azure Document Intelligence Service
  • Azure Front Door and CDN profiles (Azure Front Door Standard with Quick Create)
First step is adding Origin Group with Origins in AFD.
Add Origin Group
Added Origin is as follows:
Add Origin
Once these are added, next we need to configure how to route the requests to this Origin group.

We can do it two ways.

1. Using default-route, with a Rule set to Override origin group
2. Creating a new route with Pattens to match and Origin path

Now let's see how we can configure both these ways.

1. Using default-route, with a Rule set to Override origin group

With this approach, first we need to create a Rule set as follows.
Rule set configuration
Now we need to associate this rule set to the default-route.
Update default route

2. Creating a new route with Pattens to match and Origin path

In this approach, we don't need to create a Rule set. Instead, we can create a new Route with Pattens to match and Origin path.
Add new route
Now I have a .NET Client Application that uses Azure.AI.DocumentIntelligence package, that I was using to test the DI functionality via AFD.
using Azure;
using Azure.AI.DocumentIntelligence;

string endpoint = "https://{front-door-name}.azurefd.net/di-api/";
string apiKey = "{document-intelligence-service-api-key}";
DocumentIntelligenceClient documentIntelligenceClient = new (new Uri(endpoint)new AzureKeyCredential(apiKey));

string classifierId = "{some-classification-model-id}";
string testFilePath = "path\to\test\file.pdf";

using FileStream fileStream = new FileStream(testFilePathFileMode.Open, FileAccess.Read);
BinaryData binaryData = BinaryData.FromStream(fileStream);

ClassifyDocumentOptions classifyDocumentOptions = new(classifierIdbinaryData);

Operation<AnalyzeResult> operation = 
    await documentIntelligenceClient.ClassifyDocumentAsync(WaitUntil.Completed, classifyDocumentOptions);

AnalyzeResult result = operation.Value;

foreach (AnalyzedDocument document in result.Documents)
{
    Console.WriteLine($"Found a document of type: '{document.DocumentType}'");
}
And I can see this is working with both the approaches.

Hope this helps.

Happy Coding.

Regards,
Jaliya