Tuesday, December 2, 2025

Microsoft Entra External ID: Disable Sign Up in a User Flow

I was setting up an application on Microsoft Entra External ID and in my User Flow, I didn't want to enable Sign Up.
Sign Up/Sign In
So wanted to remove No account? Create one.

Apparently Microsoft Entra admin center  doesn't seem to have a functionality to remove this within the portal.

It however can be done using Graph Beta API.

# Install the Microsoft Graph Beta module (required for authentication events flow management)
Install-Module Microsoft.Graph.Beta -Scope CurrentUser -Force
 
# Print version of Microsoft Graph Beta module
$mgBetaModule = Get-Module Microsoft.Graph.Beta -ListAvailable `
    | Sort-Object Version -Descending `
    | Select-Object -First 1
Write-Output "Using Microsoft.Graph.Beta: $($mgBetaModule.Version)" # As of today: 2.32.0
 
# Connect to Azure Account
Write-Output "Connecting to Azure Account..."
Connect-AzAccount
 
$tenantId = "<tenant-id>"
$targetFlowName = "<user-flow-name>"
 
# Connect to Microsoft Graph with required permissions
# Required scopes:
#   - Policy.ReadWrite.AuthenticationFlows: To read and modify authentication flows
#   - EventListener.Read.All/ReadWrite.All: To read and modify event listeners
#   - Application.Read.All/ReadWrite.All: To read and modify applications
Connect-MgGraph `
    -TenantId $tenantId `
    -Scopes "Policy.ReadWrite.AuthenticationFlows", `
        "EventListener.Read.All", `
        "EventListener.ReadWrite.All", `
        "Application.Read.All", `
        "Application.ReadWrite.All"
 
# Verify the connected tenant
$tenantId = (Get-MgContext).TenantId
Write-Output "Successfully connected to tenant: $tenantId"
 
# Retrieve all authentication events flows
$authenticationEventsFlows = Invoke-MgGraphRequest -Method GET `
    -Uri "https://graph.microsoft.com/beta/identity/authenticationEventsFlows"
 
# Find the ID of the target flow
$targetFlowId = ($authenticationEventsFlows.value `
    | Where-Object { $_.displayName -eq $targetFlowName }).id
 
if (-not $targetFlowId) {
    Write-Output "ERROR: Flow '$targetFlowName' not found."
    exit 1
}
 
# Get the target flow
$targetFlow = Invoke-MgGraphRequest -Method GET `
    -Uri "https://graph.microsoft.com/beta/identity/authenticationEventsFlows/$targetFlowId"
  
if ($targetFlow.onInteractiveAuthFlowStart.isSignUpAllowed -eq $false) {
    Write-Output "Sign-up is already disabled for this flow $targetFlowName."
    exit 0
}

Write-Output "Disabling sign-up for flow $targetFlowName..."
 
# Request body to disable sign-up
$body = @{
    "@odata.type" = "#microsoft.graph.externalUsersSelfServiceSignUpEventsFlow"
    "onInteractiveAuthFlowStart" = @{
        "@odata.type" = "#microsoft.graph.onInteractiveAuthFlowStartExternalUsersSelfServiceSignUp"
        "isSignUpAllowed" = $false
    }
} | ConvertTo-Json -Depth 5
 
# PATCH
Invoke-MgGraphRequest -Method PATCH `
    -Uri "https://graph.microsoft.com/beta/identity/authenticationEventsFlows/$targetFlowId" `
    -Body $body `
    -ContentType "application/json"
 
# Verify the update by retrieving the flow again
$updatedFlow = Invoke-MgGraphRequest -Method GET `
    -Uri "https://graph.microsoft.com/beta/identity/authenticationEventsFlows/$targetFlowId"
 
Write-Output "Updated: $($updatedFlow.onInteractiveAuthFlowStart.isSignUpAllowed)"
And that's it.
Sign In
Hope this helps.

Happy Coding.

Regards,
Jaliya

Thursday, November 27, 2025

Creating SAS URIs for Azure Storage Blobs using DefaultAzureCredential

When working with Azure Storage Blobs in .NET, you will often need to generate Shared Access Signature (SAS) URIs to provide temporary, secure access to your blob resources. 

However, if you're using DefaultAzureCredential for authentication we cannot simply call GenerateSasUri() on a BlobClient instance. 

BlobServiceClient blobServiceClient = new BlobServiceClient(
    new Uri($"https://{storageAccountName}.blob.core.windows.net"),
    new DefaultAzureCredential());

BlobClient blobClient = blobServiceClient
    .GetBlobContainerClient(containerName)
    .GetBlobClient(blobName);

// Throws exception: System.ArgumentNullException: Value cannot be null. (Parameter 'sharedKeyCredential')
Uri sasUri = blobClient.GenerateSasUri(BlobSasPermissions.Read, DateTimeOffset.UtcNow.AddMinutes(5));

That's because GenerateSasUri() requires SharedKeyCredential to sign the SAS token. When using DefaultAzureCredential, you don't have access to the storage account key.

The Quick (But Not Ideal) Workaround

For faster development and testing, many developers (myself included) have resorted to using connection strings with account keys:

BlobServiceClient blobServiceClient = new BlobServiceClient(
    $"DefaultEndpointsProtocol=https;AccountName={storageAccountName};AccountKey={accountKey};EndpointSuffix=core.windows.net");

BlobClient blobClient = blobServiceClient
    .GetBlobContainerClient(containerName)
    .GetBlobClient(blobName);

// Now GenerateSasUri() works
Uri sasUri = blobClient.GenerateSasUri(BlobSasPermissions.Read, DateTimeOffset.UtcNow.AddMinutes(5));

The Best Approach: User Delegation Keys

The recommended solution is to use User Delegation Keys. This approach allows you to generate SAS tokens using Azure AD credentials instead of storage account keys.

BlobServiceClient blobServiceClient = new BlobServiceClient(
    new Uri($"https://{storageAccountName}.blob.core.windows.net"),
    new DefaultAzureCredential());

BlobClient blobClient = blobServiceClient
    .GetBlobContainerClient(containerName)
    .GetBlobClient(blobName);

// Define the SAS validity period
var startsOn = DateTimeOffset.UtcNow.AddMinutes(-1);
var expiresOn = DateTimeOffset.UtcNow.AddMinutes(5);

// Build the SAS token configuration
var sasBuilder = new BlobSasBuilder
{
    BlobContainerName = containerName,
    BlobName = blobName,
    Resource = "b",
    StartsOn = startsOn,
    ExpiresOn = expiresOn,
};

sasBuilder.SetPermissions(BlobSasPermissions.Read);

// Get user delegation key from Azure AD (uses your Azure AD identity)
Response<UserDelegationKeyuserDelegationKey =
    await blobServiceClient.GetUserDelegationKeyAsync(startsOnexpiresOn);

// Generate SAS URI using user delegation key
var blobUriBuilder = new BlobUriBuilder(blobClient.Uri)
{
    Sas = sasBuilder.ToSasQueryParameters(
        userDelegationKey.Value,
        blobServiceClient.AccountName)
};

Uri sasUri = blobUriBuilder.ToUri();

And note this requires the identity to have Permission: Storage Blob Delegator.

Hope this helps.

Happy Coding.

Regards,
Jaliya

Friday, November 14, 2025

Azure DevOps: Azure Functions Core Tools Can't Find .NET 10 Installed by UseDotNet@2 Task on Windows Agents

I was upgrading an Azure Durable Function Application from .NET 9 to .NET 10. Our Azure DevOps pipeline have a job that executes set of integration tests by spinning up the function using Azure Functions Core Tools (func.exe). Since we were using MSSQLLocalDB, the agent is Windows.

After the upgrade, the integration tests was failing to spin up func with a frustrating error.

You must install or update .NET to run this application.
App: D:\a\1\s\tests\...\bin\Debug\net10.0\FunctionApp.dll
Architecture: x64
Framework: 'Microsoft.NETCore.App', version '10.0.0' (x64)
.NET location: C:\Program Files\dotnet\
The following frameworks were found:
  8.0.6 at [C:\Program Files\dotnet\shared\Microsoft.NETCore.App]
  8.0.21 at [C:\Program Files\dotnet\shared\Microsoft.NETCore.App]
  9.0.6 at [C:\Program Files\dotnet\shared\Microsoft.NETCore.App]
  9.0.10 at [C:\Program Files\dotnet\shared\Microsoft.NETCore.App]
 
Learn more:
https://aka.ms/dotnet/app-launch-failed
To install missing framework, download:

The pipeline uses the UseDotNet@2 task to install .NET 10.

task: UseDotNet@2
  displayName: Install .NET 10.0.x
  inputs:
    packageType: 'sdk'
    version: '10.0.x'

The pipeline debug logs showed UseDotNet@2 task was setting DOTNET_ROOT and updating PATH correctly:

##[debug]Absolute path for pathSegments: C:\hostedtoolcache\windows\dotnet\sdk
Successfully installed .NET Core sdk version 10.0.100.
##[debug]Processed: ##vso[task.prependpath]C:\hostedtoolcache\windows/dotnet
##[debug]set DOTNET_ROOT=C:\hostedtoolcache\windows/dotnet
And dotnet --info confirmed .NET 10 was installed.
dotnet --info
However func.exe doesn't seem to recognize it, it kept looking at  C:\Program Files\dotnet

When starting the worker process, it ignores:

  • The DOTNET_ROOT environment variable
  • The PATH environment variable

Since .NET 10 isn't yet pre-installed on DevOps agents, Azure Functions can't find it.

After trying different things, the solution came out simple.

When installing .NET 10, override the default installation path which is $(Agent.ToolsDirectory)/dotnet  (C:\hostedtoolcache\windows\dotnet in Windows) to C:\Program Files\dotnet where Azure Functions expects to find it.

task: UseDotNet@2
  displayName: Install .NET 10.0.x
  inputs:
    packageType: 'sdk'
    version: '10.0.x'
    installationPath: 'C:\Program Files\dotnet'

And that did it. 

Hope this helps.

Happy Coding.

Regards,
Jaliya

Wednesday, November 12, 2025

Uri.TryCreate Cross-Platform Quirk: Windows vs. Linux

Hope everyone’s having fun with .NET 10, C# 14 and Visual Studio 2026 from the .NET Conf 2025 announcements.

I was upgrading a project to .NET 10 and as part of the upgrade, was doing some refactoring in the pipelines. One of the changes I did is, I moved the agent that is used to run tests from windows-latest to ubuntu-latest and a test started to fail.

After looking at the unit under test, at core it was checking a given string is a valid Web Uri.

In simple, it's something like this.

[Fact]
public void TryCreate_WhenNotAValidWebUri_ShouldNotCreate()
{
    const string uriString = "/somePath";

    bool isValidWebUri = Uri.TryCreate(uriString, UriKind.Absolute, out Uri_);

    Assert.False(isValidWebUri);
}
If we run this on Windows, it's passing. Good, because obviously "/somePath" is not a Web Uri.
Windows: Pass
And on Linux, it's failing.
Linux: Fail
Apparently on Linux, 
"/somePath" is being treated as a valid Absolute Uri.

Updated the code as follows.

[Fact]
public void TryCreate_WhenNotAValidWebUri_ShouldNotCreate()
{
    const string uriString = "/somePath";

    bool isValidWebUri = Uri.TryCreate(uriString, UriKind.Absolute, out Uriuri)
        && (uri.Scheme == Uri.UriSchemeHttp || uri.Scheme == Uri.UriSchemeHttps);

    Assert.False(isValidWebUri);
}
Now it's passing in both Windows and Linux.
Linux: Pass
Hope this helps.

Happy Coding.

Regards,
Jaliya

Monday, November 10, 2025

Running ASP.NET Core 3.1 Application Inside .NET 9 Container

Recently, I needed to run a set of applications targeting ASP.NET Core 3.1 inside .NET 9 containers. I know, it’s just a couple of days before .NET Conf 2025, and .NET Core 3.1 feels ancient at this point. But unfortunately, upgrading the applications to a newer .NET version wasn’t an option.

Had a bit of trouble getting things to run locally as well as in Azure DevOps Pipelines, so thought of sharing the experience.

First to get the things started, installed ASP.NET Core 3.1 Runtime.

FROM mcr.microsoft.com/dotnet/aspnet:9.0 AS base
WORKDIR /app
 
EXPOSE 8080 2222
 
RUN apt-get update && apt-get install -y \
    curl
 
# Install ASP.NET Core 3.1 runtime
RUN curl -SL --output aspnetcore-runtime-3.1.tar.gz https://dotnetcli.azureedge.net/dotnet/aspnetcore/Runtime/3.1.32/aspnetcore-runtime-3.1.32-linux-x64.tar.gz \
    && mkdir -p /usr/share/dotnet \
    && tar -zxf aspnetcore-runtime-3.1.tar.gz -C /usr/share/dotnet \
    && rm aspnetcore-runtime-3.1.tar.gz

Now ran the application on this and was expecting more errors. As expected container didn't even start.

The first error I got it related to ICU.

Process terminated. 
Couldn't find a valid ICU package installed on the system. 
Set the configuration flag System.Globalization.Invariant to true if you want to run with no globalization support.

I wanted to use Globalization, so installed the ICU package. Note: .NET Core 3.1 requires a specific version: libicu67

RUN apt-get update && apt-get install -y \
    curl \
    wget

# Download and install libicu67 from Debian Bullseye
RUN wget http://ftp.us.debian.org/debian/pool/main/i/icu/libicu67_67.1-7_amd64.deb \
    && dpkg -i libicu67_67.1-7_amd64.deb \
    && rm libicu67_67.1-7_amd64.deb

Once that is installed, the next error is related to libssl. 

No usable version of libssl was found

So installed that.

# Download and install libicu67 and libssl1.1 from Debian Bullseye
RUN wget http://ftp.us.debian.org/debian/pool/main/i/icu/libicu67_67.1-7_amd64.deb \
    && curl -fsSL http://ftp.us.debian.org/debian/pool/main/o/openssl/libssl1.1_1.1.1w-0+deb11u1_amd64.deb -o /tmp/libssl1.1.deb \
    && dpkg -i libicu67_67.1-7_amd64.deb \
    && dpkg -i /tmp/libssl1.1.deb \
    && rm libicu67_67.1-7_amd64.deb /tmp/libssl1.1.deb

And finally got a container up and running locally.

Next is to update the DevOps pipeline. In the pipeline, we were also running EF Core migrations.

We were using ubuntu-latest agent, and installed NET Core SDK 3.1.x and dotnet-ef tool version 3.1.x.

task: UseDotNet@2
    displayName: Install .NET Core SDK 3.1.x
    inputs:
    version: 3.1.200
 
script: |
    dotnet tool install --global dotnet-ef --version 3.1.32 || dotnet tool update --global dotnet-ef --version 3.1.32
    displayName: Install dotnet-ef tool version 3.1.x

And when the installing dotnet-ef --version 3.1.32, got the following error again.

No usable version of libssl was found

So installed libssl in the build agent before installing .NET.

# Ubuntu latest does not have libssl1.1 installed by default, which is required for .NET Core 3.1
script: |
    echo "deb http://security.ubuntu.com/ubuntu focal-security main" | sudo tee /etc/apt/sources.list.d/focal-security.list
    sudo apt-get update
    sudo apt-get install -y libssl1.1
    displayName: 'Install libssl1.1 for .NET Core 3.1

And now migrations were executed and an image got built, pushed and deployed.

That was quite a pain.

Hope this helps.

Happy Coding.

Regards,
Jaliya

Thursday, October 30, 2025

EF Core 10.0: Global Query Filter Improvements

In this post, let's have a look at some nice improvements in Global Query Filters in EF Core 10.0.

We can use Global Query Filters at an entity level to attach an additional LINQ where operator whenever the entity type is queried.

Consider the following DbContext.
public class Customer
{
    public int Id { getset; }

    public string TenantId { getinit; }

    public string Name { getinit; }

    public bool IsDeleted { getset; }
}

public class MyDbContext(string tenantId) : DbContext
{
    public DbSet<Customer> Customers { getset; }

    override protected void OnModelCreating(ModelBuilder modelBuilder)
    {
        modelBuilder.Entity<Customer>()
            .HasQueryFilter(x => x.TenantId == tenantId);
    }
}
And we can do something like this.
string tenantId = "Tenant1";

using var context = new MyDbContext(tenantId);

await context.Customers.AddRangeAsync(
    [
        new Customer
        {
            TenantId = tenantId,
            Name = "John Doe"
        },
        new Customer
        {
            TenantId = tenantId,
            Name = "Jane Doe",
            IsDeleted = true
        },
        new Customer
        {
            TenantId = "Tenant2",
            Name = "Jim Doe"
        }
    ]);

await context.SaveChangesAsync();

foreach (Customercustomer in await context.Customers.ToListAsync())
{
    Console.WriteLine($"Customer: {customer.Name}, Tenant: {customer.TenantId}");
}
When we run above code, the executed query is something below.
SELECT [c].[Id], [c].[IsDeleted], [c].[Name], [c].[TenantId]
FROM [Customers] AS [c]
WHERE [c].[TenantId] = @__P_0
As you can see, Query Filter was attached and we are getting the expected result.

Now prior to EF Core 10.0, if for some reason, we add another Query Filter by doing something like below;
modelBuilder.Entity<Customer>()
    .HasQueryFilter(x => x.TenantId == tenantId);

modelBuilder.Entity<Customer>()
    .HasQueryFilter(x => !x.IsDeleted);
And now if we run the above code, note following executed query.
SELECT [c].[Id], [c].[IsDeleted], [c].[Name], [c].[TenantId]
FROM [Customers] AS [c]
WHERE [c].[IsDeleted] = CAST(AS bit)
Only the last Query Filter was used.
This would not be the desired output. Prior to EF Core 10, when multiple filters are configured, prior filters are overridden.

The workaround is, 
modelBuilder.Entity<Customer>()
    .HasQueryFilter(x => x.TenantId == tenantId && !x.IsDeleted);
With EF Core 10.0, we can now define multiple Query Filters, but each filter has to be given a name.
modelBuilder.Entity<Customer>()
    .HasQueryFilter("TenantFilter"x => x.TenantId == tenantId)
    .HasQueryFilter("SoftDeletionFilter"x => !x.IsDeleted);
And this would generate the following query for the above code.
SELECT [c].[Id], [c].[IsDeleted], [c].[Name], [c].[TenantId]
FROM [Customers] AS [c]
WHERE [c].[TenantId] = @P AND [c].[IsDeleted] = CAST(AS bit)
And also we can ignore Query Filters by doing something like below.
// Query counts with a specific Query Filter ignored
int tenantCustomersCountIncludingDeleted = await context.Customers
    .IgnoreQueryFilters(["SoftDeletionFilter"])
    .CountAsync(); // 2

// Query counts with all Query Filters ignored
int allCustomersCount = await context.Customers
    .IgnoreQueryFilters()
    .CountAsync(); // 3
More read:
   What's New in EF Core 10
   Global Query Filters

Hope this helps.

Happy Coding.

Regards,
Jaliya

Wednesday, October 29, 2025

EF Core 10.0: Support for Partially Updating JSON Columns with ExecuteUpdate/ExecuteUpdateAsync

In this post, let’s explore a great new enhancement available in EF Core 10.0. EF Core 10.0 now supports partially updating JSON columns with ExecuteUpdate/ExecuteUpdateAsync.

Let's consider the following DbContext.

public class Customer
{
    public int Id { getset}

    public string Name { getset}

    public required Contact Contact { getset}
}

public class Contact
{
    public required Address Address { getset}
}

public class Address
{
    public required string Street { getset}

    public required string City { getset}

    public required string State { getset}

    public required string PostalCode { getset}
} public class MyDbContext : DbContext
{
    public DbSet<Customer> Customers { getset}

    protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
    {
        optionsBuilder
            .UseSqlServer("<ConnectionString>"x =>
            {
                x.UseCompatibilityLevel(170); // Microsoft SQL Server 2025
            });
    }

    override protected void OnModelCreating(ModelBuilder modelBuilder)
    {
        modelBuilder
            .Entity<Customer>(x =>
            {
                x.ComplexProperty(x => x.Contact, x => x.ToJson());
            });
    }
}

With above code Customer.Contact column will get created as a json data type (EF Core 10.0: Support for JSON Data Type in Microsoft SQL Server)

Now let's say we need to do a partial update on Customer.Contact.PostalCode.

await context.Customers
    .Where(x => x.Name == "John Doe")
    .ExecuteUpdateAsync(setters =>
        setters
            .SetProperty(c => c.Contact.Address.PostalCode, "98102")
    );

Above will create the following SQL query.

UPDATE [c]
SET [Contact].modify('$.Address.PostalCode', @p)
FROM [Customers] AS [c]
WHERE [c].[Name] = N'John Doe'

Note the partial update on PostalCode using the  modify method. The modify method is currently in preview and only available in Microsoft SQL Server 2025 Preview.

This even works with older versions of Microsoft SQL Server, where the JSON data is stored as nvarchar(max) column.

For example,

optionsBuilder
    .UseSqlServer("<ConnectionString>"x =>
    {
        x.UseCompatibilityLevel(160); // Microsoft SQL Server 2022
    });

This would create the Customer.Contact column as  nvarchar(max) and above ExecuteUpdateAsync would still work. In this case, generated query would be something like following.

UPDATE [c]
SET [c].[Contact] = JSON_MODIFY([c].[Contact], '$.Address.PostalCode', @p)
FROM [Customers] AS [c]
WHERE [c].[Name] = N'John Doe'

Note: this only works when mapping JSON data with ComplexProperty and not with owned entities.

More read:
   What's New in EF Core 10
   JSON Data Type

Hope this helps.

Happy Coding.

Regards,
Jaliya

Tuesday, October 28, 2025

.NET Isolated Azure Functions: Missing Worker Logs

I recently seen this issue in a .NET Isolated Azure Function App, it was writing custom logs at Information level to Application Insights, but the logs aren't there.

It was configured correctly with Application Insights in the Program.cs. And there was no logging filters configured.
using Microsoft.Azure.Functions.Worker.Builder;
using Microsoft.Extensions.DependencyInjection;

FunctionsApplicationBuilder builder = FunctionsApplication.CreateBuilder(args);

builder.Services
    .AddApplicationInsightsTelemetryWorkerService()
    .ConfigureFunctionsApplicationInsights();
To reproduce the issue locally, created a simple .NET isolated Azure Function App with a HTTP trigger that logs something like this.
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;

namespace FunctionApp1;

public class Function1
{
    private readonly ILogger<Function1> _logger;

    public Function1(ILogger<Function1logger)
    {
        _logger = logger;
    }

    [Function("Function1")]
    public IActionResult Run([HttpTrigger(AuthorizationLevel.Function, "get""post")] HttpRequest req)
    {
        _logger.LogInformation("C# HTTP trigger function processed a request.");
        return new OkObjectResult("Welcome to Azure Functions!");
    }
}
When triggered the HTTP Function,
Output
Custom log is not being written. Reproducing the issue locally is a one step closer to resolving the issue.

After spending some time, noticed this.
However, by default, the Application Insights SDK adds a logging filter that instructs the logger to capture only warnings and more severe logs
And to disable the behavior, we can do this.
builder.Logging.Services.Configure<LoggerFilterOptions>(options =>
{
    LoggerFilterRule defaultRule =
        options.Rules.FirstOrDefault(rule => rule.ProviderName == "Microsoft.Extensions.Logging.ApplicationInsights.ApplicationInsightsLoggerProvider");
 
    if (defaultRule is not null)
    {
        options.Rules.Remove(defaultRule);
    }
});
And now when triggered the HTTP Function,
Output
The logs now started to being written.

Hope this helps.

Happy Coding.

Regards,
Jaliya

Monday, October 27, 2025

.NET Isolated Azure Functions: Enabling Open API Support

In this post let's see how to enable Open API Support in .NET Isolated Azure Functions. Long time ago I blogged about Introducing In-Process Azure Functions OpenAPI Extension. And that's for .NET In-Process functions.

A lot has changed since then.

I have created a simple .NET Isolated Azure Function with a Http Trigger targeting .NET 9.
.NET Isolated Http Trigger
Now let's add the Open API support.

First step is installing the following NuGet package.

Install-Package Microsoft.Azure.Functions.Worker.Extensions.OpenApi -Version 1.6.0
This version is the latest as of today, it will change as we go on.

Now if you run the project (note: you don't have to do any changes in Program.cs), you will see the new endpoints for Open API.
Console
If you open up the Swagger UI URL in a browser, you can see something like following.
Swagger UI
Now let's decorate the HTTP Function with Open API attributes.
[Function(nameof(Function1))]
[OpenApiOperation(operationId: nameof(Function1))]
[OpenApiResponseWithBody(statusCode: HttpStatusCode.OK,
    contentType: "application/json",
    bodyType: typeof(object),
    Description = "The OK response message.")]
public IActionResult Run([HttpTrigger(AuthorizationLevel.Anonymous, "get")] HttpRequest req)
{
    return new OkObjectResult(new
    {
        message = "Welcome to Azure Functions!"
    });
}
And to make the Open API spec looks nice, override DefaultOpenApiConfigurationOptions with something like this:
internal class OpenApiConfigurationOptions : DefaultOpenApiConfigurationOptions
{
    public override OpenApiInfo Info { getset; } = new OpenApiInfo
    {
        Version = "1.0.0",
        Title = "Hello World",
        Description = "A sample API for my Azure Function.",
        License = new OpenApiLicense
        {
            Name = "MIT",
            Url = new Uri("http://opensource.org/licenses/MIT"),
        }
    };
    public override OpenApiVersionType OpenApiVersion { getset; } = OpenApiVersionType.V3;
}
And now after doing these changes and if you look at the Swagger UI, 
Swagger UI
Open API Spec:
{
  "openapi""3.0.1",
  "info": {
    "title""Hello World",
    "description""A sample API for my Azure Function.",
    "license": {
      "name""MIT",
      "url""http://opensource.org/licenses/MIT"
    },
    "version""1.0.0"
  },
  "servers": [
    {
      "url""http://localhost:7154/api"
    }
  ],
  "paths": {
    "/Function1": {
      "get": {
        "operationId""Function1",
        "responses": {
          "200": {
            "description""The OK response message.",
            "content": {
              "application/json": {
                "schema": {
                  "type""object"
                }
              }
            }
          }
        }
      }
    }
  },
  "components": {}
}
Hope this helps.

Happy Coding.

Regards,
Jaliya