Thursday, December 25, 2025

Content Understanding Studio: Adding Microsoft Foundry Resource: Defaults have not yet been set Error

I had a fresh Microsoft Foundry resource created and when attempting to add the resource into Content Understanding Studio, I made sure to check "Enable auto-deployment for required models if no defaults are available."
Content Understanding Studio: Add new connected resource
But I was still getting the following error.
Content Understanding Studio: Add new connected resource
Upon checking it was on, 
{
  "error": {
    "code""InvalidRequest",
    "message""Invalid request.",
    "innererror": {
      "code""DefaultsNotSet",
      "message""Defaults have not yet been set. Call 'PATCH /contentunderstanding/defaults' first."
    }
  }
}
The fix is (as mentioned in the error message), manually call the PATCH endpoint with an empty body before adding the resource.
curl --location --request PATCH 'https://{foundry-resource}.cognitiveservices.azure.com/contentunderstanding/defaults?api-version=2025-11-01' `
--header 'ocp-apim-subscription-key: {ocp-apim-subscription-key}' `
--header 'Content-Type: application/json' `
--data '{
    
}'
Once this PATCH request is made, the Foundry resource gets added successfully without any errors.

This appears to be a bug. When "Enable auto-deployment for required models if no defaults are available" is checked, the PATCH call should be handled internally by the Studio.

Happy Coding.

Regards,
Jaliya

Tuesday, December 23, 2025

Studio 3T and Azure Cosmos DB for MongoDB

When working with MongoDB, having a reliable GUI tool makes all the difference. Studio 3T has been my go-to tool for years, and it obviously works seamlessly with MongoDB. And most importantly, it works very well with Azure Cosmos DB for MongoDB.

There are several MongoDB GUI tools available, but Studio 3T stands out for a few reasons:
  • Visual Query Builder: Build queries visually if you prefer not to write JSON
  • IntelliShell: Auto-completion for MongoDB queries with syntax highlighting
  • Aggregation Editor: Step-by-step pipeline builder with stage-by-stage output preview
  • SQL Query: Write SQL and have it translated to MongoDB query language
  • Import/Export: Easily move data between MongoDB, JSON, CSV, and SQL databases
Connecting to Azure Cosmos DB for MongoDB from Studio 3T is just as easy as any MongoDB. Just copy the Connection String and paste it in, and you are connected.

Note: Azure Cosmos DB for MongoDB requires SSL, which is already included in the connection string.

SQL to MongoDB


If you are coming from a SQL background, the SQL Query feature is a lifesaver. Write a query like:
SELECT * 
FROM employees 
WHERE department = 'IT'
ORDER BY name 
LIMIT 10
And Studio 3T translates it to:
db.employees
.find({ department"IT" })
   .sort({ name: 1 })
   .limit(10);
Do try it out.

Happy Coding.

Regards,
Jaliya

Thursday, December 18, 2025

DefaultAzureCredential: Troubleshooting Local Development Issues

DefaultAzureCredential is the recommended approach for authenticating with Azure services, and in most cases, we rarely rely on access keys anymore, authentication is typically handled through managed identities.

However, during local development, when authentication falls back to the developer’s user account, this can occasionally introduce unexpected complexity and frustration.

I usually use the following DefaultAzureCredentialOptions:
DefaultAzureCredentialOptions credentialOptions = new()
{
    // Explicitly specify the tenant to avoid cross-tenant issues
    TenantId = "<TenantId>",

    // Prioritize local development credentials
    ExcludeAzureCliCredential = false,          // Azure CLI (az login)
    ExcludeAzureDeveloperCliCredential = false// Azure Developer CLI (azd auth login)
    ExcludeVisualStudioCredential = true,

    // Exclude irrelevant credentials
    ExcludeInteractiveBrowserCredential = true,
    ExcludeWorkloadIdentityCredential = true,

    // Keep managed identity for production.
    ExcludeManagedIdentityCredential = false,
};

DefaultAzureCredential defaultAzureCredential = new(credentialOptions);
Key points:
  • Always specify TenantId to avoid cross-tenant issues
  • Always avoiding exclude VisualStudioCredential, and relying on Azure CLI and Azure Developer CLI credentials
  • Keep ManagedIdentityCredential enabled so the same code works in production
If you want to enable logging for any troubleshooting:
using Azure.Core.Diagnostics;
using System.Diagnostics.Tracing;

WebApplicationBuilder builder = WebApplication.CreateBuilder(args);

// Add services to the container.

WebApplication app = builder.Build();

ILoggerFactory loggerFactory = app.Services.GetRequiredService<ILoggerFactory>();
ILogger azureIdentityLogger = loggerFactory.CreateLogger("Azure.Identity");

using var listener = new AzureEventSourceListener((argsmessage) =>
{
    if (args.EventSource.Name == "Azure-Identity")
    {
        azureIdentityLogger.LogInformation("{Message}"message);
    }
}, EventLevel.Verbose);

// Configure the HTTP request pipeline.
app.Run()
Hope this helps.

Happy Coding.

Regards,
Jaliya

Monday, December 15, 2025

Azure Functions: Running with Production App Settings Locally

In this post, let's have a look at how to run Azure Functions locally with Production app settings and why some explicit configuration is required compared to ASP.NET Core applications.

When developing Azure Functions, you might want to test your function app locally using Production configuration. This is a common requirement when you want to verify settings before deployment or troubleshoot production-specific issues.

This is usually how we add appsettings.*.json files to Configuration.
using Microsoft.Azure.Functions.Worker.Builder;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Hosting;

FunctionsApplicationBuilder builder = FunctionsApplication.CreateBuilder(args);

builder.Configuration
    .AddJsonFile("appsettings.json"optionaltruereloadOnChangefalse)
    .AddJsonFile($"appsettings.{builder.Environment.EnvironmentName}.json"optionaltruereloadOnChangefalse)
    .AddEnvironmentVariables();

// Other configurations

IHost host = builder.Build();

host.Run();
In ASP.NET Core applications, you can simply set the ASPNETCORE_ENVIRONMENT or DOTNET_ENVIRONMENT environment variable, and the application will automatically load appsettings.<EnvironmentName>.json.

For Azure Functions, things work a bit differently.

First, Azure Functions uses AZURE_FUNCTIONS_ENVIRONMENT to determine the current environment. If not set, then it falls back to DOTNET_ENVIRONMENT.

You can set it on your launchSettings.json
{
  "profiles": {
    "MyProfile": {
      "commandName""Project",
      "commandLineArgs""--port 7052",
      "environmentVariables": {
        "AZURE_FUNCTIONS_ENVIRONMENT""Production"
      }
    }
  }
}
Or in your local.settings.json:
{
  "IsEncrypted"false,
  "Values": {
    "AzureWebJobsStorage""UseDevelopmentStorage=true",
    "FUNCTIONS_WORKER_RUNTIME""dotnet-isolated",
    "AZURE_FUNCTIONS_ENVIRONMENT""Production"
  }
}
You can also use DOTNET_ENVIRONMENT instead of AZURE_FUNCTIONS_ENVIRONMENT, both will work.

Even after setting the environment variable, you might notice that your appsettings.<EnvironmentName>.json settings are not being loaded. This is because the file is not being copied to the output directory.

Here's the important difference:
  • Microsoft.NET.Sdk.Web (used by ASP.NET Core apps) automatically copies all appsettings.*.json files to the output directory.
  • Microsoft.NET.Sdk (used by Azure Functions, Class Libraries, Console apps) does not.
Azure Functions projects use Microsoft.NET.Sdk by default, which means you need to explicitly configure the project file to copy your environment-specific settings files.

Something like below:
<ItemGroup>
  <None Update="appsettings.Production.json">
    <CopyToOutputDirectory>Always</CopyToOutputDirectory>
  </None>
</ItemGroup>
Hope this helps.

Happy Coding.

Regards,
Jaliya

Tuesday, December 2, 2025

Microsoft Entra External ID: Disable Sign Up in a User Flow

I was setting up an application on Microsoft Entra External ID and in my User Flow, I didn't want to enable Sign Up.
Sign Up/Sign In
So wanted to remove No account? Create one.

Apparently Microsoft Entra admin center  doesn't seem to have a functionality to remove this within the portal.

It however can be done using Graph Beta API.

# Install the Microsoft Graph Beta module (required for authentication events flow management)
Install-Module Microsoft.Graph.Beta -Scope CurrentUser -Force
 
# Print version of Microsoft Graph Beta module
$mgBetaModule = Get-Module Microsoft.Graph.Beta -ListAvailable `
    | Sort-Object Version -Descending `
    | Select-Object -First 1
Write-Output "Using Microsoft.Graph.Beta: $($mgBetaModule.Version)" # As of today: 2.32.0
 
# Connect to Azure Account
Write-Output "Connecting to Azure Account..."
Connect-AzAccount
 
$tenantId = "<tenant-id>"
$targetFlowName = "<user-flow-name>"
 
# Connect to Microsoft Graph with required permissions
# Required scopes:
#   - Policy.ReadWrite.AuthenticationFlows: To read and modify authentication flows
#   - EventListener.Read.All/ReadWrite.All: To read and modify event listeners
#   - Application.Read.All/ReadWrite.All: To read and modify applications
Connect-MgGraph `
    -TenantId $tenantId `
    -Scopes "Policy.ReadWrite.AuthenticationFlows", `
        "EventListener.Read.All", `
        "EventListener.ReadWrite.All", `
        "Application.Read.All", `
        "Application.ReadWrite.All"
 
# Verify the connected tenant
$tenantId = (Get-MgContext).TenantId
Write-Output "Successfully connected to tenant: $tenantId"
 
# Retrieve all authentication events flows
$authenticationEventsFlows = Invoke-MgGraphRequest -Method GET `
    -Uri "https://graph.microsoft.com/beta/identity/authenticationEventsFlows"
 
# Find the ID of the target flow
$targetFlowId = ($authenticationEventsFlows.value `
    | Where-Object { $_.displayName -eq $targetFlowName }).id
 
if (-not $targetFlowId) {
    Write-Output "ERROR: Flow '$targetFlowName' not found."
    exit 1
}
 
# Get the target flow
$targetFlow = Invoke-MgGraphRequest -Method GET `
    -Uri "https://graph.microsoft.com/beta/identity/authenticationEventsFlows/$targetFlowId"
  
if ($targetFlow.onInteractiveAuthFlowStart.isSignUpAllowed -eq $false) {
    Write-Output "Sign-up is already disabled for this flow $targetFlowName."
    exit 0
}

Write-Output "Disabling sign-up for flow $targetFlowName..."
 
# Request body to disable sign-up
$body = @{
    "@odata.type" = "#microsoft.graph.externalUsersSelfServiceSignUpEventsFlow"
    "onInteractiveAuthFlowStart" = @{
        "@odata.type" = "#microsoft.graph.onInteractiveAuthFlowStartExternalUsersSelfServiceSignUp"
        "isSignUpAllowed" = $false
    }
} | ConvertTo-Json -Depth 5
 
# PATCH
Invoke-MgGraphRequest -Method PATCH `
    -Uri "https://graph.microsoft.com/beta/identity/authenticationEventsFlows/$targetFlowId" `
    -Body $body `
    -ContentType "application/json"
 
# Verify the update by retrieving the flow again
$updatedFlow = Invoke-MgGraphRequest -Method GET `
    -Uri "https://graph.microsoft.com/beta/identity/authenticationEventsFlows/$targetFlowId"
 
Write-Output "Updated: $($updatedFlow.onInteractiveAuthFlowStart.isSignUpAllowed)"
And that's it.
Sign In
Hope this helps.

Happy Coding.

Regards,
Jaliya

Thursday, November 27, 2025

Creating SAS URIs for Azure Storage Blobs using DefaultAzureCredential

When working with Azure Storage Blobs in .NET, you will often need to generate Shared Access Signature (SAS) URIs to provide temporary, secure access to your blob resources. 

However, if you're using DefaultAzureCredential for authentication we cannot simply call GenerateSasUri() on a BlobClient instance. 

BlobServiceClient blobServiceClient = new BlobServiceClient(
    new Uri($"https://{storageAccountName}.blob.core.windows.net"),
    new DefaultAzureCredential());

BlobClient blobClient = blobServiceClient
    .GetBlobContainerClient(containerName)
    .GetBlobClient(blobName);

// Throws exception: System.ArgumentNullException: Value cannot be null. (Parameter 'sharedKeyCredential')
Uri sasUri = blobClient.GenerateSasUri(BlobSasPermissions.Read, DateTimeOffset.UtcNow.AddMinutes(5));

That's because GenerateSasUri() requires SharedKeyCredential to sign the SAS token. When using DefaultAzureCredential, you don't have access to the storage account key.

The Quick (But Not Ideal) Workaround

For faster development and testing, many developers (myself included) have resorted to using connection strings with account keys:

BlobServiceClient blobServiceClient = new BlobServiceClient(
    $"DefaultEndpointsProtocol=https;AccountName={storageAccountName};AccountKey={accountKey};EndpointSuffix=core.windows.net");

BlobClient blobClient = blobServiceClient
    .GetBlobContainerClient(containerName)
    .GetBlobClient(blobName);

// Now GenerateSasUri() works
Uri sasUri = blobClient.GenerateSasUri(BlobSasPermissions.Read, DateTimeOffset.UtcNow.AddMinutes(5));

The Best Approach: User Delegation Keys

The recommended solution is to use User Delegation Keys. This approach allows you to generate SAS tokens using Azure AD credentials instead of storage account keys.

BlobServiceClient blobServiceClient = new BlobServiceClient(
    new Uri($"https://{storageAccountName}.blob.core.windows.net"),
    new DefaultAzureCredential());

BlobClient blobClient = blobServiceClient
    .GetBlobContainerClient(containerName)
    .GetBlobClient(blobName);

// Define the SAS validity period
var startsOn = DateTimeOffset.UtcNow.AddMinutes(-1);
var expiresOn = DateTimeOffset.UtcNow.AddMinutes(5);

// Build the SAS token configuration
var sasBuilder = new BlobSasBuilder
{
    BlobContainerName = containerName,
    BlobName = blobName,
    Resource = "b",
    StartsOn = startsOn,
    ExpiresOn = expiresOn,
};

sasBuilder.SetPermissions(BlobSasPermissions.Read);

// Get user delegation key from Azure AD (uses your Azure AD identity)
Response<UserDelegationKeyuserDelegationKey =
    await blobServiceClient.GetUserDelegationKeyAsync(startsOnexpiresOn);

// Generate SAS URI using user delegation key
var blobUriBuilder = new BlobUriBuilder(blobClient.Uri)
{
    Sas = sasBuilder.ToSasQueryParameters(
        userDelegationKey.Value,
        blobServiceClient.AccountName)
};

Uri sasUri = blobUriBuilder.ToUri();

And note this requires the identity to have Permission: Storage Blob Delegator.

Hope this helps.

Happy Coding.

Regards,
Jaliya

Friday, November 14, 2025

Azure DevOps: Azure Functions Core Tools Can't Find .NET 10 Installed by UseDotNet@2 Task on Windows Agents

I was upgrading an Azure Durable Function Application from .NET 9 to .NET 10. Our Azure DevOps pipeline have a job that executes set of integration tests by spinning up the function using Azure Functions Core Tools (func.exe). Since we were using MSSQLLocalDB, the agent is Windows.

After the upgrade, the integration tests was failing to spin up func with a frustrating error.

You must install or update .NET to run this application.
App: D:\a\1\s\tests\...\bin\Debug\net10.0\FunctionApp.dll
Architecture: x64
Framework: 'Microsoft.NETCore.App', version '10.0.0' (x64)
.NET location: C:\Program Files\dotnet\
The following frameworks were found:
  8.0.6 at [C:\Program Files\dotnet\shared\Microsoft.NETCore.App]
  8.0.21 at [C:\Program Files\dotnet\shared\Microsoft.NETCore.App]
  9.0.6 at [C:\Program Files\dotnet\shared\Microsoft.NETCore.App]
  9.0.10 at [C:\Program Files\dotnet\shared\Microsoft.NETCore.App]
 
Learn more:
https://aka.ms/dotnet/app-launch-failed
To install missing framework, download:

The pipeline uses the UseDotNet@2 task to install .NET 10.

task: UseDotNet@2
  displayName: Install .NET 10.0.x
  inputs:
    packageType: 'sdk'
    version: '10.0.x'

The pipeline debug logs showed UseDotNet@2 task was setting DOTNET_ROOT and updating PATH correctly:

##[debug]Absolute path for pathSegments: C:\hostedtoolcache\windows\dotnet\sdk
Successfully installed .NET Core sdk version 10.0.100.
##[debug]Processed: ##vso[task.prependpath]C:\hostedtoolcache\windows/dotnet
##[debug]set DOTNET_ROOT=C:\hostedtoolcache\windows/dotnet
And dotnet --info confirmed .NET 10 was installed.
dotnet --info
However func.exe doesn't seem to recognize it, it kept looking at  C:\Program Files\dotnet

When starting the worker process, it ignores:

  • The DOTNET_ROOT environment variable
  • The PATH environment variable

Since .NET 10 isn't yet pre-installed on DevOps agents, Azure Functions can't find it.

After trying different things, the solution came out simple.

When installing .NET 10, override the default installation path which is $(Agent.ToolsDirectory)/dotnet  (C:\hostedtoolcache\windows\dotnet in Windows) to C:\Program Files\dotnet where Azure Functions expects to find it.

task: UseDotNet@2
  displayName: Install .NET 10.0.x
  inputs:
    packageType: 'sdk'
    version: '10.0.x'
    installationPath: 'C:\Program Files\dotnet'

And that did it. 

Hope this helps.

Happy Coding.

Regards,
Jaliya

Wednesday, November 12, 2025

Uri.TryCreate Cross-Platform Quirk: Windows vs. Linux

Hope everyone’s having fun with .NET 10, C# 14 and Visual Studio 2026 from the .NET Conf 2025 announcements.

I was upgrading a project to .NET 10 and as part of the upgrade, was doing some refactoring in the pipelines. One of the changes I did is, I moved the agent that is used to run tests from windows-latest to ubuntu-latest and a test started to fail.

After looking at the unit under test, at core it was checking a given string is a valid Web Uri.

In simple, it's something like this.

[Fact]
public void TryCreate_WhenNotAValidWebUri_ShouldNotCreate()
{
    const string uriString = "/somePath";

    bool isValidWebUri = Uri.TryCreate(uriString, UriKind.Absolute, out Uri_);

    Assert.False(isValidWebUri);
}
If we run this on Windows, it's passing. Good, because obviously "/somePath" is not a Web Uri.
Windows: Pass
And on Linux, it's failing.
Linux: Fail
Apparently on Linux, 
"/somePath" is being treated as a valid Absolute Uri.

Updated the code as follows.

[Fact]
public void TryCreate_WhenNotAValidWebUri_ShouldNotCreate()
{
    const string uriString = "/somePath";

    bool isValidWebUri = Uri.TryCreate(uriString, UriKind.Absolute, out Uriuri)
        && (uri.Scheme == Uri.UriSchemeHttp || uri.Scheme == Uri.UriSchemeHttps);

    Assert.False(isValidWebUri);
}
Now it's passing in both Windows and Linux.
Linux: Pass
Hope this helps.

Happy Coding.

Regards,
Jaliya

Monday, November 10, 2025

Running ASP.NET Core 3.1 Application Inside .NET 9 Container

Recently, I needed to run a set of applications targeting ASP.NET Core 3.1 inside .NET 9 containers. I know, it’s just a couple of days before .NET Conf 2025, and .NET Core 3.1 feels ancient at this point. But unfortunately, upgrading the applications to a newer .NET version wasn’t an option.

Had a bit of trouble getting things to run locally as well as in Azure DevOps Pipelines, so thought of sharing the experience.

First to get the things started, installed ASP.NET Core 3.1 Runtime.

FROM mcr.microsoft.com/dotnet/aspnet:9.0 AS base
WORKDIR /app
 
EXPOSE 8080 2222
 
RUN apt-get update && apt-get install -y \
    curl
 
# Install ASP.NET Core 3.1 runtime
RUN curl -SL --output aspnetcore-runtime-3.1.tar.gz https://dotnetcli.azureedge.net/dotnet/aspnetcore/Runtime/3.1.32/aspnetcore-runtime-3.1.32-linux-x64.tar.gz \
    && mkdir -p /usr/share/dotnet \
    && tar -zxf aspnetcore-runtime-3.1.tar.gz -C /usr/share/dotnet \
    && rm aspnetcore-runtime-3.1.tar.gz

Now ran the application on this and was expecting more errors. As expected container didn't even start.

The first error I got it related to ICU.

Process terminated. 
Couldn't find a valid ICU package installed on the system. 
Set the configuration flag System.Globalization.Invariant to true if you want to run with no globalization support.

I wanted to use Globalization, so installed the ICU package. Note: .NET Core 3.1 requires a specific version: libicu67

RUN apt-get update && apt-get install -y \
    curl \
    wget

# Download and install libicu67 from Debian Bullseye
RUN wget http://ftp.us.debian.org/debian/pool/main/i/icu/libicu67_67.1-7_amd64.deb \
    && dpkg -i libicu67_67.1-7_amd64.deb \
    && rm libicu67_67.1-7_amd64.deb

Once that is installed, the next error is related to libssl. 

No usable version of libssl was found

So installed that.

# Download and install libicu67 and libssl1.1 from Debian Bullseye
RUN wget http://ftp.us.debian.org/debian/pool/main/i/icu/libicu67_67.1-7_amd64.deb \
    && curl -fsSL http://ftp.us.debian.org/debian/pool/main/o/openssl/libssl1.1_1.1.1w-0+deb11u1_amd64.deb -o /tmp/libssl1.1.deb \
    && dpkg -i libicu67_67.1-7_amd64.deb \
    && dpkg -i /tmp/libssl1.1.deb \
    && rm libicu67_67.1-7_amd64.deb /tmp/libssl1.1.deb

And finally got a container up and running locally.

Next is to update the DevOps pipeline. In the pipeline, we were also running EF Core migrations.

We were using ubuntu-latest agent, and installed NET Core SDK 3.1.x and dotnet-ef tool version 3.1.x.

task: UseDotNet@2
    displayName: Install .NET Core SDK 3.1.x
    inputs:
    version: 3.1.200
 
script: |
    dotnet tool install --global dotnet-ef --version 3.1.32 || dotnet tool update --global dotnet-ef --version 3.1.32
    displayName: Install dotnet-ef tool version 3.1.x

And when the installing dotnet-ef --version 3.1.32, got the following error again.

No usable version of libssl was found

So installed libssl in the build agent before installing .NET.

# Ubuntu latest does not have libssl1.1 installed by default, which is required for .NET Core 3.1
script: |
    echo "deb http://security.ubuntu.com/ubuntu focal-security main" | sudo tee /etc/apt/sources.list.d/focal-security.list
    sudo apt-get update
    sudo apt-get install -y libssl1.1
    displayName: 'Install libssl1.1 for .NET Core 3.1

And now migrations were executed and an image got built, pushed and deployed.

That was quite a pain.

Hope this helps.

Happy Coding.

Regards,
Jaliya

Thursday, October 30, 2025

EF Core 10.0: Global Query Filter Improvements

In this post, let's have a look at some nice improvements in Global Query Filters in EF Core 10.0.

We can use Global Query Filters at an entity level to attach an additional LINQ where operator whenever the entity type is queried.

Consider the following DbContext.
public class Customer
{
    public int Id { getset; }

    public string TenantId { getinit; }

    public string Name { getinit; }

    public bool IsDeleted { getset; }
}

public class MyDbContext(string tenantId) : DbContext
{
    public DbSet<Customer> Customers { getset; }

    override protected void OnModelCreating(ModelBuilder modelBuilder)
    {
        modelBuilder.Entity<Customer>()
            .HasQueryFilter(x => x.TenantId == tenantId);
    }
}
And we can do something like this.
string tenantId = "Tenant1";

using var context = new MyDbContext(tenantId);

await context.Customers.AddRangeAsync(
    [
        new Customer
        {
            TenantId = tenantId,
            Name = "John Doe"
        },
        new Customer
        {
            TenantId = tenantId,
            Name = "Jane Doe",
            IsDeleted = true
        },
        new Customer
        {
            TenantId = "Tenant2",
            Name = "Jim Doe"
        }
    ]);

await context.SaveChangesAsync();

foreach (Customercustomer in await context.Customers.ToListAsync())
{
    Console.WriteLine($"Customer: {customer.Name}, Tenant: {customer.TenantId}");
}
When we run above code, the executed query is something below.
SELECT [c].[Id], [c].[IsDeleted], [c].[Name], [c].[TenantId]
FROM [Customers] AS [c]
WHERE [c].[TenantId] = @__P_0
As you can see, Query Filter was attached and we are getting the expected result.

Now prior to EF Core 10.0, if for some reason, we add another Query Filter by doing something like below;
modelBuilder.Entity<Customer>()
    .HasQueryFilter(x => x.TenantId == tenantId);

modelBuilder.Entity<Customer>()
    .HasQueryFilter(x => !x.IsDeleted);
And now if we run the above code, note following executed query.
SELECT [c].[Id], [c].[IsDeleted], [c].[Name], [c].[TenantId]
FROM [Customers] AS [c]
WHERE [c].[IsDeleted] = CAST(AS bit)
Only the last Query Filter was used.
This would not be the desired output. Prior to EF Core 10, when multiple filters are configured, prior filters are overridden.

The workaround is, 
modelBuilder.Entity<Customer>()
    .HasQueryFilter(x => x.TenantId == tenantId && !x.IsDeleted);
With EF Core 10.0, we can now define multiple Query Filters, but each filter has to be given a name.
modelBuilder.Entity<Customer>()
    .HasQueryFilter("TenantFilter"x => x.TenantId == tenantId)
    .HasQueryFilter("SoftDeletionFilter"x => !x.IsDeleted);
And this would generate the following query for the above code.
SELECT [c].[Id], [c].[IsDeleted], [c].[Name], [c].[TenantId]
FROM [Customers] AS [c]
WHERE [c].[TenantId] = @P AND [c].[IsDeleted] = CAST(AS bit)
And also we can ignore Query Filters by doing something like below.
// Query counts with a specific Query Filter ignored
int tenantCustomersCountIncludingDeleted = await context.Customers
    .IgnoreQueryFilters(["SoftDeletionFilter"])
    .CountAsync(); // 2

// Query counts with all Query Filters ignored
int allCustomersCount = await context.Customers
    .IgnoreQueryFilters()
    .CountAsync(); // 3
More read:
   What's New in EF Core 10
   Global Query Filters

Hope this helps.

Happy Coding.

Regards,
Jaliya