Thursday, February 12, 2026

docker-compose with SQL Server and ASP.NET

This article discussed one approach to having your ASP.NET development environment work with SQL Server (MSSQL) running in a docker container.

Source code: https://github.com/medhatelmasry/AspMSSQL

It is assumed that the following installed on your computer:

  1. .NET 10.0 
  2. Docker Desktop 
  3. ‘dotnet-ef’ tool 

Setting up SQL Server docker container

To download a suitable SQL Server image from Docker Hub and run it on your local computer, type the following command from within a terminal window:

docker run --cap-add SYS_PTRACE -e ACCEPT_EULA=1 -e MSSQL_SA_PASSWORD=SqlPassword! -p 1444:1433 --name mssql -d mcr.microsoft.com/mssql/server:2022-latest

This starts a container named mssql that listens on port 1444 on your local computer. The sa password is SqlPassword!.

To ensure that the SQL Server container is running, type the following from within a terminal window:

docker ps

You will see a message like the following:

CONTAINER ID   IMAGE                                        ...... NAMES
e84053717017   mcr.microsoft.com/mssql/server:2022-latest   ...... mssql

Creating our ASP.NET MVC App

Create an ASP.NET MVC app named AspMSSQL with SQL Server support by running the following terminal window commands:

dotnet new mvc --auth individual --use-local-db -o AspMSSQL
cd AspMSSQL

To run the web application and see what it looks like, enter the following command:

dotnet watch

The app starts in your default browser and looks like this:

Let us configure our web application so that the connection string can be constructed from environment variables. Open the Program.cs file in your favourite editor and comment out (or delete) the following statements:

var connectionString = builder.Configuration.GetConnectionString("DefaultConnection") ?? throw new InvalidOperationException("Connection string 'DefaultConnection' not found.");

Replace the above code with the following:

var host = builder.Configuration["DBHOST"] ?? "localhost";
var port = builder.Configuration["DBPORT"] ?? "1444";
var password = builder.Configuration["DBPASSWORD"] ?? "SqlPassword!";
var db = builder.Configuration["DBNAME"] ?? "mydb";
var user = builder.Configuration["DBUSER"] ?? "sa";

string connectionString = $"Server={host},{port};Database={db};UID={user};PWD={password};TrustServerCertificate=True;";

Five environment variables are used in the database connection string. These are: DBHOST, DBPORT , DBPASSWORD, DBNAME and DBUSER. If these environment variables are not found then they will take on default values: localhost, 1444, SqlPassword!, mydb and sa respectively.

Go ahead and delete the connection string from appsettings.json as it is not needed anymore:

"ConnectionStrings": {
  "DefaultConnection": "Server=(localdb)\\mssqllocaldb;Database=aspnet-AspMSSQL; MultipleActiveResultSets=true"
},

Entity Framework Migrations

We can instruct our application to automatically process any outstanding Entity Framework migrations. This is done by adding the following statement to Program.cs right before the last app.Run() statement:

using (var scope = app.Services.CreateScope()) {
    var services = scope.ServiceProvider;

    var context = services.GetRequiredService<ApplicationDbContext>();    
    context.Database.Migrate();
}

Test app

Now, let's test our web app and see whether it can talk to the containerized MSSQL database server. Run the web application with the following terminal command:

dotnet watch

Click on the Register link on the top right side.

I entered an Email, Password and Confirm password, then clicked on the Register button. The website then displays the following page that requires that you confirm the email address:

Click on the “Click here to confirm your account” link. This leads you to a confirmation page:

Login with the email address and password that you registered with.

The message on the top right side confirms that the user was saved and that communication between the ASP.NET MVC app and SQL Server is working as expected.

Dockeri-zing app

We will generate the release version of the application by executing the following command from a terminal window in the root directory of the web app:

dotnet publish -o distrib

The above command instructs dotnet to produce the release version of the application in the distrib directory. When you inspect the distrib directory, you will see files like the following:

The highlighted file in the above image is the main DLL file that is the entry point into the web application. Let us run the DLL. To do this, change to the distrib directory, then run your main DLL file with:

cd distrib
dotnet AspMSSQL.dll

This displays the familiar messages from the web server that the app is ready to be accessed from a browser. 

Hit CTRL C to stop the web server.

We now have a good idea about the ASP.NET artifacts that need to be copied into a container.

In a terminal window, stop and remove the MSSQL container with:

docker rm -f mssql

Return to the root directory of your project by typing the following in a terminal window:

cd ..

Docker image for web app

We need to create a docker image that will contain the .NET runtime. At the time of writing this article, the current version of .NET is 10.0.

We can exclude files from being copied into the container imag Add a file named .dockerignore in the root of the web application with this content:

**/.git
**/.gitignore
**/node_modules
**/npm-debug.log
**/.DS_Store
**/bin
**/obj
**/.vs
**/.vscode
**/.env
**/*.user
**/*.suo
**/.idea
**/coverage
**/.nyc_output
**/docker-compose*.yml
**/Dockerfile*
**/.github
**/README.md
**/LICENSE

Create a text file named Dockerfile and add to it the following content:

# Build stage
FROM mcr.microsoft.com/dotnet/sdk:10.0 AS build
WORKDIR /src

# Copy project file and restore dependencies
COPY ["AspMSSQL.csproj", "."]
RUN dotnet restore "AspMSSQL.csproj"

# Copy the rest of the source code
COPY . .

# Build the application
RUN dotnet build "AspMSSQL.csproj" -c Release -o /app/build

# Publish stage
FROM build AS publish
RUN dotnet publish "AspMSSQL.csproj" -c Release -o /app/publish /p:UseAppHost=false

# Runtime stage
FROM mcr.microsoft.com/dotnet/aspnet:10.0 AS runtime
WORKDIR /app

# Install curl for health checks (optional)
RUN apt-get update && apt-get install -y curl && rm -rf /var/lib/apt/lists/*

# Copy published application from publish stage
COPY --from=publish /app/publish .

# Expose port 8080 (HTTP)
EXPOSE 8080

# Set environment variables
ENV ASPNETCORE_URLS=http://+:8080
ENV ASPNETCORE_ENVIRONMENT=Production

# Run the application
ENTRYPOINT ["dotnet", "AspMSSQL.dll"]

docker-compose.yml

We will next create a docker yml file that orchestrates the entire system involving two containers: a MSSQL database server and our web app. In the root folder of your application, create a text file named docker-compose.yml and add to it the following content:

services:
  # SQL Server Service
  mssql:
    image: mcr.microsoft.com/mssql/server:2022-latest
    container_name: aspmsql-mssql
    environment:
      ACCEPT_EULA: 'Y'
      MSSQL_SA_PASSWORD: 'SqlPassword!123'
      MSSQL_PID: 'Developer'
    ports:
      - "1433:1433"
    volumes:
      - ./mssql-data:/var/opt/mssql/data

  # ASP.NET Application Service
  aspmsql-app:
    build:
      context: .
      dockerfile: Dockerfile
    container_name: aspmsql-app
    depends_on:
      - mssql
    environment:
      ASPNETCORE_ENVIRONMENT: Development
      ASPNETCORE_URLS: http://+:8080
      DBHOST: mssql
      DBPORT: 1433
      DBUSER: sa
      DBPASSWORD: SqlPassword!123
      DBNAME: AspMSSQLDb
    ports:
      - "8080:8080"
    restart: unless-stopped

volumes:
  sqlserver-data:
    driver: local

Running the yml file

To find out if this all works, go to a terminal window and run the following command:

docker-compose up -d --build

Point your browser to http://localhost:8080/ and you should see the main web page. Register a user, confirm the email, and login. It should all work as expected.

Cleanup

Run the following command to shutdown docker-compose and cleanup:

docker-compose down

Conclusion

We have seen how straight forward and easy it is to containerize an application and its database with docker-compose.

Thursday, January 22, 2026

Using Microsoft Agent Framework with AI models hosted on GitHub

Overview

In this article I will show you how you can use the Microsoft Agent Framework to experiment with AI models hosted on GitHub. GitHub AI Models are intended for learning, experimentation and proof-of-concept activities. The GitHub Models feature is subject to various limits (including requests per minute, requests per day, tokens per request, and concurrent requests) and is not designed for production use cases.

Getting Started

There are many AI models from a variety of vendors that you can choose from. The starting point is to visit https://github.com/marketplace?type=models to work with free GitHub AI Models. At the time of writing, these are a subset of the models available:


For this article, I will use the "OpenAI GPT-4o" model highlighted above. 

Selecting the "OpenAI GPT-4o" leads you to the page below. Click on the "<> Use this model" button.

On the dialog that pops up, select C#, then click on "Run a basic code sample".

On the next dialog, you will see the signature of the model. In our case it is "openai/gpt-4o".


Click on "1. Configure authentication".

Next, click on the "Create Personal Access Token" button. 

You may need to go through a verification process.

Make your selections.

On the next pop-up, click on "Generate token".

Copy the newly generated token and place it is a safe place because you cannot view this token again once you leave this page. 

Let's use Microsoft Agent Framework (MAF)

In a working directory, create a C# console app named GitHubAiModelMAF inside a terminal window with the following command:

dotnet new console -n GitHubAiModelMAF

Change into the newly created directory GitHubAiModelSK with:

cd GitHubAiModelMAF

Next, let's add two packages to our console application with:

dotnet add package Azure.AI.OpenAI
dotnet add package Azure.Identity
dotnet add package Microsoft.Agents.AI.OpenAI -v 1.0.0-preview.251114.1
dotnet add package Microsoft.Extensions.Configuration.Json

Create a file named appsettings.json. Add this to appsettings.json:

{
    "GitHub": {
        "Token": "PUT-PERSONAL-ACCESS-TOKEN-HERE",
        "ApiEndpoint": "https://models.github.ai/inference",
        "Model": "openai/gpt-4o"
    }
}

Replace "PUT-PERSONAL-ACCESS-TOKEN-HERE" with the personal access token that you got from GitHub.

Next, open Program.cs in an editor and delete all contents of the file. Add this code to Program.cs:

using System.Text;
using Azure;
using Microsoft.Extensions.AI;
using Microsoft.Extensions.Configuration;
using OpenAI;
using OpenAI.Chat;

var config = new ConfigurationBuilder()
    .SetBasePath(Directory.GetCurrentDirectory())
    .AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
    .Build();

string? apiKey = config["GitHub:Token"];
string? model = config["GitHub:Model"] ?? "openai/gpt-4o-mini";
string? endpoint = config["GitHub:ApiEndpoint"] ?? "https://models.github.ai/inference";

IChatClient chatClient = new ChatClient(
    model,
    new AzureKeyCredential(apiKey!),
    new OpenAIClientOptions
    {
        Endpoint = new Uri(endpoint)
    }
)
.AsIChatClient();

var instructions = @"You are an AI assistant that helps people find information. 
    The response must be brief and should not exceed one paragraph.
    If you do not know the answer then simply say 'I do not know the answer'.";

var agent = chatClient
    .CreateAIAgent(
        instructions: instructions,
        name: "Assistant"
    );

// Instantiate a StringBuilder
StringBuilder strBuilder = new();

// User question & answer loop
while (true)
{
    Console.Write("Q: ");
    var result = await agent.RunAsync(Console.ReadLine()!);
    Console.WriteLine(result);
}

Run the application:

I asked the question "What is the longest river in the world?" and the AI answered as shown above. 

Conclusion

GitHub AI models are easy to access with the new Microsoft Afent Framework (MSF). I hope you come up with great AI driven applications that make a difference to our world.


Sunday, October 19, 2025

Small Language Models with AI Toolkit Extension in VS Code

In this article, we will see how we can work with small language models (SLM) from the AI Toolkit extension in VS Code. Though the toolkit can do other things, our focus is to consume an ONNX SLM hosted on Visual Studio Code from a C# application. We will first look at an example that is based on OpenAI packages. We will later use a similar example based on the Sematic Kernal approach.

Companion Video: https://youtu.be/V_eWAM2fxJg

Prerequisites

You will need:

  • The latest version of VS Code
  • .NET version 9.0 or higher

What are small language models (SLMs)?

Small Language Models (SLMs) are compact versions of large language models (LLMs), designed to deliver strong performance in natural language tasks while using significantly fewer computational resources.

What is the AI Toolkit Extension in VS Code?

The AI Toolkit Extension for Visual Studio Code is a powerful, all-in-one environment for building, testing, and deploying generative AI applications—especially useful for developers working with small language models (SLMs).

Getting Started

Install the following Visual Studio Code extension:


Click on the three dots (...) in the left navigation of VS Code, and choose "AI Toolkit".

Click on "Model Catalog".

Scroll down down the list until you find “Local Models” >> ONNX >> Minstral 7B – (CPU – Small, Standard) >> + Add Model.

Once the model is fully downloaded, it will appear under Models >> ONNX.

Right-click on the model and select “Copy Model Name”.

I copied the following name for the "Minstral 7B" model: 

mistral-7b-v02-int4-cpu

Using OpenAI packages

Create a C# console application named AIToolkitOpenAI and add to it required packages with the following terminal window commands:

dotnet new console -n AIToolkitOpenAI
cd AIToolkitOpenAI
dotnet add package OpenAI

Start VS Code with:

code .

Click on the "AI Toolkit" tab in VS Code and make sure that the "Minstral 7B" model is running.

Replace content of Program.cs with this code:

using OpenAI;
using OpenAI.Chat;
using System.ClientModel;
using System.Text;

var model = "mistral-7b-v02-int4-cpu";
var baseUrl = "http://localhost:5272/v1/"; // root URL for local OpenAI-like server
var apikey = "unused";

OpenAIClientOptions options = new OpenAIClientOptions();
options.Endpoint = new Uri(baseUrl);
ApiKeyCredential credential = new ApiKeyCredential(apikey);
ChatClient client = new OpenAIClient(credential, options).GetChatClient(model);

// Build the prompt
StringBuilder prompt = new StringBuilder();
prompt.AppendLine("You will analyze the sentiment of the following product reviews.");
prompt.AppendLine("Each line is its own review. Output the sentiment of each review in");
prompt.AppendLine("a bulleted list and then provide a general sentiment of all reviews.");
prompt.AppendLine();
prompt.AppendLine("I bought this product and it's amazing. I love it!");
prompt.AppendLine("This product is terrible. I hate it.");
prompt.AppendLine("I'm not sure about this product. It's okay.");
prompt.AppendLine("I found this product based on the other reviews. It worked");

// send the prompt to the model and wait for the text completion
var response = await client.CompleteChatAsync(prompt.ToString());
// display the response
Console.WriteLine(response.Value.Content[0].Text);

Run the application with:

dotnet run

The application does sentiment analysis on what customers think of the product.

This is a sample of the output:

* I bought this product and it's amazing. I love it!: Positive sentiment
* This product is terrible. I hate it.: Negative sentiment
* I'm not sure about this product. It's okay.: Neutral sentiment
* I found this product based on the other reviews. It worked for me.: Positive sentiment

General sentiment: The reviews contain both positive and negative sentiments. Some customers expressed their love for the product, while others expressed their dislike. Neutral sentiment was also expressed by one customer. Overall, the reviews suggest that the product has the potential to elicit strong feelings from customers, both positive and negative.

Sematic Kernel packages

Create a C# console application named AIToolkitSK and add to it required packages with the following terminal window commands:

dotnet new console -n AIToolkitSK
cd AIToolkitSK
dotnet add package Microsoft.SemanticKernel

Start VS Code with:

code .

Click on the "AI Toolkit" tab in VS Code and make sure that the "Minstral 7B" model is running.

Replace content of Program.cs with this code:

using System.Text;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel.Connectors.OpenAI;

var model = "mistral-7b-v02-int4-cpu";
var baseUrl = "http://localhost:5272/v1/";
var apikey = "unused";

// Create a chat completion service
var kernel = Kernel.CreateBuilder()
    .AddOpenAIChatCompletion(modelId: model, apiKey: apikey, endpoint: new Uri(baseUrl))
    .Build();
var chat = kernel.GetRequiredService<IChatCompletionService>();
var history = new ChatHistory();
history.AddSystemMessage("You are a useful chatbot. Always reply in a funny way with short answers.");
var settings = new OpenAIPromptExecutionSettings
{
    MaxTokens = 500,
    Temperature = 1,
};

while (true)
{
    Console.Write("\nUser: ");
    var userInput = Console.ReadLine();
    if (string.IsNullOrWhiteSpace(userInput)) break;

    history.AddUserMessage(userInput);

    var responseBuilder = new StringBuilder();
    Console.Write("\nAI: ");
    await foreach (var message in chat.GetStreamingChatMessageContentsAsync(userInput, settings, kernel))
    {
        responseBuilder.Append(message);
        Console.Write(message);
    }
}

This is a simple chat completion app.

Run the application with:

dotnet run

My prompt was:

Red or white wine with beef steak?

The response was:

AI:  Both red and white wines can pair well with beef steak, but a red wine is generally the more traditional choice. Red wines, such as Cabernet Sauvignon, Merlot, or Pinot Noir, have flavors that complement the rich and savory flavors of beef. However, if you prefer a lighter taste, a white wine such as Pinot Noir or Chardonnay can also work well with beef steak. Ultimately, it comes down to personal preference.

Conclusion

We have seen how to use SLMs hosted by VS Code through the AI Toolkit extension. We were able to communicate with the model from these two C# applications: (1) a app the uses OpenAI packages, and (2) an app that uses Sematic Kernel.

Monday, October 6, 2025

Explore Docker MCP Toolkit and VS Code

To explore Docker MCP Toolkit, we will use two MCP server in the toolkit, namely PostgreSQL and Playwright.

Companion Video: https://youtu.be/43oJi_gAucU

What is Docker MCP Toolkit?

The Docker MCP Toolkit enables hosting and managing MCP servers. These servers expose APIs for specific development tasks, such as retrieving GitHub issue data or querying databases using natural language.

Prerequisites

You will need the following before you can continue:

  • Docker Desktop (latest version)
  • Visual Studio Code (latest version)
  • GitHub Copilot extension for VS Code
  • GitHub Copilot with Chat and Agent Mode enabled

1) Explore PostgreSQL MCP Server

We will use natural language to query a PostgreSQL database that is already pre-loaded with the sample Northwind database. To run the PostgreSQL server in a docker container on your computer, execute the following command from any terminal window:


docker run --name psqlnw -e POSTGRES_PASSWORD=VerySecret -p 5433:5432 -d melmasry/nw-psql:1.0.0

Start “Docker Desktop” on your computer and go to the Containers tab on the left navigation. You will see that the psqlnw container is running.


Next, let us use the Docker MCP Toolkit. In Docker Desktop, click on “MCP Toolkit” in the left navigation.

Click on the Catalog tab. This will show you a list of MCP Servers that are ready for you to explore. 

We will start with PostgreSQL. Find the PostgreSQL MCP Server by entering ‘postgr’ in the filter field. Then click on + to add it to your list.

You will be asked to enter a secret. This is nothing but the connection string and it is based on this format:

postgresql://readonly_user:readonly_password@host:port/database_name

In our case, this would be:

postgresql://postgres:VerySecret@host.docker.internal:5433/northwind

Click on the "My Servers" tab to see the MCP servers that you have chosen.

Connect Docker MCP Toolkit to Visual Studio Code

Let us query our northwind database in PostgreSQL from VS Code. Go to a working directory and execute these commands to create an empty folder on your computer:

mkdir mcp-server
cd mcp-server

In the same terminal window, logout and login into docker with:

docker logout  
docker login

Start VS Code in the folder with

code .

In VS Code, open the Command Palette by pressing Ctrl + Shift + P (or Cmd + Shift + P on macOS).

Select “Add MCP Server”.

Select “Command (stdio) Run a local command that implements the MCP protocol Manual Install”.

Enter the gateway command:

docker mcp gateway run

Give the server an ID named: 

my-mcp-server

Choose “Workspace Available in this workspace, runs locally”.


Click Trust.

A file named mcp.json is created in a .vscode folder with this content:

Note that the server is running. 

Inside the Github Copilot Chat window, choose Agent and any Claude model. Click on the tools icon to see active MCP servers.

You will find MCP servers that are configured in VS Code. Among them will be the one we enabled in the Docker MCP Toolkit.

You will notice that it only has one tool: “query Run a read-only SQL query”. This is all that is needed to query the northwind database.

Click ok the Ok button to close the tools popup.

Enter this prompt in the GitHub Copilot window:

What are the tables in the PostgreSQL northwind database?

You will be asked to click on the Allow button.

Thereafter, it displays a list of database tables in the northwind database.

Try this other prompt:

What are the products supplied by "Exotic Liquids"?

You will get a similar response to this:

2) Explore Playwright MCP Server

Back in Docker Desktop >> MCP Toolkit, add Playwright.

You now have two MCP servers in our list: Playwright and PostgreSQL.

Back in VS Code, restart the MCP Server in the .vscode/mcp.json file.

In the VS Code GitHub Copilot Chat window, enter this prompt:

Using the tools provided from the Docker MCP Server, navigate to https://www.bbc.com/, find the two most important business stories.

I got the following response at the time of writing this article:

Conclusion

I hope you found this article useful. These are early days of MCP servers. I am sure things will evolve much more in this very important space.

Saturday, September 20, 2025

Build web-based MCP server and client with ASP.NET & GitHub Models

Overview

This article demonstrates how to build a basic MCP server and client using ASP.NET and Server Sent Events (SSE). The MCP server exposes tools that can be discovered and used by LLMs, while the client application connects these tools to an AI service. GitHub AI models are used on the client application.

Server Sent Events (SSE)

SSE transport enables server-to-client streaming with HTTP POST requests for client-to-server communication. This approach facilitates communication between frontend microservices and backend business contexts through the MCP server. 

Companion Video: https://youtu.be/vTBtPU7AnT4

Source Code: https://github.com/medhatelmasry/AspMCP

Prerequisites

  • GitHub account
  • Visual Studio Code
  • .NET 9.0 (or later)

Setup

We will create an .NET solution comprising of an ASP.NET web server project; a WebAPI client project; and add the required packages with these terminal window commands:

mkdir AspMCP
cd AspMCP
dotnet new sln
dotnet new web -n ServerMCP
dotnet sln add ./ServerMCP/ServerMCP.csproj
cd ServerMCP
dotnet add package Azure.AI.OpenAI
dotnet add package Microsoft.Extensions.AI
dotnet add package Microsoft.Extensions.AI.OpenAI --prerelease
dotnet add package ModelContextProtocol --prerelease
dotnet add package ModelContextProtocol.AspNetCore --prerelease
cd ..
dotnet new webapi --use-controllers -n ClientMCP
dotnet sln add ./ClientMCP/ClientMCP.csproj
cd ClientMCP
dotnet add package Azure.AI.OpenAI
dotnet add package Azure.Identity
dotnet add package Microsoft.Extensions.AI
dotnet add package Microsoft.Extensions.AI.OpenAI --prerelease
dotnet add package ModelContextProtocol --prerelease
dotnet add package Swashbuckle.AspNetCore
cd ..

Open the solution in VS Code. To do that, you can enter this command in a terminal a terminal window:

code .

Build ASP.NET Server

In the ServerMCP project, add this service to Program.cs before “var app = builder.Build();”:

builder.Services.AddMcpServer()
.WithHttpTransport()
.WithToolsFromAssembly();

In the same server Program.cs file, add this code just before “app.Run();”:

// Add MCP middleware
app.MapMcp();

Delete (or comment out) this code in the Program.cs file:

app.MapGet("/", () => "Hello World!");

In the server project create a folder named McpTools folder and add to it a GreetingTool class with this code:

[McpServerToolType]
public sealed class GreetingTool {
  public GreetingTool() { }
  
  [McpServerTool, Description("Says Hello to a user")]
  public static string Echo(string username) {
    return "Hello " + username;
  }
}

Test Server

In a terminal window inside the server project, run this command to start the server:

dotnet watch

Point your browser to https://localhost:????/sse. 

Note : replace ???? with your port number. 

This is what you will see:


Build Client

In the client project......

1) Add these settings to the appsettings.json file.

"AI": {
  "ModelName": "gpt-4o-mini",
  "Endpoint": "https://models.inference.ai.azure.com",
  "ApiKey": "PUT-GITHUB-TOKEN-HERE",
  "MCPServiceUri": "http://localhost:5053/sse"
}

NOTE: adjust values for ApiKey and MCPServiceUri accordingly.

2) Register Services via Dependency Injection. Add the following to your client Program.cs:
var endpoint = builder.Configuration["AI:Endpoint"];
var apiKey = builder.Configuration["AI:ApiKey"];
var model = builder.Configuration["AI:ModelName"];

builder.Services.AddChatClient(services =>
  new ChatClientBuilder(
    (
      !string.IsNullOrEmpty(apiKey)
        ? new AzureOpenAIClient(new Uri(endpoint!), new AzureKeyCredential(apiKey))
        : new AzureOpenAIClient(new Uri(endpoint!), new DefaultAzureCredential())
    ).GetChatClient(model).AsIChatClient()
  )
  .UseFunctionInvocation()
  .Build());
 

3) To view the swagger UI, add this code right below “app.MapOpenApi();”:
app.UseSwaggerUI(options => {
  options.SwaggerEndpoint("/openapi/v1.json", "MCP Server");
  options.RoutePrefix = "";
});
 

4) Edit Properties/launchSettings.json, change launchBrowser to true under http and https. This is so that the app automatically launches in a browser when you run the project with “dotnet watch”.

In the Controllers folder, create a ChatController class with the following code:

[ApiController]
[Route("[controller]")]
public class ChatController : ControllerBase {
  private readonly ILogger<ChatController> _logger;
  private readonly IChatClient _chatClient;

  private readonly IConfiguration? _configuration;
  public ChatController(
    ILogger<ChatController> logger,
    IChatClient chatClient,
    IConfiguration configuration
  ) {
    _logger = logger;
    _chatClient = chatClient;
    _configuration = configuration;
  }

  [HttpPost(Name = "Chat")]
  public async Task<string> Chat([FromBody] string message) {
    // Create MCP client connecting to our MCP server
    var mcpClient = await McpClientFactory.CreateAsync(
      new SseClientTransport(
          new SseClientTransportOptions {
              Endpoint = new Uri(_configuration?["AI:MCPServiceUri"] ?? throw new InvalidOperationException("MCPServiceUri is not configured"))
          }
      )
    );
    // Get available tools from the MCP server
    var tools = await mcpClient.ListToolsAsync();

    // Set up the chat messages
    var messages = new List<ChatMessage> {
      new ChatMessage(ChatRole.System, "You are a helpful assistant.")
    };
    messages.Add(new(ChatRole.User, message));

    // Get streaming response and collect updates
    List<ChatResponseUpdate> updates = [];
    StringBuilder result = new StringBuilder();

    await foreach (var update in _chatClient.GetStreamingResponseAsync(
      messages,
      new() { Tools = [.. tools] }
    )) {
      result.Append(update);
      updates.Add(update);
    }
    
    // Add the assistant's responses to the message history
    messages.AddMessages(updates);
    return result.ToString();
  }
}

Test server and client

1) Start the server in a terminal window with: dotnet watch

2) Start the client in another terminal window also with: dotnet watch

The following swagger page will load in your browser. Click on POST, then “Try it out”.


3)  string with your name, then click on Execute.


4) You will receive a response like below.


A more realistic solution

To develop a more realistic solution, we will add a database to the server project. Thereafter, our MCP client can query the database using natural language. This is very compelling for line-of-business applications.

In the server project, make these changes.

1) Add these packages:
dotnet add package Microsoft.EntityFrameworkCore
dotnet add package Microsoft.EntityFrameworkCore.Design
dotnet add package Microsoft.EntityFrameworkCore.Sqlite
dotnet add package Microsoft.EntityFrameworkCore.SQLite.Design
dotnet add package Microsoft.EntityFrameworkCore.Tools
dotnet add package CsvHelper
 

2) Add this to appsettings.json:

"ConnectionStrings": {
  "DefaultConnection": "DataSource=beverages.sqlite;Cache=Shared"
}

3) Copy CSV data from this link. Then, save the content is a file named beverages.csv and put that file in a wwwroot folder in tour project.

4) In a Models folder, add a class named Beverage with this code:
public class Beverage {
  [Required]
  public int BeverageId { get; set; }

  public string? Name { get; set; }

  public string? Type { get; set; }

  public string? MainIngredient { get; set; }

  public string? Origin { get; set; }

  public int? CaloriesPerServing { get; set; }

  public void DisplayInfo() {
    Console.WriteLine($"{Name} is a {Type} from {Origin} made with {MainIngredient}. It has {CaloriesPerServing} calories per serving.");
  }
}

5) In a Data folder, add this ApplicationDbContext class:
public class ApplicationDbContext : DbContext {
  public DbSet<Beverage> Beverages => Set<Beverage>();

  public ApplicationDbContext(DbContextOptions<ApplicationDbContext> options)
    : base(options) { }

  protected override void OnModelCreating(ModelBuilder modelBuilder) {
    base.OnModelCreating(modelBuilder);
    modelBuilder.Entity<Beverage>().HasData(GetBeverages());
  }

  private static IEnumerable<Beverage> GetBeverages() {
    string[] p = { Directory.GetCurrentDirectory(), "wwwroot", "beverages.csv" };
    var csvFilePath = Path.Combine(p);

    var config = new CsvConfiguration(CultureInfo.InvariantCulture) {
      Encoding = Encoding.UTF8,
      PrepareHeaderForMatch = args => args.Header.ToLower(),
    };

    var data = new List<Beverage>().AsEnumerable();
    using (var reader = new StreamReader(csvFilePath)) {
      using (var csvReader = new CsvReader(reader, config)) {
        data = csvReader.GetRecords<Beverage>().ToList();
      }
    }

    return data;
  }
}

6) Add this service to Program.cs:
var connStr = builder.Configuration.GetConnectionString("DefaultConnection") 
    ?? throw new InvalidOperationException("Connection string 'DefaultConnection' not found.");
    
builder.Services.AddDbContext<ApplicationDbContext>(
	option => option.UseSqlite(connStr)
);

 7) In the same Program.cs file, add this code right before “app.Run();”:
// Apply database migrations on startup
using (var scope = app.Services.CreateScope()) {
    var services = scope.ServiceProvider;

    var context = services.GetRequiredService<ApplicationDbContext>();    
    context.Database.Migrate();
}

8) Now, we can add and apply database migrations with:
dotnet ef migrations add M1 -o Data/Migrations
dotnet ef database update

At this stage, the beverages.sqlite database is populated with real data:


9) Let us add a class that queries the SQLite database. Inside of a Services folder, add a class named BeverageService with this content:

public class BeverageService(ApplicationDbContext db) {
  public async Task<string> GetBeveragesJson() {
    var beverages = await db.Beverages.ToListAsync();
    return System.Text.Json.JsonSerializer.Serialize(beverages);
  }

  public async Task<string> GetBeverageByIdJson(int id) {
    var beverage = await db.Beverages.FindAsync(id);
    return System.Text.Json.JsonSerializer.Serialize(beverage);
  }

  public async Task<string> GetBeveragesContainingNameJson(string name) {
    var beverages = await db.Beverages
      .Where(b => b.Name!.Contains(name))
      .ToListAsync();

    return System.Text.Json.JsonSerializer.Serialize(beverages);
  }


  public async Task<string> GetBeveragesContainingTypeJson(string type) {
    var beverages = await db.Beverages
      .Where(b => b.Type!.Contains(type))
      .ToListAsync();

    return System.Text.Json.JsonSerializer.Serialize(beverages);
  }

  public async Task<string> GetBeveragesByIngredientJson(string ingredient) {
    var beverages = await db.Beverages
      .Where(b => b.MainIngredient!.Contains(ingredient))
      .ToListAsync();

    return System.Text.Json.JsonSerializer.Serialize(beverages);
  }

  public async Task<string> GetBeveragesByCaloriesLessThanOrEqualJson(int calories) {
    var beverages = await db.Beverages
      .Where(b => b.CaloriesPerServing <= calories)
      .ToListAsync();

    return System.Text.Json.JsonSerializer.Serialize(beverages);
  }

  public async Task<string> GetBeveragesByOriginJson(string origin) {
    var beverages = await db.Beverages
      .Where(b => b.Origin!.Contains(origin))
      .ToListAsync();

    return System.Text.Json.JsonSerializer.Serialize(beverages);
  }
}

10) Next, expose MCP tooling that interacts with the SQLite data. In the McpTools folder, add a class named BeverageTools with this code:

[McpServerToolType]
public class BeverageTool
{
    private readonly BeverageService _beverageService;
    private readonly ApplicationDbContext _db;

    public BeverageTool(ApplicationDbContext db)
    {
        _db = db;
        _beverageService = new BeverageService(_db);
    }

    [McpServerTool, Description("Get a list of beverages and return as JSON array")]
    public string GetBeveragesJson()
    {
        var task = _beverageService.GetBeveragesJson();
        return task.GetAwaiter().GetResult();
    }

    [McpServerTool, Description("Get a beverage by ID and return as JSON")]
    public string GetBeverageByIdJson([Description("The ID of the beverage to get details for")] int id)
    {
        var task = _beverageService.GetBeverageByIdJson(id);
        return task.GetAwaiter().GetResult();
    }

    [McpServerTool, Description("Get beverages by name and return as JSON")]
    public string GetBeveragesByNameJson([Description("The name of the beverage to filter by")] string name)
    {
        var task = _beverageService.GetBeveragesContainingNameJson(name);
        return task.GetAwaiter().GetResult();
    }

    [McpServerTool, Description("Get beverages by type and return as JSON")]
    public string GetBeveragesByTypeJson([Description("The type of the beverage to filter by")] string type)
    {
        var task = _beverageService.GetBeveragesContainingTypeJson(type);
        return task.GetAwaiter().GetResult();
    }

    [McpServerTool, Description("Get beverages by ingredient and return as JSON")]
    public string GetBeveragesByIngredientJson([Description("The ingredient of the beverage to filter by")] string ingredient)
    {
        var task = _beverageService.GetBeveragesByIngredientJson(ingredient);
        return task.GetAwaiter().GetResult();
    }

    [McpServerTool, Description("Get beverages by calories less than or equal to and return as JSON")]
    public string GetBeveragesByCaloriesLessThanOrEqualJson([Description("The maximum calories per serving to filter by")] int calories)
    {
        var task = _beverageService.GetBeveragesByCaloriesLessThanOrEqualJson(calories);
        return task.GetAwaiter().GetResult();
    }

    [McpServerTool, Description("Get beverages by origin and return as JSON")]
    public string GetBeveragesByOriginJson([Description("The origin of the beverage to filter by")] string origin)
    {
        var task = _beverageService.GetBeveragesByOriginJson(origin);
        return task.GetAwaiter().GetResult();
    }
}

You can now test the Student MCP service. Start the server project followed by the client project. In the client app, enter prompt: beverages high in calories.

Here is the output:


Conclusion

You are now able to create your own MCP server using .NET. Of course, you can deploy the server to the cloud and use it from a variety of client applications and devices. Go ahead and create MCP servers that do wonderful things.