Monday, February 16, 2026

Scaffolding Blazor pages with microsoft.dotnet-scaffold

In this tutorial, I will show you how to build a server-side Blazor application that connects directly with SQLite database using Entity Framework Core. We will then scaffold the CRUD pages with the microsoft.dotnet-scaffold tool.

Source Code: https://github.com/medhatelmasry/BlazorStudents

Pre-requisites

  1. .NET Framework
  2. VS Code (or any other .NET editor)
  3. dotnet-ef tool
  4. microsoft.dotnet-scaffold tool

Getting Started

In a terminal window, go to your working directory. Enter the following command to create a Server-Side Blazor application inside a directory called BlazorStudents:

dotnet new blazor -int server --auth individual -o BlazorStudents
cd BlazorStudents

Run the application by entering the following command:

dotnet watch

The following page will load into your default browser:


Open the BlazorStudents folder in Visual Studio Code (or any other .NET editor).

We will work with a very simple student model. Therefore, add a Student class file in a folder named Models with the following content: 

public class Student {
    public int StudentId { get; set; }

    [Required(ErrorMessage = "You must enter first name.")]
    public string? FirstName { get; set; }

    [Required(ErrorMessage = "You must enter last name.")]
    public string? LastName { get; set; }

    [Required(ErrorMessage = "You must enter school.")]
    public string? School { get; set; }

    [Required(ErrorMessage = "You must enter date of birth.")]
    public DateTime? DateOfBirth { get; set; }
}

From within a terminal window at the root of your BlazorStudents project,  run the following commands to add some required packages:

dotnet add package CsvHelper

The CsvHelper package will help us read data from a CSV.

Developers prefer having sample data when building data driven applications. Therefore, we will create some sample data to ensure that our application behaves as expected. Copy the CSV data at https://gist.github.com/medhatelmasry/e8d4edc2772a538419adda45e8f82685 and save it in a text file named students.csv in the wwwroot folder.

Edit Data/ApplicationDbContext.cs and add to the class the following property and methods:

public DbSet<Student> Students => Set<Student>();

protected override void OnModelCreating(ModelBuilder modelBuilder) {
  base.OnModelCreating(modelBuilder);
  modelBuilder.Entity<Student>().HasData(GetStudents());
}

private static IEnumerable<Student> GetStudents() {
  string[] p = { Directory.GetCurrentDirectory(), "wwwroot", "students.csv" };
  var csvFilePath = Path.Combine(p);

  var config = new CsvConfiguration(CultureInfo.InvariantCulture) {
    PrepareHeaderForMatch = args => args.Header.ToLower(),
  };

  var data = new List<Student>().AsEnumerable();
  using (var reader = new StreamReader(csvFilePath)) {
    using (var csvReader = new CsvReader(reader, config)) {
      data = csvReader.GetRecords<Student>().ToList();
    }
  }

  return data;
}

Notice the above code is adding contents of the wwwroot/students.csv file as seed data into the database.

We are now ready to apply Entity Framework migrations, create the database and seed data. If you have not done so already, you will need to globally install the Entity Framework CLI tool. This tool is installed globally on your computer by running the following command in a terminal window:

dotnet tool install --global dotnet-ef

To have a clean start with Entity Framework migrations, delete the Data/Migrations folder and the Data/app.db files.

From within a terminal window inside the BlazorStudents root directory, run the following command to create migrations:

dotnet ef migrations add Stu -o Data/Migrations

This results in the creation of a migration file ending with the name ....Stu.cs in the Data/Migrations folder. 

The next step is to create the SQLite Data/app.db database file. This is done by adding the following code to Program.cs, right before app.Run():

using (var scope = app.Services.CreateScope()) {
    var services = scope.ServiceProvider;

    var context = services.GetRequiredService<ApplicationDbContext>();    
    context.Database.Migrate();
}

Run the application. This will cause students data to be seeded in the database.

Scaffolding Blazor Components

If you have not done so already, install the dotnet scaffold tool with this terminal window command:

dotnet tool install -g microsoft.dotnet-scaffold

Let us scaffold the CRUD pages for students. Run the scaffold tool from the root directory of your application with this command:

dotnet scaffold

Follow these steps....

The scaffolding process adds the following pages to your application:

Edit Components/Layout/NavMenu.razor to add this menu item:

<div class="nav-item px-3">
    <NavLink class="nav-link" href="students">
        <span class="bi bi-lock-nav-menu" aria-hidden="true"></span> Students
    </NavLink>
</div>

Find the following code in Program.cs and delete it because the scaffold tool registered ApplicatioDbContext using DbContextFactory:

builder.Services.AddDbContext<ApplicationDbContext>(options => options.UseSqlite(connectionString));

Run the application with:

dotnet watch

The app will launch in your default browser.

You should have the full CRUD experience once you click on Students in the left navigation.

QuickGrid

If you open StudentPages/Index.razor in your editor, you will notice that the QuickGrid Blazor component is being used to display students data. This is the code that uses QuickGrid:

<QuickGrid Class="table" Items="context.Students">
  <PropertyColumn Property="student => student.FirstName" />
  <PropertyColumn Property="student => student.LastName" />
  <PropertyColumn Property="student => student.School" />
  <PropertyColumn Property="student => student.DateOfBirth" />

  <TemplateColumn Context="student">
    <a href="@($"students/edit?studentid={student.StudentId}")">Edit</a> |
    <a href="@($"students/details?studentid={student.StudentId}")">Details</a> |
    <a href="@($"students/delete?studentid={student.StudentId}")">Delete</a>
  </TemplateColumn>
</QuickGrid>

Visit the QuickGrid for Blazor site for more information on this freely available component.

Component CSS

With Blazor components, it is easy create CSS that targets individual components. Simply create a file with the same name as the component and add to it .css. 

For example:

Create a file named StudentPages/Index.razor.css in the same folder as the component itself with this styling:

p.create-new {
    background-color: orange;
}

In StudentPages/Index.razor component, add the following styling to the <p> tag that contains the “Create New” link as follows:

<p class="create-new">
    <a href="students/create">Create New</a>
</p>

You will notice that styling is successfully applied to the StudentPages/Index.razor component.

The .NET scaffold tool can be used for more than creating pages for Blazor app. Among other things, it can be used for scaffolding Aspire, API, MVC Controllerts, and Itentity.

Thursday, February 12, 2026

docker-compose with SQL Server and ASP.NET

This article discussed one approach to having your ASP.NET development environment work with SQL Server (MSSQL) running in a docker container.

Source code: https://github.com/medhatelmasry/AspMSSQL

It is assumed that the following installed on your computer:

  1. .NET 10.0 
  2. Docker Desktop 
  3. ‘dotnet-ef’ tool 

Setting up SQL Server docker container

To download a suitable SQL Server image from Docker Hub and run it on your local computer, type the following command from within a terminal window:

docker run --cap-add SYS_PTRACE -e ACCEPT_EULA=1 -e MSSQL_SA_PASSWORD=SqlPassword! -p 1444:1433 --name mssql -d mcr.microsoft.com/mssql/server:2022-latest

This starts a container named mssql that listens on port 1444 on your local computer. The sa password is SqlPassword!.

To ensure that the SQL Server container is running, type the following from within a terminal window:

docker ps

You will see a message like the following:

CONTAINER ID   IMAGE                                        ...... NAMES
e84053717017   mcr.microsoft.com/mssql/server:2022-latest   ...... mssql

Creating our ASP.NET MVC App

Create an ASP.NET MVC app named AspMSSQL with SQL Server support by running the following terminal window commands:

dotnet new mvc --auth individual --use-local-db -o AspMSSQL
cd AspMSSQL

To run the web application and see what it looks like, enter the following command:

dotnet watch

The app starts in your default browser and looks like this:

Let us configure our web application so that the connection string can be constructed from environment variables. Open the Program.cs file in your favourite editor and comment out (or delete) the following statements:

var connectionString = builder.Configuration.GetConnectionString("DefaultConnection") ?? throw new InvalidOperationException("Connection string 'DefaultConnection' not found.");

Replace the above code with the following:

var host = builder.Configuration["DBHOST"] ?? "localhost";
var port = builder.Configuration["DBPORT"] ?? "1444";
var password = builder.Configuration["DBPASSWORD"] ?? "SqlPassword!";
var db = builder.Configuration["DBNAME"] ?? "mydb";
var user = builder.Configuration["DBUSER"] ?? "sa";

string connectionString = $"Server={host},{port};Database={db};UID={user};PWD={password};TrustServerCertificate=True;";

Five environment variables are used in the database connection string. These are: DBHOST, DBPORT , DBPASSWORD, DBNAME and DBUSER. If these environment variables are not found then they will take on default values: localhost, 1444, SqlPassword!, mydb and sa respectively.

Go ahead and delete the connection string from appsettings.json as it is not needed anymore:

"ConnectionStrings": {
  "DefaultConnection": "Server=(localdb)\\mssqllocaldb;Database=aspnet-AspMSSQL; MultipleActiveResultSets=true"
},

Entity Framework Migrations

We can instruct our application to automatically process any outstanding Entity Framework migrations. This is done by adding the following statement to Program.cs right before the last app.Run() statement:

using (var scope = app.Services.CreateScope()) {
    var services = scope.ServiceProvider;

    var context = services.GetRequiredService<ApplicationDbContext>();    
    context.Database.Migrate();
}

Test app

Now, let's test our web app and see whether it can talk to the containerized MSSQL database server. Run the web application with the following terminal command:

dotnet watch

Click on the Register link on the top right side.

I entered an Email, Password and Confirm password, then clicked on the Register button. The website then displays the following page that requires that you confirm the email address:

Click on the “Click here to confirm your account” link. This leads you to a confirmation page:

Login with the email address and password that you registered with.

The message on the top right side confirms that the user was saved and that communication between the ASP.NET MVC app and SQL Server is working as expected.

Dockeri-zing app

We will generate the release version of the application by executing the following command from a terminal window in the root directory of the web app:

dotnet publish -o distrib

The above command instructs dotnet to produce the release version of the application in the distrib directory. When you inspect the distrib directory, you will see files like the following:

The highlighted file in the above image is the main DLL file that is the entry point into the web application. Let us run the DLL. To do this, change to the distrib directory, then run your main DLL file with:

cd distrib
dotnet AspMSSQL.dll

This displays the familiar messages from the web server that the app is ready to be accessed from a browser. 

Hit CTRL C to stop the web server.

We now have a good idea about the ASP.NET artifacts that need to be copied into a container.

In a terminal window, stop and remove the MSSQL container with:

docker rm -f mssql

Return to the root directory of your project by typing the following in a terminal window:

cd ..

Docker image for web app

We need to create a docker image that will contain the .NET runtime. At the time of writing this article, the current version of .NET is 10.0.

We can exclude files from being copied into the container imag Add a file named .dockerignore in the root of the web application with this content:

**/.git
**/.gitignore
**/node_modules
**/npm-debug.log
**/.DS_Store
**/bin
**/obj
**/.vs
**/.vscode
**/.env
**/*.user
**/*.suo
**/.idea
**/coverage
**/.nyc_output
**/docker-compose*.yml
**/Dockerfile*
**/.github
**/README.md
**/LICENSE

Create a text file named Dockerfile and add to it the following content:

# Build stage
FROM mcr.microsoft.com/dotnet/sdk:10.0 AS build
WORKDIR /src

# Copy project file and restore dependencies
COPY ["AspMSSQL.csproj", "."]
RUN dotnet restore "AspMSSQL.csproj"

# Copy the rest of the source code
COPY . .

# Build the application
RUN dotnet build "AspMSSQL.csproj" -c Release -o /app/build

# Publish stage
FROM build AS publish
RUN dotnet publish "AspMSSQL.csproj" -c Release -o /app/publish /p:UseAppHost=false

# Runtime stage
FROM mcr.microsoft.com/dotnet/aspnet:10.0 AS runtime
WORKDIR /app

# Install curl for health checks (optional)
RUN apt-get update && apt-get install -y curl && rm -rf /var/lib/apt/lists/*

# Copy published application from publish stage
COPY --from=publish /app/publish .

# Expose port 8080 (HTTP)
EXPOSE 8080

# Set environment variables
ENV ASPNETCORE_URLS=http://+:8080
ENV ASPNETCORE_ENVIRONMENT=Production

# Run the application
ENTRYPOINT ["dotnet", "AspMSSQL.dll"]

docker-compose.yml

We will next create a docker yml file that orchestrates the entire system involving two containers: a MSSQL database server and our web app. In the root folder of your application, create a text file named docker-compose.yml and add to it the following content:

services:
  # SQL Server Service
  mssql:
    image: mcr.microsoft.com/mssql/server:2022-latest
    container_name: aspmsql-mssql
    environment:
      ACCEPT_EULA: 'Y'
      MSSQL_SA_PASSWORD: 'SqlPassword!123'
      MSSQL_PID: 'Developer'
    ports:
      - "1433:1433"
    volumes:
      - ./mssql-data:/var/opt/mssql/data

  # ASP.NET Application Service
  aspmsql-app:
    build:
      context: .
      dockerfile: Dockerfile
    container_name: aspmsql-app
    depends_on:
      - mssql
    environment:
      ASPNETCORE_ENVIRONMENT: Development
      ASPNETCORE_URLS: http://+:8080
      DBHOST: mssql
      DBPORT: 1433
      DBUSER: sa
      DBPASSWORD: SqlPassword!123
      DBNAME: AspMSSQLDb
    ports:
      - "8080:8080"
    restart: unless-stopped

volumes:
  sqlserver-data:
    driver: local

Running the yml file

To find out if this all works, go to a terminal window and run the following command:

docker-compose up -d --build

Point your browser to http://localhost:8080/ and you should see the main web page. Register a user, confirm the email, and login. It should all work as expected.

Cleanup

Run the following command to shutdown docker-compose and cleanup:

docker-compose down

Conclusion

We have seen how straight forward and easy it is to containerize an application and its database with docker-compose.

Thursday, January 22, 2026

Using Microsoft Agent Framework with AI models hosted on GitHub

Overview

In this article I will show you how you can use the Microsoft Agent Framework to experiment with AI models hosted on GitHub. GitHub AI Models are intended for learning, experimentation and proof-of-concept activities. The GitHub Models feature is subject to various limits (including requests per minute, requests per day, tokens per request, and concurrent requests) and is not designed for production use cases.

Getting Started

There are many AI models from a variety of vendors that you can choose from. The starting point is to visit https://github.com/marketplace?type=models to work with free GitHub AI Models. At the time of writing, these are a subset of the models available:


For this article, I will use the "OpenAI GPT-4o" model highlighted above. 

Selecting the "OpenAI GPT-4o" leads you to the page below. Click on the "<> Use this model" button.

On the dialog that pops up, select C#, then click on "Run a basic code sample".

On the next dialog, you will see the signature of the model. In our case it is "openai/gpt-4o".


Click on "1. Configure authentication".

Next, click on the "Create Personal Access Token" button. 

You may need to go through a verification process.

Make your selections.

On the next pop-up, click on "Generate token".

Copy the newly generated token and place it is a safe place because you cannot view this token again once you leave this page. 

Let's use Microsoft Agent Framework (MAF)

In a working directory, create a C# console app named GitHubAiModelMAF inside a terminal window with the following command:

dotnet new console -n GitHubAiModelMAF

Change into the newly created directory GitHubAiModelSK with:

cd GitHubAiModelMAF

Next, let's add two packages to our console application with:

dotnet add package Azure.AI.OpenAI
dotnet add package Azure.Identity
dotnet add package Microsoft.Agents.AI.OpenAI -v 1.0.0-preview.251114.1
dotnet add package Microsoft.Extensions.Configuration.Json

Create a file named appsettings.json. Add this to appsettings.json:

{
    "GitHub": {
        "Token": "PUT-PERSONAL-ACCESS-TOKEN-HERE",
        "ApiEndpoint": "https://models.github.ai/inference",
        "Model": "openai/gpt-4o"
    }
}

Replace "PUT-PERSONAL-ACCESS-TOKEN-HERE" with the personal access token that you got from GitHub.

Next, open Program.cs in an editor and delete all contents of the file. Add this code to Program.cs:

using System.Text;
using Azure;
using Microsoft.Extensions.AI;
using Microsoft.Extensions.Configuration;
using OpenAI;
using OpenAI.Chat;

var config = new ConfigurationBuilder()
    .SetBasePath(Directory.GetCurrentDirectory())
    .AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
    .Build();

string? apiKey = config["GitHub:Token"];
string? model = config["GitHub:Model"] ?? "openai/gpt-4o-mini";
string? endpoint = config["GitHub:ApiEndpoint"] ?? "https://models.github.ai/inference";

IChatClient chatClient = new ChatClient(
    model,
    new AzureKeyCredential(apiKey!),
    new OpenAIClientOptions
    {
        Endpoint = new Uri(endpoint)
    }
)
.AsIChatClient();

var instructions = @"You are an AI assistant that helps people find information. 
    The response must be brief and should not exceed one paragraph.
    If you do not know the answer then simply say 'I do not know the answer'.";

var agent = chatClient
    .CreateAIAgent(
        instructions: instructions,
        name: "Assistant"
    );

// Instantiate a StringBuilder
StringBuilder strBuilder = new();

// User question & answer loop
while (true)
{
    Console.Write("Q: ");
    var result = await agent.RunAsync(Console.ReadLine()!);
    Console.WriteLine(result);
}

Run the application:

I asked the question "What is the longest river in the world?" and the AI answered as shown above. 

Conclusion

GitHub AI models are easy to access with the new Microsoft Afent Framework (MSF). I hope you come up with great AI driven applications that make a difference to our world.


Sunday, October 19, 2025

Small Language Models with AI Toolkit Extension in VS Code

In this article, we will see how we can work with small language models (SLM) from the AI Toolkit extension in VS Code. Though the toolkit can do other things, our focus is to consume an ONNX SLM hosted on Visual Studio Code from a C# application. We will first look at an example that is based on OpenAI packages. We will later use a similar example based on the Sematic Kernal approach.

Companion Video: https://youtu.be/V_eWAM2fxJg

Prerequisites

You will need:

  • The latest version of VS Code
  • .NET version 9.0 or higher

What are small language models (SLMs)?

Small Language Models (SLMs) are compact versions of large language models (LLMs), designed to deliver strong performance in natural language tasks while using significantly fewer computational resources.

What is the AI Toolkit Extension in VS Code?

The AI Toolkit Extension for Visual Studio Code is a powerful, all-in-one environment for building, testing, and deploying generative AI applications—especially useful for developers working with small language models (SLMs).

Getting Started

Install the following Visual Studio Code extension:


Click on the three dots (...) in the left navigation of VS Code, and choose "AI Toolkit".

Click on "Model Catalog".

Scroll down down the list until you find “Local Models” >> ONNX >> Minstral 7B – (CPU – Small, Standard) >> + Add Model.

Once the model is fully downloaded, it will appear under Models >> ONNX.

Right-click on the model and select “Copy Model Name”.

I copied the following name for the "Minstral 7B" model: 

mistral-7b-v02-int4-cpu

Using OpenAI packages

Create a C# console application named AIToolkitOpenAI and add to it required packages with the following terminal window commands:

dotnet new console -n AIToolkitOpenAI
cd AIToolkitOpenAI
dotnet add package OpenAI

Start VS Code with:

code .

Click on the "AI Toolkit" tab in VS Code and make sure that the "Minstral 7B" model is running.

Replace content of Program.cs with this code:

using OpenAI;
using OpenAI.Chat;
using System.ClientModel;
using System.Text;

var model = "mistral-7b-v02-int4-cpu";
var baseUrl = "http://localhost:5272/v1/"; // root URL for local OpenAI-like server
var apikey = "unused";

OpenAIClientOptions options = new OpenAIClientOptions();
options.Endpoint = new Uri(baseUrl);
ApiKeyCredential credential = new ApiKeyCredential(apikey);
ChatClient client = new OpenAIClient(credential, options).GetChatClient(model);

// Build the prompt
StringBuilder prompt = new StringBuilder();
prompt.AppendLine("You will analyze the sentiment of the following product reviews.");
prompt.AppendLine("Each line is its own review. Output the sentiment of each review in");
prompt.AppendLine("a bulleted list and then provide a general sentiment of all reviews.");
prompt.AppendLine();
prompt.AppendLine("I bought this product and it's amazing. I love it!");
prompt.AppendLine("This product is terrible. I hate it.");
prompt.AppendLine("I'm not sure about this product. It's okay.");
prompt.AppendLine("I found this product based on the other reviews. It worked");

// send the prompt to the model and wait for the text completion
var response = await client.CompleteChatAsync(prompt.ToString());
// display the response
Console.WriteLine(response.Value.Content[0].Text);

Run the application with:

dotnet run

The application does sentiment analysis on what customers think of the product.

This is a sample of the output:

* I bought this product and it's amazing. I love it!: Positive sentiment
* This product is terrible. I hate it.: Negative sentiment
* I'm not sure about this product. It's okay.: Neutral sentiment
* I found this product based on the other reviews. It worked for me.: Positive sentiment

General sentiment: The reviews contain both positive and negative sentiments. Some customers expressed their love for the product, while others expressed their dislike. Neutral sentiment was also expressed by one customer. Overall, the reviews suggest that the product has the potential to elicit strong feelings from customers, both positive and negative.

Sematic Kernel packages

Create a C# console application named AIToolkitSK and add to it required packages with the following terminal window commands:

dotnet new console -n AIToolkitSK
cd AIToolkitSK
dotnet add package Microsoft.SemanticKernel

Start VS Code with:

code .

Click on the "AI Toolkit" tab in VS Code and make sure that the "Minstral 7B" model is running.

Replace content of Program.cs with this code:

using System.Text;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel.Connectors.OpenAI;

var model = "mistral-7b-v02-int4-cpu";
var baseUrl = "http://localhost:5272/v1/";
var apikey = "unused";

// Create a chat completion service
var kernel = Kernel.CreateBuilder()
    .AddOpenAIChatCompletion(modelId: model, apiKey: apikey, endpoint: new Uri(baseUrl))
    .Build();
var chat = kernel.GetRequiredService<IChatCompletionService>();
var history = new ChatHistory();
history.AddSystemMessage("You are a useful chatbot. Always reply in a funny way with short answers.");
var settings = new OpenAIPromptExecutionSettings
{
    MaxTokens = 500,
    Temperature = 1,
};

while (true)
{
    Console.Write("\nUser: ");
    var userInput = Console.ReadLine();
    if (string.IsNullOrWhiteSpace(userInput)) break;

    history.AddUserMessage(userInput);

    var responseBuilder = new StringBuilder();
    Console.Write("\nAI: ");
    await foreach (var message in chat.GetStreamingChatMessageContentsAsync(userInput, settings, kernel))
    {
        responseBuilder.Append(message);
        Console.Write(message);
    }
}

This is a simple chat completion app.

Run the application with:

dotnet run

My prompt was:

Red or white wine with beef steak?

The response was:

AI:  Both red and white wines can pair well with beef steak, but a red wine is generally the more traditional choice. Red wines, such as Cabernet Sauvignon, Merlot, or Pinot Noir, have flavors that complement the rich and savory flavors of beef. However, if you prefer a lighter taste, a white wine such as Pinot Noir or Chardonnay can also work well with beef steak. Ultimately, it comes down to personal preference.

Conclusion

We have seen how to use SLMs hosted by VS Code through the AI Toolkit extension. We were able to communicate with the model from these two C# applications: (1) a app the uses OpenAI packages, and (2) an app that uses Sematic Kernel.

Monday, October 6, 2025

Explore Docker MCP Toolkit and VS Code

To explore Docker MCP Toolkit, we will use two MCP server in the toolkit, namely PostgreSQL and Playwright.

Companion Video: https://youtu.be/43oJi_gAucU

What is Docker MCP Toolkit?

The Docker MCP Toolkit enables hosting and managing MCP servers. These servers expose APIs for specific development tasks, such as retrieving GitHub issue data or querying databases using natural language.

Prerequisites

You will need the following before you can continue:

  • Docker Desktop (latest version)
  • Visual Studio Code (latest version)
  • GitHub Copilot extension for VS Code
  • GitHub Copilot with Chat and Agent Mode enabled

1) Explore PostgreSQL MCP Server

We will use natural language to query a PostgreSQL database that is already pre-loaded with the sample Northwind database. To run the PostgreSQL server in a docker container on your computer, execute the following command from any terminal window:


docker run --name psqlnw -e POSTGRES_PASSWORD=VerySecret -p 5433:5432 -d melmasry/nw-psql:1.0.0

Start “Docker Desktop” on your computer and go to the Containers tab on the left navigation. You will see that the psqlnw container is running.


Next, let us use the Docker MCP Toolkit. In Docker Desktop, click on “MCP Toolkit” in the left navigation.

Click on the Catalog tab. This will show you a list of MCP Servers that are ready for you to explore. 

We will start with PostgreSQL. Find the PostgreSQL MCP Server by entering ‘postgr’ in the filter field. Then click on + to add it to your list.

You will be asked to enter a secret. This is nothing but the connection string and it is based on this format:

postgresql://readonly_user:readonly_password@host:port/database_name

In our case, this would be:

postgresql://postgres:VerySecret@host.docker.internal:5433/northwind

Click on the "My Servers" tab to see the MCP servers that you have chosen.

Connect Docker MCP Toolkit to Visual Studio Code

Let us query our northwind database in PostgreSQL from VS Code. Go to a working directory and execute these commands to create an empty folder on your computer:

mkdir mcp-server
cd mcp-server

In the same terminal window, logout and login into docker with:

docker logout  
docker login

Start VS Code in the folder with

code .

In VS Code, open the Command Palette by pressing Ctrl + Shift + P (or Cmd + Shift + P on macOS).

Select “Add MCP Server”.

Select “Command (stdio) Run a local command that implements the MCP protocol Manual Install”.

Enter the gateway command:

docker mcp gateway run

Give the server an ID named: 

my-mcp-server

Choose “Workspace Available in this workspace, runs locally”.


Click Trust.

A file named mcp.json is created in a .vscode folder with this content:

Note that the server is running. 

Inside the Github Copilot Chat window, choose Agent and any Claude model. Click on the tools icon to see active MCP servers.

You will find MCP servers that are configured in VS Code. Among them will be the one we enabled in the Docker MCP Toolkit.

You will notice that it only has one tool: “query Run a read-only SQL query”. This is all that is needed to query the northwind database.

Click ok the Ok button to close the tools popup.

Enter this prompt in the GitHub Copilot window:

What are the tables in the PostgreSQL northwind database?

You will be asked to click on the Allow button.

Thereafter, it displays a list of database tables in the northwind database.

Try this other prompt:

What are the products supplied by "Exotic Liquids"?

You will get a similar response to this:

2) Explore Playwright MCP Server

Back in Docker Desktop >> MCP Toolkit, add Playwright.

You now have two MCP servers in our list: Playwright and PostgreSQL.

Back in VS Code, restart the MCP Server in the .vscode/mcp.json file.

In the VS Code GitHub Copilot Chat window, enter this prompt:

Using the tools provided from the Docker MCP Server, navigate to https://www.bbc.com/, find the two most important business stories.

I got the following response at the time of writing this article:

Conclusion

I hope you found this article useful. These are early days of MCP servers. I am sure things will evolve much more in this very important space.