Monday, September 30, 2024

Using Sematic Kernel with SLM AI models downloaded to your computer

In this walkthrough, I will demonstrate how you can download the Phi-3 AI SLM (small language model) from Hugging Face and use it in a C# application. 

What is Hugging Face?

Hugging Face (https://huggingface.co/) provides AI/ML researchers & developers with access to thousands of curated datasets, machine learning models, and AI-powered demo apps. We will download he Phi-3 SLM model in ONNX format onto our computers from https://huggingface.co/models.

What is ONNX?

ONNX is an open format built to represent machine learning models. Visit https://onnx.ai/ for more information.

Getting Started

We will download the Phi-3 Mini SLM for the ONNX runtime from Hugging Face. Run the following command from within a terminal window so that the destination is a location of your choice. In the below example the destination is a folder named phi-3-mini on a Windows C: drive.

git clone https://huggingface.co/microsoft/Phi-3-mini-4k-instruct-onnx C:/phi-3-mini

Be patient as the download could take some time. On my Windows computer the size of the download is 30.1 GB comprising 97 files and 48 folders.

We will be using the files in the cpu_and_mobile folder. Inside that folder, navigate into the cpu-int4-rtn-block-32 folder where you will find this pair of files that contain the AI ONNX model:

phi3-mini-4k-instruct-cpu-int4-rtn-block-32.onnx

phi3-mini-4k-instruct-cpu-int4-rtn-block-32.onnx.data

Getting Started

In a working directory, create a C# console app named LocalAiModelSK inside a terminal window with the following command:

dotnet new console -n LocalAiModelSK 

Change into the newly created directory LocalAiModelSK with:

cd LocalAiModelSK

Next, let's add two packages to our console application with:

dotnet add package Microsoft.SemanticKernel -v 1.16.2

dotnet add package Microsoft.SemanticKernel.Connectors.Onnx -v 1.16.2-alpha

Open the project in VS Code and add this directive to the .csproj file right below: <Nullable>enable</Nullable>:

<NoWarn>SKEXP0070</NoWarn>

Replace the contents of Program.cs with the following C# code:

using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel.Connectors.OpenAI; 
 
// PHI-3 local model location 
var modelPath = @"C:\phi-3-mini\cpu_and_mobile\cpu-int4-rtn-block-32"; 
 
// Load the model and services
var builder = Kernel.CreateBuilder();
builder.AddOnnxRuntimeGenAIChatCompletion("phi-3", modelPath); 
 
// Build Kernel
var kernel = builder.Build(); 
 
// Create services such as chatCompletionService and embeddingGeneration
var chatCompletionService = kernel.GetRequiredService<IChatCompletionService>(); 
 
// Start the conversation
while (true) {
    // Get user input
    Console.ForegroundColor = ConsoleColor.Yellow;
    Console.Write("User : ");
    var question = Console.ReadLine()!; 
    
    OpenAIPromptExecutionSettings openAIPromptExecutionSettings = new() {
        MaxTokens = 200
    }; 
 
    var response = kernel.InvokePromptStreamingAsync(
        promptTemplate: @"{{$input}}",
        arguments: new KernelArguments(openAIPromptExecutionSettings){
            { "input", question }
        });
    Console.ForegroundColor = ConsoleColor.Green;
    Console.Write("\nAssistant : ");
    string combinedResponse = string.Empty;
    await foreach (var message in response) {
        // Write the response to the console
        Console.Write(message);
        combinedResponse += message;
    }
    Console.WriteLine();
}

In the above code, make sure that modelPath points to the proper location of the model on your computer.

I asked the question: How long do mosquito live?

This is the response I received:


Conclusion

You can choose from a variety of  SLMs at Hugging Face. Of course, the penalty is that the actual ONNX model sizes are significant making it, in some circumstances, more desirable to use a model that resides online.


Saturday, September 28, 2024

Using Sematic Kernel with AI models hosted on GitHub

Overview

In this article I will show you how you can experiment with AI models hosted on GitHub. GitHub AI Models are intended for learning, experimentation and proof-of-concept activities. The feature is subject to various limits (including requests per minute, requests per day, tokens per request, and concurrent requests) and is not designed for production use cases.

Getting Started

There are many AI models from a variety of vendors that you can choose from. The starting point is to visit https://github.com/marketplace/models. At the time of writing, these are a subset of the models available:


For this article, I will use the "Phi-3.5-mini instruct (128k)" model highlighted above. If you click on that model you will be taken to the model's landing page:


Click on the green "Get started" button.


The first thing we need to do is get a 'personal access token' by clicking on the indicated button above.


Choose 'Generate new token', which happens to be in beta at the time of writing.


Give your token a name, set the expiration, and optionally describe the purpose of the token. Thereafter, click on the green 'Generate token' button at the bottom of the page.


Copy the newly generated token and place it is a safe place because you cannot view this token again once you leave the above page. 

You should be able to return to the "Get Started" page by choosing the "Marketplace" tab in your browser.


On the "Get Started" page, note that you have examples using different programming languages.



If you choose C#, you can easily copy the basic sample code on your computer and try it out. We will, however, use alternative code with Semantic Kernel.

Scroll down the page to the "3. Run a basic code sample" section and note the values of endpoint & model variables. These values will later be used in our application.

Let's use Semantic Kernel

In a working directory, create a C# console app named GitHubAiModelSK inside a terminal window with the following command:

dotnet new console -n GitHubAiModelSK

Change into the newly created directory GitHubAiModelSK with:

cd GitHubAiModelSK

Next, let's add two packages to our console application with:

dotnet add package Microsoft.SemanticKernel -v 1.19.0

dotnet add package Microsoft.Extensions.Configuration.Json

Open the project in VS Code and add this directive to the .csproj file right below: <Nullable>enable</Nullable>:

<NoWarn>SKEXP0010</NoWarn>

Create a file named appsettings.json. Add this to appsettings.json:

{
    "AI": {
      "Endpoint": "https://models.inference.ai.azure.com",
      "Model": "Phi-3.5-mini-instruct",
      "PAT": "fake-token"
    }
}

Replace "fake-token" with the personal access token that you got from GitHub.

Next, open Program.cs in an editor and delete all contents of the file. Add this code to Program.cs:

using Microsoft.SemanticKernel;
using System.Text;
using Microsoft.SemanticKernel.ChatCompletion;
using OpenAI;
using System.ClientModel;
using Microsoft.Extensions.Configuration;

var config = new ConfigurationBuilder()
    .SetBasePath(Directory.GetCurrentDirectory())
    .AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
    .Build();

var modelId = config["AI:Model"]!;
var uri = config["AI:Endpoint"]!;
var githubPAT = config["AI:PAT"]!;

var client = new OpenAIClient(new ApiKeyCredential(githubPAT), new OpenAIClientOptions { Endpoint = new Uri(uri) });

// Initialize the Semantic kernel
var builder = Kernel.CreateBuilder();

builder.AddOpenAIChatCompletion(modelId, client);
var kernel = builder.Build();

// get a chat completion service
var chatCompletionService = kernel.GetRequiredService<IChatCompletionService>();

// Create a new chat by specifying the assistant
ChatHistory chat = new(@"
    You are an AI assistant that helps people find information. 
    The response must be brief and should not exceed one paragraph.
    If you do not know the answer then simply say 'I do not know the answer'."
);

// Instantiate a StringBuilder
StringBuilder strBuilder = new();

// User question & answer loop
while (true)
{
    // Get the user's question
    Console.Write("Q: ");
    chat.AddUserMessage(Console.ReadLine()!);

    // Clear contents of the StringBuilder
    strBuilder.Clear();

    // Get the AI response streamed back to the console
    await foreach (var message in chatCompletionService.GetStreamingChatMessageContentsAsync(chat, kernel: kernel))
    {
        Console.Write(message);
        strBuilder.Append(message.Content);
    }
    Console.WriteLine();
    chat.AddAssistantMessage(strBuilder.ToString());

    Console.WriteLine();
}

Run the application:


I asked the question "How many pyramids are there in Egypt?" and the AI answered as shown above. 

Using a different model

How about we use a different AI model. For example, I will try the 'Meta-Llama-3.1-405B-Instruct' model. We need to get the model ID. Click on the model on the https://github.com/marketplace/models page.

Then, click on the "Get started" button. In the "3. Run a basic code sample" section, grab the modelName (meta-llama-3.1-405b-instruct) and paste it in our appsettings.json file instead of Phi-3.5-mini-instruct.


Run the application again. This is what I experienced with the AI model meta-llama-3.1-405b-instruct:


Conclusion

GitHub AI models are easy to access. I hope you come up with great AI driven applications that make a difference to our world.


Thursday, September 19, 2024

Phi-3 Small Language Model (SLM) in a C# console application with Ollama and Sematic Kernel

What is small language model (SLM)?

A small language model (SLM) is a machine learning model typically based on a large language mode (LLM) but of greatly reduced size. An SLM retains much of the functionality of the LLM from which it is built but with far less complexity and computing resource demand.

What is Ollama?

Ollama is an application you can download onto your computer or server to run open source generative AI large-language-models (LLMs) such as Meta's Llama 3 and Microsoft's Phi-3. You can see the many models available at https://www.ollama.com/library.

What is Semantic Kernel

Semantic Kernel is a lightweight, open-source development kit that lets you easily build AI agents and integrate the latest AI models into your C#, Python, or Java codebase.

Overview

In this tutorial, we will see how easy it is to use the Phi-3 small language model in a C# application. The best part is that it is free and runs entirely on your local device. Ollama will be used to serve the Phi-3 small language model and Semantic Kernel will be the C# development kit we will use for code development. 

Getting Started

Download Ollama installer from https://www.ollama.com/download.

Once you have installed Ollama, run these commands from a terminal window:

ollama pull phi3:latest
ollama list
ollama show phi3:latest

In a working directory, create a C# console app named Phi3SK inside a terminal window with the following command:

dotnet new console -n SlmSK

Change into the newly created directory Phi3SK with:

cd SlmSK

Next, let's add the Semantic Kernel package to our console application with:

dotnet add package Microsoft.SemanticKernel -v 1.19.0

Open the project in VS Code and add this directive to the .csproj file right below: <Nullable>enable</Nullable>:

<NoWarn>SKEXP0010</NoWarn> 

The Code

Replace contents of Program.cs with the following code:

using Microsoft.SemanticKernel;
using System.Text;
using Microsoft.SemanticKernel.ChatCompletion;

// Initialize the Semantic kernel
var builder = Kernel.CreateBuilder();

// We will use Semantic Kernel OpenAI API
builder.Services.AddOpenAIChatCompletion(
        modelId: "phi3",
        apiKey: null,
        endpoint: new Uri("http://localhost:11434"));

var kernel = builder.Build();

// get a chat completion service
var chatCompletionService = kernel.GetRequiredService<IChatCompletionService>(); 

// Create a new chat by specifying the assistant
ChatHistory chat = new(@"
    You are an AI assistant that helps people find information. 
    The response must be brief and should not exceed one paragraph.
    If you do not know the answer then simply say 'I do not know the answer'."
);
 
// Instantiate a StringBuilder
StringBuilder strBuilder = new();

// User question & answer loop
while (true) {
    Console.Write("Q: "); 
 
// Get the user's question
    chat.AddUserMessage(Console.ReadLine()!);

    // Clear contents of the StringBuilder
strBuilder.Clear();

    // Get the AI response streamed back to the console
    await foreach (var message in chatCompletionService.GetStreamingChatMessageContentsAsync(chat, kernel: kernel))
    {
        Console.Write(message);
        strBuilder.Append(message.Content);
    }
    Console.WriteLine();
    chat.AddAssistantMessage(strBuilder.ToString());

    Console.WriteLine();

}

Running the app

Run the application with:

dotnet run

I entered this question: How long does a direct flight take from Los Angles to Frankfurt?

Then, I got the following response:

Use another SLM

How about we use another SLM. Let's try llama3.1 ( https://www.ollama.com/library/llama3.1 ). Pull the image with:

ollama pull llama3.1:latest

In the code, REPLACE modelId: "phi3" WITH modelId: "llama3.1". Run your application with the same question. This is what I got:


The response is quite similar to that received from Phi-3, even though it shows that the flight to Frankfurt takes longer.

Conclusion

We can package our applications with a local SLM. This makes our applications cheaper, faster, connection-free, and self contained.


Thursday, April 25, 2024

.NET Aspire with VS Code, SQLite & SQL Server

In this Tutorial, we will explore .NET Aspire. At first, we will use it with SQLite. Thereafter, we will modify the solution so that it uses a SQL Server Docker image instead of SQLite. All this is done in a terminal window with the help of VS Code. The objective is to serve those who do not use the higher end Visual Studio 2022 IDE.

Start source code: https://github.com/medhatelmasry/SoccerFIFA
End source code: https://github.com/medhatelmasry/SoccerAspire
Companion video: https://youtu.be/FDF04Lner5k

Prerequisites

In order to continue with this tutorial, you will need the following:

  • .NET 8.0
  • dotnet-ef tool - If you do not have this tool, you can install it with the terminal window command "dotnet tool install --global dotnet-ef"
  • Visual Studio Code
  • Docker Desktop
  • Azure Data Studio

.NET Aspire Setup

In any terminal window folder, run the following command before you install .NET Aspire:

dotnet workload update 

To install the .NET Aspire workload from the .NET CLI, execute this command:

dotnet workload install aspire

Check your version of .NET Aspire, with this command:

dotnet workload list

Startup Application

We will start with a .NET 8.0 application that involves a Minimal API backend and a Blazor frontend. Clone the code from this GitHub site with:

git clone https://github.com/medhatelmasry/SoccerFIFA

To get a good sense of what the application does, follow these steps:

1) Inside the WebApiFIFA folder, run the following command in a terminal window:

dotnet watch


Try out the GET /api/games endpoint, you should see the following output:


2) Next, let us try the frontend. Inside a terminal window in the BlazorFIFA folder, run this command:

dotnet watch


We know that the application works. However, it is a pain to have to start both projects to get the solution to work. This is where .NET Aspire will come to the rescue.

Converting solution to .NET Aspire

Close both terminal windows by hitting CTRL C in each.

To add the basic .NET Aspire projects to our solution, run the following command inside the root SoccerFIFA folder:

dotnet new aspire

This adds these artifacts:

  • SoccerFIFA.sln file
  • SoccerFIFA.AppHost folder
  • SoccerFIFA.ServiceDefaults folder

We will add our previous API & Blazor projects to the newly created .sln file by executing the following commands inside the root SoccerFIFA folder:

dotnet sln add ./BlazorFIFA/BlazorFIFA.csproj
dotnet sln add ./WebApiFIFA/WebApiFIFA.csproj
dotnet sln add ./LibraryFIFA/LibraryFIFA.csproj

Next, we need to add references in the SoccerFIFA.AppHost project to the BlazorFIFA and WebApiFIFA projects with these commands:

dotnet add ./SoccerFIFA.AppHost/SoccerFIFA.AppHost.csproj reference ./BlazorFIFA/BlazorFIFA.csproj
dotnet add ./SoccerFIFA.AppHost/SoccerFIFA.AppHost.csproj reference ./WebApiFIFA/WebApiFIFA.csproj

Also, both BlazorFIFA and WebApiFIFA projects need to have references into SoccerFIFA.ServiceDefaults with:

dotnet add ./BlazorFIFA/BlazorFIFA.csproj reference ./SoccerFIFA.ServiceDefaults/SoccerFIFA.ServiceDefaults.csproj

dotnet add ./WebApiFIFA/WebApiFIFA.csproj reference ./SoccerFIFA.ServiceDefaults/SoccerFIFA.ServiceDefaults.csproj

Inside the SoccerFIFA root folder, start VS Code with:

code .

Then, in the Program.cs files of both BlazorFIFA and WebApiFIFA, add this code before "var app = builder.Build();":

// Add service defaults & Aspire components.
builder.AddServiceDefaults();

In the Program.cs file in SoccerFIFA.AppHost, add this code right before “builder.Build().Run();”:

var api = builder.AddProject<Projects.WebApiFIFA>("backend");
builder.AddProject<Projects.BlazorFIFA>("frontend")
    .WithReference(api);

Now, the relative name for the API app is “backend”. Therefore, we can change the base address to http://backend. Change Constants.cs file in BlazorFIFA to:

public const string ApiBaseUrl = "http://backend/";

Test .NET Aspire Solution

To test the solution, in the SoccerFIFA.AppHost folder, start the application with:

dotnet watch

NOTE: If you are asked to enter a token, copy and paste it from the value in your terminal window:



This is what you should see in your browser:


Click on the app represented by the frontend link on the second row. You should experience the Blazor app:


At this stage we get a sense of what .NET Aspire can do for us. It essentially orchestrates the connection between multiple projects and produces a single starting point in the Host project. Let us take this journey one step further by converting our backend API so it uses a SQL Server container instead of SQLite.

Using SQL Server instead of SQLite

Stop the running application by hitting CTRL C.

IMPORTANT: You will need to ensure that Docker Desktop is running on your computer because Aspire will start SQL Server in a container. Also, update your Docker Desktop to the  latest version.

Add this package to the WebApiFIFA project:

dotnet add package Aspire.Microsoft.EntityFrameworkCore.SqlServer -v 8.0.2

Also in WebApiSoccer project Program.cs file, comment out (or delete) this code:

var connectionString = builder.Configuration.GetConnectionString("DefaultConnection");
builder.Services.AddDbContext<ApplicationDbContext>(options =>
    options.UseSqlite(connectionString));

Place the below code just before builder.AddServiceDefaults():

builder.AddSqlServerDbContext<ApplicationDbContext>("sqldata");

In the same the Program.cs file, add the following code right before app.Run() to automatically run Entity Framework migrations during startup:

if (app.Environment.IsDevelopment()) {
    using (var scope = app.Services.CreateScope()) {
        var context = scope.ServiceProvider.GetRequiredService<ApplicationDbContext>();
        context.Database.EnsureCreated();
    }
} else {
    app.UseExceptionHandler("/Error", createScopeForErrors: true);
    // The default HSTS value is 30 days.
    // You may want to change this for production scenarios, see https://aka.ms/aspnetcore-hsts.
    app.UseHsts();
}

Cleanup the backend API project (WebApiFIFA) from all traces of SQLite by doing the following:

  1. Delete SQLite files college.db, college.db-shm, and college.db-wal.
  2. In WebApiFIFA.csproj, delete: <PackageReference Include="Microsoft.EntityFrameworkCore.Sqlite" Version="8.0.4" />
  3. Delete the Data/Migrations folder.
  4. Delete the ConnectionStrings section in appsettings.json
We will create new migrations that work with SQL Server, instead of SQLite. Therefore, run the following command from within a terminal window inside folder WebApiFIFA.

dotnet ef migrations add M1 -o Data/Migrations

Configure AppHost to use SQL Server

The WebApiFIFA.AppHost project is the orchestrator for your app. It's responsible for connecting and configuring the different projects and services of your app. Add the .NET Aspire Entity Framework Core Sql Server library package to your SoccerFIFA.AppHost project with:

dotnet add package Aspire.Hosting.SqlServer -v 8.0.2

Replace the contents of the Program.cs file in AppHost project with the following code:

var builder = DistributedApplication.CreateBuilder(args);

var sql = builder.AddSqlServer("sql")
                 .AddDatabase("sqldata"); 

var api = builder.AddProject<Projects.WebApiFIFA>("backend")
    .WithReference(sql);
    
builder.AddProject<Projects.BlazorFIFA>("frontend")
    .WithReference(api);
 
builder.Build().Run();

The preceding code adds a SQL Server Container resource to your app and configures a connection to a database called sqldata. The Entity Framework classes you configured earlier will automatically use this connection when migrating and connecting to the database.

Run the solution

After ensuring that your Docker Desktop is running, execute this terminal window command inside the SoccerFIFA.AppHost folder:

dotnet watch

Your browser will look like this:


Click on the highlighted link above. Our application works just as it did before. The only difference is that this time we are using SQL Server running in a container:


I hope you found this useful and are able to see the possibilities of this new addition to .NET.

Thursday, February 29, 2024

OpenAI Function Calling with Semantic Kernel, C#, & Entity Framework

In this article, we will create a Semantic Kernel plugin that contains four functions that interact with live SQLite data. Entity Framework will be used to access the SQLite database. The end result is to use the powers of the OpenAI natural language models to ask questions and get answers about our custom data.

Source code: https://github.com/medhatelmasry/EfFuncCallSK

Companion Video: https://youtu.be/4sKRwflEyHk

Getting Started

Let’s start by creating an ASP.NET Razor pages web application. Select a suitable working folder on your computer, then enter the following terminal window commands:

dotnet new razor --auth individual -o EfFuncCallSK
cd EfFuncCallSK

Te above creates a Razor Pages app with support for Entity Framework and SQLite.

Add these packages:

dotnet add package CsvHelper
dotnet add package Microsoft.SemanticKernel 
dotnet add package Microsoft.EntityFrameworkCore.Design
dotnet add package Microsoft.EntityFrameworkCore.Tools
dotnet add package Microsoft.EntityFrameworkCore
dotnet add package Microsoft.EntityFrameworkCore.SQLite.Design

The CsvHelper package will help us load a list of products from a CSV file named students.csv and hydrate a list of Student objects. The second package is needed to work with Semantic Kernel. The rest of the packages support Entity Framework and SQLite.

Let’s Code

appsettings.json

Add these to appsettings.json:

"AIService": "OpenAI", /* Azure or OpenAI */
"AzureOpenAiSettings": {
   "Endpoint": "https://YOUR_RESOURCE_NAME.openai.azure.com/",
   "Model": "gpt-35-turbo",
   "ApiKey": "fake-key-fake-key-fake-key-fake-key"
},
"OpenAiSettings": {
  "ModelType": "gpt-3.5-turbo",
  "ApiKey": "fake-key-fake-key-fake-key-fake-key"
}

The first setting allows you to choose between using OpenAI or Azure OpenAI.

Of course, you need to adjust the endpoint setting with the appropriate value that pertains to the OpenAI and Azure OpenAI services. Also, enter the correct value for the ApiKey.

NOTE: You can use OpenAI or Azure OpenAI, or both.

Data

Create a folder named Models. Inside the Models folder, add the following Student class: 

public class Student {
   public int StudentId { get; set; }
   public string? FirstName { get; set; }
   public string? LastName { get; set; }
   public string? School { get; set; }
 
   public override string ToString() {
      return $"Student ID: {StudentId}, First Name: {FirstName}, Last Name: {LastName}, School: {School}";
   }
}

Developers like having sample data when building data driven applications. Therefore, we will create sample data to ensure that our application behaves as expected. Copy the following data and save it in a text file wwwroot/students.csv:

StudentId,FirstName,LastName,School
1,Tom,Max,Nursing
2,Ann,Fay,Mining
3,Joe,Sun,Nursing
4,Sue,Fox,Computing
5,Ben,Ray,Mining
6,Zoe,Cox,Business
7,Sam,Ray,Mining
8,Dan,Ash,Medicine
9,Pat,Lee,Computing
10,Kim,Day,Nursing
11,Tim,Rex,Computing
12,Rob,Ram,Nursing
13,Jan,Fry,Mining
14,Jim,Tex,Nursing
15,Ben,Kid,Business
16,Mia,Chu,Medicine
17,Ted,Tao,Computing
18,Amy,Day,Nursing
19,Ian,Roy,Nursing
20,Liz,Kit,Nursing
21,Mat,Tan,Medicine
22,Deb,Roy,Medicine
23,Ana,Ray,Mining
24,Lyn,Poe,Computing
25,Amy,Raj,Nursing
26,Kim,Ash,Mining
27,Bec,Kid,Nursing
28,Eva,Fry,Computing
29,Eli,Lap,Business
30,Sam,Yim,Nursing
31,Joe,Hui,Mining
32,Liz,Jin,Nursing
33,Ric,Kuo,Business
34,Pam,Mak,Computing
35,Cat,Yao,Medicine
36,Lou,Zhu,Mining
37,Tom,Dag,Business
38,Stu,Day,Business
39,Tom,Gad,Mining
40,Bob,Bee,Business
41,Jim,Ots,Business
42,Tom,Mag,Business
43,Hal,Doe,Mining
44,Roy,Kim,Mining
45,Vis,Cox,Nursing
46,Kay,Aga,Nursing
47,Reo,Hui,Nursing
48,Bob,Roe,Mining
49,Jay,Eff,Computing
50,Eva,Chu,Business
51,Lex,Rae,Nursing
52,Lin,Dex,Mining
53,Tom,Dag,Business
54,Ben,Shy,Computing
55,Rob,Bos,Nursing
56,Ali,Mac,Business
57,Edi,Gee,Computing
58,Eva,Cao,Mining
59,Jun,Lam,Computing
60,Eli,Tao,Computing
61,Ana,Bay,Computing
62,Gil,Tal,Mining
63,Wes,Dey,Nursing
64,Nea,Tan,Computing
65,Ava,Day,Nursing
66,Rie,Ray,Business
67,Ken,Sim,Nursing

Add the following code inside the ApplicationDbContext class located inside the Data folder:

public DbSet<Student> Students => Set<Student>();    
 
protected override void OnModelCreating(ModelBuilder modelBuilder) {
    base.OnModelCreating(modelBuilder);
    modelBuilder.Entity<Student>().HasData(LoadStudents());
}  
 
// Load students from a csv file named students.csv in the wwwroot folder
public static List<Student> LoadStudents() {
    var students = new List<Student>();
    using (var reader = new StreamReader(Path.Combine("wwwroot", "students.csv"))) {
        using var csv = new CsvReader(reader, CultureInfo.InvariantCulture);
        students = csv.GetRecords<Student>().ToList();
    }
    return students;
}

Let us add a migration and subsequently update the database. Execute the following CLI commands in a terminal window.

dotnet ef migrations add Students -o Data/Migrations
dotnet ef database update

At this point the database and tables are created in a SQLite database named app.db.

Helper Methods

We need a couple of static helper methods to assist us along the way. In the Models folder, add a class named Utils and add to it the following class definition:

public class Utils {
  public static string GetConfigValue(string config) {
    IConfigurationBuilder builder = new ConfigurationBuilder();
    if (System.IO.File.Exists("appsettings.json"))
      builder.AddJsonFile("appsettings.json", false, true);
    if (System.IO.File.Exists("appsettings.Development.json"))
      builder.AddJsonFile("appsettings.Development.json", false, true);
    IConfigurationRoot root = builder.Build();
    return root[config]!;
  }
 
  public static ApplicationDbContext GetDbContext() {
    var optionsBuilder = new DbContextOptionsBuilder<ApplicationDbContext>();
    var connStr = Utils.GetConfigValue("ConnectionStrings:DefaultConnection");
    optionsBuilder.UseSqlite(connStr);
    ApplicationDbContext db = new ApplicationDbContext(optionsBuilder.Options);
    return db;
  }
}

Method GetConfigValue() will read values in appsettings.json from any static method. The second GetDbContext() method gets an instance of the ApplicationDbContext class, also from any static method.

Plugins

Create a folder named Plugins and add to it the following class file named StudentPlugin.cs with this code:

public class StudentPlugin {
  [KernelFunction, Description("Get student details by first name and last name")]
  public static string? GetStudentDetails(
  [Description("student first name, e.g. Kim")]
  string firstName,
  [Description("student last name, e.g. Ash")]
  string lastName
  ) {
    var db = Utils.GetDbContext();
    var studentDetails = db.Students
      .Where(s => s.FirstName == firstName && s.LastName == lastName).FirstOrDefault();
    if (studentDetails == null)
      return null;
    return studentDetails.ToString();
  }

  [KernelFunction, Description("Get students in a school given the school name")]
  public static string? GetStudentsBySchool(
    [Description("The school name, e.g. Nursing")]
    string school
  ) {
    var studentsBySchool = Utils.GetDbContext().Students
      .Where(s => s.School == school).ToList();
    if (studentsBySchool.Count == 0)
      return null;
    return JsonSerializer.Serialize(studentsBySchool);
  }


  [KernelFunction, Description("Get the school with most or least students. Takes boolean argument with true for most and false for least.")]
  static public string? GetSchoolWithMostOrLeastStudents(
    [Description("isMost is a boolean argument with true for most and false for least. Default is true.")]
    bool isMost = true
  ) {
    var students = Utils.GetDbContext().Students.ToList();
    IGrouping<string, Student>? schoolGroup = null;
    if (isMost)
      schoolGroup = students.GroupBy(s => s.School)
          .OrderByDescending(g => g.Count()).FirstOrDefault()!;
    else
        schoolGroup = students.GroupBy(s => s.School)
            .OrderBy(g => g.Count()).FirstOrDefault()!;
    if (schoolGroup != null)
      return $"{schoolGroup.Key} has {schoolGroup.Count()} students";
    else
      return null;
  }

  [KernelFunction, Description("Get students grouped by school.")]
  static public string? GetStudentsInSchool() {
    var students = Utils.GetDbContext().Students.ToList().GroupBy(s => s.School)
      .OrderByDescending(g => g.Count());
    if (students == null)
      return null;
    else
      return JsonSerializer.Serialize(students);
  }
}

 In the above code, there are four methods with these purposes:

GetStudentDetails()Gets student details given first and last names
GetStudentsBySchool()Gets students in a school given the name of the school
GetSchoolWithMostOrLeastStudents()Takes a Boolean value isMost – true returns school with most students and false returns school with least students.
GetStudentsInSchool()Takes no arguments and returns a count of students by school.

The User Interface

We will re-purpose the Index.cshtml and Index.cshtml.cs files so the user can enter a prompt in natural language and receive a response that comes from the OpenAI model working with our semantic kernel plugin. 

Index.chtml

Replace the content of Pages/Index.cshtml with:

@page
@model IndexModel
@{
    ViewData["Title"] = Model.Service + " Function Calling with Semantic Kernel";
}
<div class="text-center">
    <h3 class="display-6">@ViewData["Title"]</h3>
    <form method="post">
        <input type="text" name="prompt" size="80" required />
        <input type="submit" value="Submit" />
    </form>
    <div style="text-align: left">
        <h5>Example prompts:</h5>
        <p>Which school does Mat Tan go to?</p>
        <p>Which school has the most students?</p>
        <p>Which school has the least students?</p>
        <p>Get the count of students in each school.</p>
        <p>How many students are there in the school of Mining?</p>
        <p>What is the ID of Jan Fry and which school does she go to?</p>
        <p>Which students belong to the school of Business? Respond only in JSON format.</p>
        <p>Which students in the school of Nursing have their first or last name start with the letter 'J'?</p>
    </div>
    @if (Model.Reply != null)
    {
        <p class="alert alert-success">@Model.Reply</p>
    }
</div>

The above markup displays an HTML form that accepts a prompt from a user. The prompt is then submitted to the server and the response is displayed in a paragraph (<p> tag) with a green background (Bootstrap class alert-success).

Meantime, at the bottom of the page there are some suggested prompts – namely:

Which school does Mat Tan go to?
Which school has the most students?
Which school has the least students?
Get the count of students in each school.
How many students are there in the school of Mining?
What is the ID of Jan Fry and which school does she go to?
Which students belong to the school of Business? Respond only in JSON format.
Which students in the school of Nursing have their first or last name start with the letter 'J'?

Index.chtml.cs

Replace the IndexModel class definition in Pages/Index.cshtml.cs with:

public class IndexModel : PageModel {
  private readonly ILogger<IndexModel> _logger;
  private readonly IConfiguration _config;
 
  [BindProperty]
  public string? Reply { get; set; }
 
  [BindProperty]
  public string? Service { get; set; }
 
  public IndexModel(ILogger<IndexModel> logger, IConfiguration config) {
    _logger = logger;
    _config = config;
    Service = _config["AIService"]!;
  }
  public void OnGet() { }
  // action method that receives prompt from the form
  public async Task<IActionResult> OnPostAsync(string prompt) {
    // call the Azure Function
    var response = await CallFunction(prompt);
    Reply = response;
    return Page();
  }
 
  private async Task<string> CallFunction(string question) {
    string azEndpoint = _config["AzureOpenAiSettings:Endpoint"]!;
    string azApiKey = _config["AzureOpenAiSettings:ApiKey"]!;
    string azModel = _config["AzureOpenAiSettings:Model"]!;
    string oaiModelType = _config["OpenAiSettings:ModelType"]!;
    string oaiApiKey = _config["OpenAiSettings:ApiKey"]!;
    string oaiModel = _config["OpenAiSettings:Model"]!;
    string oaiOrganization = _config["OpenAiSettings:Organization"]!;
    var builder = Kernel.CreateBuilder();
    if (Service!.ToLower() == "openai")
      builder.Services.AddOpenAIChatCompletion(oaiModelType, oaiApiKey);
    else
      builder.Services.AddAzureOpenAIChatCompletion(azModel, azEndpoint, azApiKey);
    builder.Services.AddLogging(c => c.AddDebug().SetMinimumLevel(LogLevel.Trace));
    builder.Plugins.AddFromType<StudentPlugin>();
    var kernel = builder.Build();
    // Create chat history
    ChatHistory history = [];
    // Get chat completion service
    var chatCompletionService = kernel.GetRequiredService<IChatCompletionService>();
    // Get user input
    history.AddUserMessage(question);
    // Enable auto function calling
    OpenAIPromptExecutionSettings openAIPromptExecutionSettings = new() {
      ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions
    };
    // Get the response from the AI
    var result = chatCompletionService.GetStreamingChatMessageContentsAsync(
      history,
      executionSettings: openAIPromptExecutionSettings,
      kernel: kernel);
    string fullMessage = "";
    await foreach (var content in result) {
      fullMessage += content.Content;
    }
    // Add the message to the chat history
    history.AddAssistantMessage(fullMessage);
    return fullMessage;
  }
}

In the above code, the prompt entered by the user is posted to the OnPostAsync() method. The prompt is then passed to the CallFunction() method, which returns the final response from Azure OpenAI.

The CallFunction() method reads the OpenAI or Azure OpenAI settings from appsettings.json, depending on the AIService key.

A builder object is created from Semantic Kernel. If we are using OpenAI, then the AddOpenAIChatCompletion service is added. Otherwise, the AddAzureOpenAIChatCompletion service is added.

The StudentPlugin is then added to the builder object Plugins collection.

The builder Build() method is then called returning a kernel object. From the kernel object we then get a chatCompletionService object by calling the GetRequiredService() method.

Thereafter:

  • Add the prompt to the history
  • Make a call to the chat message service and receive a response
  • Concatenate response into a single string
  • Return the concatenated message

Trying the application

In a terminal window, at the root of the Razor Pages web application, enter the following command:

dotnet watch

The following page will display in your default browser:


You can enter any of the suggested prompts to ensure we are getting the proper results. I entered the last prompt and got these results:




Conclusion

We have seen how Semantic Kernel and Function Calling can be used with data coming from a database. In this example we are using SQLite. However, an other database source can be used using the same technique.