Monday, January 20, 2025

Phi-3 Small Language Model (SLM) in a PHP app with Ollama and LLPhant framework

Overview

In this tutorial, we will see how easy it is to use the Phi-3 small language model in a PHP application. The best part is that it is free and runs entirely on your local device. Ollama is used to serve the Phi-3 small language model and LLPhant is the PHP framework for communicating with the AI model. 

Prerequisites

To proceed, you will need the following:

What is small language model (SLM)?

A small language model (SLM) is a machine learning model typically based on a large language model (LLM) but of greatly reduced size. An SLM retains much of the functionality of the LLM from which it is built but with far less complexity and computing resource demand.

What is Ollama?

Ollama is an application you can download onto your computer or server to run open-source generative AI small-language-models (SLMs) such as Meta's Llama 3 and Microsoft's Phi-3. You can see the many models available at https://www.ollama.com/library.

What is LLPhant

LLPhant is an open-source PHP Generative AI Framework at https://github.com/LLPhant/LLPhant

Getting Started

Download the Ollama installer from https://www.ollama.com/download.

Once you have installed Ollama, run these commands from a terminal window:

ollama pull phi3:latest
ollama list
ollama show phi3:latest

In a suitable working directory, create a folder named PhpAI with the following terminal window command:

mkdir PhpAI

Change into the newly created folder with:

cd PhpAI

Using Composer, install the theodo-group/llphant package by running this command:

composer require theodo-group/llphant

Let's get coding

Create a file named index.php with the following content:

<?php

require_once 'vendor/autoload.php';

use LLPhant\OllamaConfig;
use LLPhant\Chat\OllamaChat;  
 
$config = new OllamaConfig();
$config->model = 'phi3'; 
 
$chat = new OllamaChat($config); 
 
$chat->setSystemMessage('You are a helpful assistant who knows about world geography.'); 
 
$response = $chat->generateText('what is the capital of france?');  
 
echo $response;
?>

Running the app

To run the app, start the PHP web server to listen on port number 8888 with the following command in the PhpAI folder.

php -S localhost:8888

You can view the output by pointing your browser to the following URL:

http://localhost:8888/

This is what I experienced:

Conclusion

We can package our applications with a local SLM. This makes our applications cheaper, faster, connection-free, and self-contained.

No comments:

Post a Comment