Overview
In this tutorial, we will see how easy it is to use the Phi-3 small language model in a PHP application. The best part is that it is free and runs entirely on your local device. Ollama is used to serve the Phi-3 small language model and LLPhant is the PHP framework for communicating with the AI model.
Prerequisites
To proceed, you will need the following:
- PHP - You need to have PHP version 8.3 (or higher) installed on your computer. You can download the latest version from https://www.php.net/downloads.php.
- Composer – If you do not have Composer yet, download and install it for your operating system from https://getcomposer.org/download/.
What is small language model (SLM)?
A small language model (SLM) is a machine learning model typically based on a large language model (LLM) but of greatly reduced size. An SLM retains much of the functionality of the LLM from which it is built but with far less complexity and computing resource demand.
What is Ollama?
Ollama is an application you can download onto your computer or server to run open-source generative AI small-language-models (SLMs) such as Meta's Llama 3 and Microsoft's Phi-3. You can see the many models available at https://www.ollama.com/library.
What is LLPhant
LLPhant is an open-source PHP Generative AI Framework at https://github.com/LLPhant/LLPhant.
Getting Started
Download the Ollama installer from https://www.ollama.com/download.
Once you have installed Ollama, run these commands from a terminal window:
ollama pull phi3:latestollama listollama show phi3:latest
In a suitable working directory, create a folder named PhpAI with the following terminal window command:
mkdir PhpAI
Change into the newly created folder with:
cd PhpAI
Using Composer, install the theodo-group/llphant package by running this command:
composer require theodo-group/llphant
Let's get coding
Create a file named index.php with the following content:
<?php
require_once 'vendor/autoload.php';
use LLPhant\OllamaConfig;use LLPhant\Chat\OllamaChat;
$config = new OllamaConfig();$config->model = 'phi3';
$chat = new OllamaChat($config);
$chat->setSystemMessage('You are a helpful assistant who knows about world geography.');
$response = $chat->generateText('what is the capital of france?');
echo $response;?>
Running the app
To run the app, start the PHP web server to listen on port number 8888 with the following command in the PhpAI folder.
php -S localhost:8888
You can view the output by pointing your browser to the following URL:
http://localhost:8888/
This is what I experienced:
No comments:
Post a Comment