tredmann / php-ollama
v1.0.1
2025-01-10 13:42 UTC
Requires
- php: ^8.3
- guzzlehttp/guzzle: ^7.0
Requires (Dev)
- laravel/pint: ^1.19
- phpstan/phpstan: ^2.1
- phpunit/phpunit: ^11
README
PHP Client Library to interact with Ollama API.
The intention is to work with models on your Ollama setup and not the create a delete models. Therefore, this library will not implement any APIs to create, move or delete models. Here is a list of the APIs we intend to implement and the status of the implementation:
- Completion (without streaming support)
- Chat Completion (without streaming support)
- List local models
- Show Model Information
- List Running Models
- Version
The checkout the Ollama API Docs for more information and the APIs we might miss.
This package contains some low-level API libraries as well as a convenient API wrapper for all APIs.
Installation
composer install tredmann/php-ollama
Convenience Wrapper
The easiest way to ask the LLM things is to use the convenience wrapper:
use Ollama\Ollama; $ollama = new Ollama(model: 'gemma2:latest'); echo $ollama->completion(prompt: 'What is the capitol of Germany?'); // The capital of Germany is **Berlin**.
It does have a ton of limitations, but for quick results it is easy to use. I would highly encourage to look into the low-level library.
General way of working with the low-level library
Creating the Client
use Ollama\Client\OllamaClient; $client = new OllamaClient( baseUrl: 'http://localhost:11434' // default );
Inject the client into the respected API
use Ollama\Api\Completion; $completionApi = new Completion(client: $client);
Use the API by creating an API request
use Ollama\Requests\CompletionRequest; $request = new CompletionRequest( model: 'phi3.5:latest', prompt: 'What is the capitol of Germany?' );
Use a request to query the API
$response = $completionApi->getCompletion(request: $request);
Use the response
echo $response->response; // 'The capitol of Germany is Berlin.'