Download the PHP package ollama/ollama-php-client without Composer
On this page you can find all versions of the php package ollama/ollama-php-client. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.
Informations about the package ollama-php-client
Ollama PHP Client
This is a PHP library for Ollama. Ollama is an open-source project that serves as a powerful and user-friendly platform for running LLMs on your local machine. It acts as a bridge between the complexities of LLM technology and the desire for an accessible and customizable AI experience.
Table of Contents
- Installation
- Usage
- Completions Resource
- create
- createStreamed
- Chat Resource
- create
- createStreamed
- Models Resource
- list
- show
- create
- createStreamed
- copy
- delete
- pull
- pullStreamed
- push
- pushStreamed
- runningList
- Blobs Resource
- exists
- create
- Embed Resource
- create
- Completions Resource
- Testing
- License
Installation
You can install the package via Composer:
Ensure your project meets the following requirements:
Requires PHP 8.1+
Requires Ollama
You can find Official Ollama documentation here.
Usage
Then, you can create a new Ollama client instance:
Completions
Resource
create
Generate a response for a given prompt with a provided model.
createStreamed
Generate a response for a given prompt with a provided model and stream the response.
Chat
Resource
create
Generate a response for a given prompt with a provided model.
Also, you can use the tools
parameter to provide custom functions to the chat. tools
parameter can not be used with createStreamed
method.
createStreamed
Generate a response for a given prompt with a provided model and stream the response.
Models
Resource
list
List all available models.
show
Show details of a specific model.
create
Create a new model.
createStreamed
Create a new model and stream the response.
copy
Copy an existing model.
delete
Delete a model.
pull
Pull a model from the Ollama server.
pullStreamed
Pull a model from the Ollama server and stream the response.
push
Push a model to the Ollama server.
pushStreamed
Push a model to the Ollama server and stream the response.
runningList
List all running models.
Blobs
Resource
exists
Check if a blob exists.
create
Create a new blob.
Embed
Resource
create
Generate an embedding for a given text with a provided model.
Testing
License
The MIT License (MIT). Please see License File for more information.