Download the PHP package lewenbraun/ollama-php-sdk without Composer

On this page you can find all versions of the php package lewenbraun/ollama-php-sdk. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.

FAQ

After the download, you have to make one include require_once('vendor/autoload.php');. After that you have to import the classes with use statements.

Example:
If you use only one package a project is not needed. But if you use more then one package, without a project it is not possible to import the classes with use statements.

In general, it is recommended to use always a project to download your libraries. In an application normally there is more than one library needed.
Some PHP packages are not free to download and because of that hosted in private repositories. In this case some credentials are needed to access such packages. Please use the auth.json textarea to insert credentials, if a package is coming from a private repository. You can look here for more information.

  • Some hosting areas are not accessible by a terminal or SSH. Then it is not possible to use Composer.
  • To use Composer is sometimes complicated. Especially for beginners.
  • Composer needs much resources. Sometimes they are not available on a simple webspace.
  • If you are using private repositories you don't need to share your credentials. You can set up everything on our site and then you provide a simple download link to your team member.
  • Simplify your Composer build process. Use our own command line tool to download the vendor folder as binary. This makes your build process faster and you don't need to expose your credentials for private repositories.
Please rate this library. Is it a good library?

Informations about the package ollama-php-sdk

Ollama PHP Client Library

A lightweight PHP client library to interact with the Ollama API. This library provides a simple, fluent interface to work with text and chat completions, manage models, handle blobs, and generate embeddings.

Installation

Install the library via Composer:

Usage

Initialize the Client

Create a new client instance (default host is http://localhost:11434):

Completion

Generate a text completion using a model. Under the hood, this wraps the /api/generate endpoint.

Non-Streaming Completion

Returns a single response object:

Streaming Completion

Enables streaming to receive the response in chunks:

Chat Completion

Generate chat responses using a conversational context. This wraps the /api/chat endpoint.

Non-Streaming Chat Completion

Returns the final chat response as a single object:

Streaming Chat Completion

Stream the chat response as it is generated:

Model Management

Manage models through various endpoints (create, list, show, copy, delete, pull, push).

Create a Model

Create a new model from an existing model (or other sources):

List Models

Retrieve a list of available models:

List Running Models

List models currently loaded into memory:

Show Model Information

Retrieve details about a specific model:

Copy a Model

Create a copy of an existing model:

Delete a Model

Delete a model by name:

Pull a Model

Download a model from the Ollama library:

Push a Model

Upload a model to your model library:

Blobs

Work with binary large objects (blobs) used for model files.

Check if a Blob Exists

Verify if a blob (by its SHA256 digest) exists on the server:

Push a Blob

Upload a blob to the Ollama server:

Embeddings

Generate embeddings from a model using the /api/embed endpoint.

Version

Retrieve the version of the Ollama server.

Advanced Parameters

Each method accepts additional parameters (e.g., suffix, format, options, keep_alive, etc.) that can be used to fine-tune the request. For example, you can pass advanced options in the request arrays for structured outputs, reproducible responses, and more. For full details, refer to the Ollama API documentation.

Contributing

Contributions, issues, and feature requests are welcome. Please check the issues page for more details.

License

This project is licensed under the MIT License.


All versions of ollama-php-sdk with dependencies

PHP Build Version
Package Version
Requires php Version >=8.1
guzzlehttp/guzzle Version ^7.0
Composer command for our command line client (download client) This client runs in each environment. You don't need a specific PHP version etc. The first 20 API calls are free. Standard composer command

The package lewenbraun/ollama-php-sdk contains the following files

Loading the files please wait ....