Download the PHP package partitech/php-mistral without Composer
On this page you can find all versions of the php package partitech/php-mistral. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.
Download partitech/php-mistral
More information about partitech/php-mistral
Files in partitech/php-mistral
Package php-mistral
Short Description Connect to MistralAi inference or to Llama.cpp
License MIT
Informations about the package php-mistral
Mistral PHP Client
The Mistral PHP Client is a comprehensive PHP library designed to interface with the Mistral AI API.
This client support Mistral : La plateforme and Lama.cpp for local testing and development purpose.
Api is the same as the main Mistral api :
Features
- Chat Completions: Generate conversational responses and complete dialogue prompts using Mistral's language models.
- Chat Completions Streaming: Establish a real-time stream of chat completions, ideal for applications requiring continuous interaction.
- Embeddings: Obtain numerical vector representations of text, enabling semantic search, clustering, and other machine learning applications.
- Fill in the Middle: Automatically generate code by setting a starting prompt and an optional suffix, allowing the model to complete the code in between. Ideal for creating specific code segments within predefined boundaries.
Getting Started
To begin using the Mistral PHP Client in your project, ensure you have PHP installed on your system. This client library is compatible with PHP 8.3 and higher.
Installation
To install the Mistral PHP Client, run the following command:
Usage
To use the client in your PHP application, you need to import the package and initialize a new client instance with your API key.
You can see full example in the examples directory.
Chat message with La plateforme
Note that you can populate chat discussion with :
To get the same with Lama.cpp local inference :
Chat with streamed response with La plateforme
Chat with streamed response with Lama.cpp
Embeddings with La plateform
Result :
Embeddings with Lama.cpp server
Result :
Fill in the middle
Result :
Fill in the middle in stream mode
Result :
Pixtral request
Result :
Lama.cpp inference
MistralAi La plateforme is really cheap you should consider subscribing to it instead of running a local Lama.cpp instance. This bundle cost us only 0.02€ during our tests. If you really feel you need a local server, here is a docker-compose you can use for example:
with a .env file
vLLM inference
vLLM is the official inference alternative to MistralAi La plateforme. Starting to v0.6.0 vLLM is fully compatible with the tools/tool_choice from mistral's plateforme. To get an ISO instance locally you will need to specify the template: examples/tool_chat_template_mistral.jinja . Here is a docker-compose you can use for example:
vLLM inference for tool_call
vLLM inference for Pixtral
About the Response object
$client->chat()
method return a Partitech\PhpMistral\Response
object.
Available methods are :
All of theses methods are accessors to basic response from the server :
Method's names and response json are understandable, but here is a basic explanation :
- Get the last message response from the server. In fact, it gets the last message from the Messages class, which is a list of messages.
example:
- When using streamed response, it get the last chunk of the message yelded by the server.
example:
- Get the id's response from the server
example:
- Get array object with choices responses from the server
example:
- Get the created response's date (integer timestamp)
example:
- Get the guided message object. This is the json_decoded message returned from the vllm server. Only used with vllm server.
example:
- Get the model used, from the server response.
example:
- Get the object index, from the server response.
example:
- get the tool_calls value from the server's response. This is an associative array with the json_decoded response from the server.
example:
- Get the usage response from the server.
example:
Documentation
For detailed documentation on the Mistral AI API and the available endpoints, please refer to the Mistral AI API Documentation.
Contributing
Contributions are welcome! If you would like to contribute to the project, please fork the repository and submit a pull request with your changes.
License
The Mistral PHP Client is open-sourced software licensed under the MIT license.
Support
If you encounter any issues or require assistance, please file an issue on the GitHub repository issue tracker.
Thanks
This Readme.md is mostly copied from https://github.com/Gage-Technologies/mistral-go. thanks to them :)