Download the PHP package saqqal/llm-integration-bundle without Composer
On this page you can find all versions of the php package saqqal/llm-integration-bundle. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.
Informations about the package llm-integration-bundle
๐ค LLMIntegrationBundle
LLMIntegrationBundle is a powerful Symfony bundle that seamlessly integrates Large Language Models (LLMs) into your Symfony applications. With support for multiple AI providers and a flexible architecture, it's designed for easy extension and customization.
๐ Table of Contents
- Features
- Installation
- Configuration
- Usage
- Available AI Clients
- CLI Commands
- Extending the Bundle
- Exception Handling
- Testing
- License
- Author
- Contributing
- Documentation
- Acknowledgements
โจ Features
- ๐ Support for multiple AI providers
- โ๏ธ Flexible configuration
- ๐ก๏ธ Exception handling with custom exceptions
- ๐ฅ๏ธ CLI integration for generating new AI service classes
- ๐งฉ Extensible architecture
- ๐งช Comprehensive unit testing
๐ฆ Installation
Install the bundle using Composer:
๐ ๏ธ Configuration
-
Register the bundle in
config/bundles.php
: -
Create
config/packages/llm_integration.yaml
: - Set the API key in your
.env
file:
๐ Usage
Injecting the AI Service
Inject AiServiceInterface
into your services or controllers:
Generating Responses
Use the generate
method to send prompts and receive responses:
Changing Output Type
You can change the output type to DynamicAiResponse
for more flexible access to API responses:
๐ค Available AI Clients
LLMIntegrationBundle supports the following AI clients:
- API Together (
ApiTogetherClient
) - OpenAI (
OpenAiClient
) - Anthropic (
AnthropicClient
) - Arliai (
ArliaiClient
) - Deepinfra (
DeepinfraClient
) - Groq (
GroqClient
) - HuggingFace (
HuggingFaceClient
) - Mistral (
MistralClient
) - OpenRouter (
OpenRouterClient
) - Tavily (
TavilyClient
)
To use a specific client, set the llm_provider
in your configuration to the corresponding provider name.
๐ป CLI Commands
Generate a new AI service class
Follow the prompts to enter the provider name and API endpoint.
List available AI clients
This command will list all available AI clients that are tagged with the @AiClient
attribute.
๐ง Extending the Bundle
To add a new AI provider:
-
Create a new client class extending
AbstractAiClient
: - Update your configuration to use the new provider:
๐ฆ Exception Handling
Create an event subscriber to handle LlmIntegrationExceptionEvent
:
๐งช Testing
Run the test suite:
๐ License
This bundle is released under the MIT License. See the LICENSE file for details.
๐จโ๐ป Author
Abdelaziz Saqqal - LinkedIn - Portfolio
๐ค Contributing
Contributions are welcome! Please fork the repository and submit a pull request with your changes.
๐ Documentation
For more detailed documentation, please visit our [Wiki]().
All versions of llm-integration-bundle with dependencies
symfony/framework-bundle Version ^6.0 || ^7.0
symfony/http-client Version ^6.0 || ^7.0
symfony/monolog-bundle Version ^3.10
symfony/yaml Version ^6.0 || ^7.0