Download the PHP package prismaticoder/laravel-prompt-manager without Composer
On this page you can find all versions of the php package prismaticoder/laravel-prompt-manager. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.
Download prismaticoder/laravel-prompt-manager
More information about prismaticoder/laravel-prompt-manager
Files in prismaticoder/laravel-prompt-manager
Package laravel-prompt-manager
Short Description Laravel package for seamlessly managing, versioning and testing AI prompts when integrating with LLMs.
License MIT
Informations about the package laravel-prompt-manager
Laravel Prompt Manager
A Laravel package built for the next generation of LLM engineers โ seamlessly manage, version, and test your AI prompts with the same rigor you bring to your code. Designed for developers who want to apply real software engineering principles to prompt engineering and build AI features that scale with confidence.
Table of Contents
- Features
- Installation
- Basic Usage
- Generating and Using Prompts
- Advanced Usage
- Model-Specific Versions
- Working with Existing Template Storage
- A/B Testing Made Easy
- Context-Aware Version Selection
- Custom Token Counting
- Best Practices
- License
Features
- ๐ Prompt versioning
- ๐ Easy A/B testing with random selection
- ๐ Built-in token counting
- ๐ฏ Support for model-specific prompt variants
- ๐งช Simple testing and maintenance
- โก๏ธ Fluent and intuitive API
Installation
Basic Usage
Generate a new prompt class:
This will create a new ProductDescriptionPrompt
class in the App/Prompts
directory
Generating and Using Prompts
Advanced Usage
Model-Specific Versions
Create model-specific prompts and dynamically select versions:
Working with Existing Template Storage
If you already store prompt templates in a database or registry, you can easily integrate them:
A/B Testing Made Easy
Easily run A/B tests on your prompts by enabling random version selection:
Context-Aware Version Selection
You can override the default versionSelector()
method to select prompt versions based on relevant context:
Custom Token Counting
The default token counting strategy is a simple estimation (text length divided by 4, based on OpenAI's GPT tokenizer approximation). To improve accuracy, you can override it with a custom token counter like Tiktoken:
Best Practices
- Abstract Complex Prompts: Break down large prompts into smaller methods.
- Semantic Versioning: Use clear naming conventions (e.g.,
v1-gpt4
,v1-formal
). - Testing: Use built-in random selection for effective A/B testing.
License
The MIT License (MIT). See License File.