Download the PHP package llm-agents/prompt-generator without Composer
On this page you can find all versions of the php package llm-agents/prompt-generator. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.
Download llm-agents/prompt-generator
More information about llm-agents/prompt-generator
Files in llm-agents/prompt-generator
Package prompt-generator
Short Description Prompt generator for LLM agents with interceptors
License MIT
Informations about the package prompt-generator
LLM Agents Prompt Generator
This package provides a flexible and extensible system for generating chat prompts with all required system and user messages for LLM agents. It uses an interceptor-based approach to expand generator abilities.
Installation
You can install the package via Composer:
Setup in Spiral Framework
To get the Site Status Checker Agent up and running in your Spiral Framework project, you need to register its bootloader.
Here's how:
- Open up your
app/src/Application/Kernel.php
file. -
Add the bootloader like this:
- Create a bootloader for the prompt generator. Create a file named
PromptGeneratorBootloader.php
in yourapp/src/Application/Bootloader
directory:
And that's it! Your Spiral app is now ready to use the agent.
Usage
Here's an example of how to initialize the prompt generator and generate a prompt:
Interceptors
The package comes with several built-in interceptors:
InstructionGenerator
This interceptor adds the agent's instruction to the prompt. It includes important rules like responding in markdown format and thinking before responding to the user.
AgentMemoryInjector
This interceptor adds the agent's memory to the prompt. It includes both static memory (defined when creating the agent) and dynamic memory (which can be updated during the conversation).
LinkedAgentsInjector
This interceptor adds information about linked agents to the prompt. It provides details about other agents that the current agent can call for help, including their keys, descriptions, and output schemas.
UserPromptInjector
This interceptor adds the user's input to the prompt as a user message.
Creating Custom Interceptors
You can create custom interceptors by implementing the LLM\Agents\PromptGenerator\PromptInterceptorInterface
:
Let's create a ContextAwarePromptInjector
that adds relevant context to the prompt based on the current time of day
and
user preferences. This example will demonstrate how to create a more sophisticated interceptor that interacts with
external services and modifies the prompt accordingly.
Then, add your custom interceptor to the pipeline:
Check out this UML sequence diagram to see how the prompt generation process works with the interceptors:
Implementing PromptContextInterface
The PromptGeneratorInput
includes a context
property of type PromptContextInterface
. This interface allows you to
pass custom context data to your interceptors. To use it effectively, you need to create your own implementation of this
interface.
Here's an example of how you might implement the PromptContextInterface
:
Then, when generating a prompt, you would pass an instance of your custom context:
In your custom interceptors, you can then access this context data:
By implementing your own PromptContextInterface
, you can pass any necessary data from your application to your
interceptors, allowing for highly customized and context-aware prompt generation.
Want to help out? 🤝
We love contributions! If you've got ideas to make this agent even cooler, here's how you can chip in:
- Fork the repo
- Make your changes
- Create a new Pull Request
Just make sure your code is clean, well-commented, and follows PSR-12 coding standards.
License 📄
This project is licensed under the MIT License - see the LICENSE file for details.
That's all, folks! If you've got any questions or run into any trouble, don't hesitate to open an issue.