Download the PHP package assistant-engine/open-functions-core without Composer

On this page you can find all versions of the php package assistant-engine/open-functions-core. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.

FAQ

After the download, you have to make one include require_once('vendor/autoload.php');. After that you have to import the classes with use statements.

Example:
If you use only one package a project is not needed. But if you use more then one package, without a project it is not possible to import the classes with use statements.

In general, it is recommended to use always a project to download your libraries. In an application normally there is more than one library needed.
Some PHP packages are not free to download and because of that hosted in private repositories. In this case some credentials are needed to access such packages. Please use the auth.json textarea to insert credentials, if a package is coming from a private repository. You can look here for more information.

  • Some hosting areas are not accessible by a terminal or SSH. Then it is not possible to use Composer.
  • To use Composer is sometimes complicated. Especially for beginners.
  • Composer needs much resources. Sometimes they are not available on a simple webspace.
  • If you are using private repositories you don't need to share your credentials. You can set up everything on our site and then you provide a simple download link to your team member.
  • Simplify your Composer build process. Use our own command line tool to download the vendor folder as binary. This makes your build process faster and you don't need to expose your credentials for private repositories.
Please rate this library. Is it a good library?

Informations about the package open-functions-core

![Preview](media/preview.png)

OpenFunctions Core

This library provides a set of primitives that simplify LLM calling. It offers an easy way to define messages and message lists, create tool definitions, and execute tool calls. By abstracting these core functionalities, OpenFunctions Core helps you reduce boilerplate code and quickly integrate advanced tool-calling capabilities into your LLM-powered applications.

Key Features

Installation

Install the package via Composer:

Usage

You can use this library for the following challenges:

Messages

Based on the OpenAI schema, the available message types are exposed as primitives. You can use these as building blocks to structure your conversation. The following example shows how to define an array of messages, add them to a MessageList, and convert that list to an array for use in an LLM call.

Tool Calling

In order to enable tool calling, the common challenge is to both define the function definitions and expose the methods to make tool calling possible. To address this, the concept of an OpenFunction is introduced. Inside an OpenFunction, you generate the function definitions and implement the methods that the LLM can invoke. The AbstractOpenFunction class provides convenience methods such as callMethod() to wrap the output in a standardized response and handle errors consistently.

Function Definitions

Each class that extends the abstract open function must implement the generateFunctionDefinitions() method. This method is responsible for describing the functions that your tool exposes. To build these descriptions, you can use the provided helper classes:

For example, here’s a simple implementation:

Once you have implemented your open functions (such as the WeatherOpenFunction), you can generate their function definitions and pass them to the OpenAI client as the tools parameter. For example:

Open Function Registry

In some scenarios, you may want to use the same OpenFunction more than once with different configurations, or you might have different OpenFunctions that define methods with the same name. To handle these cases, the library provides an OpenFunction Registry, which itself is also an OpenFunction. This registry allows you to register each function under a unique namespace, ensuring that even if functions share the same underlying method name, they remain distinct.

For example, suppose you want one WeatherOpenFunction instance to operate in Celsius (the default) and another in Fahrenheit. You could register them as follows:

In this example, the registry ensures that even though both WeatherOpenFunction instances share the same method names (like getWeather), they are uniquely identified by their namespaces (celsius and fahrenheit). This separation allows you to call the appropriate function based on the desired temperature unit without any naming collisions.

Meta Mode

The above example ensures that even if multiple functions are registered under similar method names, each one is uniquely namespaced and easily distinguishable. In some scenarios, however, the total number of methods you want to provide to the LLM may exceed the maximum number it can ingest in a single call. To address this, the OpenFunction Registry offers a meta-mode.

In meta-mode, only three core registry functions are initially registered to the LLM. These functions give the LLM the ability to activate and deactivate additional methods on the fly. This approach keeps the number of tool definitions sent to the LLM both limited and understandable. The following image demonstrates an activated meta-mode within the Filament Assistant Plugin:

The three registry functions provided in meta-mode are:

Method Description Parameters
listFunctions Lists all available functions grouped by namespace, including their names, descriptions, and namespace details. None
activateFunction Activates one or more functions from the registry. Duplicate activations are ignored. functionNames: array of string (required) An array of valid function names to activate.
deactivateFunction Deactivates one or more functions from the registry. functionNames: array of string (required) An array of valid function names to deactivate.
Registry Presenter

It can be a good idea to inform the LLM about the different namespaces and give a little more context. In order to do this, you might want to add a dedicated developer message to the message list that explains the namespaces. To achieve this dynamically, the concept of a Message List Extension is implemented. Extensions implement a specific interface and are invoked when the message list is built, giving you the opportunity to modify or extend the messages.

Message List Extensions

Sometimes it’s useful to add a message that explains context details to the LLM. For instance, you might want to inform the model about the namespaces registered by your tool. To achieve this dynamically, you can create an extension that implements the MessageListExtensionInterface. Once added to the message list, the extension is invoked automatically during the conversion process, allowing you to inject extra messages.

For example, the RegistryPresenter class implements this interface and in its extend() method it prepends a developer message that lists the namespaces:

To add an extension to your message list, simply use the addExtension() method on your MessageList instance. Here’s how you can register the RegistryPresenter as an extension:

Function Calling

Implement all callable methods within your OpenFunction class. Each method should return a string, a text response, a binary response, or a list of responses. The callMethod in the abstract class ensures the output is consistently wrapped.

For example, in WeatherOpenFunction:

To invoke the function:

and if you used the registry

so if you would use it within the llm loop it could look like something like this

Available Open Function Implementations

In addition to creating your own OpenFunction, there are several ready-to-use implementations available. Here’s a quick overview:

More Repositories

We’ve created more repositories to make AI integration even simpler and more powerful! Check them out:

We are a young startup aiming to make it easy for developers to add AI to their applications. We welcome feedback, questions, comments, and contributions. Feel free to contact us at [email protected].

Consultancy & Support

Do you need assistance integrating Filament Assistant into your Laravel Filament application, or help setting it up?
We offer consultancy services to help you get the most out of our package, whether you’re just getting started or looking to optimize an existing setup.

Reach out to us at [email protected].

Contributing

We welcome contributions from the community! Feel free to submit pull requests, open issues, and help us improve the package.

License

This project is licensed under the MIT License. Please see License File for more information.


All versions of open-functions-core with dependencies

PHP Build Version
Package Version
Requires php Version ^8.2
Composer command for our command line client (download client) This client runs in each environment. You don't need a specific PHP version etc. The first 20 API calls are free. Standard composer command

The package assistant-engine/open-functions-core contains the following files

Loading the files please wait ....