Download the PHP package codingwisely/taskallama without Composer
On this page you can find all versions of the php package codingwisely/taskallama. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.
Download codingwisely/taskallama
More information about codingwisely/taskallama
Files in codingwisely/taskallama
Package taskallama
Short Description Taskallama is a Laravel package that seamlessly integrates with Ollama’s LLM API to empower your applications with AI-driven text generation, task management assistance, and more. Designed for simplicity and scalability, Taskallama brings the power of language models to your Laravel projects.
License MIT
Homepage https://github.com/codingwisely/taskallama
Informations about the package taskallama
Taskallama: Laravel Integration with Ollama LLM API
Taskallama is a Laravel package that provides seamless integration with Ollama's LLM API. It simplifies generating AI-powered content, from professional task writing to conversational agents, with minimal effort. Whether you're building a task management system, an HR assistant for job posts, or blog content generation, Taskallama has you covered.
Why i built it? Simple reasons - i want to implement a ai helper on or Project and task management system at Taskavel.com to help me quickly scaffold the task. We gonna use it also on our another SaaS project, Advanced ATS system at Bagel.blue to make it easy to create a Job Postings.
Features
- Simple API for generating AI responses via the Ollama LLM.
- Supports task creation, conversational AI, embeddings, and more.
- Customizable agent personalities for tailored responses.
- Integration with Laravel Livewire for real-time interactions.
- Configurable options like streaming, model selection, and temperature.
Prerequisites
-
Ollama Installation
- Taskallama requires Ollama to be installed and running locally on your machine. You can download and install Ollama from their official website:
-
Ollama Configuration
- By default, Taskallama connects to Ollama at
http://127.0.0.1:11434
. Ensure that Ollama is running and accessible at this address. You can update theOLLAMA_URL
in the config file if it's hosted elsewhere.
- By default, Taskallama connects to Ollama at
- System Requirements
- PHP
^8.3
or higher. - Laravel
^11.0
or higher.
- PHP
Installation
You can install the package via composer:
Next, you should publish the package's configuration file:
This will publish a taskallama.php
file in your config
directory where you can configure your Ollama API key and other settings.
Usage
Basic Example (non-stream)
Generate a response using a prompt:
Basic Example (stream)
Generate a stream response using a prompt:
Chat Example
Create a conversational agent:
Livewire Integration Example
Integrate Taskallama into a Livewire component for real-time task generation:
Embeddings Example
Generate embeddings for advanced search or semantic analysis:
Additional Methods
List Local Models
Retrieve Model Information
Retrieve Model Settings
Pull or Delete a Model
If you're pulling model, make sure you set this a background job, as it may take a while to download the model.
Testing
Run the tests with:
License
This package is open-source software licensed under the MIT License. Please see the LICENSE.md file for more information.
All versions of taskallama with dependencies
spatie/laravel-package-tools Version ^1.16
illuminate/contracts Version ^10.0||^11.0