kauffinger / livewire-chat-kit
A clean streamed chat kit for Laravel Livewire. Built with FluxUI & Prism.
Installs: 4
Dependents: 0
Suggesters: 0
Security: 0
Stars: 1
Watchers: 1
Forks: 0
Open Issues: 0
Language:Blade
Type:project
Requires
- php: ^8.2
- laravel/framework: ^12.0
- laravel/tinker: ^2.10.1
- livewire/flux: ^2.1.1
- livewire/volt: ^1.7.0
- prism-php/prism: ^0.68.0
Requires (Dev)
- fakerphp/faker: ^1.23
- laravel/pail: ^1.2.2
- laravel/pint: ^1.18
- laravel/sail: ^1.41
- mockery/mockery: ^1.6
- nunomaduro/collision: ^8.6
- pestphp/pest: ^3.8
- pestphp/pest-plugin-laravel: ^3.2
This package is auto-updated.
Last update: 2025-06-10 20:47:50 UTC
README
A Laravel Starter Kit for building LLM-powered chat interfaces with Livewire, FluxUI, and Prism. Jump right into the LLM party without touching (much) JavaScript, while still having a good-looking and effortlessly feeling chat interface.
This starter kit provides a clean, simple foundation for creating chat applications. It's designed to get you sending your first chat messages to an LLM instantly.
Features
- Livewire-Powered: Build dynamic interfaces with PHP.
- Streamed Responses: Real-time message streaming from LLMs for a smooth UX.
- FluxUI Components: Beautiful, pre-built UI components for a polished look and feel.
- Prism Integration: The Laravel-way of speaking to LLMs. Easy to use, test, and switch between providers (e.g., OpenAI, Anthropic).
- Minimal JavaScript: Focus on your PHP backend.
- TailwindCSS Styled: Includes a TailwindCSS setup with a typography plugin for rendering markdown.
Installation
You can install this starter kit into a new Laravel application using Composer:
laravel new my-chat-app --using=kauffinger/livewire-chat-kit
After installation, make sure to:
- Run migrations:
php artisan migrate
- Install NPM dependencies:
npm install
- Build assets:
npm run dev
(ornpm run build
for production)
Getting Started
1. Configure LLM Provider
This starter kit uses Prism to interact with LLMs. By default, it's configured to use OpenAI's gpt-4o-mini
. You'll need to add your API key to your .env
file.
OPENAI_API_KEY=your-openai-api-key # OPENAI_ORGANIZATION_ID= (optional)
You can easily switch to other providers supported by Prism (like Anthropic) by modifying the Chat.php
component. For example, to use Claude:
// In app/Livewire/Chat.php // ... use Prism\Prism\Enums\Provider; // ... public function runChatToolLoop(): void { $generator = Prism::text() ->using(Provider::Anthropic, 'claude-3-opus-20240229') // Example for Claude // ->using(Provider::OpenAI, 'gpt-4o-mini') // Default ->withSystemPrompt('You are a helpful assistant.') ->withMessages(collect($this->messages)->map->toPrism()->all()) ->asStream(); // ... } // ...
Remember to add the corresponding API key to your .env
file (e.g., ANTHROPIC_API_KEY
).
2. Explore the Chat Interface
Navigate to your application's /dashboard
route (or wherever you've set up the chat component) to start interacting with the chat interface.
Core Components
app/Livewire/Chat.php
This is the heart of the chat functionality.
resources/views/livewire/chat.blade.php
This Blade view renders the chat interface.
How it Works
- User types a message in
x-chat.message-input
and hits send. sendMessage()
inChat.php
is triggered.- The user's message is added to the
$messages
array. - The input field is cleared.
$this->js('$wire.runChatToolLoop()')
is called, which immediately invokes therunChatToolLoop()
method. This allows the UI to update with the user's message before the LLM call.
- The user's message is added to the
runChatToolLoop()
:- Constructs a request to the LLM using Prism, including the system prompt and the current chat history.
- The request is sent as a stream.
- As tokens arrive from the LLM:
- They are appended to a local
$message
variable. - The
stream()
method sends the accumulated$message
(converted to markdown) to the frontend, updating the part of the view listening tostreamed-message
. This is typically handled by thex-chat.assistant-message
component.
- They are appended to a local
- Once the LLM finishes generating its response:
- The complete assistant message is added to the
$messages
array. The temporary streamed display is effectively replaced by the final message in the loop.
- The complete assistant message is added to the
Contributing
Contributions are welcome! If you'd like to improve the Livewire Chat Kit, please feel free to:
- Report a bug.
- Suggest a new feature.
- Submit a pull request.
Please visit the GitHub repository to contribute.
License
This project is open-sourced software licensed under the MIT license.