jonathanbossenger / wp-ollama-model-provider
Ollama AI model provider for WordPress AI Client. Enables WordPress to use Ollama models locally or in the cloud.
Installs: 0
Dependents: 0
Suggesters: 0
Security: 0
Stars: 4
Watchers: 0
Forks: 0
Open Issues: 0
Type:wordpress-plugin
pkg:composer/jonathanbossenger/wp-ollama-model-provider
Requires
- php: >=8.0
- wordpress/wp-ai-client: ^0.2.1
Requires (Dev)
This package is auto-updated.
Last update: 2026-02-07 20:18:45 UTC
README
A WordPress plugin that provides local and cloud AI model support (Ollama) for the WordPress AI Client.
Description
WP Ollama Model Provider enables WordPress to use AI models through Ollama, supporting both local installations and Ollama Cloud. This plugin acts as a provider for the WordPress AI Client, making Ollama models accessible to any WordPress plugin that uses the AI Client.
Features
- Flexible Deployment: Support for both local Ollama installations and Ollama Cloud
- Local AI Models: Run AI models locally with Ollama - no cloud API keys required
- Cloud Integration: Use Ollama Cloud for easy access without local installation
- Automatic Model Detection: Discovers all available Ollama models on your system or cloud account
- Simple Configuration: Easy settings page at Settings > Ollama AI Models
- Model Selection: Choose which Ollama model to use from a dropdown
- Model Caching: Efficient 5-minute cache for model discovery
- Public API: Other plugins can easily check for and use your selected model
- Privacy-First: Local mode ensures text generation happens entirely on your machine
Supported Providers
- Ollama (current)
- Local installation
- Ollama Cloud
- Future: LocalAI, LM Studio, and other local providers
Requirements
- PHP: 8.0 or higher
- WordPress: 6.0 or higher
- Dependencies:
wordpress/wp-ai-client^0.2.1- For local mode: Ollama installed and running locally
- For cloud mode: Ollama Cloud API key
Installation
Via Composer (Recommended)
# Install in your WordPress project
composer require jonathanbossenger/wp-ollama-model-provider
The plugin will be installed to wp-content/plugins/wp-ollama-model-provider/ (or web/app/plugins/ for Bedrock).
Activate it through the WordPress admin.
For Bedrock/Roots.io users: The package will automatically install to the correct plugins directory.
For traditional WordPress installations:
Run composer from your WordPress root with proper installer-paths configured, or run it directly in wp-content/plugins/.
Manual Installation
- Download the latest release from GitHub Releases
- Upload to
/wp-content/plugins/wp-ollama-model-provider/ - Run
composer install --no-devin the plugin directory - Activate the plugin through the WordPress admin
Setup Ollama
Local Setup
- Install Ollama from https://ollama.com
- Pull at least one model:
ollama pull llama3.2
- Verify Ollama is running:
curl http://localhost:11434/api/tags
Ollama Cloud Setup
- Sign up for Ollama Cloud at https://ollama.com/cloud
- Get your API key from the Ollama Cloud dashboard
Configuration
For Local Ollama:
- Navigate to Settings > Ollama AI Models in WordPress admin
- Select Local as the deployment mode
- Select your preferred Ollama model from the dropdown
- Click Save Settings
For Ollama Cloud:
- Navigate to Settings > Ollama AI Models in WordPress admin
- Select Ollama Cloud as the deployment mode
- Enter your Ollama Cloud API key
- Select your preferred model from the dropdown
- Click Save Settings
Your selected model is now available to all WordPress plugins that use the AI Client.
Usage
For End Users
Once configured, the plugin runs automatically in the background. Any WordPress plugin that uses the WordPress AI Client can detect and use your selected Ollama model.
For Plugin Developers
Check for Selected Model
if ( function_exists( 'wp_ollama_model_provider_get_selected_model' ) ) { $selected_model = wp_ollama_model_provider_get_selected_model( 'ollama' ); if ( ! empty( $selected_model ) ) { // Use the selected Ollama model } }
Use the Selected Model
// Check for selected Ollama model from wp-ollama-model-provider if ( function_exists( 'wp_ollama_model_provider_get_selected_model' ) ) { $selected_ollama_model = wp_ollama_model_provider_get_selected_model( 'ollama' ); if ( ! empty( $selected_ollama_model ) && class_exists( 'WpOllamaModelProvider\Providers\Ollama\OllamaProvider' ) ) { $model = \WpOllamaModelProvider\Providers\Ollama\OllamaProvider::model( $selected_ollama_model ); $registry = \WordPress\AiClient\AiClient::defaultRegistry(); $registry->bindModelDependencies( $model ); return \WordPress\AI_Client\AI_Client::prompt( $prompt ) ->using_model( $model ) ->generate_text(); } } // Fall back to automatic provider/model selection return \WordPress\AI_Client\AI_Client::prompt( $prompt )->generate_text();
Public API Functions
wp_ollama_model_provider_is_provider_registered( $provider_slug )
- Check if a provider (e.g., 'ollama') is registered
- Returns:
bool
wp_ollama_model_provider_has_settings_page()
- Check if the settings page is available
- Returns:
bool
wp_ollama_model_provider_get_selected_model( $provider_slug )
- Get the selected model for a provider (e.g., 'ollama')
- Returns:
string(model ID) or empty string if none selected
Filters
wp_ai_client_ollama_base_url
Change the Ollama base URL (default: http://localhost:11434).
add_filter( 'wp_ai_client_ollama_base_url', function() { return 'http://192.168.1.100:11434'; // Remote Ollama server } );
wp_ai_client_default_request_timeout
Adjust timeout for slower models (default: varies by plugin).
add_filter( 'wp_ai_client_default_request_timeout', function() { return 120; // 2 minutes } );
Documentation
- Quick Start Guide - Get up and running in 5 minutes
- Ollama Models Reference - Model recommendations and comparisons
- Integration Guide - Technical integration details
Troubleshooting
No models appearing in dropdown
-
Verify Ollama is running:
ollama list
-
Pull a model if needed:
ollama pull llama3.2
-
Use the "Refresh Model List" button on the settings page
Cannot connect to Ollama
-
Check if Ollama is running:
curl http://localhost:11434/api/tags
-
Start Ollama if needed:
ollama serve
Slow generation times
- Use a smaller model (e.g.,
llama3.2:1binstead ofllama2:13b) - Increase the timeout using the
wp_ai_client_default_request_timeoutfilter - See OLLAMA-MODELS.md for model recommendations
Enable debug logging
// In wp-config.php define( 'WP_DEBUG', true ); define( 'WP_DEBUG_LOG', true ); define( 'WP_DEBUG_DISPLAY', false );
Check wp-content/debug.log for error messages.
Limitations
- Text Generation Only: Ollama only supports text generation, not image generation
- Local Models: Requires Ollama to be installed and running locally (or on your network)
- Resource Intensive: Larger models require significant RAM and CPU/GPU resources
Contributing
Contributions are welcome! Please feel free to submit issues or pull requests.
Development Setup
- Clone the repository
- Run
composer install - Ensure Ollama is installed and running
- Activate the plugin in a WordPress development environment
Coding Standards
This plugin follows WordPress Coding Standards. Check your code:
vendor/bin/phpcs --standard=WordPress wp-ollama-model-provider.php includes/
License
GPL-2.0-or-later
Support
- Issues: GitHub Issues
- Documentation: See
docs/directory - Ollama: https://ollama.com
Credits
Created by Jonathan Bossenger
Built on top of: