bluefly/llm_platform

Enterprise AI coordination platform with comprehensive AI provider support, vector databases, and intelligent workflows built on Drupal 11 standards.

Installs: 42

Dependents: 0

Suggesters: 0

Security: 0

Type:drupal-recipe

0.1.0 2025-07-21 11:32 UTC

This package is auto-updated.

Last update: 2025-07-21 11:33:34 UTC


README

A comprehensive Drupal 10 recipe that creates an enterprise-grade AI coordination platform with multi-provider support, vector databases, intelligent workflows, and enterprise security.

Features

🤖 Multi-Provider AI Integration

  • LiteLLM (unified provider interface) - DEFAULT
  • Ollama (local, privacy-first)
  • OpenAI (GPT-4, GPT-3.5)
  • Anthropic (Claude 3)
  • LangChain (orchestration)
  • CrewAI (multi-agent systems)
  • HuggingFace (model hub)

LiteLLM Integration

The recipe now uses LiteLLM as the default AI provider, offering unified access to multiple AI services:

# Default configuration
default_provider: 'litellm'
base_url: 'http://localhost:4000'
default_model: 'ollama/llama3.2'

Benefits:

  • Single interface for all AI providers
  • Automatic failover between providers
  • Unified authentication and error handling
  • Reduced custom code maintenance

🔐 Enterprise Security

  • Built on secure_drupal foundation
  • FedRAMP, HIPAA, PCI DSS, SOC2 compliance ready
  • Advanced encryption and key management
  • Comprehensive audit logging

🔍 Intelligent Search

  • Vector embeddings with semantic search
  • Multiple vector database support
  • Search API integration
  • Faceted search capabilities

🤝 AI Agents & Automation

  • Multi-agent orchestration with CrewAI
  • Workflow automation with ECA
  • Content generation and moderation
  • Research and analysis agents

🔗 Model Context Protocol (MCP)

  • WebSocket, HTTP, and stdio transports
  • Community registry browser integration
  • Tool integration for enhanced capabilities
  • LangChain tools compatibility

Requirements

  • Drupal: 10.3+ or 11.0+
  • PHP: 8.1+
  • Node.js: 20.0.0+ (for npm packages)
  • Composer: 2.0+

Optional Requirements

  • Ollama: For local AI execution (recommended)
  • Redis: For enhanced caching and performance
  • Solr: For advanced search capabilities

Installation

Method 1: Using the Recipe (Recommended)

# Install the recipe and its dependencies
composer require bluefly/llm-platform-enterprise

# Apply the recipe to your Drupal site
php core/scripts/drupal recipe path/to/llm_platform

Method 2: With Input Variables

# Install with custom configuration
php core/scripts/drupal recipe path/to/llm_platform \
  --site_name="My AI Platform" \
  --admin_email="admin@mycompany.com" \
  --default_provider="ollama"

Method 3: Development Installation

# Clone the repository
git clone https://github.com/bluefly/llm-platform-recipe.git

# Install dependencies
cd llm-platform-recipe
composer install

# Apply the recipe
php core/scripts/drupal recipe . --site_name="Dev Platform"

Post-Installation Setup

1. Configure AI Providers

Ollama (Recommended for privacy)

# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh

# Pull recommended models
ollama pull llama3.2:7b
ollama pull nomic-embed-text

# Verify in Drupal
Visit: /admin/config/ai/providers/test

Cloud Providers

  1. Get API keys from providers (OpenAI, Anthropic, etc.)
  2. Navigate to /admin/config/system/keys
  3. Add your API keys securely
  4. Configure providers at /admin/config/ai/providers

2. Complete Onboarding

Visit /admin/config/system/recipe-onboarding to:

  • Configure AI providers
  • Set up vector databases
  • Enable desired features
  • Test integrations

3. Security Configuration

  1. Review Security Settings: /admin/config/security
  2. Configure Encryption: /admin/config/system/encrypt
  3. Set Up Key Management: /admin/config/system/keys
  4. Run Security Review: /admin/reports/security-review

Configuration

Input Variables

The recipe accepts these input variables:

VariableTypeDefaultDescription
site_namestring"LLM Platform"Name of the AI platform site
admin_emailemailsystem.site.mailAdministrator email address
default_providerstring"ollama"Default AI provider (ollama, openai, anthropic, langchain)

Module Configuration

Core AI Modules:

  • ai: Core AI framework
  • ai_logging: Request/response logging
  • llm: LLM Platform core functionality

Provider Modules:

  • ai_provider_ollama: Local Ollama integration
  • ai_provider_openai: OpenAI GPT models
  • ai_provider_anthropic: Anthropic Claude models
  • ai_provider_langchain: Multi-provider orchestration
  • ai_agent_crewai: Multi-agent systems
  • ai_agent_huggingface: HuggingFace model hub

Platform Modules:

  • mcp_registry: MCP server registry browser
  • alternative_services: Service discovery and failover
  • recipe_onboarding: Guided setup wizard

Usage Examples

Basic AI Chat

$ai_service = \Drupal::service('llm.chat');
$response = $ai_service->chat('Explain quantum computing', [
  'provider' => 'ollama',
  'model' => 'llama3.2:7b'
]);

Multi-Agent Workflow

$crew_service = \Drupal::service('ai_agent_crewai.manager');
$result = $crew_service->executeCrewWorkflow([
  'task' => 'research_and_write_article',
  'topic' => 'AI in Healthcare',
  'agents' => ['researcher', 'writer', 'editor']
]);

Vector Search

$search_service = \Drupal::service('llm.vector_search');
$results = $search_service->semanticSearch('machine learning algorithms', [
  'provider' => 'ollama',
  'model' => 'nomic-embed-text',
  'similarity_threshold' => 0.8
]);

Architecture

Provider Hierarchy

  1. Ollama (Primary) - Local execution, privacy-first
  2. OpenAI (Secondary) - Cloud fallback for advanced features
  3. Anthropic (Tertiary) - Specialized tasks requiring Claude models
  4. LangChain - Multi-provider orchestration layer

Security Foundation

Built on the secure_drupal recipe providing:

  • AES-256-GCM encryption
  • Multi-factor authentication
  • Comprehensive audit logging
  • Compliance frameworks (FedRAMP, HIPAA, PCI DSS, SOC2)

Performance Optimization

  • Redis caching for AI responses
  • Connection pooling for providers
  • Request batching and streaming
  • Tiered cache strategy

Variants

The recipe includes three variants for different use cases:

Minimal (variants/minimal/)

  • ~34 modules
  • Essential AI features only
  • Single provider (Ollama)
  • Basic security

Standard (variants/standard/)

  • ~76 modules
  • Full-featured AI platform
  • Multi-provider support
  • Enhanced security

Enterprise (variants/enterprise/)

  • ~119 modules
  • Enterprise-grade with compliance
  • All providers and features
  • Maximum security

API Documentation

The platform provides comprehensive APIs:

  • REST API: /api/ai/* endpoints
  • GraphQL: /graphql with AI schema
  • OpenAPI: /admin/config/services/openapi
  • MCP Protocol: WebSocket on :3001/mcp

Monitoring & Health

Health Dashboard

Visit /service-health for real-time monitoring:

  • Provider availability
  • Response latency
  • Error rates
  • Token usage
  • Cost tracking

Logs & Debugging

  • AI Logs: /admin/reports/ai-logging
  • System Status: /admin/reports/status
  • Security Review: /admin/reports/security-review

Troubleshooting

Common Issues

Ollama Connection Failed

# Check Ollama is running
curl http://localhost:11434/api/version

# Restart Ollama service
ollama serve

Module Dependencies Missing

# Update Composer dependencies
composer update

# Clear Drupal cache
drush cr

Permission Denied

  • Ensure user has 'administer ai configuration' permission
  • Check security module restrictions
  • Verify API key access

Performance Issues

  • Enable Redis caching
  • Configure connection pooling
  • Adjust rate limiting settings
  • Monitor token usage

Contributing

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/new-feature
  3. Follow Drupal coding standards
  4. Add tests for new functionality
  5. Submit a pull request

Development Setup

# Install development dependencies
composer install --dev

# Run tests
vendor/bin/phpunit

# Code quality checks
vendor/bin/phpcs
vendor/bin/phpstan

License

GPL-2.0-or-later - see LICENSE file.

Support

Changelog

Version 1.0.0 (2025-01-09)

  • Initial release with Drupal 11 support
  • Multi-provider AI integration
  • Enterprise security foundation
  • MCP protocol support
  • Vector search capabilities
  • Multi-agent orchestration

Built with ❤️ for the Drupal AI community