inspector-apm / neuron-ai
PHP AI Framework with built-in observability.
Installs: 905
Dependents: 2
Suggesters: 0
Security: 0
Stars: 182
Watchers: 6
Forks: 16
Open Issues: 1
Requires
- php: ^8.0
- guzzlehttp/guzzle: ^7.0
- psr/http-message: ^1.0|^2.0
- psr/log: ^1.0|^2.0|^3.0
Requires (Dev)
- elasticsearch/elasticsearch: ^8.0
- php-http/curl-client: ^2.3
- phpunit/phpunit: ^9.0
- symfony/process: ^5.0|^6.0|^7.0
- typesense/typesense-php: ^5.0
Suggests
- elasticsearch/elasticsearch: ^7.0|^8.0
- inspector-apm/inspector-php: ^3.9
- php-http/curl-client: ^2.3
- symfony/process: ^4.0|^5.0|^6.0|^7.0
- typesense/typesense-php: ^5.0
- dev-main
- 1.8.15
- 1.8.14
- 1.8.13
- 1.8.12
- 1.8.11
- 1.8.10
- 1.8.9
- 1.8.8
- 1.8.7
- 1.8.6
- 1.8.5
- 1.8.4
- 1.8.3
- 1.8.2
- 1.8.1
- 1.8.0
- 1.7.2
- 1.7.1
- 1.7.0
- 1.6.0
- 1.5.4
- 1.5.3
- 1.5.2
- 1.5.1
- 1.5.0
- 1.4.2
- 1.4.1
- 1.4.0
- 1.3.2
- 1.3.1
- 1.3.0
- 1.2.25
- 1.2.24
- 1.2.23
- 1.2.22
- 1.2.21
- 1.2.20
- 1.2.19
- 1.2.18
- 1.2.17
- 1.2.16
- 1.2.15
- 1.2.14
- 1.2.13
- 1.2.12
- 1.2.11
- 1.2.10
- 1.2.9
- 1.2.8
- 1.2.7
- 1.2.6
- 1.2.5
- 1.2.4
- 1.2.3
- 1.2.2
- 1.2.1
- 1.2.0
- 1.1.3
- 1.1.2
- 1.1.1
- 1.1.0
- 1.0.1
- 1.0.0
- dev-develop
- dev-qdrant
- dev-structure
This package is auto-updated.
Last update: 2025-04-05 08:05:52 UTC
README
Before moving on, support the community giving a GitHub star ⭐️. Thank you!
Requirements
- PHP: ^8.0
Official documentation
Go to the official documentation
Install
Install the latest version of the package:
composer require inspector-apm/neuron-ai
Create an Agent
Neuron provides you with the Agent class you can extend to inherit the main features of the framework,
and create fully functional agents. This class automatically manages some advanced mechanisms for you such as memory,
tools and function calls, up to the RAG systems. You can go deeper into these aspects in the documentation.
In the meantime, let's create the first agent, extending the NeuronAI\Agent
class:
use NeuronAI\Agent; use NeuronAI\SystemPrompt; use NeuronAI\Providers\AIProviderInterface; use NeuronAI\Providers\Anthropic\Anthropic; class YouTubeAgent extends Agent { public function provider(): AIProviderInterface { return new Anthropic( key: 'ANTHROPIC_API_KEY', model: 'ANTHROPIC_MODEL', ); } public function instructions(): string { return new SystemPrompt( background: ["You are an AI Agent specialized in writing YouTube video summaries."], steps: [ "Get the url of a YouTube video, or ask the user to provide one.", "Use the tools you have available to retrieve the transcription of the video.", "Write the summary.", ], output: [ "Write a summary in a paragraph without using lists. Use just fluent text.", "After the summary add a list of three sentences as the three most important take away from the video.", ] ); } }
The SystemPrompt
class is designed to take your base instructions and build a consistent prompt for the underlying model
reducing the effort for prompt engineering.
Talk to the Agent
Send a prompt to the agent to get a response from the underlying LLM:
$agent = YouTubeAgent::make(); $response = $agent->run(new UserMessage("Hi, I'm Valerio. Who are you?")); echo $response->getContent(); // I'm a friendly YouTube assistant to help you summarize videos. $response = $agent->run( new UserMessage("Do you know my name?") ); echo $response->getContent(); // Your name is Valerio, as you said in your introduction.
As you can see in the example above, the Agent automatically has memory of the ongoing conversation. Learn more about memory in the documentation.
Supported LLM Providers
With Neuron you can switch between LLM providers with just one line of code, without any impact on your agent implementation. Supported providers:
- Anthropic
- Ollama (also available as an embeddings provider)
- OpenAI
- Mistral
- Deepseek
Tools & Function Calls
You can add the ability to perform concrete tasks to your Agent with an array of Tool
:
use NeuronAI\Agent; use NeuronAI\SystemPrompt; use NeuronAI\Providers\AIProviderInterface; use NeuronAI\Providers\Anthropic\Anthropic; use NeuronAI\Tools\Tool; use NeuronAI\Tools\ToolProperty; class YouTubeAgent extends Agent { public function provider(): AIProviderInterface { return new Anthropic( key: 'ANTHROPIC_API_KEY', model: 'ANTHROPIC_MODEL', ); } public function instructions(): string { return new SystemPrompt( background: ["You are an AI Agent specialized in writing YouTube video summaries."], steps: [ "Get the url of a YouTube video, or ask the user to provide one.", "Use the tools you have available to retrieve the transcription of the video.", "Write the summary.", ], output: [ "Write a summary in a paragraph without using lists. Use just fluent text.", "After the summary add a list of three sentences as the three most important take away from the video.", ] ); } public function tools(): array { return [ Tool::make( 'get_transcription', 'Retrieve the transcription of a youtube video.', )->addProperty( new ToolProperty( name: 'video_url', type: 'string', description: 'The URL of the YouTube video.', required: true ) )->setCallable(function (string $video_url) { // ... retrieve the video transcription }) ]; } }
Learn more about Tools on the documentation.
MCP server connector
Instead of implementing tools manually, you can connect tools exposed by an MCP server with the McpConnector
component:
use NeuronAI\Agent; use NeuronAI\MCP\McpConnector; use NeuronAI\Providers\AIProviderInterface; use NeuronAI\Providers\Anthropic\Anthropic; use NeuronAI\Tools\Tool; use NeuronAI\Tools\ToolProperty; class SEOAgent extends Agent { public function provider(): AIProviderInterface { return new Anthropic( key: 'ANTHROPIC_API_KEY', model: 'ANTHROPIC_MODEL', ); } public function instructions(): string { return new SystemPrompt( background: ["Act as an expert of SEO (Search Engine Optimization)."] steps: [ "Analyze a text of an article.", "Provide suggestions on how the content can be improved to get a better rank on Google search." ], output: ["Structure your analysis in sections. One for each suggestion."] ); } public function tools(): array { return [ // Connect an MCP server ...McpConnector::make([ 'command' => 'npx', 'args' => ['-y', '@modelcontextprotocol/server-everything'], ])->tools(), // Implement your custom tools Tool::make( 'get_transcription', 'Retrieve the transcription of a youtube video.', )->addProperty( new ToolProperty( name: 'video_url', type: 'string', description: 'The URL of the YouTube video.', required: true ) )->setCallable(function (string $video_url) { // ... retrieve the video transcription }) ]; } }
Learn more about MCP connector on the documentation.
Implement RAG systems
For RAG use case, you must extend the NeuronAI\RAG\RAG
class instead of the default Agent class.
To create a RAG you need to attach some additional components other than the AI provider, such as a vector store
,
and an embeddings provider
.
Here is an example of a RAG implementation:
use NeuronAI\Providers\AIProviderInterface; use NeuronAI\Providers\Anthropic\Anthropic; use NeuronAI\RAG\Embeddings\EmbeddingsProviderInterface; use NeuronAI\RAG\Embeddings\VoyageEmbeddingProvider; use NeuronAI\RAG\RAG; use NeuronAI\RAG\VectorStore\PineconeVectoreStore; use NeuronAI\RAG\VectorStore\VectorStoreInterface; class MyChatBot extends RAG { public function provider(): AIProviderInterface { return new Anthropic( key: 'ANTHROPIC_API_KEY', model: 'ANTHROPIC_MODEL', ); } public function embeddings(): EmbeddingsProviderInterface { return new VoyageEmbeddingProvider( key: 'VOYAGE_API_KEY', model: 'VOYAGE_MODEL' ); } public function vectorStore(): VectorStoreInterface { return new PineconeVectoreStore( key: 'PINECONE_API_KEY', indexUrl: 'PINECONE_INDEX_URL' ); } }
Learn more about RAG on the documentation.
Official documentation
Go to the official documentation
Contributing
We encourage you to contribute to the development of Neuron AI Framework! Please check out the Contribution Guidelines about how to proceed. Join us!
LICENSE
This bundle is licensed under the MIT license.