AI Chat with Netdata
Chat with your infrastructure using natural language through two distinct integration architectures.
Integration Architecture
Method 1: Client-Controlled Communication (Available Now)
How it works:
- You ask a question to your AI client
- LLM responds with tool execution commands
- Your AI client executes tools against Netdata Agent MCP (locally)
- Your AI client sends tool responses back to LLM
- LLM provides the final answer
Key characteristics:
- Your AI client orchestrates all communication
- Netdata Agent MCP runs locally on your infrastructure
- No internet access required for Netdata Agent
- Full control over data flow and privacy
Method 2: LLM-Direct Communication (Coming Soon)
How it works:
- You ask a question to your AI client
- LLM directly accesses Netdata Cloud MCP tools
- LLM provides the final answer with integrated data
Key characteristics:
- LLM provider manages MCP integration
- Direct connection between LLM and MCP tools
- Netdata Cloud MCP accessible via internet
- Simplified setup, no local MCP configuration needed
Quick Comparison
Aspect | Method 1: Client-Controlled | Method 2: LLM-Direct |
---|---|---|
Availability | ✅ Available now | 🚧 Coming soon |
Setup Complexity | Moderate (configure AI client + MCP) | Simple (just AI client) |
Data Privacy | Depends on LLM provider | Depends on LLM provider |
Internet Requirements | AI client needs internet, MCP is local | Both AI client and MCP need internet |
Supported AI Clients | Any MCP-aware client (including those using LLM APIs) | Only clients from providers that support MCP on LLM side |
Infrastructure Access | Limited to one Parent's scope | Complete visibility across all infrastructure |
Do you have any feedback for this page? If so, you can open a new issue on our netdata/learn repository.