Skip to main content

AI Chat with Netdata

Chat with your infrastructure using natural language through two distinct integration architectures.

Integration Architecture

Method 1: Client-Controlled Communication (Available Now)

How it works:

  1. You ask a question to your AI client
  2. LLM responds with tool execution commands
  3. Your AI client executes tools against Netdata Agent MCP (locally)
  4. Your AI client sends tool responses back to LLM
  5. LLM provides the final answer

Key characteristics:

  • Your AI client orchestrates all communication
  • Netdata Agent MCP runs locally on your infrastructure
  • No internet access required for Netdata Agent
  • Full control over data flow and privacy

Method 2: LLM-Direct Communication (Coming Soon)

How it works:

  1. You ask a question to your AI client
  2. LLM directly accesses Netdata Cloud MCP tools
  3. LLM provides the final answer with integrated data

Key characteristics:

  • LLM provider manages MCP integration
  • Direct connection between LLM and MCP tools
  • Netdata Cloud MCP accessible via internet
  • Simplified setup, no local MCP configuration needed

Quick Comparison

AspectMethod 1: Client-ControlledMethod 2: LLM-Direct
Availability✅ Available now🚧 Coming soon
Setup ComplexityModerate (configure AI client + MCP)Simple (just AI client)
Data PrivacyDepends on LLM providerDepends on LLM provider
Internet RequirementsAI client needs internet, MCP is localBoth AI client and MCP need internet
Supported AI ClientsAny MCP-aware client (including those using LLM APIs)Only clients from providers that support MCP on LLM side
Infrastructure AccessLimited to one Parent's scopeComplete visibility across all infrastructure

Do you have any feedback for this page? If so, you can open a new issue on our netdata/learn repository.