MCP Server
As AI assistants and large language models (LLMs) become integral to modern workflows, the ability to connect them directly to live business data is transformative. The CrateDB MCP Server bridges that gap.
Built on the Model Context Protocol (MCP), it allows LLMs and AI agents to securely query your CrateDB clusters, understand their structure, and access relevant documentation, all in real time.
In other words, it turns your data into something that AI can converse with, enabling natural-language analytics, diagnostics, and decision-making powered by CrateDB’s distributed SQL engine.
What is the CrateDB MCP Server?
The CrateDB MCP Server is a connector that exposes your database and documentation to AI assistants via the open Model Context Protocol. It acts as an intelligent interface between LLMs and CrateDB, making your data understandable and queryable through natural language.
It enables AI tools to:
- Translate natural language into SQL (Text-to-SQL).
- Retrieve documentation to ground their answers in verified knowledge.
- Access cluster metadata, table schemas, and query results in real time.
Why it matters
- Natural-language access to real-time data: Anyone can ask business questions in plain English and get answers instantly.
- LLM-ready data infrastructure: Bring CrateDB into your AI ecosystem as a high-speed, structured data source for reasoning and analysis.
- Faster innovation: Build AI copilots, analytics assistants, and conversational dashboards without custom connectors.
- Context-aware insights: The MCP Server enables LLMs to retrieve not just raw data, but the knowledge behind it: schema, relationships, and documentation.
Key features
- Text-to-SQL tools: translate prompts into safe, optimized SQL queries executed on CrateDB.
- Documentation lookup: integrate official CrateDB docs directly into LLM reasoning.
- Cluster & metadata access: inspect tables, columns, and performance metrics conversationally.
- Flexible transport: supports stdio, HTTP, and SSE for integration with diverse LLM clients.
- Security-first design: read-only by default, with configurable access scopes and credentials.
- Lightweight deployment: install via Python (cratedb-mcp) or deploy as a Docker container.
Use cases
- Conversational analytics: “Show me average sensor uptime this week.”
- AI-assisted monitoring: “List partitions nearing storage limits.”
- Chat-driven documentation: “How do I create a partitioned table in CrateDB?”
- AI copilots for data teams: Enable generative agents that analyze metrics or automate queries in real time.
Best practices
- Restrict access using dedicated, read-only CrateDB users.
- Keep schema metadata clean and well-described for better AI understanding.
- Use structured prompts to guide accurate SQL generation.
- Validate outputs for critical workloads; treat AI as an assistant, not an oracle.
- Monitor query performance and scale your cluster accordingly for AI workloads.
CrateDB architecture guide
This comprehensive guide covers all the key concepts you need to know about CrateDB's architecture. It will help you gain a deeper understanding of what makes it performant, scalable, flexible and easy to use. Armed with this knowledge, you will be better equipped to make informed decisions about when to leverage CrateDB for your data projects.
