MCP Server Directory
Back to servers

Apache Doris Mcp Server

#doris#warehouse

Description

Doris MCP (Model Control Panel) Server is a backend service built with Python and FastAPI. It implements the MCP (Model Control Panel) protocol, allowing clients to interact with it through defined "Tools". It's primarily designed to connect to Apache Doris databases, potentially leveraging Large Language Models (LLMs) for tasks like converting natural language queries to SQL (NL2SQL), executing queries, and performing metadata management and analysis

Features

  • MCP Protocol Implementation: Provides standard MCP interfaces, supporting tool calls, resource management, and prompt interactions.
  • Multiple Communication Modes:
  • SSE (Server-Sent Events): Served via /sse (initialization) and /mcp/messages (communication) endpoints (src/sse_server.py).
  • Streamable HTTP: Served via the unified /mcp endpoint, supporting request/response and streaming (src/streamable_server.py).
  • (Optional) Stdio: Interaction possible via standard input/output (src/stdio_server.py), requires specific startup configuration.
  • Tool-Based Interface: Core functionalities are encapsulated as MCP tools that clients can call as needed. Currently available key tools focus on direct database interaction:
  • SQL Execution (mcp_doris_exec_query)
  • Database and Table Listing (mcp_doris_get_db_list, mcp_doris_get_db_table_list)
  • Metadata Retrieval (mcp_doris_get_table_schema, mcp_doris_get_table_comment, mcp_doris_get_table_column_comments, mcp_doris_get_table_indexes)
  • Audit Log Retrieval (mcp_doris_get_recent_audit_logs) Note: Current tools primarily focus on direct DB operations.
  • Database Interaction: Provides functionality to connect to Apache Doris (or other compatible databases) and execute queries (src/utils/db.py).
  • Flexible Configuration: Configured via a .env file, supporting settings for database connections, LLM providers/models, API keys, logging levels, etc.
  • Metadata Extraction: Capable of extracting database metadata information (src/utils/schema_extractor.py)

Endpoint URL

https://doris.apache.org/

Links & Contact

apache/doris-mcp-serverTwitter/XLinkedIn
dev@doris.apache.org
Added on 5/7/2025

Share this server:

Recommended Servers

Fast LLM & Agents & MCPs is a comprehensive repository that explores the concepts of Large Language Models (LLMs), Agents, and Model Context Protocols (MCPs) both theoretically and practically. It includes various tools, frameworks, and sample codes for developing intelligent agents and applications using LLMs.

#ai.mcp-server

Flomo MCP is a tool designed for writing notes to the Flomo application, allowing users to easily manage and organize their notes.

#flomo-mcp#mcp-server

MCPollinations is a multimodal Model Context Protocol (MCP) server that allows AI assistants to generate images, text, and audio through the Pollinations APIs.

#ai-assistant#mcpollinations

The Deep Research MCP Server is a Model Context Protocol (MCP) compliant server for comprehensive, up-to-date web research. It leverages Tavily's Search & Crawl APIs to gather, aggregate, and structure information for LLM-powered documentation generation.

#information#knowledge

NPM Helper is a Model Context Protocol (MCP) server designed to assist with NPM package management and dependency updates, enabling LLMs like Claude to interact with npm packages, search the npm registry, and maintain up-to-date dependencies.

#nodejs#npm

AgentMode is an all-in-one Model Context Protocol (MCP) server that connects your coding AI to various databases, data warehouses, data pipelines, and cloud services, streamlining your development workflow.

#agentmode#agent

This project provides a Model Context Protocol (MCP) server for NATS, enabling AI models and applications to interact with NATS messaging systems through a standardized interface. It exposes a comprehensive set of tools for interacting with NATS servers, making it ideal for AI-powered applications that need to work with messaging systems.

#nats#natscli

Kollektiv MCP enables you to build personal LLM knowledge base in seconds and use it from your favorite editor / client. No more infrastructure setup, chunking, syncing - just upload your data and start chatting. Supports all major MCP clients out of the box - Cursor, Windsurf, Claude Desktop, etc.

#RAG#semantic-search

Podbean MCP Server is a management tool that connects any MCP-compatible AI assistant to the Podbean API, allowing users to manage their podcasts, episodes, and analytics through natural conversation.

#podbean-mcp-server