JSPM

databricks-mcp-server

0.0.10
  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 35
  • Score
    100M100P100Q54255F
  • License MIT

Model Context Protocol (MCP) server for interacting with Databricks

Package Exports

  • databricks-mcp-server
  • databricks-mcp-server/bin/index.js

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (databricks-mcp-server) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

Databricks MCP Server

A Model Context Protocol (MCP) server for interacting with Databricks.

Installation

You can download the latest release for your platform from the Releases page.

VS Code

Install the Databricks MCP Server extension in VS Code by pressing the following link:

Install in VS Code

Alternatively, you can install the extension manually by running the following command:

# For VS Code
code --add-mcp '{"name":"databricks","command":"npx","args":["databricks-mcp-server@latest"]}'
# For VS Code Insiders
code-insiders --add-mcp '{"name":"databricks","command":"npx","args":["databricks-mcp-server@latest"]}'

Tools

The Databricks MCP Server provides a Model Context Protocol (MCP) interface to interact with Databricks workspaces. It offers the following functionalities:

List Catalogs

Lists all catalogs available in the Databricks workspace.

Tool name: list_catalogs

Parameters: None

Returns: JSON array of catalog objects

List Schemas

Lists all schemas in a specified Databricks catalog.

Tool name: list_schemas

Parameters:

  • catalog (string, required): Name of the catalog to list schemas from

Returns: JSON array of schema objects

List Tables

Lists all tables in a specified Databricks schema with optional filtering.

Tool name: list_tables

Parameters:

  • catalog (string, required): Name of the catalog containing the schema
  • schema (string, required): Name of the schema to list tables from
  • filter_pattern (string, optional, default: ".*"): Regular expression pattern to filter table names

Returns: JSON array of table objects

Execute SQL

Executes SQL statements on a Databricks SQL warehouse and returns the results.

Tool name: execute_sql

Parameters:

  • statement (string, required): SQL statement to execute
  • timeout_seconds (number, optional, default: 60): Timeout in seconds for the statement execution
  • row_limit (number, optional, default: 100): Maximum number of rows to return in the result

Returns: JSON object containing columns and rows from the query result, with information of the SQL warehouse used to execute the statement.

List SQL Warehouses

Lists all SQL warehouses available in the Databricks workspace.

Tool name: list_warehouses

Parameters: None

Returns: JSON array of SQL warehouse objects

Supported Platforms

  • Linux (amd64)
  • Windows (amd64)
  • macOS (Intel/amd64)
  • macOS (Apple Silicon/arm64)

Usage

Authentication

The application uses Databricks unified authentication. For details on how to configure authentication, please refer to the Databricks Authentication documentation.

Running the Server

Start the MCP server:

./databricks-mcp-server

The server will start and listen for MCP protocol commands on standard input/output.

Development

Prerequisites

  • Go 1.24 or later