SQL MCP: Local to Fabric Lakehouse

Let’s be honest, you thought of doing this yourself. Connecting two tools that seem to serve very different purposes. But, as I had a day to fool around, I thought to myself, let’s see if I can make this work.

In the previous blog post on MCP Local, I described how to run your MCP on your laptop and connect to your local SQL Server database.

With this setup, a few new challenges arise:

  • Connecting to a Fabric SQL Endpoint
  • Using Entra ID
  • Reading Lakehouse data

Getting the Fabric SQL Endpoint

This was the easiest part. The trick is to understand how it works. Because your connection string needs to define both the server and the database. In SQL Server, this is easy. But when you retrieve the SQL Endpoint from your Fabric Lakehouse properties, it only shows the server. The name of your Lakehouse is the actual database you’re connecting to.

And, as I found out, it’s just like firing a query against a SQL Server; it behaves the same and that’s a good job on Microsoft’s side.

Using Entra ID to authenticate

Here, the problems arose. Mostly because I had decided to use a User Assigned Managed Identity (UAMI) to connect my MCP to Fabric. Adding a UAMI to Fabric with permissions isn’t hard, but when you try to use this on your laptop that’s unknown to your Azure tenant, it doesn’t work. In my case, I had to use Azure Directory Interactive to connect. And as my dab-config.json file had too many remnants of my trials, it kept failing.

Working code

To save you (a lot of) time, here’s some working code for you (which very well may be future me ;))

mcp.json

{
  "servers": {
    "sql-mcp-server": {
      "type": "stdio",
      "command": "dab",
      "args": [
        "start",
        "--mcp-stdio",
        "role:authenticated",
        "--config",
        "${workspaceFolder}/dab-config.json"
      ]
    }
  }
}

In this mcp.json file, the key is the authenticated role; without it (at least when I was testing), it threw an authentication error.

.env

MSSQL_CONNECTION_STRING=Server=[Your Fabric Server].datawarehouse.fabric.microsoft.com;Database=[Your Lakehouse Name];Authentication=Active Directory Interactive;Encrypt=True;TrustServerCertificate=True;Connection Timeout=30;

The .env file contains your connection string. Set Authentication=Active Directory Interactive. This ensures that when you run your code, a new browser window will open to help you authenticate to Azure.

dab-config.json

{
  "$schema": "https://github.com/Azure/data-api-builder/releases/download/v1.7.90/dab.draft.schema.json",
  "data-source": {
    "database-type": "mssql",
    "connection-string": "@env('MSSQL_CONNECTION_STRING')"
  },
  "runtime": {
    "rest": {
      "enabled": true,
      "path": "/api",
      "request-body-strict": true
    },
    "graphql": {
      "enabled": true,
      "path": "/graphql",
      "allow-introspection": true
    },
    "mcp": {
      "enabled": true,
      "path": "/mcp"
    },
    "host": {
      "cors": {
        "origins": [],
        "allow-credentials": false
      },
      "authentication": null,
      "mode": "production"
    }
  },
  "entities": {
    "active_items": {
      "source": {
        "object": "dbo.active_items",
        "type": "table"
      },
      "fields": [
        {
          "name": "id",
          "primary-key": true
        }
      ],
      "graphql": {
        "enabled": true,
        "type": {
          "singular": "active_items",
          "plural": "active_items"
        }
      },
      "rest": {
        "enabled": true
      },
      "permissions": [
        {
          "role": "anonymous",
          "actions": [
            {
              "action": "read"
            }
          ]
        }
      ]
    }
  }
}

In the dab-config.json file, I found that the authentication section had to be removed for it to work. In this code, it is null, which apparently works too!

Testing your config

Whilst trying to figure out why my authentication went off the rails (and yes, this was my bad), I ran into the following command:

dab validate --config dab-config.json

This command not only checks the technical validity, but also tests functions; can it connect to your endpoint, are your entities defined correctly and, as always, does every entity have a field with a primary key? For those working with Fabric, you’ll quickly recognise that there is no concept of a SQL Primary Key constraint in a Lakehouse. So you’ll need to define that in your dab-config.json. After that, it’s plain sailing.

Thanks for reading!

Leave a comment