SQL MCP: Local to Fabric Lakehouse

Let's be honest, you thought of doing this yourself. Connecting two tools that seem to serve very different purposes. But, as I had a day to fool around, I thought to myself, let's see if I can make this work. In the previous blog post on MCP Local, I described how to run your MCP … Continue reading SQL MCP: Local to Fabric Lakehouse

SQL and MCP, the local edition

In the previous blog posts, I've written about installing MCP and how it works, followed by how to use it with an Azure SQL Database through an Azure Container Instance and Microsoft Foundry. But, you can also use your favourite local database (SQL Server ;)) and Visual Studio Code. The reason Visual Studio Code is … Continue reading SQL and MCP, the local edition

SQL and MCP, Azure SQL Edition

In part one, I covered the basics of your Data API Builder setup. Now, it's time to get into the real stuff :). For the demo, I'm using the Stack Overflow 2013 database on an Azure SQL General Purpose Serverless 8-core database. It's not a very large dataset (making it less expensive to host on … Continue reading SQL and MCP, Azure SQL Edition

SQL, Azure SQL and MCP, the Introduction

This three-part series of blog posts will take you along my first experiments, trials and errors using the MCP Service for SQL. This first blog will focus on the technology behind it; part 2 will focus on Azure SQL options; and the last part will dig a bit deeper into running this on your local … Continue reading SQL, Azure SQL and MCP, the Introduction

DP-700 training: implement database projects

Some of you might recognise database projects from either Synapse Analytics or SQL Server. Yay for the latter ;). A long story, very short, database projects are a way to develop and deploy your databases using a CI/CD (Continuous Integration/Continuous Development) approach. For each database, the code you've written is saved in a specific project … Continue reading DP-700 training: implement database projects

Loadtesting SQL, the sequel

Some time ago, I wrote a number of blogposts comparing the different Azure SQL options to give you some idea about performance, differences between tiers and differences between the Stock Keeping Units (SKU's). This was done by creating data in the database itself and review the metrics. This works fine and gave a good overview … Continue reading Loadtesting SQL, the sequel

Creating Azure Synapse Link for Dynamics: Step-by-Step Guide

I had an interesting morning trying to connect Dynamics to a Data lake. As you might know, there was a feature where Dynamics would create a Data lake for you. This choice will cease to exist quite soon and to offer alternatives, you can create an Azure Synapse Link or Fabric Link. This blog will … Continue reading Creating Azure Synapse Link for Dynamics: Step-by-Step Guide

Using Github Actions to deploy Azure resources with Terraform

Now, that's a title with a lot of terms. Github Actions You'll probably know Github as the website where you can store your code, be agile with all sorts of branches, merges etc and keep track of your issues. There are also actions which are roughly the same as Azure DevOps pipelines. Written in Yaml … Continue reading Using Github Actions to deploy Azure resources with Terraform

Azure Data Factory and Soap, an opera?

Getting data from an API can be hard, especially when you're trying to get data from a so-called Soap interface. This is a bit of an antique way to distribute data to online applications and has a lot of challenges. I've read a few blogs on this subject but funnily enough, they're all using the … Continue reading Azure Data Factory and Soap, an opera?