Welcome to a series of three blogs where I’m going to explore Azure Sql Copilot. This series will portray a lot of my experiences trying out Azure Sql Copilot. The first blog will feature a short introduction and an introduction on Copilot security. The following blogs will focus on a lot of testing. Remember, AI is constantly learning and my results may or may not ever show up on your screen. So don’t take them as absolutes but as responses at a point in time.
Having said that, let’s get cracking!
Copilot Introduction
In these days, AI is becoming more and more prevalent in every Azure offering. You don’t have to be a magician or visionary to predict that every resource in Azure will be connected to a form of Copilot. AI is storming in and we all have to, sort of, accept that. It will change our jobs though I refuse to believe that we’ll go out of a job. Soon.
In the last months, Microsoft has been working on a new offering, Copilot in Azure SQL and to be honest, I couldn’t wait to get my hands on this one. As someone who regularly must check out databases, analyse performance and do health checks, it’s very important to check out the capabilities of this Copilot. It might go off the rails at some point, but you never know, it might be useful for some things as well.

I don’t want to spend too much time on what Copilot is, but to start off, I’d like to show you a diagram of the different Copilots. There might be more by the time you see this, but it already shows that there are many different Copilots. If you can control all these Copilots (the One Copilot to rule them all), you’ll be amazing. The good thing about this diagram is that there is no general Copilot but every single one of these has its own speciality regarding the resource it’s connected to.
Copilot on Azure SQL
Coming to SQL Server in Azure, because there’s no offering for Copilot on SQL Server on-premises, there are two slightly different ones. The one most people will use is the Copilot in the Azure portal. It’s quite easy to find and you can ask it a lot of questions, some of which can be answered. The other one is encapsulated in the query editor (strangely enough still in preview).
So let’s start with some basics about Copilot in Azure and especially in Azure SQL. The following link will take you to the official documentation. As you can see, at the time of writing, it’s in preview. I don’t know when it will be Generally Available (GA).
Microsoft Copilot in Azure (preview) | Microsoft Learn
The main goals of Copilot are understanding you Azure environment, working smarter with your Azure devices and finally write and optimise code. Now this sounds very basic and simple, on the other hand, these are the main things we as humans working with Azure and the resources are trying to achieve. Well, that and minimising cost of course.
Use cases
I’ve been working with SQL Server since 2012 and, like you, I might call myself proficient in writing queries and managing SQL Server.
Azure has taken some management burden away from us, but it still needs a lot of love and care to perform. And, queries do not write themselves.
When Chat GPT became hot and happening, a number of people tried if it could understand anything that has something to do with SQL and most of the responses were negative. It can do simple stuff but when it gets complicated, it fails, hallucinates or is adamant in it’s wrong answer. Those people moved away from AI for now for the difficult stuff.
So why would this be different? And why would us, the experts, rely on someone else’s algorithm to do our work.
And my response is, don’t. Because Copilot isn’t there to do your work. It’s there to help you out and support with the simple things or guide you towards your goal. You still need to check the output of Copilot. In the end, you’re the expert. You decide if the output meets your quality standards, if the output is good enough.
As a sidestep, the company I’m working for has colleges and universities as clients and some offer IT as part of their curriculum. They’ve told us that part of their classes is changing from core programming to asking AI how to program a certain part and focus on interpreting the results. The students still need the creativity to create something new but the tedious work of typing stuff out can be left to AI.
But you still need to check the results. Most DBA’s I know are control freaks and want to know exactly what’s going on. I might be one of them and you might be as well. And that’s perfectly fine.
The thing is, Copilot is here and it won’t go away I think. So embrace it for what it can do and mutter about the things it can’t. We can still say no, even to Copilot!
Security
We care about our data. A lot! So what about security. Now there’s an interesting question. Because what happens with the data Copilot is processing?
First, I haven’t seen Copilot actually select data from the database, other than meta data. I will show proof of that in the following blogs.
Second, when you dig into the docs, you’ll see that data from prompts is stored in a separate subscription where only employees with a secure admin workstation have access to. Now, these docs are hard to understand so I asked Microsoft if I could get some bullet points on this.
- All data/metadata/logs used for the service are stored within the originating geography.
- Prompts and responses are not stored or shared unless the customer decides to share them as part of the feedback process.
- Prompts and responses are also not used to train underlying foundation models.
- As for the Azure OpenAI endpoints, Azure OpenAI is not deployed in all regions yet. Data residency requirements get enforced in General Availability.
If you want to read more, click here: Frequently asked questions about Microsoft Copilot skills in Azure SQL Database (preview) – Azure SQL | Microsoft Learn
Then there’s your responsibility; if you grant a person access to the database and let them use Copilot, they do all sorts of cool things. Deny access to the database and they won’t be able to do a thing.
Concluding
Copilot for Azure SQL is in preview. It’s learning everything about SQL but on the other hand, it has a vast general Copilot knowledge base to connect to.
It isn’t storing your prompts for learning unless you explicitly allow Copilot to do this.
Copilot doesn’t read your data, it only reads your metadata.
Humans interacting with Copilot can’t do anything beyond their given database permissions.
Now that the basics have been established, the next blog will focus on the Azure portal experience where I’ll show you what you can do and where the limits are. The final blog will focus on the inline query editor.
2 thoughts on “Sql Server and Copilot. What the query it this? Part 1, introduction”