The main reason that people have been using KQL is usually as part of Azure Monitor Logs. And did you know that Azure Data Studio has another Extension available specifically for Azure Monitor Logs. This allows you to connect to a Log Analytics Workspace.
Both extensions are currently in preview at the time of writing, however, having used both, they seem fairly stable to me.
So, after a few seconds (probably less) the Azure Monitor Logs extension was fully installed and ready to go.
I confirmed this by creating a brand new note book and seeing that the ‘Log Analytics’ kernel was available for use.
Now we need to connect to our Log Analytics Workspace.
So, let’s set up a new connection in Azure Data Studio. You’ll now have a new connection type – “Azure Monitor Logs”
If you’re struggling to find the Workspace ID of your Log Analytics resource, just head to the overview page of the resource in the Azure Portal
Create a new Notebook.
Now we can simply pick the connection in the dropdown and start writing some Kusto Query Language (KQL)
Select the Kernel (LogAnalytics) and the connection (I called my connection “Log Analytics”)
Create a new code cell and enter some Kusto, here’s a starter.
// Error and exception count
// Show a column chart of the number of the logs containing warnings or errors in the last hour, per application.
| where TimeGenerated > ago(24h)
| where Level == "Warning" or Level == "Error"
| summarize count_per_app = count() by _ResourceId
| sort by count_per_app desc
| render columnchart
So, that’s the Azure Monitor Extension in Azure Data Explorer.
It’s a great topic, and if you’ve even been involved in a user group in any capacity then you’ll probably appreciate that it takes a fair bit of ingenuity, knowledge, connections, manners, begging, borrowing and getting help from others to eventually become a successful group.
The first thing to note is that you don’t necessarily need all of those things to start off with – but manners and the ability to persuade others to help will certainly be a great start.
And, with a few of those things in hand, “The Christchurch SQL Server User Group” was founded, run by myself, Rob Douglas [@rob_douglasNZ] and Nick Draper. Fortunately, Rob’s company lent us their office space, on the condition we tidied up after ourselves, a reasonable request although I never envisaged carrying so may beer bottles and pizza boxes to the bin. I soon learnt moving the bin closer to the attendees was a good idea.
The group was mainly founded around socially events. Christchurch is a small(ish) place and we managed to get the word out pretty well – it wasn’t uncommon to have 20-30 people turn up even in the early days. We usually carried on the meeting in a local pub afterwards.
Soon we were contributing to other local events – such as the annual Christchurch Code Camp (thanks for the opportunity Steve Knutson (@nztechtweet)) and we managed to get a SQL Saturday off the ground after Greg Low (@greglow) helped us get affiliated to PASS. The more community involvement, the better – but the local population is the life blood of your group. While national and international are great and interesting, local is what’s going to build your group (at least where physical meetings are involved, that may be shifting with virtual) – think local, act global (remember, every local community has some sort of individualism and uniqueness that you can probably leverage somehow)
We then managed to make another community connection with the local community college, which was quite fortunate as Rob was moving away from the city and the group needed a new home – so Ara stepped up and stepped in. The deal was the same, tidy up after yourself. This actually proved a little more challenging as they didn’t have big bins and the now growing amount of pizza boxes wouldn’t fit. So, I needed to transport all the rubbish home myself – so I put rubbish bags on my (now growing) group night shopping list,
The group’s still based at Ara, and I know that members of the group (including myself) still do education sessions for their students from time-to-time. A mutually beneficial arrangement.
So, what did I learn. Well, you need to change as your audience changes. Initially, I knew nearly all of the members and I assumed they all liked the food and drink on offer – beer and pizza – and they did. But I should have been more accommodating as new members joined, catering more for people who didn’t want to drink beer or eat pizza.
I left Christchurch in early 2016, the group had grown significantly and it was time to hand it over. One of our ever present members, Hamish Watson (@TheHybridDBA) took it over, and it’s gone from strength to strength ever since. Most members are keen to get involved in any capacity, while speaking may seem the obvious one – that’s not for everyone. Simply having your regular members help to make new members feel welcome and involved will go a long way.
Since leaving Christchurch, I’ve been involved in a few other groups. “SQL Social Melbourne” run by Martin Cairney (@martin_cairney) is a great concept. Built around a social gathering in a local pub it encourages short talks from various members and it was a great way of getting new speakers interested in the local / national / international communities. Unfortunatly, the pandemic has curtailed that – at least temporarily, but I reckon we’ll see it make a comeback soon. If you’re in Melbourne, keep your eye out.
More recently, I moved to Brisbane and became involved in Virtual user groups with Warwick Rudd (@Warwick_Rudd), We run Data Platform Down Under which runs at 12:15pm AEST (UTC+10) on a Tuesday, usually the first Tuesday of the month, but this can vary and we often have additional sessions (still on a Tuesday). Check out our times. They are great for most parts of the world (although maybe not Europe – sorry Europe!)
Which brings me to another point – “Never turn down the opportunity of free publicity for your group”
And with that, I’ll say that Thanks Rie, and to everybody reading this, best of luck with all your user groups -whether you’re a current group host, a future group host or a valued (or prospective) community member.
Hopefully, you’re aware that Azure Data Explorer can be accessed by anybody with a valid Microsoft Account Identity or MSA for short. If you need a one, then hop along to Outlook.com and pick up a free MSA.
After that head along to dataexplorer.azure.com and login with your MSA. Then, you’ll find that Azure Data Explorer is attached to a cluster that you did not create. This will be the ‘help’ cluster that Microsoft uses in order to get folks up-to-speed on Azure Data Explorer and KQL.
Your screen (browser) should look something like this.
You’ll notice that you’re attached to the ‘help’ cluster. Which has 4 databases
And that an initial script is in the main window.
Notice too that right at the top of the main window (the one that contains the script) you can see the name of the ADX cluster and database that you are attached to.
The script that’s open in the main window is designed to run on the “@help/Samples” cluster / database.
Make sure you’re attached to that and highlight the first query (if you’re not on the correct database then you’ll just get an error)
This is Kusto Query Language or KQL
This is basically the equivilent of a T-SQL query of the form
SELECT TOP(10) * FROM from table
You’ll also notice that the next query in the script looks something like this
And if you were to translate this into T-SQL, you’re basically asking for a computed / derived column and specifying a list of columns that you’d like back.
If you’ve worked with T-SQL for sometime, you’ll probably see some similarities but a lot of differences.
ADX and KQL have been built from the ground up with Data Analytics in mind.
Like most things Microsoft, it’s been developed internally and consequently seen great internal adoption. Hence, it’s being made available as a product to the rest of the world.
KQL is used heavily in Log Analytics, where we can join different tables of immense size in order to interrogate performance issues with our Azure systems – and get the answers really quickly, even when we have terabytes (or even petabytes) of data involved.
This is the type of interrogation that people might have used products such as Splunk for previously.
It’s worthwhile playing with this ‘help’ cluster and getting to know the KQL language, it’s looking like the adoption rate could be huge – and it’s always good to know your options, even if you find out that this is not for you.
Also, have a look at the connection string. Press the ‘Edit’ button in the bar that states the name of the ‘help’ cluster
And we see the query string, and have the opportunity to change the display name of the cluster – if we like.
If you wish to query the cluster inside Azure Data Studio using the KQL extension then that’s possible too – see this post for details.
While KQL might be a little different to T-SQL, most database professionals will easily ‘acclimatise’ to the new language and be up and running quite quickly.
Note that this new email address has to be an MSA, although you do not have to sign up for any Azure account – free or otherwise.
After that, there’s a big button in the middle of the screen that will allow you to create a brand new free cluster – and database as it turns out.
Fill in a few details in the subsequent dialog box
Use your own cluster and database names above. Accept the terms and conditions and enjoy.
As you might expect, you get all of the core features – but you’ll get some more if you upgrade at some stage. The table on this page will outline exactly what you are getting. The free cluster lasts for one year – but the docs seem to indicate that this could be extended. However, you’ll probably have a good idea of functionality after one year.
So, what are you waiting for, try out the new free cluster.
We’ll look at getting some data in there next time, but if you can’t wait for that there’s some more details here
In the not too recent past, I’ve done quite a lot of migrations to Azure SQL Managed Instance, so, I thought about writing about that. But that seems more of an ‘uplift’ than an upgrade. I guess that’s an arguable point – depending on your perspective.
Anyway, just recently, I upgrades a SQL 2008 instance to SQL 2019. Quite straight forward I thought. And it was. But I ran into some unexpected events along the way that I thought might make an interesting post.
One of the first things that I like to do is talk to the person who wants the job doing – the client 🙂
I like to find out their motivations and reasons for doing an upgrade, which would seems obvious in this case as SQL 2008 is out of extended support. This was a fully on-premises upgrade, but I also like to check facts.
There were a number of servers to upgrade, as you might expect – DEV, TEST , UAT and PROD.
Interestingly enough the server in the TEST environment had previously been upgraded to SQL 2016 with the user databases left in 2008 compatibility mode. Circumstances like this are always intriguing, and a few questions spring to mind such as why only one environment was tackled, why it was left and how this affects any code that makes it way though the various environments and eventually lands in the PROD environment.
Anyway, that aside, one of things I like to reference is Compatibility Certification – in recent years Microsoft has started to improve guidance and make certain guarantees around SQL Server upgrades, helping organisations to better manage risk.
It’s not a short article, but also not the longest one that you’ll come across. If you’re doing upgrades regularly (or even if you are not) then I’d bookmark it as the word certification and compliance will most likely lead search engines down other routes.
After investigating the environments, I’d normal put a plan together of how the environments will each be upgraded and the testing that will be done, both by you and by the client. I’d be keen to know their Change Advisory Board (CAB) processes and a rough indication of how long this would take.
Oh, and is this an in-place upgrade or a side-by-side upgrade (which some might refer to as a migration). In this case it was an in-place upgrade. There’s pros and cons to both and it depends on many factors such as risk, downtime and complexity – to name just a few.
I always favour downloading and running Database Migration Assistant (DMA).
DMA works fine for on-premises migrations / upgrades, it’s not just for the cloud. DMA produces both JSON and XLXS files that list out issues. I’ve seen a few open source solutions that pull the results into a data warehouse and then use a Power BI solution to visual represent any issues. Microsoft used to have one, but it seems to have disappeared at the moment (please let me know if you see it)
Then there’s the issues of upgrading instances that are out-of-support. You might require a ‘double hop” – basically where there are multiple upgrades involved, for instance upgrade to 2016 then upgrade to 2019.
It’s all fun and games till somebody loses an eye 🙂 – so plan well.
Sometimes simply upgrades turn out to be way more time consuming and problematic than you imagine.
Over the last few years I’ve had the privilege of helping people start on their Azure journey. Some come along with Azure Subscriptions, others just want to get started.
While there are multiple ways of getting Azure subscriptions, I’m going to concentrate on how to get a free subscription and how I would walk through this with somebody who was doing this for the very first time.
On the way, there are a few interesting points that come up, which I feel are beneficial to introduce at this point – and drill into in more details once more experience has been gained. After all, whenever we start something new there’s always that element of culture shock and finding your feet – at least there is with me.
The very first thing that I would advise somebody to do is to get a brand new Microsoft account (known as an MSA) – the simplest thing to do is head to outlook.com and open a brand new account. These accounts have gone by various different terms over the years (such as Passport) – but nowadays it’s just an MSA.
Once that has been set up, head to https://portal.azure.com/ and log in using the MSA credentials that you have just created.
At this point, the question normally arises as to why a new MSA is recommended. Well, my answer to that is that a brand new (outlook.com) account will have never been used with any Azure infrastructure previously.
I’d avoid a company account, event a new one as you’ll likely end up being linked to their Azure Tenant. (I’ll blog more on this later)
At the portal we should see a screen like the following, We’ll be hitting the “Free Trial” button soon – but we’ll have a look around first.
Before we provision our free trial, lets take a look around Azure Active Directory. As we progress our journey, it’s important to understand the relationship between an Azure Tenant and an Azure Subscription. An Azure Tenant is used by Azure Active Directory – at present we do not have an Azure Tenant, but we do have the ability to look around Azure Active Directory. Type “Azure Active Directory” in the blue search box at the top of the screen and lets look around.
Upon choosing the above option, you’ll see a menu on the left hand side of the screen appear. Have a look around this. You won’t be able to create anything (on the off chance that a button appears letting you, chances are you’ll get an error when you press the button as at this point you do not have an Azure Tenant)
You’ll see that here you can create Users, Groups, Administrative units and various other things. Just have a look, browse around and get curious about what’s on offer – but remember, at this stage you DO NOT have an Azure Active Directory Tenant (Azure AD). After you have finished looking around then head back to the home page by clicking the “Home” link in the top left hand corner of the browser.
This will return you to the home screen, where you were Originally – now select the Free Trial button.
At this point, you’re going to need to verify your identity. You’ll do this using your credit card.
This usually is the point at which people are hesitant – and I totally understand.
However, the free trial gives $200 worth of free credit and there is a spending block that WILL NOT allow you to go over that limit. Yes, you can take steps to remove it and these are fairly involved. Removing the spending limit is unlikely to be something that you will do by accident. Also note that you can only use a credit card for 1 free trial.
After filling out your details you’ll end up back at the home screen, it’ll look different now.
Notice in the top left hand corner, your account appears.
You’ve also now got a tenant. If you used an outlook account like I did, the Azure Tenant will take the form
emailname + outlook.onmicrosoft.com
email address is email@example.com
Azure Tenant is – myazureaddressoutlook.onmicrosoft.com
more on this another time.
Type “subscription” into the search bar and check out your new subscription.
You can rename the subscription. Just click on the subscription name. This will take you to the Subscription overview page. From there, locate the name, click it and rename – remember to hit “Save”
So now we are all set up.
At this point we have an Azure Tenant and an Azure Subscription.
We’ll use the Azure Tenant for Authentication.
We’ll use the Subscription for Authorization with Azure Role Based Access Control or RBAC as it is commonly known..
We’ll use the Tenant and Subscription for other things too, but Authentication and Authorization are big ticket items. Identity is a huge concept.
One other thing to realize is that Azure Active Directory (Azure AD) is NOT Window Active Directory in the cloud. While there are similarities there are also huge differences.
One of them is Authentication – for example Windows Active Directory uses Kerberos while Azure Active Directory user open standards such as OpenID.
I recently renewed one of my Azure Certifications, extending it by another year from the original expiry data.
You can extend any certification that is about to expire within the next 6 (six) months. You can’t extend them if the expiry date is any longer than 6 months in the future.
The process was quite straight forward and I was surprised at the easy with which I was able to walk through the whole process (not the actual questions, those required a bit of thought 🙂 ) – especially as I had previously took many online exams at home and had to verify the integrity of the temporary exam room that I had set up.
The only difficulty that I had with the process was actually starting it.
I remember that I got an email a few weeks ago stating that I needed to upgrade within 6 months, well I now could not find that email and I wanted to get the upgrade done.
After a bit of research, I found that I needed to associate my certification profile with a Microsoft docs profile.
After that I found that there was a link from my docs home page to all the certifications that I needed to update.
After linking my account I found that the certifications that you need to update will be listed under “Certifications” on your docs page
Just hit “Certifications” in the left had menu and then any exams that need renewing will appear in the middle pane with a “Renew” button next to them.
Then you are straight into the exam, a series of untimed questions that will cover some of the latest enhancements and just ensure your knowledge is still current.
I’ve based my career around Microsoft products on the Data Platform.
The first product that I seriously learnt was Microsoft SQL Server back around 2000.
I was lucky enough to be employed as a database developer at the time and I decided to go for the MCDBA certification.
I remember that I felt ok with the development stuff, but there was an exam on administration – a topic that I was a little shaky on. I’d also opted to take an Exam on Windows 2000 as part of the Certification set, this was the one that I was most uncomfortable in.
So, I relied on advice from colleagues (I was in a team of network engineers at the time) – but that would only get me part of the way.
Reading books, Microsoft articles (books online was not as good as the equivalent Microsoft docs of today) – is ok, but I personally find that the information does not stick. I began working through scenarios from the docs, but I found these basic.
I discovered two Microsoft endorsed companies who produced practice tests – Self-Test Software (who unfortunately are no more) and measureup, who still provide a great deal of practice tests. These provided me with more than a few problems that I worked through and I found the knowledge stuck a lot more than it did when I simply read.
When it was time to take the test I wondered how quickly I would pick up how to navigate the exam – not the content and questions, but the user interface and question types – and if I’d waste time on the question types rather the question itself.
As it turned out this was quite straight forward.
If I was in the same position today, I could have used the new Microsoft Exam Simulator that can be found here. And I encourage anybody who has never taken a Microsoft exam – but intends too – to have a look and get familiar with the question types.
Even as a regular test taker, there’s a few parts of the exam engine that I’ve never explored – such as changing the contrast and such things.
Apart from getting to know the engine, there a few strategies that I normally take myself.
I always fully answer every question first time around, even if I don’t know, I attempt to work it out and give it my best shot – extreme worst case, I guess. However, I’ll always mark that question as for review. Occasionally I’ll also mark it as for feedback. Note that there is a section at the end of the exam (after you’ve completed) where you can separately go back to questions and give feedback, this helps Microsoft make questions better for others.
After you complete the last question you see a dashboard letting you know such things as questions you have not (fully) answered (it occasional happens) and questions that you have marked as for review – I always concentrate on those categories.
The main point of this post is to have an exam taking strategy, this might include
One of my previous blogs looked at using the free resources that Microsoft provides to link to an Azure Data Explorer Cluster and start using Kusto Query Language.
In this post, I want to look at doing a similar thing, but this time using Azure Data Studio (ADS). The cluster that we will link too will be the same one that we previously used when we link to an ADX cluster via the Azure Portal.
Open Azure Data Studio (ADS) and install KQL addin.
Now select “New Notebook” – then select the Python Kernel
If you don’t have a python installation then don’t worry, as soon as we execute our first command then ADS will prompt you to install one.
I’ve worked with Log Analytics quite a bit over the years and that’s taken me down the road of the Kusto Query Language – or KQL
So, using Azure Data Explorer Clusters seemed the next logical step on that journey.
Log Analytics is great for storing and querying diagnostic data from Azure (or any other infrastructure) – but I decided that I might want to investigate my own data.
So, with that in mind – I decided to set up my own Azure Data Explorer Cluster and start looking around.
It’s easy enough to find in the Azure Portal – I chose to favorite it as I expect to be using it a fair bit over the next little while
You can also get to it via a shortcut
Next, I select “Create” and then filled in the minimum amount of information needed to get up and running.
Just a few things to focus on here – firstly, the combination of the cluster name and the region is going to form part of a URI – so this has to be unique, and there are certain formats that the name has to meet, but you’ll soon know if you have not met them as the screen will call them out instantly.
Additionally, I’m just creating a Dev cluster here – note that there is no SLA.
Also, the region that I chose has Availability Zones – yours might not, or you might choose not to take advantage of them. I chose to include all three.
You’ll then notice that you can step through all of the remain setup.
I simply accepted the defaults, but it’s great to know you can define additional settings at this stage.
It’ll take a little time to set the cluster up – but you’ll soon be up and running.
And once we are up and running we can see that the URI is made up of the cluster name and region
In order to create a database on the cluster head to the resource, click on “Databases” in the left hand side menu (scrolling may be needed), click “Add database” near the top of the screen and then enter a name for the database in the fly-out pane.
That’s it – we now have our first Azure Data Explorer Cluster created with a new shiny database.