Data Thoughts – T-SQL Tuesday #148 – Advice on running a user group

G’day,

This months T-SQL Tuesday (March 2022) invite is brought to you by Rie Merrit@IrishSQL

Rie has asked us to write about “Advice on running a user group

It’s a great topic, and if you’ve even been involved in a user group in any capacity then you’ll probably appreciate that it takes a fair bit of ingenuity, knowledge, connections, manners, begging, borrowing and getting help from others to eventually become a successful group.

The first thing to note is that you don’t necessarily need all of those things to start off with – but manners and the ability to persuade others to help will certainly be a great start.

And, with a few of those things in hand, “The Christchurch SQL Server User Group” was founded, run by myself, Rob Douglas [@rob_douglasNZ] and Nick Draper. Fortunately, Rob’s company lent us their office space, on the condition we tidied up after ourselves, a reasonable request although I never envisaged carrying so may beer bottles and pizza boxes to the bin. I soon learnt moving the bin closer to the attendees was a good idea.

The group was mainly founded around socially events. Christchurch is a small(ish) place and we managed to get the word out pretty well – it wasn’t uncommon to have 20-30 people turn up even in the early days. We usually carried on the meeting in a local pub afterwards.

Soon we were contributing to other local events – such as the annual Christchurch Code Camp (thanks for the opportunity Steve Knutson (@nztechtweet)) and we managed to get a SQL Saturday off the ground after Greg Low (@greglow) helped us get affiliated to PASS. The more community involvement, the better – but the local population is the life blood of your group. While national and international are great and interesting, local is what’s going to build your group (at least where physical meetings are involved, that may be shifting with virtual) – think local, act global (remember, every local community has some sort of individualism and uniqueness that you can probably leverage somehow)

We then managed to make another community connection with the local community college, which was quite fortunate as Rob was moving away from the city and the group needed a new home – so Ara stepped up and stepped in. The deal was the same, tidy up after yourself. This actually proved a little more challenging as they didn’t have big bins and the now growing amount of pizza boxes wouldn’t fit. So, I needed to transport all the rubbish home myself – so I put rubbish bags on my (now growing) group night shopping list,

The group’s still based at Ara, and I know that members of the group (including myself) still do education sessions for their students from time-to-time. A mutually beneficial arrangement.

So, what did I learn. Well, you need to change as your audience changes. Initially, I knew nearly all of the members and I assumed they all liked the food and drink on offer – beer and pizza – and they did. But I should have been more accommodating as new members joined, catering more for people who didn’t want to drink beer or eat pizza.

I left Christchurch in early 2016, the group had grown significantly and it was time to hand it over. One of our ever present members, Hamish Watson (@TheHybridDBA) took it over, and it’s gone from strength to strength ever since. Most members are keen to get involved in any capacity, while speaking may seem the obvious one – that’s not for everyone. Simply having your regular members help to make new members feel welcome and involved will go a long way.

Since leaving Christchurch, I’ve been involved in a few other groups. “SQL Social Melbourne” run by Martin Cairney (@martin_cairney) is a great concept. Built around a social gathering in a local pub it encourages short talks from various members and it was a great way of getting new speakers interested in the local / national / international communities. Unfortunatly, the pandemic has curtailed that – at least temporarily, but I reckon we’ll see it make a comeback soon. If you’re in Melbourne, keep your eye out.

More recently, I moved to Brisbane and became involved in Virtual user groups with Warwick Rudd (@Warwick_Rudd), We run Data Platform Down Under which runs at 12:15pm AEST (UTC+10) on a Tuesday, usually the first Tuesday of the month, but this can vary and we often have additional sessions (still on a Tuesday). Check out our times. They are great for most parts of the world (although maybe not Europe – sorry Europe!)

Data Platform Downunder – worldwide meeting times (always Tuesdays)

Which brings me to another point – “Never turn down the opportunity of free publicity for your group”

And with that, I’ll say that Thanks Rie, and to everybody reading this, best of luck with all your user groups -whether you’re a current group host, a future group host or a valued (or prospective) community member.

Have a great day

Cheers

Marty.

Data Thoughts – T-SQL Tuesday #147 – Upgrade Strategies

G’day,

This months T-SQL Tuesday (January 2022) invite is brought to you by Steve Jones @way0utwest

Steve’s asked us to write about “Planning for upgrades

T-SQL Tuesday

In the not too recent past, I’ve done quite a lot of migrations to Azure SQL Managed Instance, so, I thought about writing about that. But that seems more of an ‘uplift’ than an upgrade. I guess that’s an arguable point – depending on your perspective.

Anyway, just recently, I upgrades a SQL 2008 instance to SQL 2019. Quite straight forward I thought. And it was. But I ran into some unexpected events along the way that I thought might make an interesting post.

One of the first things that I like to do is talk to the person who wants the job doing – the client 🙂

I like to find out their motivations and reasons for doing an upgrade, which would seems obvious in this case as SQL 2008 is out of extended support. This was a fully on-premises upgrade, but I also like to check facts.

There were a number of servers to upgrade, as you might expect – DEV, TEST , UAT and PROD.

Interestingly enough the server in the TEST environment had previously been upgraded to SQL 2016 with the user databases left in 2008 compatibility mode. Circumstances like this are always intriguing, and a few questions spring to mind such as why only one environment was tackled, why it was left and how this affects any code that makes it way though the various environments and eventually lands in the PROD environment.

Anyway, that aside, one of things I like to reference is Compatibility Certification – in recent years Microsoft has started to improve guidance and make certain guarantees around SQL Server upgrades, helping organisations to better manage risk.

It’s not a short article, but also not the longest one that you’ll come across. If you’re doing upgrades regularly (or even if you are not) then I’d bookmark it as the word certification and compliance will most likely lead search engines down other routes.

After investigating the environments, I’d normal put a plan together of how the environments will each be upgraded and the testing that will be done, both by you and by the client. I’d be keen to know their Change Advisory Board (CAB) processes and a rough indication of how long this would take.

Oh, and is this an in-place upgrade or a side-by-side upgrade (which some might refer to as a migration). In this case it was an in-place upgrade. There’s pros and cons to both and it depends on many factors such as risk, downtime and complexity – to name just a few.

I always favour downloading and running Database Migration Assistant (DMA).

DMA works fine for on-premises migrations / upgrades, it’s not just for the cloud. DMA produces both JSON and XLXS files that list out issues. I’ve seen a few open source solutions that pull the results into a data warehouse and then use a Power BI solution to visual represent any issues. Microsoft used to have one, but it seems to have disappeared at the moment (please let me know if you see it)

Then there’s the issues of upgrading instances that are out-of-support. You might require a ‘double hop” – basically where there are multiple upgrades involved, for instance upgrade to 2016 then upgrade to 2019.

It’s all fun and games till somebody loses an eye 🙂 – so plan well.

Sometimes simply upgrades turn out to be way more time consuming and problematic than you imagine.

Take care out there.

Have a great day.

Cheers

Martin.

Data Thoughts – Azure and Exchange rates

G’day,

I noticed something in the Azure portal the other day that had escaped my attention previously. Normally, when in Azure accounts I’m using corporate subscription, but this particular day, I was helping somebody who had just opened a “Free Trial” account.

The “Free Trail” account gives $US200 credit. But this account was based in Australia, so the account is billed in Australian dollars.

We headed to “Cost Management and Billing”.

Notice that there’s a few links to educational content on how to interpret things.

Upon provisioning the Free Trial account, we immediately headed to “Cost Management and Billing”

Cost Management and Billing

Next, we headed to “Payment Methods” from the left hand side menu and selected the “Azure Credit” tab.

Azure Credits – Free Account

I wondered if this was a good exchange rate.

Today google shows the exchange rate as

And the Exchange Rate shown in the Azure Portal (for the date of this post) was

Exchange Rate that google shows

Not bad Microsoft.

Hopefully, your exchange rates are just as good – I’m sure they are.

Have a great day.

Cheers

Martin.

Data Thoughts – Renewing Microsoft Certifications

G’Day,

I recently renewed one of my Azure Certifications, extending it by another year from the original expiry data.

You can extend any certification that is about to expire within the next 6 (six) months. You can’t extend them if the expiry date is any longer than 6 months in the future.

The process was quite straight forward and I was surprised at the easy with which I was able to walk through the whole process (not the actual questions, those required a bit of thought 🙂 ) – especially as I had previously took many online exams at home and had to verify the integrity of the temporary exam room that I had set up.

The only difficulty that I had with the process was actually starting it.

I remember that I got an email a few weeks ago stating that I needed to upgrade within 6 months, well I now could not find that email and I wanted to get the upgrade done.

After a bit of research, I found that I needed to associate my certification profile with a Microsoft docs profile.

After that I found that there was a link from my docs home page to all the certifications that I needed to update.

After linking my account I found that the certifications that you need to update will be listed under “Certifications” on your docs page

Just hit “Certifications” in the left had menu and then any exams that need renewing will appear in the middle pane with a “Renew” button next to them.

Then you are straight into the exam, a series of untimed questions that will cover some of the latest enhancements and just ensure your knowledge is still current.

Good luck.

Have a great day.

Cheers

Martin.

Data Thoughts – Microsoft Exams and Simulations

G’day,

I’ve based my career around Microsoft products on the Data Platform.

The first product that I seriously learnt was Microsoft SQL Server back around 2000.

I was lucky enough to be employed as a database developer at the time and I decided to go for the MCDBA certification.

I remember that I felt ok with the development stuff, but there was an exam on administration – a topic that I was a little shaky on. I’d also opted to take an Exam on Windows 2000 as part of the Certification set, this was the one that I was most uncomfortable in.

So, I relied on advice from colleagues (I was in a team of network engineers at the time) – but that would only get me part of the way.

Reading books, Microsoft articles (books online was not as good as the equivalent Microsoft docs of today) – is ok, but I personally find that the information does not stick. I began working through scenarios from the docs, but I found these basic.

I discovered two Microsoft endorsed companies who produced practice tests – Self-Test Software (who unfortunately are no more) and measureup, who still provide a great deal of practice tests. These provided me with more than a few problems that I worked through and I found the knowledge stuck a lot more than it did when I simply read.

When it was time to take the test I wondered how quickly I would pick up how to navigate the exam – not the content and questions, but the user interface and question types – and if I’d waste time on the question types rather the question itself.

As it turned out this was quite straight forward.

If I was in the same position today, I could have used the new Microsoft Exam Simulator that can be found here. And I encourage anybody who has never taken a Microsoft exam – but intends too – to have a look and get familiar with the question types.

Even as a regular test taker, there’s a few parts of the exam engine that I’ve never explored – such as changing the contrast and such things.

Apart from getting to know the engine, there a few strategies that I normally take myself.

I always fully answer every question first time around, even if I don’t know, I attempt to work it out and give it my best shot – extreme worst case, I guess. However, I’ll always mark that question as for review. Occasionally I’ll also mark it as for feedback. Note that there is a section at the end of the exam (after you’ve completed) where you can separately go back to questions and give feedback, this helps Microsoft make questions better for others.

After you complete the last question you see a dashboard letting you know such things as questions you have not (fully) answered (it occasional happens) and questions that you have marked as for review – I always concentrate on those categories.

The main point of this post is to have an exam taking strategy, this might include

  • Visit the exam simulator
  • Answer every question first time around – never miss one.
  • Mark a question for review if you are not confident with the answer you gave.
  • Give feedback on questions at the end of the exam if appropriate.
  • don’t rush – plan your time carefully and use up the time to check answers.
  • Relax – have fun and pass.

and after you’ve successfully passed the exam, remember to claim your badge and let people know about your achievement.

Have a great day

Cheers

Marty

What exactly is a modern data warehouse?

G’day,

I was asked recently what a modern data warehouse is – and that’s a vey thought provoking question.

First, there’s the tag “Modern” – I think what’s modern now may not be quite so modern in a year or twos time – maybe sooner, judging by the speed of some technology changes at present.

My initial response – jokingly – was ‘it depends who you ask’, which is a pretty useless – but true – answer.

Firstly let me put some context around my answer. I’ve built my career around Microsoft technologies – so my answer is going to centre around their latest technologies – which are going to be all cloud based, and essentially incorporate Platform-as-a-Service (PaaS) functionality. Your experience may be different, you might be in a different cloud, you might have different ideas – but for me it’s PaaS Services in the Azure cloud. I’ll expand on this shortly.

But let’s (quickly) look back on how I’ve seen data warehouses develop – in my Microsoft world – over the years. They used to be simply built on SQL Server, incorporating fact and dimension tables. There’d be debate about star and snowflake schemas, about kimball or Inmon. More recently people talk about Data Vault. Essentially, people pulled data from their operational (transactional) system into their data warehouse (decision support system or Enterprise data warehouse or whatever they chose to call it).

They did this using some Extract, Transform and Load (ETL) tool (commonly called a data integration tool) like SQL Server integration services.

They might have had a single data source (one transactional system) or they may have had more data sources – so firstly staging the data made sense, then cleaning it and then putting it into the main data warehouse.

We’d have to deal with things like slowly changing dimensions and very often we may not have modelled data exactly as it is in the real world.

We might put analytical models over the data warehouse, initially these were multi-dimensional (MDX) but later Tabular. Then we’d either report straight off the data warehouse or the analytical model – using tools like SQL Server Reporting Services or – lately – Power BI.

For me, things have moved to the cloud and our options have expanded (or should I say exploded)

We can take data from literally any data source and land it in our Azure data lake. This area can be segregated, as it likely contains our raw data. we can now keep this – untouched – as space is literally no longer an impediment (ok, it still costs money, but it’s cheap, and we can have different degrees of storage (hot / cold /archive) and we have to trade off access times with cost. We also don’t have to worry about upgrading and migrating data at any time – this is all part of the service provide by a modern cloud.

We can then start segregating the data into different areas (of the data lake), say a staging area, a clean area, and maybe an area full of curated data sets.

We can pull in different types of data in different formats – CSV, JSON, AVRO, Parquet, and more.

Once we have the data in out desired area we can then pull it into a traditional data warehouse – say in Azure SQL Database for instance (although there maybe better cloud solution). We can use Polybase to virtualise the data.

We could also use Synapse, CosmosDB or maybe databricks.

We can build Machine Learning Models and enable Artificial Intelligence services.

Basically, I think the ‘modern’ data warehouse is about cloud based data services and simply adds to the traditional models of data warehousing.

Whereas the traditional data warehouse is about historical reporting, such as diagnostic and descriptive analytics – the new tools enable predictive and prescriptive analytics along with cognitive capabilities.

So, basically the answer to ‘what is the modern data warehouse?’ could be a lot of things, pieced together in many different ways. and we’re only going to get more as time marches on.

So maybe we should start talking about the changing data warehouse, or the evolving data warehouse or the cloud data warehouse – but not the modern data warehouse, as what’s modern today is sure to change tomorrow.

Anyway, these are just my thoughts. Some will agree, some won’t and they’ll be plenty of diverse ideas – that’s great for innovation. Um, maybe the innovative data warehouse is a better name 🙂

Anyway, have a great day

Cheers

Martin.

96 GB lost (or so I thought)

The other day I was looking to store some files, so I went and found myself a 128 GB USB 3.1 flash drive.

Upon inserting this into my computer (initially the wrong way up, and then correcting myself, as is the accepted method) – I was amazed to see it only registered as 32 GB

wow, I thought this was 128 GB

Whatever, I thought. I’ll just reformat this to the correct size.

Only 32 GB of space yet the drive had 128 GB firmly stamped on the back.

But no matter what combination of file systems that I chose, I could not get this to recognize more than 32 GB

no matter what file system I chose to format the drive with, I could not get it to recognize more than 32GB of space

I soon realised that because it was FAT32 perhaps this was the reason. But how to reclaim the apparent lost space.

Over to Disk Management, where I seen

So, I found the ‘lost’ space by using disk management.

I quickly deleted the 32GB Volume.

The 32 GB volume was deleted.

To reveal the full quota of unallocated space

So, I found the full quote of space

and I allocated that as a simple volume and re-formatted the drive as NTFS.

I found a few things out on that journey.

I could create multiple volumes of the same flash drive and each volume registers as a separate drive in Windows explorer.

But if you want to use the flash drive for something like re-imaging your surface – then it’ll have to be formatted as FAT32. Then you’ll be back to square one again and wondering where your space disappeared to.

I hope this saves somebody some time at some point.

Have a great day.

Cheers

Marty