JSON, SQL and NoSQL

Over the last year, I’ve been presenting on CosmosDB at SQL Saturday’s and various user groups.

This presentation has morphed somewhat over time to now be more of a comparison between functionality in CosmosDB and functionality in SQL Server -and a discussion of where one technology may be more relevant than the other.

In case you’re not aware of CosmosDB it’s Microsoft’s cloud based, planet scale, fully managed NoSQL (that’s Not only SQL) database – and it’s heavily JSON based.

While a lot of the knowledge gained as a SQL Server professional will be relevant to the world of Cosmos DB – there are elements that can lead the SQL Server professional down the wrong path and cause a substantial degree of pain at a later date.

However, lately, I’ve took to diving into the comparison with what I think is a good starting point – and that’s JavaScript Object Notation or JSON.

It’s kind of amazed me how many SQL professionals have turned up for my presentation and have not explored JSON at all – sure, they are aware of what it is, but the mechanics of it are mostly left as an exercise ‘to be explored’ at a later date.

That ‘later date’ maybe when some sort of data transfer has to done involving SSIS, ADF (v2), CosmosDB or some other technology or situation that lends itself well to JSON formats.

One of those uses – I’ve found – is setting up tables with configuration information -sure, there are lots of ways to achieve this, but the JSON format is one of the easier – and best – that I’ve found to date.

SQL Server 2016 introduced JSON support within the database and this support is in every edition of SQL Server from SQL Server 2016 forward.

One myth that I see regularly is that the JSON support in SQL Server may be analogous to the XML support offered by the same product. While, there are some comparisons, I’ve personally found dealing with JSON in SQL Server much more straightforward than dealing with XML – and, if you have never dealt with JSON in SQL Server before then you might be surprised how flexible it it.

For that reason, I’ve decided to invest some time in producing some posts that are specifically dedicated to the JSON functionality in SQL Server, while there’s a lot of good stuff already out there – and I’ll reference this when it’s relevant – one of my goals will be to build on each post to eventually explore many uses and potentials of JSON and the opportunites that it offers for SQL Server professionals.

I hope you enjoy it and I’d love to get feedback from you of any variety.

Have a great day

Cheers

Marty

A trusty old friend – The BCP utility

One of the requirements that springs up around data stores – regardless of if they are relational, not relational, cloud based or some other variety is (unsurprisingly) the need to access the data for various reasons such as reporting or analysis (amongst other things). Often, this requires moving the data (off-loading the data) to another environment / system where analysis or reporting can take place without the need to impact the main system.

Very often we use tools such as SQL Server Integration Services or Azure Data Factory for this sort of thing. But occasionally we might wish to fall back to a utility that’s been in SQL Server (as well as Sybase) for a long time – and that is the Bulk Copy Program, or BCP for short.

The BCP utility (bcp.exe) can be found in the SQL Server installation directory structure. On my machine it can be found at

C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\170\Tools\Binn\bcp.exe

Your system may be slightly different. Regardless, the directory should be accessible via the PATH variable and so simply typing bcp or bcp -? should display a list of arguments that the bcp utility accepts.

As an aside, did you also know that you can look at the same display using the PowerShell ISE – which in some cases may provide a more pleasing display

Let’s see how easy it is. We’ll use the sys.configurations table as everybody has that.

Clear-Host;
bcp.exe "SELECT * FROM sys.configurations" queryout "C:\temp\configurations.txt" -Slocalhost -dtempdb  -T -c -t',';

If you do run the above code then you will notice that the text names and descriptions have spaces, this might be a problem. We could handle this in the bcp utility. However, I’d be more inclined to do that in the T-SQL query, but that’s just a personal choice others may prefer another method. But here’s the query I’d use – notice the use of QUOTENAME to enclose the text in double quotation marks.

USE tempdb;
GO

SELECT
	[configuration_id], 
	[name] = QUOTENAME([name], '"'),
	[value], 
	[minimum], 
	[maximum], 
	[value_in_use], 
	[description] = QUOTENAME([description] , '"'), 
	[is_dynamic], 
	[is_advanced]
FROM 
	sys.configurations AS C
ORDER BY 
	C.configuration_id ASC;

Although this is a relatively old utility, I still find it remarkably useful and use it reasonably often – hence this post.

There are other ways to accomplish the same requirement (maybe even without moving the data – but we’ll leave that for another day)

And, by the way, the data that has been extracted is now on disk – so be sure to have a plan that deals with security.

BCP can also be used to input data into a table as well. I’ll leave that example for the moment but will return to it in a later post.

Have a great day

Cheers

Marty.

Data Events in Australia and New Zealand in early 2020

Over the course of the next few weeks there are some significant events taking place that reflect technologies on the Microsoft data platform and beyond.

Thursday 13th and 14th February will see Microsoft Ignite – The Tour Sydney

Saturday 15th February 2020 will see two events – focusing on Microsoft Business Applications

Power Platform Saturday Sydney

Power Platform Saturday Melbourne.

Both Power Platform events are entirely free.

Monday 17th February 202o sees the start of the Difinty conference in Auckland, New Zealand.

There’s quite a few high profile speakers that have traveled quite some distance to speak at this event – these are both members of the Microsoft Product teams and MVP’s

The Monday(17th February 2020) and Thursday(20th February 2020) are both dedicated to several workshops with the Tuesday and Wednesday being the main conference sessions

The conference is hosted by Reza Rad and Leila Etaati of Radacad – who are hosting the conference in #Auckland for the forth time.

Saturday 22nd February 2020 will see SQL Saturday Wellington taking place. This is a 100% free event and will see a lot of the speakers from the Difinity Conference traveling to give their time to benefit the community.

The day before (Friday 21st February 2020) will see SQL Saturday Wellington pre-conference events – and while these are paid training days they are substantially discounted and offer amazing value.

Later in the year will see SQL Saturday Brisbane taking place on Saturday 30th May 2020. This year they have an amazing venue that is right in the middle of the city and really convenient for public transport.

There’s a call-for-speakers currently open for the Brisbane event as well and some great sponsors lined up. You’ll need to get a PASS membership in order to submit – but that’s free and as an added bonus will give you access to some great educational content.

Later in the year we should see additional SQL Saturdays at

  • Auckland
  • Christchurch
  • Melbourne
  • Sydney
  • Perth

Keep your eye on the SQL Saturday website for these dates and if you are so inclined then submit a session in the call for speakers.

I’m looking forward to meeting many old friends at these events and hopefully making a few new ones too.

Have a great day.

Cheers

Martin.

Data Privacy – Playing your part

Tuesday 28th January 2020 was “Data Privacy Day 2020”

Some may refer to this as Data Protection Day, but it is really just a day to draw attention to privacy issues that exist around the use of digital data.

You can read more about this day here and here.

Ironically, on that day, I just happened to be investigating some privacy / security issues and I was reminded that some companies, individuals and even industry sectors that should be blatantly aware of the issues surrounding data privacy are either

  • Ignore of these issues
  • Playing ignorant
  • Thinking ‘it’ll never happen to me’
  • Not caring
  • putting profits above the protection of their clients.
  • All (or some of the above)

And there’s probably many more reasons that people are employing less than desirable practices to look after their data.

If you need more evidence you only have to look at the list of big global companies that have suffered data breaches of some kind in the last few years.

The damage caused by a data breach can be irreparable. Law suits may follow, reputations may be damaged, goodwill may be lost and that’ll soon be reflected in financial figures.

There are plenty of simple things that people can do to help keep their company’s (and their own) data safe. I’ll go into more details on these in later posts but initially I just wanted to start by saying (or rather reiterating) that security is everybody’s job.

Just because the word security is not in your job title does not mean you can get away with complacency.

If you see an issue – speak up. If you could suggest an improvement then speak up too. I get that there may be cultural barriers in some workplaces around these sort of things but good employers will remove these barriers and allow a physiologically safe workplace.

This is an amazingly cost effective way to help secure your environment.

A lack of data protection is not only causing embarrassment to the IT industry as whole but is costing individuals – who have trusted banks and other large institutions with their data – both time and money.

Nowadays, it is easy for a person to switch banks, insurance companies, airlines, phone / internet / TV provides or any other service provider really easily and that may have consequences for your employment.

So, whatever industry you are in I’d encourage you to play your own part – however small – in helping your company keep the data of their customers safe and secure.

Have a great day.

Cheers

Marty

Simply forgot – What’s in my path?

Simple things we often forget

I was recently doing some work that required me to look at the PATH system variable in Windows.

Nothing too hard about that, just open a command prompt and type path

and we get all the directories available in the PATH listed.

That’s sometimes a little hard to read, so say we wanted each directory on a separate line – simple, Powershell can do that for us.

In fact – I prefer using the Powershell ISE to the simple Commandline windows

If you’re looking for the Powershell ISE, just do a search directly from the start menu in windows

Then it’s just a simple matter of adding a few lines of code

and we’ll see the output listed (truncated here)

Hope this helps somebody out.

Cheers

Marty

Is NoSQL still a thing?

G’day,

Looking back, perhaps one of the (many) mistakes that I’ve made in my career was to initially ignore the N0SQL movement.

I’ve focused my career on Relational Database systems – predominantly Microsoft SQL Server – and so using a NoSQL database felt ‘wrong’, for a number of reasons.

Firstly, the name itself – NoSQL – felt like the message was that the SQL language was in someway being “devalued” and ‘kicked to the kerb”

Secondly, the lack of transaction support in NoSQL databases also felt wrong. I assumed SQL implied relational.

I think now-a-days it’s probably better to think of SQL as simply a language to manipulate data (SELECT, INSERT, UPDATE, DELETE etc). (Event if this is a deviation from the technical ANSI definition) Nearly all vendors have their own propriety extensions (Microsoft have T-SQL, Oracle have PL/SQL, Sybase also have T-SQL but it is diverging from the Microsoft language of the same name where it originally has its roots) and pure ANSI standard SQL is hard to find.

We see lots of database (relational or not) using derivatives of SQL.

We also see relational databases using non-relational functionality and non-relational databases starting to use relational functionality.

And so, both relational and non-relational have their place in today’s world and so I wonder how long the term NoSQL will continue to be used.

Quite sometime ago I started to look at CosmosDB – which a lot of people still talk about in 2020 as being the ‘new kid on the block’ – but it’s been evolving for 10 years now and is gaining a good foothold in the market.

If you’ve not seem it before it would be worth a look and although it shares some familiar terms with relational database it is not one and approaching it with the same mindset may land you in a mess – but once you start to get to grips with how it operates and the different models it employs I think it’ll start to grow on you.

Have a great day.

Cheers

Martin.

Moving to a Modern Microsoft Platform

Melbourne Event – Monday 17th June 2019 (Free – including lunch and prizes)

PASS, in conjunction with Microsoft, Intel and community leaders are offering free training on Moving to a Modern Microsoft Platform

In case you’ve never heard of PASS, they are a not profit organisation run by, and for, a diverse data community.

PASS (formerly known as The Professional Association of SQL Server) focuses on supporting data professionals throughout the world who use the Microsoft Data Platform.

You can read more about PASS on their site

Microsoft SQL Server has been around for some 25+ years and there is a major release coming up later this year – SQL Server 2019.

Another thing that is happening this year – in fact very soon – is that Microsoft SQL Server 2008 (which includes SQL Server 2008R2) is moving out of extended supported.

So, if you are using Microsoft SQL Server 2008 / 2008R2 you’ll need to prepare for SQL Server 2008 End of support

It’s 10 years since that version was released and it was one of the most popular with a lot of organisations still relying on keeping one of their prized business assets on the 2008 platform.

To ensure that organisations stay current and to help them with some of the options that are available, PASS, Microsoft and Intel have teamed up to bring users around the globe a free in-person event in many cities to provide advice, and support to ensure that they fully aware of all options to keep their users data safe, secure and highly available in an ever changing data-driven world.

The events throughout the world are listed here.

And if you would like to register for the Melbourne event – that will run in the CBD on Monday 17th June 2019 at

Telstra Customer Insights Centre
Level 1, 242 Exhibition Street
Melbourne, VIC 3000
Australia

then please head to the EventBrite page and Register for FREE – we’d love to see you there, learn about your pain points, priorities and talk through your next steps on your data-driven journey to keep your clients data safe, secure and highly available to provide the insight that a modern business needs.

They’ll be information sessions, hands-on-labs (if that’s your thing), expert instruction and of course prizes.

There’s no charge and lunch will be provided.

We look forward to meeting you!

Speaking at Events

G’day,

I recently got the opportunity to speak at the “PASS Marathon, The New World of Data Privacy”.

I had not spoken at a virtually event for quite some time and I’d forgotten how different it is to a live presentation.

There’s pros and cons to both live presentations and virtually presentations, but for me, the main difference is having audience interaction.

At a live event you are able to see the faces in the audience and constantly gauge how you feel things are going – and if appropriate change on the fly. You can also take questions ‘in real time’.

However, with a virtual event you get to ‘stay on script’ the entire time and just carry on, not really getting a gauge of how your presentation is ‘going down’.

I’d like to thank everybody who gave me feedback and take the opportunity to encourage viewers of such events to provide feedback – in as much detail as you feel comfortable with. For example if you didn’t like an element of the presentation then please say why (not just that you didn’t like it please) – this helps people to improve.

I’d additionally like to encourage audience members who have not spoken previously to consider doing so. If a virtual presentation is not the place to start for you, then consider heading along to a local SQL Saturday event – there’s always a call for speakers before hand.

There’s plenty of people to help you along on your presentation journey at these events and also in the wider SQL community.

These’s also a lot of online courses aimed at presenting and finding one of those might be a good starting point for you.

I hope to be viewing your presentation soon.

Have a great day.

Cheers

Martin

Generating Insert Statements

G’day,

I been asked a few times lately the best way to generate INSERT statements.

I’d always replied to use SSMS.

I know SSMS has this feature, it’s just not something that I have the need to use on a regular basic. And – as we all know – when you don’t use something regularly you tend to forget where to find it, or – as happens reasonably regularly nowadays – the feature, or tool, gets an update and you find yourself re-learning something you thought that you were already familiar with.

Fortunately, the feature to generate INSERT statements was pretty much where I expected – but it did take a few minutes to find the exact spot, so I thought that I’d mention it here – just in case anybody else is looking too, or you just need a reminder that SSMS is very capable of performing this task.

Simply right click on the database that contains the table – or tables – that you need to generate INSERT statements for and select TASKS

Select TASKS

From the resulting menu choose GENERATE SCRIPTS

Select Generate Scripts

Then it’s just a simply case of following the Wizard through.

If you’ve not started this Wizard before then you’ll see the Introduction page – you can safely tick “Don’t show this page again”, so you basically don’t see this page again the next time you start the same Wizard.

You probably don’t want to see this page every time you start the Wizard.

Then, on the next page, either select either all objects to script – or just the specific one(s) that you want. In my case I’m interested in one particular table.

Select the object(s) that you want to script.

On the next page select the location where you want the file to go. I’m selecting a new query editor window, but you could select a file location or the clipboard.

Then hit the “Advanced” button

Selecting the Advanced button is important

Clicking the Advanced button will show the “Advanced Scripting Options” – scroll down until you find “Types of data to script” and select your choice from the drop down. I’ve chose to script “Schema and data” – but you can also select just Data or only schema.

Select the option that best suits your need.

The next page will give you a summary

Hit Next on the summary page

The it’s just a case of hitting Next and then your script is generated. Simply hit finish on the next screen

You’ll see a summary and hit Finish

And you should see your script. Save it to a location of your choice.

Finally – you see your nice INSERT script

I hope this was useful.

Have a great day

Cheers

Marty.

Azure Notebooks – a nice little tool.

Gday,

I’ve been playing with Azure notebooks lately and have found them invaluable as

  • A teaching aid
  • A test sandbox

I’ve been using Python code, and this means that I don’t have to install Python on the machine that I am working at.

It also means that anybody I am demonstrating my code to has only to open a web page that shows my Azure Notebook.

So – how do you get to Azure Notebooks I hear you ask.

Head along to https://notebooks.azure.com and log in with your Microsoft account.

Once there you’ll be able to start a new project

Give in a name, a URL and decide if you want to make this public or not

Create a Project and Share (if you want)

Start the Azure Project up

Start the Project up

Add a Notebook to the project

Add a Notebook

Choose the language, R and Python are available – but also F#

Choose the language for the Azure Notebook.

And then have fun!

I’d really love to see more languages added in the future. These notebooks have potential – on many levels – and they are also FREE!

Have a great day!

Cheers

Marty.