Press "Enter" to skip to content

Category: JSON

Managing SQL Server Documentation with JSON

Phil Factor gives us the gloop:

Metadata extract files are handy for documentation, study, cataloguing and change-tracking. This type of file supplements source because it can record configuration, permissions, dependencies and documentation much more clearly. It is a good way of making a start with documenting your database.

Here is a sample of a json metadata file (from AdventureWorks 2016). It was generated using GloopCollectionOfObjects.sql that is here in Github, and is being viewed in JSONBuddy. I use this format of JSON, a collection of documents representing SQL Server base objects (no parent objects) when I need to read the contents into MongoDB. The term ‘Gloop’ refers to a large query that, you’d have thought, would be better off as a procedure. Here is a typical sample of the output.

This is an interesting approach to documentation. I’m not totally buying into it, but that might just be due to my not having tried it.

Comments closed

Use SQL for XML and JSON Creation

Lukas Eder argues that if you’re storing the data in SQL and you need to get data from a database into JSON or XML format, just use SQL for that:

In English: We need a list of actors, and the film categories they played in, and grouped in each category, the individual films they played in.

Let me show you how easy this is with SQL Server SQL (all other database dialects can do it these days, I just happen to have a SQL Server example ready:

Lukas makes a great point and has a FAQ to follow up on it. If there’s a reason for mapping at a higher layer—if you’re actually adding value rather than building out a set of converters—that’s one thing, but if you’re just accepting a data set and returning a JSON blob…well, your database product can do that too.

Comments closed

Storing Database Deployment Metadata with JSON

Phil Factor combines a couple SQL Server features to track database deployment history:

We maintain the current record where it is easy to get to and simply add an array to hold the history information. Our only headache is that we can only hold an NVARCHAR of 3750 characters (7500 of varchar characters) because extended properties are held as SQL_Variants. They need careful handling! This means that if our JSON data is larger, we have to trim off array elements that make the JSON exceed that number.

The combination of JSON and extended properties is not one that I’ve seen before—typically, there’s a deployment log table.

Comments closed

Benchmarking JSON Query Times

Silvano Coriani compares different options for loading and querying JSON data in Azure SQL Database:

Storing and retrieving data from JSON fragments is a common need in many application scenarios, like IoT solutions or microservice-based architectures. These fragments can be persisted in a variety of data stores, from blob or file shares, to relational and non-relational databases, and there’s a long standing debate in the industry on what’s the database technology that fits “better” for this task.
 
Azure SQL Database offers several options for parsing, transforming and querying JSON data, and this article doesn’t pretend to provide a definitive answer to that debate, but rather to explore these options for common scenarios like data loading and retrieving, and benchmarking results to provide a clear indication of how Azure SQL Database will perform manipulating JSON data.

Read on for the results.

Comments closed

Converting JSON to Result Sets

Jack Vamvas shows how you can import data in JSON format and get tabular data in SQL Server:

It is possible to read a json file using T-SQL.There are a number of different methods.  By using the OPENROWSET functionality , ISJSON and OPENJSON function you can quickly read the file , check if the JSON is valid and then unpack the JSON into a SQL table. 

Read on for an example. This also performs reasonably well in practice, at least in my experience.

Comments closed

Processing JSON in Biml

Bill Fellows takes us through a library which (seemingly by law) must be in every .NET project:

#sqlhelp #biml I would have the metadata in a Json structure. How would you parse the json in the C# BIML Script? I was thinking use Newtonsoft.Json but I don’t know how to add the reference to it

Adding external assemblies is a snap but here I’ll show how to use the NewtonSoft Json library to parse a Json based metadata structure and then use that in our Biml.

Click through to learn how.

Comments closed

Powershell and Windows Terminal Profiles

Jeffery Hicks shows how you can modify your Windows Terminal profile using Powershell:

I recently updated my Windows 10 systems to the 1903 release. One of the reasons is that I wanted to try out the new Windows Terminal preview. You can find it in the Windows Store. This is bleeding edge stuff and far from complete but promises to a great addition. Now you will be able to have all your command terminals, in one tabbed application and easily be able to switch between them. As I said, this is far from being a finished and polished product. Right now, if you want to add a new profile, that is another terminal, you have to manually edit a json file. If you have VS Code installed, the file will open in that.  Otherwise, I’m assuming you’ll get whatever application is associated with the .json extension.

Read on for a Powershell one-liner which lets you create a terminal profile.

Comments closed

Splitting Arrays with OPENJSON

Dave Mason continues a journey into parsing JSON with T-SQL:

Starting with SQL Server 2016, Microsoft provided a STRING_SPLIT function. It is a table-valued function that splits a string into rows of substrings, based on a specified separator character. It’s been a welcome addition that we waited a long time for. It has one shortcoming, though: the order of the output rows is not guaranteed to match the order of the substrings in the input string.

Microsoft also provided support for parsing JSON data starting with SQL Server 2016. I discovered the OPENJSON function can be used to split strings, and it can also return the ordinal position of each substring from the original input string.

There are some limitations which you’d expect, namely around requirements for valid JSON.

Comments closed

Parsing JSON with T-SQL

Dave Mason has a primer on JSON parsing using T-SQL:

Microsoft added support for JSON data beginning with SQL Server 2016. JSON is an open-standard file format consisting of attribute–value pairs and array data types. It is commonly used to transmit data objects for asynchronous browser–server communication. But it is also used for storing unstructured data in files or NoSQL databases such as Microsoft Azure Cosmos DB. For most of us, SQL Server’s support for JSON probably means two things: we can convert relational data to JSON and vice versa. In this post, I’ll focus on converting JSON to relational data and share what I’ve learned from a recent experience.

I’ve been pleasantly surprised with the way JSON support works in SQL Server. It’s supported every complicated scenario I’ve had to deal with so far, including nesting, deciding with or without arrays for the outer element, quotes or no quotes around numbers, etc.

Comments closed

Matrix Operations with JSON

Phil Factor takes a look at using JSON to perform memoization:

For the SQL Server developer, matrices are probably most valuable for solving more complex string-searching problems, using Dynamic Programming. Once you get into the mindset of this sort of technique, a number of seemingly-intractable problems become easier.  Here are fifty common data structure problems that can be solved using Dynamic programming. Until SQL Server 2017, these were hard to do in SQL because of the lack of support for this style of programming.  Memoization, one of the principles behind the technique is easy to do in SQL but it is very tricky to convert existing procedural algorithms to use table variables. It is usually easier and quicker to use strings as pseudo-variables as I did  with Edit Distance and the Levenshtein algorithmthe longest common subsequence, and  the Longest Common Substring. The problem with doing this is that the code to fetch the array values can be very difficult to decypher or debug. JSON can do it very easily with path array references.

The results aren’t fantastic but the code is easier at least.

Comments closed