Using JSON_MODIFY To Modify Existing JSON

Jovan Popovic shows off the JSON_MODIFY function in SQL Server:

Recently I found this question on stack overflow. The problem was in appending a new JSON object to the existing JSON array:

UPDATE TheTable
SET TheJSON = JSON_MODIFY(TheJSON, 'append $', N'{"id": 3, "name": "Three"}')
WHERE Condition = 1;

JSON_MODIFY function should take the array value from TheJSON column (the first argument), append the third argument into the first argument, and write the appended array back in TheJSON column.

However, the unexpected results in this case is the fact that JSON_MODIFY didn’t appended a JSON object {"id": 3, "name": "Three"}to the array. Instead, JSON_MODIFY appended a new JSON string literal  "{\"id\": 3, \"name\": \"Three\"}" to the end of the array.

This might be surprising result if you don’t know how JSON_MODIFY function works.

Read on to see how JSON_MODIFY works and why this doesn’t quite do what the poster thought.

Multi-Object JSON Arrays In SQL Server

Kevin Feasel

2017-12-19

JSON

Bert Wagner shows how to build JSON arrays in SQL Server:

When using FOR JSON PATH, ALL rows and columns from that result set will get converted to a single JSON string.

This creates a problem if, for example, you want to have a column for your JSON string and a separate column for something like a foreign key (in our case, HomeId). Or if you want to generate multiple JSON strings filtered on a foreign key.

The way I chose to get around this is to use CROSS APPLY with a join back to our Home table — this way we get our JSON string for either Cars or Toys created but then output it along with some additional columns.

Impedance mismatch?  What impedance mismatch?

DataRow To JSON With Powershell

Rob Sewell shows how to convert a .NET DataRow into its JSON form using Powershell:

I wanted to be able to Mock $variable. I wrapped the code above in a function, let’s call it Run-Query

Which meant that I could easily separate it for mocking in my test. I ran the code and investigated the $variable variable to ensure it had what I wanted for my test and then decided to convert it into JSON using ConvertTo-Json

Read on to see the fun mess that ConvertTo-Json made and then Rob’s simplification.

Matrix Transposition In T-SQL

Phil Factor has some fun transposing a matrix using T-SQL:

What I’m doing is simply converting the table into its JSON form, and then using this to create a table using the multi-row VALUES  syntax which paradoxically allows expressions. The expression I’m using is JSON_Value, which allows me do effectively dictate the source within the table, via that JSON Path expression, and the destination. As it is an expression, I can do all sorts of manipulation as well as a transpose.  I could, if I wanted, (in SQL 2017)provide that path parameter as a variable. This sort of technique can be used for several other reporting purposes, and it is well-worth experimenting with it because it is so versatile.

That is not at all what I would have thought up; very interesting approach.  I’d probably just be lazy and shell out to R Services.

JSON Parsing Performance

Kevin Feasel

2017-11-14

JSON

Jovan Popovic has a couple posts showing JSON parsing performance in SQL Server 2017.  First up, he makes the statement that JSON parsing can be 10x faster than XML:

I’m running 10000 iterations and getting the average spent time.

Average time to parse JSON and get the value using JSON_VALUE() function is around 3 microseconds while the equivalent action with XML typed variable and value() method takes between 30 and 40 microseconds.

This shows that parsing JSON “plain text” is 10x faster than parsing strongly typed XML variable.

He also compares JSON parsing to spatial type parsing:

I’m running 100.000 iterations and getting the average spent time in both cases. The results are:

  1. Spatial type needs around 11-14 microseconds to get the X coordinate
  2. JSON parser needs around 1-2 microseconds to get the value from X property (including CAST to float)

We can see that parsing JSON text is much faster than the equivalent operation in Spatial type.

JSON uses NVARCHAR type that is the most optimized type in SQL Server because it is used in most of the queries in SQL Server. JSON parser is based on T-SQL parser that is also very optimized. Spatial has a type, but it is still some binary serialized object that needs to be deserialized when we need to access the fields. In this case, scanning NVARCHAR is much faster than deserializing binary data into spatial.

On this side, Jovan does note that there are benefits to using spatial types, like performing spatial calculations.

Multi-Structured Data In U-SQL

Kevin Feasel

2017-09-06

JSON, U-SQL

Melissa Coates shows us how to use U-SQL to normalize JSON files in which different rows may have differing structures:

Handling the varying formats in U-SQL involves a few steps if it’s the first time you’ve done this:

  1. Upload custom JSON assemblies  [one time setup]

  2. Create a database   [one time setup]

  3. Register custom JSON assemblies   [one time setup]

  4. Upload JSON file to Azure Data Lake Store [manual step as an example–usually automated]

  5. Run U-SQL script to “standardize” the JSON file(s) into a consistent CSV column/row format

Melissa then shows us how to do this step-by-step.

Convert SSAS Tabular Processing Scripts Into Tables

Chris Koester shows how to take an Analysis Services Tabular processing script in TMSL format and turn it into a table using OPENJSON:

The previous post looked at how to process SSAS Tabular models with TMSL. Since SQL Server adds new JSON capabilities in 2016, let’s look at how to convert TMSL JSON to a Table with OPENJSON. OPENJSON is a new function in SQL Server 2016 that, per Microsoft:

OPENJSON is a table-valued function that parses JSON text and returns objects and properties from the JSON input as rows and columns.

In short, OPENJSON converts JSON text to a table. Since TMSL is JSON, this function can convert a SSAS Tabular processing script into a table. This could be useful if you wanted to document a SSAS processing schedule.

That’s an interesting use of OPENJSON.

JSON In Powershell

Adam Bertram shows an easy way of dealing with JSON data inside Powershell:

As the world continually becomes “eaten by software,” more and more services are being replaced by software. IT pros have most likely seen this in the form of software-defined everything. One of the premier components of this focus on software and with the continuing adoption of DevOps is application programming interfaces (APIs). All of these services needs to talk together and must provide a way for programs and users to interact with them. This is where APIs come in handy. But, what does this have to do with PowerShell and JSON, you ask?

APIs, more specifically REST APIs, return data when queried. This data is typically in the JSON format. JSON is a way of structuring data that makes it easy for software to consume. When working with PowerShell, Microsoft has provided some helpful tools to work with JSON called the ConvertTo-Json and ConvertFrom-Json commands. These commands allow you to quickly work with REST APIs or any other service that returns or accepts JSON as an input.

Read on for more details on how to use these commands.

JSON Data Sources In SSIS

Chris Koester shows how to read JSON data sources in SQL Server Integration Services:

Once the Script Component has been defined as a source, the output columns can be defined. For this post, the same USGS Earthquake data that was used in the “Download JSON data with PowerShell” post will serve as an example. Be careful to choose the correct data types here. This can be tedious because you have to choose the correct data types in the C# code as well, and ensure that they correspond with the SSIS types. It’s helpful to bookmark a SSIS data type translation table for reference.

It does involve creating a script component, but aside from the tedium that Chris mentions, it’s not too bad.

Using WITH With OPENJSON

Jovan Popovic points out the performance difference in using the WITH clause in an OPENJSON query:

 

Here are results of the queries:

SQL Server Execution Times: CPU time = 656 ms, elapsed time = 651 ms.
SQL Server Execution Times: CPU time = 204 ms, elapsed time = 197 ms.

As you can see, WITH clause specify that OPENJSON should immediately return properties from the JSON array without second parsing. Performance of the queries might be increased 3 times if you avoid double parsing.

That’s a pretty big difference when you specify the relevant data model elements.

Categories

February 2018
MTWTFSS
« Jan  
 1234
567891011
12131415161718
19202122232425
262728