Press "Enter" to skip to content

Category: JSON

Azure SQL Database Supports JSON

Jovan Popvic reports that Azure SQL Database now has full JSON support:

JSON is available in all service tiers (basic, standard, and premium) but only in new SQL Database V12. You can see quick  introduction here or more details in Getting Started page. you can also find code samples that JSON functions in Azure Sql Database on official Sql Server/Azure Sql Database GitHub repository.

Note that OPENJSON function requires database compatibility level 130. If all functions work except OPENJSON, you would need to set the latest compatibility level in database.

It will be interesting to see adoption of JSON within Azure SQL Database.  I could see it being a bit more likely due to DocumentDB.

Comments closed

Learning JSON

Jason Brimhall wants to learn a bit of JSON:

Let’s just get this out there right now – I suck at JSON. I suck at XML. The idea of querying a non-normalized document to get the data is not very endearing to me. It is for that reason that I have written utilities or scripts to help generate my XML shredding scripts – as can be seen here.

Knowing that I have this allergy to features similar to XML, I need to build up some resistance to the allergy through a little learning and a little practice. Based on that, my plan is pretty simple:

  1. Read up on JSON

  2. Find some tutorials on JSON

  3. Practice using the feature

  4. Potentially do something destructive with JSON

I’m not particularly excited about JSON support in SQL Server 2016 but the fact that it is there, combined with the fact that so many developers love JSON means that it’s a good idea to learn how to integrate, if only to figure out when it’s a bad idea to parse JSON within your very expensive SQL Server instances.

Comments closed

JSON Parsing Performance

Jovan Popovic answers a question I’ve had on my mind:

One of the first questions that people asked once we announced JSON support in SQL Server 2016 was “Would it be slow?” and “How fast you can parse JSON text?”. In this post, I will compare performance of JSON parsing with JSON_VALUE function with the XML and string functions.

The short answer is, JSON parsing should be faster than XML but slower than our historical T-SQL parsing functions.

Comments closed

Using GeoJSON Data

Jovan Popovic shows how to use data in GeoJSON format.

First, building data in GeoJSON format from a spatial type:

In geometry object are placed type of the spatial data and coordinates. In “property” object can be placed various custom properties such as address line, town, postcode and other information that describe object. SQL Server stores spatial information as geometry or geography types, and also stores additional properties in standard table columns.

Since GeoJSON is JSON, it can be formatted using new FOR JSON clause in SQL Server.

In this example, we are going to format content of Person.Address table that has spatial column SpatialLocation in GeoJSON format using FOR JSON clause.

Then, converting GeoJSON to Geography types:

New OPENJSON function in SQL Server 2016 enables you to parse and load GeoJSON text into SQL Server spatial types.

In this example, I will load GeoJSON text that contains a set of bike share locations in Washington DC. GeoJSON sample is provided ESRI and it can be found in https://github.com/Esri/geojson-layer-js/blob/master/data/dc-bike-share.json

Check them out.

Comments closed

Indexing JSON

Jovan Popovic answers a question which has been on my mind:  how are we supposed to index JSON data in SQL Server 2016?

In this post I will show how you can add indexes on JSON properties in product catalog. In SQL Server 2016, you can use two type of indexes on JSON text:

  1. Index on computed column that index some specific properties in JSON.
  2. Full text search index that can index all key:value pairs in JSON objects.

This is the downside to JSON not being an official type:  indexing is somewhat limited.  In comparison, you could create XML indexes which were specially-designed to do the job of searching for text within an XML field.

Comments closed

FOR JSON WITHOUT_ARRAY_WRAPPER

Jovan Popvic introduces us to the WITHOUT_ARRAY_WRAPPER clause:

This option enables you to remove square brackets [ and ] that surround JSON text generated by FOR JSON clause. I will use the following example:

SELECT 2015 as year, 12 as month, 15 as day
FOR JSON PATH, WITHOUT_ARRAY_WRAPPER

This query will return:

{ "year":2015, "month":12, "day":15 }

However, without this option, following text would be returned:

[{ "year":2015, "month":12, "day":15 }]

Jovan also points out important changes between 3.1 and 3.2 with FOR JSON output.

Comments closed

JSON Leads To New Wave Of 1NF Failures

Jovan Popovic talks about storing JSON in SQL Server:

Instead of single JSON object you can organize your data in this “collection”. If you do not want to explicitly check structure of each JSON column, you don’t need to add JSON check constraint on every column (in this example I have added CHECK constraint only on EmailAddresses column).

If you compare this structure to the standard NoSQL collection, you might notice that you will have faster access to strongly typed data (FirstName and LastName). Therefore, this solution is good choice for hybrid models where you can identify some information that are repeated across all objects, and other variable information can be stored as JSON. This way, you can combine flexibility and performance.

Okay, we’ve hit my first major problem with JSON support:  rampant violation of first normal form.  You can create check constraints on JSON code, and that’s pretty snazzy I guess, but I know a better way to store relational data in a relational database system.  JSON support is great when you ask SQL Server to be a holder of text blobs, but this is begging for bad design decisions.

Comments closed

JSON In SQL 2016

Jovan Popovic has a couple of posts on JSON.  First, using OPENJSON to generate a tally table:

Problem: I want to dynamically generate a table of numbers (e.g. from 0 to N). Unfortunately we don’t have this kind of function in SQL Server 2016, but we can use OPENJSON as a workaround.

OPENJSON can parse array of numbers [1,25,3,5,32334,54,24,3] and return a table with [key,value] pairs. Values will be elements of the array, and keys will be indexes (e.g. numbers from 0  to length of array – 1). In this example I don’t care about values I just need indexes.

Well, that’s one way to do it.

Also, Jovan talks about performance of FOR JSON PATH:

You might notice that table scans take majority of the query cost. Cost of the FOR JSON (JSON SELECT operator) is 0% compared to others. Also, since we are joining small tables (one sales order and few details), cost of the JOIN is minor. Therefore, if you processing small requests there will be no performance difference between formatting JSON on client side and in database layer.

This comment was actually due to a bug in the AdventureWorks CTP 3 database.  The good news is that JSON isn’t obviously slow performance problems, but I’d like to see some more thorough performance tests.

Both posts via Database Weekly.

Comments closed