Press "Enter" to skip to content

Category: JSON

Iterating over JSON and XML Data in SQL Server

Steve Stedman explains how you can iterate through XML and JSON data using the APPLY operator:

The results are what we are looking for in this specific example, but where they break down is when there are more employees represented in the XML, for each employee we need to add another UNION to bring the results together. That is not very iterative and since the title of this post includes the word iterating, we need to focus on how to do that.

Now we introduce the CROSS APPLY functionality that can be used like a JOIN to take a value from one result set (table) and apply it to a function that gets called once for each row. You can reference my JOIN TYPES poster for using CROSS APPLY

Click through for the full set of examples.

Comments closed

Comparing JSON Documents with SQL Server

Phil Factor has a new function for us to try out:

It is often necessary to compare objects, particularly when you are working on a routine that generates an object and you want to check that an improvement you’ve made is actually returned the same result. An XML or JSON document is the nearest we get to an object in SQL Server. This is probably as near as you’d want.

Although this routine is designed to compare JSON documents, and takes them as input, it can be used for comparing any results.

The style takes from the diff command in Unix.

Comments closed

Passing Around JSON Arrays in Azure Data Factory

Rayis Imayev continues a series on JSON in Azure Data Factory:

It’s not a new thing to know that we can reference nested elements of ADF activities’ output since it’s represented in JSON format or pass the JSON file content to other tasks/components that can process this format.

But what if you need to pass a complete output of your ADF activity task further down your pipeline. Or you need to pass a JSON array elements to another ADF activity or sub-pipeline as the parameter value. Let’s explore what other options available in Azure Data Factory for this very interesting use case.

Read on for the demo.

Comments closed

Mongo Shell Preview for Azure Cosmos DB

Hasan Savran takes a look at the preview for a native Mongo shell in Cosmos DB:

Native Mongo Shell became available as In-Preview mode in Azure Cosmos DB on March. I had chance to check it out this week and I decided to write about it this week. Mongo Shell let you execute Mongo database commands in Cosmos DB Data Explorer! Currently, It is not available in all Azure regions. If you don’t see this option, your database might be in a region that does not support this option yet. 
     Click on Data Explorer to see the Mongo Shell button. If you have never used it before, you will need to activate the Mongo Shell by clicking Complete Setup button. This box will open up when you click on Open Mongo Shell.

It sounds like it’s a little bit limited at the moment, but Hasan takes you through the things you can do today.

Comments closed

XML and JSON Creation in SQL Server vs Elsewhere

Lukas Eder walks us through converting result sets to XML and JSON using different platforms:

SQL Server supports transforming flat tabular SQL result sets into hierarchical structures by convention using the convenient FOR XML or FOR JSON syntaxes. This is really convenient and less verbose than the standard SQL/XML or SQL/JSON APIs – although the standard ones are more powerful.

In this blog post, I’d like to show a few core features of the SQL Server syntax, and what they correspond to in standard SQL. jOOQ 3.14 will support both SQL Server’s syntax and the standard syntax, and will be able to translate from one to the other, such that you can use SQL Server syntax also on Db2, MariaDB, MySQL, Oracle, PostgreSQL. You can play around with the current state of development on our website here.

Click through for a series of examples.

Comments closed

Documenting SQL Server Tables

Phil Factor has a way to create table documentation in source control and propagate it to the actual database:

It has always been a problem that documentation in the source, where it should be, is not then passed into the live database when the build script is executed. In a table, you have columns, constraints and indexes that you are likely to document using line-ending comments and block comments. You probably have a big block comment at the start, explaining the table. This information should be available in the live database. Microsoft don’t have a good answer and vaguely go on about adding comments in extended properties. Well, that’s fine but it hasn’t happened, unsurprisingly: Have you ever tried to do it? It is an almost impossible task, even with SQL Doc.

My solution is to execute my finely-documented build script as usual to create the latest version of the database, and then process the same script in PowerShell to add all the comments and documentation as extended properties in the right place in the live database.

It’s an interesting approach to a classic problem.

Comments closed

Reading JSON in .NET from a DataTable

Hasan Savran ran into an issue parsing JSON data from SQL Server via .NET:

FOR JSON lets you return the data in JSON format. As you might know, SQL Server can return and query JSON documents, but It doesn’t have a special data type for JSON documents. You must store data as string in SQL Server. You can make SQL Server work like a NoSQL Database. Your web application can retrieve data as JSON document and you can use dynamic objects to make things flexible.

     Let’s see an example first, In the following example, I retrieve data as JSON document and send it to directly to my front-end as string. JavaScript parses it and generates a grid from it. It’s very flexible because there is no schema. Front-End will display whatever SQL Server returns. You can change query and without changing any code in the middle, your grid will display the data.

There are limitations in how much JSON gets generated on the buffer at a time, so click through to see how you can rebuild the entire JSON output for a large file.

Comments closed

Checking JSON Structure with ADF

Rayis Imayev takes us through the solution of a tricky problem in Azure Data Factory:

Within my “ForEach” container I have also placed a Stored Procedure task and set 4 data elements from my incoming data stream as values for corresponding parameters.

However this approach will not work for all my incoming JSON events, it actually failed for the last one, since it didn’t have both “stop_time” and “last_update” data elements.

An easy way to fix this problem is to add missing data elements with empty values for the last event record, however, when we don’t have control over incoming data, we need to adjust our data processing steps.

Read on to see how Rayis solves this problem.

Comments closed

Transforming JSON to CSV with Azure Data Factory

Rayis Imayev shows how you can use the Flatten task in Azure Data Factory to convert JSON text to CSV:

What this new task does it helps to transform/transpose/flatten your JSON structure into a denormalized flatten datasets that you can upload into a new or existing flat database table. 

I like the analogy of the Transpose function in Excel that helps to rotate your vertical set of data pairs (name : value) into a table with the column names and values for corresponding objects. And when this vertical JSON structural set contains several similar sets (array) then ADF Mapping Data Flows Flatten does a really good job by transforming it into a table with several rows (records).

Click through for a demonstration.

Comments closed