Press "Enter" to skip to content

Category: JSON

Passing Around JSON Arrays in Azure Data Factory

Rayis Imayev continues a series on JSON in Azure Data Factory:

It’s not a new thing to know that we can reference nested elements of ADF activities’ output since it’s represented in JSON format or pass the JSON file content to other tasks/components that can process this format.

But what if you need to pass a complete output of your ADF activity task further down your pipeline. Or you need to pass a JSON array elements to another ADF activity or sub-pipeline as the parameter value. Let’s explore what other options available in Azure Data Factory for this very interesting use case.

Read on for the demo.

Comments closed

Mongo Shell Preview for Azure Cosmos DB

Hasan Savran takes a look at the preview for a native Mongo shell in Cosmos DB:

Native Mongo Shell became available as In-Preview mode in Azure Cosmos DB on March. I had chance to check it out this week and I decided to write about it this week. Mongo Shell let you execute Mongo database commands in Cosmos DB Data Explorer! Currently, It is not available in all Azure regions. If you don’t see this option, your database might be in a region that does not support this option yet. 
     Click on Data Explorer to see the Mongo Shell button. If you have never used it before, you will need to activate the Mongo Shell by clicking Complete Setup button. This box will open up when you click on Open Mongo Shell.

It sounds like it’s a little bit limited at the moment, but Hasan takes you through the things you can do today.

Comments closed

XML and JSON Creation in SQL Server vs Elsewhere

Lukas Eder walks us through converting result sets to XML and JSON using different platforms:

SQL Server supports transforming flat tabular SQL result sets into hierarchical structures by convention using the convenient FOR XML or FOR JSON syntaxes. This is really convenient and less verbose than the standard SQL/XML or SQL/JSON APIs – although the standard ones are more powerful.

In this blog post, I’d like to show a few core features of the SQL Server syntax, and what they correspond to in standard SQL. jOOQ 3.14 will support both SQL Server’s syntax and the standard syntax, and will be able to translate from one to the other, such that you can use SQL Server syntax also on Db2, MariaDB, MySQL, Oracle, PostgreSQL. You can play around with the current state of development on our website here.

Click through for a series of examples.

Comments closed

Documenting SQL Server Tables

Phil Factor has a way to create table documentation in source control and propagate it to the actual database:

It has always been a problem that documentation in the source, where it should be, is not then passed into the live database when the build script is executed. In a table, you have columns, constraints and indexes that you are likely to document using line-ending comments and block comments. You probably have a big block comment at the start, explaining the table. This information should be available in the live database. Microsoft don’t have a good answer and vaguely go on about adding comments in extended properties. Well, that’s fine but it hasn’t happened, unsurprisingly: Have you ever tried to do it? It is an almost impossible task, even with SQL Doc.

My solution is to execute my finely-documented build script as usual to create the latest version of the database, and then process the same script in PowerShell to add all the comments and documentation as extended properties in the right place in the live database.

It’s an interesting approach to a classic problem.

Comments closed

Reading JSON in .NET from a DataTable

Hasan Savran ran into an issue parsing JSON data from SQL Server via .NET:

FOR JSON lets you return the data in JSON format. As you might know, SQL Server can return and query JSON documents, but It doesn’t have a special data type for JSON documents. You must store data as string in SQL Server. You can make SQL Server work like a NoSQL Database. Your web application can retrieve data as JSON document and you can use dynamic objects to make things flexible.

     Let’s see an example first, In the following example, I retrieve data as JSON document and send it to directly to my front-end as string. JavaScript parses it and generates a grid from it. It’s very flexible because there is no schema. Front-End will display whatever SQL Server returns. You can change query and without changing any code in the middle, your grid will display the data.

There are limitations in how much JSON gets generated on the buffer at a time, so click through to see how you can rebuild the entire JSON output for a large file.

Comments closed

Checking JSON Structure with ADF

Rayis Imayev takes us through the solution of a tricky problem in Azure Data Factory:

Within my “ForEach” container I have also placed a Stored Procedure task and set 4 data elements from my incoming data stream as values for corresponding parameters.

However this approach will not work for all my incoming JSON events, it actually failed for the last one, since it didn’t have both “stop_time” and “last_update” data elements.

An easy way to fix this problem is to add missing data elements with empty values for the last event record, however, when we don’t have control over incoming data, we need to adjust our data processing steps.

Read on to see how Rayis solves this problem.

Comments closed

Transforming JSON to CSV with Azure Data Factory

Rayis Imayev shows how you can use the Flatten task in Azure Data Factory to convert JSON text to CSV:

What this new task does it helps to transform/transpose/flatten your JSON structure into a denormalized flatten datasets that you can upload into a new or existing flat database table. 

I like the analogy of the Transpose function in Excel that helps to rotate your vertical set of data pairs (name : value) into a table with the column names and values for corresponding objects. And when this vertical JSON structural set contains several similar sets (array) then ADF Mapping Data Flows Flatten does a really good job by transforming it into a table with several rows (records).

Click through for a demonstration.

Comments closed

Parsing ADF ARM Templates with T-SQL

Paul Andrew shows how you can use T-SQL to read an Azure Data Factory ARM template:

While documenting a customers data platform solution I decided it would be far easier if we could summarise the contents of a fairly complex Data Factory using its ARM Template. So, this is what I’ve done using T-SQL to parse the ARM Template JSON and output of series of tables containing details about the factory components.

That is quite the clever solution.

Comments closed

Automating JSON to Tables

Dave Mason has a stored procedure for us:

The code for my stored procedure is below. It has a single input parameter for a string of JSON data. The name/value pairs of the first “row” of JSON data is parsed to obtain column names and types for the result set. A query string for the OPENJSON function is constructed and executed via EXEC. JSON’s support for data types is pretty sparse compared to SQL Server (notice the big CASE expression where I attempt to do some mapping). If you’re using SQL 2016, you’ll have to make an edit for the STRING_AGG function.

Click through for the code and demos.

Comments closed

When FOR JSON PATH Isn’t Enough

Dave Mason walks us through some options when working with JSON data in SQL Server:

In both situations, we need to know something about the JSON schema to query it in a meaningful way: in the first example, column names and types are hard-coded; in the second example, column names are hard-coded as path parameter values for the JSON_VALUE function. Even though JSON data is self-describing, SQL Server doesn’t have a way to infer schema. (I would be quite happy to be wrong about this–please add a comment if you know something I don’t!) About the time I came to this realization, I commented on Twitter that JSON might be fool’s gold. You don’t need to know schema to store JSON data in SQL Server. But you do if you want to query it. “It’s pay me now or pay me later.”

It’s schema on read or schema on write. I’m not sure there is ever a truly schema-free scenario in a business application.

Comments closed