Parsing Gigantic JSON Text

Kevin Feasel



Jovan Popvic has created a 4.35 GB JSON array:

SQL Server 2016 and Azure SQL Database enable you to parse JSON text and transform it into tabular format. In this post, you might see that JSON functions can handle very large JSON text – up to 4GB.

First, I would need very large JSON document. I’m using TPCH database so I will export the content of lineitem table in a file. JSON can be exported using the bcp.exe program:

My first draft read “Jovan Popovic has created a monster.”  I might go back to that one.  On the plus side, the operation took a lot less time than I had expected, though I’d have to imagine that his SQL Express instance had some decent specs.

Related Posts

Using JSON_MODIFY To Modify Existing JSON

Jovan Popovic shows off the JSON_MODIFY function in SQL Server:   Recently I found this question on stack overflow. The problem was in appending a new JSON object to the existing JSON array: UPDATE TheTable SET TheJSON = JSON_MODIFY(TheJSON, 'append $', N'{"id": 3, "name": "Three"}') WHERE Condition = 1; JSON_MODIFY function should take the array value […]

Read More

Multi-Object JSON Arrays In SQL Server

Kevin Feasel



Bert Wagner shows how to build JSON arrays in SQL Server: When using FOR JSON PATH, ALL rows and columns from that result set will get converted to a single JSON string. This creates a problem if, for example, you want to have a column for your JSON string and a separate column for something like […]

Read More


February 2017
« Jan Mar »