Splitting A Large Script With Powershell

Jana Sattainathan shows how to split a file with a large number of independent operations and have Powershell run the job in batches:

Everyday is an interesting day in the life of a DBA. Today, I received a request to run in a HUGE script. This script had 125k INSERT statements each in a separate line. Just opening the script was an issue, leave alone running it into a database. You get this error in SSMS just trying to open – “The operation could not be completed. Not enough storage is available to complete this operation

This is how I handled it

  1. I split the file into a manageable 1000 line files (total of 125 files)
  2. Looped through each split file and ran it in!

Yes, it was that simple. Thanks to PowerShell!

Read on to see how simple it is.

Related Posts

Using Azure Blob Storage Archive Tier For Archival Data

Bob Pusateri shows us how to configure Azure Blob Storage Archive Tier: Two of the products I use extensively for this purpose are Amazon Glacier and, more recently, Microsoft Azure Blob Storage Archive Tier. As happy as I’ve been with Amazon Glacier since its introduction in 2012, I always hoped Microsoft would offer a similar service. My wish […]

Read More

Get-DbaDbCompression DBATools Cmdlet

Jess Pomfret walks us through her recent contribution to dbatools: I’ve been using this at work recently and it also relates to the presentation I gave at the ONSSUG June meeting around data compression. The beginnings of this script originated online as I dug into learning about the DMVs that related to objects and compression and […]

Read More

Categories

February 2018
MTWTFSS
« Jan Mar »
 1234
567891011
12131415161718
19202122232425
262728