Ok, so before I get started I will caveat this quick post by saying that there is an easier (or preferred) way to perform compression on files in SSIS using the ZipFile Class in the recent version of the .NET framework, this is sadly not available if you are using any legacy deployments.
While I have not spent ages adapting the following code to perform what should be a simple thing to do, I have spent enough time to justify sharing what I have done so hopefully you will also find some benefit from the code below too.
Click through to see Mark’s code and explanation.
The article will include the following topics:
Get the list of services using PowerShell in C#.
How to send SSIS Parameters to PowerShell using the script task.
How to use the PowerShell.addscript function.
How to invoke a PowerShell script in C#
I think this is a fairly limited scenario—if you’re going to have to write C# code anyhow, you can do this same work in C#. I suppose that it would be most useful in cases where you have to call common Powershell cmdlets rather than writing your own .NET code.
There is an awesome set of PowerShell cmdlets out there written by MVP Chrissy LeMaire. This method is my personal choice. It works great and is easy to automate. You can run it with SQLAgent or you can just use Scheduled Tasks in the OS. The scheduled tasks method is a little cleaner, but you don’t get to see it in SQL Server. Also if you are on a cluster and running Windows 2012 you can cluster the task scheduler as an added benefit.
Chrissy wrote this with the intent of making migrations easier, and she succeeded. In fact, I made it a point to thank her at MVP Summit last year because it made my life insanely easier. The advantage here is that you can automate a lot more than than just logins. In fact you can migrate and automate pretty much anything at the server level. Here is the link that I guarantee you are going to bookmark followed by a video demo where I show how to install and automate the syncing of logins using both the SQLAgent method and the Scheduled Tasks method.
DBATools would be my preference in this situation as well, but click through to see four other methods, as well as code.
This post talks about the issue I ran into with SSIS Mapping Files.
We currently run DB2 on an IBM iSeries AS400 for our ERP system. I was tasked with copying data from the AS400 to a SQL Server database for some consultants to use. The C-Suite didn’t want to give the consultants access to our AS400, so this was the work around that was put forth and accepted (and no, no one asked me before I was “voluntold” for the task). Since this would essentially be a “one-time” thing, I chose to use the Import Export Wizard, but I would save the package just in case they wanted this process repeated.
Sounds like it was a painful experience, but it does have a happy ending.
Once the Script Component has been defined as a source, the output columns can be defined. For this post, the same USGS Earthquake data that was used in the “Download JSON data with PowerShell” post will serve as an example. Be careful to choose the correct data types here. This can be tedious because you have to choose the correct data types in the C# code as well, and ensure that they correspond with the SSIS types. It’s helpful to bookmark a SSIS data type translation table for reference.
It does involve creating a script component, but aside from the tedium that Chris mentions, it’s not too bad.
I prefer to passively gather performance metrics – even if it’s a little convoluted and more work (for me). I don’t often need these metrics immediately available, so I execute queries similar to the query below and store the performance metrics in a table. I can (and do) build dashboards to track SSIS performance (perhaps I should blog about them…) using this passively-collected performance metrics.
As with Lookup Transformation messages, OLE DB Destinations in Data Flow Tasks record specific messages using the same format every time. Because of this (hard-coded) consistency, you and I can passively collect the number of rows written while executing packages in the SSIS Catalog using the (default) Basic logging level. We can use the following Transact-SQL query to collect this execution metadata post-execution:
Click through for the script.
While recently examining an existing SSIS package before implementing some needed changes I noticed the connection manager for multiple flat files.
Normally, when processing multiple files, I use a sequence container and loop through them. It’s usually a cumbersome process to setup and you must configure variables. I was not aware the multiple flat file type of connection manager existed. No wonder because it’s notwithin the normal list. This handy thing could only be found when I looked under “New Connection”.
Click through for more details.
The “Stage EDW Data” Framework Application is identified by ApplicationID 2. If you recall, ApplicationID 2 is mapped to PackageIDs 4, 5, and 6 (LoadWidgetsFlatFile.dtsx, LoadSalesFlatFile.dtsx, and ArchiveFile.dtsx) in the ApplicationPackages table shown above.
The cardinality between Framework Applications and Framework Packages is many-to-many. We see an Application can contain many Packages. The less-obvious part of the relationship is represented in this example: a single Package can participate in multiple Applications or even in the same Application more than once. Hence the need for a table that resolves this many-to-many relationship. I hope this helps explain why Application Package is the core object of our SSIS Frameworks.
Read on for the rest of the story.
Why This Complexity?
My argument for embracing (or ignoring) the complexity of SSIS Catalog configuration is straightforward: Externalization – the act of storing execution-time values outside of the object to be executed – is one way to achieve portability, a software development best practice.
As an architect I loathe complexity. Whenever I encounter (or opt to develop) complexity in a software project, I question it. I ask others to question it. I don’t always arrive at the right conclusion, but I challenge complexity at every turn.
I’ve helped several organizations implement SSIS Catalog portability using Environments and References. We all agree that the solution is complex, but most also agree that the complexity is worth the flexibility achieved thereby.
This is a fairly lengthy blog post, full of good information.
One of the first Data Integration Lifecycle Management (DILM) Suite solutions I built was Catalog Reports. Catalog Reports is a relatively simple and straightforward version of some of the SSIS Catalog Reports embedded in SSMS. The main difference is Catalog Reports is a SQL Server Reporting Services (SSRS) solution.
And it’s open source. Here’s a screenshot of the Overview Report for the same execution above
Check it out.