Press "Enter" to skip to content

Category: Integration Services

Emailing SSIS Errors

Peter Schott improves upon Kevin Hill’s script:


Recently, Kevin Hill (b | t ) posted on getting package errors from the SSIS catalog in a single query as opposed to clicking through the SSIS Reports and digging through pages.  I took that and ran with it a little bit. The first pass needed an additional index on the catalog to increase performance.  Kevin’s included that at the bottom of his query on the post above.  (You probably don’t need the included “message” column, though.)
I wanted to take this and run with it a little bit to report on all errors for a given folder within the last day, then e-mail that in an HTML formatted e-mail. To that end, I wrote up a quick stored procedure that should take the Folder or Package or Project name and a “to” e-mail address to send an e-mail through DBMail.

Click through for the script.

Comments closed

Optimizing SSIS Catalog Cleanup

Tim Mitchell has a script which replaces [internal].[cleanup_server_retention_window] in the SSISDB database:

Earlier this week, I blogged about the automatic cleanup process that purges old data from the SSIS catalog logging tables. This nightly process removes data for operations that are older than 365 days. While this is useful, many SSIS admins have complained that this process is very slow and contentious on large or busy SSISDB databases.
In this post, I’ll show to you one of the main reasons this purge process is slow, and will share a more efficient way of performing this delete operation.

Click through for the script and explanation.

Comments closed

Looping Over Files In SSIS

Tim Mitchell shows us how to use the foreach loop component to iterate over a set of files:

The SSIS foreach loop is configured to allow you to easily ingest multiple data files in a single data flow task. For this to work, all of the files would need to reside in the same directory structure (either locally or on the network), and they must all have the same structure and metadata.
In this design, the data flow is contained within the foreach container, which will execute the contents of that data flow task once for each file found in the specified directory.

This gives us a good pattern for loading a bunch of text files, such as monthly extracts from a different system.

Comments closed

Tips For Migrating SSISDB

Kenneth Fisher shares some thoughts on SSISDB:

We’ve been doing a lot of upgrading recently and at one point had to move an instance from one 2016 server to another. In the process, we found out (the hard way) that it’s not that easy to move SSISDB (the SSIS Catalog that may or may not be named SSISDB). I mean it’s not hard, but it’s definitely not a basic backup/restore. The full BOL instructions on how to do this are here. That said, here are the elements that are involved.

Read on for the list as well as an order of operations.

Comments closed

Testing Package Properties With ssisUnit

Bartosz Ratajczyk shows how you can test certain properties on an Integration Services package using ssisUnit:

The command is simple. You can get or set the property using the value for given property path. As usual – when you get the value, you leave the value blank. The path – well – is the path to the element in the package or the project. You use backslashes to separate elements in the package tree, and at the end, you use .Properties[PropertyName] to read the property. If you use the elements collection – like connection managers – you can pick a single element using square brackets and the name of this element.

Read on for more, including limitations and useful testing scenarios.

Comments closed

Validating SSIS Packages Using T-SQL

Annie Xu shows us how to validate SSIS packages in the SSISDB catalog using T-SQL:

Recently, I need to do a data warehouse migration for a client. Since there might be some difference between the Dev environment source databases and Prod environment source databases. The migrated SSIS packages for building data warehouse might have some failures because of the changes. So the challenge is how can I validate all my DW packages (100 +) all at once.

Click through for the script.

Comments closed

Contrasting Integration Services And Pentaho Data Integration

Koen Verbeeck contrasts SQL Server Integration Services with Pentaho Data Integration:

For generating SSIS packages, you need to rely on Biml (much about that can be found on this blog or on the net), or older frameworks such as ezApi. Or you need 3rd party tools such as BimlStudio or TimeXtender. Using Biml means writing XML and .NET. Don’t get me wrong, I love Biml and I use it a lot in my SSIS projects.

But generating transformations in PDI is so much easier. First, you create a template (you create a transformation, but you leave certain fields empty, such as the source SQL statement and the destination table). Then you have another transformation reading metadata. This metadata is pushed to the template using the Metadata Injection Transformation. In this transformation, you point to the template and you map those empty fields to your metadata fields.

It’s interesting to see where each product stands out or falls flat compared to the other, and Koen’s comparison is definitely not a one-sided bout.

Comments closed

Using Biml To Read Excel Files Without Excel

Bill Fellows follows up on his prior post and shows how you can write BimlScript to parse an Excel file without having Microsoft Office installed:

My resources are quite simple: Excel Spreadsheet containing meta data, a driver program and a package template.

The template is your standard truncate and reload pattern with the target table being specified by a parameter. The client validates data by running processes in parallel so the existing mainframe process delivers data to the Billing table while ours delivers to a Billing_NEW table. Once they accept the new process, the target table becomes Billing and the NEW table is dropped. I decided the most native SSIS route would be use specify the target table in as a parameter. We originally have a boolean parameter indicating whether we were loading the new table or the production one but that was more logic and overhead that just specifying which table to load. I force their queries to be dirty reads as some of these queries can be rather messy.

Click through for the script and explanation.

Comments closed

Azure Data Factory Or Integration Services?

Teo Lachev contrasts use cases for Integration Services vesus Azure Data Factory V2:

So, ADF was incorrectly positioned as “SSIS for the Cloud” and unfortunately once that message made it out there was a messaging problem that Microsoft has been fighting ever since. Like Azure ML, on the glory road to the cloud things that were difficult with SSIS (installation, projects, deployment) became simple, and things that were simple became difficult. Naturally, Microsoft took a lot of criticism from the customers and community, including from your humble correspondent. ADF, or course, has nothing to do with SSIS, thus leaving many data integration practitioners with a difficult choice: should you take the risk and take the road less traveled with ADF, or continue with the tried-and-true SSIS for data integration on Azure?

To Microsoft’s credits, ADF v2 has made significant enhancements in features, usability, and maintainability. There is an also a “lift and shift” option to run SSIS inside ADF but since this architecture requires a VM, I consider it a narrow case scenario, such as when you need to extend ADF with SSIS features that it doesn’t have. Otherwise, why would you start new development with SSIS hosted under ADF, if you could provision and license the VM yourself and have full control over it?

All in all, Teo is not the biggest fan of ADF at this point and leans heavily toward SSIS; read on for the reasoning.

Comments closed