Incremental Loading Using Datetime Columns

Reza Rad shows a pattern for implementing incremental loads using modified dates:

The idea behind this method is to store the latest ETL run time in a config or log table, and then in the next ETL run just load records from the source table that have modified (with their modified date greater than or equal to) after the latest ETL run datetime. This will create the change set for the data table. The change set might contains inserted, updated, or deleted records. to identify which change happened on the record you need to compare the change set with existing records and separate inserted, updated, and deleted records. This change set with the action on each record can be inserted into staging tables, and then be used to apply on the fact table based on appropriate action.

In my experience, the hardest part about this is making sure people update ModifiedTime when they update rows in the table.

Related Posts

Keep That Data Raw

Archana Madhavan argues that you should retain your raw data: When your pipeline already has to read every line of your data, it’s tempting to make it perform some fancy transformations. But you should steer clear of these add-ons so that you: Avoid flawed calculations. If you have thousands of machines running your pipeline in real-time, […]

Read More

Using Biml With Oracle

Shannon Lowder shows us that you can write Biml to connect to an Oracle database as well: Next, I wanted to build a package from Biml. A package that uses a connection to my Oracle instance.  Defining connections to Oracle in Biml is similar to every other connection you’ve ever defined in Biml (lines 3-12). […]

Read More

Categories

January 2016
MTWTFSS
« Dec Feb »
 123
45678910
11121314151617
18192021222324
25262728293031