Using Azure Data Factory With Biml

Meagan Longoria has a multi-part series on using Biml to script Azure Data Factory tasks to migrate data from an on-prem SQL Server instance to Azure Data Lake Store.  Here’s part 1:

My Azure Data Factory is made up of the following components:

  • Gateway – Allows ADF to retrieve data from an on premises data source

  • Linked Services – define the connection string and other connection properties for each source and destination

  • Datasets – Define a pointer to the data you want to process, sometimes defining the schema of the input and output data

  • Pipelines – combine the data sets and activities and define an execution schedule

Click through for the Biml.

Related Posts

Creating An Azure Chat Bot

Dustin Ryan shows how to build a QnA bot: After you’ve created your knowledge base you can then edit and update your knowledge base. There’s a few different ways to update your knowledge. a. Manually edit the knowledge base directly within QnAMaker.ai. You can do this by directly editing the questions by modifying the text […]

Read More

Data Lake Archive Tier

Ust Oldfeld looks at an important part of a data lake: The Archive access tier in blob storage was made generally available today (13th December 2017) and with it comes the final piece in the puzzle to archiving data from the data lake. Where Hot and Cool access tiers can be applied at a storage account level, […]

Read More

Categories

March 2017
MTWTFSS
« Feb Apr »
 12345
6789101112
13141516171819
20212223242526
2728293031