Cathrine Wilhelmsen continues a series on Azure Data Factory. First, we get an overview of the available components:
Pipelines are the things you execute or run in Azure Data Factory, similar to packages in SQL Server Integration Services (SSIS). This is where you define your workflow: what you want to do and in which order. For example, a pipeline can first copy data from an on-premises data center to Azure Data Lake Storage, and then transform the data from Azure Data Lake Storage into Azure Synapse Analytics (previously Azure SQL Data Warehouse).
Then, Cathrine looks at the Copy Data wizard:
LEGO! Yay! I love LEGO. Rebrickable is an online service that will show you which LEGO sets you can build from the sets and parts you already own. Fun! 🙂
They also have a database of all official LEGO sets and parts (including themes and colors) that you can download for free as CSV files or JSON files.
The CSV files are automatically generated at the start of each month and can be found on rebrickable.com/downloads.
Cathrine takes this LEGO data and feeds it into Azure Data Lake Storage.