Leo Furlong shows how we can load an Azure SQL Data Warehouse dimension with Databricks:
Ingesting data into the Data Lake occurs in steps 1 and 2 in our architecture. Azure Data Factory (ADF) provides an excellent mechanism for loading data from source applications into a Data Lake stored in Azure Data Lake Store Gen2. In fact, Microsoft offers a template in the ADF Template gallery which provides a metadata driven approach for doing so. The template comes with a control table example in a SQL Server Database, a data source dataset and a data destination dataset. More on this template can be found here in the official documentation.
I appreciate that this is a full walkthrough of the process, not just one step.
Comments closed