While this seems like a lot of parts just to copy a few files, it’s important to note I only scratched the surface of what ADF can do. Think of ADF as an airline company that manages and enables cargo (data) movement between cities (data sources). A pipeline represents the overall goal of moving certain cargo from one city to another. The linked service is the airport, which provides a landing point and access control for the cities. The dataset is the list of cargo to move, the activity is the flight itself, while the integration runtime is the airport infrastructure that makes the cargo movement possible. A single pipeline requires all these objects to run successfully; however, many pipelines can use these same objects to complete different tasks. Once you’ve created these data factory objects, it is straightforward to layer on additional functionality or more pipelines. ADF also has visual tools that make building these objects a breeze – to build my pipeline, all I had to do was click on “Copy data” in the visual tools start menu and follow the steps provided.
Matt has a video demonstrating the process as well.