Azure Data Factory V2 Pricing

Kevin Feasel

2018-05-29

Cloud, ETL

Chris Seferlis gives us the details on how Azure Data Factory V2 pricing works:

2. Volume of data moved – this is measured in DMUs (data movement units). This is one you should be aware of as this will default to auto, which is basically using all the DMUs it can use and this is paid for by the hour. Let’s say you specify and use 2 DMUs and it takes an hour to move that data. The other option is you could use 8 DMUs and it takes 15 minutes, this price is going to end up the same. You’re using 4X the DMUs but it’s happening in a quarter of the time.

This is good to look at and do some comparisons since how many DMUs you’re using is where the bulk of your spend if going to be.

There are a few moving parts here, so the calculation is not trivial.  But Chris makes good sense of it all.

Related Posts

NoSQL? No! MoSQL

Steve Jones points out a bit of a shift at Google: Google is doing more SQL, or at least shifting towards relational SQL databases as a way of storing data. At least, some of their engineers see this as a better way to store data for many problems. Since I’m a relational database advocate, I […]

Read More

Preventing Credential Compromise When Using AWS

Will Bengtston walks us through techniques Netflix uses to protect credentials in AWS: Scope In this post, we’ll discuss how to prevent or mitigate compromise of credentials due to certain classes of vulnerabilities such as Server Side Request Forgery (SSRF) and XML External Entity (XXE) injection. If an attacker has remote code execution (RCE) or […]

Read More

Categories

May 2018
MTWTFSS
« Apr Jun »
 123456
78910111213
14151617181920
21222324252627
28293031