SSAS Tabluar RAM Requirements

Bill Anton looks at ensuring your Tabular server has enough RAM:

In addition to being an “in-memory” technology, Analysis Services Tabular is also a “column-store” technology which means all the values in a table for a single column are stored together. As a result – and this is especially true for dimensional models – we are able to achieve very high compression ratios. On average, you can expect to see compression ratios anywhere from 10x-100x depending on the model, data types, and value distribution.

What this ultimately means is that your 2 TB data mart will likely only requrie between 20 GB of memory (low-end) and 200 GB (high-end) of memory. That’s pretty amazing – but still leaves us with a fairly wide margin of uncertainty. In order to further reduce the level of uncertainty, you will want to take a representative sample from your source database, load it into a model on a server in your DEV environment, and calculate the compression factor.

Read the whole thing; Bill has several factors he considers when sizing a machine.

Related Posts

Dynamic Filtering With Power BI + SSAS

Patrick LeBlanc shows in a video how to implement dynamic filtering with SSAS Tabular & Multidimensional in Power BI: In this video, Patrick answers your question about how to do this in Analysis Services Tabular and Multidimensional. Also, he adds a little bit of SQL to the mix. Make sure to watch the previous dynamic […]

Read More

Ignoring SSAS Dynamic Formatting

Chris Webb shows that tools like Power BI ignore formatting in SCOPE statements: What’s more (and this is a bit strange) if you look at the DAX queries that are generated by Power BI to get data from the cube, they now request a new column to get the format string for the measure even […]

Read More

Categories

December 2016
MTWTFSS
« Nov Jan »
 1234
567891011
12131415161718
19202122232425
262728293031