I have a post showing off some of the value of compressing R models:
So right now, we’re burning roughly 200K per model. My stated goal is to be able to store several years worth of data for 10 million products. Let’s say that I need 10 million products in ProductModel and 1 billion rows in ProductModelHistory. That means that we’d end up with 1.86 TB of data in the ProductModel table and 186 TB in ProductModelHistory. This seems…excessive.
As a result, I decided to try using the COMPRESS() function in SQL Server 2016. The COMPRESS function simply uses GZip compression. Yeah, there are compression algorithms which tend to be more compact (e.g., bz2 or 7z), but GZip is relatively CPU efficient and I can wrap my SQL statements with COMPRESS() and DECOMPRESS() and not have to change any calling code; I just need to update the two stored procedures I use to insert and then retrieve product models.
Most of the time, it’s not a big deal. But once you start talking hundreds of gigabytes or in my case, a couple hundred terabytes, it’s definitely worth compressing this data.
Comments closed