Press "Enter" to skip to content

Category: Storage

Things to Know about Storage

Monica Rathbun gives us a primer on storage concepts:

“One Gerbil, Two Gerbils or Three Gerbils?” is a common DBA joke about server and storage performance. No matter how many gerbils power your storage, you need to know what type they are and the power that they provide. Storage is not about gerbils it is about IOPs, bandwidth, latency, and tiers.

As a DBA it is important for you to understand and know what kind of storage is attached to your servers and how it is handling your data. It is not important to master everything about it, but it is very advantageous to be able to talk to your storage admins or “Gerbil CoLo, LLC” provider intelligently especially when you experience performance issues.  Here is a list of things to I encourage you to know and ask.

Click through for the cheat sheet.

Comments closed

Saving Money on Backups to Azure Blob Storage

John McCormack has a few tips for saving some cash:

You have 5 databases on a SQL Server Instance. You take daily full backups of each database on your instance. You also take log backups every 15 minutes as each database is in full recovery mode. This means in 1 week, you will have 35 full backups and 3,360 transaction log backups. This multiplies to 1,820 full and 174,720 t-log backups over 52 weeks. Multiply this for 7 years or more and the costs can get very expensive.

Click through to see how you can save a considerable amount with a bit of planning.

Comments closed

Disk Performance Testing in 2020

Glenn Berry gives us some CrystalDiskMark results:

Recently, I built a new AMD mainstream desktop system with some existing parts that I had available. This system has six storage drives, with various levels of technology and performance. I thought it would be interesting to run CrystalDiskMark 7.0.0 on each of these drives. So, here are some quick comparative CrystalDiskMark results in 2020 from those six drives.

This system has a Gigabyte B550 AORUS MASTER motherboard, which is actually a great choice for a B550 motherboard, especially if you want extra storage flexibility. AMD B550 motherboards only have PCIe 4.0 support from the CPU, not from the B550 chipset.

Glenn gets some outstanding performance from one drive and reminds us once more of how beautiful SSD and M.2 drives are.

Comments closed

AzureTableStor: Table Storage in R

Hong Ooi announces a new package on CRAN:

I’m pleased to announce that the AzureTableStor package, providing a simple yet powerful interface to the Azure table storage service, is now on CRAN. This is something that many people have requested since the initial release of the AzureR packages nearly two years ago.

Azure table storage is a service that stores structured NoSQL data in the cloud, providing a key/attribute store with a schemaless design. Because table storage is schemaless, it’s easy to adapt your data as the needs of your application evolve. Access to table storage data is fast and cost-effective for many types of applications, and is typically lower in cost than traditional SQL for similar volumes of data.

If that sounds like a fit for you, check out the package.

Comments closed

Monitoring Storage Metrics

Robert Sheldon continues a series on storage concepts for the DBA:

When monitoring storage systems, engineers should track a variety of metrics to ensure the systems continue to meet application requirements. Three of the most important and commonly cited metrics are latency, I/O operations per second (IOPS), and throughput. In addition to these three, queue length and I/O splitting can also provide valuable insights into storage performance.

In this article, I discuss all five of these metrics and demonstrate them in action. Despite my focus on these five, they’re not the only important metrics to monitor. For example, engineers should also track storage capacity, device cache usage, controller operations, and storage networks. Even seemingly unrelated components can be a factor, such as low CPU utilization, which can indicate that the processor is waiting on storage to complete requests from the application.

Robert also shows off the new perfmon, which may or may not be better than Perfmon Classic.

Comments closed

Storing SQL Server Database Files in Blob Storage

Tomaz Kastrun has a wacky idea:

Storing SQL Server database files in Azure blob storage is a great solution for all the databases that are often migrated between instances, servers, virtual machines, or would have been divided between instances. This scenario also has the positive aspect to it, since the ability to create snapshot backups to Azure is seamless.

Following the steps, we will create a Azure Blob storage, where MSSQL Server database files will reside with MSSQL Server running on-prem. Assuming, that you already have the Azure account (if not, you can get a free Azure account), let’s proceed by opening the Windows Terminal in PowerShell mode.

I’m impressed that it worked and could see it being an option for small demo databases, but I can’t imagine performance would be good enough for a production scenario.

Comments closed

Loading Data from S3 into Power BI

Gilbert Quevauvilliers loves a challenge:

I really enjoy a good challenge, and with my customer they have all their data stored in AWS S3. Whilst there is no native connector, I thought there must be a way for me to get the data from AWS S3 into Power BI.

I did a bit of Googling and could not find any suitable solution. I also found and learnt that I could use AWS Athena to query the data living in S3. (I am definitely an expert of have a lot of knowledge in the AWS space. I am fortunate that I have other people who know AWS and were able to setup, configure and give me the details to connect to S3 via AWS Athena)

Below are the steps on how I got this working.

Why they don’t have a proper connector is a bit of a head-scratcher to me given the sheer amount of data stored in S3 and the sheer number of connectors in Power BI.

Comments closed

Disk Caching with SQL Server VM Disks in Azure

Niko Neugebauer performs some tests:

Microsoft has been extremely clear in the best practices recommendation for the SQL Server workloads on Azure VMs:
– use read caching for the data drives/storage pools
– use no caching for the log drives/storage pools
– use read caching for the temp db drives/storage pools

Sounds simple and direct, isn’t it ?
Let me borrow your attention for the next couple of minutes pointing to some situations where you might want to reconsider the best practices.

But do read on for some important notes.

Comments closed

Backing Up Databases to Azure Blob Storage

David Fowler shows how you can back up databases to Azure Blob Storage:

SQL Server has given us the option to backup our databases directly to Azure BLOB storage for a while now but it’s not something that I’ve had all that much call to use until recently.

So this is just going to be a quick walk through on how you can backup your on premise SQL Servers to Azure BLOB storage. I’m going to assume that you’ve already got an Azure account, if you haven’t, you get set up a free trial which will see you good for this demo.

Performance typically won’t be as good as backing up locally to disk, so if you need the fastest backup performance and cloud storage, the best route would be to write backups to disk and have a separate process which migrates them to Blob Storage, S3, or wherever. But in many cases, doing this directly can work out just fine, especially if you are already using an Azure-based VM.

Comments closed