Ed Elliott continues a series on SQL syntax concepts in Spark:
The next example is how to do a CTE (Common Table Expression). When creating the CTE I will also rename one of the columns from “dataType” to “x”.
Read on for the answer.
Comments closedA Fine Slice Of SQL Server
Ed Elliott continues a series on SQL syntax concepts in Spark:
The next example is how to do a CTE (Common Table Expression). When creating the CTE I will also rename one of the columns from “dataType” to “x”.
Read on for the answer.
Comments closedJoey D’Antoni shows off some of the benefits of using Azure Cloud Shell:
One of the challenges of being a consultant is having to work with a number of clients, and having different login credentials and accounts. In the early days of Azure, this was exceptionally painful, but over time the experience of using the portal with multiple identities and connecting to Azure tenants has gotten much easier. However, when writing PowerShell or Azure CLI code, switching accounts and contexts is slightly more painful. Also, when you are doing automation, you may be touching a lot of resources at one time, you want to be extra careful that you are in the right subscription and tenant.
Enter cloud shell.
Read on to see how to use it, get an idea of its cost, and see some of the benefits.
Comments closedAndy Levy has a new reason for us:
I got a merge a while back that included a change I wasn’t expecting from one of my developers. All they were doing was renaming a column on a table. We had names settled a while ago. What’s up with that?
Turns out, when I created the table, I named one of the fields
BrithYear
. This merge request corrected it toBirthYear
. The misspelling slipped past me, past everyone else, it was only discovered because when this developer was building the a new query into their code, they spelled the field as one would expect, and the query threw an error.
There’s many a reason not to use SELECT *
in application code; this is one I don’t think I’d heard before.
Sheldon Hull has a hoarding problem to solve:
In my case, I’ve run into 50TB of old backups due to tooling issues that prevented cleanup from being successful. The backup tooling stored a sqlite database in one subdirectory and in another directory the actual backups.
I preferred at this point to only perform the lifecycle cleanup on the backup files, while leaving the sqlite file alone.
Click through to see how to do this using Powershell.
Comments closedJess Pomfret shows how we can copy a database using dbatools:
We’re working hard on the AdventureWorks2017 database, perhaps getting it ready for an upgrade – since it’s now 3+ years out of date.
dbatools has so many functions, and I know I’ve mentioned it before, but
Find-DbaCommand
is a great way of looking for what we need. I want to know what the default backup path is set to, and since I’m just backing up and restoring to the same server, we already know that the instance has the required permissions here. If only there was an easy button for this…
Spoiler alert: there is.
Comments closedInstant File Initialization is especially helpful when large file growths are occurring. Without Instant File Initialization, SQL Server has to write zeros to the disk in order to initialize it before SQL Server can use it for new data. As with autogrowth, the amount of time this process takes depends on the size. The more disk space that is needed, the longer it’s going to take for the disk to be zeroed for use. Fortunately, IFI exists to skip the need to write out zeroes. Instead, disk space can immediately be used as needed.
IFI can also make a noticeable difference when restoring databases. If you’re working on getting your recovery time objective (RTO) down, check the status of IFI. Enabling IFI may help cut that number down to an acceptable value.
This is particularly interesting because, for the most part, I’ve only seen posts showing how to enable it at installation time.
Comments closedMonica Rathbun gives us a primer on storage concepts:
“One Gerbil, Two Gerbils or Three Gerbils?” is a common DBA joke about server and storage performance. No matter how many gerbils power your storage, you need to know what type they are and the power that they provide. Storage is not about gerbils it is about IOPs, bandwidth, latency, and tiers.
As a DBA it is important for you to understand and know what kind of storage is attached to your servers and how it is handling your data. It is not important to master everything about it, but it is very advantageous to be able to talk to your storage admins or “Gerbil CoLo, LLC” provider intelligently especially when you experience performance issues. Here is a list of things to I encourage you to know and ask.
Click through for the cheat sheet.
Comments closed