Press "Enter" to skip to content

Day: January 27, 2022

Finding Performance Regressions

Angus Croll embraces the power of control charts:

Technically, “performance” metrics are those relating to the responsiveness or latency of the app, including start up time.

But TV devices also tend to be more memory constrained than other devices, and as such are more liable to crash during a memory spike — so for Netflix TV we actually care about memory at least as much as performance, maybe more so.

At Netflix the term “performance” usually encompasses both performance metrics (in the strict meaning) and memory metrics, and that’s how we’re using the term here.

Click through for a primer on how they use changepoint and anomaly detection to do this.

Comments closed

Temporal Tables and Azure DevOps Deployments

Rayis Imayev notes a problem with Azure DevOps deployments:

Here is one thing that still doesn’t work well when you try to alter an existing temporal table and run this change through the [SqlAzureDacpacDeployment@1] DevOps task, whether this change is to add a new column or modify existing attributes within the table. Your deployment will fail with the “This deployment may encounter errors during execution because changes to … are blocked by …’s dependency in the target database” error message.

Read on to see what causes this problem and what we can do to work around it.

Comments closed

Checking a SQL Server Database’s Master Key

Steve Jones is trying to find where he left his keys:

I’ve been working with encryption in SQL Server for a long time, and have delivered quite a few presentations on the topic. Recently I was updating some code and wanted to check if a database had a master key created in it. This post shows how to do that.

Click through to see how you can create a database master key as well as how to drop it if necessary.

Comments closed

Combining Transparent Data Encryption and Instant File Initialization

Tom Collins performs a test:

Instant File Initialization (IFI) enabled  on  SQL Server, for data files only, leads to faster  execution on  certain file operations, as  it bypasses the zeroing procedure while reclaiming disk space. Instead, disk content is overwritten as new data is written to the files.

But do these benefits continue  when Transparent Data Encryption (TDE) is enabled ?   

Click through for the answer.

Comments closed

January 2022 Updates for Azure Synapse Analytics

Saveen Reddy has an update for us:

You can now easily add data quality, data validation, and schema validation to your Synapse ETL jobs by leveraging Assert transformation in Synapse data flows. Add expectations to your data streams that will execute from the pipeline data flow activity to evaluate whether each row or column in your data meets your assertion. Tag the rows as pass or fail and add row-level details about how a constraint has been breached. This is a critical new feature to an already effective ETL framework to ensure that you are loading and processing quality data for your analytical solutions.

Read on for the full list of changes.

Comments closed

Using Proprietary Database Features

Lee Markum does some thinking:

I was recently thinking of SQL Server temporal tables and how there is a perspective that you shouldn’t use proprietary features of a product because it locks you into that product. I want to be your guide on this matter.

I agree completely with Lee’s take on this. I can count on one hand the number of platform migrations I’ve been a part of through the years. These are painful enough experiences even if you use 100% “portable” code because optimization patterns change. And then you get to the utterly absurd: indexes in Oracle and SQL Server behave differently (e.g., clustered indexes are a practical must in SQL Server and the equivalent to clustered indexes in Oracle is so rarely used that it’s not even worth talking about to most Oracle people), so are you going to avoid indexes so you can have “truly” portable code? If the answer to that question is “yes,” you are wrong.

I do understand that there are companies which create code bases which need to be installable on multiple platforms. In that case, I do understand trying to keep things as common as possible, especially for fairly simple apps. But this is also part of why vendor databases are, as a general rule, awful.

Comments closed

Go/No-Go Indicators for Oracle Migrations to Azure

Kellyn Pot’vin-Gorman lays out some guidance on Oracle to Azure migrations:

When migrating an Oracle database to another platform, there are the common indicators and discussion topics around PL/SQL conversions, data types, application rewrites, etc., as being roadblocks to refactoring, but being successful also has to do with the SIZE of the workload coming from Oracle.  I find this is often dismissed, even though this is one of the quickest ways to identify if an ENTIRE Oracle database, (not even by schema or a subset of the Oracle database) can run on a Platform as a Service, (PaaS) solution.

Click through for more information on PaaS limits for Oracle databases in Azure.

Comments closed