Press "Enter" to skip to content

Curated SQL Posts

SQL Server and Golang

Shane O’Neill goes long:

It’s all very well and good to open up Golang and write a FizzBuzz (note to self: write a FizzBuzz), but I still work with databases.

So before I do anything with Golang, I’d like to know: can it interact with databases. 

Granted, everything can, but how easy is it? And does trying to interact with databases kill any interest I have in the language.

So, let’s give it a go.

Click through to see what Shane came up with.

Comments closed

Azure Synapse Analytics July 2022 Updates

Ryan Majidimehr notes that the Azure Synapse Analytics team has been busy:

Azure Synapse Link for SQL is an automated system for replicating data from your transactional databases into a dedicated SQL pool in Azure Synapse Analytics. Starting this month, you can make trade-offs between cost and latency in Synapse Link for SQL by selecting the continuous or batch mode to replicate your data.  

By selecting “continuous mode”, the runtime will be running continuously so that any changes applied to the SQL database or SQL Server will be replicated to Synapse with low latency. Alternatively, when you select “batch mode” with a specified interval, the changes applied to the SQL database or SQL Server will be accumulated and replicated to Synapse in batch mode with the specified interval. This can save cost as you are only charged for the time the runtime is required to replicate data. After each batch of data is replicated, the runtime will be shut down automatically. 

Click through for the complete list.

Comments closed

Bidirectional Transactional Replication and Managed Instances

Holger Linke builds a transactional replication topology with a couple of twists:

Bidirectional transactional replication is a specific Transactional Replication topology that allows two SQL Server instances or databases to replicate changes to each other. Each of the two databases publishes data and then subscribes to a publication with the same data from the other database. The “@loopback_detection” feature ensures that changes are only sent to the Subscriber and do not result in the changes being sent back to the Publisher.

The databases that are providing the publication/subscription pairs can be hosted either on the same SQL instance or on two different SQL instances. The SQL instances can either be SQL Server on-premise, SQL Server hosted in a Virtual Machine, SQL Managed Instance on Azure, or a combination of each. You just have to make sure that the instances can connect to each other. If you add a subscription by using the fully-qualified domain name (FQDN), verify that the server name (@@SERVERNAME) of the Subscriber returns the FQDN. If the Subscriber server name does not return the FQDN, changes that originate from that Subscriber may cause primary key violations.

Read on for the scripts.

Comments closed

An Overview of Power BI Datamarts

Melissa Coates has a new diagram for us:

An new diagram of Power BI Datamarts is now available. It includes the technical components of a Power BI datamart.

Want to download a copy? Head on over to the Diagrams page where the latest version will always be. You can download a PDF copy of it from the Diagrams page.

This blog post includes a brief summary of each technical component of a Power BI datamart. Datamarts are really new, and they’ll be changing as (1) the functionality matures over time, and (2) based on feedback that customers give. Therefore, I’ve kept the descriptions below brief.

Despite their brevity, the descriptions are worth the read.

Comments closed

A Primer on Contrast and CVD

Tim Brock covers color vision deficiency:

In this blog post and a follow up I’m going to describe why and how we used theming to make diffify.com more accessible to users who suffer from some common visual impairments. Here in Part 1 I’ll cover some of the science and the terminology. Part 2 will look at the actual changes we made.

Click through for a brief discussion of contrast sensitivity and a longer one on color vision deficiency, including why color choice matters.

Comments closed

“Warming Up” Databricks Clusters

Ust Oldfield needs that cluster to be up:

Interactive and SQL Warehouse (formerly known as SQL Endpoint) clusters take time to become active. This can range from around 5 mins through to almost 10 mins. For some workloads and users, this waiting time can be frustrating if not unacceptable.
For this use case, we had streaming clusters that needed to be available for when streams started at 07:00 and to be turned off when streams stopped being sent at 21:00. Similarly, there was also need from business users for their SQL Warehouse clusters to be available for when business started trading so that their BI reports didn’t timeout waiting for the clusters to start.

Read on to see one way to solve this problem without having a cluster run 24/7.

Comments closed

Executing as User or Login

Kenneth Fisher puts on a mask:

I use impersonation a lot. It’s a really easy way to check if someone has the permissions they are supposed to. That said, a co-worker recently had an interesting question. They were testing permissions on a synonym.

Msg 916, Level 14, State 1, Line 3
The server principal “Domain/NetworkName” is not able to access the database “OtherDB” under the current security context.

Read on to see what caused the issue and how you can fix it.

Comments closed

Detecting Data Changes in Power BI Incremental Refresh

Chris Webb writes some M:

One feature of Power BI incremental refresh I’ve always been meaning to test out is the ability to create your own M queries to work with the “detect data changes” feature, and last week I finally had the chance to do it. The documentation is reasonably detailed but I thought it would be a good idea to show a worked example of how to use it to get direct control over what data is refreshed during an incremental refresh.

Read on to see how it works, including a couple gotchas around things like the shape of query results.

Comments closed

Standard Edition and Memory Recommendations

Erik Darling has some recommendations:

This is a short post to warn you about the memory recommendation tab in the SQL Server installer.

Let’s say you’re doing the smart thing and giving your Standard Edition install 192 GB of RAM:

Read on to understand why a max memory size of 128 GB isn’t necessarily the right answer for Standard Edition but also how you might set it that way by Next-Next-Nexting.

Comments closed