Press "Enter" to skip to content

Curated SQL Posts

Using the Develop Hub in Azure Synapse Analytics

Charles Feddersen shows off one of the Azure Synapse Analytics hubs:

The Develop Hub in Azure Synapse Analytics enables you to write code and define business logic using a combination of notebooks, SQL scripts, and data flows. This gives us a development experience which provides the capability to query, analyze, and model data in multiple languages, along with giving us Intellisense support for these languages. This provides a rich interface for authoring code and in this post, we will see how we can use the Knowledge Center to jump-start our development experience.

Click through to see two demos, one of notebooks and one of SQL scripts.

Comments closed

The Merge Interval Operator

Hugo Kornelis looks at another execution plan operator:

The Merge Interval operator reads dynamic seek range specifications, checks to see if their specified ranges overlap, and if so combines the overlapping ranges into one new range.

One typical use case is for a query that uses multiple BETWEEN specifications, connected with OR. When these ranges overlap, they must be combined into a single range. This saves performance, but more important is that it prevents rows that satisfy both range specifications from being returned multiple times. When the boundaries of the BETWEEN are given as constants, the optimizer analyzes for overlaps and combines ranges if needed when compiling the query. But when the boundaries of the BETWEEN specifications are only known at run-time (variables, column references), the Merge Interval operator is used for this task.

Click through to see how it works.

Comments closed

Measure Your DBA Skills

Lee Markum has just wrapped up an interesting series:

Over the last 9 weeks I took you on a journey of skills and career topics related to being a SQL Server DBA. We looked at the Production DBA. We saw skills and career topics from the beginning to mid-career to Senior DBA. Then we looked at the Development DBA and their skills and career development needs. Finally there was a wrap up post.

To make it easier for everyone to get to these posts, I decided to bring them all together on a single page.

Click through to get a feeling for where you’re at on the DBA and database developer sides of the house.

Comments closed

New Azure Announcements

Eitan Blumin has a roundup of Azure-related announcements:

On the week of December 7th (especially on December 9th), Microsoft has sent us a whole bag of goodies, announcing the general availability of new features that were only in preview until now, and even newer features that have just entered public preview.

There’s quite a lot to cover here, so let’s try to break it down by categories and provide links for more details. 

Click through for the list.

Comments closed

Using Koalas with Azure Databricks

Tomaz Kastrun continues a series on Azure Databricks:

So far, we looked into SQL, R and Python and this post will be about Python Koalas package. A special implementation of pandas DataFrame API on Apache Spark. Data Engineers and data scientist love Python pandas, since it makes data preparation with pandas easier, faster and more productive. And Koalas is a direct “response” to make writing and coding on Spark, easier and more familiar. Also follow the official documentation with full description of the package.

Click through for a quick demo.

Comments closed

Automating Python Data Pipelines with SQL Agent

Joshua Higginbotham shows an old scheduling dog a new trick:

First off, we need to figure out what server we are going to run these from. For me, it was our SQL Servers dedicated to SSIS. Once this is figured out, we then need to do a custom install of Python. The key here, is to make sure when you install python, you install it across the server itself and not at the user level. Once installed, we can then move to SQL Agent to complete the rest of the work. You’ll need to make sure the service account that you are running SQL Agent with has both permissions to install libraries with python as well as permissions to the directory that your python scripts live. Once permissions are set we can start building out our SQL Agent Job.

Click through for the answer.

Comments closed

Disk Performance Testing in 2020

Glenn Berry gives us some CrystalDiskMark results:

Recently, I built a new AMD mainstream desktop system with some existing parts that I had available. This system has six storage drives, with various levels of technology and performance. I thought it would be interesting to run CrystalDiskMark 7.0.0 on each of these drives. So, here are some quick comparative CrystalDiskMark results in 2020 from those six drives.

This system has a Gigabyte B550 AORUS MASTER motherboard, which is actually a great choice for a B550 motherboard, especially if you want extra storage flexibility. AMD B550 motherboards only have PCIe 4.0 support from the CPU, not from the B550 chipset.

Glenn gets some outstanding performance from one drive and reminds us once more of how beautiful SSD and M.2 drives are.

Comments closed

On-Premises SQL Server is Still Relevant

John Morehouse does not abide by Betteridge’s Law of Headlines:

While I’m a firm believer that the cloud is not a fad and is not going away, it’s just an extension of a tool that we are already familiar with.  The Microsoft marketing slogan is “It’s just SQL” and for the most part that is indeed true.  However, that does not mean that every workload will benefit from being in the cloud.  There are scenarios where it does not make sense to move things to the cloud so let’s take a look at a few of them.

Read on for several reasons why the cloud might not be right for you.

Comments closed

Running Kafka on Windows (via WSL2)

Jim Galasyn shows how you can try out Apache Kafka on Windows:

Is Windows your favorite development environment? Do you want to run Apache Kafka® on Windows? Thanks to the Windows Subsystem for Linux 2 (WSL 2), now you can, and with fewer tears than in the past. Windows still isn’t the recommended platform for running Kafka with production workloads, but for trying out Kafka, it works just fine. Let’s take a look at how it’s done.

You can also get Kafka to run natively on Windows, though there are bugs around file handling, to the point where if you restart your machine while the Kafka service is running, data in partitions may become permanently inaccessible and force you to delete it before you can start Kafka again. So yeah, it’s better to use WSL or Docker containers for trying out Kafka on Windows machines.

Comments closed