Press "Enter" to skip to content

Author: Kevin Feasel

The apply() Functions in R

Steven Sanderson talks about a series of functions:

Welcome, fellow R warriors! Today, we delve into the heart of vectorized operations with R’s “apply” family: apply()lapply()sapply(), and tapply(). These functions are your secret weapons for efficiency and elegance, so buckle up and prepare to be amazed!

But first, the “why”: Loops are great, but for repetitive tasks on data structures, vectorization reigns supreme. It’s faster, cleaner, and lets you focus on the “what” instead of the “how” of your analysis. Enter the apply family, each member offering a unique twist on applying functions to your data.

The trickiest part about the apply() series is remembering which one does what. This is where purrr’s map() function does a better job, I think.

Comments closed

Enabling SQL Agent after Upgrading from Express Edition

Warwick Rudd ponies up the cash:

With the above limitations, this product can provide all of the functionality for your application to operate as required. Because there is no SQL Agent service in this edition this makes it more difficult for you to implement scheduled database maintenance. We are not going to discuss ways to implement database maintenance on an Express edition of SQL Server in this post.

But at some point in time you may find yourself in the position that you need to perform an upgrade of editions. This may or may not include and upgrade of versions. An upgrade to the version can be achieved and can bring benefits of its own. Normally, I would not be writing about and recommending performing in-place upgrades. However, sometimes due to restrictions of vendor applications it is a requirement to perform an in-place upgrade. 

Read on to see what you’d need to do, knowing that Express Edition does not come with a SQL Server Agent and does not perform the appropriate setup for one.

Comments closed

Indexing for Sorted Data

Erik Darling digs into indexes and sorts:

Without things sorted the way you’re looking for them, it’s a lot like hitting shuffle until you get to the song you want. Who knows when you’ll find it, or how many clicks it will take to get there.

The longer your playlist is, well, you get the idea. And people get all excited about Skip Scans. Sheesh.

Anyway, let’s look at poor optimizer choices, and save the poor playlist choices for another day.

Click through for an interesting problem and solution.

Comments closed

Selective Column Replication in Postgres

Semab Tariq shows how to replicate a limited number of columns:

Selective column replication, a feature of logical replication in PostgreSQL, enables the selective transfer of data changes from one database to another. This feature offers flexibility by addressing bandwidth and storage optimization concerns. It allows users to choose specific columns to replicate rather than replicating entire tables, ensuring that only essential data is transferred. This selective approach reduces network traffic and storage space required for replication while maintaining data consistency across databases.

In this blog, I will demonstrate the process of replicating specific columns from a table using logical replication. Additionally, we will see how you can create multiple subscribers connected to a single publisher to replicate data across multiple locations.

Replication seems to be a lot easier to set up in Postgres than SQL Server, though check out the FAQ at the end for a few kicks in the pants around Postgres replication.

Comments closed

Upcoming SSMS 20 Connection Security Changes

Erin Stellato announces an “above-the-fold” change:

In SSMS 20 we have made changes to the connection dialog.  In the screenshot below, you will see the Login and Connection Properties pages for the connection dialogs for SSMS 19 and 20 compared side-by-side.

Moving the Connection Security section to the Login tab (and it presumably showing up without needing to select Options) is a good change. It’s also interesting to see SSMS support Mandatory encryption, a part of Tabular Data Stream (TDS) 8.0 that Azure Data Studio supports but SSMS (and most other SQL Server-related things) don’t yet.

Comments closed

The Importance of Source Control for DBAs

Steve Jones explains that Git isn’t just for developers:

Git has become a fantastic tool for me, and many other technologists, over the last ten years. It’s almost ubiquitous in most of my clients, and so many people are comfortable with it. Many others aren’t, which is why I started a Git series for DBAs (and other Ops people) on my blog.

Quite a few people asked me why I recommend git over a file share for storing code that a team of Ops people or DBAs might use. Why isn’t a global file share a better choice in an organization? I think I have a few good reasons, but if you disagree, let me know in the discussion for this piece.

There are some annoyances around Git but good UI tools minimize a fair amount of the pain and the benefits are huge.

Comments closed

Cannot Create Property ‘groupid’ On String

Barney Lawrence fixes a problem in Azure Data Studio:


This is a solution to a bug I’ve encountered in Azure Data Studio when trying to create new connections. that leads to an error message of cannot create property ‘groupid’ on string ‘.

File this post under “things I couldn’t find a neat answer to in Google so I figure I’d best create a page for it”.

Read on for Barney’s answer as I lament how far downhill Google has gone as a search engine.

Comments closed

Checking SQL Server Connectivity with Powershell in Parallel

Rod Edwards builds a script:

The chances are that you have other systems monitoring your SQL servers already, so this task isn’t required at all. However, sometimes a quick ‘knock on the door of SQL’ to confirm a response isn’t a bad thing as a sanity check.

So, building on that, we’re going to use the same technique to essentially perform a sweep of our estate (with a few bells and whistles added) to give us a colour coded quick view of service status like below, with a little bit of additional info.

Click through for the script and explanation.

Comments closed

Avoiding Time Intelligence DAX Functions in DirectQuery Mode

Marco Russo and Alberto Ferrari skip the slow stuff:

Calculations that use the DAX time intelligence functions mostly retrieve data at the day level, performing the required aggregations in the formula engine. By avoiding time intelligence DAX functions, you can force DAX to produce more optimized queries for your specific calculations.

DirectQuery over SQL and VertiPaq require the same patterns to optimize time intelligence calculations, even though the reasons are different. In VertiPaq, we try to stay away from DAX time intelligence functions to avoid large materialization at the day level. With SQL, materialization does not always happen because Tabular tries to push the grouping down to SQL. Still, time intelligence calculations often result in complex queries, and it is better to avoid the complexity by using simpler DAX code.

Check out the performance difference.

2 Comments

Troubleshooting a Problem with sp_send_dbmail

Shane O’Neill applies the Sgt. Schultz defense:

I don’t like posting about issues unless I fundamentally understand the root cause. That’s not the case here. A lot of the explanation here will be hand-waving while spouting “here be dragons, and giants, and three-headed dogs”, but I know enough to give you the gist of the issue.

Click through for a fun story about Kerberos and behavioral changes after absolutely nothing happened.

Comments closed