Press "Enter" to skip to content

Author: Kevin Feasel

Scripting Multiple SQL Agent Jobs With SSMS

Steve Stedman shows how to script multiple SQL Agent jobs from SQL Server Management Studio:

While doing a recent server move, I came across the need to script all agent jobs and save them to a file so that if we needed to reference what was there after the move we could get at the agent jobs without having to restart the old SQL Server.

It made me realize that sometimes what you want to do is available in SQL Server Management Studio, but that perhaps it is not always entirely obvious.

Click through for the demo.  I’d probably use Powershell over this, but if you just need to do a one-time job move, this gets you going quickly.

Comments closed

Using Biml With Oracle

Shannon Lowder shows us that you can write Biml to connect to an Oracle database as well:

Next, I wanted to build a package from Biml. A package that uses a connection to my Oracle instance.  Defining connections to Oracle in Biml is similar to every other connection you’ve ever defined in Biml (lines 3-12). The difference here is the Provider. (line 10)  I took the Provider attribute for my Oracle connection from the manual package I created in my previous test.

Simply right click on your Oracle connection in the package and choose properties.  Then look at the Qualifier value. I take the whole value and copy and paste it into the Provider attribute in my BimlScript.

Once you have that defined, set up a Package with our two connections, and a data flow (lines 14 – 33).  This script builds a simple data flow to move data from a single table on Oracle over to SQL Server.  The only difference is, this time, the package is named Automatic, rather than Manual.

You could get as complex as you’d like in your example.

As far as Integration Services goes, Oracle is just another source or sink and as long as you have an appropriate driver, the flows work the same way as with any other database.

Comments closed

Resumable Online Index Rebuild

Arun Sirpal shows off a SQL Server 2017 and Azure SQL Database feature:

Clearly I am in a paused state with 5.48% percent of the operation complete. In the original query window where I started the index rebuild, I receive:

Msg 1219, Level 16, State 1, Line 4 Your session has been disconnected because of a high priority DDL operation. Msg 0, Level 20, State 0, Line 3

A severe error occurred on the current command.  The results, if any, should be discarded.

DTUs when in a paused state, it drops back down to pretty much 0 DTU consumption (red arrow below).

If you have a hard nightly maintenance window to hit, being able to pause index rebuilds is a pretty nice feature.

Comments closed

Synonyms And Availability Groups

Allen McGuire shows how to maintain synonyms between instances in an Availability Group:

I’m a big fan of using synonyms to point to objects in other databases, but if you’ve tried to use them in conjunction with Availability Groups you may have encountered a slight issue: upon failover, the synonyms are no longer valid because the instance name is part of the synonym definition.  In order for the synonyms to work again, they have to point to the instance they now reside in – assuming the databases were in the same instance, respectively.

The next challenge was to automate the of detection of when the primary has changed, therefore triggering an update of the synonyms.  I put the following tSQL into a SQL Agent job and set it to run every five seconds.

Click through for the SQL Agent job details.  I have only created a couple synonyms and don’t really like them that much, but if they’re your bag, then at least you know you can use them safely with Availability Groups.

Comments closed

Analyzing Clickstream Data With Markov Chains

Eleni Markou shows one method of analyzing clickstream data:

We chose to use the third-order Markov Chain on the above-produced data, as:

  • The number of parameters needed for the chain’s representation remains manageable. As the order increases, the parameters necessary for the representation increase exponentially and thus managing them requires significant computational power.
  • As a rule of thumb, we would like at least half of the clickstreams to consist of as many clicks as the order of the Markov Chain that should be fitted. There is no point in selecting a third-order chain if the majority of the clickstream consists of two states and so there is no state three steps behind to take into consideration.

Fitting the Markov Chain model gives us transition probabilities matrices and the lambda parameters of the chain for each one of the three lags, along with the start and end probabilities.

This particular analysis is trying to understand which page (if any) a user will go to next when on a particular page.  Eleni uses additional techniques like k-means clustering to segment out particular groups of users.  Very interesting analysis.

Comments closed

Basics Of Survival Analysis

Subhasree Chatterjee explains the basics of survival analysis:

Survival analysis is a set of methods to analyze the ‘time to occurrence’ of an event. The response is often referred to as a failure time, survival time, or event time. These methods are widely used in clinical experiments to analyze the ‘time to death’, but nowadays these methods are being used to predict the ‘when’ and ‘why’ of customer churn or employee turnover as well.

The dependent variables for the analysis are generally two functions:

Read the whole thing.  H/T R-Bloggers

Comments closed

Log Shipping With dbatools

Sander Stad shows off a few log shipping functions he created for dbatools:

The entire log shipping functionality is now separated between 5 functions. Four of them are used internally and are not visible as a public function because you can easily break stuff if it’s not being used correctly.

The main function, Invoke-DbaLogShipping, is available in the dbatools module for anyone to use.

If you open the GUI for the log shipping you have lots of choices but most of them are already supplied in the GUI itself and you can decide whether to use them or not.
The whole idea behind the functionality was that it would allow you to quickly setup log shipping using a lot of defaults like you can in the GUI, but if you’re more experienced you can change any setting to your preferences.

Read on for an example of how to use this.  It looks pretty snazzy.

Comments closed

Loading From Excel Into SQL Server With Powershell

Shane O’Neill tackles one of the all-time important questions, how to get data from Excel into something else:

Now let’s say that we are working in an environment that does not allow us to change the server configurations, meaning that OPENROWSET() is closed for us.

Being completely honest, my spreadsheet is only 8 rows so I would just manually insert the data but what happens if it’s a few thousand rows big? Not a viable option.

So with most of my T-SQL ideas have been exhausted, let’s look at PowerShell!

The other all-time important question, of course, is how to get data from something else into Excel.

Comments closed

Automatically Restoring Databases To Another Server

Bjorn Peters needs to migrate a few databases over to a test server nightly:

My first thoughts about that were creating a SQL Server Agent Job with following steps:

  1. check the availability of Shared-Destination-Folder
  2. delete/clear Destination-Folder-Content
  3. Shrink all Transaction-Logfiles
  4. Backup all Databases from given list
  5. Restore each Backup-File from folder
  6. Check all orphaned user
  7. delete/clear Destination-Folder-Content

A year or two ago, I had built this with a lot of normal T-SQL-Agent-Jobs, now I’m doing this with dbatools which make it very easy and fast (and in one step)

It’s only a few lines of Powershell code, which you can see upon clicking through.

Comments closed

Homoglyphs And SQL Injection

Bert Wagner gives us one more reason why blacklists are bad:

Homoglpyhs can exist within a character set (like the Latin character set examples above) or they can exist between character sets. For example, you may have the unicode apostrophe ʼ, which is a homoglyph to the Latin single quote character '.

How does SQL Server handle unicode homoglyphs?

Funny you should ask. If you pass in a unicode character to a non-unicode datatype (like char), SQL implicitly converts the unicode character to its closest resembling non-unicode homoglyph.

Bert’s examples show failure when converting Unicode data to VARCHAR, so this attack vector may not work if the input parameter is NVARCHAR, but even if that’s the case, it’s still one of many reasons why blacklists are awful for preventing against injection attacks.

Comments closed