Press "Enter" to skip to content

Author: Kevin Feasel

Getting A Handle On Isolation Levels

Robert Bishop explains some of the basics behind transaction isolation levels:

Lost Updates

It is possible for two or more transactions to modify the same row. While the subsequent transactions are reading the data, the 1st transaction commits its changes. Then the subsequent transaction makes its changes and commits, thereby possibly losing the changes committed by the 1st transaction. In this scenario, the last transaction always wins.

Dirty Read

A dirty read is when a SELECT statement will read against a table that is currently being modified and not yet committed to the database. By default, SQL Server will protect against this on all isolation levels, except for Read Uncommitted or by using NOLOCK in your queries.

This is the first in a series.

Comments closed

Password Expiration

Artemakis Artemiou hits a controversial topic in security today:

As you can see from the above screenshot, the maximum password age is set to 90 days. This means that if I have a login that uses SQL Server Authentication or a contained database user with a password, if the login’s or user’s password was last changed 90 days (or more) ago, then the login/user will be prompted by SQL Server to change her password.

Even though this practice was used for many years, not only in SQL Server but similarly, on Windows-level, as well as in other systems and applications, recent studies argue that it should not be a recommended practice anymore. On the contrary, these studies suggest that user passwords should not be regularly changed but rather change only when there is a specific and justified reason to do so. Furthermore, new security standards are being formulated that contain new recommendations on password change.

Security is all about trade-offs.  If you make people change their passwords every 90 days, they’ll write the password down somewhere.

Comments closed

Reading Extended Events

Dave Mason writes a custom app to read extended events data:

It took me a while to make the transition from SQL Profiler to Extended Events. Eventually I got comfortable enough with it to use it 100% of the time. As I read more about the XEvents architecture (as opposed to just “using” XEvents), I gained a deeper appreciation of just how great the feature is. My only gripe is that there isn’t a way to handle the related events from within SQL Server using T-SQL. DDL triggers can’t be created for XEvents. And they can’t be targeted to Service Broker for Event Notifications (not yet, anyway). For now, the one way I know of to handle an XEvent is by accessing the event_stream target via the .NET framework. I’ll demonstrate with C#.

9/10, would have preferred F# but would read again.

Comments closed

Pipelearner

Simon Jackson introduces pipelearner, a tool to help with creating machine learning pipelines:

This post will demonstrate some examples of what pipeleaner can currently do. For example, the Figure below plots the results of a model fitted to 10% to 100% (in 10% increments) of training data in 50 cross-validation pairs. Fitting all of these models takes about four lines of code in pipelearner.

Click through for some very interesting examples.

Comments closed

Monitoring Car Data With Spark And Kafka

Carol McDonald builds a model to determine where Uber cars are clustered:

Uber trip data is published to a MapR Streams topic using the Kafka API. A Spark streaming application, subscribed to the topic, enriches the data with the cluster Id corresponding to the location using a k-means model, and publishes the results in JSON format to another topic. A Spark streaming application subscribed to the second topic analyzes the JSON messages in real time.

This is a fairly detailed post, well worth the read.

Comments closed

Web App Security

Vishwas Parameshwarappa has an article on securing web applications:

The Cross-site request forgery (CSRF) exploit uses cross-site scripting (mentioned above), browser insecurities, and other techniques to cause a user to unwittingly perform an action within their current authenticated context that allows the attacker to access the user’s account. This type of attack usually occurs when a malicious email, blog, or a message causes a user’s Web browser to perform an unwanted action on a trusted site for which the user is currently authenticated.

This is a nice overview of the most common attack vectors for web applications.

Comments closed

Deleting SSAS Partitions

Chris Koester shows how to use TMSL and Powershell to delete an Analysis Services tabular model partition:

The sample script below shows how this is done. The sequence command is used to delete multiple partitions in a single transaction. This is similar to the batch command in XMLA. In this example we’re only performing delete operations, but many different operations can be performed in sequence (And some in parallel).

Click through for a description of the process as well as a script to do the job.

Comments closed

Always Encrypted With Powershell

Jakub Szymaszek shows how to configure Always Encrypted support from Powershell:

Note: In a production environment, you should always run tools (such as PowerShell or SSMS) provisioning and using Always Encrypted keys on a machine that is different than the machine hosting your database. The primary purpose of Always Encrypted is to protect your data, in case the environment hosting your database gets compromised. If your keys are revealed to the machine hosting the database, an attacker can get them and the benefit of Always Encrypted will be defeated.

That’s a good warning.

Comments closed

Azure Resource Explorer

Kenneth Fisher discusses the Azure Resource Explorer:

Now this is just default tab. The GET, PUT tab. Which basically shows you the get command of the resource manager API that calls this information, and if you hit the edit button you can actually change information in the JSON output and issue a PUT command to send it back. I’ll admit up front that this is a bit beyond me as I don’t do API calls and I’m new enough to Azure that I don’t know what I can and can’t change (everything I’ve tried so far hasn’t worked). There are several other tabs, though, including a Powershell one and I’m a bit more familiar with Powershell. In it, you can see some of the Powershell commands associated with the resource manager and this particular object.

Read on for more information.

Comments closed

Columnstore Partitioning

Niko Neugebauer warns against partitioning small tables with clustered columnstore indexes:

Needless to say that looking at the execution plans you notice that the actual execution plan shows 10 times difference between them, even though both tables contain the very same data!
The query cost for the partitioned table is staggering – it is around 10 times bigger (~8.8) vs (~0.81) for the first query.
The execution times reflect in part this situation: 12 ms vs 91 ms. Non-partitioned table performs almost 9 times faster overall and the spent CPU time is reflecting it: 15 ms vs 94 ms. Remember, that both tables are Columnstore Indexes based ! Partitioning your table in a wrong way will contain a huge penalty that might not be directly detectable through the execution plan of the complex queries. Well, you might want to use the CISL, just saying

If you can’t fill a single rowgroup, your partition is too granular.  Even then, I’d like to see double-digit rowgroups per partition, though that’s just me.

Comments closed