Press "Enter" to skip to content

Author: Kevin Feasel

Using Diagnostics Trace With Power BI

Chris Webb uses Diagnostics.Trace to track process runtime in Power BI:

To sum up, the workflow for tuning your query is:

  • Make some changes to the LongQuery query that hopefully make it faster

  • Update the Trace Message parameter with some notes about which version of the LongQuery query it is that you’ll be testing

  • Click the Refresh Preview button for the Diagnostics query to test how long LongQuery now runs for

  • Refresh, or load, the query that reads the data from the trace logs so you can see how all of your changes have affected query execution times

I give it two months before the Power BI team releases a change to make this easier…

Comments closed

SQL Server Backup Buffers

SQLsasquatch (whose name I now know but I like the handle so much) has a few questions about backup buffers:

I wonder:
-If ‘max server memory’ wasn’t being overridden by a more reasonable target (because max server memory is 120 GB on a 32GB VM), would the behavior still be the same before reaching target?  I bet it would be.
-Is this behavior specific to not having reached target?  Or when reaching target would backup buffers be allocated, potentially triggering a shrink of the bpool, then returned to the OS afterward requiring the bpool to grow again?
-What other allocations are returned directly to the OS rather than given back to the memory manager to add to free memory?  I bet CLR does this, too.  Really large query plans?  Nah, I bet they go back into the memory manager’s free memory.
-Does this make a big deal?  I bet it could.  Especially if a system is prone to develop persistent foreign memory among its NUMA nodes.  In most cases, it probably wouldn’t matter.

Good questions, to which I have zero answers.

Comments closed

Launch Event Thoughts

Patrick Keisler sums up his thoughts on the Raleigh-Durham SQL Server 2016 Launch Discovery Day:

Your solution must be completed in 3 hours.

On paper this all sounds pretty easy, but in practice it was quite hard. I am no BI developer and the other members of my team did not have any expertise in that area either, but we still managed to create a solution and have fun doing so.

The first issue was had was how to combine our development work on the same database. This one was easy…just use Azure. In the span of about 30 minutes, I spun up a new Azure VM with SQL Server 2016 pre-installed, uploaded the database, setup logins, and opened the appropriate ports. I then gave my team members the URL and credentials so they each could connect from their laptops.

These are good thoughts, and I completely agree with the point that better data definition would have made for a better event.  Each one of the teams had to spend a lot of time cleaning up data, and I think that limited the teams’ ability to do really cool things.  I’d love to put something like this on again, but if that happens, I’m going to make sure we start with a good data model so people can do fun things on top of that rather than spend all their time scrubbing data (unless that’s the point of the exercise).

Comments closed

Row-Level Security With Reporting Services

Paul Turley discusses combining row-level security, SQL Server Reporting Services, and SQL Server Analysis Services:

In every data source connection string, you can add a simple expression that maps the current Windows username to the CUSTOMDATA property of the data source provider.  This works in SSRS embedded data sources, shared data sources, in a SharePoint Office Data Connecter (ODC) file and in a SharePoint BISM connection file.  In each case, the syntax should be the similar.  Here is my shared data source on the SSRS 2016 report server

This is pretty snazzy.  Paul goes into good detail on the topic, so read the whole thing.

Comments closed

Share Your Knowledge

Ginger Grant wants you to share what you know:

Recently I have been working with some new features of SQL Server 2016 and have had questions which blogs, TechNet and Stack Overflowprovided no answers on the internet. Fortunately, I have found people to help me resolve the answers. If you go searching for the same errors I had, you will find answers now, as I have posted them. If you have had a problem unique to the latest release of SQL Server, I hope you will take the time to post the question and the answer if you have it. I’m going to try to be better at answering forum questions, especially now I have learned a few interesting factoids. I am looking forward to the fact that next time when I go looking for an answer, thanks to all of us who have done the same, we can all help each other out. The next person who finds themselves in the same jam will thank you for talking them out of the tree.

And if you don’t already do so, blog.  And if I don’t know about your blog, tell me about it.

Comments closed

Warehouse History

Kennie Pontoppidan delves into various aspects of collecting and storing history in warehouses:

In T2 history we have the two attributes ValidFromDate and ValidToDate. We can choose two different strategies for updating the values of these: using system time (load time) or business time. If we use system time for the T2 splits, the data warehouse history is dependent on when we load data. This makes it impossible to reload data in the data warehouse without messing up the data history. If we allow our load ETL procedures to use timestamps for business time (when data was really valid) for T2 history, we get the opportunity to reload data. But the cost of this flexibility is a much more complicated design for T2 splits. We also need to keep track of this metadata on the source system attributes.

Part of a warehouse’s value is its ability to replay historical data, but you can only do that if you store the data correctly (and query it correctly!).

Comments closed

Azure Key Vault Connector Available

Rebecca Zhang notes that Azure Key Vault is now available to all:

When using these SQL encryption technologies, your data is encrypted with a symmetric key (called the database encryption key) stored in the database. Traditionally (without Azure Key Vault), a certificate that SQL Server manages would protect this data encryption key (DEK). With Azure Key Vault integration for SQL Server through the SQL Server Connector, you can protect the DEK with an asymmetric key that is stored in Azure Key Vault. This way, you can assume control over the key management, and have it be in a separate key management service outside of SQL Server.

The SQL Server Connector is especially useful for those using SQL Server-in-a-VM (IaaS) who want to leverage Azure Key Vault for managing their encryption keys. SQL IaaS is the simplest way to deploy and run SQL Server, and it is optimized for extending existing on-premises SQL Server applications to the cloud in a hybrid IaaS scenario, or supporting a migration scenario.

Read on for more details.

Comments closed

HBase PaaS On AWS

Qubole now offers HBase on AWS as a Platform as a Service product:

  • Scheduled Backups and Restore to any cluster: HBase data is backed up to S3 regularly, and you can use Qubole’s cluster management UI to easily configure schedules for your backups. All the data or specific tables can be restored to any other HBase cluster. This feature is not only useful for disaster recovery but also to bring up test or dev clusters as well as clusters for one-off analysis.

  • Cluster Management: HBase clusters in QDS are elastic! Our users regularly increase the size of the capacity of the clusters. This is a boon for capacity planners as they can be more reactive to workload patterns saving time and money.

  • Zero impact node rotations: By carefully orchestrating data compaction and HDFS block transfers prior to reassigning region servers, QDS minimizes the impact of adding/deleting/replacing nodes in a HBase cluster.

  • High-Availability Configurations: Putting together a high availability HBase cluster in the cloud can be daunting. QDS provides prebuilt templates for deploying HA HBase clusters through a single click.

The value proposition for this is very similar to Azure SQL Database:  eliminate some of the administrative tasks by accepting their defaults.  Doing this does remove some of the complexity of HBase.

Comments closed

Learning By Doing

Matt Cushing gives us some notes on learning SSIS:

Send Mail tasks won’t work unless you have things configured properly.  I was trying to find things on google and all I kept coming across was how to configure the task or how to Install SSIS and configure it to run, not how to configure the server to send it properly.  Thankfully John took pity on me and helped me realize that using an execute SQL task and sp_send_dbmail works more easily and cleanly – Sql Server Central

I’ve used Send Mail a few times, but have always had somebody else around to configure Exchange or whatever other mail server we were using at the time.

Comments closed

Power BI Groups

Reza Rad looks at using Power BI groups and integrating with Office 365:

When you share a content with an individual in the organization, if that person leave the company, or be replaced by someone else from another team, then you have to remove sharing from previous user account, and assign it to the new user account. Best practice is to share content with groups. and members of Groups then easily can be managed by an administrator. Power BI groups are fully synchronized with Office 365 groups. once you used a group in Power BI, then it is only an admin’s task to add/remove members from it.

I like this group-based approach a lot, as it makes dashboard security a lot easier.

Comments closed