Press "Enter" to skip to content

Author: Kevin Feasel

Power BI Build Permissions

Chris Webb explains what the Build permission actually does:

If there is sensitive data in your Power BI semantic model that you don’t want some users to see then you need to use row-level security or object-level security to control access to that data. You’re an experienced Power BI developer – you know that, right? But what about Build permissions? If an end-user only has access to a report you’ve built and doesn’t have Build permissions on the underlying semantic model, and if there’s no other security on the semantic model, can they access data in the semantic model that isn’t visible in the report? The answer is potentially yes: you can’t rely on Build permissions for security.

I’ve found that the best method of security is never deploying products to begin with. Also, never using a computer.

Leave a Comment

Column-Level Encryption in SQL Server

Greg Low provides a primer:

Way back in SQL Server 2005, Microsoft introduced an amazing array of technologies. I suspect that many SQL Server professionals aren’t aware of just how many technologies were introduced in that release. It’s not surprising that professionals are still learning about a number of these technologies, even 20 years later. In particular, questions regarding certificates, keys, and column-based encryption still draw blank responses from most SQL Server professionals.

What’s interesting is that, as far as technologies like Always Encrypted have taken us, one of the first things to get support in new twists on SQL Server (e.g., Synapse, Fabric) is column-level security.

Leave a Comment

Tips for Building a Data Warehouse

James Serra gets back to foundations:

I had a great question asked of me the other day and thought I would turn the answer into a blog post. The question is “I’m an experienced DBA in SQL Server/SQL DB, and my company is looking to build their first data warehouse using Microsoft Fabric. What are the best resources to learn how to do your first data warehouse project?”. So, below are my favorite books, videos, blogs, and learning modules to help answer that question:

Click through for James’s recommendations. I strongly agree with his advice to start with Ralph Kimball’s The Data Warehouse Toolkit, and frankly, I think a lot of James’s advice here is sound. The person asking focuses on Fabric, and there are plenty of Fabric-specific things to learn, but at the end of the day, modern data warehouses are still data warehouses.

Leave a Comment

Functions and WHERE Clauses

Brent Ozar digs into some of the nuance:

SQL Server brought back 1856 of an estimated 1856 rows. So far, so good. But what happens when we start running functions on the parameter we’re searching for, like if it’s a parameter that we need to clean up. How will this affect our estimated number of rows:

Your knee-jerk reaction is probably to say, “FUNCTIONS BAD!” especially given that this is a case-insensitive database. But let’s test that hypothesis across time, across different database compatibility levels:

Brent give some of the nuance behind this. Which, in fairness, aligns with what some of the best performance tuning practitioners have frequently stated. But in contra-fairness, Brent shows how that advice should also change depending on the version of SQL Server, and how the first few versions of SQL Server after the no-longer-new cardinality estimator behaved radically differently from pre-2014 or post-2019.

Leave a Comment

Accessing a Former Employee’s Power BI Workspace

Gilbert Quevauvilliers says it’s MY workspace now:

One of the common challenges I’ve seen in organizations is when a team member leaves and their Power BI reports are stored in their personal My Workspace. These reports often contain valuable datasets and dashboards that are still in use or need to be maintained. So, how do you access and recover these reports?

In this blog post, I’ll walk you through the steps to access a former employee’s My Workspace, assign it to a supported capacity, and download the reports using Fabric Studio.

Read on for the instructions, and be sure to do the “Look at me. I am the captain now” meme when it works.

Leave a Comment

Invoking REST API Endpoints in SQL Server 2025

Hristo Hristov makes a call:

One highly anticipated new feature in SQL Server 2025 is the ability to call an external REST API endpoint from the database server itself. This new feature opens the door to new data integration scenarios and delivers on the promise to “bring AI closer to data.” What are the steps to follow if you want to use this new feature?

I expect to see two things from this. First, some percentage of developers will abuse it and cause performance problems in the database. Second, some percentage of database administrators will panic about this and try to prevent its use even when it makes sense.

But hey, at least this time, they didn’t use the term “unsafe” to describe something DBAs don’t understand and thus cause a widescale panic.

Leave a Comment

Power BI Projects and PBIR Format Admin Settings

Koen Verbeeck notes a new default:

The Power BI Enhanced Report Format (PBIR) will soon become the default, and that’s a good thing because it significantly makes git integration easier. You can already enable it in the preview features of Power BI Desktop (also enable PBIP and TMDL to make git integration of the model itself much easier).

Read on to see the administrative setting associated with this, as well as reasons why you should keep it on.

Leave a Comment

Eventhouse Endpoint for Fabric Data Warehouse

Tzvia Gitlin Troyna announces a new feature:

The new Eventhouse Endpoint for Fabric Data Warehouse extends this same architecture to structured data sources, allowing users to:

  • Query Fabric Data Warehouse tables in real-time using KQL.
  • Leverage schema mirroring for warehouse tables.
  • Unify analytics across Lakehouse and Fabric Data Warehouse without duplicating data.

Even if I don’t expect many data platform practitioners to use KQL and even though I’m morally opposed to the Fabric Data Warehouse (short version: Lakehouses and Warehouses in Fabric should be the same thing, not two separate things), I’d still consider this a step forward. It does provide a new integration point between two services that have been annoyingly isolated.

Leave a Comment

Database and Fabric Ignite Roundup

Victoria Holt lays out the news:

Real-Time Analytics with Synapse Real-Time Intelligence

  • The new Synapse Real-Time Intelligence in Fabric allows streaming data from operational databases to be analyzed instantly.
  • This supports use cases like fraud detection, predictive maintenance, and personalized recommendations.

Ah, thank goodness they’re re-using the name Synapse for something that has nothing to do with Azure Synapse Analytics. Makes sense when you’re running Microsoft Service Fab–, err, App Fab–, err, Fabric.

Leave a Comment

Slimming down Batch Deletion in SQL Server

Matt Gantz deletes a batch at a time:

In previous articles I showed patterns for working with large amounts of data on big tables while keeping locking at a minimum. These processes can allow migrations and maintenance without requiring downtime but, in environments with unpredictable database workloads, there is a risk of heavy traffic starting at any time and disrupting a once smooth operation. In this article, I’ll demonstrate how to augment these processes to allow dynamic adjustment of the configuration.

For most systems, the main limitation these techniques run into is the speed and throughput of I/O (input/output). During periods of low traffic, a large batch size may perform great with no impact to production, but as traffic increases, the storage subsystem may not be able to keep up.

Read on for two mechanisms to make batch operations a little less stressful on the server.

A consulting customer of mine has a fairly clever mechanism for this as well: track the number of non-trivial active processes before the batch begins. If that number is above a certain threshold (say, 10 or 15 or whatever), pause for a pre-defined period of time before running. That way, if the server isn’t very active, batches can keep processing willy-nilly. But once things get busy, it reduces its activity load.

Leave a Comment