Press "Enter" to skip to content

Author: Kevin Feasel

Auditing Within Power BI

Adam Saxton has a video on how to use Power BI Auditing:

In this video, I look at the Power BI Auditing feature that was made available a few weeks ago. I show how to turn it on and how to search. This can be helpful with understanding who is doing what within your organization.

You can read more about Power BI Auditing by checking out the official docs.

Auditing Power BI in your organization

Adding the ability to audit data access is important enough within regulated environments that this was probably a deal-killer until a few weeks ago.

Comments closed

Migrating Data To SQL Server Using Data Factory

Ginger Grant moves data from Azure Blob Storage into Azure SQL Database using Data Factory:

There are instances where data resides in Azure Blob Storage and the data is needed in a SQL database. For example, if one ran a Machine Learning experiment in Data Factory, the results would be stored in Azure Blob storage, and for analysis purposes, it may make a lot more sense to move the data to SQL database. Moving data around in Data Factory, means writing JSON. In this example we will be using an Azure SQL DB, but it is not essential that the data be stored in Azure. An on-premises SQL Server could also be used, as long as a gateway was added for the connection, the other steps would be the same. There are five different Data Factory elements required to move data from an Azure blob to a database: a pipeline for the data, a data set containing the definition for the blob, a linked service for the blob, a data set containing a definition for the SQL Data, and a linked service to connect to the SQL database.

There’s a lot of JSON ahead.

Comments closed

Calculating DTU

John Sterrett gives us a measure for calculating DTUs in Azure SQL Database:

The whole query is below. Right now, let’s just focus on the secret sauce. The secret sauce is how DTU percentage gets calculated.  In a nutshell, the maximum of CPU, Data IO, Log Write Percent determine your DTU percentage.  What does this mean to you? Your max consumer limits you. So, you can be using 1% of your IO but still be slowed down because CPU could be your max consumer resource.

That’s a rather interesting finding.  I think the next step (which may be so context-dependent that it’s not possible to generalize) might be to figure out what various workloads do to the metrics and if there’s a way to predict with some reasonable accuracy the expected DTU load given an anticipated change in workload, rather than seeing the value spike and reacting to it later.

Comments closed

Releasing An Azure Page/Blob Lease

Denny Cherry has VB code to release an Azure page or blob storage lease:

Sometimes when firing up VMs or moving VMs from the page or blob store you’ll get an error that there is still a lease on the file.  To solve this you need to release the lease. But waiting won’t do the trick, as the leases don’t have an expiration date.

I found some VB.NET code online that with some tweaking (with the help of Eli Weinstock-Herman and Christiaan Baes) I was able to get to release the lease.

Click through for the code.

Comments closed

Copying On-Prem Databases To Azure SQL Database

Kenneth Fisher walks us through migrating a database to Azure SQL Database:

It turns out it’s pretty easy (even if it takes some time). So where to start? Well the first thing we need is a place to put our database. An Azure SQL Database Server. If you don’t already have one creating a new one is fairly easy.

First start at portal.azure.com. Log in and follow these steps

This is the longer, manual process.  It’s good to walk through it this way at least once before writing a Powershell script, just to see what the script is doing.

Comments closed

SSMS In 4K

Aaron Bertrand compares SQL Server Management Studio now that there’s better support for high-resolution monitors:

At first I used Gianluca’s solution (“SSMS in High-DPI Displays: How to Stop the Madness“), but it wasn’t perfect – fonts in some places were really blurry, and some dialogs were totally unusable. He has several examples in his post if you’re curious. But I have several too.

This is the previous version of SSMS (13.0.15600.2), out of the box, which now foregoes any type of DPI scaling at all, using the old-fashioned jaggy type we’ve been suffering for decades (except check out the smooth text on the About dialog title bar!).

Cf Gianluca Sartori.  There’s still some work to do, but more and more of us are moving to high-resolution and 4K monitors; 1080p isn’t cutting it anymore.

Comments closed

Power BI Dataset Refresh Errors

Chris Webb has a workaround for dataset refresh errors when using Power BI and writing M code:

This is just a variation on the widely-used M pattern for using functions to iterate over and combine data from multiple data sources; Matt Masson has a good blog describing this pattern here. In this case I’m doing the following:

  • Defining a table using #table() with three rows containing three search terms.
  • Defining a function that calls the metadata API. It takes one parameter, a search term, and returns a value indicating whether the search was successful or not from the JSON document returned. What the API actually returns isn’t relevant here, though, just the fact that I’m calling it. Note the highlighted lines in the code above that show how I’m constructing the URL passed to Web.Contents() by simply concatenating the base URL with the string passed in via the custom function’s Term parameter.
  • Adding a custom column to the table returned by the first step, and calling the function defined in the second step using the search term given in each row.

This query refreshes with no problems in Power BI Desktop. However, when you publish a report that uses this code to PowerBI.com and try to refresh the dataset, you’ll see that refresh fails and returns a rather unhelpful error message:

Data source error Unable to refresh the model (id=1264553) because it references an unsupported data source.

The nature of the problem makes sense, and Chris provides one method of getting around this error.

Comments closed

Data Access And Streaming

Kartik Paramasivam discusses data access problems and solutions within a streaming architecture:

Using a remote store: This is the traditional model for building applications. Here, when an application needs to process an event, it makes a remote call to a separate SQL or No-SQL database. In this model, write operations are always remote calls, but reads can be performed on a local cache in certain scenarios. There are a large number of applications at LinkedIn that fall into this category.

Another pattern is to use a remote cache (e.g., Couchbase) that is fronting a remote database (e.g., Oracle). If the remote cache is used primarily for reading adjunct data, then applications use an Oracle change capture stream (using Databus) to populate the remote cache.

This is a must-read if you’re looking at implementing a streaming architecture and need to do any kind of data enrichment.

Comments closed

Dacpac Deployment Models

Ed Elliott discusses publish profiles as part of dacpac deployment scenarios:

For a while I meandered between the two approaches until the ssdt team announced that they had released a nuget package with the DacFx in and I decided that I would move over to that as it meant that I no longer had to check in the dll’s into source control which in itself is a big win. I also decided to fix the extensions thing and so figured out a (slightly hacky) way to get the DacFx dll’s in the nuget package to behave like sqlpackage and allow a sub-directory to be used to load dll’s – I fixed that using this powershell module that wraps a .net dll (https://the.agilesql.club/blogs/Ed-Elliott/DacFxed-Nugetized-DacFx-Power…). Now I have the problem of not having to check in dll’s and still being able to load contributors without having to install into program files sorted BUT I still had the problem of lots of command line args which I was sharing in powershell scripts and passing in some custom bits like server/db names etc.

I’m not very familiar with dacpacs, so this was an interesting read for me.

Comments closed

Spatial Fragmentation Viewer

Slava Murygin writes a spatial query which shows database fragmentation:

As you can see, I definitely have a lot of free space, but my data are so spread across the file and especially up to it’s border, that there is no way to make file size smaller.

If we zoom at the very tail we can figure out the names of tables at the very end of the file, which prevent file from shrinking:

This looks quite a bit like the old Windows 95 defrag tool.  I like it.

Comments closed