Press "Enter" to skip to content

Category: Power BI

Replacing Error Values on All Columns in Power BI

Kristyna Hughes needs to clear out some errors:

End users that are not trained in data governance but are actively involved in maintaining a data set can easily make data entry mistakes or create inconsistent data types within columns. For example, you may have a column in the dataset called “Sales” and instead of 0, someone may type “None” or “NA”. When this gets loaded into Power BI, Power BI will not know how to convert the text value “None” to a number, and it will throw an error on the refresh of the report.

One way to mitigate the impact of user-entered data is to replace errors with null values. This is not ideal since it doesn’t fix the data entry issues, but it does enable reports to still be refreshed and used while the data issues are addressed. In Power Query, you can manually replace the errors with null by going to the “Transform” tab then selecting the drop down for “Replace Values” and choosing “Replace Errors”.

Click through for a script which does this for all columns in Power Query.

Comments closed

The Basics of Slowly Changing Dimensions

Soheil Bakhshi explains what slowly changing dimensions are:

Another example is when a customer’s address changes in a sales system. Again, the customer is the same, but their address is now different. From a data warehousing standpoint, we have different options to deal with the data depending on the business requirements, leading us to different types of SDCs. It is crucial to note that the data changes in the transactional source systems (in our examples, the HR system or a sales system). We move and transform the data from the transactional systems via extract, transform, and load (ETL) processes and land it in a data warehouse, where the SCD concept kicks in. SCD is about how changes in the source systems reflect the data in the data warehouse. These kinds of changes in the source system do not happen very often hence the term slowly changing. Many SCD types have been developed over the years, which is out of the scope of this post, but for your reference, we cover the first three types as follows.

Click through for depictions of the first three types as well as implementation details and pains.

Comments closed

Rolling Average and Working Days in DAX

Marco Russo and Alberto Ferrari combine two common business requests:

In a previous article, Rolling 12 Months Average in DAX we showed you how to compute a rolling average over a time period. In this new article, we want to take you one step further and show how to compute a moving average over a certain timeframe, that takes into account only the working days. We present two variations of the same solution: one that is optimized, relying on a calculated column, and one that – despite being somewhat slower – works without requiring a calculated column. The latter can be useful in case you need to define the formula in a live-connected report, where calculated columns are not an option.

Because the formula needs to account for working versus non-working days, it cannot rely on standard time intelligence functions. Indeed, DAX time intelligence functions have no knowledge about what it means for a day to be either a working day or a rest day. The NETWORKDAYS DAX function would not be very useful in this case, because it would introduce a slow filter to compute the range of dates that includes the number of working days desired.

Read on to see how they solve this one.

Comments closed

Power BI Field Parameters and Measures

Roland Szirmai has fun with field parameters in Power BI:

Meaning that report users can switch between “dimensions” of the data. This is great and already provides a much better UI and UX, but there was no information about the limitations of what “fields” can you add to the parameter table.

To be more specific, I couldn’t find any limitation about adding measures (Explicit Measures) to the Field Parameter.

I think you can see where my mind wandered after that…

Read on for the result of Roland’s wanderings.

Comments closed

The Importance of a Proper Datamart / Data Warehouse

Teo Lachev explains why you want a datamart (or a data warehouse) for BI solutions:

I sent a proposal for implementing a classic BI solution: Azure SQL-based datamart (not Power BI datamart please), ETL, semantic model, and reports. The client had a sticker shock. Return to sender … as other BI companies that quoted can do it for half! Upon digging, it turned out the other companies would build the semantic model (aka Power BI dataset) directly on top of the data source. On a T&M basis, of course, what else? By contrast, I give fixed-price milestone-driven proposals and I don’t get paid unless I deliver and meet written and agreed upon success criteria, but that’s a different story.

So, let me count the ways as the poet would say. It’s certainly technically possible to slap a dataset on top of the data source(s). That’s what self-service BI is all about right … until it doesn’t serve anymore

Read on for more detail.

Comments closed

Quality Checks for Power BI Visuals

Meagan Longoria has a checklist:

For more formal enterprise Power BI development, many people have a checklist to ensure data acquisition and data modeling quality and performance. Fewer people have a checklist for their data visualization. I’d like to offer some ideas for quality checks on the visual design of your Power BI report. I’ll update this list as I get feedback or new ideas.

Read on for the list, as it’s a good one. For the most part, these also apply to visuals created in other tools.

Comments closed

Power Query’s Equivalent of IN

Gilbert Quevauvilliers is making a list:

In the example I did not want to specify all the country names one row at a time. This not only takes a long time, but if I had to then make updates it could be painful too.

The requirement was for certain countries to have their names and the rest be grouped into “Other Countries”

Read on to see how Gilbert was able to combine the set of “Other Countries” together.

Comments closed

Gateways and the CPU Cost of Power BI Dataset Refresh

Chris Webb continues experimenting:

After last week’s post on measuring Power Query CPU usage during dataset refresh, someone asked an obvious question that I should have addressed: does using a gateway change anything? After all, if you’re using a gateway to connect to an on-premises data source then all the Power Query queries transforming the data from that source will be executed on the gateway machine and not in the Power BI Service.

Let’s do a quick test to find out. 

Read on to see what Chris found out.

Comments closed

Uploading Multiple Reports to Power BI

Jon Fletcher doesn’t have time to upload reports one by one with the UI:

In this blog post, I will be sharing a PowerShell script that allows multiple Power BI reports to be uploaded at once. In a previous blogpost, I shared a PowerShell script that allowed users to downloaded multiple Power BI reports. Combined you could move several reports from one workspace to another in a few seconds.

The script is downloadable at the bottom of the page as a txt file. To use the script there are three steps to take.

Click through to see how it all works.

Comments closed

Refreshing Power BI Usage Metrics Report Credentials

Stephanie Bruno fixes a credentials issue:

Have you ever had a Usage Metrics Report just stop refreshing? At the bottom you’ll see a little message that tells you to check the credentials. But you don’t have access to the Usage Metrics Report dataset in the list of datasets in the workspace because it’s hidden! So even if you wanted to delete it and nicely ask it to start again, you can’t do it from the workspace. What can you do? Well, you can delete the Usage Metrics Report dataset with the Power BI Rest API.

Read on to see how.

Comments closed