Press "Enter" to skip to content

Category: Power BI

Creating Custom Power BI Themes

Cathrine Wilhelmsen shows us how to create a custom theme for Power BI dashboards:

Power BI comes with several built-in themes and a whole gallery full of custom themes available for download. But what if you still can’t find the perfect look for your reports? No problem! Just create your own custom Power BI themes 🙂

…sounds simple enough, right? It only takes a few minutes to create a custom Power BI theme with a color palette of your choice. Whoosh – instant custom branding!

But if you are like me, simple color changes might not be enough. Maybe you want finer control of borders, fonts, labels, or other visual elements. Or maybe you just don’t want to keep changing the same settings over and over and over again in multiple visualizations and reports. (Please don’t do that.)

You can control all of these things in custom Power BI themes. It is, however, not quite as simple as creating a color palette… yet. (You never know when the Power BI product team will blow your mind with a new update!) But for now, we need to define custom themes in JSON files.

Click through to learn how to do some of these changes through the power of editing JSON files.

Comments closed

Tracking Errors In Power BI

Reza Rad has a lengthy post covering how you can track errors in Power Query:

To build a robust BI system, you need to cater for errors and handle errors carefully. If you build a reporting solution that the refresh of that fails everytime an error occurs, it is not a robust system. Errors can happen by many reasons, In this post, I’ll show you a way to catch potential errors in Power Query and how to build an exception report page to visualize the error rows for further investigation. The method that you learn here, will save your model from failing at the time of refresh. Means you get the dataset updated, and you can catch any rows caused the error in an exception report page. To learn more about Power BI, read Power BI book from Rookie to Rock Star.

There’s a lot of work, but also a lot of value in doing that work.

Comments closed

Using Power BI DMVs

Kasper de Jonge shows off a few Power BI dynamic management views:

Today a quick one that I came across while writing a different blog post that I will blog later. I know we have talked about it again and again but a good best practice is to remove any high carnality columns you don’t necessary need. This trick is not new and has been blogged about before in different places but I wanted to emphasize it again due to the importance.

I was finishing up my next blog post and wanted to upload the sample file. While doing that I noticed the file was 150MB large. That is rather large for such a simple file. The largest table has 500,000 rows and none of them are unique. What is going on?

There are some interesting DMVs in Kasper’s post, including one which shows cardinality by column.

Comments closed

Using Power BI Dataflows To Create Common Data Sets

Alexander Arvidsson shares an interesting use case for Power BI Dataflows:

There are several more use cases for a dataflow, but one that is very useful is the ability to share a dataset between apps. Previously we had to duplicate the dataset to each and every app that needed to use it, increasing the risk that one dataset was ignored, not refreshed properly or otherwise out of sync with reality. By using dataflows we can have several apps rely on the same dataflow (via a dataset), and thus it is quite possible to have a “master dataset”.

Click through for a walkthrough, as well as an understanding of the process’s limitations.

Comments closed

The Good And Bad Of Dataflows

Teo Lachev gives us the lowdown on Dataflows in Power BI:

There is a lot to like about dataflows. I can think of two primary self-service scenarios that can benefit from dataflows:

  • Data staging – Many organizations implement operational data stores (ODS) and staging databases before the data is processed and loaded in a data warehouse. As a business user, you can use data-flows for a similar purpose. For example, one of our clients is a large insurance company that uses Microsoft Dynamics 365 for customer relationship management. Various data analysts create data models from the same CRM data, but they find that refreshing the CRM data is time consuming. Instead, they can create a dataflow to stage some CRM entities before importing them in Power BI Desktop. Even better, you could import the staged CRM data into a single dataset or in an organizational semantic model to multiple data copies and duplicating business logic.

  • Certified datasets – One way to improve data quality and promote better self-service BI is to prepare a set of certified common entities, such as Organization, Product, and Vendor. A data steward can be responsible for designing and managing these entities. Once in place, data analysts can import the certified entities in their data models.

Read on for some more positives and negatives.

2 Comments

What Power BI Costs

Eugene Meidinger delineates what is free versus paid with respect to Power BI:

The Power BI Service, think powerbi.com, allows for free users. These free users can create reports and upload them, but with a significant number of limitations. The biggest is you only have one way of sharing content to others. Specifically with Publish to Web, which essentially makes your entire report free to the public.

You also only have one way of privately consuming other people’s reports, and that’s if someone places content in Power BI Premium. Otherwise, other users can’t share their reports directly with you. Power BI Free users are truly and island to themselves.

One other thing worth nothing is that you can’t sign up with a personal email. David Eldersveld has a good blog post on the issue. As of this writing, the uservoice request to change this has 2,800 votes.

See here for some more limitations of the free version of Power BI.

Read the whole thing.

Comments closed

Running R Scripts In Power BI’s Query Editor

Brad Lewellyn walks us through the process of executing an R script against a table in Power Query:

If you aren’t able to open the R Script Editor, check out our previous post, Getting Started with R Scripts.  While it’s possible to develop and test code using the built-in R Script Editor, it’s not great.  Unfortunately, there doesn’t seem to be a way to develop this script using an external IDE like RStudio.  So, we typically export files to csv for development in RStudio.  This is obviously not optimal and should be done with caution when data is extremely large or sensitive in some way.  Fortunately, the write.csv() function is pretty easy to use.  You can read more about it here.

It’s not a perfect experience, but Brad does show us how to get it done.

Comments closed

Inactive Relationships In Power BI

Reza Rad explains the value of inactive relationships and shows how you can implement this in Power BI:

As you can see this new type of relationship is different. It is dashed line, compared to the active, which was a solid line. This is an inactive relationship. You can only have one active relationship between two tables. Any other relationships will become inactive.

An inactive relationship doesn’t pass filtering. It doesn’t do anything by itself. I still see many people creating inactive relationships in their model thinking that just the inactive relationship by itself will do some filtering. It doesn’t. If I use the FullDateAlternateKey from the DimDate table to slice and dice the SalesAmount from the FactInternetSales table, which field I’m filtering based on? The field that is related through an Active relationship of course. Here is a result for that (which is apparently same as what you have seen in the previous example because the inactive relationship doesn’t do anything. It is just the active relationship that passes the filter);

Read the whole thing.

Comments closed

Working With Temporal Line Charts In Power BI

Marco Russo shows off a few things you can do with Power BI to make displaying temporal data in line charts better:

The first line is related to the week ending on February 2nd, so Sales Amount includes only 2 days (February 1st and 2nd) excluding the amount of other 5 days in the same week (January 27th to 31st). The same happens in the last week, which includes June 29th and 30th but does not include sales for the remaining 5 days in the same week (July 1st to 5th). This also explains why the report includes a week ending in July 2008 even though the Month slicer only includes dates up to June 2018.

We can create a measure that removes incomplete weeks from the calculation, as shown in the following code. A similar technique could be used for incomplete months and quarters.

There are some interesting techniques that Marco shows off, including hiding incomplete weeks.

Comments closed

Optimizing M Function Calls With Function.ScaleVector()

Chris Webb shows us how we can batch calls to M-driven web services:

One of the most common issues faced when calling web services in M is that the the easiest way of doing so – creating a function that calls the web service, then calling the function once per row in a table using the Invoke Custom Function button – is very inefficient. It’s a much better idea to batch up calls to a web service, if the web service supports this, but doing this forces you to write more complex M code. It’s a problem I wrestled with last year in my custom connector for the Cognitive Services API, and in that case I opted to create functions that can be passed lists instead (see here for more information on how functions with parameters of type list work); I’m sure the developers working on the new AI features in dataflows had to deal with the same problem. This problem is not limited to web services either: calculations such as running totals also need to be optimised in the same way if they are to perform well in M. The good news is that there is a new M function Function.ScalarVector() that allows you to create functions that combine the ease-of-use of row-by-row requests with the efficiency of batch requests.

As Chris notes in the post and in the comments, this is mostly useful when you can batch together individual calls to improve performance.  For functions which operate serially (like opening Excel workbooks), you won’t see much (if any) gain.

Comments closed