Press "Enter" to skip to content

Day: May 6, 2020

Running Postman Collections via Newman and Jenkins

Ankur Thakur takes us through configuring Postman automation:

We can write the automation test suite for any service which can be used for performing regression testing. But we can also integrate our automation test scripts using Newman in Jenkins CI/CD pipeline which can be triggered automatically whenever a pull request gets merged.

We need software installed on our system locally:

– Jenkins
– Node
– NPM
– Postman (For writing the collection)

We’ve done this in our environment and I’m reasonably happy with the results. If you have a REST API, it’s a fair sight easier to work with than something like Specflow.

Comments closed

Dynamic File Pruning on Delta Lake

Ali Afroozeh, et al, take us through Dynamic File Pruning in Databricks Runtime 6.1:

In addition to eliminating data at partition granularity, Delta Lake on Databricks dynamically skips unnecessary files when possible. This can be achieved because Delta Lake automatically collects metadata about data files managed by Delta Lake and so, data can be skipped without data file access. Prior to Dynamic File Pruning, file pruning only took place when queries contained a literal value in the predicate but now this works for both literal filters as well as join filters. This means that Dynamic File Pruning now allows star schema queries to take advantage of data skipping at file granularity.

There are some interesting performance results here. I’d also be curious to see how robust the results are as queries get more complicated

Comments closed

Settings for Tuning Backup Performance

Stuart Moore takes us through backup tuning options in dbatools:

Backup-DbaDatabase produces a lot of output, but the one you’re really going to be interested in is Duration. You’re job is going to be to minimise this as much as possible without impacting database performance.

To test changes in isolation you’ll want to remove the biggest variables in backup performance, network and storage speed. To do this, there’s a magical backup device that’s got near infinite speed and storage you can use. This is the black hole known as the bit bucket, /dev/null or NUL:, everything thrown into this pit disappears as fast as it’s put in.

Read the whole thing. And if you’re interested in a systematic method of testing and understanding the impact of these settings on your system, I have a presentation on the topic.

Comments closed

Optimizing Slow Card Visuals in Power BI

Marco Russo helps us tune Power BI reports containing a large number of card visuals:

Every visual element in a Power BI report must complete a number of tasks to provide the expected result. Visuals showing data must generate one or more DAX queries to retrieve the required measures applying the correct filters. The execution of these queries increases the waiting time for the end user, and increase the workload on the server, especially when multiple users access a published report at the same time. In order to improve the performance and the scalability of a report, the best practice is reducing the number of visuals consuming data published in a page of a report.

The focus is on a single page of the report. Power BI only gets data and build the visualizations required for the active page of a report. When the user switches the focus to a different page, the waiting time only depends on the visuals of the new page. The content of other pages of the same report is not relevant for the performance. The goal is reducing the number of visuals in a single page of a report. This could be challenging in order to obtain the same report layout, but we can look for the right visualization once we realize that the number of visuals in the same page is negatively affecting the user experience.

Less is more here.

Comments closed

Creating Currency Formatting Strings with Power BI

Gilbert Quevauvilliers walks us through formatting currencies via calculation groups in Power BI and Analysis services:

When I first started looking at the calculation groups and changing the currency formats, I thought that my existing currency format was correct. Boy was I wrong and once I found that out and corrected it, my Currency Format Strings started working.

As per the Microsoft documentation found here Dynamic format strings for currency conversion I had to make sure that my Currency format followed the following pattern.

Read on for an example and demonstration.

Comments closed

Issues with Window Functions and Views / Derived Tables

Paul White takes us through some of the difficulties you are liable to see when using window functions in views and derived tables:

Our expectation is that the execution plan for this new query will be exactly the same as before we created the view. The query optimizer should be able to push the product filter specified in the WHERE clause down into the view, resulting in an index seek.

We need to stop and think a bit at this point, however. The query optimizer can only produce execution plans that are guaranteed to produce the same results as the logical query specification. Is it safe to push our WHERE clause into the view?

Read the whole thing.

Comments closed

Translating Datasets in Power BI Premium

Kasper de Jonge shows how you can translate data in Power BI datasets if you’re using Premium:

One request that comes up a lot is to be able to provide translations for the tables and columns of your model (especially here in Europe). Up until now this was not possible in Power BI as there was no UI for it. The new XMLA read/write feature changed all of this. As you probably know in SSAS Tabular this is already possible. The new XMLA endpoint makes any Power BI dataset into a SSAS tabular model with most of its functionalities. So how do you use it in Power BI and add translations? In this blog post I will walk you through it.

Click through for a demo.

Comments closed

Getting Started with MySQL in Azure

Chris Hyde tries out Azure’s MySQL Platform-as-a-Service offering:

I started out by setting up a dedicated resource group to use for my instance, and then used the Azure Portal GUI to create a new instance named mysql-20200505. I made sure to downgrade from the default General Purpose configuration to Basic, so it will only cost me about $67 a month if I leave it running instead of around $350. After the instance was created successfully I then added some connection security rules to ensure that only my IP was able to connect to it.

I then opened up MySql Workbench to connect to the server as pictured below. Of course it took me two tries to connect as I made my usual error of not including the machine name in the username field the first time around.

Click through for Chris’s early tests.

Comments closed