Press "Enter" to skip to content

Curated SQL Posts

Managing Azure DevOps via Azure Logic Apps

Stuart Ainsworth has a process:

A big part of my job these days is looking for opportunities to improve workflow. Automation of software is great, but identifying areas to speed up human processes can be incredibly beneficial to value delivery to customers. Here’s the situation I recently figured out how to do:

1. My SRE team uses a different Azure DevOps project than our development team. This protects the “separation of duties” concept that auditors love, while still letting us transfer items back and forth.
2. The two projects are in the same organization.
3. The two projects use different templates, with different required fields.
4. Our workflow process requires two phases of triage for bugs in the wild: a technical phase (provided by my team), and a business prioritization (provided by our Business Analyst).
5. Moving a card between projects is simple, but there were several manual changes that had to be made:
– Assigning to a Business Analyst (BA)
– Changing the status to Proposed from Active
– Changing the Iteration and Area
– Moving the card.

To automate this, I decided to use Azure Logic Apps

Read on to see how Stuart did this.

Comments closed

Cost Management Updates in Azure

Michael Flanakin gives us a few updates on Azure billing:

Understanding your cost patterns over time and investigating specific charges often requires drilling into and selecting specific dates. You’ve always been able to select from one day up to one year in cost analysis, but you’ve told us that selecting those dates isn’t as easy as it could be. As we started building out a new platform for analytics and insights, we took this feedback to heart and completely redesigned the date selection. What you see today is an early peek at that.

This month, you’ll find a new option to select a custom date range in the cost analysis preview. You can pick a single month, a range of months, or start and end dates for a range of days, making it easier than ever to fine-tune your reporting to the dates you need. 

The virtue and downfall of cloud systems like AWS and Azure is that they’re very clear about how much things cost, but only if you know exactly the resources something uses. It’s not as simple as “I want to use a database,” but there are all of those other charges around data egress, networking, log management, etc. which can add up. Many of those costs are negligible (fortunately), but try walking through a pricing scenario for Azure Synapse Analytics sometime with someone new to the product and figure out at what point that person gives up trying to calculate the cost. My money says right around the time you get to the integration runtime costs.

Comments closed

The Equals Sign in Powershell

Kenneth Fisher avoids overloading:

In SQL Server both the set and equality functions are handled by the equals sign (=). 

[…]

However, in some other languages that’s not how it works. In PowerShell for example the equals sign is always a set operation. 

Click through to learn the difference, as well as what Powershell uses for equality operations. I think my favorite language for this is Scala, where I jokingly say = means “is equal to,” == means “is really equal to,” and === means “is equal to and I totally mean it.”

Comments closed

Quick Insertion into SQL Server from a Spreadsheet

Kevin Wilkie gives a quick way to load data from Excel (or any other spreadsheet):

One of the items I do before creating the table in the database is to review all of the data that is in the spreadsheet to make sure that:

1. I understand the data that is going into the database table.
2. Nothing that is just obviously wrong is trying to be pushed into the database. For example, the data I was talking about earlier that was one column over from what it should have been. If you see data that is all 0’s and 1’s up until a certain row, then you have descriptions or names – you probably have some bad data.

The other important part of pushing the data into the database from a spreadsheet is working with the CONCATENATE function of Excel. Let’s go into that now.

Click through for the process, as well as additional explanation.

Comments closed

Pre-Loading SSAS Databases into Memory Post-Restart

Nigel Foulkes-Nock explains why that first query after restarting SSAS can be slow:

When the SQL Server Analysis Services (SSAS) Tabular Service is started, it can take a long time before it is ready to be queried. This can cause delays to Service, not to mention confusion.

This Blog Post will explain what is happening during this time and a method that can be used to improve. It’s worth mentioning that the SSAS Tabular Databases that this has been used on are quite large (> 100Gb).

Click through for the answer, as well as a technique to warm up those servers so an end user doesn’t wind up being the one to pay for this wait.

Comments closed

sp_QuickieStore in Action

Erik Darling has a new stored procedure. First up, an introduction:

If you’ve ever tried to write a query to find stuff in Query Store and quickly gotten frustrated, you’re not alone.

Whomever designed that schema hates you. It’s probably the same person who designed Extended Events.

I know, there’s a GUI. The stuff you can find in there is definitely useful, but I often find myself wishing it did a little bit more. Asking for additional features often feels like porridge-begging, and even if something new did get added eventually, it wouldn’t help you in the moment.

With all that in mind, I wrote sp_QuickieStore to try to take some of that pain away. As of this post, it’s V1. There will likely be changes and fixes and all that good stuff, but right now it’s good enough for an introduction.

Erik then shows off the results:

Under the more concise mode, you get one set of results back that combines metrics from query_store_runtime_stats, along with query text, query plan, and context settings. Some previews below. There’s a lot more in the actual results, but the screen caps would get very repetitive.

This looks really interesting, so go check it out.

Comments closed

DirectQuery on REST APIs

Chris Webb illuminates us:

One of the most common questions I get asked is “How can I use Power BI in DirectQuery mode on top of a REST API?”. This seems like a reasonable thing to do but almost everyone who tries it will fail, and in this post I will explain why.

To answer this question we first of all have to review the two main ways of working with data in Power BI: Import mode and DirectQuery mode. In Import mode data is cached in Power BI’s own internal database and all the DAX queries that are generated by your reports are answered from there. In DirectQuery mode no data is stored inside Power BI; instead, when a report is run and DAX queries are fired off against your dataset, Power BI in turn generates queries against the data source to get the data needed. Most of the data sources that can be used with DirectQuery mode in Power BI are relational databases and so that means Power BI will generate SQL queries to get data from them, but Power BI can also generate queries in other languages too.

Read on for the bad news, although there are some third-party products which can make it work in specific cases.

Comments closed

The Power Automate Custom Visual in Power BI

Imran Burki tries out a new custom visual:

Using the Power Automate Custom Visual in Power BI is the same process as using any custom visual. We’ll use the Defects Dashboard I created from my last blog post as an example. We want to send a Teams message when we notice defects in a plant require the attention of the plant supervisor. After that, we want to create a meeting in Outlook to discuss findings from our dashboard. Previously, there wasn’t a straightforward way to do this directly in Power BI. However, with the Power Automate Custom Visual, we can create flows directly in Power BI without ever having to leave Power BI! Now that’s cool! Let’s get started.

This is really interesting for setting up rules-based alerting.

Comments closed

Storing dbatools as a Package in Azure DevOps

Kevin Chant has a process for us:

In this post I want to cover how you can store dbatools PowerShell module as a package in Azure DevOps. By using the Azure Artifacts service.

I want share some knowledge about this because did a demo of it at Malta Data Saturday. By the end of this post you will have a better understanding of Azure Artifacts and a workaround if you encounter a problem publishing a package.

Read on for the process.

Comments closed