Press "Enter" to skip to content

Day: March 19, 2026

Creating a Power BI Semantic Model Online

Gilbert Quevauvilliers doesn’t need Power BI Desktop:

It has been in the service for quite a while so I thought I would blog about it in terms of how you can create a power BI semantic model simply using the web interface. This means you no longer need Power BI desktop, or a Windows PC to get going.

This is quite a significant change because at times you need a lot of resources on your Windows PC or you’re working on a Mac and could not do this previously.

So, I will give an overview below on how you can create the semantic model just by using your browser.

Click through to see how.

Leave a Comment

Materialized Lake Views now GA in Microsoft Fabric

Balaji Sankaran makes an announcement:

Since introducing MLVs (Preview) at Build 2025, data engineers have used them to replace hand-built ETL pipelines with a few declarative Spark SQL statements, and their feedback directly shaped this release.

This update closes the most important gaps since reaching preview and makes MLVs production-ready at scale. With multi-schedule support, broader incremental refresh, PySpark authoring, in-place updates, and stronger data quality controls, teams can now build, run, and evolve medallion pipelines with far less operational overhead.

Click through to see what’s changed since the preview.

Leave a Comment

Configuring a New Windows Device via Chocolatey

Reitse Eskens writes a script:

Last month, my company got me a new laptop. I have very little choice in which one I get, and I have to reinstall a lot of software. So, what to do next?

Click through for an example of such an installation script. Having one or more Chocolatey scripts does help, especially for fairly stable corporate machines. I’ve found it to be moderately useful for home devices, where I tend to cycle through software a bit more frequently. But it does make the process of rebuilding a Windows-based machine less painful.

On the Linux side, this is more built-in. For anything via repo managers (such as apt or yum), you can list your installed packages and turn that into a script. It’s a similar story for snap or flatpak packages, though maybe a little bit less conveneint. For AppImage files or deb/rpm files you installed separately from the repo, it’s a bit more of a mixed bag.

Leave a Comment

Idempotence in SQL Scripts

Jared Westover lays out some solid advice:

Imagine you’ve spent weeks preparing a software release for production, only to deploy it late one night and receive an error that the table or column already exists. This occurs in production environments, even when you use migration-based deployment methods such as DbUp. How can you ensure or at least reduce the likelihood of an error like this in the future?

At a prior job, we needed to write idempotent scripts because the deploy-to-QA process would run every script for the sprint every time someone checked in a new script. This prevented a few classes of release error, and I’ve carried that practice with me to future engagements.

SQL Server 2016 then gave us several helpers like CREATE OR ALTER for stored procedures and views, or DROP IF EXISTS for views and tables. It’s not a complete set of everything I’d like the functionality to do, but it’s a lot more convenient than what we had to do in prior versions.

Leave a Comment

Finding Missing Data-Driven Subscriptions after an SSRS Upgrade

Sandra Delany has misplaced some subscriptions:

After migrating SSRS from SQL 2016 Enterprise Edition to SSRS SQL 2022 Standard Edition Data-driven subscriptions disappeared from within the SSRS web portal. However, I could see the subscriptions in the ReportServer.dbo.Subscriptions table.

SSRS was migrated from an EC2 instance where SQL and SSRS, etc. was installed by a DBA to an EC2 instance that was built using a template where all components were installed. When this was originally built out, we asked that they test. They said they did some testing, but they did not look at subscriptions in the portal nor did they create a subscription.

Click through to see how Sandra was able to troubleshoot and resolve the issue. But then how that led to the next issue, and how Sandra resolved that. And so on. This is what I refer to as an IT shaggy dog story. I don’t mean it in a negative sense for Sandra (or any author) but more along the lines of, “I want to solve problem X, which should take about 5-10 minutes. As I start to solve problem X, I now need to solve problem Y to solve X. But as I start to solve Y, now I need to fix Z. Oh, and then here come problems A, B, and C to make my life a pain. Three days later, I finally got X done.” It seems like the life of your average IT professional is one shaggy dog story after another.

Leave a Comment

Tracking the Last Sequence Value in SQL Server

Greg Low shares some queries and some history:

Sequences allow us to create a schema-bound object that is not associated with any specific table.

For example, if I have a Sales.HotelBookings table, a Sales.FlightBookings table, and a Sales.VehicleBookings table, I might want to have a common BookingID used as the key for each table. If more than the BookingID was involved, you could argue that there is a normalization problem with the tables, but we’ll leave that discussion for another day.

Another reason I like sequences is that they make it much easier to override the auto-generated value, without the need for code like SET IDENTITY_INSERT that we need with IDENTITY columns. This is particularly powerful if you ever need to do this across linked servers, as you’ll quickly find out that it doesn’t work.

Sequences let me avoid these types of issues: they perform identically to IDENTITY columns, and they also give me more control over the cache for available values.

Click through for some queries to find the latest value of a sequence, as well as how this functionality has changed over the years. One thing that I would point out is that, on busy systems, you might find that the value has changed between the time you run this query and the time you use the results.

Leave a Comment