Press "Enter" to skip to content

Category: Error Handling

Temporal Tables and Azure DevOps Deployments

Rayis Imayev notes a problem with Azure DevOps deployments:

Here is one thing that still doesn’t work well when you try to alter an existing temporal table and run this change through the [SqlAzureDacpacDeployment@1] DevOps task, whether this change is to add a new column or modify existing attributes within the table. Your deployment will fail with the “This deployment may encounter errors during execution because changes to … are blocked by …’s dependency in the target database” error message.

Read on to see what causes this problem and what we can do to work around it.

Comments closed

Bidirectional Transactional Replication and Server Names

Mousa Janini points out a requirement of bidirectional transactional replication:

The steps to create a Bi-directional replication is simple, and similar to the steps for configuring transnational replication with extra step to enable the @loopback_detection parameter of sp_addsubscription to ensure that changes are only sent to the Subscriber and do not result in the change being sent back to the Publisher.

The most common issue for the Bi-directional replication is when the loop back detection is not working as expected; which results in data conflicts and Primary Key Violations.

Read on to see what is the cause of this problem and what you can do to solve it.

Comments closed

Reasons Azure SQL Databases Cannot Move to Serverless

Ahmed Mahmoud troubleshoots an Azure SQL Database migration issue:

We sometimes see customers cannot move their SQL database from provisioned compute tier to serverless while the scaling operation fails with error signature like:

Failed to scale from General Purpose: Gen5, 2 vCores, 32 GB storage, zone redundant disabled to General Purpose: Serverless, Gen5, 2 vCores, 32 GB storage, zone redundant disabled for database: .
Error code: .
Error message: An unexpected error occured while processing the request. Tracking ID: ‘xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx’

Click through for several possible reasons.

Comments closed

Flexible File Components with SSIS

Bill Fellows hides SSIS DNA in a can of Barbasol shave cream:

The Azure Feature Pack for SSIS is something I had not worked with before today. I have a client that wants to use the Flexible File Task/Flexible File Source/Flexible File Destination but they were having issues. The Flexible File tools allow you to work with Azure Blob storage. We were dealing with ADLS Gen2 but the feature pack can work with classic blob storage as well. In my hubris, I said no problem, know SSIS. Dear reader, I did not know as much as I thought I did…

Click through for a whopper of a story. But be sure to read to the very end, as you don’t want to stop at using TLS 1.0.

Comments closed

Operation Requires Server to be a Registered Server

Garland MacNeill finds one way to solve a problem:

Anyway, it’s been a while since I worked on this AG and I need to get the migration/upgrade done. As I was working on configuring jobs, I ran into a problem where the AG node (as a target node) wasn’t downloading jobs from the Master node, in fact, the last poll was in July.

When I tried to force a poll, I was met with an error message that the server wasn’t registered, never mind it was clearly listed as a target server. Google didn’t find anything useful, other than some questions from 2013. I did come across the syntax to forcefully eject the server as a target with SQL. 

Read on to see how and what to do in the aftermath.

Comments closed

Az Powershell Modules and Users

Rayis Imayev troubleshoots an Azure Data Factory deployment:

To run these scripts, you will need to have Azure PowerShell installed on your DevOps agents (Azure Pipeline Agents). If your pipeline agents are Microsoft-hosted, then you’re good and all maintenance and software updates are taken care of for you. However, when you implement and install self-hosted agents, then additional software and component installation is solely your responsibility to maintain.

Recently, while I was configuring those pre- and post-deployment scripts for my Azure Data Factory deployment, I received the following error message, “Could not find the modules: ‘Az.Accounts’ with Version: ””. 

Rayis does have a working solution, but I do recommend against installing modules in System32 because that directory is supposed to be reserved for Windows. Instead, for multi-user Powerhsell modules, I’d drop them in %ProgramFiles%\WindowsPowerShell\Modules instead, following the general Powershell guidance.

Comments closed

Troubleshooting External Command Timeouts in Power BI

Chris Webb continues a series n troubleshooting timeouts:

In the first post in this series I showed how any Power BI dataset refresh started via the Power BI portal or API is limited to 2 hours in Shared capacity and 5 hours in Premium capacity, and how you could work around that by running a refresh via Premium’s XMLA endpoint feature. In the second post in this series I showed how some M functions allow you to set timeouts. However, even if you initiate a refresh via the XMLA endpoint you may still get a timeout error and in this post I’ll discuss another reason why: the External Command Timeout.

Read on to see what the external command timeout is and when it might strike.

Comments closed

Using the Fail Activity in Azure Data Factory

Rayis Imayev thinks about failure:

Recently, Microsoft introduced a new Fail activity (https://docs.microsoft.com/en-us/azure/data-factory/control-flow-fail-activity) in the Azure Data Factory (ADF) and I wondered about a reason to fail a pipeline in ADF when my internal being tries very hard to make the pipelines successful once and for all. Yes, I understand a documented explanation that this activity can help to “customize both its error message and error code”, but why?

Click through for Rayis’s take. I’ll just be here cracking jokes about how Fail activities are banned in my code because I expect it to have a positive outlook on life.

Comments closed

Clear out Those Old Container Images

Joy George Kunjikkur has a public service announcement for us:

When we use self-hosted Azure pipeline agents, we may encounter the below issue during the build process. This is not a hard issue to troubleshoot. The reason is there in the error message.

Error processing tar file(exit status 1): open /root/.local/share/NuGet/v3-cache/670c1461c29885f9aa22c281d8b7da90845b38e4$ps:_api.nuget.org_v3_index.json/nupkg_system.reflection.metadata.1.4.2.dat: no space left on device

This is known in the industry as a whoopsie-doops. Click through to see what you can do to resolve the problem.

Comments closed