Press "Enter" to skip to content

Category: Error Handling

Operation Requires Server to be a Registered Server

Garland MacNeill finds one way to solve a problem:

Anyway, it’s been a while since I worked on this AG and I need to get the migration/upgrade done. As I was working on configuring jobs, I ran into a problem where the AG node (as a target node) wasn’t downloading jobs from the Master node, in fact, the last poll was in July.

When I tried to force a poll, I was met with an error message that the server wasn’t registered, never mind it was clearly listed as a target server. Google didn’t find anything useful, other than some questions from 2013. I did come across the syntax to forcefully eject the server as a target with SQL. 

Read on to see how and what to do in the aftermath.

Comments closed

Az Powershell Modules and Users

Rayis Imayev troubleshoots an Azure Data Factory deployment:

To run these scripts, you will need to have Azure PowerShell installed on your DevOps agents (Azure Pipeline Agents). If your pipeline agents are Microsoft-hosted, then you’re good and all maintenance and software updates are taken care of for you. However, when you implement and install self-hosted agents, then additional software and component installation is solely your responsibility to maintain.

Recently, while I was configuring those pre- and post-deployment scripts for my Azure Data Factory deployment, I received the following error message, “Could not find the modules: ‘Az.Accounts’ with Version: ””. 

Rayis does have a working solution, but I do recommend against installing modules in System32 because that directory is supposed to be reserved for Windows. Instead, for multi-user Powerhsell modules, I’d drop them in %ProgramFiles%\WindowsPowerShell\Modules instead, following the general Powershell guidance.

Comments closed

Troubleshooting External Command Timeouts in Power BI

Chris Webb continues a series n troubleshooting timeouts:

In the first post in this series I showed how any Power BI dataset refresh started via the Power BI portal or API is limited to 2 hours in Shared capacity and 5 hours in Premium capacity, and how you could work around that by running a refresh via Premium’s XMLA endpoint feature. In the second post in this series I showed how some M functions allow you to set timeouts. However, even if you initiate a refresh via the XMLA endpoint you may still get a timeout error and in this post I’ll discuss another reason why: the External Command Timeout.

Read on to see what the external command timeout is and when it might strike.

Comments closed

Using the Fail Activity in Azure Data Factory

Rayis Imayev thinks about failure:

Recently, Microsoft introduced a new Fail activity (https://docs.microsoft.com/en-us/azure/data-factory/control-flow-fail-activity) in the Azure Data Factory (ADF) and I wondered about a reason to fail a pipeline in ADF when my internal being tries very hard to make the pipelines successful once and for all. Yes, I understand a documented explanation that this activity can help to “customize both its error message and error code”, but why?

Click through for Rayis’s take. I’ll just be here cracking jokes about how Fail activities are banned in my code because I expect it to have a positive outlook on life.

Comments closed

Clear out Those Old Container Images

Joy George Kunjikkur has a public service announcement for us:

When we use self-hosted Azure pipeline agents, we may encounter the below issue during the build process. This is not a hard issue to troubleshoot. The reason is there in the error message.

Error processing tar file(exit status 1): open /root/.local/share/NuGet/v3-cache/670c1461c29885f9aa22c281d8b7da90845b38e4$ps:_api.nuget.org_v3_index.json/nupkg_system.reflection.metadata.1.4.2.dat: no space left on device

This is known in the industry as a whoopsie-doops. Click through to see what you can do to resolve the problem.

Comments closed

Non-Yielding IO Completion Ports

Sean Gallardy is here to demystify a concept:

IO Completion Ports are a set of Windows APIs which allow for efficient, fast, multithreaded asynchronous IO. Great, that pretty much tells you nothing.

SQL Server uses IO Completion Ports not for disk-based IO but for general network IO when it comes into SQL Server for TDS level items. This means it’s used for things such as connecting to an instance of SQL Server, sending batch and rpc information, etc., and is used to properly take actions on the incoming items. These actions should be extremely short and quick, the name of the game is low latency and high throughput which means not doing things like reading or writing from disk, allocating memory, calling functions that may block, etc., to keep things flowing.

Read on to see what happens when there is a problem and what might cause that problem.

Comments closed

Backing UP Power BI Premium—Couldn’t Connect to Azure

Gilbert Quevauvilliers troubleshoots an error:

What I did learn when working through the blog post is that I ran into some errors when trying to re-connect or trying to connect to the Azure Storage in my Premium App Workspace and it failed.

The errors that I got were, “We couldn’t connect to Azure, but it’s likely temporary. Try again later or see details.”

Read on for the cause and the solution.

Comments closed

Data Source Name Not Found with Postgres Driver

Rayis Imayev troubleshoots a problem:

A very short blog post, just a reminder to myself, but if you have ever tried to connect to a PostgreSQL database using ODBC interface (I know, it already sounds like a very interesting challenge :- ), then you might have experienced this error message: “ERROR [IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified.”

Read on to see the cause of and solution to this problem.

Comments closed