Press "Enter" to skip to content

Category: Security

Configuring the Fabric Service Principal to Support Storage APIs

Gilbert Quevauvilliers grants some permissions:

This blog post explains how to configure access for my Service Principal to interact with the Azure Storage API to use the API to get details for Microsoft Fabric Storage.

This is part of a blog post series where I am going to show you how to “View Total Storage consumed in Microsoft Fabric”

When I started this blog post I realized that I first need to explain how to configure the Service Principal authentication to interact with the Azure Storage API permissions. This is because in my notebook these steps are required for the notebook to run successfully.

Read on to find out how.

Leave a Comment

Simplifying Azure Authorization with Managed Identities

MIka Sutinen doesn’t need a password:


Managed Identities
 in Azure provide an efficient and secure way for managing credentials when accessing databases, or other Azure resources. In this post, we build on my recent post about using Service Principal for database authentication, further exploring how we can leverage Managed Identities to simplify the process.

While Service Principals can make some parts of the access management simpler in Azure, you’re still left with several responsibilities in their management.

These responsibilities include rotating secrets and managing the lifecycle of the Service Principal, to name maybe the two most crucial ones. And if you have multiple applications, each with a separate Service Principal, this can be complex to manage.

Click through to learn more about managed identities and service principals, and how they work to link together Azure resources behind the scenes.

Leave a Comment

Querying Audit Log (.xel) Files in Azure SQL DB

Tanayankar Chakraborty reads an audit log:

A recent issue was brought to our attention that customers could not query .xel log files in an Azure SQL DB using t-sql command. The customers complained that when they ran the command, they received column headers but no content whereas they know that there is content in the logs because they were able to open them with SSMS using Merge Extended Event Files. Here was the T-sql command used by the customer:

select * from sys.fn_get_audit_file (‘https://mydbastorage.blob.core.windows.net/sqldbauditlogs/servername/dbname/SqlDbAuditing_Audit_NoRetention/*.xel’, NULL, NULL);

Click through for the solution, which came down to two separate issues.

Leave a Comment

Permissions and BACPAC Files in Azure SQL DB

Roberto Yonekawa diagnoses an error:

We had a support request where the customer was getting an error when trying to export his Azure SQL Database to a bacpac file using the SqlPackage command-line utility.

Error message:

Microsoft.Data.Tools.Diagnostics.Tracer Error: 19 : 2024-08-21T16:10:56 : Microsoft.SqlServer.Dac.DacServicesException: One or more unsupported elements were found in the schema used as part of a data package.

Error SQL71627: The element Permission has property Permission set to a value that is not supported in Microsoft Azure SQL Database v12.

Click through for the specific issue Roberto found. I’d imagine that there are other permission sets that are incompatible with Azure SQL Database and would cause this error message to pop up as well.

Leave a Comment

Data Masking in Snowflake

Kevin Wilkie puts on the mask:

The premise behind data masking is fairly simple – make it so that only certain people can see the data in a certain column. For example, social security numbers or credit card numbers probably shouldn’t be seen by everyone who has access to your database.

I like to set it up as follows: (Obviously, this can change depending on your business’ policies.)

  1. Admins have full access.
  2. Call Center Representatives have access to the last 4 digits of the social.
  3. All others see nothing useful.

Read on to see how. I’ll even save my normal “data masking isn’t really a security feature” rant. This time.

Leave a Comment

Constrained Kerberos Delegation with SSRS and Power BI Gateways

Rod Edwards doesn’t want just anyone to double-hop:

Ok, many of you will already be aware that in order to use Integrated Authentication successfully with SSRS particularly, that you have to configure Kerberos Authentication. At a very basic level, this allows the credentials of the user running the report, to be passed to the report server (hop 1) and then along to the target of the SSRS datasource (hop 2), also known as “Double hop” authentication. The delegation part of this signifies where the service (PBIG or SSRS) is allowed to pass these credentials along to.

  • anywhere, ie…Unconstrained delegation, or
  • to a restricted set of targets…Constrained delegation.

Read on to see how you can set up constrained delegation.

Leave a Comment

Using the Log Analytics Ingestion API with Sentinel

Abhishek Sharan explains how the Log Ingestion API works when feeding data into a Log Analytics workspace:

In this blog, I’m going to delve into ingesting and transforming application logs to log analytics workspace using Log Ingestion API technique. Before we jump into the details, let’s explore the different types of ingestion techniques based on the application log types.

  • If applications and services logs information to text files instead of standard logging services such as Windows Event log or Syslog, Custom Text Logs can be leveraged to ingest such text file logs in log analytics workspace.
  • If applications and services logs information to JSON files instead of standard logging services such as Windows Event log or Syslog, Custom JSON Logs can be leveraged to ingest such text file logs in log analytics workspace.
  • If an application understands how to send data to an API then you can leverage Log Ingestion API to send the data to Log Analytics Workspace.

Custom Text/JSON Logs are out of scope for this blog, I might write a separate blog dedicated to these techniques later. In this blog, my focus will be on streaming data to log analytics workspace using Log Ingestion API and transforming the data for optimal usage.

Note: This blog aims to demonstrate how to ingest logs using the log ingestion API. To keep things straightforward, I’ll refer to our public documentation.

Click through for details on the process.

Leave a Comment

An Overview of Differential Privacy

Zachary Amos covers a topic of note:

Data analytics tools allow users to quickly and thoroughly analyze large quantities of material, accelerating important processes. However, individuals must ensure to maintain privacy while doing so, especially when working with personally identifiable information (PII). 

One possibility is to perform de-identification methods that remove pertinent details. However, evidence has suggested such options are not as effective as once believed. People may still be able to extract enough information from what remains to identify particular parties. 

Read on to learn a bit more about the impetus behind differential privacy and a few of the techniques you can use to get there. The real trick with differential privacy is adding the right kind of noise not to distort the distribution of the data, while still not allowing an end user to unearth enough information to identify a specific individual.

Leave a Comment

Managing Orphaned Users in SQL Server

Jordan Boich asks for more gruel:

Addressing orphaned users is an important piece of SQL Server security management. When left unchecked, you can accumulate an overwhelming number of users that exist in a database but do not have a correlating login in the master database, thus preventing access to the server or the database at all. There are dbatools PowerShell modules out there that can help you accomplish what sp_FindOrphanedUser does. However, there are some situations where PowerShell may not be available to you for a multitude of reasons, and having an extra tool in the toolbelt never hurt anyone, am I right?

Read on to learn about orphaned users and how the sp_FindOrphanedUser procedure works.

Comments closed