Press "Enter" to skip to content

Category: Notebooks

Checking Key Vault Access in Microsoft Fabric Spark Notebooks

Marc Lelijveld has clearance:

Working with sensitive data in Microsoft Fabric requires careful handling of secrets, especially when collaborating externally. In a recent customer engagement, I needed to validate access to Azure Key Vault from within a Fabric Notebook, without ever exposing the actual secret values. With only read access granted and no need to manage or update secrets, I focused on confirming that the connection was working as expected.

In this blog, I’ll walk you through the approach, including the setup, code snippets, and logic behind this quick but crucial verification step.

Click through for the full story.

Leave a Comment

Writing to Microsoft FabricDelta Tables in Python via DuckDB

Gilbert Quevauvilliers does a bit of writing:

When I was exploring how to easily write to Delta Tables with a Python notebook, it took me a considerable amount of time to find out how to do this.

This is my learnings below, and from my point of view it makes it easy to write to a Lakehouse table, like what is done with a PySpark notebook.

Click through for one very important note, as well as the process.

Leave a Comment

Transmitting Printed Data in Notebooks

Marc Lelijveld provides a public service announcement:

When working with Notebooks in Microsoft Fabric, exporting and reusing them across environments or tenants might seem like a harmless, even convenient, task. Whether you’re sharing a template with a colleague, moving assets between workspaces, or contributing to the community — the last thing you’d expect is to accidentally include data along with your code.

But that’s exactly what can happen.

For people who have worked with Jupyter notebooks in the past, this is a fairly obvious result. But if you aren’t familiar with the platform, that idea may seem weird. Marc does provide some options for exporting notebook contents, and you can also clear the cell contents before exporting.

Comments closed

Deploying and Using Custom Python Libraries in Microsoft Fabric

Miles Cole picks up from part one:

This is part 2 of my prior post that continues where I left off. I previously showed how you can use Resource folders in either the Notebook or Environment in Microsoft Fabric to do some pretty agile development of Python modules/libraries.

Now, how exactly can you package up your code to distribute and leverage it across multiple Workspaces or Environment items? How could we acomplish something like the below?

Read on for the answer.

Comments closed

Querying a Microsoft Fabric SQL Endpoint from a Notebook

Dennes Torres wants to hit a SQL endpoint:

Let’s analyse why we would like to query an SQL Endpoint. Once we understand why, we can dig into how to make a query to a SQL Endpoint.

We use notebooks to connect directly to lakehouse. Except by the T-SQL notebook, the notebooks have a default lakehouse and work directly with it from spark. However, accessing other data object may be more complex

Specifically, this is a Spark notebook in Microsoft Fabric running Scala rather than a pure Python notebook, and is hitting the data warehouse SQL endpoint.

Comments closed

Writing Data into a Microsoft Fabric Lakehouse via Notebook

Stepan Resl writes some code:

Since Lakehouse is one of the key items within Microsoft Fabric, it is important to know how to write data into it in various formats and using different tools. One of the most common tools is notebooks, as they provide great flexibility and speed for development and testing with graphical outputs. In this article, I want to focus primarily on the following types of notebooks:

  • PySpark
  • Python

Click through to see how it works in both notebook types.

Comments closed

Retrieving Microsoft Fabric Items using a Python-Only Notebook

Gilbert Quevauvilliers doesn’t need Spark for this:

This blog below explains how to use a Python only notebook to get all the Fabric items using the Fabric REST API.

NOTE: At the time of this blog post Feb 2025, Dataflow Gen2 is not included in the Fabric items, I am sure it will be there in the future.

NOTE II: This only gets the Fabric Items, which does not include the Power BI Items.

Despite the notes, Gilbert leads off with the main reason why you might want to use this: it takes up approximately 5% of the capacity units that a Spark-based notebook does to perform the same operation.

Comments closed

Receiving Notification when a Microsoft Fabric Notebook Fails

Gilbert Quevauvilliers gets an e-mail:

What I have found is that when I created a pipeline in Microsoft Fabric that uses a notebook, when there is an error with the notebook, I do not get an alert that the notebook has failed.

This has happened to me in the past and I have found this pattern below to work consistently to notify me of errors.

In this blog post I will show you how I get notified when a notebook fails in a pipeline.

Read on to learn how.

Comments closed

Running a Microsoft Fabric Notebook from ADO via Service Principal

Kevin Chant needs a service principal to help:

In this post I want to share one way that you can authenticate as a service principal to run a Microsoft Fabric notebook from Azure DevOps.

Some of you may recall that I previously covered how to run a Microsoft Fabric notebook from Azure DevOps.

I decided to published a newer version of the aforementioned post to amplify the fact that the REST API that runs a notebook on demand now supports service principals.

Service principals are the way to go for this, so long as you’re having one Azure-based service communicate with another Azure-based service. No passwords, no API keys, nothing you need to remember or change every 90 days.

The problem is, this works beautifully for assets inside of Azure, but not so much outside of Azure. But that’s a story for a different day.

Comments closed

Geospatial Data Exploration in Microsoft Fabric

Sandeep Pawar goes on a journey:

Simon Willison is one of my favorite bloggers. In fact, what I blog, how I blog & test, is inspired by him. He wrote a blog a couple of weeks ago about FourSquare Places data that has been open-sourced. I was exploring this dataset and ended up creating a few maps. I love OrgApps in Fabric and I truly believe as it matures, it will be THE way for analysts & data scientists to provide rich insights + traditional reports to business users. Notebooks can augment the Power BI reports to provide insights that are otherwise not possible. I have submitted a session on this topic to FabCon ‘25, let’s see. If it is selected, I hope to show how transformational it is and how businesses can use it.

Click through for a video and the notebook that Sandeep demonstrated.

Comments closed