Press "Enter" to skip to content

Category: Notebooks

Move Data between Lakehouses and Workspaces in Microsoft Fabric

Gilbert Quevauvilliers performs an exfiltration:

With the new Schema’s in a Lakehouse, it now is possible to read from Lakehouse A (In Workspace A) and write to Lakehouse B (In Workspace B).

Here are more details about the Schema preview: Lakehouse schemas (Preview) – Microsoft Fabric | Microsoft Learn

This opens a whole new world of possibilities.

I also really like the fact that I can simply use the Names, and I do not need to get the actual GUIDS!

For example, I can use the following as shown below which is WorkspaceName.LakehouseName,SchemaName.TableName

Click through to see it in action.

Leave a Comment

T-SQL Notebooks in Microsoft Fabric

Dennes Torres tries out T-SQL notebooks:

T-SQL Notebooks is one of the new features announced during FabCon Europe.

The most distracted could miss the fact this is a new feature at all. Yes, it is. Notebooks were capable to support Spark SQL, but T-SQL is something new.

The main examples being announced are built with data warehouses, but let me confirm and highlight this:

T-SQL Notebooks support lakehouses as well.

There is at least one limitation: DML is not supported with lakehouses.

Saving my rant about lakehouses vs warehouses in Fabric, do read what Dennes has to say about T-SQL notebooks as they exist today.

Comments closed

Dynamically Running Notebooks across Fabric Lakehouse Environments

Ayman El-Ghazali solves a problem:

A few months ago, an ISV customer approached with a request to have notebooks run across Microsoft Fabric Lakehouse environments dynamically.  Initially the first request was to allow pipelines in Fabric to pass parameters for file paths to help with data ingestion.  This would allow the customer to use the same notebook across Lakehouse environments for the customers that they are serving. After resolving this, the scope increased to include the notebook execution. The notebooks should be able to run across workspace environments and not have to be attached to a Lakehouse at the time of execution.  The solution presented below allows for the customer to run notebooks across environments but also allows them to run SQL queries against existing Lakehouse tables; additionally it allows for access to tables created during the notebook execution run without the notebook being attached to the Lakehouse. 

Read on to learn how.

Comments closed

KQLMagic in Fabric Runtime 1.3

Sandeep Pawar spreads the news:

I wrote a blog last year on the usefulness of KQLMagic command in Fabric notebook and made a suggestion that it should be part of the default runtime. Well, guess what – it’s now in the Fabric Runtime 1.3. No installation necessary and authentication is handled automatically.

Read on to learn more about how you can use KQLMagic in a Microsoft Fabric notebook to read from an Eventhouse.

Comments closed

Writing Data to an Unattached Lakehouse via Fabric Notebook

Prathy Kamasani does a bit of movement:

Regardless of which architecture we follow, during stages of data integration and transformation there’s always a step to move data from one location to another. And, we work with multiple tables, schemas, and even lake houses.Same goes with Fabric Notebooks. I often find myself in scenarios where I don’t want to attach Lakehouse to my notebook, but I do want to read or write data from various bakehouses.

I recently blogged about a way to achieve this as part of documenting your workspaces. In that post, I described how to write data to a workspace that was not attached to the notebook. I used MsSparkUtil(renamed to NotebookUtils) to mount and then write data in the Lakehouse as Delta tables.

Read on for the answer.

Comments closed

Tips for Orchestrating Fabric Notebooks

Stepan Resl talks orchestration:

Let’s start by introducing what orchestration is and why it’s important to talk about shared resources. Orchestration is a discipline focused on managing and coordinating individual items or control elements to collectively manage the flow of our data operations. In the context of Fabric, this involves managing notebooks, dataflows, pipelines, stored procedures, semantic model updates, and many other items, activities, and services that may even be outside of Fabric.

Read on for some of the options, how they work in Microsoft Fabric, and tips for success.

Comments closed

Exploring Semantic Model Relationships with Sempy

Prathy Kamasani builds a graph:

Understanding the relationships between datasets is crucial in data analytics, especially in the world of self-service BI. Sempy, a Python library unique to Microsoft Fabric, allows users to visualise these relationships seamlessly. This post explores using Sempy to visualise semantic model relationships and view them in a Power BI Report. Viewing them in Notebook is easy and has been documented on MS Docs.

Click through for a notebook and explanation of the underlying code.

Comments closed

Tracking Microsoft Fabric Notebook Progress

Gilbert Quevauvilliers asks are we there yet? are we there yet?

How to view or track the progress of Notebook while it is running in Microsoft Fabric

I was recently working with a Notebook in Microsoft Fabric that was started via a Data Pipeline.

The challenge I had was that I had no idea how far the notebook had gone (as there were quite a lot of cells in this particular notebook).

In this blog post I am going to show you how I can use Microsoft Fabric to identify exactly which cell my notebook is currently on.

Click through for the answer. And so help me, if you ask that question one more time, I’m turning this thing around and we’re going back home.

Comments closed

Updating the Default Lakehouse of a Notebook

Sandeep Pawar makes a change:

I have written about default lakehouse of a Fabric notebook before here and here. However, unless you used the notebook API, there was no easy/quick way of removing all/selective lakehouses or updating the default lakehouse of a notebook. But thanks to tip from Yi Lin from Notebooks product team, notebookutils.notebook.updateDefinition has two extra parameters, defaultLakehouse and defaultLakehouseWorkspace which can be used to update the default lakehouse of a notebook. You can also use it to update environment attached to a notebook. Below are some scenarios how it can be used.

Click through for those scenarios.

Comments closed