Emanuele Meazzo has updated the SQL Server Diagnostic Notebook:
Good news everyone!
As always, I’ve updated the SQL Server Diagnostic Notebook to include the latest updates to the source scripts
Click through for the link.
Comments closedA Fine Slice Of SQL Server
Emanuele Meazzo has updated the SQL Server Diagnostic Notebook:
Good news everyone!
As always, I’ve updated the SQL Server Diagnostic Notebook to include the latest updates to the source scripts
Click through for the link.
Comments closedTomaz Kastrun continues opening doors in the advent calendar:
Yesterday we introduced the Databricks CLI and how to upload the file from “anywhere” to Databricks. Today we will look how to use Azure Blob Storage for storing files and accessing the data using Azure Databricks notebooks.
Click through to see how.
Comments closedTomaz Kastrun is putting together an Advent of Azure Databricks:
Yesterday we started working towards data import and how to use drop zone to import data to DBFS. We have also created our first Notebook and this is where I would like to start today. With a light introduction to notebooks.
Read on for a depiction of notebooks, as well as an example which loads data into the Databricks File System (DBFS).
Comments closedTomaz Kastrun shows us how to include multiple languages in Azure Data Studio notebooks:
Using multiple languages is a huge advantages when people choose notebooks over standard code files. And with notebooks in Azure Data Studio without switching the kernels, you can stay on one and work your code by using following functions for switching.
Read on to learn more.
Comments closedJulie Koesmarno shows off the Kusto Query Language magic in Azure Data Studio notebooks:
To do this, you’ll need to ensure that you have Kqlmagic installed. See Install and set up Kqlmagic in a notebook. Then in a notebook, you can load Kqlmagic with
%reload_ext Kqlmagic
in a code cell.The next step is then in a new code cell, you can start connecting to a Log Analytics workspace. There are three ways to do so (roughly – as I’m also learning in this space too):
1. Using Azure Active Directory Device Login authentication.
2. Using Az CLI login
3. Using Client Secret
Read on for one example using Azure AD authentication.
Comments closedDrew Skwiers-Koballa takes us through creating and deploying Jupyter Books:
The notebook experience in Azure Data Studio allows users to create and share documents containing live code, execution results, and narrative text. Potential usage includes data cleaning and transformation, statistical modeling, troubleshooting guides, data visualization, and machine learning. Jupyter books compile a collection of notebooks into a richer experience with more structure and a table of contents. In Azure Data Studio we are able not only to use Jupyter books but also create and share them. Learn the basics of notebooks in Azure Data Studio from the documentation and read on to learn how to leverage a GitHub Action to publish and share remote Jupyter books.
Click through for the process of creating, opening, and distributing Jupyter Books.
Comments closedAlan Yu announces the November 2020 release of Azure Data Studio:
Another feature request was to provide support for parameters in a notebook. Parameterization is the ability to execute the same notebook with different parameters.
With this release of Azure Data Studio, users will now be able to utilize Papermill’s ability to parameterize, execute, and store notebooks. By stating the parameters cell as the first code cell in your notebook, it ensures that the injected parameters in the outputted parameterized notebook will be placed directly after the original parameters cell. That way the parameterized notebook will utilize the newly injected parameters instead of the original parameters cell.
Users can utilize Papermill CLI as well as the Python API to pass in a new set of parameters quickly and efficiently as shown below.
That does look interesting.
Comments closedUnmesha Sreeveni shows how you can create a widget in a Databricks notebook:
In order to get some inputs from user we will require widgets in our Azure Databricks notebook.
This blog helps you to create a text based widget in your python notebook.
The syntax is rather similar for Scala as well.
Comments closedDavid Eldersveld continues a series on Power BI external tools:
Many people use Python with notebooks, so let’s take a look at one possible way to enable a Jupyter external tool for Power BI Desktop. The following stepwise approach begins with simply opening Jupyter. It then progresses to creating and opening a notebook that includes Power BI’s server and database arguments. Finally, it works its way toward downloading a notebook definition contained in a GitHub gist and connects to Power BI’s tabular model to start to make this approach more useful.
This post continues a series of posts related to Python and Power BI. The first three parts of this blog series introduced some possible uses for Python connected to a Power BI model, how to setup a basic Python external tool, and how to both use it with a virtual environment and connect to the Tabular Object Model.
This was a cool usage of Power BI’s external tool functionality and starts to give you an idea of how powerful it can be.
Comments closedAveek Das takes us through the most popular name in notebooks:
In this article, I am going to explain what Jupyter Notebooks are and how to install the same on your machine. Further, I will demonstrate how to use these notebooks using Visual Studio Code and perform data analysis and other development activities. It is an open-source platform using which you can create and share documents that contain live code, equations, and visualizations along with the formatted text. Most importantly, these notebooks can be run on the web browser by just starting a server and using it. This open-source project is maintained by the team at Project Jupyter.
This is a fairly basic introduction to the topic, good if you have heard about notebooks but don’t know where to begin.
Comments closed