Press "Enter" to skip to content

Day: February 1, 2019

Using AWS Lambda To Get Into Nice Restaurants

Stephane Maarek gives us the best use of AWS Lambda I’ve seen yet:

One attentive eye would have noticed that the booking platform is not hosted on the restaurant website at http://www.septime-charonne.fr/en/ but instead on https://module.lafourchette.com.

Upon using the Chrome Web Developer Tools to analyze the network calls being made between my browser and the booking service, I stumbled upon an easy to use and completely unprotected REST API:

I love the bonus hack at the end.

Comments closed

Enabling Cloudera Manager Debug Mode

Guy Shilo has a quick tip around debugging in Cloudera Manager:

This is a short post but it can save you some wandering and searching.

Sometimes when you try to find and fix issues with Cloudera Manager you will want to increase the log level to debug so you can see what’s wrong.

The procedure cannot be found in the documentation (or at least cannot be found easily), so here how it’s done:

As you’d expect, going into debug mode generates a lot of data on a real cluster, so use sparingly.

Comments closed

Things To Know About Databricks UAP

Kara Annanie has five things you should know about the Databricks Unified Analytics Platform:

4.     A Spark Dataframe is not the same as a Pandas/R Dataframe
Spark Dataframes are specifically designed to use distributed memory to perform operations across a cluster whereas Pandas/R Dataframes can only run on one computer. This means that you need to use a Spark Dataframe to realize the benefits of the cluster when coding in Python or R within Databricks.

This is a nice set of tips for people getting started with Spark and Databricks.

Comments closed

When A Procedure Has Multiple Plan Cache Entries

Arthur Daniels shows that multi-statement stored procedures can have multiple entries in the plan cache:

So we have two entries for this stored procedure. I included the statement sql handle to show that each statement handle has its own text. Let’s parse that text to see each statement. I copied the parsing SQL from this Plan Cache article.

This is a good thing to keep in mind if you’re trying to figure out how often a procedure gets called: SUM on the execution counts grouped only by text might not give you the results you expect.

Comments closed

Azure Data Studio Code Snippets

Rich Brenner shows us how we can create code snippets in Azure Data Studio:

A great thing about these snippets is that you can add your own and they can be exactly how you want them.

To get started with this open the Command Pallet with Ctrl+Shift+P and type in ‘snippets’.

Scroll down and find the SQL option. Open it and it will bring you to the SQL.json file in which we’ll be storing our SQL Snippets.

I had to migrate a bunch of SSMS snippets to Azure Data Studio and was not that happy with the experience, especially for some of the more complicated snippets.

Comments closed

Digging Into DBCC CHECKIDENT

Solomon Rutzky covers the four primary scenarios when running DBCC CHECKIDENT and specifying a new reseed value:

So, when specifying a “new_reseed_value“, the possible scenarios covered are:
1. Rows exist
2. No rows due to none inserted since the table was created
3. No rows due to TRUNCATE TABLE operation


What’s missing? The following scenario:
No rows due to DELETE operation!!

Click through to see how DBCC CHECKIDENT behaves differently depending upon the scenario.

Comments closed

Database Integration Testing With Pester

Ust Oldfield shows us an example of how we can perform database integration testing using Pester:

In a previous post, I gave an overview to integration tests and documenting integration points. In this post, I will give a practical example of developing and performing integration tests with the Pester framework for PowerShell. With a data platform, especially one hosted in Azure, it’s important to test that the Azure resources in your environment have been deployed and configured correctly. After we’ve done this, we can test the integration points on the platform, confident that all the components have been deployed.

The code for performing integration tests is written in PowerShell using the Pester Framework. The tests are run through Azure DevOps pipelines and are designed to test documented integration points. The PowerShell scripts, which contain the mechanism for executing tests, rely upon receiving the actual test definitions from a metadata database.

Click through for the script.

Comments closed

Orphaned Workspaces In Power BI

David Eldersveld explains what orphaned workspaces are in Power BI:

One of the newer features in the Power BI Admin Portal is the ability to view all of a tenant’s Workspaces. As I was browsing through the collection of workspaces, I noticed several marked as Orphaned. What is an orphaned workspace, and how does it occur?

I was expecting orphaned workspaces to be a new thing where you pay for an Azure service using a distributed blockchain technology called Gruel (or maybe Grool).

Comments closed