Press "Enter" to skip to content

Category: Cloud

Using the Cosmos DB Integrated Cache

Hasan Savran makes use of a cache:

We are ready to write some code now. Integrated Cache works only in Eventual Consistency for now. So, we need to send requests in Eventual consistency to test the Integrated Cache. To do that, we need to use requestOptions parameter in SDK. You can change your database consistency level to Eventual too for testing if you like. Don’t forget to change it back later!

Hopefully that limitation changes later, but in the meantime, click through to see how to use the integrated cache in Cosmos DB.

Comments closed

Moving Artifacts between Folders in Synapse Studio

Wolfgang Strasser looks at a recent update:

Another small but very powerful usability extension in Azure Synapse Studio was added at the beginning of June: Move artifacts across folders in Synapse Studio (without extra clicks but with drag&drop)

Once again, the release notes list contained the short sentence that made me curious… hmm… that sound nice… In one of my previous post, I described the “old” way of moving artifacts around in Synapse Studio.

Click through for a demonstration.

Comments closed

Connecting to Cosmos DB via Dedicated Gateway

Hasan Savran introduces us to the Cosmos DB Dedicated Gateway:

Cosmos DB team announced a new way named Dedicated Gateway to connect to Azure Cosmos DB. As you might know there is already a standard gateway to connect to Cosmos DB. Dedicated or Standard gateway means that there is a computer stays between Cosmos DB replica set and your application. Your application request goes to gateway server then goes to Cosmos DB database. The biggest difference between Standard Gateway and Dedicated Gateway is, you do not share the dedicated gateway server with other Cosmos DB customers.

     Dedicated Gateway is totally yours and you are responsible for its costs. Depending on your application size, you can select different size of gateway servers.

Read on to learn how expensive it is and the benefits it brings.

Comments closed

Using Ola’s Maintenance Solution on RDS

Jack Vamvas takes us through a couple of nuances around using Ola Hallengren’s SQL Server Maintenance Solution on Amazon RDS:

I’ve used the Ola Hallengren Maintenance Solution across various SQL Server environments . I was recently asked by a colleague about how adaptable they are to the AWS RDS SQL Server environment. 

I checked the Ola Hallengren FAQ and there is a comment :

Read on to learn the details.

Comments closed

Loading Data into Power BI Premium Per User vs Azure Analysis Services

Gilbert Quevauvilliers continues a series on moving from Azure Analysis Services to Power BI Premium Per User:

I have been working with a customer where I have got data in AAS and in PPU for the same dataset.

What I have found is that when the data is loading it is very similar in terms of how long the data takes to load.

With one of my customers as an example the data was being curated in Asia, whilst the business was running things from Australia. By hosting AAS/PPU where the data was curated meant that the data loading was significantly faster. Yes while the reports would have to access the data across the ocean, this only sends the results, so the performance of the reports was and is still blazingly fast!

Click through for the full story.

Comments closed

Drain Mode in Azure Functions

Rayis Imayev pulls the plug:

As requests to execute Azure Functions increase, then the demand for such compute resources is supported, but only while it is needed (scale-out). As requests fall, any extra resources and application instances drop off automatically (scale-in).

Recently Microsoft enabled a new Drain mode in Azure Functions, that allows for a graceful shutdown of the Azure Function host by completing inflight invocations and stops listening for new events from triggering sources.

Read on for the set of steps it performs, as well as the benefit it provides.

Comments closed

Querying AWS Athena via Powershell

Michael Bourgon needs to get some data out of S3:

I was running into issues with the Linked Server lopping off long JSON that I’m having to pull out from the raw files.  I can’t explain it – doesn’t appear to be SSMS.  See previous post

But I needed to automate this, rather than use SQL Workbench, save to “Excel” (it was XML), then opening it again and saving it so that instead of 250mb, it’s 30mb.  Runs against the previous month, one day at a time (walking the partitions), and then saves to a file.  You got your Athena, your ODBC, your Export-Excel…

Incidentally, that previous post was around trying to use a linked server to pull the data in via SQL Server.

Comments closed

Integrating Power BI Deployment Pipelines with Azure DevOps

Marc Lelijveld shows how you can combine Power BI deployment pipelines with Azure DevOps:

Looking at the Power BI release plan, dataflow support for Deployment Pipelines is coming up shortly! Currently it is scheduled for June 2021 to reach the public preview state. Versioning and DevOps integration go hand-in-hand to our opinion. With Azure DevOps Git integration, we can overcome the versioning challenge while integrating with Azure DevOps at the same time, as described in the previous blog in 2019. Today, we release a new version of the DevOps implementation which uses native Power BI functionality. Stay tuned!

As we really like the metadata deployment and the ease of setup a pipeline in the Power BI Service, Ton and I decided to setup an Azure DevOps extension based on the recently released Power BI REST APIs for Deployment Pipelines. Although Microsoft promised to come-up with a native DevOps extension over time, we decided to go for it. Time to bridge the gap!

Read on for more details.

Comments closed