Press "Enter" to skip to content

Author: Kevin Feasel

Moving To The Cloud

Denny Cherry explains some of the important indicators that you might benefit from moving to a cloud provider:

The reality is that not all workloads are a right fit for the cloud. If you are running highly sustained workloads, then the cloud probably isn’t the right solution for your environment. The systems which work best in the cloud are the ones which can be converted to use the various PaaS services not just the IaaS services. The PaaS services will usually provide a more cost effective hosting solution, especially for workloads which vary over time; for example, ramping up during the day and becoming almost idle overnight.

Even if running in a PaaS environment isn’t an option this may be cost effective for running in an IaaS environment. It all depends on how bursty the workload is that you plan on moving to the cloud.

There are some good points here; check it out.

Comments closed

R-Hub

David Smith discusses a new service to test packages on multiple platforms:

If you’re developing a package for R to share with others — on CRAN, say — you’ll want to make sure it works for others. That means testing it on various platforms (Windows, Mac, Linux, and all the versions thereof), and on various versions of R (current, past, and future). But it’s likely you only have access to one platform, and installing and managing multiple R versions can be a pain.

R-hub, the online package-building service now in public beta, aims to solve this problem by making it easy to build and test your package on a variety of platforms and R versions. Using the rhub R package, you can with a single command upload your package to the cloud-based R-hub service, and build and test your package on the current, prior, and in-development versions of R, using any or all of these platforms

This looks like an interesting service for package developers and companies with a broad distribution of R installations.

Comments closed

Handling MDS Configuration Errors

Koen Verbeeck looks at setting up MDS and conquers some configuration file permission issues:

The error seemed quite clear: Cannot read configuration file due to insufficient permissions. Just to be sure, I added the user MDSAppPool – created in the MDS Configuration Manager for the MDS Application Pool – to the Administrators group on the machine. A brute-force solution, but since it’s on my own machine for demo purposes, I didn’t really care. Of course it didn’t work. Then I assigned full control permissions for the MDSAppPool user on the folder C:\Program Files\Microsoft SQL Server\130\Master Data Services. Didn’t work. Used the browser in Administrator modus. Also didn’t work. Checked IIS settings and discovered that Windows Authentication was not enabled. So I enable it, but the error persists. This is the point where it all starts to get frustrating. Adding MDSAppPool to the IIS_IUSRS group doesn’t work. Giving that group full control on the MDS directory either.

Read on for the solution.

Comments closed

Restoring An Azure SQL Database

I have a post on restoring a database in Azure SQL Database:

You will need to select your restore point as well.  In this case, I decided to restore back to midnight UTC on a particular date.  Note that the dates are UTC rather than your local timezone!

After selecting your restore point, you pick the target server and can decide a couple of things.  First, you can put this database into an elastic database pool, which makes cross-database connections a lot easier.  Second, you can choose a different pricing tier.  Because I only needed this database for a few minutes, keeping it at P2 Premium was fine; the total restore time meant that we spent less than a dollar bringing the data back to its pristine condition.

Be aware of the time for restoration; it can be very slow.

Comments closed

SQL Server ODBC For Linux 13.0

Meet Bhagdev reports that a new version of the SQL Server ODBC driver for Linux is available:

What’s new

  • Native Linux Install Experience: The driver can now be installed with apt-get (Ubuntu), yum (RedHat/CentOS) and Zypper (SUSE). Instructions on how to do this is below.

  • AlwaysOn Availability Groups (AG): The driver now supports transparent connections to AlwaysOn Availability Groups. The driver quickly discovers the current AlwaysOn topology of your server infrastructure and connects to the current active server transparently.

  • TLS 1.2 support: The driver now supports TLS 1.2 connections to SQL Server.

These are some nice additions.  None of them are groundbreaking, but they add up to a nice release.  Click through for instructions on how to install the driver; it got a lot easier for supported platforms.

Comments closed

Power BI Integrates With SSRS

Paul Turley discusses a brand new announcement:

What am I most excited about as I prepare for the PASS Summit here in Seattle this week?  A lot of things.  Preparing for my session, which will be on Thursday at 1:30, by far the most popular and interesting topics are about integration and tool choice.  Today’s public announcement on the SSRS product team blog about on-premises Power BI integration with Reporting Services is really big news.  It’s great to see two of the technologies I love working together.  Whether in the cloud or on-premises, Power BI and Reporting Services can be used together.

It’s hard to overstate how huge this is.  Plenty of companies want the reporting that Power BI offers, but have security or software policies in place which prevented Power BI adoption.  Having it render through Reporting Services means that end users don’t need to have Power BI Desktop and that the data and reports remain entirely on-prem.

Comments closed

A Better Get-SQLErrorLog

Drew Furgiuele steps up with an improved version of Get-SQLErrorLog:

It doesn’t have to be so bad, though, because we can make it better. In my mind, my perfect error log cmdlet should:

  1. Be usable when a SQL Server instance is down, and
  2. Be relatively quick, and
  3. Parse error messages by number, severity, and state, and
  4. Incorporate date/time range filtering, and
  5. Return an object

What’s that, you say? You want to solve each of these and write our own, better cmdlet? I got you, fam. At the bottom of this blog post you’ll find my code, and you can skip ahead if you don’t care about the “hows” or “whys” of what you’ll be looking at.

Very nice work Drew did in putting this together.

Comments closed

Regular Expressions In Lucene

Kendra Little looks at Azure Search searches:

I wanted to be able to find all architect jobs using something like ‘%rchit%’ as well, because there’s not a lot of great ways to do this in SQL Server.

In SQL Server, you can use a traditional B-Tree index to seek, but only based on the letters at the beginning of a character column.  If I want to know every business title that contains ‘%rchit%’, I’m going to have to scan an entire index.

SQL Server fulltext indexes don’t solve the double-wildcard problem, either. Fulltext indexes support word prefix searches– so a fulltext index would be great at finding all job titles that contain a word that starts with ‘Arch%’.

Sometimes that’s enough. But a lot of times, you do need to find a substring anywhere in a word. And sometimes you do want to offload that from your database.

This is the kind of problem Lucene (and its follow-up implementations, like Elasticsearch) was designed to solve.  Read on for more details as Kendra solves the problem in Azure Search.

Comments closed

Testing Analysis Services Cubes

Jens Vestergaard shows how to test Analysis Services cubes using a Visual Studio test project:

Unit testing in Visual Studio is actually not that hard and can save you a lot pain down the road. The testing framework in Visual Studio offers extensive ways of executing batches of tests. You can group tests by Class, Duration, Outcome, Trait or Project.

When you right-click a test, you get the option to select how you want the tests in the Test Explorer to be grouped.

If you have an Analysis Services cube, definitely read this—testing is a vital part of software development, and automating tests can save you significant time later.

Comments closed