Press "Enter" to skip to content

Day: May 17, 2017

Apache Solr Backup And Recovery

Hrishikesh Gadre shows how to back up indexes in Apache Solr:

The backup mechanism allows an administrator to create a physically separate copy of index files and configuration metadata for a Solr collection. Any subsequent change to a Solr collection state (e.g. removing documents, deleting index files or changing collection configuration) has no impact on the state of this backup. As part of disaster recovery, the restore operation creates a new Solr collection and initializes it to the state represented by a Solr collection backup.

It’s probably safest to treat data in Solr as secondary data, in the sense that you should be able to rebuild the entire data set from scratch instead of Solr being a primary data store.  I’m not a big fan of the author using the term “disaster recovery” instead of just “recovery” or “backup restoration” (as they’re different concepts), but it’s worth the read.

Comments closed

Getting Get-Help Help

Shane O’Neill troubleshoots a problem and explains how helpful Get-Help can be in the process:

Why does help exist?

When you think about it, why is there even a function called help?
As far as I’m aware it’s basically the same as Get-Help except it automatically pipes the output to | more so we get pages rather than a wall of text.

Is there more that we can do with Get-Help though? Is there a way that we can return the examples only? Syntax only? Parameters only?

Is there not a way that we can do such things?!

Read on to find out if there is.

Comments closed

SQL Server Updates For MacOS

Meet Bhagdev has a couple MacOS-related announcements for SQL Server.  First, the SQL Server team has released command line tools:

We are delighted to share the production-ready release of the SQL Server Command Line Tools (sqlcmd and bcp) on macOS El Capitan and Sierra.

The sqlcmd utility is a command-line tool that lets you submit T-SQL statements or batches to local and remote instances of SQL Server. The utility is extremely useful for repetitive database tasks such as batch processing or unit testing.

The bulk copy program utility (bcp) bulk copies data between an instance of Microsoft SQL Server and a data file in a user-specified format. The bcp utility can be used to import large numbers of new rows into SQL Server tables or to export data out of tables into data files.

Second, there’s a new ODBC driver available:

  • Azure AD support – You can now use Azure AD authentication (username/password) to centrally manage identities of database users and as an alternative to SQL Server authentication.

  • Always Encrypted support – You can now use Always Encrypted. Always Encrypted lets you transparently encrypt the data in the application, so that SQL Server will only handle the encrypted data and not plaintext values. Even if the SQL instance or the host machine is compromised, an attacker gets ciphertext of the sensitive data.

  • Table Valued Parameters (TVP) support – TVP support allows a client application to send parameterized data to the server more efficiently by sending multiple rows to the server with a single call. You can use the ODBC Driver 13.1 to encapsulate rows of data in a client application and send the data to the server in a single parameterized command.

Multi-platform is the catchword of the day.  If you’re a MacOS user, this might be a portent of things to come.

Comments closed

Pester For Presentations

Rob Sewell takes Pester to the edge:

If you have PowerShell version 5 then you will have Pester already installed although you should update it to the latest version. If not you can get Pester from the PowerShell Gallery follow the instructions on that page to install it. This is a good post to start learning about Pester

What can you test? Everything. Well, specifically everything that you can write a PowerShell command to check. So when I am setting up for my presentation I check the following things. I add new things to my tests as I think of them or as I observe things that may break my presentations. Most recently that was ensuring that my Visual Studio Code session was running under the correct user. I did that like this

Rob’s scenario is around giving presentations, but while reading this, think about those services which should be running on your SQL Server instance—the same concept applies.

Comments closed

An Introduction To “Modern Excel”

Rob Collie explains how he talks about “Modern Excel,” by which he means DAX + Power Query/Power Pivot:

So the magic of Power Query is instantly apparent and tangible to basically any Excel Pro.  They can immediately see how PQ will save them oodles of time and anguish.

The benefits of DAX and relationships, by contrast, are less readily-apparent on first glance.  Portable/re-useable formulas that enable rapid iteration, the answering of “emergent” questions in near real-time, as well as a “subdivide and segment” capability?  Or how about multi-data-table capabilities that provide an integrated and convenient view across many different sources of formerly-siloed data?  These concepts are simply alien to the longtime Excel user, even though they are MONSTERS in terms of their biz value (as well as time-savers and anguish-reducers).  None of the impact “lands” up front because it can’t adequately be contemplated until you’ve started DOING it.

Rob’s looking at this from the standpoint of an educator helping train people with Excel expertise.

Comments closed

Against Shrinking Database Log Files

Kenneth Fisher is wary of shrinking your database log file:

It’s too big
I find that people who say this frequently don’t have a firm idea of what is too big or even why it might be as big as it is. This also goes in with the I need to free up disk space with no good reason why the space needs to be freed up.

There are good reasons to shrink the log and they do revolve around space. For example:

  • I had a one-time explosive growth of the log due to a large data load.

  • The usage of the database has changed and we aren’t using as much of the log as we used to.

  • We are billed at 2 am based on space used. We will grow the log back again after the billing period.

  • I need to clean up a large number of VLFs. (of course, then you are going to grow it back manually right?)

I quoted the caveats but Kenneth makes a solid case against shrinking log files without a good counterbalancing reason.

Comments closed

Which Write- Cmdlet Should I Use?

Jana Sattainathan has some thoughts on when to use each of the Write- cmdlets in Powershell:

PowerShell has matured as the automation tool of choice on the Microsoft platform, be it on Windows or Azure. However, there is no official guidance on best-practices and standards around some things. At times, bloggers do things incorrectly in their examples thereby reinforcing bad practices. Hopefully, this small post will help connect some dots for you! Please comment if I am stating something that is not a generally accepted best practice. Specifically, we are going to glance at the following cmdlets

  • Write-Host

  • Write-Output

  • Write-Debug

  • Write-Warning

  • Write-Error/Throw

  • Write-Verbose

  • Write-Progress

Write-Host is the easiest and probably most controversial of the set (because its messages are outside the pipeline), but there’s a place for each of these.

Comments closed

Automatic Plan Correction

Jovan Popovic introduces a new tuning feature in SQL Server 2017:

How to identify plans that should be corrected?

SQL Server 2017 provides a new system view called sys.dm_db_tuning_recommendations that shows all identified plan regressions. You can select data from this view, find the difference between last known good plan and regressed plan, and the script that you can execute to manually fix regression.

Automatic plan correction

As a next step, you can let SQL Server 2017 to automatically correct any plan that regressed.

I like it when the database engine gets smarter, but I get the feeling I’d like there to be some finer-grained options around what gets considered a regression and when a sub-optimal plan gets swapped out.

Comments closed

Ship It

James Anderson compiled the latest T-SQL Tuesday results and shipped it:

I’d like to say a huge thank you to everyone who read or published a post for T-SQL Tuesday #90. I had a great time reading through all the posts and I learnt a lot!

I feel that the real takeaway here is that Continuous Integration and DevOps are not just about putting the right tools in place, it’s all about putting the right working practices in place.

Read on for the wrap-up.

Comments closed