The original post for this topic garnered the attention of a commenter who pointed out that the same result could be gathered using a couple of
UNION ALLs and those lovely set-based
I personally think that both options work and whatever you feel comfortable with, use that.
It did play on my mind though of what the performance differences would be…what would the difference in
STATISTICS IO, TIMEbe? What would the difference in Execution Plans be? Would there even be any difference between the two or are they the same thing? How come it’s always the things I tell myself not to forget that I end up forgetting?
This may not be the most important thing to test, but it does show you a technique.
With a database restore, the users are within a database and if their SID matches the SQL Login you are ready to go. But with the logins it is a different story!
If you have to reinstall the engine because your master database backup is corrupt or someone hs changed the login password and you want to put it back or even – maybe the most common scenario – you want to keep track of the login permissions you need to have them saved somewhere.
Imagine that you have to re-create a login and give all the permissions that it has, imagine that this login has multiple roles on different databases. Did you know that beforehand? Do you keep track of it? How much time would take to gather all that information from the application owner? How much time will you have to spend resolving all the permission issues until everything is running smoothly? How many tickets will need to be raised? How many users will this be affecting?
Read on for Claudio’s easy solution.
Today, I received a really simple request. There was an UPDATE statement that had to be run on quite a few databases in multiple instances. This was for a vendor software we use and the statement was sent to me by a Helpdesk professional who manages the product.
You need to have the SQLPS (older) or the SQLSERVER (newer) PowerShell module installed.
Click through for the code. If you want to extend this further, I’d look at using a Central Management Server to pull the instances and maybe looking at parallel processing if you have a large number of servers.
As the world continually becomes “eaten by software,” more and more services are being replaced by software. IT pros have most likely seen this in the form of software-defined everything. One of the premier components of this focus on software and with the continuing adoption of DevOps is application programming interfaces (APIs). All of these services needs to talk together and must provide a way for programs and users to interact with them. This is where APIs come in handy. But, what does this have to do with PowerShell and JSON, you ask?
APIs, more specifically REST APIs, return data when queried. This data is typically in the JSON format. JSON is a way of structuring data that makes it easy for software to consume. When working with PowerShell, Microsoft has provided some helpful tools to work with JSON called the ConvertTo-Json and ConvertFrom-Json commands. These commands allow you to quickly work with REST APIs or any other service that returns or accepts JSON as an input.
Read on for more details on how to use these commands.
Nothing special here, simple syntax, but the seasoned PowerShell remoting pro will notice that we’re using a new parameter here -HostName. Normally on Windows PowerShell you have the -ComputerName parameter. Now, I don’t know exactly why this is different, but perhaps the PowerShell team needed a way to differentiate between OpenSSH and WinRM based remoting. Further, Enter-PSSession now has a new parameter -SSHTransport which at the moment doesn’t seem to do much since remoting cmdlets currently use OpenSSH by default. But if you read the code comments here, it looks like WinRM will be the default and we can use this switch parameter to specify SSH as the transport.
Once we execute this command, you’ll have a command prompt to the system that passed as a parameter to -HostName. The prompt below indicates you’re on a remote system by putting the server name you’re connected to in square brackets then your normal PowerShell prompt. That’s it, you now have a remote shell. Time to get some work done on that server, eh? Want to get out of the session, just type exit.
It’s interesting to see how well Microsoft is integrating Linux support into Powershell (and vice versa, but that’s a different post).
Environment variables are exposed with a PowerShell drive known as “$env:”. It’s possible to browse through all of the environment variables by typing $env: at the console and hitting the tab key. This will allow you to see the names of each environment variable in alphabetical order.
The $env: drive is the recommended place to refer to any environment variables with PowerShell. However, it’s possible also to read the variables via .NET in PowerShell by using the GetEnvironmentVariable static method on the Environment class. This is essentially the same task.
Read on to see how you can use these in your scripts.
Recently I got a request from a user that he wanted to copy a specific set of tables and their indexes into a new database to ship to the vendor for analysis. The problem was that the DB had thousands of tables (8,748 to be precise). Hunting and pecking for specific tables from that is possible but tedious. Even if I managed to do that, I still have to manually script out the indexes and run them in target as the native “Import/Export Wizard” does not do indexes. It only copies the table structure and data! I am not a big fan of point and click anyway.
My first thought was to see if dbatools had something similar, though a quick glance at the command list says perhaps not yet.
Now you can simply type <# and your help will be dynamically created. You will still have to fill in some of the blanks but it is a lot easier.
Here it is in action in its simplest form
But it gets better than that. When you add parameters to your function code they are added to the help as well. Also, all you have to do is to tab between the different entries in the help to move between them
Looks like a nice time-saver.
In my case, I got various systems setup: Windows and Ubuntu 16.04. So, I make sure I download correct *zip or *tar.gz file
As, pre-requisite, you will needed to have already installed *”.NET Core 2.0 Preview 1” for the SQL Service Tools to work and remember this need to be installed in all systems.
Just in case, here’s the link to download “.NET Core 2.0 Preview 1“: https://www.microsoft.com/net/core/preview#windowscmd
Now, because we are working with PowerShell Core, don’t forget to install the latest build found at:
Read the whole thing.
There is an awesome set of PowerShell cmdlets out there written by MVP Chrissy LeMaire. This method is my personal choice. It works great and is easy to automate. You can run it with SQLAgent or you can just use Scheduled Tasks in the OS. The scheduled tasks method is a little cleaner, but you don’t get to see it in SQL Server. Also if you are on a cluster and running Windows 2012 you can cluster the task scheduler as an added benefit.
Chrissy wrote this with the intent of making migrations easier, and she succeeded. In fact, I made it a point to thank her at MVP Summit last year because it made my life insanely easier. The advantage here is that you can automate a lot more than than just logins. In fact you can migrate and automate pretty much anything at the server level. Here is the link that I guarantee you are going to bookmark followed by a video demo where I show how to install and automate the syncing of logins using both the SQLAgent method and the Scheduled Tasks method.
DBATools would be my preference in this situation as well, but click through to see four other methods, as well as code.