If you’re developing a package for R to share with others — on CRAN, say — you’ll want to make sure it works for others. That means testing it on various platforms (Windows, Mac, Linux, and all the versions thereof), and on various versions of R (current, past, and future). But it’s likely you only have access to one platform, and installing and managing multiple R versions can be a pain.
R-hub, the online package-building service now in public beta, aims to solve this problem by making it easy to build and test your package on a variety of platforms and R versions. Using the rhub R package, you can with a single command upload your package to the cloud-based R-hub service, and build and test your package on the current, prior, and in-development versions of R, using any or all of these platforms
This looks like an interesting service for package developers and companies with a broad distribution of R installations.
Unit testing in Visual Studio is actually not that hard and can save you a lot pain down the road. The testing framework in Visual Studio offers extensive ways of executing batches of tests. You can group tests by Class, Duration, Outcome, Trait or Project.
When you right-click a test, you get the option to select how you want the tests in the Test Explorer to be grouped.
If you have an Analysis Services cube, definitely read this—testing is a vital part of software development, and automating tests can save you significant time later.
We should have 4 backup jobs
Definitely 4, not 3, and 5 is right out.
Unless of course someone builds a 500GB database with SQL CLR UDTs which slow down DBCC to a crawl, so you move the backups and maintenance for that database into their own jobs or steps with different flags, and possibly start staggering some stuff so it finishes on the weekend without dripping into Monday.
Then you’ve got more. But just kind of sweep them under the carpet. We have file system backup checks so that covers most of it…
These are just administrative test examples; when you have to start testing queries and procedures, it’s a whole new level of fun, as it feels like you’re building a castle on sand.
Whilst I won’t be showing code in this, there is a companion sample database project. This is on GitHub and each key stage is shown by a branch of work. This means you can jump in at most stages and work from there. If you need some git advice, check out my quick git in 5 presentation.
The core DB is a super hero themed database.
Database unit testing involves a lot of compromises (because most of what we care about as database professionals tends to be integration tests rather than unit tests), but having imperfect tests generally beats having no tests, so check this out.
There have been a couple of fixes in SQLCover this week, kindly submitted by John Mclusky (https://github.com/jmclusky):
Go check out SQLCover.
First I thought about what I would look for in SSMS when I had installed the maintenance solution and made a list of the things that I would check which looked something like this. This would be the checklist you would create (or have already created) for yourself or a junior following this install. This is how easy you can turn that checklist into a Pester Test and remove the human element and open your install for automated testing
SQL Server Agent is running – Otherwise the jobs won’t run🙂
We should have 4 backup jobs with a name of
DatabaseBackup – SYSTEM_DATABASES – FULL
DatabaseBackup – USER_DATABASES – FULL
DatabaseBackup – USER_DATABASES – DIFF
DatabaseBackup – USER_DATABASES – LOG
We should have Integrity Check and Index Optimisation Jobs
We should have the clean up jobs
All jobs should be scheduled
All jobs should be enabled
The jobs should have succeeded
There’s a very nice script and walkthrough of the process if you click through.
Case: we’ve integrated two sources of customers. We want to add a third source.
Q: How do we at the same time know that our current integration and solutions will continue to work while at the same time integrating the new sources?
A: Test it.
Q: How do we get faster deployments and more stability?
A: Automate the tests, so they can run continuously.
This is an interesting concept; do read the whole thing.
The load test utility had maxed out the compute resource that it had available, due to internal limitations within iperf itself. It’s a shame that this utility is not multi-threaded, because I think we could have a much greater result of improvement on this system.
Monitor the utilities that you’re using to do load testing, because limitations like this might skew your results!
Everything eventually hits a bottleneck. In David’s case, the bottleneck was in the testing tool itself.
What is this?
This lets you use Visual Studio to run tSQLt tests easily. Visual Studio has a built in framework for finding and executing tests so that if you have tSQLt tests in an SSDT project for example, although this just requires you have the .sql files in source control and does not require ssdt – you can easily see and execute your tests via the Visual Studio Test Explorer window. It is also available for use in your favorite build server via the vstest.console.exe tool or if you have vsts then you can use the default test task.
Combined with Visual Studio database tests for integration testing, this gives you a pretty fair start on database testing technology.
One of the more time consuming pieces in this process is step 4 – setting up a solution that can generate a concurrent query workload. Bob Duffy (b | t) has written blogs and delivered presentations on the topic of Analysis Services load testing. However, when it comes to tooling (unless something changed recently) I believe he’s still using a custom .NET Visual Studio Test solution for the test harness. And unless you know .NET, you’re going to have a pretty difficult time getting setup… which is why I was so happy when, earlier this year, Chris Schmidt (b | t) wrote a post over on MSDN demonstrating a method for load testing an Analysis Services database using PowerShell.
This weekend I finally had some time to expand upon Chris’ code sample and add several new features… e.g. parameters (target server/database, level of concurrency, queries per batch, etc) and the ability to retrieve MDX (or DAX) queries from a SQL table. In my experience, it’s quite a bit easier to generate a table of MDX/DAX queries rather than a bunch of txt files with a query in each file.
My first thought was “Well, that doesn’t seem too complicated.” Which means that Bill did a great job.