Press "Enter" to skip to content

Day: July 13, 2022

Mapping Functions in R with purrr

Ronan Harrington executes a function for each row in a dataframe:

In this section, we want to tidy the different types of flight in the data set by increasing the number of rows and decreasing the number of columns. For a given airport on a given day, instead of having multiple columns/variables for arrivals, departures and total number of flights, we want to have one column describing the flight type (e.g. arrival or departure) and one column with the value of that flight type/number of flights. This will give the data set a tidy structure.

Click through for the process and the script. H/T R-Bloggers.

Comments closed

The Benefits of CNAMEs

Deepthi Goguri wants a CNAME record:

If you are using the Servername and database name in the connection strings in all the applications connecting to your database being migrated, this process gets tough and tedious to update once the database is migrated to a different server. It is always advised to use database DNS in the connection strings instead of the IP address or the Servername.

This also applies to the report server datasources as well.

I know it is hard to make the change but if the change is good, it is worth the effort.

I think this starts to get tricky as the number of servers and instances increases, especially if your DNS records differ considerably from your server names. Still, I completely agree: using CNAMES makes life a lot easier.

Comments closed

Sources of Data Structure Truth

Deb Melkin performs database epistomology:

The “source of truth” is my newly made up phrase for whatever you are using to say this is my database schema and initial data needed to start up the application. This can be your script directory; this can be a dacpac or bacpac; this can be your data model; this can be a combination of these things. My go-to “source of truth” right now is my source control repository. I’ve got both the schema and the default data needed in the same location. In the past, I would have probably included the data model as way to help me make sure whatever database table changes I have in my source control are there, especially for that one database which only had tables and views. (A different rant for a different time.) Whatever you use, it absolutely CANNOT be an actual database. There are two main reasons for this:

Read on for those reasons.

Comments closed

Finding Sample Data Online

Mara Pereira goes searching for data:

Have you ever struggled to find sample data to play with in Power BI?

Did you spend hours (sounds crazy, but it happened to me too!) just looking for a dataset with insurance data? Healthcare data? Housing prices data?

Did you ever wonder “where are people finding the data to create those Netflix and Amazon reports that seem to be everywhere these days? Seriously, tell me your secret!”?

Click through for three good sites. Another one I’ve taken to is the US Bureau of Labor Statistics, which is the federal department responsible for tracking things like employment data, consumer prices, and compensation data. The plus side to these datasets is that you get the whole range of data cleanup, warehousing, querying, and analytics over data which is both real and fairly interesting. For a broader take, data.gov has open source data hosted by the US federal government, though I haven’t spent much time working with it.

Comments closed

SQL Login of the Week Club

Mala Mahadevan recounts a horror story:

Its been a few years since I left consulting. But the last gig I was at – we encountered something like this. We had a big client who had outsourced all their database development and manual update work (no not to us, to some third-party contracting company). These were contractors paid by the hour, and the turnover was really high. Our client did not want to issue windows based authenticated logins to these people for some reason (do not recall what). So every week, when the week started, the contractor working on a particular server would get a SQL Server authenticated login they could use. This was valid just for that week and would expire the next week. And, every weekend , it was our job, as the remote DBA company, to set up those logins.

But wait, it gets worse.

Comments closed

The Risks of “Unused” Databases

Chad Callihan reminds us about things in use:

You may have been updating data someone needed for validating a fix. The “_OLD” table that you assumed could be dropped may still have been useful to the person who created it. There might be a database covered in cobwebs that should have been dropped years ago but it could also be a database that’s used for some type of reporting every few months.

Yeah, that’s a pretty common problem. A couple of things which help mitigate this issue:

  • Check wherever you can to see if the database (or database object) is in use: cached plans, stored procedure calls, application calling code, SQL Agent jobs, SSIS packages, etc.
  • Take (and test!) backups of databases before you drop tables or get rid of them.
  • Keep those database backups around for quite a while.
  • Take databases offline for a while before dropping them. That way, if somebody really does use it on occasion, it’s easy to bring back online rather than needing to restore from a backup.

At the end of the day, however, you shouldn’t be afraid to drop things. Do the appropriate amount of diligence and make it a controlled demolition.

Comments closed