Tracy Boggiano shares work on centralizing Query Store results across a number of databases:
I’ve worked for SaaS companies for the last 6 years or so. So our queries are largely the same across our system and by default Query Store is per database. So it would be handy to have a central repository to help you determine which queries across your whole server are your worse performing queries. Hence comes my idea to build a central repository. I believe I put in connect item before it got moved to the new platform for this but never put a new ticket. So this is the beginning of building something along those lines. So it will be a work in progress so to speak. My current company I care about queries that are taking a long time to run. So I’m going to store the top 50 queries in total duration into a database handily called DBA because that’s where I store all the DBA stuff. To do this, I have some none client related databases I don’t care about so I create a table to tell which databases to collect the data from. Then a table to put the information into and job to run every day at midnight and sum up the data. Now the data is stored in UTC time so the data will be off by whatever timezone difference you are in but with most people being 24×7 shops as SaaS companies that shouldn’t matter and if it does you can edit the query.
This helps to resolve a necessary pain point in Query Store: all of that data is per-database, so if you have a federated system with a large number of equivalent databases, getting cross-system stats is painful.