John Mount has an interesting take on hyperparameter tuning:
The hyper dance is the venial trick of pushing user facing technical debt and flaws as user controllable features. These controls are usually named “hyper parameters” and they are parameters or arguments that control the behavior of an algorithm. Users think “hyper parameters” must be even better than “regular parameters”, just like “hyper drive” is better than “sub-light drive.” However the etymology of the name isn’t from science fiction, it is just the need in statistical contexts to have a name for controls other than parameter, as parameter is often used to name the fit coefficients of a model (i.e. to name an output, not an input!).
In addition to this, I’d be concerned that heavy hyperparameter tuning could lead to a garden of forking paths problem where we end up accidentally doing the equivalent of p-hacking: modifying hyperparameters until we come up with the “right” answer.