Do you ever worry about how you declare
NUMERICorDECIMALdata types in SQL?
Do you sometimes “add a bit more” precision—just to be safe?
Have you considered how that small decision could actually change your arithmetic results?I ran into this recently when comparing data between two environments that should have produced identical results. One calculated field was slightly off — and the culprit turned out to be a difference in numeric data type declarations.
Read on to see what happened. The differences weren’t massive, but if you were expecting an exact match, seeing a difference, even at the 7th or 8th spot after the decimal, could be jarring.