Suzana Herculano-Houzel

Before you get fancy, have you looked at the easiest explanation already?

DALL·E 2024-05-08 08.18.17 - An illustrative image for a neuroscience blog about the principle that the simplest explanation is usually the correct one, applied to data analysis.

If a metric goes down by 50% and you want it back to where it was, what do you do? I say first make sure that you understand the source of the decrease, instead of rushing to fix something that maybe didn’t need fixing – while you miss out on what actually did need fixing.

I just got contacted by technical support concerned that one metric of something that I oversee was going down. The point of the call was to make sure that I was aware of the decrease in the metric, then point me to an AI tool that could change how the metric is enforced, and make that metric increase back to where it was.

Technical support, in charge of calculating said metric, made an assumption: that the metric was going down because it was not being enforced (as in, I’m not enforcing it). But that didn’t seem right to me. I know the care that all of the involved take to enforce what the metric measures (sorry I have to be so cryptical). So I asked back: do you know that the metric is not going down simply because the total number of cases to which the metric applies has been going down, too? Have you looked at that total?

He had not. It took all of two minutes to pull up the data. The metric had gone down by 50… and so had the total number of cases to which the metric applied. Once total numbers were taken into consideration, there had been exactly zero change in the metric. Fifty percent of a smaller and smaller total is a smaller and smaller number, but… it still is fifty percent.

So: did the metric in question decrease? Yes, it did. But only because the total number that the metric applied to had decreased; proportionately, the metric stayed put all the time.

Which still means that we have a problem, but the problem is not what technical support thought it was. The problem is not that this metric has dropped, but that the total that it applies to has been decreasing. Trying to fix the metric would have been solving a problem that didn’t exist, and ignoring the real problem.

I understand the rush to get fancy and apply new tools at problems, but sometimes the problems are not even what they seemed at first sight. So, before you get fancy: Have you considered the simpler alternative explanations, and checked the data to rule them out first?

More Posts

pt_BRPortuguês do Brasil