04-21-2026
Think back a couple of weeks and imagine your favorite coworker strolling into the office with a March Madness bracket sitting in the 99th percentile of your company pool. You’ve followed college basketball closely. They have not. In fact, they tell you they picked winners based on mascots.
That result would feel ridiculous. It might even make expertise seem overrated, and all those hours of “watching film” feel wasted. But in a tournament as volatile as March Madness, one surprising outcome does not suddenly prove the mascot method works. In noisy environments, strange results happen all the time. We laugh at the mascot method in a bracket. We're less likely to notice when we're doing the same thing at work.
The same mistake shows up constantly in the professional world. In many business settings, analysis doesn’t start with a question but instead with an answer. A company branch outperforms another by 2%. A campaign beats expectations for one quarter. A team stumbles into a good outcome with a questionable process. The temptation is always the same: see the result first, then build the story around it.
Good analysis asks a harder question. Before deciding what a result means, it asks whether the setup was sound enough to mean much at all. Just because a result is surprising does not indicate it is meaningful.
While business is not a science, it can benefit from scientific thinking in ways that may not be intuitive. Not because it requires complexity, but because it forces clarity — about what question is being asked, how it is being tested, and what the result can actually tell you.
The real question, before even looking at the data, is simple: is this a valid test?
If a method is biased in one direction, is it still useful? Sometimes yes, sometimes no.
It depends entirely on the needs of the business and the questions you are trying to answer. The important piece is to understand the purpose of the test before going in so that you can design it for what you actually want to learn.
Let’s say you are working to gauge customer satisfaction, but you decide that only the positive comments are relevant because the 1-star reviews tend to come from more emotionally driven reactions. That approach might be useful if your goal is to understand what people like about your product. But if you want a realistic view of performance, ignoring negative feedback creates a distorted picture. In that case, you may need a more structured and representative way of collecting input, such as targeted and incentivized surveys. The method depends on the question.
This isn't a new problem. Organizations have been wrestling with it for as long as there have been decisions to make.
As the Guinness Brewery was scaling production hundreds of years ago, they noticed that not every bottle was exactly the same. They asked a simple but powerful question: how can we ensure consistent quality across every pint?
For Guinness, answering this question wasn’t just an academic exercise; it was a path to becoming one of the world’s most reliable and recognized beer brands.
To answer it, they brought in a statistician who developed methods to measure variation and determine how much difference was acceptable before a product became noticeably inconsistent. This work ultimately led to the development of the Student’s t-test, a statistical method still widely used today. More importantly, it reflected a mindset: before trusting the outcome, design a way to measure it properly.
I've felt this pull myself. There have been moments mid-analysis where I've noticed myself gravitating toward a cut of the data that confirmed what I was hoping to find — I had to stop and ask whether I was building a test or building a case. Those are very different things. The honest answer isn't always the one you went in expecting.
A clean number from a messy question is still a messy answer. The spreadsheet isn't where analysis goes wrong. It's usually long before that.
Colin Myers (BS ECON ’18) is a media analytics manager with Circana, a market research and technology company.