Eating My Dog Food

James Titchener / @mistertitchener

After months of research, I finally got to put a project of mine to the test this week. I've been on a deep dive into the science of decision making since I came to realize that I wasn't quite the intuitive savant that I thought I was. There's a laundry list of human biases that hinder our decision making along with some quantitative methods to alleviate these biases. The result of my efforts was a combination of some of these lessons into a set of processes and models designed for making effective forecasts and decisions.

While mentally I had been on the quantitative bandwagon for some time, this was my first chance to put these tools into practice. Part of this methodology requires that you make and track your forecasts with clear, objective numbers that can be later reassessed for their accuracy. This part, to my surprise, really fucking sucked.

It's generally pretty easy to throw out some prediction, be it in life or business, and just move on with our days. The reason being is that we use fluffy language and half-assed gut-intuitions that provide a margin of safety if we end up being wrong. And basically nobody records their forecasts to be later checked for accuracy.

Let's take an example that you might hear on TV today, "Trump is gonna lose. He's butchered the COVID response."

Here we've got a hypothesis: Trump will lose the upcoming election, because of his reaction to the COVID pandemic. This is something that sounds ripe to come from the mouths of political pundits on CNN or MSNBC. But what do these words actually mean? Are we saying that not only is Trump going to lose 100% of the time, the factor deciding his loss will 100% be caused by his poor reaction to the COVID pandemic? If so, this is a bold assessment. With this level of certainty, we'd imagine this hypothetical political pundit would literally bet his house on Joe Biden winning given his -130 odds to win according to sportsbooks (meaning for every $130 the pundit bets, he wins $100 in return). Yet despite the certitude spouted from the talking heads on cable news, somehow I don't think many of them are printing money playing political futures at casinos.

Researcher and political scientist Phillip Tetlock has set out to put the expertise of political pundits (among other experts) to the test. In a study spanning 20 years, he took hundreds of public commentators and asked them to assess the quantitative probability of a set of future events in areas both inside and outside their expertise. Given three possible outcomes on a question, the experts were asked to select probabilities for each possible outcome in terms of a percentage chance. The results? The experts performed worse than if they had simply given each outcome an equal 33% chance of occurring. Or in Tetlock's words, the experts performed worse than a group of dart-throwing chimps.

It's findings like this that motivated me to implement tools in my work and general life that avoided the pitfalls of expert intuition. What I underestimated, however, was just how hard it would be to put these lessons into practice.

Tetlock didn't have an easy time finding pundits that would sign up for his experiment, and those that agreed did so anonymously. Getting clear about your forecasts and tracking the results removes the safety net that comes with making vague predictions soon to be forgotten. And putting real work into quantifying your uncertainty and clarifying what it is you're actually predicting is not only hard, it's scary. Effort removes the excuse of not trying, and despite what I thought to be full buy-in on my part to the ideas of researchers like Tetlock, I found myself struggling to put the work in this week as I tried to implement quantitative decision making methods for myself.

The difficulty I had putting the ideas I've come to believe into practice was surprising. For one, I'd put a lot of work into simplifying the recommended tools, but even still I found the process mentally grueling and time consuming. While I have the capacity for slow, deliberate, conscious thinking (what the psychologist Daniel Kahneman would call System 2 thinking) the process I used required a hell of a lot of it. Secondly, I didn't expect to be so uncomfortable about making my forecasts and thoughts around the decision explicit. The decision I evaluated is potentially a risky one for my business, and while only me and the guys on my team would know about it, I could feel my ego begging me to spare myself from the potential embarrassment of being wrong.

The final thing that surprised me was just how cocky I'd gotten about the whole thing. I thought for sure I'd figured out a way to combine the lessons from researchers like Tetlock, Kahneman, and Douglas Hubbard among others into a process and simulation model that would prove to be an easy way to make effective decisions for my company and beyond. I thought I might have made the first steps to solving the puzzle of effective decision making in a way that didn't sacrifice the speed and ease-of-use that organizations have come to expect.

Whatdidn't expect was the true difficulty of the problem. Even with great software and better online training, the learning curve will be high and the execution for decision makers won't be easy. The human brain prefers comfort, and conscious, deliberate thinking with the chance of failure isn't comfortable. Neither is learning all the material necessary to even start making quantitative forecasts. And once you've started, the learning and discomfort doesn't stop. One of the qualities that Tetlock recognizes in Superforecasters, those that are effective in making quantitative forecasts, is an ability to iterate and learn from past results. It requires what the psychologist Carol Dweck calls a growth mindset. Rather than fearing evidence of one's past mistakes, those with a growth mindset embrace failure as an opportunity to learn. With some forecasts requiring years of time passed in order to get feedback on one's assessments, becoming a Superforecaster not only requires a growth mindset, but a hell of a lot of patience—two qualities that I'm painfully short on.

Up until this last week, my ideas on decision making were safe. They were theories. It wasn't until I put my hypotheses to the test that I met the harsh bitch that is reality. The truth hurts—but it also gives. I just need to be willing to eat my own dog food.

more posts like this

Logging Off

Taking a break to refocus on what matters.

Continue reading →

The Cooldown

How to make sure we don't get a Trump-type again.

Continue reading →

Why Positive Thinking is Overrated

How to prepare for the worst and appreciate what you already have.

Continue reading →

Paying Off My Debt

Emotional debt compounds over time and becomes bigger than it needs to be if left unpaid.

Continue reading →

More Thoughts on Attention

On the brevity of time, the value of attention, and our propensity to use neither wisely.

Continue reading →