Philip Tetlock and Dan Gardner give advice to aspiring superforecasters in "Superforecasting: The Art and Science of Prediction." While the book is a valuable resource to any forecaster, machine learning is an even bigger one.
If you haven't read the book, at least listen to the Freakonomics podcast interview with Tetlock. We highlight Tetlock's qualities of bad forecasters and the "Ten Commandments for Aspiring Superforecasters" below, but let's face it, not everyone has the time or patience to practice them all. Machine learning is a simple solution because it complements forecasters and empowers them to become "superforecasters."
Characteristics of "bad" forecasters
- Dogmatism - Unwillingness to change one's mind in response to new evidence.
- Lack of understanding of probability - Good forecasters understand probabilities.
- Reliance on vague verbage - You shouldn't base forecasts off vague descriptions. Example: A "fair chance of success" could really mean 1 in 3 chances.
Ten commandments for aspiring superforecasters
- Triage - "Focus on questions where your hardwork is likely to pay off."
- Break seemingly intractable problems into tractable sub-problems - "The surprise is how often remarkably good probability estimates arise from a remarkably crude series of assumptions and guesstimates."
- Strike the right balance between inside and outside views - "Superforecasters are in the habit of posing the outside-view question: How often do things of this sort happen in situations of this sort?"
- Strike the right balance between under- and overreacting to evidence - "Belief updating is to good forecasting as brushing and flossing are to good dental hygiene."
- Look for the clashing causal forces at work in each problem - "For every good policy argument, there is typically a counterargument that is at least worth acknowledging."
- Strive to distinguish as many degrees of doubt as the problem permits but no more - "Translating vague-verbiage hunches into numeric probabilities feels unnatural at first but it can be done. It just requires patience and practice."
- Strike the right balance between under- and overconfidence, between prudence and decisiveness - "They have to find creative ways to tamp down both types of forecasting errors— misses and false alarms— to the degree a fickle world permits such uncontroversial improvements in accuracy."
- Look for the errors behind your mistakes but beware of rearview-mirror hindsight biases - "You want to learn where you went wrong and determine ways to get better. And don't just look at failures. Evaluate successes as well so you can determine when you were just plain lucky."
- Bring out the best in others and let others bring out the best in you - "Wise leaders know how fine the line can be between a helpful suggestion and micromanagerial meddling or between a rigid group and a decisive one or between a scatterbrained group and an open-minded one."
- Master the error-balancing bicycle - "You can't become a superforecaster by reading training manuals. Learning requires doing, with good feedback that leaves no ambiguity about whether you are suceeding ...or ...failing."
Add machine learning to the equation
In a perfect world we would all read Tetlock's book and become superforecasters. In reality, there is bias and uncertainty, making an even bigger case for machine learning as the new superforecaster in business. Our API uses machine learning for forecasting and impact analysis. It was able to predict future sales with a 94.4% certainty using historical times series data. Our robust suite of time series algorithms makes any professional or developer a better forecaster.