Fighting the Odds

Spring/Summer 2014



By Chris Dash

What kind of specialists might one need to recruit in order to outmaneuver some of the best government intelligence analysts? Computer programmers, financial investors, and pharmacists, according to Barbara Mellers, I. George Heyman University Professor, Professor of Psychology, and Professor of Marketing in the Wharton School; and Philip Tetlock, Leonore Annenberg University Professor, Professor of Psychology, and Professor of Management at Wharton. As part of their long-term Good Judgment Project (GJP), the husband-and-wife team has recruited a wide swath of volunteers from all walks of life. The objective: test participants’ talent for predicting future outcomes and then pit them against the experts in the intelligence community.

In 2006, Tetlock published the book Expert Political Judgment: How Good Is It? How Can We Know? One of the major findings was that experts thought they knew a lot more about the future than they actually did, and that the farther out the forecasting horizon, the more difficult it is for them to make accurate predictions. “I think the U.S. intelligence community saw the book as a challenge,” says Tetlock. “They decided they wanted to run a big forecasting tournament to test the effectiveness of various techniques for improving the quality of probabilistic judgment, with the hope that some of those techniques might be importable into the intelligence community at some point.”

That’s where GJP comes in. The initial tournament, funded by government program Intelligence Advanced Research Projects Activity (IARPA), pitted five university-based teams against each other and inquired: What’s the best way to ask people about the future, and what are the properties and characteristics of people who make the most accurate forecasts? Penn faced off against teams fielded by other prestigious universities and won two years in a row. “That’s when IARPA fired the other teams and funded us,” says Mellers.

Questions range from political to military to economic, and are developed by subject matter experts like Associate Professor of Political Science Michael Horowitz. Sometimes questions have close-call endings: One asked participants to anticipate whether or not there would be a violent confrontation in the South China Sea before year’s end in 2011. The forecasting community predicted a very low chance of such an occurrence, but as the deadline neared a member of the South Korean Coast Guard was attacked during an arrest. “There’s a huge component of irreducible uncertainty, so one of the big questions is how low you should set your forecasts to counteract the possibility that some fluky event could occur?” says Mellers.

Forecasters have a number of tools at their disposal when it comes to researching predictions, including Gallup polls and intelligent news-aggregating systems that provide daily alerts of any relevant goings on. The GJP pulls from other fields to aid in information gathering, as well. Lyle Ungar, a professor of computer and information science in Penn Engineering, helps the team develop models of aggregation using winning algorithms over a wide range of possibilities. The most successful forecasters, however, will go so far as to directly contact some of the organizations that are part of forecasting questions. One volunteer even contacted the United Nations for information on a question about world food price inflation.

It may seem like pure guesswork, but Mellers and Tetlock say the numbers don’t lie. “If you were a dart-throwing chimpanzee, the long-term average of your forecast for binary problems would be 50/50,” says Tetlock, who along with Mellers has recruited other psychologists, like Professor of Psychology Jonathan Baron, an expert in judgment decision-making, to analyze project results. “Our best forecasters are beating the intelligence agency analysts by as much as 30 percent, and fall somewhere between the chimpanzee and the omniscient being. The question is, how close can we get them?”

The GJP depends on fresh volunteers in order to compile the most accurate data. Interested people are encouraged to visit the website, http://www.goodjudgmentproject.com, and sign up to try their hand at becoming superforecasters.

By Blake Cole