0% found this document useful (0 votes)
30 views17 pages

Top 10 Project Management Biases

Top ten biases in decision making
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views17 pages

Top 10 Project Management Biases

Top ten biases in decision making
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

See discussions, stats, and author profiles for this publication at: [Link]

net/publication/357027633

Top Ten Behavioral Biases in Project Management: An Overview

Article in Project Management Journal · December 2021


DOI: 10.1177/87569728211049046

CITATIONS READS

120 7,904

1 author:

Bent Flyvbjerg
University of Oxford
300 PUBLICATIONS 48,116 CITATIONS

SEE PROFILE

All content following this page was uploaded by Bent Flyvbjerg on 14 December 2021.

The user has requested enhancement of the downloaded file.


Article
Project Management Journal
2021, Vol. 52(6) 531–546
Top Ten Behavioral Biases in Project © 2021 Project Management Institute, Inc.
Article reuse guidelines:

Management: An Overview [Link]/journals-permissions


DOI: 10.1177/87569728211049046
[Link]/home/pmx

Bent Flyvbjerg1,2

Abstract
Behavioral science has witnessed an explosion in the number of biases identified by behavioral scientists, to more than 200 at
present. This article identifies the 10 most important behavioral biases for project management. First, we argue it is a mistake
to equate behavioral bias with cognitive bias, as is common. Cognitive bias is half the story; political bias the other half.
Second, we list the top 10 behavioral biases in project management: (1) strategic misrepresentation, (2) optimism bias, (3) unique-
ness bias, (4) the planning fallacy, (5) overconfidence bias, (6) hindsight bias, (7) availability bias, (8) the base rate fallacy, (9)
anchoring, and (10) escalation of commitment. Each bias is defined, and its impacts on project management are explained,
with examples. Third, base rate neglect is identified as a primary reason that projects underperform. This is supported by pre-
sentation of the most comprehensive set of base rates that exist in project management scholarship, from 2,062 projects. Finally,
recent findings of power law outcomes in project performance are identified as a possible first stage in discovering a general
theory of project management, with more fundamental and more scientific explanations of project outcomes than found in con-
ventional theory.

Keywords
behavioral economics, project management, cognitive bias, political bias, strategic misrepresentation, optimism bias, uniqueness
bias, planning fallacy, overconfidence bias, hindsight bias, availability bias, base rate fallacy, anchoring, escalation of commitment

Introduction conflict with the standard rational model, although you would
never know this from reading Thaler’s (2015) history of the
Since the early work of Tversky and Kahneman (1974), the field. Thaler (2015, p. 357) speaks of “the unrealism of hyper-
number of biases identified by behavioral scientists has rational models,” and we agree. But, behavioral economics
exploded in what has been termed a behavioral revolution in itself suffers from such unrealism, because it ignores that
economics, management, and across the social and human sci- many behavioral phenomena are better explained by political
ences. Today, Wikipedia’s list of cognitive biases contains more bias than by cognitive bias.
than 200 items (“List of cognitive biases,” 2021). The present In short, behavioral economics in its present form suffer from
article gives an overview of the most important behavioral an overfocus on cognitive psychology: Economic decisions get
biases in project planning and management, summarized in overaccounted for in psychological terms, when other perspec-
Table 1. They are the biases most likely to trip up project plan- tives—for instance political, sociological, and organizational—
ners and managers and negatively impact project outcomes, if
may be more pertinent. If all you have is a hammer, everything
the biases are not identified and dealt with up front and looks like a nail. Similarly, if all you have is psychology,
during delivery. everything gets diagnosed as a psychological problem, even
Many would agree with Kahneman (2011, p. 255) that opti-
when it is not. Behavioral economics suffers from a “psychology
mism bias “may well be the most significant of the cognitive bias,” in this sense. Cognitive bias is only half the story in
biases.” However, behavioral biases are not limited to cognitive behavioral science. Political bias is the other half.
biases, though behavioral scientists, and especially behavioral
economists, often seem to think so. For instance, in his
history of behavioral economics, Nobel laureate Richard
Thaler (2015, p. 261) defines what he calls “the real point of
1
University of Oxford, Oxford, UK
2
behavioral economics” as “to highlight behaviors that are in IT University of Copenhagen, Copenhagen, Denmark
conflict with the standard rational model.” But, nothing in Corresponding Author:
this definition limits the object of behavioral economics to cog- Bent Flyvbjerg.
nitive bias. Other types of bias, for example, political bias, also Email: flyvbjerg@[Link]
532 Project Management Journal 52(6)

Table 1. Top 10 Behavioral Biases in Project Planning and political bias is pervasive and must be taken into account. Or
Management so I argue.
Name of Bias Description It should be emphasized again that many other behavioral
biases exist than those mentioned in Table 1, which are relevant
1. Strategic The tendency to deliberately and to project planning and management, for example, illusion of
misrepresentation systematically distort or misstate control, conservatism bias, normalcy bias, recency bias, proba-
information for strategic purposes. Aka bility neglect, the cost–benefit fallacy, the ostrich effect, and
political bias, strategic bias, or power bias.
2. Optimism bias The tendency to be overly optimistic about
more. But, the 10 mentioned here may be considered the
the outcome of planned actions, most important, and, in this sense, they are deemed to be the
including overestimation of the most common biases with the most direct impact on project
frequency and size of positive events and outcomes.
underestimation of the frequency and
size of negative ones.
3. Uniqueness bias The tendency to see one’s project as more Discussions With Kahneman
singular than it actually is.
4. Planning fallacy (writ The tendency to underestimate costs,
My first opportunity to reflect systematically on the relationship
large) schedule, and risk and overestimate between political and cognitive bias was an invitation in 2003
benefits and opportunities. from the editor of Harvard Business Review (HBR) to
5. Overconfidence bias The tendency to have excessive confidence comment on an article by Lovallo and Kahneman (2003). The
in one’s own answers to questions. year before, Kahneman had won the Nobel Prize in
6. Hindsight bias The tendency to see past events as being Economics for his path-breaking work with Amos Tversky
predictable at the time those events (who died in 1996) on heuristics and biases in decision-making,
happened. Also known as the
I-knew-it-all-along effect.
including optimism bias, which was the topic of the HBR
7. Availability bias The tendency to overestimate the article. The editor explained to me that he saw Kahneman and
likelihood of events with greater ease of me as explaining the same phenomena—cost overruns,
retrieval (availability) in memory. delays, and benefit shortfalls in investment decisions—but
8. Base rate fallacy The tendency to ignore generic base rate with fundamentally different theories. As a psychologist,
information and focus on specific Kahneman explained outcomes in terms of cognitive bias,
information pertaining to a certain case or especially optimism bias and the planning fallacy. As an eco-
small sample.
9. Anchoring The tendency to rely too heavily, or
nomic geographer, I explained the same phenomena in terms
“anchor,” on one trait or piece of of political economic bias, specifically strategic misrepresenta-
information when making decisions, tion. So which of the two theories is right, asked the HBR
typically the first piece of information editor?
acquired on the relevant subject. The editor’s question resulted in a spirited debate in the
10. Escalation of The tendency to justify increased pages of HBR. I commented on the article by Kahneman and
commitment investment in a decision, based on the Lovallo (2003) that they,
cumulative prior investment, despite new
evidence suggesting the decision may be
wrong. Also known as the sunk cost “underrate one source of bias in forecasting—the deliberate
fallacy. ‘cooking’ of forecasts to get ventures started. My colleagues
and I call this the Machiavelli factor. The authors [Kahneman
and Lovallo] mention the organizational pressures forecasters
Political bias—understood as deliberate strategic distortions face to exaggerate potential business results. But adjusting
—arises from power relations, instead of from cognition, and forecasts because of such pressures can hardly be called opti-
has long been the object of study in political economy. mism or a fallacy; deliberate deception is a more accurate
Political bias is particularly important for big, consequential term. Consequently, Lovallo and Kahneman’s analysis of
decisions and projects, which are often subject to high politi- the planning fallacy seems valid mainly when political pressures
cal–organizational pressures. In fact, for very large projects— are insignificant. When organizational pressures are significant,
so-called megaprojects—the most significant behavioral bias both the causes and cures for rosy forecasts will be
is arguably political bias, more specifically, strategic misrepre- different from those described by the authors” (Flyvbjerg,
sentation (Flyvbjerg et al., 2002; Flyvbjerg et al., 2018; Wachs, 2003, p. 121).
2013). Cognitive bias may account well for outcomes in the
simple lab experiments done by behavioral scientists. But for Kahneman and Lovallo (2003, p. 122) responded:
real-world decision-making—in big hierarchical organizations,
involving office politics, salesmanship, jockeying for position “Flyvbjerg and his colleagues reject optimism as a primary
and funds, including in the C-suite and ministerial offices, cause of cost overruns because of the consistency of the over-
with millions and sometimes billions of dollars at stake— runs over a significant time period. They assume that people,
Flyvbjerg 533

particularly experts, should learn not only from their mistakes end of the scale, with strong political-organizational pres-
but also from others’ mistakes. This assumption can be chal- sures—for example, the situation where a chief executive
lenged on a number of grounds.” officer or minister must have a certain project—one would
expect strategic misrepresentation to have more explanatory
Ultimately, the HBR debate did not so much resolve power relative to optimism bias, again without optimism bias
the question as clarify it and demonstrate its relevance. being absent. Big projects, whether in business or govern-
Kahneman and I therefore continued the discussion offline. ment, are typically at the upper end of the scale, with high
Others have commented on Kahneman’s generosity in aca- political-organizational pressures and strategic misrepresen-
demic interactions. He invited me to visit him at home, first tation. The typical project in the typical organization is some-
in Paris and later in New York, to develop the thinking on where in the middle of the scale, exposed to a mix of strategic
political and cognitive bias and how they may be interrelated. misrepresentation and optimism bias, where it is not always
He was more rigorous than anyone I’d discussed bias with clear which one is stronger.
before, and I found the discussions highly productive. The discussions with Kahneman taught me that although I
In addition to being generous, Kahneman is deeply curious had fully acknowledged the existence of cognitive bias in my
and empirical. Based on our discussions, he decided he initial work on bias (Flyvbjerg et al., 2002), I needed to empha-
wanted to investigate political bias firsthand and asked if I size cognition more to get the balance right between political
could arrange for him to meet people exposed to such bias. and psychological biases in real-life decision-making. This
I facilitated an interview with senior officials I knew at was the object of later publications (Flyvbjerg, 2006;
the Regional Plan Association of the New York-New Flyvbjerg, 2013; Flyvbjerg et al., 2004; Flyvbjerg et al.,
Jersey-Connecticut metropolitan (tristate) area, with offices 2009; Flyvbjerg et al., 2016). More importantly, however, in
near Kahneman’s home in New York. Their work includes our discussions and in a relatively obscure article by
forecasting and decision-making for major infrastructure Kahneman and Tversky (1979a), I found an idea for how to
investments in the tristate region, which are among the eliminate or reduce both cognitive and political biases in
largest, the most expensive, and most complex in the world. decision-making. I developed this into a practical tool called
They were the types of projects I studied to develop my the- “reference class forecasting” (Flyvbjerg, 2006). In Thinking,
ories of strategic misrepresentation. Decision-making on such Fast and Slow, Kahneman (2011, p. 251) was kind enough to
projects is a far cry from the lab experiments used by endorse the method as an effective tool for bringing the
Kahneman and other behavioral scientists to document outside view to bear on projects in order to debias them.
classic cognitive biases like loss aversion, anchoring, opti- Finally, it has been encouraging to see Kahneman begin to
mism, and the planning fallacy. mention political bias in his writings, including in his seminal
When Kahneman and I compared notes again, we agreed book, Thinking, Fast and Slow, where he explicitly points out
the balanced position regarding real-world decision-making that,
is that both cognitive and political biases influence outcomes.
Sometimes one dominates, sometimes the other, depending on
“Errors in the initial budget are not always innocent. The
what the stakes are and the degree of political-organizational
authors of unrealistic plans are often driven by the desire to
pressures on individuals. If the stakes are low and political-
get the plan approved—whether by their superiors or by a
organizational pressures are absent, which is typical for lab
client—supported by the knowledge that projects are rarely
experiments in behavioral science, then cognitive bias will
abandoned unfinished merely because of overruns in costs
dominate, and such bias will be what you find. But if the
or completion times” (Kahneman, 2011, pp. 250–251).
stakes and pressures are high—for instance, when deciding
whether to spend billions of dollars on a new subway line in
Manhattan—political bias and strategic misrepresentation That is clearly not a description of cognitive bias, which is
are likely to dominate and will be what you uncover, together innocent per definition, but of political bias, specifically strate-
with cognitive bias, which is hardwired and therefore present gic misrepresentation aimed at getting projects underway. As
in most, if not all, situations. such, it contrasts with other behavioral economists, for instance,
Imagine a scale for measuring political-organizational Thaler (2015) who leaves political bias unmentioned in his best-
pressures, from weak to strong. At the lower end of the selling history of behavioral economics.
scale, one would expect optimism bias to have more explan- Most likely, none of the above would have happened
atory power of outcomes relative to strategic misrepresenta- without the HBR editor’s simple question, “Strategic misrepre-
tion. But with more political-organizational pressures, sentation or optimism bias, which is it?” The discussions with
outcomes would increasingly be explained in terms of strate- Kahneman proved the answer to be: “Both.”
gic misrepresentation. Optimism bias would not be absent We use this insight below to describe the most important
when political-organizational pressures increase, but opti- behavioral biases in project planning and management, starting
mism bias would be supplemented and reinforced by bias with strategic misrepresentation, followed by optimism bias and
caused by strategic misrepresentation. Finally, at the upper eight other biases.
534 Project Management Journal 52(6)

Strategic Misrepresentation (2002, 2004, 2005, 2009), and Feynman (2007a, 2007b)
among others.
Strategic misrepresentation is the tendency to deliberately
Strategic misrepresentation will be particularly strong
and systematically distort or misstate information for
where political-organizational pressures are high, as argued
strategic purposes (Jones & Euske, 1991; Steinel & De Dreu,
above, and such pressures are especially high for big, strategic
2004). This bias is sometimes also called political bias, strategic
projects. The bigger and more expensive the project, the more
bias, power bias, or the Machiavelli factor (Guinote & Vescio,
strategic import it is likely to have with more attention from
2010). The bias is a rationalization where the ends justify the
top management and with more opportunities for political-
means. The strategy (e.g., achieve funding) dictates the bias
organizational pressures to develop, other things being
(e.g., make projects look good on paper). Strategic mis-
equal. For project planning and management, the following
representation can be traced to agency problems and political-
propositions apply:
organizational pressures, for instance, competition for scarce
funds or jockeying for position. Strategic misrepresentation is
Proposition 1: For small projects, with low strategic import and
deliberate deception, and as such, it is lying, per definition
no attention from top management, bias, if present, is likely to
(Bok, 1999; Carson, 2006; Fallis, 2009).
originate mainly with cognitive bias, for example, optimism
Here, a senior Big-Four consultant explains how strategic
bias.
misrepresentation works in practice:
Proposition 2: For big projects, with high strategic import and
“In the early days of building my transport economics and ample attention from top management, bias, if present, is
policy group at [name of company omitted], I carried out a lot likely to originate mainly with political bias, for example, stra-
of feasibility studies in a subcontractor role to engineers. In vir- tegic misrepresentation, although cognitive bias is also likely to
tually all cases it was clear that the engineers simply wanted to be present.
justify the project and were looking to the traffic forecasts to
help in the process … I once asked an engineer why their cost Strategic misrepresentation has proved especially important
estimates were invariably underestimated and he simply in explaining megaproject outcomes. For megaproject manage-
answered ‘if we gave the true expected outcome costs nothing ment, strategic misrepresentation may be expected to be the
would be built’” (personal communication, author’s archives, dominant bias (Flyvbjerg, 2014).
italics added). Professor Martin Wachs of UC Berkeley and UCLA, who
pioneered research on strategic misrepresentation in transporta-
Signature architecture is notorious for large cost overruns. A tion infrastructure forecasting, recently looked back at more
leading signature architect, France’s Jean Nouvel, winner of the than 25 years of scholarship in the area. After carefully weigh-
Pritzker Architecture Prize, explains how it works: ing the evidence for and against different types of explanations
of forecasting inaccuracy, Wachs summarized his findings in
the following manner:
“I don’t know of buildings that cost less when they were com-
pleted than they did at the outset. In France, there is often a the- “While some scholars believe this [misleading forecasting] is a
oretical budget that is given because it is the sum that politically simple technical matter involving the tools and techniques of
has been released to do something. In three out of four cases this cost estimation and patronage forecasting, there is growing evi-
sum does not correspond to anything in technical terms. This is a dence that the gaps between forecasts and outcomes are the
budget that was made because it could be accepted politically. results of deliberate misrepresentation and thus amount to a col-
The real price comes later. The politicians make the real price lective failure of professional ethics … Often … firms making
public where they want and when they want” (Nouvel, 2009, the forecasts stand to benefit if a decision is made to proceed
p. 4, italics added). with the project” (Wachs, 2013, p. 112).

This is strategic misrepresentation. Following its play- Wachs found a general incentive to misrepresent forecasts
book, a strategic cost or schedule estimate will be low, for infrastructure projects and that this incentive drives fore-
because it is more easily accepted, leading to cost and sched- casting outcomes. Wachs’s review and the studies cited
ule overruns later. Similarly, a strategic benefit estimate will above falsify the notion that optimism and other cognitive
be high, leading to benefit shortfalls. Strategic misrepresen- biases may serve as a stand-alone explanation of cost under-
tation therefore produces a systematic bias in outcomes. estimation and benefit overestimation, which has been the
And, this is precisely what the data show (see Table 2). common view in behavioral economics. Explanations in
We see the theory of strategic misrepresentation fits the terms of cognitive bias are especially wanting in situations
data well. Explanations of project outcomes in terms of stra- with high political and organizational pressures. In such sit-
tegic misrepresentation have been set forth by Wachs (1989, uations, forecasters, planners, and decision makers inten-
1990, 2013), Kain (1990), Pickrell (1992), Flyvbjerg et al. tionally use the following Machiavellian formula to make
Flyvbjerg 535

Table 2. Base Rates for Cost and Benefit Overrun in 2,062 Capital Investment Projects Across Eight Types

Cost Overrun (A/E) Benefit Overrun (A/E)


Investment Type n Average pa n Average pa

Dams 243 1.96 < 0.0001 84 0.89 < 0.0001


BRTb 6 1.41 0.031 4 0.42 0.12
Rail 264 1.40 < 0.0001 74 0.66 < 0.0001
Tunnels 48 1.36 < 0.0001 23 0.81 0.03
Power plants 100 1.36 0.0076 23 0.94 0.11
Buildings 24 1.36 0.00087 20 0.99 0.77
Bridges 49 1.32 0.00012 26 0.96 0.099
Roads 869 1.24 < 0.0001 532 0.96 < 0.0001
Total 1,603 1.39/1.43c < 0.0001 786 0.94/0.83c < 0.0001
Note. Project planners and managers clearly do not get base rates right. The data show strong biases for (1) cost underestimation and overrun and (2) benefit
overestimation and shortfall. Overrun is measured as actual divided by estimated costs and benefits (A/E), respectively, in real terms, baselined at the final
investment decision. See Flyvbjerg (2016, pp. 181–182) for a description of the dataset used in the table.
a
The p-value of Wilcoxon test with null hypothesis that the distribution is symmetrically centered around 1.
b
BRT: Bus rapid transit.
c
Weighted and unweighted average, respectively.

their projects look good on paper, with a view to securing people, and such individuals are convexity generators, with
their approval and funding: political bias driving their cognitive biases, which are
larger for powerful individuals than for nonpowerful ones.
Underestimated costs + Overestimated benefits
= Funding
Optimism Bias
Finally, recent research has found that not only do politi- Optimism bias is a cognitive bias, and it is the tendency for indi-
cal and cognitive biases compound each other in the manner viduals to be overly bullish about the outcomes of planned
described above. Experimental psychologists have shown actions (Kahneman, 2011, p. 255). Sharot (2011, p. xv) calls
that political bias directly amplifies cognitive bias in the it “one of the greatest deceptions of which the human mind is
sense that people who are powerful are affected more capable.” Where strategic misrepresentation is deliberate, opti-
strongly by various cognitive biases—for example, availabil- mism bias is nondeliberate. In the grip of optimism, people—
ity bias and recency bias—than people who are not (Weick & including experts—are unaware that they are optimistic. They
Guinote, 2008). A heightened sense of power also increases make decisions based on an ideal vision of the future rather
individuals’ optimism in viewing risks and their propensity than on a rational weighing of realistic gains, losses, and prob-
to engage in risky behavior (Anderson & Galinsky, 2006, abilities. They overestimate benefits and underestimate costs.
p. 529). This is because people in power tend to disregard They involuntarily spin scenarios of success and overlook the
the rigors of deliberate rationality, which are too slow and potential for mistakes and miscalculations. As a result, plans
cumbersome for their purposes. They prefer—consciously are unlikely to deliver as expected in terms of benefits and costs.
or not—subjective experience and intuitive judgment as the Almost 100 years ago, when Geoffrey Faber founded what
basis for their decisions, as documented by Flyvbjerg would become Faber & Faber, the renowned London publish-
(1998, p. 69 ff.), who found that people in power will delib- ing house, he was so certain of his project that he bet his
erately exclude experts from meetings when much is at stake, mother’s, his own, and a few friends’ fortunes on it, concluding,
in order to avoid clashes in high-level negotiations between “everybody would benefit … with a substantial income” (Faber,
people in power’s intuitive decisions and experts’ delibera- 2019, p. 6, underline in the original). A year later, the new pub-
tive rationality. Guinote and Vescio (2010) similarly found lishing house was in its first of several near-bankruptcies, and
that people in power rely on ease of retrieval more than Faber wrote in his diary:
people without power. In consequence, total bias—political
plus cognitive—escalates, but not in a simple linear manner “I find it hard to justify my buoyant self-confidence of last year
where total bias equals the sum of political and cognitive … I ought, I think, to have foreseen trouble and gone more cau-
biases but instead in a complex, convex way where political tiously” (Faber, 2019, pp. 27–28).
bias amplifies cognitive bias, leading to convex risk. This,
undoubtedly, is one reason we find strong convexities in That’s optimism bias and what it does to individuals.
the planning and management of big projects. Decisions Geoffrey Faber is not the only entrepreneur to have been
about big projects are typically made by highly powerful tripped up like this. It’s typical. What’s less typical is that
536 Project Management Journal 52(6)

Faber & Faber survived to tell the story. Most companies fail (Flyvbjerg & Gardner, 2022). I call such individuals “realistic
and are forgotten. optimists.” Risto Siilasmaa, chairman of Nokia during its
Optimism bias can be traced to cognitive biases, in other recent successful turnaround, goes one step further in highlight-
words, systematic deviations from rationality in the way the ing the two disparate dispositions, when he emphasizes “para-
mind processes information (O’Sullivan, 2015; Sharot et al., noid optimism” as the key to success in leading projects and
2007; Shepperd et al., 2002). These biases are thought to be businesses, always planning for the worst-case scenario: “The
ubiquitous. In project planning and management, an optimistic more paranoid we are, the harder we will continue to labor to
cost or schedule estimate will be low, leading to cost and sched- shift the probability curve in our favor and the more optimistic
ule overruns. An optimistic benefit estimate will be high, we can afford to be” (Siilasmaa, 2018, p. xvi). If you are
leading to benefit shortfalls. Optimism therefore produces a sys- looking for someone to successfully lead a project, this is the
tematic bias in project outcomes, which is what the data show type of person you want: a realistic optimist, if not a paranoid
(see Table 2). The theory of optimism bias thus fits the data one. You would never get on a plane if you overheard the
well, which lends support to its validity. pilot say to the copilot, “I’m optimistic about the fuel situation.”
Interestingly, however, when researchers ask forecasters Similarly, one should not trust a project leader who is optimistic
about causes of inaccuracies in their forecasts, they do not about the budget or schedule, which is the fuel of projects.
state optimism bias as a main cause, whereas they do mention During the Apollo program (1961–1972), the NASA
strategic misrepresentation and the usual suspects: scope administration criticized its cost engineers for being optimistic
changes, complexity, price changes, unexpected underground with a US$10 billion estimate for the program (approximately
conditions, bad weather, and so on (Flyvbjerg et al., 2005, US$90 billion in 2021 dollars). The administration told the engi-
pp. 138–140). Psychologists would argue this is because opti- neers that their assumption “that everything’s going to work”
mism bias is a true cognitive bias. As such it is unreflected by was wrong (Bizony, 2006, p. 41). The engineers
forecasters, including when they participate in surveys of then increased their estimate to US$13 billion, which the admin-
stated causes of forecasting inaccuracy, which is why such istration adjusted to US$20 billion and got approved by
surveys cannot be trusted. Psychologists would further argue Congress, to the shock of the engineers. Today, the NASA
there is a large body of experimental evidence for the existence administration’s US$7 billion increase has a technical name:
of optimism bias (Buehler et al., 1994, 1997; Newby-Clark “optimism bias uplift.” NASA jokingly called it the “administra-
et al., 2000). However, the experimental data are mostly from tor’s discount.” But they were serious when they advised that all
simple laboratory experiments with students. This is a senior executives in charge of large, complex projects must apply
problem, because it is an open question to what extent the such a discount to make allowance for the unknown. Whatever
results apply outside the laboratory, in real-life situations like the name, it is the single most important reason Apollo has
project planning and management. gone down in history as that rare species of multi-billion-dollar
Optimism bias can be both a blessing and a curse. Optimism project: one delivered on budget. The NASA administration
and a “can-do” attitude are obviously necessary to get projects “knew exactly what [it] was doing” for Apollo, as rightly
done. Kahneman (2011, p. 255) calls optimism “the engine of observed by space historian Piers Bizony (ibid.).
capitalism.” I would go further and call it the engine of life. Explanations of project outcomes in terms of optimism bias
But, optimism can seriously trip us up if we are unaware of originate with Kahneman and Tversky (1979a) and have been
its pitfalls and therefore take on risks we would have avoided further developed by Kahneman and Lovallo (1993), Lovallo
had we known the real, nonoptimistic, odds. This has been and Kahneman (2003), Flyvbjerg (2009a), and Flyvbjerg
known and reflected since at least the ancient Greeks. More et al. (2004, 2009).
than two millennia ago, the Greek historian Thucydides We saw above that strategic project planners and managers
(2009, p. 220) said about the Athenians that “they expected sometimes underestimate cost and overestimate benefit to
no reverses” to “their current good fortune”—in other words, achieve approval for their projects. Optimistic planners and
they were optimistic, specifically overconfident—and this managers also do this, albeit unintentionally. The result is the
caused the fall of Athens in the Peloponnesian War, according same, however, namely cost overruns and benefit shortfalls.
to Thucydides. Thus, optimism bias and strategic misrepresentation reinforce
No upside can compensate for the ultimate downside: death. each other, when both are present in a project. An interviewee
This is a fundamental asymmetry between upside and downside in our research described this strategy as “showing the project at
in human existence and is probably why humans are predis- its best” (Flyvbjerg et al., 2004, p. 50). It results in an inverted
posed to loss aversion, as documented by prospect theory Darwinism, “survival of the unfittest” (Flyvbjerg, 2009b). It is
(Kahneman & Tversky, 1979b). Quite simply, it is rational in not the best projects that get implemented like this, but the
evolutionary terms to be more concerned about downside projects that look best on paper. And, the projects that look
than upside. “Death” does not have to be of an individual, need- best on paper are the projects with the largest cost underesti-
less to say. It can be of a nation, a city, a business, or a project. mates and benefit overestimates, other things being equal.
In my research, I have found that successful leaders have a But, the larger the cost underestimate on paper, the greater
rare combination of hyperrealism and can-do optimism the cost overrun in reality. And, the larger the overestimate of
Flyvbjerg 537

benefits, the greater the benefit shortfall. Therefore, the projects It is self-evidently true, of course, that a project may be unique
that have been made to look best on paper become the worst, or in its own specific geography and time. For instance, California
unfittest, projects in reality. has never built a high-speed rail line before, so in this sense, the
California High-Speed Rail Authority is managing a unique
project. But, the project is only unique to California and therefore
Uniqueness Bias not truly unique. Dozens of similar projects have been built
Uniqueness bias was originally identified by psychologists as around the world, with data and lessons learned that would be
the tendency of individuals to see themselves as more singular highly valuable to California. In that sense, projects are no differ-
than they actually are, for example, singularly healthy, clever, ent from people. A quote, often ascribed to the anthropologist
or attractive (Goethals et al., 1991; Suls et al., 1988; Suls & Margaret Mead, captures the point well: “Always remember
Wan, 1987). In project planning and management, the term that you are absolutely unique. Just like everyone else.” Each
was first used by Flyvbjerg (2014, p. 9), who defined unique- person not only is obviously unique but also has a lot in
ness bias as the tendency of planners and managers to see common with other people. The uniqueness of people has not
their projects as singular. It is a general bias, but it turns out stopped the medical profession from making progress based on
to be particularly rewarding as an object of study in project what humans have in common. The problem with project man-
management, because project planners and managers are sys- agement is that uniqueness bias hinders such learning across pro-
tematically primed to see their projects as unique. jects, because project managers and scholars are prone to
The standard definition of a project, according to the biggest “localism bias,” which we define as the tendency to see the
professional organization in the field, the U.S.-based Project local as global, due to availability bias for the local. Localism
Management Institute (PMI, 2017, p. 4), directly emphasizes bias explains why local uniqueness is easily and often confused
uniqueness as one of two defining features of what a project with global uniqueness. In many projects, it does not even occur
is: “A project is a temporary endeavor undertaken to create a to project planners and managers to look outside their local
unique product, service, or result” (italics added). Similarly, project, because “our project is unique,” which is a mantra one
the U.K.-based Association for Project Management (APM, hears over and over in projects and that is surprisingly easy to
2012) stresses uniqueness as the very first characteristic of get project managers to admit to.
what a project is in their official definition: “A project is a Uniqueness bias feeds what Kahneman (2011, p. 247) calls
unique, transient endeavour, undertaken to achieve planned the “inside view.” Seeing things from this perspective, planners
objectives” (italics added). Academics, too, define projects in focus on the specific circumstances and components of the
terms of uniqueness, here Turner and Müller (2003, p. 7, project they are planning and seek evidence in their own expe-
italics added): “A project is a temporary organization to rience. Estimates of budget, schedule, and so forth are based on
which resources are assigned to undertake a unique, novel this information, typically built “from the inside and out,” or
and transient endeavour managing the inherent uncertainty bottom-up, as in conventional cost engineering. The alternative
and need for integration in order to deliver beneficial objectives is the “outside view,” which consists of viewing the project
of change.” Similar views of uniqueness as key to the nature of you are planning from the perspective of similar projects that
projects may be found in Grün (2004, p. 3, p. 245), Fox and have already been completed, basing your estimates for the
Miller (2006, p. 3, p. 109), and Merrow (2011, p. 161). planned project on the actual outcomes of these projects. But
We maintain that the understanding of projects as unique if your project is truly unique, then similar projects clearly do
is unfortunate, because it contributes to uniqueness bias with not exist, and the outside view becomes irrelevant and impossible.
project planners and managers. In the grip of uniqueness This leaves you with the inside view as the only option for
bias, project managers see their projects as more singular planning your project. Even if a project is not truly unique,
than they actually are. This is reinforced by the fact that if the project team thinks it is, then the outside view will be
new projects often use nonstandard technologies and left by the wayside, and the inside view will reign supreme,
designs. which is typical. “In the competition with the inside view,
Uniqueness bias tends to impede managers’ learning, the outside view does not stand a chance,” as pithily observed
because they think they have little to learn from other projects by Kahneman (2011, p. 249). The inside view is the perspec-
as their own project is unique. Uniqueness bias may also feed tive people spontaneously adopt when they plan, reinforced by
overconfidence bias (see below) and optimism bias (see uniqueness bias for project planners and managers. The inside
above), because planners subject to uniqueness bias tend to view is therefore typical of project planning and management.
underestimate risks. This interpretation is supported by research The consequences are dire, because only the outside view
on IT project management reported in Flyvbjerg and Budzier effectively takes into account all risks, including the so-called
(2011), Budzier and Flyvbjerg (2013), and Budzier (2014). “unknown unknowns.” These are impossible to predict from
The research found that managers who see their projects as the inside, because there are too many ways a project can go
unique perform significantly worse than other managers. If wrong. However, the unknown unknowns are included in
you are a project leader and you overhear team members the outside view, because anything that went wrong with the
speak of your project as unique, you therefore need to react. completed projects that constitute the outside view is included
538 Project Management Journal 52(6)

in their outcome data (Flyvbjerg, 2006). Using these data for Sunstein (2017) suggested the term “planning fallacy writ
planning and managing a new project therefore leaves you large” for the broader concept, to avoid confusing the two.
with a measure of all risk, including unknown unknowns. Flyvbjerg et al. (2003, p. 80) call the tendency to plan
Uniqueness bias makes you blind to unknown unknowns. according to best-case scenarios the “EGAP principle,” for
The outside view is an antidote to uniqueness bias. Everything Goes According to Plan. The planning fallacy and
Project managers, in addition to being predisposed, like every- the EGAP principle are similar in the sense that both result in
one else, to the inside view and uniqueness, have been indoctri- a lack of realism, because of their overreliance on best-case sce-
nated by their professional organizations to believe projects are narios, as with the NASA cost engineers above. Both lead to
unique, as we saw above. Thus it’s no surprise it takes substantial base rate neglect, illusion of control, and overconfidence. In
experience to cut loose from the conventional view. Patrick this manner, both feed into optimism bias.
O’Connell, an experienced megaproject manager and At the most fundamental level, Kahneman and Tversky
ex-Practitioner Director of Oxford’s BT Centre for Major (1979a) identified the planning fallacy as arising from a tendency
Programme Management, told me, “The first 20 years as a mega- with people to neglect distributional information when they plan.
project manager I saw uniqueness in each project; the next 20 People who plan would adopt what Kahneman and Tversky
years similarities.” The NASA administration, mentioned (1979a, p. 315) first called an “internal approach to prediction”
above, balked when people insisted the Apollo program, with and later renamed the “inside view,” under the influence of
its aim of landing the first humans on the moon, was unique. which people would focus on “the constituents of the specific
How could it not be, as putting people on the moon had never problem rather than on the distribution of outcomes in similar
been done before, people argued. The administration would cases.” Kahneman and Tversky (1979a) emphasized that “The
have none of it. They deplored those who saw the program “as internal approach to the evaluation of plans is likely to produce
so special—as so exceptional,” because such people did not underestimation [of schedules].” For the planning fallacy writ
understand the reality of the project. The administration insisted, large, such underestimation applies to costs, schedules, and risk,
in contrast, that “the basic knowledge and technology and the whereas overestimation applies to benefits and opportunities.
human and material resources necessary for the job already Interestingly, Guinote (2017, pp. 365–366) found in an
existed,” so there was no reason to reinvent the wheel (Webb, experiment that subjects who had been made to feel in power
1969, p. 11, p. 61). The NASA-Apollo view of uniqueness were more likely to underestimate the time needed to complete
bias saw this bias for what it is: a fallacy. a task than those not in power, demonstrating a higher degree of
In sum, uniqueness bias feeds the inside view and optimism, planning fallacy for people in power. Again, this is an example
which feed underestimation of risk, which makes project teams of how power bias and cognitive bias interact, resulting in
take on risks they would likely not have accepted had they amplification and convexity.
known the real odds. Good project leaders do not let themselves The planning fallacy’s combination of underestimated costs
be fooled like this. They accept that projects may be unique and overestimated benefits generates risks to the second degree.
locally, yes. But they understand that to be locally unique is Instead of cost risk and benefit risk canceling out one another—
an oxymoron. Local uniqueness is, however, the typical as other theories predict, for example, Hirschman’s (2014) prin-
meaning of the term “unique,” when used in project manage- ciple of the Hiding Hand—under the planning fallacy, the two
ment. It is a misnomer that undermines project performance types of risk reinforce each other, creating convex (accelerated)
and thus the project management profession. Truly unique pro- risks for projects from the get-go. The planning fallacy goes a
jects are rare. We have lots to learn from other projects. And if long way in explaining the Iron Law of project management:
we don’t learn, we will not succeed with our projects. “Over budget, over time, under benefits, over and over again”
(Flyvbjerg, 2017). As a project leader, you want to avoid
convex risks because such risks are particularly damaging.
You want to avoid committing the planning fallacy and espe-
The Planning Fallacy (Writ Large) cially for people in power.
The planning fallacy is a subcategory of optimism bias that
arises from individuals producing plans and estimates that are
unrealistically close to best-case scenarios. The term was origi- Overconfidence Bias, Hindsight Bias, and
nally coined by Kahneman and Tversky (1979a, p. 315) to
describe the tendency for people to underestimate task comple- Availability Bias
tion times. Buehler et al. (1994, 1997) continued work following Overconfidence bias is the tendency to have excessive
this definition. Later, the concept was broadened to cover the confidence in one’s own answers to questions and to not fully
tendency for people to, on the one hand, underestimate costs, recognize the uncertainty of the world and one’s ignorance of
schedules, and risks for planned actions and, on the other, it. People have been shown to be prone to what is called the
overestimate benefits and opportunities for those actions. “illusion of certainty” in (a) overestimating how much they
Because the original narrow and later broader concepts are so understand and (b) underestimating the role of chance events
fundamentally different in the scope they cover, Flyvbjerg and and lack of knowledge, in effect underestimating the
Flyvbjerg 539

variability of events they are exposed to in their lives (Moore & of knowledge and ignoring or underrating the role of chance
Healy, 2008; Pallier et al., 2002; Proeger & Meub, 2014). events in deciding the fate of projects. Hiring experts will gen-
Overconfidence bias is found with both laypeople and erally not help, because experts are just as susceptible to
experts, including project planners and managers (Fabricius & overconfidence bias as laypeople and therefore tend to underes-
Büttgen, 2015). timate risk, too. There is even evidence that the experts who
Overconfidence bias is fed by illusions of certainty, which are most in demand are the most overconfident. In other
are fed by hindsight bias also known as the “I-knew-it-all-along words, people are attracted to, and willing to pay for, confi-
effect.” Availability bias—the tendency to overweigh what- dence, more than expertise (Kahneman, 2011, p. 263;
ever comes to mind—similarly feeds overconfidence bias. Tetlock, 2005). Risk underestimation feeds the Iron Law of
Availability is influenced by the recency of memories and by project management and is the most common cause of project
how unusual or emotionally charged they may be, with more downfall. Good project leaders must know how to avoid this.
recent, more unusual, and more emotional memories being Individuals produce confidence by storytelling. The more
more easily recalled. Overconfidence bias is a type of optimism, coherent a story we can tell about what we see, the more confi-
and it feeds overall optimism bias. dent we feel. But, coherence does not necessarily equal validity.
A simple way to illustrate overconfidence bias is to ask people People tend to assume “what you see is all there is,” called
to estimate confidence intervals for statistical outcomes. In one WYSIATI by Kahneman (2011, pp. 87–88), who gives this
experiment, the chief financial officers (CFOs) of large U.S. cor- concept pride of place in explaining a long list of biases, includ-
porations were asked to estimate the return next year on shares ing overconfidence bias. People spin a story based on what they
in the relevant Standard & Poor’s index (Kahneman, 2011, see. Under the influence of WYSIATI, they spontaneously
p. 261). In addition, the CFOs were asked to give their best impose a coherent pattern on reality, while they suppress
guess of the 80% confidence interval for the estimated returns doubt and ambiguity and fail to allow for missing evidence,
by estimating a value for returns they were 90% sure would be says Kahneman. The human brain excels at inferring patterns
too low (the lower decile, or P10) and a second value they were and generating meaning based on skimpy, or even nonexistent,
90% sure would be too high (the upper decile, or P90), with evidence. But, coherence based on faulty or insufficient data is
80% of returns estimated to fall between these two values (and not true coherence, needless to say. If we are not careful, our
20% outside). Comparing actual returns with the estimated confi- brains quickly settle for anything that looks like coherence
dence interval, it was found that 67% of actual returns fell outside and uses it as a proxy for validity. This may not be a big
the estimated 80% confidence interval, or 3.35 times as many as problem most of the time, and may even be effective, on
estimated. The actual variance of outcomes was grossly underesti- average, in evolutionary terms, which could be why the brain
mated by these financial experts, which is the same as saying they works like this. But for big consequential decisions, typical of
grossly underestimated risk. It is a typical finding. The human project planning and management, it is not an advisable strat-
brain, including the brains of experts, spontaneously underesti- egy. Nevertheless, project leaders and their teams often have
mates variance. For whatever reason, humans seem hardwired a very coherent—and very wrong—story about their project,
for this. for instance that the project is unique, as we saw above under
In project management, overconfidence bias is built into the uniqueness bias, or that the project may be completed faster
tools experts use for risk management. The tools, which are typi- and cheaper than the average project or that everything will
cally based on computer models using so-called Monte Carlo sim- go according to plan. The antidote is better, more carefully
ulations, or similar, look scientific and objective but are anything curated stories, based on better data.
but. Again, this is easy to document. You simply compare Gigerenzer (2018, p. 324) has rightly observed that overcon-
assumed variance in a specific, planned project with actual, his- fidence, presented by psychologists as a nondeliberate cognitive
toric variance for its project type, and you find the same result bias, is in fact often a deliberate strategic bias used to achieve
as for the CFOs above (Batselier & Vanhoucke, 2016). The bias predefined objectives; in other words, it is strategic misrepre-
is generated by experts assuming thin-tailed distributions of risk sentation. Financial analysts, for instance, “who earn their
(normal or near-normal), when the real distributions are fat-tailed money by mostly incorrect predictions such as forecasting
(lognormal, power law, or similar probability distribution) (Taleb, exchange rates or the stock market had better be overconfident;
2004). The error is not with Monte Carlo models as such, but with otherwise few would buy their advice,” argues Gigerenzer, who
erroneous input into the models. Garbage in, garbage out, as further observes about this fundamental confusion of one type
always. To eliminate overconfidence bias, you want a more objec- of bias for a completely different one that, “[c]onceptual
tive method that takes all distributional information into account, clarity is desperately needed” (Gigerenzer, 2018, p. 324).
not just the distributional information experts can think of, which Finally, regarding the relationship between power bias and
is subject to availability bias. The method needs to run on histor- cognitive bias mentioned above, powerful individuals have
ical data from projects that have actually been completed. been shown to be more susceptible to availability bias than indi-
Flyvbjerg (2006) describes such a method. viduals who are not powerful. The causal mechanism seems to
In the thrall of overconfidence bias, project planners and be that powerful individuals are affected more strongly by ease
decision makers underestimate risk by overrating their level of retrieval than by the content they retrieve, because they are
540 Project Management Journal 52(6)

more likely to “go with the flow” and trust their intuition than The base rate fallacy runs rampant in project planning and
individuals who are not powerful (Weick & Guinote, 2008). management, as documented by the Iron Law described
This finding has been largely ignored by behavioral economists, earlier. Table 2 shows the most comprehensive overview that
including Thaler (2015) in his history of the field. This is exists of base rates for costs and benefits in project manage-
unfortunate, because the finding documents convexity to the ment, based on data from 2,062 projects covering eight
second degree for situations with power. By overlooking this, project types. Most projects do not get base rates right, as doc-
behavioral economists make the same mistake they criticize umented by averages that are different from one (1.0 ≈ correct
conventional economists for, namely overlooking and underes- base rate) at a level of statistical significance so high
timating variance and risk. Conventional economists make the (p < 0.0001) it is rarely found in studies of human behavior.
mistake by disregarding cognitive bias; behavioral economists The base rate fallacy is deeply entrenched in project manage-
by ignoring power bias and its effect on cognitive bias. ment, as the data show. Flyvbjerg and Bester (2021) argue
Underestimating convexity is a very human mistake, to be that base rate neglect results in a new behavioral bias, which
sure. We all do it. But, it needs to be accounted for if we they call the “cost–benefit fallacy,” which routinely derail
want to understand all relevant risks and protect ourselves cost–benefit analyses of projects to a degree where such analy-
against them in project planning and management. ses cannot be trusted.
As pointed out by Kahneman (2011, p. 150), “anyone who
ignores base rates and the quality of evidence in probability
assessments will certainly make mistakes.” The cure for the
The Base Rate Fallacy base rate fallacy, in and out of project management, is to get
The base rate fallacy—sometimes also called base rate bias or the base rate right by taking an outside view, for instance
base rate neglect—is the tendency to ignore base rate informa- through reference class forecasting, carrying out premortems,
tion (general data pertaining to a statistical population or a large or doing decision hygiene (Flyvbjerg, 2006; Klein, 2007;
sample, e.g., its average) and focus on specific information Kahneman et al., 2011, 2021, pp. 312–324, 371–372).
(data only pertaining to a certain case or a small number of If you are a project planner or manager, the easiest and most
cases) (Bar-Hillel, 1980; Tversky & Kahneman, 1982). If you effective way to get started with curbing behavioral biases in
play poker and assume different odds than those that apply, your work is getting your base rates right, for the projects
you are subject to the base rate fallacy and likely to lose. The you are working on. Hopefully, most can see that if you do
objective odds are the base rate. not understand the real odds of a game, you are unlikely to
People often think the information they have is more rele- succeed at it. But that is the situation for most project planners
vant than it actually is or they are blind to relevant information and managers: they do not get the odds right for the game they
they do not have. Both situations result in the base rate fallacy. are playing: project management. Table 2 documents this
“Probability neglect”—a term coined by Sunstein (2002, beyond reasonable doubt and establishes realistic base rates
pp. 62–63) to denote the situation where people overfocus on for a number of important areas in project management that
bad outcomes with small likelihoods, for instance terrorist planners can use as a starting point for getting their projects
attacks—is a special case of the base rate fallacy. right. Data for other project types were not included for
The base rate fallacy is fed by other biases, for instance, reasons of space but show similar results.
uniqueness bias, described above, which results in extreme
base rate neglect, because the case at hand is believed to be sin-
gular, wherefore information about other cases is deemed irrel- Anchoring
evant. The inside view, hindsight bias, availability bias, recency Anchoring is the tendency to rely too heavily, or “anchor,” on
bias, WYSIATI bias, overconfidence bias, and framing bias one piece of information when making decisions. Anchoring
also feed the base rate fallacy. Base rate neglect is particularly was originally demonstrated and theorized by Tversky and
pronounced when there is a good, strong story. Big, monumen- Kahneman (1974). In their perhaps most famous experiment,
tal projects typically have such a story, contributing to extra subjects were asked to estimate the percentage of African coun-
base rate neglect for those. Finally, we saw above that people, tries in the United Nations. First, a number between 0 and 100
including experts, underestimate variance. In the typical was determined by spinning a wheel of fortune in the subjects’
project, base rate neglect therefore combines with variance presence. Second, the subjects were instructed to indicate
neglect, following this formula: whether that number was higher or lower than the percentage
of African countries in the United Nations. Third, the subjects
Base Rate Neglect + Variance Neglect = Strong Convexity
were asked to estimate this percentage by moving upward or
Preliminary results from our research indicate that variance downward from the given number. The median estimate was
neglect receives less attention in project management than base 25% for subjects who received the number 10 from the wheel
rate neglect, which is unfortunate, because the research also indi- of fortune as their starting point, whereas it was 45% for sub-
cates that variance neglect is typically larger and has even more jects who started with 65. A random anchor significantly influ-
drastic impact on project outcomes than base rate neglect. enced the outcome.
Flyvbjerg 541

Similar results have been found in other experiments for a rightly observed by Tetlock and Gardner (2015, pp. 117–120),
wide variety of different subjects of estimation (Chapman & whereas spontaneous anchors typically are less meaningful and
Johnson, 1999; Fudenberg et al., 2012). Anchoring is pervasive. lead to biased decisions with hidden risks.
The human brain will anchor in most anything, whether random
numbers, previous experience, or false information. It has
proven difficult to avoid this (Epley & Gilovich, 2006; Escalation of Commitment
Simmons et al., 2010; Wilson et al., 1996). The most effective Last, but not least, escalation of commitment (sometimes also
way of dealing with anchoring is therefore to make sure the called commitment bias) is the tendency to justify increased
brain anchors in relevant information before making decisions. investment in a decision, based on the cumulative prior invest-
An obvious choice would be to anchor in base rates that are per- ment, despite new evidence suggesting the decision may be
tinent to the decision at hand, as proposed by Flyvbjerg (2006). wrong and additional costs will not be offset by benefits.
This advice is similar to recommending that gamblers must Consider the example of two friends with tickets for a profes-
know the objective odds of each game they play. It is sound sional basketball game a long drive from where they live. On
advice but often goes unheeded in project management. the day of the game, there is a big snowstorm. The higher the
Project planners and managers tend to err by anchoring their price the friends paid for the tickets, the more likely they are
decisions in plans that are best-case, instead of most likely, sce- to brave the blizzard and attempt driving to the game, investing
narios, as mentioned above. Planners and organizations also fre- more time, money, and risk (Thaler, 2015, p. 20). That is esca-
quently anchor in their own limited experience, instead of lation of commitment. In contrast, the rational approach when
seeking out a broader scope of histories, which would be deciding whether to invest further in a venture would be to dis-
more representative of the wider range of possible outcomes regard what you have already invested.
that actually apply to the project they are planning. Escalation of commitment applies to individuals, groups,
This happened to Hong Kong’s MTR Corporation when they and whole organizations. It was first described by Staw
were tasked with building the first high-speed rail line in the terri- (1976) with later work by Brockner (1992), Staw (1997),
tory. MTR anchored in its own experience with urban and conven- Sleesman et al. (2012), and Drummond (2014, 2017).
tional rail instead of throwing the net wider and looking at Economists use related terms like the “sunk-cost fallacy”
high-speed rail around the world. High-speed rail is significantly (Arkes & Blumer, 1985) and “lock-in” (Cantarelli et al.,
more difficult to build than urban and conventional rail, and 2010b) to describe similar phenomena. Escalation of commit-
MTR had never built a high-speed rail line before. Despite—or ment is captured in popular proverbs such as, “Throwing
perhaps because of—MTR’s proven competence in building good money after bad” and “In for a penny, in for a pound.”
urban and conventional rail, the anchor for the high-speed rail In its original definition, escalation of commitment is unre-
line proved optimistic, resulting in significant cost and schedule flected and nondeliberate. People do not know they are
overruns for the new venture (Flyvbjerg et al., 2014). subject to the bias, as with other cognitive biases. However,
Ansar et al. (2014, p. 48) similarly found that planners of once you understand the mechanism, it may be used deliber-
large dams around the world have generally anchored in the ately. In his autobiography, famous Hollywood director Elia
North American experience with building dams, for no better Kazan (1997, pp. 412–413) explains how he used sunk costs
reason than North America built their dams first. By choosing and escalation of commitment to get his projects going:
this anchor, planners ended up allowing insufficient adjust-
ments to fully reflect local risks, for example, exchange rate “Quickly I planned my position on costs … My tactic was one
risks, corruption, logistics, and the quality of local project man- familiar to directors who make films off the common path: to get
agement teams. This resulted in optimistic budgets and higher the work rolling, involve actors contractually, build sets, collect
cost overruns for dams built outside North America. props and costumes, expose negative, and so get the studio in
Anchoring is fed by other biases, including availability bias deep. Once money in some significant amount had been spent,
and recency bias, which induce people to anchor in the most it would be difficult for Harry [Cohn, President and co-founder
available or most recent information, respectively. Anchoring of Columbia Pictures] to do anything except scream and holler.
results in base rate neglect, in other words, underestimation of If he suspended a film that had been shooting for a few weeks,
the probabilities, and thus the risks, that face a project (see pre- he’d be in for an irretrievable loss, not only of money but of
vious section). Smart project leaders avoid this by anchoring ‘face.’ The thing to do was get the film going.”
their project in the base rate for similar projects to the one they
are planning, for instance, by benchmarking their project Kazan here combines strategic misrepresentation with cogni-
against outcomes for a representative class of similar, completed tive bias to achieve takeoff for his projects. The misrepresenta-
projects. Flyvbjerg (2013) explains how to do this, and tion consists in initially (a) being “economical with the truth”
Kahneman (2011, p. 154) explicitly identifies anchoring in the regarding the real cost of his projects and (b) just “get the
base rate as the cure for the WYSIATI bias mentioned above. film going” to sink in sufficient cost to create a point of no
Anchoring in the base rate is similar to taking an outside view, return. After this, Kazan trusts the studio head to fall victim
and the outside view is “an anchor that is meaningful,” as to cognitive bias, specifically sunk cost and escalation of
542 Project Management Journal 52(6)

commitment, in the grip of which he will allocate more money in the pipeline. Should the thesis hold across project types,
to the film instead of halting it, which might have been the ratio- we may be in the first stages of discovering a general theory
nal decision. This is the studio head’s version of Thaler’s (2015) of project management, with more fundamental and more scien-
“driving into the blizzard,” described above. As argued earlier, tific explanations of project outcomes than those found in con-
such interaction between cognitive and political bias is common ventional theory.
in shaping project outcomes. Most project managers will know
examples similar to Kazan’s. It is too simple to think of out-
comes as being generated solely by either cognitive bias or Discussion
political bias. Such purity may be constructed in lab experi- Scientific revolutions rarely happen without friction. So, too,
ments. In real life, both are typically at play with complex inter- for the behavioral revolution. It has been met with skepticism,
actions between the two. including from parts of the project management community
A number of excellent case studies exist that demonstrate the (Flyvbjerg et al., 2018). Some members prefer to stick with con-
pertinence of escalation of commitment to project planning and ventional explanations of project underperformance in terms of
management, for example, of Expo 86 (Ross & Staw, 1986), the errors of scope, complexity, labor and materials prices, archae-
Shoreham nuclear power plant (Ross & Staw, 1993), and ology, geology, bad weather, ramp-up problems, demand fluc-
Denver International Airport (Monteagre & Keil, 2000), each of tuations, and so forth (Cantarelli et al., 2010a).
which present their own version of “driving into the blizzard.” Behavioral scientists would agree with the skeptics that
We saw above how optimism bias undermines project per- scope changes, complexity, and so forth are relevant for under-
formance. Escalation of commitment amplifies this. Consider standing what goes on in projects but would not see them as
that once a forecast turns out to have been optimistic, often root causes of outcomes. According to behavioral science, the
the wisest thing would be to give up the project. But, escalation root cause of, say, cost overrun is the well-documented fact
of commitment and the sunk cost fallacy keep decision-makers that project planners and managers keep underestimating
from doing the right thing. Instead, they keep going, throwing scope changes, complexity, and so forth in project after project.
good money after bad. From the point of view of behavioral science, the mecha-
Escalation of commitment often coexists with and is rein- nisms of scope changes, complex interfaces, price changes,
forced by what has been called “preferential attachment” or archaeology, geology, bad weather, and business cycles are
the “Yule process” (Barabási, 2014; Barabási & Albert, 1999; not unknown to project planners and managers, just as it is
Gabaix, 2009). Preferential attachment is a procedure in not unknown that such mechanisms may be mitigated.
which some quantity, for example, money or connections in a However, project planners and managers often underestimate
network, is distributed among a number of individuals or these mechanisms and mitigation measures, due to optimism
units according to how much they already have, so that those bias, overconfidence bias, the planning fallacy, and strategic
who have much receive more than those who have little, misrepresentation. In behavioral terms, unaccounted for scope
known also as the “Matthew effect.” changes are manifestations of such underestimation on the
In project planning and management, Flyvbjerg (2009b) part of project planners, and it is in this sense bias and underes-
argued that the investments that look best on paper get timation are root causes and scope changes are just causes. But
funded and that these are the investments with the largest cost because scope changes are more visible than the underlying root
underestimates and therefore the largest need for additional causes, they are often mistaken for the cause of outcomes, for
funding during delivery, resulting in preferential attachment example, cost overrun.
of funds to these investments, once they have their initial In behavioral terms, the causal chain starts with human bias
funding. After an investment has been approved and funded, (political and cognitive), which leads to underestimation of
typically there is lock-in and a point of no return, after which scope during planning, which leads to unaccounted for scope
escalation of commitment follows, with more and more funds changes during delivery, which leads to cost overrun. Scope
allocated to the original investment to close the gap between changes are an intermediate stage in this causal chain through
the original cost underestimate and actual outturn cost which the root causes manifest themselves. Behavioral science
(Cantarelli et al., 2010b; Drummond, 2017). tells project planners and managers, “Your biggest risk is you.”
Interestingly, preferential attachment has been identified as a It is not scope changes, complexity, and so forth in themselves
causal mechanism that generates outcome distributions with a that are the main problem; it is how human beings misconceive
fat upper tail, specifically power law distributions (Barabási, and underestimate these phenomena, through optimism bias,
2014; Krapivsky & Krioukov, 2008). In the case of cost, this overconfidence bias, and strategic misrepresentation. This is a
would predict an overincidence (compared with the Gaussian) profound and proven insight that behavioral science brings to
of extreme cost overruns. So far, we have tested the thesis for project planning and management. You can disregard it, of
cost and cost overrun with the Olympic Games, where the course. But if you do, project performance would likely suffer.
thesis found strong support in the data (Flyvbjerg et al., You would be the gambler not knowing the odds of their game.
2021). Currently, we are further testing the thesis for informa- Behavioral science is not perfect. We saw above how behav-
tion technology projects, while tests of other project types are ioral economics suffers from a “psychology bias,” in the sense it
Flyvbjerg 543

tends to reduce behavioral biases to cognitive biases, ignoring Barabási, A.-L., & Albert, R. (1999). Emergence of scaling in random
political bias in the process, thus committing the very sin it networks. Science, 286(5439), 509–512.
accuses conventional economics of, namely theory-induced Batselier, J., & Vanhoucke, M. (2016). Practical application and empir-
blindness resulting in limited rationality. Gigerenzer (2018) ical evaluation of reference class forecasting for project manage-
goes further and criticizes behavioral economics for “bias ment. Project Management Journal, 47(5), 36–51.
bias,” and he is right when he calls for conceptual clarification. Bizony, P. (2006). The man who ran the moon: James Webb, JFK, and
Not all behavioral biases are well defined, or even well delin- the secret history of Project Apollo. Icon Books.
eated: many and large overlaps exist among different biases Bok, S. (1999). Lying: Moral choice in public and private life. Vintage,
that need clarification, including for the 10 described above. first published in 1979.
Just as seriously, many biases have only been documented in Brockner, J. (1992). The escalation of commitment to a failing course
simplified lab experiments but are tacitly assumed to hold in real- of action: Toward theoretical progress. Academy of Management
life situations outside the lab, without sound demonstration that Review, 17(1), 39–61.
the assumption holds. Finally, the psychology used by behavioral Budzier, A. (2014). Theorizing outliers: Explaining variation in IT
economists is not considered cutting-edge by psychologists, a project performance (DPhil thesis). Green-Templeton College.
fact openly acknowledged by Thaler (2015, p. 180), who Budzier, A., & Flyvbjerg, B. (2013). Making sense of the impact
further admits it is often difficult to pin down which specific and importance of outliers in project management through the
behavioral bias is causing outcomes in a given situation or to use of power laws. Proceedings of IRNOP (International
rule out alternative explanations (Thaler, 2015, p. 295). Research Network on Organizing by Projects), Volume 11, June,
Nevertheless, the behavioral revolution seems to be here to pp. 1–28.
stay, and it entails an important change of perspective for Buehler, R., Griffin, D., & MacDonald, H. (1997). The role of moti-
project management: The problem with project cost overruns vated reasoning in optimistic time predictions. Personality and
and benefit shortfalls is not error but bias, and as long as we try Social Psychology Bulletin, 23(3), 238–247.
to solve the problem as something it is not (error), we will not Buehler, R., Griffin, D., & Ross, M. (1994). Exploring the
succeed. Estimates and decisions need to be debiased, which is ‘planning fallacy’: Why people underestimate their task comple-
fundamentally different from eliminating error. Furthermore, tion times. Journal of Personality and Social Psychology, 67,
the problem is not even cost overruns or benefit shortfalls, it is 366–381.
cost underestimation and benefit overestimation. Overrun, for Cantarelli, C. C., Flyvbjerg, B., Molin, E. J. E., & van Wee, B. (2010a).
instance, is mainly a consequence of underestimation, with the Cost overruns in large-scale transportation infrastructure
latter happening upstream from overrun, for big projects often projects: Explanations and their theoretical embeddedness.
years before overruns manifest. Again, if we try to solve the European Journal of Transport and Infrastructure Research,
problem as something it is not (cost overrun), we will fail. We 10(1), 5–18.
need to solve the problem of upstream cost underestimation in Cantarelli, C. C., Flyvbjerg, B., van Wee, B., & Molin, E. J. E. (2010b).
order to solve the problem of downstream cost overrun. Once Lock-in and its influence on the project performance of large-scale
we understand these straightforward insights, we understand transportation infrastructure projects: Investigating the way in
that we and our projects are better off with an understanding of which lock-in can emerge and affect cost overruns. Environment
behavioral science and behavioral bias than without it. and Planning B: Planning and Design, 37, 792–807.
Carson, T. L. (2006). The definition of lying. Noûs, 40, 284–306.
Chapman, G. B., & Johnson, E. J. (1999). Anchoring, activation, and
References the construction of values. Organizational Behavior and Human
Anderson, C., & Galinsky, A. D. (2006). Power, optimism, and risk- Decision Processes, 79(2), 115–153.
taking. European Journal of Social Psychology, 36, 511–536. Drummond, H. (2014). Is escalation always irrational? Originally
Ansar, A., Flyvbjerg, B., Budzier, A., & Lunn, D. (2014). Should we published in Organization Studies, 19(6), 1998, here from
build more large dams? The actual costs of hydropower mega- B. Flyvbjerg (Ed.), Megaproject planning and management:
project development. Energy Policy, 69, 43–56. Essential readings (Vol. II, pp. 291–309). Edward Elgar.
Arkes, H. R., & Blumer, C. (1985). The psychology of sunk cost. Drummond, H. (2017). Megaproject escalation of commitment: An
Organizational Behavior and Human Decision Making, 35(1), update and appraisal. In B. Flyvbjerg (Ed.), The Oxford hand-
24–140. book of megaproject management (pp. 194–216). Oxford
Association of Project Management (APM). (2012). APM body of University Press.
knowledge (6th Ed.) Retrieved from [Link] Epley, N., & Gilovich, T. (2006). The anchoring-and-adjustment heu-
body-of-knowledge/context/governance/project-management/. ristic: Why the adjustments are insufficient. Psychological
Bar-Hillel, M. (1980). The base-rate fallacy in probability judgments. Science, 17(4), 311–318.
Acta Psychologica, 44(3), 211–233. Faber, T. (2019). Faber & Faber: The untold story. Faber & Faber.
Barabási, A.-L. (2014). Linked: How everything is connected to every- Fabricius, G., & Büttgen, M. (2015). Project managers’ overconfi-
thing else and what it means for business, science, and everyday dence: How is risk reflected in anticipated project success?
life. Basic Books. Business Research, 8, 239–263.
544 Project Management Journal 52(6)

Fallis, D. (2009). What is lying? The Journal of Philosophy, 106(1), Flyvbjerg, B., & Gardner, D. (2022). Big plans: Why most fail, how
29–56. some succeed. Penguin Random House.
Feynman, R. P. (2007a). Richard P. Feynman’s minority report to the Flyvbjerg, B., Glenting, C., & Rønnest, A. (2004). Procedures for
space shuttle Challenger inquiry, in The pleasure of finding dealing with optimism bias in transport planning: Guidance
things out. Penguin, first published in 1999, pp. 151–169. document. UK Department for Transport, London, June.
Feynman, R. P. (2007b). Mr. Feynman goes to Washington: Investigating Flyvbjerg, B., Holm, M. K., & Buhl, S., & L, S. (2005). How (in)accu-
the space shuttle Challenger disaster, in What do you care what other rate are demand forecasts in public works projects? The case of
people think? Further adventures of a curious character. Penguin, transportation. Journal of the American Planning Association,
first published in 1988, pp. 113–237. 71(2), 131–146.
Flyvbjerg, B. (1998). Rationality and power: Democracy in practice. Flyvbjerg, B., Holm, M. K. S., & Buhl, S. L. (2002). Underestimating
The University of Chicago Press. costs in public works projects: Error or lie? Journal of the
Flyvbjerg, B. (2003). Delusions of success: Comment on Dan Lovallo American Planning Association, 68(3), 279–295.
and Daniel Kahneman. Harvard Business Review. December, Flyvbjerg, B., Hon, C.-k., & Fok, W. H. (2016). Reference class fore-
pp. 121–122. casting for Hong Kong’s major roadworks projects. Proceedings
Flyvbjerg, B. (2006). From Nobel Prize to project management: of the Institution of Civil Engineers, 169(CE6), 17–24.
Getting risks right. Project Management Journal, 37(3), 5–15. Flyvbjerg, B., Kao, T. C., & Budzier, A. (2014). Report to the
Flyvbjerg, B. (2009a). Optimism and misrepresentation in early project Independent Board Committee on the Hong Kong Express
development. In T. Williams, K. Samset, & K. Sunnevag (Eds.), Rail Link Project, in MTR Independent Board Committee, Second
Making essential choices with scant information: Front-end Report by the Independent Board Committee on the Express Rail
decision making in major projects (pp. 147–168). Palgrave Link Project (Hong Kong: MTR), pp. A1–A122.
Macmillan. Flyvbjerg, B., & Sunstein, C. R. (2017). The principle of the malevo-
Flyvbjerg, B. (2009b). Survival of the unfittest: Why the worst infra- lent hiding hand; or, the planning fallacy writ large. Social
structure gets built, and what we can do about it. Oxford Research, 83(4), 979–1004.
Review of Economic Policy, 25(3), 344–367. Fox, J. R., & Miller, D. B. (2006). Challenges in managing large pro-
Flyvbjerg, B. (2013). Quality control and due diligence in project jects. Defense Acquisition University Press.
management: Getting decisions right by taking the outside Fudenberg, D., Levine, D. K., & Maniadis, Z. (2012). On the robust-
view. International Journal of Project Management, 31(5), ness of anchoring effects in WTP and WTA experiments.
760–774. American Economic Journal: Microeconomics, 4(2), 131–145.
Flyvbjerg, B. (2014). What you should know about megaprojects and Gabaix, X. (2009). Power laws in economics and finance. Annual
why: An overview. Project Management Journal, 45(2), 6–19. Review of Economics, 1, 255–293.
Flyvbjerg, B. (2016). The fallacy of beneficial ignorance: A test Gigerenzer, G. (2018). The bias bias in behavioral economics. Review
of Hirschman’s hiding hand. World Development, 84, of Behavioral Economics, 5, 303–336.
176–189. Goethals, G. R., Messick, D. M., & Allison, S. (1991). The uniqueness
Flyvbjerg, B. (2017). Introduction: The iron law of megaproject man- bias: Studies in constructive social comparison. In J. Suls &
agement. In B. Flyvbjerg (Ed.), The Oxford handbook of mega- T. A. Wills (Eds.), Social comparison: Contemporary theory
project management (pp. 1–18). Oxford University Press. and research (pp. 149–176). Erlbaum.
Flyvbjerg, B., Ansar, A., Budzier, A., Buhl, S., Cantarelli, C., Garbuio, Grün, O. (2004). Taming giant projects: Management of multi-
M., Glenting, C., Holm, M. S., Lovallo, D., Lunn, D., Molin, E., organization enterprises. Springer.
Rønnest, A., Stewart, A., & van Wee, B. (2018). Five things you Guinote, A. (2017). How power affects people: Activating, wanting,
should know about cost overrun. Transportation Research Part and goal seeking. Annual Review of Psychology, 68, 353–381.
A: Policy and Practice, 118, 174–190. Guinote, A., & Vescio, T. K. (2010). The social psychology of power.
Flyvbjerg, B., & Bester, D. W. (2021). The cost-benefit fallacy: Why Guilford Press.
cost-benefit analysis is broken and how to fix it. Journal of Hirschman, A. O. (2014). The principle of the hiding hand, originally
Benefit-Cost Analysis. published in The Public Interest, Winter 1967, pp. 10–23, in
Flyvbjerg, B., Bruzelius, N., & Rothengatter, W. (2003). Megaprojects B. Flyvbjerg (Ed.), Megaproject planning and management:
and risk: An anatomy of ambition. Cambridge University Press. Essential readings, (vol. I, pp. 149–162). Edward Elgar.
Flyvbjerg, B., & Budzier, A. (2011). Why your IT project may be riskier Jones, L. R., & Euske, K. J. (1991). Strategic misrepresentation in bud-
than you think. Harvard Business Review, 89(9), 23–25. geting. Journal of Public Administration Research and Theory,
Flyvbjerg, B., Budzier, A., & Lunn, D. (2021). Regression to the tail: 1(4), 437–460.
Why the Olympics blow up. Environment and Planning A: Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.
Economy and Space, 53(2), 233–260. Kahneman, D., & Lovallo, D. (1993). Timid choices and bold fore-
Flyvbjerg, B., Garbuio, M., & Lovallo, D. (2009). Delusion and deception casts: A cognitive perspective on risk taking. Management
in large infrastructure projects: Two models for explaining and pre- Science, 39, 17–31.
venting executive disaster. California Management Review, 51(2), Kahneman, D., & Lovallo, D. (2003). Response to Bent Flyvbjerg.
170–193. Harvard Business Review, December, p. 122.
Flyvbjerg 545

Kahneman, D., Lovallo, D., & Sibony, O. (2011). Before you make Ross, J., & Staw, B. M. (1993). Organizational escalation and exit: The
that big decision. Harvard Business Review, June, pp. 51–60. case of the Shoreham Nuclear Power Plant. Academy of
Kahneman, D., Sibony, O., & Sunstein, C. R. (2021). Noise: A flaw in Management Journal, 36(4), 701–732.
human judgment. William Collins. Sharot, T. (2011). The optimism bias: A tour of the irrationally positive
Kahneman, D., & Tversky, A. (1979a). Intuitive prediction: Biases and brain. Pantheon.
corrective procedures. In S. Makridakis & S. C. Wheelwright Sharot, T., Riccardi, A. M., Raio, C. M., & Phelps, E. A. (2007). Neural
(Eds.), Studies in the management sciences: Forecasting (Vol. mechanisms mediating optimism bias. Nature, 450, 102–105.
12, pp. 313–327). North Holland. Shepperd, J. A., Carroll, P., Grace, J., & Terry, M. (2002). Exploring the
Kahneman, D., & Tversky, A. (1979b). Prospect theory: An analysis of causes of comparative optimism. Psychologica Belgica, 42, 65–98.
decisions under risk. Econometrica, 47, 313–327. Siilasmaa, R. (2018). Transforming Nokia: The power of paranoid
Kain, J. F. (1990). Deception in Dallas: Strategic misrepresentation in optimism to lead through colossal change. McGraw Hill.
rail transit promotion and evaluation. Journal of the American Simmons, J. P., LeBoeuf, R. A., & Nelson, L. D. (2010). The effect of
Planning Association, 56(2), 184–196. accuracy motivation on anchoring and adjustment: Do people
Kazan, E. (1997). A life. Da Capo Press (first published in 1988). adjust from provided anchors? Journal of Personality and
Klein, G. (2007). Performing a project premortem. Harvard Business Social Psychology, 99(6), 917–932.
Review, September, pp. 1–2. Sleesman, D. J., Conlon, D. E., McNamara, G., & Miles, J. E. (2012).
Krapivsky, P., & Krioukov, D. (2008). Scale-free networks as prea- Cleaning up the big muddy: A meta-analytic review of the deter-
symptotic regimes of superlinear preferential attachment. minants of escalation of commitment. Academy of Management
Physical Review E, 78(026114), 1–11. Journal, 55(3), 541–562.
List of Cognitive Biases. (2021). In Wikipedia. [Link] Staw, B. M. (1976). Knee-deep in the big muddy: A study of escalating
org/wiki/List_of_cognitive_biases. commitment to a chosen course of action. Organizational
Lovallo, D., & Kahneman, D. (2003). Delusions of success: How opti- Behavior and Human Resources, 16(1), 27–44.
mism undermines executives’ decisions. Harvard Business Staw, B. M. (1997). The escalation of commitment: An update and
Review, July, 56–63. appraisal. In Z. Shapira (Ed.), Organizational decision making
Merrow, E. W. (2011). Industrial megaprojects: Concepts, strategies, (pp. 191–215). Cambridge University Press.
and practices for success. Wiley. Steinel, W., & De Dreu, C. K. W. (2004). Social motives and strategic
Monteagre, R., & Keil, M. (2000). De-escalating information technol- misrepresentation in social decision making. Journal of
ogy projects: Lessons from the Denver International Airport. MIS Personality and Social Psychology, 86(3), 419–434.
Quarterly, 24, 417–447. Suls, J., & Wan, C. K. (1987). In search of the false uniqueness phe-
Moore, D. A., & Healy, P. J. (2008). The trouble with overconfidence. nomenon: Fear and estimates of social consensus. Journal of
Psychological Review, 115(2), 502–517. Personality and Social Psychology, 52, 211–217.
Newby-Clark, I. R., Ross, M., Buehler, R., Koehler, D. J., & Griffin, D. Suls, J., Wan, C. K., & Sanders, G. S. (1988). False consensus and
(2000). People focus on optimistic scenarios and disregard false uniqueness in estimating the prevalence of health-protective
pessimistic scenarios while predicting task completion times. behaviors. Journal of Applied Social Psychology, 18, 66–79.
Journal of Experimental Psychology: Applied, 6(3), 171–182. Sunstein, C. R. (2002). Probability neglect: Emotions, worst cases, and
Nouvel, J. (2009). Interview in Weekendavisen, Copenhagen, 16 law. Yale Law Review, 112(61), 61–107.
January 16, p. 4 (DR-Byen). Taleb, N. N. (2004). Fooled by randomness: The hidden role of chance
O’Sullivan, P. (2015). The neural basis of always looking on the bright in life and in the markets. Penguin.
side. Dialogues in Philosophy, Mental and Neuro Sciences, 8(1), Tetlock, P. E. (2005). Expert political judgment: How good is it? How
11–15. can we know? Princeton University Press.
Pallier, G., Wilkinson, R., Danthiir, V., Kleitman, S., Knezevic, G., Tetlock, P. E., & Gardner, D. (2015). Superforecasting: The art and
Stankov, L., & Roberts, R. D. (2002). The role of individual dif- science of prediction. Random House.
ferences in the accuracy of confidence judgments. The Journal of Thaler, R. H. (2015). Misbehaving: How economics became behaviou-
General Psychology, 129(3), 257–299. ral. Allen Lane.
Pickrell, D. (1992). A desire named streetcar: Fantasy and fact in rail transit Thucydides (2009). The Peloponnesian war, translated by Martin
planning. Journal of the American Planning Association, 58(2), Hammond. Oxford University Press.
158–176. Turner, J. R., & Müller, R. (2003). On the nature of the project as a tem-
Proeger, T., & Meub, L. (2014). Overconfidence as a social bias: porary organization. International Journal of Project Management,
Experimental evidence. Economics Letters, 122(2), 21, 1–8.
203–207. Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty:
Project Management Institute (PMI). (2017). A guide to the project Heuristics and biases. Science, 185(4157), 1124–1131.
management body of knowledge (PMBOK® guide) – Sixth Tversky, A., & Kahneman, D. (1982). Evidential impact of base rates.
edition. Author. In D. Kahneman, P. Slovic, & A. Tversky (Eds.), Judgment
Ross, J., & Staw, B. M. (1986). Expo 86: An escalation prototype. under uncertainty: Heuristics and biases (pp. 153–162).
Administrative Science Quarterly, 31(2), 274–297. Cambridge University Press.
546 Project Management Journal 52(6)

Wachs, M. (1989). When planners lie with numbers. Journal of the Author Biography
American Planning Association, 55(4), 476–479.
Wachs, M. (1990). Ethics and advocacy in forecasting for public Bent Flyvbjerg is the first BT Professor and inaugural Chair of
policy. Business and Professional Ethics Journal, 9(1 and 2), Major Programme Management at the University of Oxford
141–157. and the Villum Kann Rasmussen Professor and Chair at the IT
Wachs, M. (2013). The past, present, and future of professional ethics University of Copenhagen. He is the most cited scholar in the
in planning. In N. Carmon & S. S. Fainstein (Eds.), Policy, plan- world in project management. His books and articles have
ning, and people: Promoting justice in urban development been translated into 20 languages. He has received numerous
(pp. 101–119). University of Pennsylvania Press. honors and awards, including the Project Management
Webb, J. (1969). Space-age management: The large-scale approach. Institute Research Achievement Award, two Fulbright
McGraw-Hill. Scholarships, and a knighthood. He is a frequent commentator
Weick, M., & Guinote, A. (2008). When subjective experiences in the news, including The New York Times, The Economist,
matter: Power increases reliance on the ease of retrieval. the Wall Street Journal, the Financial Times, the BBC, and
Journal of Personality and Social Psychology, 94, 956–970. CNN. He serves as an advisor to 10 Downing Street and govern-
Wilson, T. D., Houston, C. E., Etling, K. M., & Brekke, N. (1996). A ment and business around the world. His most recent book is Big
new look at anchoring effects: Basic anchoring and its anteced- Plans: Why Most Fail, How Some Succeed (with Dan Gardner,
ents. Journal of Experimental Psychology: General, 125(4), Penguin Random House, 2022). He can be contacted at
87–402. flyvbjerg@[Link]

View publication stats

You might also like