Project Management: Logic Fallacies
Failure to apply critical thinking can have a negative impact on our project. A logic fallacy is an incorrect argument in logic. A key element of critical thinking is being able to spot logical fallacies to reduce risk and enhance our ability to complete projects successfully. We project managers must be aware of common logic fallacies such as these:
- Sunk Costs. “Sunk cost” is an economic term for any expenses that can no longer be recovered. Sometimes we might invest ourselves so thoroughly in a project that we’re reluctant to abandon it, even when it turns out to be fruitless and futile. It’s natural, and usually not a fallacy to want to carry on with something we find important, not least because of all the resources we’ve put into it. However, this kind of thinking becomes a fallacy when we continue with a project solely because of what we’ve already spent on it.
- Positivism. This fallacy holds that if we believe we can accomplish something, we’re more likely to accomplish it. And inversely, if we express doubts about accomplishing something, we’re less likely to achieve it. This fallacy is especially tempting to project managers who want to motivate reluctant teams to attempt (or keep trying to do) the impossible. While it’s helpful to have a positive attitude, the truth is more important. Both staying positive and expressing doubt inappropriately can lead to catastrophe.
- Hasty Generalisation. A hasty generalisation is when we neglect to perform our due diligence. It’s making a decision without all the facts. For example, let’s say we’re going to exploit what we see as a need for some kind of new product. To support the viability of the project we sample a very small and unrepresentative group to gauge their interest in the product. They seem to like it, so we produce lots to find that hardly anyone wants it.
- Ad Hominem. The classic ad hominem fallacy is when we attack the person who is advancing the argument and ignores what they’re actually saying. There may be team members who we personally don’t connect with, even though they excel at their job. If this person is arguing, say, why we must be aware of a certain project risk, and we dismiss them because they’re always complaining, we’re making a logical fallacy. It’s okay not to like somebody, but do give their argument its due. A further example is if during a project meeting, there is a discussion about solutions to an issue, one of the suggested solutions is disregarded purely because it came from a less experienced project team member.
- Bandwagon. This fallacy assumes something is true (or right, or good) because other people agree with it. There is also the status appeal fallacy when something is considered true, right, or good because it has the reputation of lending status, or making us look popular, important or successful. Yet people can be mistaken, confused, deceived, or even willfully irrational. And when people act together, sometimes they become even more foolish as in groupthink. People can be gullible, and this doesn’t necessarily change when applied to large groups.
- Slippery Slope. This logical fallacy is the argument that a position is not consistent or tenable because accepting the position means that the extreme of the position must also be accepted. For example, a business analyst may push back on a newly submitted project requirement because it will ‘open the flood gates’ for further requirement changes. Yet ‘last minute’ requirements could be critical to project success.
- Appeal to Ignorance. Ignorance isn’t proof. Ignorance merely shows that we don’t know something. If someone argues that our organisation shouldn’t pursue a project because no one has ever been able to achieve this goal previously, that is not a solid argument.
- Argument from Authority. We’re told to respect authority, which is not inherently bad, but it can lead to the logical fallacy of an argument from authority. For example, if our project sponsor is making the argument, we’re more likely to listen and believe it to be true. But just because it’s coming from our sponsor, doesn’t necessarily make the argument correct. It’s important to only trust a person in authority if they’ve earned that trust because they’re knowledgeable, experienced and skilled.
- Appeal to Tradition. Another logical fallacy is when we think, “We’ve always done things this way, so it must be right.” That’s called an appeal to tradition. But often an adherence to tradition means a reluctance to try new things. That means a retreat from innovation, which is often bad business. Projects invariably mean change.
- Appeal to Hypocrisy. This tactic doesn’t solve the problem, or prove one’s point, because even hypocrites can tell the truth. Focusing on the other person’s hypocrisy is a diversionary tactic that typically deflects criticism away from one’s self by accusing the other person of the same problem or something comparable.
- Red Herrings. A red herring is something irrelevant that is raised to deflect attention. For example, let’s say someone is trying to get us to construct a house on soft ground that is unlikely to sustain the weight of the building. To distract us from this fact, they move the conversation to the advantages of having great shopping nearby. Obvious deficits can be hidden behind benefits that are in fact red herrings to corrupt our decision-making.
- Straw Man. This is about arguing against a position that we create to be easy to argue against, rather than the position actually held by those who oppose our point of view. If for example we are updating a project sponsor with the bad news that a requirements/scope change means the original deadline can’t be reached, they respond by asking why do we always deliver projects late.
- Tautology. A tautology is an argument that applies circular reasoning, which means that the conclusion is also its own premise. For example, a project manager may claim that a project is so complex that they cannot maintain a project plan accurately so therefore all complex projects don’t need project plans.
- False Dilemma. This common fallacy misleads by presenting complex issues in terms of two inherently opposed sides. Instead of acknowledging that most (if not all) issues can be thought of on a spectrum of stances, the false dilemma fallacy asserts that there are only two mutually exclusive outcomes. For example, we can either agree with this plan, or let the project fail, as there is no other option.
- TuQuoque Fallacy. The tu quoque fallacy (Latin for “you also”) is an invalid attempt to discredit an opponent by answering criticism with criticism, but never actually presenting a counter-argument to the original disputed claim. “I don’t think that you John would be a good fit to manage this project, because you don’t have a lot of experience with project management.” John: “But you don’t have a lot of experience in project management either!”
- Causation Fallacy. If two things appear to be correlated, this doesn’t necessarily indicate that one of those things irrefutably caused the other thing. This might seem like an obvious fallacy to spot, but it can be challenging to catch in practice, particularly when we really want to find a correlation between two points of data to prove our point. For example, our blog views were down in April. We also changed the colour of our blog header in April. This means that changing the colour of the blog header led to fewer views in April. Another example would be, if people are buying our product because of an advertisement that we placed, that doesn’t necessarily mean the ad is working. Most likely there are many other variables that must be considered before we can substantiate that claim. In other words, one action following another does not mean there is a causal link.
- Anecdotal Evidence. This is evidence based solely on the personal experience of one person or a small number of people. We use a personal experience or an isolated example instead of a sound argument or compelling evidence. Arguments that rely heavily on anecdotal evidence tend to overlook the fact that one (possibly isolated) example can’t stand alone as definitive proof of a greater premise. Because they lack tested empirical evidence or facts, we need to be careful when making conclusions based on anecdotal evidence.
- Sharpshooter Fallacy. This fallacy gets its colorful name from an anecdote about a Texan who fires his gun at a barn wall, and then proceeds to paint a target around the closest cluster of bullet holes. He then points at the bullet-riddled target as evidence of his expert marksmanship. Those who rely on the sharpshooter fallacy cherry-pick data to support predetermined conclusions.
- Middle Ground. This fallacy assumes that a compromise between two extreme conflicting points is always true. Arguments of this style ignore the possibility that one or both of the extremes could be completely true or completely false rendering any form of compromise between the two invalid. Another related fallacy is false balance, which assumes that just because the two extreme positions exist, they must be equally valid and deserve the same weight.
- Burden of Proof Fallacy. For example, a person makes a claim. Another person refutes the claim, and the first person asks them to prove that the claim is not true. Just because there is no evidence presented against something, that doesn’t automatically make that thing true. Of course, what can be asserted without evidence can also be dismissed without evidence.
The above fallacies are best avoided by critical thinking. Being able to think critically is an essential skill for us project managers. We need to wade through what everyone is saying and pick out the truth from the nonsense. Critical thinking isn’t easy. It may sometimes involve us letting go of what we want to believe and embracing new information. Yet one of the biggest challenges to overcome in critical thinking is that we may not properly update our beliefs in the face of such new evidence. We may cling stubbornly to old beliefs or the status quo.