Cognitive Moderators - Filters of Thinking that Change the Accuracy of Our Decisions

  • Time Pressure (brain misses information).

    Rushing – Imagine driving at about 200 miles per hour past a road sign. What did the sign say? You probably could not read it. This is what time-pressure can do to decision-making. When you are rushed to make decisions, the brain does not have enough time to consider all alternatives, risks, resources needed, etc. It’s like driving past all the potential exits at 200 miles per hour and never seeing other options.

    Automatic Thinking (System 1) – Time pressure causes automatic thinking, otherwise known by cognitive scientists as System 1 thinking. Due to a project’s time-constraint, personnel on projects experience higher degrees of automatic thinking. The subsequent increase in System 1 thinking causes thinking errors, cognitive biases, reduces creativity, causes a reliance on heuristics (see description below), and impacts decision-making in risk and safety.

  • Cognitive Dissonance (brain avoids information).

    This term explains the mental discomfort experienced when someone holds two or more contradictory beliefs, ideas, or values in their mind at the same time, and experiences psychological stress because of it. When two actions or ideas are not mentally consistent with each other, people try to revise them until they become consistent. To explain dissonance, imagine this scenario: you made a plan for a project and handed it off to the project manager to deliver. After the project was delivered, you find that there were major errors in the plan which caused it to finish behind schedule. Because you believed you were good at planning, you now have two conflicting pieces of information that are causing cognitive dissonance: 1) you are a good planner, and 2) you aren’t as good at planning as you thought you were. When people experience dissonance, they make a decision of how to disposition it, which results in one of the following actions:

    • Accept the new information (not a common disposition)

    • Reject the new information

    • Discredit the new information

    • Minimize the new information

    The mental discomfort causes people to make decisions that may not be purely logical or rational. This is because people often make choices that reduce mental discomfort over the decision that is correct. Cognitive dissonance is the cause of many cognitive biases.

    Optimism bias is an example of avoiding mental discomfort, where a person holds an unrealistic positive view about the future. Optimism reduces negative views about the future, avoiding the associated mental discomfort. Deliberate ignorance has a similar theme: avoidance of information that causes cognitive dissonance.

  • Inertia (brain bypasses information).

    The inertia human phenomenon explains the tendency for people to maintain a stable state associated with inaction or persistence in a certain direction. Let’s use the example of a car in motion. Once the car starts moving forward in a certain direction, the inertia keeps it going. Any steering either left or right introduces friction and causes the inertia to slow the car. The brain operates in a similar fashion. As people start to move in a certain direction in decisions, actions, or mental state, any change in direction introduces friction and discomfort. Because it takes more mental energy to deal with the friction or change in inertia, the brain resists this change. Inertia is associated with status quo bias, and can be one of the causes of resistance to change. Inertia can also be used to improve decision-making by setting defaults (such as those found in choice architecture, or nudge theory) so that the inertia causes people to make better decisions by putting the right decision in the path of movement, either physical or mental.

  • Heuristics (brain defaults to wrong information).

    Heuristics are the brain’s way of automatically referencing information in a split second without having to thoroughly think through a situation. Heuristics are a mental rule of thumb and are kind of like a computer or Google Search engine that provides suggestions of phrases once you start typing in the search bar. The computer is constantly making comparisons as you put in more information. Heuristics are doing the same thing; your brain is constantly indexing what it sees and hears against what it thinks it knows, giving you split-second feedback for you to make a decision. The problem is that heuristics lead to more automatic decisions that can lead to errors. The two most commonly referenced heuristics are representativeness and availability.

  • Psychological Safety (brain fears certain information).

    One of the most basic moderators of cognition, and probably the most popular, is the brain’s response to threat; most of us have heard of “fight, flight, or freeze.” All humans and other mammals are constantly evaluating the environment for threats. Before we lived in civilized towns with a relative degree of safety, humans were more exposed to the elements, to predators, and other dangerous situations. In a dangerous situation the brain is on high alert, and if there is an immediate threat, we respond by fighting the threat, fleeing the danger, or in some cases freezing and not responding (a natural reaction if one did not want to be seen by a predator).

    Above all, the brain is trying to survive in every situation. And just because we are now in more civilized environments with reduced levels of threat, does not mean the brain has shut off the function of threat detection. It is now just looking for other threats that may be more subtle; in the office, in a conversation with the boss, or in a project team meeting.

    Psychological safety can generally be defined as belief that one is safe in the organization to take interpersonal risk. If a team or whole organization does not feel safe, that will inhibit performance, learning, innovation, and risk identification, among many other issues. A lack of psychological safety reduces confidence and people feel afraid they will be rejected, punished, embarrassed, or socially ostracized. Psychological safety also significantly impacts the level of trust in the organization. In addition to social pressure, psychological safety can be a cause of strategic misrepresentation. As trust is decreased, individuals may not feel safe communicating the realities of duration or cost estimates. As most people know, trust plays a big part in a high-performing culture. The lack of psychological safety is a direct contributor to decreased trust in an organization.

  • Cognitive Load (brain under-processes information).

    Imagine running your computer all day long, and as you go about the day you open more and more programs. You now have MS Word open, Excel open, your email is open, a YouTube video is running, and you are editing photos. Meanwhile, your computer is running all the background programs to keep the computer functioning, such as automatic update programs, the controls for your mouse, watching your battery power, etc. Your computer is bogged down and slow because the Random Access Memory (RAM) is almost completely full. Now the computer cannot run at full capacity, and its performance is compromised. Cognitive load in the brain is the same. The more information you put into it throughout the day, the lower and slower the performance.

  • Social Pressure (brain changes information).

    This phenomenon is the cause of many decision errors by humans. Social pressure explains the pressure experienced from others to make decisions that correspond with their will or desires. This pressure can be real or perceived and can be based on social expectations of the culture, organization, or small group within the organization. It can also occur in temporary groups, such as in a business meeting. The pressure from other people causes humans to often make decisions that are not completely logical.

    For example, in a safety or planning meeting a subject matter expert (SME) may introduce a risk to people in the meeting, but the risk is considered uncomfortable to discuss. Because of the common understanding of the discomfort with discussing the risk, the SME decides not to push the issue and the risk is no longer discussed. However, not discussing the risk did not make the risk go away, it just kept it from being mitigated. In this case, the social pressure increased risk to the project because the pressure decreased the logical decision. Social pressure is also associated with Strategic Misrepresentation, one of the most common causes of optimistic project planning.

  • Framing (brain sees the wrong information).

    A frame can influence your choice or decision to a great extent. The same thing can be framed positively (“We’re halfway through our work people, good job!”) or negatively (“We’re only halfway, there is still so much work to do”). Framing is often used by politicians. They are experts (or their speechwriters are) in giving the truth a twist so that the frame matches their and their constituent’s viewpoints.

    Framing is especially important when we have to make decisions under uncertainty or conditions of risk. This is the case within project management: a certain level of risk and uncertainty is present in every phase. Positive framing can lead to the identification of opportunities, while negative framing can lead to perceived threats. Loss aversion can also be used as a reframe to help identify more risk in planning.

  • Decision Fatigue (brain loses energy to find information).

    Similar to cognitive load, decision fatigue is what occurs when the computer between your ears (your brain) is making decisions and loses energy due to those decisions. While cognitive load represents the memory being used throughout the day, decision fatigue is like the computer using that memory to take action.

    Each decision, large or small, builds up throughout the day, and every decision is burning calories and using oxygen (about 20% of your body’s oxygen is being used by your brain). As the day goes on, your energy for decisions decreases, similar to what happens when you are using your muscles, and your arms or legs get tired. And just like a muscle, even small actions can have a cumulative effect on decreasing energy. As decision fatigue increases, the quality of the decision is reduced, which can decrease rationality and logical outputs of the brain, with an increase in automatic thinking, which also results in an increased reliance on cognitive biases to make decisions.