The stellar Stephen Hawking was often given credit for a quote he borrowed from the brilliant former Librarian of Congress Daniel Boorstin:
The greatest enemy of knowledge is not ignorance, it is the illusion of knowledge.
Illusions have no place in a technology project—they allow project leaders and team members to make too many important decisions based on what they think they know.
What we think we know is often only the illusion of knowledge. Our confirmation bias prevents us from seeing the real facts.
What is confirmation bias?
Confirmation bias is the natural tendency to seek, interpret, and favor information in a way that confirms your existing beliefs.
This bias affects how we form opinions, react in meetings, and make decisions. For example, if you’re discussing AMS options with your colleagues and someone brings up AMS Brand X as a possible option, Joe immediately interjects saying, “No way. That one’s no good. It’s dated technology and no longer relevant.”
What Joe doesn’t tell anyone is he hasn’t given Brand X a good look in ages. Although, he did hear someone complaining about her Brand X system a few months ago at a conference. What she didn’t tell Joe is her Brand X system is a customized version that hasn’t been upgraded since they bought it 10 years ago!
No matter, her incomplete version of the truth supported Joe’s preconceived notion of Brand X and now he’s sharing his incomplete version of the truth with everyone else in the room. They all think, “Joe should know,” so they take Brand X off the list without learning anything more about it. “Facts” are getting the best of this group.
How confirmation bias rears its ugly head during a technology project
People see what they want to see and hear what they want to hear based on a set of beliefs and assumptions that may have nothing to do with reality.
We all do it. Confirmation bias becomes a problem when we don’t realize we’re treating these beliefs and assumptions as facts—when maybe we shouldn’t.
Planning a technology project
Take some time to anticipate where confirmation bias is likely to influence your project so you’re more likely to spot it when it arises. Doing so will give you more clarity for project planning and assessing risk.
Confirmation bias can influence assumptions about your team’s project management abilities. Don’t let your association’s healthy ‘self-image’ get in the way when a technology vendor or consultant brings up a potential concern or risk. Don’t say, “That will never happen to us.” Like phishing, everyone is susceptible.
Listen to what your technology partners are saying—they’ve been down this road many times before. And never underestimate the power of Murphy’s presence—it’s not called the Law for nothing.
A common planning fallacy is believing everything will take less time than it really will. Yes, your team is sharp and tech savvy, but maybe you shouldn’t promise to get it done by the next board meeting.
Confirmation bias also creates an exclusionary tendency when selecting project team members: “Don’t involve Mason, he knows nothing about this type of thing.” You think you know all about Mason, but do you? Did you know he helped with a system selection like this in his last job?
Famous last words: “We know everything we need to know to move forward.” When you begin to think that, that's your cue to take another look at what you 'know.'
Along with their personal agendas, complaints, and hopes, guess what else everyone brings to requirements gathering meetings? That’s right, confirmation bias. Here are some ways it may show up:
People and positions
Preconceived notions about employees in certain positions can cause others to diminish or denigrate their input—or not even give them the opportunity to provide input—during the requirements phase of a technology project.
For example, front-line staff—the people who talk to members and customers every day—aren’t always invited to requirements gathering meetings. Department heads or directors think they have a handle on what’s needed. But I’ve seen requirements epiphanies take place when front-line folks enlighten others with what they know.
Approach and solutions
Another example of bias at work: you believe an AMS that does it all—the data hub model—is the only way to go. Or perhaps you think a mix of best-of-breed systems—the distributed data model—is the smart choice.
It’s a mistake to go into a selection project saying you need Y without knowing what else is out there or understanding the benefits of other options. All you can think about is the need for an apple when an orange might be better, and now you’re blinded to the whole world of citrus!
Traditions and processes
Everyone knows and talks about this one, yet it’s still an issue for associations. When reviewing business processes, the fallback position is “because we’ve always done it that way” or “because it can only be done this way.” The blind force of tradition and weight of inertia empower confirmation bias. Remember, the devil you know is still the devil.
Another way bias comes into play is with the Snowflake Effect: “We have unique needs. Off-the-shelf software isn’t going to cut it. We need a customized product.” If you can’t keep an open mind about changing processes, you’re going to paint yourself into corners. You’ll incur much bigger bills now and later at upgrade time thanks to avoidable customization.
Get an unbiased view
In all these scenarios, it helps to have someone around who asks why—someone who forces you to think beyond the accepted norms reinforced by your team’s collective confirmation bias. A consultant who has an outsider’s perspective but an insider’s knowledge about associations can be indispensable. A business analyst who’s 'been there, done that' can point out where biases (opinions) don’t hold weight.
Confirmation bias isn’t limited to your internal processes and team members.
‘Halos and horns’
Biases influence opinions about people outside your organization, including technology sales people. When you feel a positive connection with a sales person, you’re more likely to favor their product too because of the ‘halo effect.’
At the other end of the spectrum, the ‘horns effect’ comes into play when a sales person rubs you the wrong way. You can’t separate the person from the product, and you all lose. If you don’t take your blinders off, you’ll bring that bias into the decision-making process.
Stick to the script
How can you avoid the influence of biases during demonstrations? Focus on the features and functionality your association needs, not the fancy bells and whistles the sales team wants to show off.
Consultants come in handy here as they’ve seen hundreds of demos and know how to rein things in. They develop a demo script that keeps the sales team focused on how the product meets the requirements spelled out in your RFP and ensures all vendors are reviewed on the same criteria.
Reference check, please!
When it’s time to check vendor references, be ready to minimize the confirmation biases of your team and the vendor’s team. By now, you may have fallen hard for one of your finalists. Don’t ask references only the questions that confirm what you want to hear. Stay objective. Develop a list of questions ahead of time and ask every reference the same set of tough questions.
Vendors have their own biases and will lead you to clients who only say nice things. When possible, do your own research and find other clients who can talk candidly about the vendor. If you hear something disturbing, ask the vendor about it—there might be a good reason for the problem.
“Everyone’s going to love this new system—just like I do!” If only.
No system will be 100% perfect for everyone’s needs, so don’t let your rosy perception cloud reality. Put yourself in your colleagues’ shoes—fire up those powers of empathy—and imagine how they will feel about the transition to a new system and new processes.
Confirmation bias affects how we view others and their potential for change. When we make assumptions, we might alienate people. For example, don’t assume that Boss Bob will never learn how to use the new AMS because he’s steeped in the old way of doing things. Because the next thing you know, he’s excitedly talking to others about the fantastic new dashboard and reports.
Preventing confirmation bias
Confirmation bias is the invisible elephant in the room. No one admits it’s there, most don’t even know it’s there, but you have to bring it up.
Put safeguards in place so you can ward off its effects. Discuss it with your project team. Talk about how it might show up by sharing possible scenarios. Acknowledge that no one is immune. Set ground rules that allow considerate fact-checking of each other’s statements. If fact-checking sounds intimidating, remind everyone that their own self-awareness is the best self-defense against bias.
For a light approach, bring in some yellow flags and, like referees, throw the yellow confirmation bias flag when you hear someone say:
- That’s the best way to do it.
- We’ve always done it this way.
- It’s no good.
- It’s great.
Keep flags on hand for any occasion when errant assumptions may arise. These yellow flag moments are your team’s occasion to ask why and to request evidence.
Some parting wisdom from the Talking Heads about the danger of confirmation bias and the illusion of knowledge:
Facts are simple and facts are straight
Facts are lazy and facts are late
Facts all come with points of view
Facts don't do what I want them to
Facts just twist the truth around
Facts are living turned inside out
Facts are getting the best of them
Facts are nothing on the face of things
Facts don't stain the furniture
Facts go out and slam the door
Facts are written all over your face
Facts continue to change their shape
Confirmation bias in action
A common misconception of associations and other organizations is that they aren’t likely to be victims of cyberattacks because they’re too small or they’re a niche industry. But that’s not the case—anyone and everyone is a potential victim. Download our cybersecurity watchlist to get the facts and increase your awareness.
images from pixabay.com