Load this file when you need to challenge assumptions, detect blind spots, or validate decision quality.
Research from McKinsey (2019) shows that removing bias from decisions improves ROI by up to 7%. Nobel laureate Daniel Kahneman demonstrated that cognitive biases affect even the most experienced executives. Debiasing is not about being smarter — it's about having a structured process to catch what your brain naturally hides from you.
Jeff Bezos' insight: "What we need to do is come up with frameworks for making good decisions... When you're 90 and looking back on your life, you want to minimize regret."
Run this scan before any major decision. For each bias, ask the diagnostic question. If the answer triggers concern, apply the debiasing technique.
What it is: Over-relying on the first piece of information encountered. Diagnostic: "Was there a number, price, or estimate mentioned early that we keep referencing?" Example: A competitor's acquisition price anchors your own valuation expectations. Debiasing technique:
- Generate your own estimate BEFORE looking at external anchors
- Use multiple independent estimation methods (bottom-up, top-down, comparable)
- Ask: "If the anchor number didn't exist, what would we estimate?"
What it is: Seeking only information that supports your existing belief. Diagnostic: "Have we actively searched for evidence AGAINST our preferred option?" Example: The CEO wants to enter a new market and only reads bullish analyst reports. Debiasing technique:
- Assign a "Red Team" member whose job is to argue against the preferred option
- Before searching, write down what evidence would CHANGE YOUR MIND
- Deliberately search for counter-evidence and failure cases using available search or browsing tools
- Ask: "What would our smartest critic say about this plan?"
What it is: Continuing a failing course of action because of past investment. Diagnostic: "Would we start this project today if we hadn't already invested $X?" Example: "We've spent $2M on this product — we can't stop now." Debiasing technique:
- Frame the decision as: "We have $Y remaining budget. Is THIS the best use of that $Y?"
- Ignore all past investment in the analysis — only forward-looking costs and benefits matter
- Set "kill criteria" at the start of every project — pre-commit to stopping conditions
What it is: Focusing on successful examples while ignoring failures. Diagnostic: "Are we only looking at companies that succeeded with this strategy?" Example: "Airbnb pivoted and succeeded, so we should pivot too." Debiasing technique:
- Actively search for companies that tried the same approach and FAILED
- Calculate base rates: "Of all companies that attempted X, what % succeeded?"
- Ask: "What's different about the failures? Are we more like the successes or failures?"
What it is: Overweighting information that comes to mind easily (recent, vivid, emotional). Diagnostic: "Is a recent event or dramatic story driving our risk assessment?" Example: After a competitor's data breach, overinvesting in security at the expense of growth. Debiasing technique:
- Use actual data and base rates instead of anecdotes
- Create a structured risk register with historical frequency data
- Ask: "If this event hadn't happened last month, would we still prioritize this?"
What it is: Overestimating the accuracy of your own predictions. Diagnostic: "How often have our past predictions been correct?" Example: "I'm 90% sure this product will hit $10M ARR in year one." Debiasing technique:
- Widen confidence intervals by 2x (if you think 80-120, use 60-140)
- Track prediction accuracy over time in a decision journal
- Ask 3 independent people for their estimates, then average
- Use the "Reference Class Forecasting" — compare to similar past projects
What it is: Conforming to group consensus without critical evaluation. Diagnostic: "Has anyone in the room voiced strong disagreement?" Example: The entire leadership team agrees on a strategy, but no one played devil's advocate. Debiasing technique:
- Use "Brainwriting" — everyone writes their opinion BEFORE group discussion
- Assign a rotating "Devil's Advocate" role in every decision meeting
- CEO speaks LAST in discussions (to avoid HiPPO effect — Highest Paid Person's Opinion)
- Amazon's practice: Start meetings with silent reading of a written memo
What it is: Preferring the current state of affairs even when change is beneficial. Diagnostic: "If we were starting from scratch today, would we choose our current approach?" Example: Keeping an underperforming sales team structure because "it's always been this way." Debiasing technique:
- Apply the "Clean Sheet" test: Design the ideal state ignoring current reality
- Calculate the COST of inaction — not acting is itself a decision with consequences
- Ask: "A new CEO with no history here — what would they do differently?"
What it is: Underestimating the time, cost, and risk of future actions. Diagnostic: "Are we basing estimates on best-case scenarios?" Example: "This integration will take 3 months" → Actually takes 9 months. Debiasing technique:
- Use "Reference Class Forecasting" — how long did SIMILAR projects take?
- Add a "planning buffer": 1.5x for familiar tasks, 2-3x for novel ones
- Ask the team implementing it (not the person proposing it) for estimates
- Pre-mortem: "Assume it took 3x longer than planned — what went wrong?"
What it is: Being influenced by HOW information is presented rather than WHAT it says. Diagnostic: "Would my decision change if the same data were presented differently?" Example: "90% survival rate" feels different from "10% mortality rate" — same data. Debiasing technique:
- Reframe the data in multiple ways (gains vs losses, absolute vs percentage)
- Ask: "How would this look from our competitor's / customer's perspective?"
- Strip emotional language — present raw numbers first, narrative second
What it is: Experts overlook fundamentals; non-experts overestimate their understanding. Diagnostic: "Are we relying on a domain expert who hasn't been challenged by outsiders?" Example: The CTO says "AI will solve this" without being questioned by non-technical leaders. Debiasing technique:
- Have the expert explain to a smart non-expert (forces clarity)
- Ask: "What would a beginner ask about this that we're overlooking?"
- Seek external advisors who don't share your team's assumptions
What it is: Fearing losses more than valuing equivalent gains (roughly 2:1 ratio). Diagnostic: "Are we avoiding a good opportunity because we're afraid of losing what we have?" Example: Not launching a disruptive product because it might cannibalize existing revenue. Debiasing technique:
- Quantify both the loss AND the opportunity cost of not acting
- Ask: "If a competitor does this instead of us, what do we lose?"
- Reframe losses as "investments" with expected returns
- Bezos' Regret Minimization: "Will I regret NOT doing this in 10 years?"
Origin: Gary Klein, psychologist Process:
- "It's 18 months from now. This decision was a catastrophic failure."
- Each person INDEPENDENTLY writes down why it failed (3 minutes, no talking)
- Share and compile all failure modes
- Rate probability × impact for each
- Create mitigation plans for top-5 risks
Process:
- Assign 1-2 smart people to build the STRONGEST POSSIBLE case AGAINST the decision
- They must argue as if they genuinely believe the opposite — not strawman it
- The Red Team presents their case formally
- The decision maker must address every point before proceeding
- If the Red Team's argument can't be convincingly refuted, reconsider the decision
Origin: Aristotle → popularized by Elon Musk Process:
- Identify the conventional wisdom / assumption ("Batteries will always be expensive")
- Break it down to fundamental truths ("What are batteries made of? What do those materials cost?")
- Rebuild reasoning from the ground up, ignoring analogies and conventions
- Ask at each step: "Is this NECESSARILY true, or just commonly assumed?"
- Often reveals that 80% of "constraints" are actually assumptions
Elon Musk's Example:
- Convention: "Battery packs cost $600/kWh — that's just what they cost"
- First Principles: "What are the raw material costs? Carbon, nickel, aluminum, polymers = $80/kWh"
- Conclusion: "We need to find clever ways to combine those materials — the cost SHOULD be much lower"
- Result: Tesla drove battery costs below $100/kWh
Origin: Daniel Kahneman & Amos Tversky Process:
- Identify a reference class (similar projects/decisions in the past)
- Determine the distribution of outcomes for that reference class
- Adjust for specific factors of your situation (but conservatively)
- Use the reference class BASE RATE as your starting estimate
Example:
- "Only 10% of SaaS acquisitions achieve projected synergies within 2 years"
- "Our situation is better because X" → Adjust to 20%, not 80%
- "Therefore our financial model should use 20% synergy realization, not 100%"
Process:
- "How will I feel about this decision in 10 minutes?"
- "How will I feel about this decision in 10 months?"
- "How will I feel about this decision in 10 years?"
- Decisions that feel scary at 10 minutes but right at 10 years → probably good
- Decisions that feel great at 10 minutes but risky at 10 years → proceed with caution
Origin: Jeff Bezos, used to decide to start Amazon Process:
- Project yourself to age 80
- Ask: "Will I regret NOT trying this?"
- Minimize lifetime regret, not short-term risk
- Especially powerful for Type 1 decisions that involve leaving a safe position
- Bezos: "I knew that when I was 80, I would never regret having tried this... but I would regret not trying."
| Decision Type | Minimum Debiasing | Recommended Deep Dive |
|---|---|---|
| Type 2 (reversible) | Quick bias scan (2 min) | Not usually needed |
| Type 1 (irreversible) | Full 12-bias scan + Pre-Mortem | Red Team + First Principles |
| Crisis | Skip — use OODA Loop | Post-crisis debrief with bias review |
| Strategic Bet | Full 12-bias scan + Reference Class | All techniques |
| Stakeholder Navigation | Framing Effect + Groupthink check | Full scan if high stakes |
## Bias Check Report
**Decision:** [What's being decided]
**Biases Detected:**
| # | Bias | Evidence | Severity | Mitigation Applied |
|---|------|----------|----------|---------------------|
| 1 | Anchoring | CEO referenced competitor's $50M raise | Medium | Generated independent valuation |
| 2 | Confirmation | Only bullish market data presented | High | Searched for bear cases, found 3 |
| 3 | Planning Fallacy | Engineering estimates "6 weeks" | High | Reference class says 12-16 weeks |
**Post-Debiasing Adjustment:**
- Original confidence: 85% → Adjusted confidence: 60%
- Original timeline: 6 weeks → Adjusted timeline: 12-16 weeks
- Original expected outcome: $5M → Adjusted: $2-4M range
**Key Insight:** [The single most important thing debiasing revealed]