Recognizing Bias in Decision-Making
Recognizing Bias at GitLab
Unconscious biases are automatic assumptions—formed outside our awareness—that influence how we perceive, interact with, and evaluate others. Everyone has them. They are not a sign of personal failure; they’re a byproduct of how our brains filter vast amounts of information quickly.
While bias can help us make rapid decisions, it can also lead to unintended inequity—especially in moments like hiring, performance reviews, or promotion discussions.
🌍 Why It Matters at GitLab
GitLab is a global, remote-first company with team members across geographies, cultures, time zones, and identities. In this environment:
- Visibility varies widely
- Communication styles differ
- Cultural norms are not universal
We must actively recognize and navigate bias to uphold our values of transparency, inclusion, and collaboration—especially when assessing performance, assigning stretch opportunities, or recommending promotions.
🔍 Biases to Watch For in Performance & Promotion
Bias | What It Is | Example |
---|---|---|
Recency Bias | Overemphasizing recent performance or incidents | During a promotion review, most praise for Alina centers on a successful product launch two weeks ago. Jordan, who contributed consistently all year, is overlooked because their work wasn’t recent or high-profile. |
Horns Effect | A single early misstep colors the entire evaluation | Taylor missed a deliverable in their second month. Since then, they’ve improved documentation and delivered reliably—but are still labeled “not dependable.” Others with similar early errors aren’t held to the same standard. |
Proximity Bias | Favoring those more visible or geographically close | Sasha attends real-time syncs with their manager and receives frequent praise. Deji, who works async in a different region, contributes just as much—but gets fewer shout-outs and less visibility during reviews. |
Affinity Bias | Favoring people similar in background or style | A reviewer strongly supports Alex, saying “we collaborate well.” Meanwhile, Samira—who communicates differently but leads across cultures—is called “solid, but not a strong fit for the next level.” |
Halo Effect | Letting one strength overshadow other concerns | Linh led a big launch recently, earning strong praise. But earlier collaboration gaps go unmentioned. Mateo, who showed inclusive leadership all year, is overlooked without a visibility “spike.” |
🛠️ How to Interrupt Bias
Bias can’t be removed—but it can be managed.
- Slow down decision-making for performance and promotion reviews
- Review full-cycle evidence, not just recent activity
- Ask: “Would I make the same judgment if this person had a different background or style?”
- Include feedback from cross-functional and async collaborators
- Use rubrics and artifacts (1:1 notes, design docs, MR comments) to supplement visibility gaps
💡 Tip: Promotion reviewers should apply the same rigor we use in code review—checking for bugs, bias, and missed opportunities.
✋ More Examples of Bias Types
Other common biases include:
- Confirmation Bias – Seeking only evidence that confirms a belief
- Authority Bias – Giving undue weight to someone in power
- Name Bias – Making assumptions based on perceived origin of a name
- Gender, Age, or Height Bias – Associating leadership or competence with physical or demographic traits
- Cultural Assumption Bias – Mistaking differences in style for lack of engagement
See additional definitions in the Bias Glossary - Link coming soon
🧪 Want to Test Your Bias?
The Implicit Association Test (IAT) is a free research-based tool that can help you explore unconscious associations around race, gender, age, and more.
Using these tools doesn’t make you biased. It makes you aware—and awareness is the first step toward equitable decisions.
📚 Related Resources
- DIB Examples in Promotion Packets - Link TBC
- The SPACE2 Model of Inclusion - Six evidence based techniques for managing bias in oneself and others.
- GitLab Performance Review Guide
- Bias in Interviewing
1be82abf
)