Yggdrasil
MCP ServersMCP 伺服器 SKILLs技能 PlugIns解決方案 Asgard AI SolutionAsgard AI 方案 Submit Listing申請上架 GitHub
D

Dual-Process Theory 雙程序理論

Released已發布
theory theory

Apply dual-process theory to diagnose whether judgments arise from fast intuitive (System 1) or slow analytical (System 2) processing and identify resulting cognitive biases. Use this skill when the user needs to explain why quick decisions go wrong, design choice architectures that account for cognitive defaults, audit decision processes for heuristic errors, or when they ask 'why do people misjudge probability', 'how to reduce snap-judgment errors', or 'when does intuition fail'.

學術研究技能:Dual-Process Theory 分析與應用。

View on GitHub在 GitHub 查看

Overview概述

Dual-process theory (Kahneman, 2011; Stanovich & West, 2000) distinguishes two modes of cognitive processing: System 1 (fast, automatic, heuristic-driven) and System 2 (slow, deliberate, rule-based). Most judgments default to System 1, which is efficient but prone to systematic biases when heuristics misfire.

When to Use使用時機

  • Explaining why stakeholders make predictable judgment errors under time pressure or complexity
  • Designing decision environments (nudges, checklists) that compensate for System 1 defaults
  • Auditing existing processes to identify where heuristic shortcuts introduce risk
  • Evaluating when intuitive expertise is reliable vs. when it is misleading

When NOT to Use不適用時機

  • When decisions are already well-structured with algorithmic procedures (bias is engineered out)
  • As an excuse to dismiss all intuitive judgment — expert intuition can be accurate in high-validity environments
  • When the problem is motivational rather than cognitive (people know the right answer but choose otherwise)

Assumptions前提假設

IRON LAW: System 1 operates by DEFAULT — System 2 engagement
requires cognitive effort and is easily depleted. Under time
pressure, cognitive load, or ego depletion, System 1 dominates
and heuristic biases amplify.

Key assumptions:

  1. System 1 and System 2 are metaphors for processing modes, not discrete brain systems
  2. Heuristics are generally adaptive — biases emerge at the boundary conditions
  3. System 2 can override System 1, but only when cued and when cognitive resources are available

Framework 框架

Step 1 — Identify the Judgment or Decision Context

Characterize the decision: time pressure, complexity, familiarity, stakes, emotional involvement.

Step 2 — Classify Processing Mode

Feature System 1 System 2
Speed Fast, automatic Slow, effortful
Awareness Unconscious Conscious
Capacity High (parallel) Low (serial)
Basis Heuristics, associations Rules, logic
Error type Systematic biases Computational mistakes
Triggered by Default, familiarity Novelty, conflict detection

Step 3 — Map Relevant Heuristics and Biases

Common System 1 heuristics and their failure modes:

  • Availability: judge frequency by ease of recall — biased by salience and recency
  • Representativeness: judge probability by similarity — ignores base rates
  • Anchoring: estimate by adjusting from initial value — insufficient adjustment
  • Affect: judge risk/benefit by emotional reaction — neglects statistical evidence

Step 4 — Design Intervention

  • De-bias: slow down decisions, require explicit justification, use pre-mortems
  • Nudge: restructure choice architecture to align System 1 defaults with desired outcomes
  • Leverage: use System 1 strengths (pattern recognition) in high-validity, rapid-feedback domains

Output Format輸出格式

Gotchas注意事項

  • System 1/System 2 is a useful metaphor, not a literal brain architecture — avoid reifying the distinction
  • Expert intuition (System 1) is highly accurate in domains with clear feedback and regular patterns (e.g., chess, firefighting)
  • De-biasing training has poor transfer — changing the environment is more effective than training individuals
  • Cognitive depletion effects are debated; do not assume a simple "willpower battery" model
  • System 2 is not inherently "better" — it is slower, more costly, and still subject to motivated reasoning
  • People often confuse confidence with accuracy; high System 1 confidence does not indicate correctness

References參考資料

  • Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.
  • Stanovich, K. E. & West, R. F. (2000). Individual differences in reasoning: implications for the rationality debate. Behavioral and Brain Sciences, 23(5), 645-665.
  • Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty: heuristics and biases. Science, 185(4157), 1124-1131.

Tags標籤

dual-processsystem-1system-2heuristics