Skip to main content

Hallucination Risk and Escalation

When to trust AI and when to escalate to a human.

Hallucination Risk and Escalation

AI can invent facts, citations, and legal conclusions. Know when to trust it and when to escalate.

What is hallucination?

Models sometimes generate plausible-sounding but false information. They may invent case names, statute sections, or dates. They do not "know" the law—they predict text.

When to verify everything

  • Legal conclusions: Never rely on AI for final legal advice.
  • Citations: Always look up statutes and cases yourself.
  • Deadlines and procedures: Confirm against court rules and local practice.
  • Client-specific facts: AI has no access to your matter. Verify all matter details.

When to escalate

  • Unclear or conflicting outputs
  • High-stakes decisions (litigation, transactions, regulatory)
  • Jurisdiction-specific questions
  • Anything you would normally run by a senior attorney

Use AI to speed up research and drafting. Use humans to validate and decide.