Documented AI Failure Archive

Verified reports of ChatGPT and AI chatbot incidents with sources

This archive documents verified AI failures from credible sources including court records, news outlets, academic research, and verified user reports. Each entry is categorized, dated, and linked to primary sources where available. We are archiving reports, not making claims ourselves.

600+
Legal Hallucination Cases
1,314+
Service Outages Tracked
351+
User Testimonials

Legal Hallucination Cases

Court Case

Noland v. Land of the Free, L.P.

September 2025 | California Court of Appeals CalMatters

Attorney sanctioned $10,000 after 21 of 23 case quotations in brief were fabricated by ChatGPT. Highest AI-related fine in California state court history.

Court Case

Chicago Housing Authority Case

Summer 2025 | Illinois Court Records

Attorney cited non-existent "Mack v. Anderson" Illinois Supreme Court case. Attorney stated she didn't think ChatGPT was capable of creating false precedent.

Court Case

Johnson v. Dunn

July 2025 | N.D. Alabama Federal Court

Court distinguished between attorneys who took remedial steps versus those facing potential disbarment proceedings for AI-generated fake citations.

Court Case

ByoPlanet International Case

2025 | S.D. Florida Federal Court

Attorney's paralegal drafted pleadings using ChatGPT without proper review. Judge dismissed all four matters, ordered fee payment, and referred to Florida Bar.

Court Case

Arizona Social Security Disability Case

August 2025 | Arizona Federal Court AZ PBS

12 of 19 cited cases were "fabricated, misleading, or unsupported." Attorney's temporary permission to appear revoked, must disclose sanctions to all judges.

Mental Health Incidents

Lawsuit

Raine Family v. OpenAI

August 2025 | Federal Court Senate Testimony

Parents allege ChatGPT discouraged teen from discussing suicidal thoughts with parents. Father testified before Senate Judiciary Committee.

Settlement

Character.AI Setzer Settlement

January 7, 2026 | Settlement AI Incident Database

Google and Character.AI reached mediated settlement with family of 14-year-old who died after reported dependency on AI chatbot.

Lawsuit

Character.AI Texas Family Lawsuit

2025 | Texas Court Filing

Texas family claims child experienced sexual exploitation through Character.AI chatbot, along with encouragement of self-harm.

Government & Enterprise Failures

Government

NYC MyCity Chatbot Failures

2025 | New York City News Reports

City chatbot gave incorrect information about Section 8 vouchers, worker pay regulations, and industry-specific requirements.

Government

Deloitte Australia GPT Report

2025 | Australia Government Review

Deloitte used GPT to prepare 237-page government report on safety standards. Analysts discovered fabricated references and non-existent citations.

Enterprise

Replit AI Database Deletion

July 2025 | SaaStr Incident Tech News

AI coding assistant went rogue and wiped production database despite instructions not to modify production code.

AI Chatbot Safety Incidents

Safety

Grok Provided Home Invasion Instructions

July 2025 | xAI Verified Report

xAI's Grok provided detailed instructions for breaking into a politician's home, including lock picks and sleep schedule analysis.

Safety

Grok "MechaHitler" Incident

July 2025 | xAI X Platform

Grok made antisemitic posts and declared itself "MechaHitler" repeatedly, forcing X to temporarily shut down the chatbot.

Safety

Florida School AI False Alarm

2025 | Florida News Report

School entered code red lockdown after $250,000/year AI weapon detection system mistook a clarinet for a firearm.

Safety

Meta AI Persona "Billie" Incident

2025 | Meta Incident Database

76-year-old man believed Meta's AI persona "Big sis Billie" was real, traveled to New York to meet "her," suffered accident and died.

About This Archive: We document reported incidents from verifiable sources. Inclusion in this archive does not constitute legal judgment. We encourage readers to consult primary sources and draw their own conclusions. Submit your own experience or contact us with corrections.