Web Stories Wednesday, September 24

HURDLES FOR PLAINTIFFS TO OVERCOME

Though the door to liability for chatbot providers is now open, other issues could keep families from recovering any damages from the bot providers. Even if ChatGPT and its competitors are not immune from lawsuits and courts buy into the product liability system for chatbots, lack of immunity does not equal victory for plaintiffs.

Product liability cases require the plaintiff to show that the defendant caused the harm at issue. This is particularly difficult in suicide cases, as courts tend to find that, regardless of what came before, the only person responsible for suicide is themselves. 

Whether it’s an angry argument with a significant other leading to a cry of “why don’t you just kill yourself,” or a gun design making self-harm easier, courts tend to find that only the person is to blame for their own death, not the people and devices the person interacted with along the way.

But without the protection of immunity that digital platforms have enjoyed for decades, tech defendants face much higher costs to get the same victory they used to receive automatically. In the end, the story of the chatbot suicide cases may be more settlements on secret, but lucrative, terms to the families.

Meanwhile, bot providers are likely to place more content warnings and trigger bot shutdowns more readily when users enter territory that the bot is set to consider dangerous. The result could be a safer, but less dynamic and useful, world of bot “products”.

Brian Downing is Assistant Professor of Law at the University of Mississippi. This commentary first appeared on The Conversation.

Share.

Leave A Reply

© 2025 The News Singapore. All Rights Reserved.