this post was submitted on 10 Jul 2023
163 points (94.1% liked)
Technology
59414 readers
3162 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If I’m understanding this correctly, whatever chatgpt responds to your queries, you can be held liable for if any damaging content is produced.
It makes sense, right?
They produced a language model. It does nothing more than predict the next word. It will lie all the time, that's part of how it works. It makes stuff up from the input it gets.
If you post that stuff online and it contains lies about people and you didn't check it, you absolutely should be liable for that. I don't see a problem with that.
Right, but what about the case where you post something that doesn't contain lies at all?
What if ChatGPT outputs something that a certain former president gets offended by and he decides to sue OpenAI?
According to their ToS it doesn't matter if it's a "frivolous lawsuit". If OpenAI had to pay any attorney fees just to respond to some ridiculous lawsuit, they could still bill you for those costs.
I don't think it makes sense at that point at all.
Of course the vast majority of users would never have to worry about this, but it's still something to be aware of.
It's a tool. Can't sue the manufacturer if you injure someone with it.
This isn't true in the least. Purchase a tool and look through the manual. Every section marked "danger", "warning", or "caution" was put in there because someone sued some company because the user or some bystander was hurt or injured.
You are right. Seems I confused common sense with reality.
You ever heard of a product recall?
You can if the tool is defective.
That's gotta be more to cover their ass then to come after you. Unless you use it's generated text to sue the company I don't think they would ever try to sue their users or else everyone would stop using the platform and Microsoft would have a huge PR problem and their stock price would drop. It just doesn't logically make sense for them to do that, unless they were sued by you for the content produced by your inputs.