this post was submitted on 25 Aug 2023
88 points (100.0% liked)
Technology
37724 readers
649 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I've read some of his books too.
Just as long as you don't in any way absorb the information into your brain it's okay.
I understand what authors are trying to say, but legally I don't quite get how they can say that an AI is doing anything other than reading the books, which obviously they're allowed to do.
It is not as if they are making the books publicly available for free, and it's not as if writing in the style of another author is illegal, so I'm not quite sure what law has been broken here.
LLMs have been caught plagiarising works, by the simple nature of how they function. They predict the next word based on an assumed context of the previous words, they're very good at constructing sentences but often the issue is "where is it getting its information from?" Authors never consented to their works being fed into an optimisation algorithm and neither did artists when DALL E was created.
For authors, you buy the book and thus the author is paid but that's not what happened with ChatGPT.
Any source for this? I have never seen that.
I'm highly skeptical about GPT4 having been directly trained on copyrighted material by Stephen King. Simply by all the sheer information about his works, including summaries, themes, characters, and critical analyses that are publicly available, a good LLM can appear to be able to plagiarize these works, while it doesn't. If I'm right, there is no leverage for creators to complain. Just accept that that's the world we're living in now. I don't see why this world will stop the sales of books or movie rights on books, etc.
Especially since copyright only protects human authored works. Meaning anything created by an LLM is in the public domain, and the publisher using it loses control of the work.
Of course, this has the potential to be a significant issue, as I can take a copyrighted work, train an LLM using it, and then get it to generate a similar but unique work that is in the public domain. This new work will likely impact the original author’s ability to profit off their original work, thus decreasing supply of human created works in the long run.
But it’s currently all legal and above board.
I had heard some mentions of this before too, but didn't recall the exact references. I went searching and found this recent study.
Sure, it can plagiarize works it has been trained on. They didn't show in the study, however, that this has occurred for copyright protected material like fiction books.
I saw a comment, probably on Mastodon, from an author saying that (I believe) ChatGPT had plagiarized some of his work verbatim. I don't recall if it was a work of fiction or not, although for the purpose of copyright it doesn't matter.
I wouldn't be surprised if it's trained on works of fiction just as much as non-fiction though. I think that from what I've heard, you can ask ChatGPT to write something in the style of particular writers? If it's possible to give a very specific prompt for it to write something with the same plot points as a Stephen King story in the style of Stephen King, I wonder just how close it would look like the original?