That just goes to show how complex the immune system is. Even though we learn more about it all the time, I get the feeling that we’re only scratching the surface.
Hamartiogonic
Yes, it’s true that countless authors contributed to the development of this LLM, but they were not compensated for it in any way. Doesn’t sound fair.
Can we compare this to some other situation where the legal status has already been determined?
And Siri will immediately call the local exterminator…
I think of an LLM as a tool, just like drill or a hammer. If you buy or rent these tools, you pay the tool company. If you use the tools to build something, your client pays you for that work.
Similarly, OpenAI can charge me for extensive use of ChatGPT. I can use that tool to write a book, but it’s not 100% AI work. I need to spend several hours prompt crafting, structuring, reading and editing the book in order to make something acceptable. I don’t really act as a writer in this workflow, but more like an editor or a publisher. When I publish and sell my book, I’m entitled to some compensation for the time and effort that I put into it. Does that sound fair to you?
Space is mostly empty anyway, so the chances of crashing into anything is pretty low. That’s why space travel is so safe.
Yeah, that’s very much an English thing. Many other languages use reasonably consistent spelling and pronunciation, so memorizing the handful of exceptions isn’t really a problem.
However, with English it’s the other way around. You need to memorize the handful of words that are actually pronounced the way they are written. Everything else is just pure chaos. If you read a word, you can’t pronounce it. If you hear a word, you can’t find it in a dictionary.
Better call my local roach doctor then…
An LLM is not a legal entity, nor should it be. However, similar things happen in a human brain and the network of an LLM, so same laws could be applicable to some extent. Where do we draw the line? That’s a legal/political issue we haven’t figured out yet, but following these developments is going to be interesting.
A neural network (the machine learning technology) aims to imitate the function to normal neurons in a human brain. If you have lots of these neurons, all sorts of interesting phenomena begin to emerge, and consciousness might be one of them. If/when we get to that point, we’ll also have to address several of legal and philosophical questions. It’s going to be a wild ride.
Here’s an analogy that can be used to test this idea.
Let’s say I want to write a book but I totally suck as an author and I have no idea how to write a good one. To get some guidelines and inspiration, I go to the library and read a bunch of books. Then, I’ll take those ideas and smash them together to produce a mediocre book that anyone would refuse to publish. Anyway, I could also buy those books, but the end result would still be the same, except that it would cost me a lot more. Either way, this sort of learning and writing procedure is entirely legal, and people have been doing this for ages. Even if my book looks and feels a lot like LOTR, it probably won’t be that easy to sue me unless I copy large parts of it word for word. Blatant plagiarism might result in a lawsuit, but I guess this isn’t what the AI training data debate is all about, now is it?
However, if I pirated those books, that could result in some trouble. However, someone would need to read my miserable book, find a suspicious passage, check my personal bookshelf and everything I have ever borrowed etc. That way, it might be possible to prove that I could not have come up with a specific line of text except by pirating some book. If an AI is trained on pirated data, that’s obviously something worth the debate.
Yeah, but why though?
“The extreme industrialization of both the poultry and pork industries — with their use of densely packed, genetically uniform, and immunocompromised animals — is a perfect Petri dish for cultivating the next plague.”
That part was nowhere near the top of the article.
I’ve seen a bunch of Terminator style movies where an AI slices, dices, scorches and/or nukes humanity to oblivion long before climate change gets us. I have it on good authority that we don’t need worry about the temperature change.