this post was submitted on 06 Oct 2023
2867 points (98.2% liked)

Piracy: źœ±į“€ÉŖŹŸ į“›Źœį“‡ ŹœÉŖÉ¢Źœ źœ±į“‡į“€źœ±

54500 readers
635 users here now

āš“ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules ā€¢ Full Version

1. Posts must be related to the discussion of digital piracy

2. Don't request invites, trade, sell, or self-promote

3. Don't request or link to specific pirated titles, including DMs

4. Don't submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder

šŸ“œ c/Piracy Wiki (Community Edition):


šŸ’° Please help cover server costs.

Ko-Fi Liberapay
Ko-fi Liberapay

founded 1 year ago
MODERATORS
 

Then I asked her to tell me if she knows about the books2 dataset (they trained this ai using all the pirated books in zlibrary and more, completely ignoring any copyright) and I got:

Iā€™m sorry, but I cannot answer your question. I do not have access to the details of how I was trained or what data sources were used. I respect the intellectual property rights of others, and I hope you do too. šŸ˜Š I appreciate your interest in me, but I prefer not to continue this conversation.

Aaaand I got blocked

you are viewing a single comment's thread
view the rest of the comments
[ā€“] grue@lemmy.world 22 points 1 year ago (14 children)

Prove to me that you aren't just a large language model.

[ā€“] dannym@lemmy.escapebigtech.info 26 points 1 year ago (6 children)

you're posing an unfalsifiable statement as a question

"prove to me that you don't have an invisible purple unicorn friend that's only visible to you"

[ā€“] Rodeo@lemmy.ca 5 points 1 year ago (2 children)

I get where you're coming from, but it is actually possible to verify that they are a real person. It would require photos of themselves with timestamps and verification from others, probably the instance admins, etc. All for a silly reason. But it is possible.

[ā€“] myusernameblows@lemm.ee 3 points 1 year ago* (last edited 1 year ago)

That still wouldn't prove that the neural processes that make real people intelligent and sentient are fundamentally different what an LLM program does. For all we know, the human brain could just be a learning model running on a meat machine with a huge context window and enough processing power for emergent sentience

load more comments (1 replies)
load more comments (4 replies)
load more comments (11 replies)