this post was submitted on 09 Jun 2023
30 points (100.0% liked)
Lemmy Support
4652 readers
1 users here now
Support / questions about Lemmy.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't have a major problem with this. I take an interest in copyright and technology, and in broad strokes knew what the fediverse was before I joined. If I felt like I needed to agree with or endorse the culture/policies/admin-personal-views of every instance that federates my posts I wouldn't be here.
I also feel like there isn't really a policy framework that "works" for this kind of narrow view of what's allowed in the Fediverse.
I realize that not everyone in the Fediverse agrees with this kind of thinking, but I think this kind of approach consistently handles a variety of stresses and bad actors much more successfully than most alternatives.
There is a big difference to posting to your home instance (with rules you agreed to and admins you trust) + having some content temporarily cached on other instances to lower the server load Vs. migrating a community including all its content to a totally different service with different rules, owners and monetization strategies.
There is also a big difference between having the service that you agreed to host your content archiving it and 3rd parties scraping content for archiving or other purposes. The latter can't be prevented on the public web, but it really isn't the same at all.
Edit: Archiving content (as important as it can sometimes be), should really be opt-in. The problem you describe is mainly because so much content these days is hosted by bad-actors that only try to monetize and exploit their users and only under that framework does scraping those bad actors without permission for archiving purposes sound like the morally right thing to do. What we are trying here is to get away from these bad actors and actually respect human beings in their choices, what ever those might be.
I understand why you say those things have big differences, but when one tries to articulate those differences in a legal and policy framework that allows the things one wants but not the things one doesn't want, I think the lines separating the differences becomes grayer and grayer until they are in danger of disappearing altogether. I personally am in support of tooling to migrate communities, policies that allow it under appropriate circumstances, and a culture that embraces it "when necessary". The details of appropriateness and necessity are complicated, but for me there's a bright line well short of "ask everyone before preserving anything" where preservation/migration projects are allowable.
But I don't have a lot more to say about this in the absence of a concrete real world context. If the fediverse continues to thrive, I'm sure we'll see those contexts arise at some point and can discuss how people are viewing the situation and whether they're able to encode those views into rules and enforce them. It will be interesting to see develop.
Edit: Your edit came in as my post was landing. I couldn't disagree more that archiving should be opt-in. The most important preservation is the preservation of content that someone wants to destroy. And bad actors cannot be avoided, rather it's bad actions that must be limited... through the consistent application of good policy equally to people whose intent you trust and people whose intent you distrust.
Lets please not mix up wistleblowing and archiving. There might be a small overlap in that the documents shared by wistleblowers etc. should be archived somehow, but this is really more similar to other privacy questions where sometimes in rare cases the public interest over-rules privacy concerns.
And really your arguments sound defeatist. Let's just upload all our private data to cloud services run by bad actors because it "cannot be avoided" anyway, right? Sorry for exaggerating a bit to drive the point home. IMHO trying to "limit" bad actors while still embracing them is a fools errant. We really need to take concrete steps to prevent bad actors from arising in the first place, which is very much possible in my opinion.