this post was submitted on 24 Sep 2023
22 points (75.0% liked)

Privacy

31935 readers
678 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
top 26 comments
sorted by: hot top controversial new old
[–] jet@hackertalks.com 24 points 1 year ago (2 children)

Is there a typo in the question? By definition censorship is already censorship. It's a tautology.

If you mean when does small scale individual censorship become systematic censorship and oppression?

  • in today's internet, with the public square is being privately owned, Twitter Instagram Facebook etc, small scale censorship by the private corporations, can effectively become systematic censorship, because they dominate the digital public square.

This is why myself, and many of my friends, are actively supporting the fediverse so that people's voices can be democratically supported, and not subject to corporate censorship.

[–] sxan@midwest.social 17 points 1 year ago (1 children)

This is at the root of the paradox of tolerance. If you tolerate intolerance, eventually intolerance dominates.

Robert Anton Wilson wrote about Big Truths and Little Truths. Similarly, we can talk about Little Censorship and Big Censorship. I don't know what those definitions are, but I'm sure that it's not just a matter of scale, because the Paradox of Tolerance applies at all scales. I think the difference lies between in what's being censored, things that promote intolerance. And then there are things outside of intolerance that most of us agree should be squashed -- child porn, hate speech, incenting to crimes against individuals, doxxing. But it's a fine line, and you could argue that it's better to not censor, and just make the the sharing a crime.

Personally, I don't have clear definitions around this stuff, but I do think the Paradox of Tolerance is a real thing that's been demonstrated countless times, and which should be heeded.

[–] jet@hackertalks.com 5 points 1 year ago (3 children)

The tolerance intolerance discussion is interesting, and very sticky.

If speech is criminally intolerable, then it should be up to the criminal system to prevent that speech. Not digital platform providers to enforce their opinions. Or at least that's why I support the fediverse.

"If there be any among us who would wish to dissolve this Union or to change its republican form, let them stand undisturbed as monuments of the safety with which error of opinion may be tolerated where reason is left free to combat it. I know, indeed, that some honest men fear that a republican government can not be strong, that this Government is not strong enough; " - Jefferson, Thomas speech

Personally I fall on the the side of a free and open discord, we cannot be fearful of evil ideas, we must expose them to sunlight so that they may shrink away by the minds of conscionable people.

Rhetorically I've seen many internet arguments use the intolerance of tolerance idea, to shut down any idea they don't agree with. They wield it as a shield to prevent open debate. I think that hurts discourse, and finding common ground, it polarizes people in a discussion.

[–] sxan@midwest.social 5 points 1 year ago (1 children)

We completely agree that it's a difficult question, and a slippery slope. And also on the point of government's role.

Do you them believe that privately run platforms shouldn't have the right to choose what gets put on their platform? Or is it a matter of scale, like, Sxan's GoToSocial server can do what it wants, but The-Platform-Formally-Known-As-Twitter shouldn't?

I always think of the brigading that happens on "open" platforms. The Masses will effectively censor any real debate, but especially if they know there are no rules. How are we to deal with that?

[–] jet@hackertalks.com 4 points 1 year ago* (last edited 1 year ago) (3 children)

You bring up some excellent points. Right now, there's private organizations that are acting as de facto public squares. I think when they're the only option, it gets muddy, if they're going to be so essential to society, they have to operate like utilities, not have opinions beyond legal or illegal.

For all the platforms, that are just options, but are not de facto public squares, I'm perfectly happy for them to have opinions about what can or cannot be said.

Let's take the fediverse, as an example, any individual server can have its own opinions, and enforce them through moderation - perhaps very heavy moderation. And that's totally fine. Because any group of people can run their own instance, and have their own moderation policies, and it's all on an equitable playing field.

The brigading, that we're seeing on these open platforms, is the early adopter phenomenon, and that groups tend to move together. The only real solution, is heavy moderation, on different instances. So if you have a community that's talking about fishing, the moderator should prevent brigading from discussing things that don't relate to fishing. "Person I hate was caught fishing, we should ban them from fish, etc etc oh you support fish killing...." the moderator should stop that.


The litmus test I would use, to determine if a social media company was a public space, and should act by utility rules, rather than private club rules - would be, does the government use that platform to communicate with citizens?

X-twitter, Facebook... both have governments using them to communicate directly if their citizens, sometimes as the only means of communication. So they are de facto public squares and utilities.

[–] partizan@lemm.ee 2 points 1 year ago

But the governments prefer the current situation, as they have channels to ask for removal, but have zero liability and the company is covered, as they can do as they please, because its their private platform where they are allowing them. So I dont see why would the government declare social media as public squares...

[–] Facebones@reddthat.com 2 points 1 year ago

I think a publicly funded platform would be beneficial in today's world. Nobody can be banned but you can still block people individually. (Criminal stuff would still be criminal and you could potentially be muted by govt entities though) All govt communication would be through this platform, so nobody can be "walled off" from govt comms. It would still function as social media as well, but people would be free to twit/fb/whatevs - there would just no longer be govt entities there.

It would also lay the framework to potentially move our voting systems into the 21st century IMO.

[–] sxan@midwest.social 2 points 1 year ago (1 children)

So, I've left this on "unread" for so long only because until now I only used Lemmy on my phone, and I really hate typing long replies on my phone. I wanted to give your reply due consideration, though. Anyway, I'm embarassed to have taken this long to respond.

I agree with you about the public square, and I think you bring up an excellent point about these systems becoming "essential to society." I think it's a thing that is obvious to younger people, and almost completely invisible to older people. Even those of us who grew up during the IT boom decades and lived through the change may find it difficult to grok just how much of an impact this is having. I do think that people are generally well aware of how slow legislation is in adapting to rapid changes in society, but the impact you talk about has happened at such an accelerated rate, useful precedents are lacking. So we see legislators thrashing about more than usual, over or under-reacting, and mostly in extreme ignorance.

I see brigading in the fediverse as a worse problem than you do. It's mob rule, and it is unchecked largely -- I feel -- as a result of hesitance by moderators to be accused of censorship. I haven't yet seen much of what Reddit suffers from -- moderator affinity, where mods have a heavier hand with posters they disagree with -- but the result is unchecked herd mentality cowing dissenters.

But, maybe mob rule is good? I vacillate on this one. A well-functioning, healthy society has laws controlling gross topics, and social censure is used to moderate distructive elements. We don't want a society where we have laws for every little infraction; in that society, every citizen is a criminal by default, and the government always has a legal justification to persecute everyone they want to (and let slip those they don't). OTOH, we have what happened in the US in the 50's, with mobs of white people harrassing black integration students. I don't know what the right answer is for this, honestly, but it is an issue in meatspace, and it's as much or more of an issue online.

Your litmus is good, I think, but risks being based largely on our current clueless government. As the generations age out, and younger generations take control, the government will become increasingly social-media savvy. I can easily see a future government having a communications department that is competent enough to hit nearl every social media platform, regardless of popularity. What about cross-posting? If we use that litmus, then if I were the government and wanted to control a platform, all I need to do is start posting to it and now it qualifies as subject to regulation?

I think I've said before, but I'll repeat it: I don't have answers to any of these issues. I wish we could have a censorship-free internet; there was a time in the early history when most users were well-behaved and followed established etiquette. I think a lot of that may have been due to the lack of anonymity, but whatever the reason, we've been past that for decades, and we haven't yet adapted.

[–] jet@hackertalks.com 2 points 1 year ago (1 children)

Thank you for the very thoughtful reply.

The brigading is a huge problem and discourages people from joining lemmy, we need highly opinionated moderated communities to create "safe spaces" for niche communities and viewpoints. The inclusion of "user participation requirements", like account age, interaction with a community, karma scores in community - are necessary to help lemmy grow.

From a long term stability of society perspective, absolute free speech is the only path forward. Yes, people we hate will have voices, and people who are criminal will have voices, but that is the price of giving everyone a voice. We only have to look at the diversity of "governments" globally to realize having a community focused, respectful, government is a temporary thing. Governments change with time, with those enforcing the rules. Just as a thought experiment imagine you lived your entire life in every country, and imagine you wanted to advocate for 1. human rights, 2. a political opposition party. In many countries, that is aggressively stamped out, "don't rock the boat". In many global communities' doing 1 and 2 are great ways to embarrasses powerful people and have a short life.

I know many people will think, "yes, but... what about thing I don't like X"... If we create the digital tooling to ban X, whatever X is, then those in power will use that tooling to target everything else. Tools in the toolbox get used. Its a difficult stance to be a free speech absolutist, its unpopular, but I think its necessary. I'm not saying communities have to suffer outsider speech intruding on their spaces, but that platforms cannot be opinioned as a whole.

You bright up very thoughtful points and I agree censorship is necessary to grow communities, but censorship should never get larger then the community level. Platform level censorship is bad for society in the long term.

[–] sxan@midwest.social 1 points 1 year ago (1 children)

Okay; you're making a distinction between "moderation" and "censorship" that I don't understand. Does it go back to your litmus of an "important public space?"

[–] jet@hackertalks.com 1 points 1 year ago (1 children)

Moderation: not deplatforming, but putting rails on a specific discussion

Censorship: deplatforming, total limits on a topic in all places.

I.E. anyway can send mail in the post office. A news letter editor moderates the received letters for inclusion in their publication.

So in a Lemmy context, it's not censorship to have rules on a instances, but it would be censorship to deny people the ability to run a instance. Lemmy is very censorship resistant.

[–] sxan@midwest.social 2 points 1 year ago (1 children)

Are you suggesting that there are no topics, no content, that should be censored? I'm not trying to walk you into Godwin's law; I just don't see how you address issues like CP, snuff porn, or hate/incentivizing speech. I personally would rather err on the conservative side of the Paradox of Tolerance, than allow intolerence to take hold and take over. With total and complete freedom of expression, how do you prevent the emergence of populist oppressive movements like the Khmer Rouge, or the Nazi party? Or do you think the Paradox of Tolerance is flawed?

[–] jet@hackertalks.com 1 points 1 year ago

First, let me take the opposite position, with restricted and curated freedom, how do you prevent people from being oppressed?

The speech itself should not be censored, that includes the objectionable things you mentioned. If a country, or a government wish to make some speech illegal, that it should be up to the courts to remove somebody's speech, through a due process and public discourse.

I take a different position on the paradox of tolerance, the issue is sitting idly by, while groups are being excluded. Open debate, and rational thinking, are required by all countries in the world, and all the citizens of the world, to prevent terrible abuses from happening again. My takeaway, is everyone should fight tooth and nail, to prevent any group from being excluded - including groups we don't like.

I've seen the paradox of intolerance used as rhetorical ammunition to silence opponents online, and that just turns into another form of tyrrany of the current ingroup.

To prevent another opressive government from taking hold (like your examples), we have to trust in people's engagement and wisdom, and the open healthy debate of ideas. We can, of course, help people, through economic stability, critical thinking education, etc...

If we say some thoughts are tok dangerous to be spoken, for fear people are too easily lead astray.. then we are trusting that those who choose which voices are worth hearing will always be benelovent dictators.. the one lession I take away from history is that power rarely stays in the hands of the benelovent. Open communication, organization, and free thought is the most effective way to protect a population.

TLDR: thought crime and wrong think shouldn't ever exist, legally at least.

[–] HumbertTetere@feddit.de 2 points 1 year ago (1 children)

Being able to criminally persecute someone requires knowing their identity. If this is the only approach, the real need to prevent anonymous internet usage will increase.

[–] LinkOpensChest_wav@lemmy.one 1 points 1 year ago

Not to mention, in most communities I choose to be part of, I trust the judgment of the admins and moderators far more than the state's "justice" system.

What’s bothering me is all the editing of old books and remaking movies to fit current political views

[–] milicent_bystandr@lemm.ee 4 points 1 year ago

As well as small/large, I think there's a difference between legal/effective/practical censorship.

With legal censorship but not practical, I can tell my friends things, maybe pay anonymously, but at risk of legal prosecution and worrying about my ethics as a law-abiding citizen.

Media bias (for example) gives effective censorship for many, but if I care enough I can even start my own media and promote it as best as I can - and some people can be reached.

To some extent I think the three can balance each other out: for instance I wouldn't want anti-vaccination rhetoric to be the main thing people hear, but I do want freedom and opportunity for people to question scientific and medical consensus.

Personally I think social media is a fantastic tool and also a problem - but not a good place for a solution: so I tend not to worry about social media 'censorship'. Maybe I'm just out of touch!

[–] rodbiren@midwest.social 17 points 1 year ago (1 children)

To me it is blocking expression that presents no plausible harm to anyone. Yelling fire in a crowd to start a panic, making a specific threat, and intentionally spreading lies as to defame all strike me as harmful language and should be curtailed somehow. All expression of any kind not plausibly causing harm should be allowed and equal in the market of ideas despite my personal opinion of them which is a bitter pill to swallow when neonazis appear all to common. The is my opinion.

[–] rikudou@lemmings.world 20 points 1 year ago

That really depends on who's at power currently. I'm pretty sure almost everyone agrees with you, it's just that people disagree on what's considered harmful to society.

There are people who think saying two men can kiss and love each other is harmful.

[–] trippingonthewire@lemmy.ml 9 points 1 year ago

The first time it happens, at step one.

[–] zwekihoyy@lemmy.ml 8 points 1 year ago

a governmental body. private entities, whether that be people or organizations, are not bound to the concepts of censorship like a governing body with real power over the people are.

[–] ExtremeDullard@lemmy.sdf.org 4 points 1 year ago

at what point does censorship became censorship

When you can't ask at what point censorship becomes censorship without consequences.

[–] putoelquelolea@lemmy.ml 2 points 1 year ago

Censorship is censorship

[–] possiblylinux127@lemmy.zip 2 points 1 year ago

I find all the rewriting of books and remaking of movies and tv shows I used to watched as a kind because they are inclusive enough to be strange. I knowing it is not a government effort. but why is the public trying to the kids now that if they do not have a culturally diverse friend group it is something wrong. we are trying to achieve equality not make race or gender something people do not think about. sorry i know this is a bit off topic but it is I want to talk about.

[–] DavidGarcia@feddit.nl -1 points 1 year ago

I'd say if you are unfairly depriving someone of an audience that would have wanted to listen to you.

Individually blocking someone you don't personally want to hear from obviously isn't censorship.

But if you have a monopoly on a platform and block everyone who would be interested in listening to someone, just because of your personal preferences, that is censorship.

But if virtually no one wants to listen to something and you block it, I would argue that's not censorship. E.g. no one should has to listen to spam or look at porn.

Of course those lines are blurry, but so is all of moral judgement.

It's more clear cut if you 'unrightfully' ban someone from YouTube, since it's a monopoly. Banning someone from lemmy.world who would have had an audience there is trickier, since ideally this would eventually lead to them and their audience moving to an instance where they are welcome.

That's why you would want your government to protect speech, since it is the biggest and most powerful monopoly. But in my opinion the same should extend to any large institution, like social media.

And I'm talking about censorship as a moral judgement free term, since I would argue there is some good censorship. E.g. banning CSM. I don't think it makes sense to call it anything other than it is.