this post was submitted on 08 Jul 2023
157 points (100.0% liked)

Technology

23 readers
2 users here now

This magazine is dedicated to discussions on the latest developments, trends, and innovations in the world of technology. Whether you are a tech enthusiast, a developer, or simply curious about the latest gadgets and software, this is the place for you. Here you can share your knowledge, ask questions, and engage in discussions on topics such as artificial intelligence, robotics, cloud computing, cybersecurity, and more. From the impact of technology on society to the ethical considerations of new technologies, this category covers a wide range of topics related to technology. Join the conversation and let's explore the ever-evolving world of technology together!

founded 2 years ago
top 50 comments
sorted by: hot top controversial new old
[–] HandsHurtLoL@kbin.social 87 points 1 year ago (5 children)

Hm, yeah I guess no one has been speculating about this part of the de/federate Threads reality. Everyone's worried about Meta and EEE, but what we should have really been discussing is the history of Meta moderation and community guidelines which have often cited "free speech" when people use white supremacist dog whistling but cite "calls to violence" when people of color actively complain about white supremacy.

There's a reason why we have seen news articles about large LEO Facebook groups trading and making joke comments on racist memes...

We were worried about the technology, but we should have been worried about cultural infiltration.

[–] MiscreantMouse@kbin.social 55 points 1 year ago (2 children)

Exactly. What happens when a far-right troll like libsoftiktok sics thousands of rabid followers on a fediverse account? I get the feeling our small, volunteer group of moderators just don't have the resources to cover that kind of brigading.

[–] HandsHurtLoL@kbin.social 28 points 1 year ago (1 children)

Also, I don't think moderation can even stop brigading or the downvotes to hell avalanche. It could only stop thread and comment creation on just your one community/magazine on your instance.

Nothing could stop a bad faith actor from finding my comments on a different instance and harassing or brigading me there if that instance federated with Threads, even if my instance defederate from Threads.

This Fediverse stuff is... complex.

[–] sab@kbin.social 11 points 1 year ago* (last edited 1 year ago) (1 children)

Well, at least downvotes isn't going to be much of a problem, as threads users will only be capable of upcoming stuff they see here. They don't have a downvote button. :)

[–] Ragnell@kbin.social 11 points 1 year ago (4 children)

They will be able to send swarms of trolls to harass. If Threads does even federate, I suspect even admins who didn't sign the fedipact will defederate quite fast.

load more comments (4 replies)
[–] HeinousTugboat@kbin.social 12 points 1 year ago (1 children)

For what it's worth, LibsOfTikTok's already getting slapped by Threads's moderation.

[–] MiscreantMouse@kbin.social 26 points 1 year ago* (last edited 1 year ago) (1 children)

Nope, she has repeatedly had posts reinstated after being initially flagged for hate speech, including that one. Meta knows their audience.

[–] HeinousTugboat@kbin.social 11 points 1 year ago

Ah, damn. Should've figured it was too good to be true if she was posting it.

[–] ZILtoid1991@kbin.social 16 points 1 year ago

Facebook's moderation only covers the bare minimum. Simple mention of Hitler can get you banned (even if you're criticizing him), calling all LGBTQ people pedophiles and the likes are de-facto allowed there. Threads' moderation is pretty much the same from what I've heard.

[–] Kichae@kbin.social 11 points 1 year ago (1 children)

Oh, we haven't been speculating about moderation because that's a known quantity. A major driver of defederarion discussion on the microblogging side of the fedi has been about the moderation issues that people would have to deal with if federated with Threads. And especially about bad actors on Threads getting posts from users on defederated instances via intermediary sites, and then spotlighting vulnerable people to trolls on other instances.

It's why many niche Mastodon instances are talking about defederating from any other site not blocking Threads. It's a significant mental safety risk for vulnerable people in the alt-right's sights.

[–] HandsHurtLoL@kbin.social 11 points 1 year ago* (last edited 1 year ago) (3 children)

I'm not an "early adopter" of the Fediverse per se, but I came over on the reddit migration on June 11. I feel like I've been an information sponge trying to wrap my head around the organization of the Fediverse and seeing the benefits. I think I'm pretty up to speed, at least enough to discuss it with people offline and explain it in a way that does it some justice.

But I don't think I've seen a lot of discussion about the drawbacks of the Fediverse. I've seen a few threads about major privacy concerns related to the Fediverse, but most of the comments responding just kind of hand wave the issue.

Seeing a possible larger issue here regarding the moderation issues, I can't see anything other than a total containment of Threads away from other instances. Like, great - use ActivityPub, but don't talk to me (kbin.social) or my child (literally everything else that wants to interact together in the Fediverse with kbin) again. Lol

[–] Kichae@kbin.social 9 points 1 year ago (1 children)

The thing is, because minority-targeting trolls aren't taken seriously by any corporate social media platform, there's no big downside compared to them. It's just that them showing up here is effectively taking the safer space these communities they've built away from them, returning things to basically how they were just before they fled those other spaces.

They were made safe not due to the tools, but due to obscurity, and they're about to lose that obscurity.

This is... I don't want to call it a "good thing", because people who have suffered many assholes suffering them all over again is in no way, shape, or form good, but it's highlighting an issue that's been clear to these communities, but not to developers on the Fediverse: The moderation tools here are hot, sweaty garbage.

Hopefully we can see serious movement on making useful tools now.

[–] HandsHurtLoL@kbin.social 10 points 1 year ago

I don't know if you have history on reddit, but the "safety because of obscurity" and having that taken away by increased visibility is absolutely what I lived through as a member of a subreddit called TwoXChromosomes. TwoX was a really welcoming space for women-identifying people to get a breath of fresh air from the constant "equal rights means equal lefts" kind of casual misogyny on the rest of reddit. And then corporate created the "default sub" designation and put TwoX on the list.

I remember the moderators at the time making it very clear to the community that they voiced their dissent but it was happening anyway (wow, what does that sound like?) and now a lot of the posts there get inundated with "not all men" apologists and all the OPs have reddit cares alerts filed on them.

load more comments (2 replies)
load more comments (2 replies)
[–] ArchmageAzor@lemmy.world 33 points 1 year ago (9 children)

𝓓𝓮𝓯𝓮𝓭𝓮𝓻𝓪𝓽𝓮

[–] a_name_needs_no_name@kbin.social 8 points 1 year ago (1 children)

Threads, while built on ActivityHub is not federated.

[–] MiscreantMouse@kbin.social 20 points 1 year ago (1 children)
[–] Mane25@feddit.uk 6 points 1 year ago

I mean if it gets a bad enough reputation it might not be that much of a problem. If this turns out to be the next Voat rather than the next Twitter then job done.

load more comments (8 replies)
[–] CarrierLost@lemmy.one 23 points 1 year ago (4 children)

New nazi bar over there apparently.

load more comments (4 replies)
[–] dumptruckdan@kbin.social 14 points 1 year ago (2 children)

White supremacists are like that guy nobody ever wants at their party but who always invites himself anyway. It's hard enough to keep him from washing his balls in the punch bowl when you're actively trying to keep him out. Meta doesn't even try except to the meager extent required by law.

load more comments (2 replies)
[–] gravitas_deficiency@sh.itjust.works 12 points 1 year ago (5 children)

I mean… I wasn’t expecting this to not happen eventually… I’m just surprised it happened so quickly, and that Meta has done nothing in terms of mitigation - and moreover, didn’t see this as a thing they’d need to guard against out of the gates (unless, I suppose, this isn’t intended to be a Twitter clone, and it’s more shooting for being a Parler clone).

There’s probably a lesson somewhere in there about the benefits of growing your userbase organically instead of trying to force-march users over by creating shadow accounts, but applying that lesson would be unprofitable, so Meta definitely won’t care.

[–] skellener@kbin.social 8 points 1 year ago (2 children)

Meta is about ad dollars. That’s all.

Meta is about user data monetization which includes - but is absolutely not limited to - ad sales and targeting.

load more comments (1 replies)
load more comments (4 replies)
[–] gk99@kbin.social 10 points 1 year ago (5 children)

Literally why

They already have Truth Social and Twitter.

[–] ozen@kbin.social 13 points 1 year ago

there aren't minorities in those places for them to attack, which is what they want to do

[–] spriteblood@kbin.social 7 points 1 year ago

Facebook is also a big gathering place for white supremacists, anti-LGBT, and other conservative extremists. It's largely where the US Capitol insurrection was organized. Meta is no stranger to fascism.

[–] ArugulaZ@kbin.social 6 points 1 year ago

Hitler already had Germany. Why'd he want the Czech Republic and Poland?

load more comments (2 replies)
[–] Rottcodd@kbin.social 8 points 1 year ago (3 children)

That's what I expected from the start.

I guess I just assumed that that was commonly understood, As soon as I saw that it was going to be run according to Facebook's moderation standards, I took that to mean that it was going to be tailored to suit white supremacists and Christian nationalists, like Facebook.

load more comments (3 replies)
[–] EmperorHenry@kbin.social 7 points 1 year ago* (last edited 1 year ago) (16 children)

Supporting free speech means allowing people you hate to talk too. Censor a Nazi one day, then the next day it's something your weird friend likes, then the next day it's something you like.

Everyone deserves a platform online, but they have to earn their audience. Censoring them is only going to make more people want to go to other platforms to hear and see what they have to say.

[–] skulblaka@kbin.social 24 points 1 year ago (1 children)

I am not required to respect "free speech" when it comes from a place of fundamental dishonesty. Slander is not protected speech. They are within their rights to bitch and complain about whatever non-issue they're up in arms about today and I'm within my rights to ban and ignore them.

They are, notably, NOT within their rights to call for violence and death against LGBTQ+ folks, which many are doing, because that constitutes hate speech, assault, or even inciting a riot, depending on which particular situation you find yourself being a bigot in. All three of these are illegal and are not protected speech.

Tolerance of intolerance is not a paradox, it is a failing of the people who are supposed to be protecting their communities. Tolerance of Nazis and racism are not required by the tenets of the Constitution or by the tenets of democracy and instead actively erode the protections enshrined within each.

In short, Nazi punks, fuck off.

load more comments (1 replies)
[–] elscallr@kbin.social 16 points 1 year ago (1 children)

It doesn't mean you have to give them the platform, though. If they want to create their own Nazi federation that's entirely on them, but you don't have to integrate their content.

load more comments (1 replies)
[–] bedrooms@kbin.social 10 points 1 year ago* (last edited 1 year ago) (2 children)

That's just common misconception. Free speech is there to protect people from the government, not business. If my anti-racism voice gets suppressed on Threads (assuming I ever make an account there) I'd just move to another platform.

And really, there's no good reason for a well-intended internet community to allow racism expand.

load more comments (2 replies)
[–] ondoyant@beehaw.org 5 points 1 year ago* (last edited 1 year ago) (1 children)

such a slippery slope! supporting free speech means allowing people to talk about how much they want queer people dead, too. tell the people calling for violence against queer people to fuck off, and maybe one day your very own calls for violence might get told to fuck off!

everybody deserves a platform to call for the extermination of people groups, but they have to earn their audience 😏. i think we should do absolutely nothing to stop them, because doing anything just makes them stronger anyways. /s

load more comments (1 replies)
load more comments (12 replies)
[–] gentleman@kbin.social 5 points 1 year ago

@MiscreantMouse This is why I’m of the opinion that defederating from anything that smacks of Meta or Threads should be done immediately. Zsuck supports Russian bots, Alt-right Insurrectionists and hate speech and has done so since 2015, in other words, longer than Elon. Should be walled off and removed like a cancerous tumor. In my view, that should include any instance that signed an NDA with them.

I saw a survey of instances that indicated many are taking a “wait and see” approach, which is mystifying. What do people think they are find that they don’t already know about Meta?

[–] Jumpinship@lemmy.world 4 points 1 year ago* (last edited 1 year ago) (1 children)

Its just free speech that nobody has to listen to, right? Lemmy has no ads anyways so what if there's some nonsense mixed in? I doubt it would outnumber the people who want good content to prevail.

[–] EmperorHenry@kbin.social 4 points 1 year ago (1 children)

EXACTLY! All of these people complaining about bigots "everywhere" where are they? I don't know, because I've never gone looking for them and I've never clicked on any of their profiles. The only time I ever hear what bigots say is through the filter of people making fun of them and de-bunking their arguments.

For all the people that downvoted me. CLICK AWAY FROM THE THINGS YOU DON'T LIKE. No one's making you look at it!

load more comments (1 replies)
[–] GataZapata@kbin.social 4 points 1 year ago

The genocide of the rohyinga people was largely organized on Facebook. Meta is not to be trusted with any of this shit.

https://en.m.wikipedia.org/wiki/Rohingya_genocide

load more comments
view more: next ›