this post was submitted on 14 Jun 2023
77 points (100.0% liked)

Lemmy.World Announcements

29048 readers
5 users here now

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news ๐Ÿ˜

Outages ๐Ÿ”ฅ

https://status.lemmy.world/

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to info@lemmy.world e-mail.

Report contact

Donations ๐Ÿ’—

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Ko-Fi (Donate)

Bunq (Donate)

Open Collective backers and sponsors

Patreon

Join the team

founded 1 year ago
MODERATORS
 

I recently made the jump from Reddit for the same immediate reasons as everyone else. But, to be honest, if it was just the Reddit API cost changes I wouldn't be looking to jump ship. I would just weather the protest and stay off Reddit for a few days. Heck I'd probably be fine paying a few bucks a month if it helped my favorite Reddit app (Joey) stay up and running.

No, the real reason I am taking this opportunity to completely switch platforms is because for a couple years now Reddit has been unbearably swamped by bots. Bot comments are common and bot up/downvotes are so rampant that it's becoming impossible to judge the genuine community interest in any post or comment. It's just Reddit (and maybe some other nefarious interests) manufacturing trends and pushing the content of their choice.

So, what does Lemmy do differently? Is there anything in Lemmy code or rules that is designed to prevent this from happening here?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] kadu@lemmy.world 27 points 1 year ago (2 children)

That's the thing though - what system? Reddit, YouTube, Twitter, Facebook, you name it, nobody managed to prevent bots. How would Lemmy be more successful at this? It's an extremely challenging battle, unfortunately.

[โ€“] czech@kbin.social 6 points 1 year ago (2 children)

Do those for-profit social media companies want to drive-down traffic that makes them seem more valuable to advertisers? I get that it's still insanely difficult, and we can't actually implement a captcha on every up-vote, but it seems like there's a conflict of interest between moderators and site owners when it comes to bot activity.

[โ€“] kadu@lemmy.world 8 points 1 year ago (1 children)

Arguably, some of the platforms I mentioned have even more of an interest on preventing bots. If I want to place ads on your website, but you can't tell me if out of 100 impressions 10 are bots or 90 are bots... I'm not wasting my money, or at the very least, I'll expect rates significantly lower than other competitors.

[โ€“] voiceofchris@lemmy.world 0 points 1 year ago

I don't know. Wouldn't their motivation be to know exactly how many bots there are (so they could disclose the number if/when asked) but continue to let them proliferate?

[โ€“] HQC@beehaw.org 6 points 1 year ago* (last edited 1 year ago) (1 children)

Social media companies generally benefit from high traffic for advertiser appeal, but combating bots is crucial for maintaining user trust and engagement. Implementing CAPTCHAs for every upvote may not be feasible, but addressing bot activity is generally in the long-term interest of social media companies.

This message was generated by ChatGPT.

Not sure if you bought that, but if I was applying for an account on Beehaw using a LLM assistant, I bet the odds of passing a human review is better than 50%.

[โ€“] aeternum@kbin.social 8 points 1 year ago

Oh god. Could you imagine doing a captcha every time you upvoted? Please DO NOT do this, Ernest.

[โ€“] voiceofchris@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

Well, what about the system I mentioned? Just have the up and down arrows be little bot detection boxes. My understanding is that all those "I am not a robot" check boxes detect mouse speed, precise click locations, hesitation times, etc. and do a quick calculation on the odds that your clicking behavior was human or robot. I'm probably underestimating what it takes to implement that but on the user side it's just a click just like any other click.