this post was submitted on 10 Jun 2024
63 points (93.2% liked)

Linux

47940 readers
1238 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
top 4 comments
sorted by: hot top controversial new old
[–] Sinuousity@lemmy.world 22 points 4 months ago (1 children)

So you're telling me all we have to do is beg the bots in multiple ways not to read the page and only the malicious bots will get away with it? Win - win - win I think

[–] technom@programming.dev 9 points 4 months ago* (last edited 4 months ago) (1 children)

We need ~~three~~ four things:

  1. A way to poison the data that will throw off the training without causing perceptible difference to humans. As I remember it, many image AIs were sensitive to a peculiar noise that was imperceptible to humans.
  2. A skiplist of AI data stealers, so that their IPs/domains can be blocked in bulk.
  3. Eventually, the above technique will become useless as AI data stealers will start using dynamic IPs and botnets to bypass the skiplists. We'll need to throttle or block data to visitors based on pattern recognition. For example, if the visitor requests linked pages in rapid succession. Or if the request interval is uniform or pseudo random, instead of genuinely random.
  4. If the pattern recognition above is triggered, we could even feed the bots with data from AI models, instead of blocking or throttling. Let the AI eat its own s**t.
[–] BrianTheeBiscuiteer@lemmy.world 9 points 4 months ago* (last edited 4 months ago) (1 children)

I think it could be potentially easier to thwart malicious bots than "honest" bots. I figure a bot that doesn't care about robots.txt and whatnot would try to gobble up as many pages as it could find. You could easily place links into HTML that aren't visible to regular users and a "greedy" bot would follow it anyway. From there you could probably have a website within a website that's being generated by AI on the fly. To keep the bots from running up your bills you probably want it to be mostly static.

[–] technom@programming.dev 3 points 4 months ago* (last edited 4 months ago)

Nice idea!

In addition, we could have an allowlist for honest bots (like search crawlers).