this post was submitted on 14 Feb 2024
1074 points (98.6% liked)

Technology

59197 readers
3207 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] BrianTheeBiscuiteer@lemmy.world 55 points 8 months ago (2 children)

If it doesn't get queried that's the fault of the webscraper. You don't need JS built into the robots.txt file either. Just add some line like:

here-there-be-dragons.html

Any client that hits that page (and maybe doesn't pass a captcha check) gets banned. Or even better, they get a long stream of nonsense.

[–] 4am@lemm.ee 25 points 8 months ago (2 children)

server {

name herebedragons.example.com; root /dev/random;

}

[–] PlexSheep@feddit.de 16 points 8 months ago (1 children)

Nice idea! Better use /dev/urandom through, as that is non blocking. See here.

[–] aniki@lemm.ee 0 points 8 months ago

That was really interesting. I always used urandom by practice and wondered what the difference was.

[–] aniki@lemm.ee 2 points 8 months ago* (last edited 8 months ago)

I wonder if Nginx would just load random into memory until the kernel OOM kills it.

I actually love the data-poisoning approach. I think that sort of strategy is going to be an unfortunately necessary part of the future of the web.