this post was submitted on 15 Jun 2023
9 points (100.0% liked)
FREEMEDIAHECKYEAH
22 readers
1 users here now
๐ฟ ๐บ ๐ต ๐ฎ ๐ ๐ฑ
๐ดโโ ๏ธ Wiki / ๐ฌ Chat
Rules
1. Please be kind and helpful to one another.
2. No racism, sexism, ableism, homophobia, transphobia, spam.
3. Linking to piracy sites is fine, but please keep links directly to pirated content in DMs.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If you login in a browser, it'll most likely give you a "session cookie" that you should be able to see in the developer tools. (If you're using Firefox's developer tools, it'd be under the "storage" section.) The name of the cookie will generally have the word "session" in the name. After logging in, that cookie identifies you to the server, letting the server know that "this particular request is from CucumberSalad" (or whatever your user is named on that service.) Wget probably hasn't been working because the requests from wget don't include that cookie like the requests from your browser do.
(Just looking at my developer tools while using Lemmy, it seems like the Lemmy web ui doesn't use session cookies but rather a JSON web token in a cookie named "jwt", but I think that cookie would suffice if I was trying to scrape the Lemmy web ui.)
Once you have the proper cookie name and value, you can have wget send the cookie to the server with each request by adding the flag
--header 'Cookie: <cookie name>=<cookie value>'
(but replace the values in angle brackets. Example:--header 'Cookie: JSESSIONID=ksdjflasjdfaskdjfaosidhwe'
.)Also, if you can provide more info as to what you're trying to scrape, folks can probably help more. Hopefully what I've given you is enough to get you going, but it's possible there might be more hurtles to jump to get it working.
Lovely comment tgankdfor tge help