You could copy a different GET request from the same domain and then grab the URL for the download and replace the one in the GET request with that one
Firefox
The latest news and developments on Firefox and Mozilla, a global non-profit that strives to promote openness, innovation and opportunity on the web.
You can subscribe to this community from any Kbin or Lemmy instance:
Related
- Firefox Customs: !FirefoxCSS@fedia.io
- Thunderbird: !Thunderbird@fedia.io
Rules
While we are not an official Mozilla community, we have adopted the Mozilla Community Participation Guidelines as far as it can be applied to a bin.
Rules
-
Always be civil and respectful
Don't be toxic, hostile, or a troll, especially towards Mozilla employees. This includes gratuitous use of profanity. -
Don't be a bigot
No form of bigotry will be tolerated. -
Don't post security compromising suggestions
If you do, include an obvious and clear warning. -
Don't post conspiracy theories
Especially ones about nefarious intentions or funding. If you're concerned: Ask. Please don’t fuel conspiracy thinking here. Don’t try to spread FUD, especially against reliable privacy-enhancing software. Extraordinary claims require extraordinary evidence. Show credible sources. -
Don't accuse others of shilling
Send honest concerns to the moderators and/or admins, and we will investigate. -
Do not remove your help posts after they receive replies
Half the point of asking questions in a public sub is so that everyone can benefit from the answers—which is impossible if you go deleting everything behind yourself once you've gotten yours.
That's a neat trick, good call! I ended up using an extension called cliget which seems functional so far.
I used to use cliget some years back. Quite a bit, IIRC. Long story short: company customer support portal, downloads would sometimes time out. Installing and using cliget saved me from many fistfuls of hair ripped from my head.
Yep, cliget was the answer. Honestly an extension I wish I knew about sooner.
Found this in a stackexchange post from 2018 (not sure if you can do this on MacOS as well):
Initiate download via takeout page in your browser
Go to "Window->Downloads"
Locate the download which is in-progress right now
Right click + Copy link address
From your terminal - wget {url}
That's good for most websites, but it only takes the URL and sometimes websites are tricksy when you're trying to use CLI tools and figured I needed all the appropriate headers. I got an extension cliget which seems to have worked well enough. In the end I think this would have worked, but takeout gives you limited attempts to download a file before it cuts you off entirely and wanted to be sure.