this post was submitted on 12 Jan 2024
76 points (98.7% liked)

Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ

54716 readers
391 users here now

⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules • Full Version

1. Posts must be related to the discussion of digital piracy

2. Don't request invites, trade, sell, or self-promote

3. Don't request or link to specific pirated titles, including DMs

4. Don't submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder

📜 c/Piracy Wiki (Community Edition):


💰 Please help cover server costs.

Ko-Fi Liberapay
Ko-fi Liberapay

founded 1 year ago
MODERATORS
 

I’ve been looking online for ways to download websites (game wikis mostly), in order to have them all in my collection and ensure that they dont get taken down or changed.

After trying linkwarden, which is fine for singular web pages, one has to manually link each individual page of the wiki in order to make a pdf.

With this in mind, the only other option that I’ve discovered is using wget recursively. Do any you of you have experience with this or reccomend alternative ideas? Any and all help is appreciated.

PS: I will most likely download official game guides which will cover most of the games, but looking for something to cover all my games library.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 11 points 10 months ago* (last edited 10 months ago) (2 children)

I’ve used wget to mirror websites. It works very well.

[–] [email protected] 4 points 10 months ago

Wget2 can mirror websites also, and it also has the advantage of the following features that wget does not have:

  • downloads multiple files in parallel, speeding things up a lot
  • brotli and zstd compression support
  • uses multiple proxies for parallel downloads
  • supports sitemap indexes
  • http2 support
[–] [email protected] 0 points 10 months ago

This is the way