I’ve been looking online for ways to download websites (game wikis mostly), in order to have them all in my collection and ensure that they dont get taken down or changed.
After trying linkwarden, which is fine for singular web pages, one has to manually link each individual page of the wiki in order to make a pdf.
With this in mind, the only other option that I’ve discovered is using wget recursively. Do any you of you have experience with this or reccomend alternative ideas? Any and all help is appreciated.
PS: I will most likely download official game guides which will cover most of the games, but looking for something to cover all my games library.
1. Posts must be related to the discussion of digital piracy
2. Don’t request invites, trade, sell, or self-promote
3. Don’t request or link to specific pirated titles, including DMs
4. Don’t submit low-quality posts, be entitled, or harass others
📜 c/Piracy Wiki (Community Edition):
💰 Please help cover server costs.
Ko-fi | Liberapay |
I use grab-site (unmaintained) for full site archival and
wget -p -k
for simple non-javascript single pagesI’ve heard good things about HTTrack, SingleFile and Archivebox but don’t have any experience with them.
Archivebox looks the most modern and intuitive but is hosted on docker