▲ ▼ Archive entire website permanently online
I have several side projects and each time I shutdown a project I have to archive each and every web page of my project website individually at Wayback Machine as the ability to archive entire website seems to be available only for large organizations through their Archive-It program.
I feel there is a need gap for a service to save entire websites permanently on the Internet available to small website owners as well.
Here's a script that adds each page from a website recursively to archive.org; I can't test it as the archiver seems to be down or is taking a long time to go through the jobs (I tried to manually archive a page; it said it was archived but I didn't see any snapshots): https://gist.github.com/alexyorke/8e97660e5dbcdc72e3ba9f8703eea4c4 I couldn't post the comment here because it said the URLs were invalid.
Hey thanks, but it shows 'Fail with status: 503' in archive.org for the website I tried with the script and so I don't know whether this works or not.
I apologize for the URL parsing bug, you should be able to post URLs now as long as they are valid (http(s) with space before and after the URL).
Found a self hosted, open-source archiving tool recently - https://archivebox.io/ which can save to the Internet Archive and so should be able to do what you're asking.
I'm a bit confused as to whether or not you want this for websites that are still online, or only for those that are only available via archive.org. And, then, do you want a service that hosts the content for you, in addition to making it downloadable?
Sorry for not making it clear, I would like to backup my entire website online for posterity but service like archive.org currently allow only individual webpages to be archived for general users.