Archive all the urls of a website to WaybackMachine (https://web.archive.org/)

archive.org

The Oxford English Dictionary website offers the entries of the dictionary online, just by counting numbers in the final numerical expression in its URL: https://www.oed.com/oed2/00000001, https://www.oed.com/oed2/00000002 … the last url being https://www.oed.com/oed2/00291601

Surprisingly enough, https://web.archive.org has not yet archive them all, and I'd love to know how I can do/request so .

Best Answer

If you have the time, you can save the pages (one by one) yourself. This page describes how to do it.

  1. Save Page Now

Put a URL into the form, press the button, and we save the page. You will instantly have a permanent URL for your page. Please note, this method only saves a single page, not the whole site.

OR

  1. Browser extensions and add-ons

Install the Wayback Machine Chrome extension in your browser. Go to a page you want to archive, click the icon in your toolbar, and select Save Page Now. We will save the page and give you a permanent URL.

OR if you want more pages, do as indicated here:

You can now save all the “outlinks” of a web page with a single click. By selecting the “save outlinks” checkbox you can save the requested page (and all the embedded resources that make up that page) and also all linked pages (and all the embedded resources that make up those pages). Often, a request to archive a single web page, with outlinks, will cause us to archive hundreds of URLs. Every one of which is shown via the SPN interface as it is archived.

Edit: I have found this question on Web Applications which provides some answers. Which also might render my answer a bit pointless since another user there provide it already. If there are mods reading this, should I delete this answer?

Related Topic