Long Live Open Source Software !!

9 September

[Tutorial] Download Entire Web

You want to download an entire website togethered with its contents at the same time? Confused to use what? Used wget, a free open source application to get some file through HTTP, HTTPS, and FTP. This application doesn’t need an user interface to run. Simply type this lines below in terminal :

wget \
     --recursive \
     --no-clobber \
     --page-requisites \
     --html-extension \
     --convert-links \
     --restrict-file-names=windows \
     --domains website.org \
     --no-parent \

Information from option-option used above is as follows:

  • --recursive: download entire website recursively.
  • --domains website.org: only download in domain domain website.org.
  • --no-parent: won't download directory above tutorials/html/.
  • --page-requisites: Got all the elements that build a page (images, CSS, etc).
  • --html-extension: save file with html extension.
  • --convert-links: convert links so we can display the site offline
  • --restrict-file-names=windows: change file name so it can be opened in Windows
  • --no-clobber: do not overwrite existing files. (useful when download stopped and resumed)
Be Sociable, Share!

Maybe you like this post too :