Download website files using wget

ko31 profile image Ko Takagi ・1 min read

If you want to create a local mirror site, you can download a set of websites by wget.


wget -P /path/to/download -E -k -m -nH -np -p -c https://example.com
Option Overview
-P Set save directory path.
-E This option will cause the suitable suffix to be appended to the local filename.
-k After the download is complete, convert the links in the document to make them suitable for local viewing.
-m Turn on options suitable for mirroring.
-nH Disable generation of host-prefixed directories.
-np Do not ever ascend to the parent directory when retrieving recursively.
-p This option causes Wget to download all the files that are necessary to properly display a given HTML page.
-c Continue getting a partially-downloaded file.

With basic authentication

wget -P /path/to/download -E -k -m -nH -np -p -c --http-user=username --http-password=password https://example.com
Option Overview
--http-user Set username.
--http-password Set password.

Posted on by:

ko31 profile

Ko Takagi


I am freelance web engineer in Japan. Since I am studying English, I started writing articles here😆 I want to play catch! ⚾️


markdown guide

There is an important option which is -c. It helps to continue the download of the file.


Thanks! I updated -c option.