DEV Community

Mehrad Rousta
Mehrad Rousta

Posted on

Have an offline version of a website

Originally posted at my blog.

At the beginning of 2017, Shamim, Maryam, and Parastoo met each other at the university, when they were passing master of tourism management course. Years of friendship and traveling made them closer to each other. And one day they decided to share this wonderful feeling with others. the result was Irandestina group. Irandestina was the space to experience the good filling of traveling. friendly, creative and adventurous trips with respect to the environment and the lives of host communities. later they've also pivoted to accessible tourism but with the fall of Iran economy now 2 of them are migrated and Irandestina stopped here.

then around July of 2019 Parastoo asked me if I can find a way to keep Irandestina website content without the need to pay for hosting. and I said "Yes!". my plan was to download the whole website and put it over Github/Gitlab Pages for free and point the domain to there too.

I remembered as a child there was some softwares on windows OS to do that. so we had a Dial-up connection and sometimes I would download all of a website so I won't need Internet access to browse it. but now I was on Linux and knew there must be an easier way to do this.

And it is! it's called WGET!

$ wget \
     --recursive \
     --no-clobber \
     --page-requisites \
     --html-extension \
     --convert-links \
     --domains \
     --no-parent \

all you I had to do was this.

The options are:

  • --recursive: the most important flag, download sub URLs in given address.
  • --domains don't follow links outside
  • --no-parent: don't follow the links outside of provided URL, can b used to download only one sub URL.
  • --page-requisites: get all the elements (images, CSS, JS and others) of the page.
  • --html-extension: save files with the .html extension.
  • --convert-links: convert HTML links to local version so site is navigable offline.
  • --no-clobber: don't overwrite the already downloaded files (used when the download is interrupted and resumed).

for more options you can check out it's manual page. also read more at

so now let's ship this static website we just downloaded. I hope you know how to work with Git. create a new repository, go inside the downloaded directory and do this:

  1. git init
  2. git add *
  3. git commit -m "add static content"
  4. git remote add origin[REPOSITORY ADDRESS]
  5. git push -u origin master

great! now open the repository setting and make the master branch the Source for Github Pages. you can also add a cname file to the repo so you can have your custom domain. read more at Github Help.


Top comments (0)