Download Whole Site Using WGET

The WGet is a C base program to retrieve a web contents where it popularly used on OS platform such as Linux and others. It’s part of GNU and distributed under GNU General Public License. It has a very robust feature including recursive web page download like a crawler. By using a proper program arguments we can download a whole site by using the following command.

wget -E -H -k -K -p

The -E arguments to tell te program that it supposed to save HTML and CSS document using proper extension. Where -H to makesure that all file were download even it were hosted on foreign host or domain.

The flag -k -K will convert all the link in HTML to locally downloaded file. The -p will download all the requisite file including images and others to makesure the page is properly working.

Recent Update!