Download Whole Website via Command Line in Linux (Like httrack in Windows)

Usually you need to download the whole website, and you use additional application like httrack in windows. But there will be a question how to do it in Linux? What application do I need to do this. But that question will be answered by a simple solution that missed by many people.



Yes, we will do it just by command line. And the command that we need is a popular command that many people have known. The command is wget. Now, how wget will download whole website? Here it is the command that you need :



$ wget -r --level=0 -convert-links --page-requisites --no-parent



The wget option explanation :

-r / --recursive                   Do recursive

-l / --level=                        Use 0 for infinite depth level or use number greater than 0 for limited depth.

-k / --convert-links            Modify links inside downloaded files to point to local files.

-p / --page-requisites         Get all images, css, js files which make up the web page.

-np / --no-parent               Don't download parent directory contents.



That's the command that you need to download whole website like httrack. Very simple, isn't it?


more on: http://gregorygavin.wordpress.com/
 
Copyright 2010 News Tutorials. All rights reserved.
Themes by Ex Templates Blogger Templates l Home Recordings l Studio Rekaman