There are a few ways to download an entire web site. One way is to use a web browser, such as Firefox or Chrome, and navigate to the site you want to download. Another way is to use a program such as 7-Zip or WinRAR. Once you have downloaded the files, you can unzip them and then copy them into your computer’s Downloads folder.


You don’t just want an article or an individual image, you want the whole web site. What’s the easiest way to siphon it all?

Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites.

Image available as wallpaper at GoodFon.

The Question

SuperUser reader Joe has a simple request:

Every page, no exception. Joe’s on a mission.

Any platform is fine.

The Answer

SuperUser contributor Axxmasterr offers an application recommendation:

We can heartily recomment HTTRACK. It’s a mature application that gets the job done. What about archivists on non-Windows platforms? Another contributor, Jonik, suggests another mature and powerful tool:

HTTRACK works like a champ for copying the contents of an entire site. This tool can even grab the pieces needed to make a website with active code content work offline. I am amazed at the stuff it can replicate offline.

This program will do all you require of it.

Happy hunting!

Have something to add to the explanation? Sound off in the the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.

You’d do something like:

For more details, see Wget Manual and its examples, or take a look at these:

http://linuxreviews. org/quicktips/wget/ http://www. krazyworks. com/?p=591