Download and Save Entire Websites for Offline Viewing in Windows using HTTrack

Share

Have you ever felt the urge to access one or two of your frequented websites or some online guide(s) you often refer to, while you were unplugged from the internet; or maybe when you were on the go with a slow internet connection? You might not have to worry about it anymore, once you have HTTrack to handle an otherwise tedious job neatly for you.

HTTrack Website Copier is a software utility that would help you to access websites from your computer even when you are offline. It can download and save entire (or a part of) websites from the internet to a local directory in your computer. In case you are unsure about the usefulness of the application, do head straight over to the “Pros” before you read on.

How to Use HTTrack

  1. Download HTTrack Website Copier.
  2. If you believe that you are good enough with software and websites, you may skip the following steps, as you will most probably be able to sort everything out yourself. Or, if you would still prefer some guidance, keep reading.HTTrack Installation
  3. Open the file to start the installation process (duh!). The process is quite neat and simple. You may leave all settings as it is and accept the terms and conditions, and keep clicking ‘Next’ until the installation finishes.
  4. In the last step, untick ‘View history.txt file’, and then click on ‘Finish’ to automatically launch the program.
  5. Once HTTrack opens up, click ‘Next’ to get started with what we really want to do.
  6. In the next screen, type in some name for your project and the category you would like to add it to (for the sake of management). Additionally, you may also change the location (Base path) to which the website documents would be downloaded.HTTrack-2
  7. The next window allows you to choose what you want to do. You may start fresh with the default action to ‘Download website(s)’ (if you were done only partially with a previous download, you may also choose to continue with that).
  8. HTTrack-4Click on Add URL and enter the base URL of the (entire, or part of) website you would like to download. For example, enter ‘www.technohash.com/internet’ (without quotes) if you would like to download all pages linked under the ‘internet’ sub-directory; or enter ‘www.technohash.com’ if you would like to download the entire website. If some website requires you to login to access certain pages, you may also enter your ‘Login’ and ‘Password’ in the given fields. You may also type in links directly if you prefer that way.
    HTTrack-7
  9. Advanced users may use ‘Set options’ to manually change ‘Preferences and mirror options’. Most users don’t need to change anything there. But, if you have trouble downloading or accessing websites, you may come back to this step, and go to ‘Set options’ -> ‘Browser ID’ and try selecting some other ‘Browser Identity’ such as “Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US; rv:1.1) Gecko/20020826″.
  10. Select ‘Next’ and then click on ‘Finish’ on the next screen. Wait till the download finishes.
  11. When the download finishes, you may view the downloaded website by clicking on ‘Browse Mirrored Website’; which would then open up the website in your default web browser.
  12. You can now browse the downloaded website anytime without the need of an internet connection. All you need is just a web browser.
  13. If you need any help or have any suggestions, you may get to us through the comments below.

Pros

  • It is available for free and is also open-source, though it performs better than most of its proprietary counterparts.
  • Websites downloaded using HTTrack are portable. Once you have downloaded the website, you don’t need the software anymore to browse the website.
  • It can also be used to ‘update’ websites which you had already downloaded. It can be set to incrementally update the offline website by downloading just the new or changed files.
  • You can choose to avoid certain type(s) of files, or download just the specified type(s) of files.
  • Deep layers of control are exposed to the user, like restricting the layers of the page to be downloaded and the maximum number of parallel connections to be made while downloading.

Cons

  • HTTrack is known to have some problems with PHP links. But, that’s an unavoidable part of the package as of now, and most of the other offline browsers also have the same problem.
  • Common users may find it a bit difficult to get a hang of all the options they are presented with. But, most of the things work out of the box with the default settings. The documentation and support forums might also be able to help out almost everyone.
  • Images and other files from external websites are not downloaded by default. But, they may be made available for offline use by just browsing around the downloaded website and clicking on those external images. Actually, this can’t be classified as a drawback.

Summary

Wrapping it up, HTTrack is one of the best tools available to download and save websites for offline viewing; with a wide range of features and options. If you can find your way through, it might soon turn out to be one of your favorite applications.

Note: Though the title says that it’s for Windows, it is also available for other platforms like Linux, OSX and Android.

Download HTTrack

You can download HTTrack Website Copier from any of the links below:

  • Official Website: Download (if you are unsure which file to download, just download the first file ~ httrack-3.47.23.exe)
  • CNET Download.com: 32-bit, 64-bit

Alternatives

If you are unhappy with what HTTrack has to offer you, or if you would like to check out some alternatives to HTTrack, you might want to checkout the following applications:

Please leave your suggestions and queries in the comments below.

  • Jishnu J Nair

    Oh my god! Then we can download wikipedia ???

    • Cherian Eapen

      how about google??

      • Dhanesh V

        You are funny. :D :P

        • ankush

          I m unable to download any dynamic website with HTTRACK its showing error please help me

    • Dhanesh V

      You ‘CAN’!
      But you would probably have to setup a data center with quite some processing power at your home to get that done.
      The total size of Wikipedia is not something that’s generally estimatable. It’s something that grows every second.

      As of August 5, 2013, just the textual articles in Wikipedia would account
      to more than 10 GB of data. So, when the images and other materials add
      up, that would be some massive thing.

      In short, downloading Wikipedia might be not be a good idea. ;)

  • Cherian Eapen

    nice piece of work, i really admire the way you presented it. but dont you think it would be little extravagant to download some websites like google??even if i download a web site, it would be practically impossible to see the recent updates without an internet connection. so it turns out that after you waste a great deal of the memory,all you can do is to browse through some pages that dont even update themselves..??

    • Dhanesh V

      Really, trying to “download Google” would not be a good idea at all!

      Talking about updating websites, I had already mentioned in the “Pros” section above that the software allows to update websites by just downloading the new or changed files automatically.

      Coming to the usefulness of downloading a website, let’s say you are currently studying Python, then you might need to often refer to docs.python.org frequently. Then downloading just docs.python.org/2/ (for Python 2.x) or docs.python.org/3/ (for Python 3.x) using the software would be more than enough for the purpose of looking up things while you you remain offline.
      The uses of the software would actually depend on the needs of the user. Some might not even find this useful, while this might be a lifesaver for some. :)

  • Gdh

    Hi

    Does this allow a user to interact with the site in the way that dat would be stored and communicated back to the server? I have a wordpress site with a lot of video files. I want to give people a version that they can use straight off their machine. But when they go online I want there usage data to be sent back to me so that I can get the analytics I would have if they were using online. Is that possible with this, or any software?

    Thanks