Retrieving Webpages Using wget, curl and lynx - LinuxConfig.org

Whether you are an IT professional who needs to download 2000 online bug reports into a flat text file and parse them to see which ones need attention, or a mum who wants to download 20 recipes from an public domain website, you can benefit from knowing the tools which help you download webpages into a text based file. If you are interested in learning more about how to parse the pages you download, you can have a look at our Big Data Manipulation for Fun and Profit Part 1 article.


This is a companion discussion topic for the original entry at https://linuxconfig.org/retrieving-webpages-using-wget-curl-and-lynx