I found myself in need of something to let me download some webpages (tutorials + images) for offline usage. The problem was that they had links to eachother, and images, preventing me from copying and pasting the text.
now, I could have just downloaded the source, downloaded all the images, and sifted through the html files changing directories to local ones, but, being the lazy person that I am, iGoogled.
...and found this application:
http://www.httrack.com/page/2/en/index.html
Once it's installed, all you have to do is paste in HTML links, and it starts working. It downloads everything from the page, including images (and stuff), and then changes all the links to point to their local directories. If you have a link that points to another website you downloaded (where a.com/a.html links to a.com/b.html, and you downloaded both), it changes those links to point locally also.
It's really really cool, easy to use, and it works well. I hope i'll save someone some googling time by posting this. And to think i was seriously considering doing everything by hand
Also, this would work really really well for downloading wikipedia articles or something for school (a webpage that has a lot of links to other articles, and pictures).