A spidering software that checks Web sites for Broken links. Link verification is done on "normal" links, images, frames, backgrounds and local image maps. It displays a continously updated list of URLs which you can sort by different criteria. A report can be produced at any time.
image maps, style sheets, scripts and java applets. It displays a continously updated list of URLs which you can sort by different criteria. A report can be produced at any time.
Additional features:
Robots.txt support
Detect remote loading of images (geocities sabotages this)
Solution for leftover TGH*.* files in temp directory
Command-line parameters (actually, this has already been done, for a client who agreed to pay my development time to two people I support. If you need something similar, e-Mail me, the price is a $300 DONATION to be sent to a person I support)
Names of last checked URLs in also file menu
Automatic saving every minute
A correctly working "Update" feature that rechecks changed sites (tricky, so I will never do it)
Ideas from Chris:
What about identifying how many steps it takes to reach a particular page from the home page and how much kb had to be downloaded before one could reach there.
[TH: useful e.g. to which steps a user must take to reach the page of a particular product]