mirror of
https://github.com/Hopiu/linkchecker.git
synced 2026-04-11 10:00:58 +00:00
check links in web documents or full websites
While this flag can be abused, it seems to me like a legitimate use case that you want to check a fairly small document for mistakes, which includes references to a website which has a robots.txt that denies all robots. It turns out that most websites do *not* add a permission for LinkCheck to use their site, and some sites, like the Debian BTS for example, are very hostile with bots in general. Between me using linkcheck and me using my web browser to check those links one by one, there is not a big difference. In fact, using linkcheck may be *better* for the website because it will use HEAD requests instead of a GET, and will not fetch all page elements (javascript, images, etc) which can often be fairly big. Besides, hostile users will patch the software themselves: it took me only a few minutes to disable the check, and a few more to make that into a proper patch. By forcing robots.txt without any other option, we are hurting our good users and not keeping hostile users from doing harm. The patch is still incomplete, but works. It lacks: documentation and unit tests. Closes: #508 |
||
|---|---|---|
| cgi-bin | ||
| config | ||
| doc | ||
| linkcheck | ||
| po | ||
| scripts | ||
| tests | ||
| third_party | ||
| windows | ||
| .gitattributes | ||
| .gitignore | ||
| .project | ||
| .pydevproject | ||
| .travis.yml | ||
| COPYING | ||
| install-rpm.sh | ||
| linkchecker | ||
| linkchecker.freecode | ||
| Makefile | ||
| MANIFEST.in | ||
| README.rst | ||
| requirements.txt | ||
| robots.txt | ||
| setup.cfg | ||
| setup.py | ||
LinkChecker ============ |Build Status|_ |Latest Version|_ |License|_ .. |Build Status| image:: https://travis-ci.org/wummel/linkchecker.svg?branch=master .. _Build Status: https://travis-ci.org/wummel/linkchecker .. |Latest Version| image:: http://img.shields.io/pypi/v/LinkChecker.svg .. _Latest Version: https://pypi.python.org/pypi/LinkChecker .. |License| image:: http://img.shields.io/badge/license-GPL2-d49a6a.svg .. _License: http://opensource.org/licenses/GPL-2.0 Check for broken links in web sites. Features --------- - recursive and multithreaded checking and site crawling - output in colored or normal text, HTML, SQL, CSV, XML or a sitemap graph in different formats - HTTP/1.1, HTTPS, FTP, mailto:, news:, nntp:, Telnet and local file links support - restrict link checking with regular expression filters for URLs - proxy support - username/password authorization for HTTP, FTP and Telnet - honors robots.txt exclusion protocol - Cookie support - HTML5 support - a command line and web interface - various check plugins available, eg. HTML syntax and antivirus checks. Installation ------------- See doc/install.txt in the source code archive. Python 2.7.2 or later is needed. Usage ------ Execute ``linkchecker http://www.example.com``. For other options see ``linkchecker --help``.