Add missing option and settings documentation

This commit is contained in:
Chris Mayo 2020-08-08 17:07:27 +01:00
parent 8e8f7a1668
commit d62490d17a
8 changed files with 1047 additions and 871 deletions

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

View file

@ -1,6 +1,6 @@
.\" Man page generated from reStructuredText.
.
.TH "LINKCHECKER" "1" "August 06, 2020" "" "LinkChecker"
.TH "LINKCHECKER" "1" "August 08, 2020" "" "LinkChecker"
.SH NAME
linkchecker \- command line client to check HTML documents and websites for broken links
.
@ -250,6 +250,11 @@ Prüfe URLs die auf den regulären Ausdruck zutreffen, aber führe keine Rekursi
.UNINDENT
.INDENT 0.0
.TP
.B \-\-no\-robots
Check URLs regardless of any robots.txt files.
.UNINDENT
.INDENT 0.0
.TP
.B \-p, \-\-password
Liest ein Passwort von der Kommandozeile und verwende es für HTTP und FTP Autorisierung. Für FTP ist das Standardpasswort anonymous@. Für HTTP gibt es kein Standardpasswort. Siehe auch \fI\%\-u\fP\&.
.UNINDENT

View file

@ -1,6 +1,6 @@
.\" Man page generated from reStructuredText.
.
.TH "LINKCHECKERRC" "5" "August 06, 2020" "" "LinkChecker"
.TH "LINKCHECKERRC" "5" "August 08, 2020" "" "LinkChecker"
.SH NAME
linkcheckerrc \- configuration file for LinkChecker
.
@ -40,6 +40,11 @@ level margin: \\n[rst2man-indent\\n[rst2man-indent-level]]
\fBcookiefile=\fP\fIDateiname\fP
Lese eine Datei mit Cookie\-Daten. Das Cookie Datenformat wird in \fBlinkchecker(1)\fP erklärt. Kommandozeilenoption: \fB\-\-cookiefile\fP
.TP
\fBdebugmemory=\fP[\fB0\fP|\fB1\fP]
Write memory allocation statistics to a file on exit, requires \fI\%meliae\fP\&.
The default is not to write the file.
Command line option: none
.TP
\fBlocalwebroot=\fP\fISTRING\fP
Beachten Sie dass das angegebene Verzeichnis in URL\-Syntax sein muss, d.h. es muss einen normalen statt einen umgekehrten Schrägstrich zum Aneinanderfügen von Verzeichnissen benutzen. Und das angegebene Verzeichnis muss mit einem Schrägstrich enden. Kommandozeilenoption: none
.TP
@ -70,14 +75,35 @@ Falls der Wert Null ist werden SSL Zertifikate nicht überprüft. Falls er auf E
\fBmaxrunseconds=\fP\fINUMMER\fP
Hört nach der angegebenen Anzahl von Sekunden auf, neue URLs zu prüfen. Dies ist dasselbe als wenn der Benutzer nach der gegebenen Anzahl von Sekunden stoppt (durch Drücken von Strg\-C). Kommandozeilenoption: none
.TP
\fBmaxfilesizedownload=\fP\fINUMBER\fP
Files larger than NUMBER bytes will be ignored, without downloading anything
if accessed over http and an accurate Content\-Length header was returned.
No more than this amount of a file will be downloaded.
The default is 5242880 (5 MB).
Command line option: none
.TP
\fBmaxfilesizeparse=\fP\fINUMBER\fP
Files larger than NUMBER bytes will not be parsed for links.
The default is 1048576 (1 MB).
Command line option: none
.TP
\fBmaxnumurls=\fP\fINUMMER\fP
Maximale Anzahl von URLs die geprüft werden. Neue URLs werden nicht angenommen nachdem die angegebene Anzahl von URLs geprüft wurde. Kommandozeilenoption: none
.TP
\fBmaxrequestspersecond=\fP\fINUMMER\fP
Limit the maximum number of requests per second to one host.
The default is 10.
Command line option: none
.TP
\fBrobotstxt=\fP[\fB0\fP|\fB1\fP]
When using http, fetch robots.txt, and confirm whether each URL should
be accessed before checking.
The default is to use robots.txt files.
Command line option: \fB\-\-no\-robots\fP
.TP
\fBallowedschemes=\fP\fINAME\fP[\fB,\fP\fINAME\fP\&...]
Allowed URL schemes as comma\-separated list.
Command line option: none
.UNINDENT
.SS filtering
.INDENT 0.0

View file

@ -1,6 +1,6 @@
.\" Man page generated from reStructuredText.
.
.TH "LINKCHECKER" "1" "August 06, 2020" "" "LinkChecker"
.TH "LINKCHECKER" "1" "August 08, 2020" "" "LinkChecker"
.SH NAME
linkchecker \- command line client to check HTML documents and websites for broken links
.
@ -297,6 +297,11 @@ See section \fI\%REGULAR EXPRESSIONS\fP for more info.
.UNINDENT
.INDENT 0.0
.TP
.B \-\-no\-robots
Check URLs regardless of any robots.txt files.
.UNINDENT
.INDENT 0.0
.TP
.B \-p, \-\-password
Read a password from console and use it for HTTP and FTP
authorization. For FTP the default password is anonymous@. For

View file

@ -1,6 +1,6 @@
.\" Man page generated from reStructuredText.
.
.TH "LINKCHECKERRC" "5" "August 06, 2020" "" "LinkChecker"
.TH "LINKCHECKERRC" "5" "August 08, 2020" "" "LinkChecker"
.SH NAME
linkcheckerrc \- configuration file for LinkChecker
.
@ -45,6 +45,11 @@ Read a file with initial cookie data. The cookie data format is
explained in \fBlinkchecker(1)\fP\&.
Command line option: \fB\-\-cookiefile\fP
.TP
\fBdebugmemory=\fP[\fB0\fP|\fB1\fP]
Write memory allocation statistics to a file on exit, requires \fI\%meliae\fP\&.
The default is not to write the file.
Command line option: none
.TP
\fBlocalwebroot=\fP\fISTRING\fP
When checking absolute URLs inside local files, the given root
directory is used as base URL.
@ -100,6 +105,18 @@ seconds.
The default is not to stop until all URLs are checked.
Command line option: none
.TP
\fBmaxfilesizedownload=\fP\fINUMBER\fP
Files larger than NUMBER bytes will be ignored, without downloading anything
if accessed over http and an accurate Content\-Length header was returned.
No more than this amount of a file will be downloaded.
The default is 5242880 (5 MB).
Command line option: none
.TP
\fBmaxfilesizeparse=\fP\fINUMBER\fP
Files larger than NUMBER bytes will not be parsed for links.
The default is 1048576 (1 MB).
Command line option: none
.TP
\fBmaxnumurls=\fP\fINUMBER\fP
Maximum number of URLs to check. New URLs will not be queued after
the given number of URLs is checked.
@ -108,9 +125,18 @@ Command line option: none
.TP
\fBmaxrequestspersecond=\fP\fINUMBER\fP
Limit the maximum number of requests per second to one host.
The default is 10.
Command line option: none
.TP
\fBrobotstxt=\fP[\fB0\fP|\fB1\fP]
When using http, fetch robots.txt, and confirm whether each URL should
be accessed before checking.
The default is to use robots.txt files.
Command line option: \fB\-\-no\-robots\fP
.TP
\fBallowedschemes=\fP\fINAME\fP[\fB,\fP\fINAME\fP\&...]
Allowed URL schemes as comma\-separated list.
Command line option: none
.UNINDENT
.SS filtering
.INDENT 0.0

View file

@ -208,6 +208,10 @@ Checking options
This option can be given multiple times.
See section `REGULAR EXPRESSIONS`_ for more info.
.. option:: --no-robots
Check URLs regardless of any robots.txt files.
.. option:: -p, --password
Read a password from console and use it for HTTP and FTP

View file

@ -21,6 +21,10 @@ checking
Read a file with initial cookie data. The cookie data format is
explained in :manpage:`linkchecker(1)`.
Command line option: :option:`--cookiefile`
**debugmemory=**\ [**0**\ \|\ **1**]
Write memory allocation statistics to a file on exit, requires :pypi:`meliae`.
The default is not to write the file.
Command line option: none
**localwebroot=**\ *STRING*
When checking absolute URLs inside local files, the given root
directory is used as base URL.
@ -67,6 +71,16 @@ checking
seconds.
The default is not to stop until all URLs are checked.
Command line option: none
**maxfilesizedownload=**\ *NUMBER*
Files larger than NUMBER bytes will be ignored, without downloading anything
if accessed over http and an accurate Content-Length header was returned.
No more than this amount of a file will be downloaded.
The default is 5242880 (5 MB).
Command line option: none
**maxfilesizeparse=**\ *NUMBER*
Files larger than NUMBER bytes will not be parsed for links.
The default is 1048576 (1 MB).
Command line option: none
**maxnumurls=**\ *NUMBER*
Maximum number of URLs to check. New URLs will not be queued after
the given number of URLs is checked.
@ -74,8 +88,16 @@ checking
Command line option: none
**maxrequestspersecond=**\ *NUMBER*
Limit the maximum number of requests per second to one host.
The default is 10.
Command line option: none
**robotstxt=**\ [**0**\ \|\ **1**]
When using http, fetch robots.txt, and confirm whether each URL should
be accessed before checking.
The default is to use robots.txt files.
Command line option: :option:`--no-robots`
**allowedschemes=**\ *NAME*\ [**,**\ *NAME*...]
Allowed URL schemes as comma-separated list.
Command line option: none
filtering
^^^^^^^^^