mirror of
https://github.com/Hopiu/linkchecker.git
synced 2026-03-17 06:20:27 +00:00
80 lines
3.4 KiB
Diff
80 lines
3.4 KiB
Diff
--- linkchecker.1.html.orig 2011-06-14 21:14:55.016011206 +0200
|
|
+++ linkchecker.1.html 2011-06-14 21:17:07.108913849 +0200
|
|
@@ -38,7 +38,7 @@
|
|
|
|
The most common use checks the given domain recursively, plus any
|
|
URL pointing outside of the domain:
|
|
-<BR> <B>linkchecker <A HREF="http://www.example.net/">http://www.example.net/</A></B>
|
|
+<BR> <B>linkchecker http://www.example.net/</B>
|
|
<BR>
|
|
|
|
Beware that this checks the whole site which can have thousands of URLs.
|
|
@@ -59,15 +59,15 @@
|
|
<BR>
|
|
|
|
You can skip the <B>http://</B> url part if the domain starts with <B>www.</B>:
|
|
-<BR> <B>linkchecker <A HREF="http://www.example.com">www.example.com</A></B>
|
|
+<BR> <B>linkchecker www.example.com</B>
|
|
<BR>
|
|
|
|
You can skip the <B>ftp://</B> url part if the domain starts with <B>ftp.</B>:
|
|
-<BR> <B>linkchecker -r0 <A HREF="ftp://ftp.example.org">ftp.example.org</A></B>
|
|
+<BR> <B>linkchecker -r0 ftp.example.org</B>
|
|
<BR>
|
|
|
|
Generate a sitemap graph and convert it with the graphviz dot utility:
|
|
-<BR> <B>linkchecker -odot -v <A HREF="http://www.example.com">www.example.com</A> | dot -Tps > sitemap.ps</B>
|
|
+<BR> <B>linkchecker -odot -v www.example.com | dot -Tps > sitemap.ps</B>
|
|
<A NAME="lbAF"> </A>
|
|
<H2>OPTIONS</H2>
|
|
|
|
@@ -302,8 +302,8 @@
|
|
|
|
Multiple entries are separated by a blank line.
|
|
The example below will send two cookies to all URLs starting with
|
|
-<B><A HREF="http://example.com/hello/">http://example.com/hello/</A></B> and one to all URLs starting
|
|
-with <B><A HREF="https://example.org/">https://example.org/</A></B>:
|
|
+<B>http://example.com/hello/</B> and one to all URLs starting
|
|
+with <B>https://example.org/</B>:
|
|
<P>
|
|
<BR> Host: example.com
|
|
<BR> Path: /hello
|
|
@@ -326,15 +326,15 @@
|
|
variables to ignore any proxy settings for these domains.
|
|
Setting a HTTP proxy on Unix for example looks like this:
|
|
<P>
|
|
-<BR> export http_proxy="<A HREF="http://proxy.example.com:8080">http://proxy.example.com:8080</A>"
|
|
+<BR> export http_proxy="http://proxy.example.com:8080"
|
|
<P>
|
|
Proxy authentication is also supported:
|
|
<P>
|
|
-<BR> export http_proxy="<A HREF="http://user1:mypass@proxy.example.org:8081">http://user1:mypass@proxy.example.org:8081</A>"
|
|
+<BR> export http_proxy="http://user1:mypass@proxy.example.org:8081"
|
|
<P>
|
|
Setting a proxy on the Windows command prompt:
|
|
<P>
|
|
-<BR> set http_proxy=<A HREF="http://proxy.example.com:8080">http://proxy.example.com:8080</A>
|
|
+<BR> set http_proxy=http://proxy.example.com:8080
|
|
<P>
|
|
<A NAME="lbAO"> </A>
|
|
<H2>PERFORMED CHECKS</H2>
|
|
@@ -470,8 +470,8 @@
|
|
<H2>NOTES</H2>
|
|
|
|
URLs on the commandline starting with <B>ftp.</B> are treated like
|
|
-<B><A HREF="ftp://ftp.">ftp://ftp.</A></B>, URLs starting with <B>www.</B> are treated like
|
|
-<B><A HREF="http://www.">http://www.</A></B>.
|
|
+<B>ftp://ftp.</B>, URLs starting with <B>www.</B> are treated like
|
|
+<B>http://www.</B>.
|
|
You can also give local files as arguments.
|
|
<P>
|
|
If you have your system configured to automatically establish a
|
|
@@ -584,7 +584,7 @@
|
|
</DL>
|
|
<HR>
|
|
This document was created by
|
|
-<A HREF="/cgi-bin/man/man2html">man2html</A>,
|
|
+man2html,
|
|
using the manual pages.<BR>
|
|
</BODY>
|
|
</HTML>
|