*** empty log message ***

git-svn-id: https://linkchecker.svn.sourceforge.net/svnroot/linkchecker/trunk/linkchecker@374 e7d03fd6-7b0d-0410-9947-9c21f3af8025
This commit is contained in:
calvin 2002-02-24 12:29:35 +00:00
parent 62e27e905d
commit 11a13a7d58
38 changed files with 338 additions and 312 deletions

View file

@ -10,4 +10,5 @@ VERSION
_linkchecker_configdata.py
js
locale
share
Packages.gz

View file

@ -1,5 +1,5 @@
# $Id$
import sys, re, getopt, socket
import os, sys, re, getopt, socket
import DNS,DNS.Lib,DNS.Type,DNS.Class,DNS.Opcode
#import asyncore
@ -24,7 +24,6 @@ def init_dns_resolver():
defaults['server'] = []
defaults['search_domains'] = []
# platform specific config
import os
if os.name=="posix":
init_dns_resolver_posix()
elif os.name=="nt":
@ -40,6 +39,8 @@ def init_dns_resolver():
def init_dns_resolver_posix():
"parses the /etc/resolv.conf file and sets defaults for name servers"
if not os.path.exists('/etc/resolv.conf'):
return
# XXX this needs to be dynamic?
global defaults
for line in open('/etc/resolv.conf', 'r').readlines():

View file

@ -44,6 +44,9 @@ deb_local: cleandeb
# standard for local use
fakeroot debian/rules binary
deb_localsigned:
debuild -sgpg -pgpg -k32EC6F3E -rfakeroot
deb_signed: cleandeb
# ready for upload, signed with my GPG key
env CVSROOT=:pserver:anonymous@cvs.linkchecker.sourceforge.net:/cvsroot/linkchecker cvs-buildpackage -Mlinkchecker -W/home/calvin/projects/cvs-build -sgpg -pgpg -k32EC6F3E -rfakeroot
@ -78,5 +81,8 @@ locale:
timeouttest:
$(PYTHON) $(PACKAGE) -v --timeout=0 mailto:root@aol.com
tar: distclean
cd .. && tar cjvf linkchecker.tar.bz2 linkchecker
.PHONY: all clean cleandeb distclean files upload test timeouttest locale
.PHONY: onlinetest config dist deb_local deb_signed deb_unsigned
.PHONY: onlinetest config dist deb_local deb_signed deb_unsigned tar

14
README
View file

@ -27,8 +27,8 @@ Read the file INSTALL.
License and Credits
-------------------
LinkChecker is licensed under the GNU Public License.
Credits go to Guido van Rossum for making Python. His hovercraft is
full of eels!
Credits go to Guido van Rossum and his team for making Python.
His hovercraft is full of eels!
As this program is directly derived from my Java link checker, additional
credits go to Robert Forsman (the author of JCheckLinks) and his
robots.txt parse algorithm.
@ -52,9 +52,9 @@ Included packages
-----------------
DNS from http://pydns.sourceforge.net/
fcgi.py and sz_fcgi.py from http://saarland.sz-sb.de/~ajung/sz_fcgi/
fintl.py from http://sourceforge.net/snippet/detail.php?type=snippet&id=100059
CSV from http://eh.org/~laurie/comp/python/csv/index.html
Note that all included packages are modified by me.
Note that included packages are modified by me.
Internationalization
@ -75,7 +75,7 @@ commandline options and stores them in a Config object.
(2) Which leads us directly to the Config class. This class stores all
options and works a little magic: it tries to find out if your platform
supports threads. If so, threading is enabled. If not, it is disabled.
Several functions are replaced with their threaded equivalents if
Several functions are replaced with their threaded equivalents if
threading is enabled.
Another thing are config files. A Config object reads config file options
on initialization so they get handled before any commandline options.
@ -83,7 +83,7 @@ on initialization so they get handled before any commandline options.
(3) The linkchecker script finally calls linkcheck.checkUrls(), which
calls linkcheck.Config.checkUrl(), which calls linkcheck.UrlData.check().
An UrlData object represents a single URL with all attached data like
validity, check time and so on. These values are filled by the
validity, check time and so on. These values are filled by the
UrlData.check() function.
Derived from the base class UrlData are the different URL types:
HttpUrlData for http:// links, MailtoUrlData for mailto: links, etc.
@ -94,6 +94,6 @@ the subclasses define functions needed for their URL type.
(4) Lets look at the output. Every output is defined in a Logger class.
Each logger has functions init(), newUrl() and endOfOutput().
We call init() once to initialize the Logger. UrlData.check() calls
newUrl() (through UrlData.logMe()) for each new URL and after all
newUrl() (through UrlData.logMe()) for each new URL and after all
checking is done we call endOfOutput(). Easy.
New loggers are created with the Config.newLogger function.

4
debian/README.Debian vendored Normal file
View file

@ -0,0 +1,4 @@
On Debian systems, you have a simple CGI script located at
http://localhost/doc/linkchecker/lconline/
For installation of other CGI scripts, see README

15
debian/changelog vendored
View file

@ -1,3 +1,18 @@
linkchecker (1.3.22) unstable; urgency=low
* last release before 1.4.0
* CGI scripts don't raise Exception when called with missing form
parameters.
* The CGI form checking function checks also the language parameter.
* Updated documentation.
* Don't bail out if /etc/resolv.conf could not be found on DNS init.
* corrected installation of locale files
* install linkchecker.1 in share/man/man1, not man/man1
* Add support for multiple init_gettext calls, this is needed by CGI
scripts
-- Bastian Kleineidam <calvin@debian.org> Sat, 23 Feb 2002 21:35:52 +0100
linkchecker (1.3.21) unstable; urgency=low
* really fix the keyboardinterrupt thing

View file

@ -1,4 +1,5 @@
README
debian/README.Debian
TODO
FAQ
draft-gilman-news-url-00.txt

View file

@ -20,6 +20,8 @@ case "$1" in
/bin/sh -c "$PYTHON -O -c $COMMAND $SITEPACKAGES/$i"
/bin/sh -c "$PYTHON -c $COMMAND $SITEPACKAGES/$i"
done
# for later use of python-central
#/usr/sbin/register-python-package module configure linkchecker ">=2.0"
;;
*)
echo "postinst called with unknown argument \`$1'" >&2

View file

@ -15,4 +15,7 @@ dpkg --listfiles $PACKAGE |
rmdir /usr/lib/$PYTHON/site-packages/linkcheck 2>/dev/null || true
rmdir /usr/lib/$PYTHON/site-packages/DNS 2>/dev/null || true
# for later use of python-central
#/usr/sbin/register-python-package module remove linkchecker ">=2.0"
exit 0

6
debian/rules vendored
View file

@ -39,9 +39,7 @@ install: build
dh_installdirs
$(MAKE) locale
$(PYTHON) setup.py install --root=$(CURDIR)/debian/$(PACKAGE) --no-compile
# remove man pages, we install them with dh_installman
rm -r debian/$(PACKAGE)/usr/man
# remove files, we install them below
# remove example files, we install them below
rm -r debian/$(PACKAGE)/usr/share/linkchecker/examples
# install additional doc files
install -c -m 644 DNS/README $(DOCDIR)/README_DNS.txt
@ -60,7 +58,7 @@ binary-indep: build install
dh_testdir
dh_testroot
dh_installdocs
dh_installman linkchecker.1
#dh_installman linkchecker.1
dh_installchangelogs
dh_link
dh_strip

19
lc.cgi
View file

@ -15,26 +15,23 @@
# along with this program; if not, write to the Free Software
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
import re,cgi,sys,urlparse,time,os
import re, cgi, sys, urlparse, time, os
sys.stderr = sys.stdout
def testit():
cgi.test()
sys.exit(0)
# main
print "Content-type: text/html"
print "Cache-Control: no-cache"
print
# uncomment the following line to test your CGI values
#testit()
# main
print "Content-type: text/html"
# http/1.1
print "Cache-Control: no-cache"
# http/1.0
print "Pragma: no-cache"
print
form = cgi.FieldStorage()
if form['language'].value == 'de':
os.environ['LC_MESSAGES'] = 'de'
elif form['language'].value == 'fr':
os.environ['LC_MESSAGES'] = 'fr'
else:
os.environ['LC_MESSAGES'] = 'C'
import linkcheck
if not linkcheck.lc_cgi.checkform(form):
linkcheck.lc_cgi.logit(form, form)

View file

@ -16,8 +16,6 @@
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
import sys, re, os
dist_dir = "/home/calvin/projects/linkchecker"
sys.path.insert(0,dist_dir)
import fcgi
# main
@ -26,14 +24,9 @@ try:
req = fcgi.FCGI()
req.out.write("Content-type: text/html\r\n"
"Cache-Control: no-cache\r\n"
"Pragma: no-cache\r\n"
"\r\n")
form = req.getFieldStorage()
if form['language'].value == 'de':
os.environ['LC_MESSAGES'] = 'de'
elif form['language'].value == 'fr':
os.environ['LC_MESSAGES'] = 'fr'
else:
os.environ['LC_MESSAGES'] = 'C'
import linkcheck
if not linkcheck.lc_cgi.checkform(form):
linkcheck.lc_cgi.logit(form, req.env)

View file

@ -16,13 +16,12 @@
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
import sys,re,thread
dist_dir = "/home/calvin/projects/linkchecker"
sys.path.insert(0,dist_dir)
import sz_fcgi, linkcheck
def func(fcg, req):
req.out.write("Content-type: text/html\r\n"
"Cache-Control: no-cache\r\n"
"Pragma: no-cache\r\n"
"\r\n")
form = req.getFieldStorage()
if not linkcheck.lc_cgi.checkform(form):

View file

@ -1,9 +1,9 @@
<html><head>
<title>LinkChecker Online</title>
</head>
<frameset rows="40%,60%" border="1" frameborder="0" framespacing="0">
<frame name="formular" src="lc_cgi.html" noresize frameborder="0">
<frame name="links" src="leer.html" noresize frameborder="0">
<frameset rows="40%,60%" border=1 frameborder=0 framespacing=0>
<frame name=formular src="lc_cgi.html" noresize frameborder=0>
<frame name=links src="leer.html" noresize frameborder=0>
<noframes>
Please use a frame capable browser.
</noframes>

View file

@ -1,52 +1,51 @@
<html><head>
<title>LinkChecker Online</title>
</head>
<body text="#192c83" bgcolor="#fff7e5" link="#191c83" vlink="#191c83"
alink="#191c83" >
<body text=#192c83 bgcolor=#fff7e5 link=#191c83 vlink=#191c83 alink=#191c83>
<font face="Lucida,Verdana,Arial,sans-serif">
<center><h2>LinkChecker Online</h2>
(powered by <a href="http://linkchecker.sourceforge.net/"
target="_top">LinkChecker</a>)
target=_top>LinkChecker</a>)
</center>
<blockquote>
<form method="POST" action="http://localhost/cgi-bin/lconline/lc.cgi"
target="links">
<table border=0>
<form method=POST action="http://localhost/cgi-bin/lconline/lc.cgi"
target=links>
<table border=0 cellpadding=2 cellspacing=0 summary=''>
<tr><td colspan=2 bgcolor="#fff7e5">
<font face="Lucida,Verdana,Arial,sans-serif">URL:
<input size=70 name="url" value="http://">
<input type="submit" value="Go!"></font></td></tr>
<font face="Lucida,Verdana,Arial,sans-serif">URL:
<input size=70 name=url value="http://">
<input type=submit value="Go!"></font></td></tr>
<td><font face="Lucida,Verdana,Arial,sans-serif">Recursion Level:
<select name="level">
<option> 0
<option selected> 1
<option> 2
<option> 3
<select name=level>
<option value=0>0</option>
<option value=1 selected>1</option>
<option value=2>2</option>
<option value=3>3</option>
</select>
</font></td>
<td><font face="Lucida,Verdana,Arial,sans-serif">
Check anchors in HTML: <input type="checkbox" name="anchors" checked>
Check anchors in HTML: <input type=checkbox name=anchors checked>
</font></td></tr>
<tr><td>
<font face="Lucida,Verdana,Arial,sans-serif">
Log only errors: <input type="checkbox" name="errors">
Log only errors: <input type=checkbox name=errors>
</font></td>
<td><font face="Lucida,Verdana,Arial,sans-serif">
Check only intern links: <input type="checkbox" name="intern" checked>
Check only intern links: <input type=checkbox name=intern checked>
</font></td>
</tr>
<tr>
<td><font face="Lucida,Verdana,Arial,sans-serif">
Output language: <select name="language">
<option selected> English
<option value="de"> Deutsch
<option value="fr"> Fran&ccedil;ais
<option value=C selected>English</option>
<option value=de>Deutsch</option>
<option value=fr>Fran&ccedil;ais</option>
</select>
</font></td>
<td><font face="Lucida,Verdana,Arial,sans-serif">
Check strict intern links: <input type="checkbox" name="strict">
Check strict intern links: <input type=checkbox name=strict>
</font></td>
</tr>
</table>

View file

@ -1,7 +1,6 @@
<html><head>
<title>Empty</title>
</head>
<body text="#192c83" bgcolor="#fff7e5" link="#191c83" vlink="#191c83"
alink="#191c83" >
<body text=#192c83 bgcolor=#fff7e5 link=#191c83 vlink=#191c83 alink=#191c83>
No links checked, dude!
</body></html>

View file

@ -16,11 +16,10 @@
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
import ConfigParser, sys, os, re, UserDict, string, time
import Logging, _linkchecker_configdata
import Logging, _linkchecker_configdata, linkcheck
from os.path import expanduser,normpath,normcase,join,isfile
from types import StringType
from urllib import getproxies
from linkcheck import _
from debuglevels import *
Version = _linkchecker_configdata.version
@ -384,10 +383,10 @@ class Configuration(UserDict.UserDict):
self.readConfig(files)
def warn(self, msg):
self.message(_("warning: %s")%msg)
self.message(linkcheck._("warning: %s")%msg)
def error(self, msg):
self.message(_("error: %s")%msg)
self.message(linkcheck._("error: %s")%msg)
def message(self, msg):
print >> sys.stderr, msg
@ -419,7 +418,7 @@ class Configuration(UserDict.UserDict):
if Loggers.has_key(log):
self['log'] = self.newLogger(log)
else:
self.warn(_("invalid log option '%s'") % log)
self.warn(linkcheck._("invalid log option '%s'") % log)
except ConfigParser.Error, msg:
debug(HURT_ME_PLENTY, msg)
try:
@ -455,7 +454,7 @@ class Configuration(UserDict.UserDict):
try:
num = cfgparser.getint(section, "recursionlevel")
if num<0:
self.error(_("illegal recursionlevel number %d") % num)
self.error(linkcheck._("illegal recursionlevel number %d") % num)
self["recursionlevel"] = num
except ConfigParser.Error, msg: debug(HURT_ME_PLENTY, msg)
try:

View file

@ -15,9 +15,8 @@
# along with this program; if not, write to the Free Software
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
import re, os, urlparse, urllib
import re, os, urlparse, urllib, linkcheck
from UrlData import UrlData, ExcList
from linkcheck import _
# OSError is thrown on Windows when a file is not found
ExcList.append(OSError)

View file

@ -15,9 +15,8 @@
# along with this program; if not, write to the Free Software
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
import ftplib,linkcheck
import ftplib, linkcheck
from UrlData import UrlData,ExcList
from linkcheck import _
ExcList.extend([
ftplib.error_reply,
@ -34,15 +33,15 @@ class FtpUrlData(UrlData):
def checkConnection(self, config):
_user, _password = self._getUserPassword(config)
if _user is None or _password is None:
raise linkcheck.error, _("No user or password found")
raise linkcheck.error, linkcheck._("No user or password found")
try:
self.urlConnection = ftplib.FTP(self.urlTuple[1], _user, _password)
except EOFError:
raise linkcheck.error, _("Remote host has closed connection")
raise linkcheck.error, linkcheck._("Remote host has closed connection")
info = self.urlConnection.getwelcome()
if not info:
self.closeConnection()
raise linkcheck.error, _("Got no answer from FTP server")
raise linkcheck.error, linkcheck._("Got no answer from FTP server")
self.setInfo(info)

View file

@ -16,7 +16,6 @@
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
from UrlData import UrlData
from linkcheck import _
class GopherUrlData(UrlData):
"Url link with gopher scheme"

View file

@ -15,9 +15,8 @@
# along with this program; if not, write to the Free Software
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
import socket
import socket, linkcheck
from UrlData import UrlData
from linkcheck import _
class HostCheckingUrlData(UrlData):
"Url link for which we have to connect to a specific host"
@ -42,4 +41,4 @@ class HostCheckingUrlData(UrlData):
def checkConnection(self, config):
ip = socket.gethostbyname(self.host)
self.setValid(self.host+"("+ip+") "+_("found"))
self.setValid(self.host+"("+ip+") "+linkcheck._("found"))

View file

@ -16,14 +16,14 @@
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
import httplib, urlparse, sys, time, re
import Config, StringUtil, robotparser
import Config, StringUtil, robotparser, linkcheck
if Config.DebugLevel > 0:
robotparser.debug = 1
from UrlData import UrlData
from urllib import splittype, splithost, splituser, splitpasswd
from linkcheck import _
from debuglevels import *
class HttpUrlData(UrlData):
"Url link with http scheme"
netscape_re = re.compile("Netscape-Enterprise/")
@ -75,9 +75,9 @@ class HttpUrlData(UrlData):
self.auth = None
self.proxyauth = None
if not self.urlTuple[2]:
self.setWarning(_("Missing '/' at end of URL"))
self.setWarning(linkcheck._("Missing '/' at end of URL"))
if config["robotstxt"] and not self.robotsTxtAllowsUrl(config):
self.setWarning(_("Access denied by robots.txt, checked only syntax"))
self.setWarning(linkcheck._("Access denied by robots.txt, checked only syntax"))
return
# first try
@ -113,7 +113,7 @@ class HttpUrlData(UrlData):
Config.debug(BRING_IT_ON, "Redirected", self.mime)
tries += 1
if tries >= 5:
self.setError(_("too much redirections (>= 5)"))
self.setError(linkcheck._("too much redirections (>= 5)"))
return
# user authentication
@ -136,13 +136,13 @@ class HttpUrlData(UrlData):
# content-type
elif status in [405,501,500]:
# HEAD method not allowed ==> try get
self.setWarning(_("Server does not support HEAD request (got "
self.setWarning(linkcheck._("Server does not support HEAD request (got "
"%d status), falling back to GET")%status)
status, statusText, self.mime = self._getHttpRequest("GET")
elif status>=400 and self.mime:
server = self.mime.getheader("Server")
if server and self.netscape_re.search(server):
self.setWarning(_("Netscape Enterprise Server with no "
self.setWarning(linkcheck._("Netscape Enterprise Server with no "
"HEAD support, falling back to GET"))
status,statusText,self.mime = self._getHttpRequest("GET")
elif self.mime:
@ -152,7 +152,7 @@ class HttpUrlData(UrlData):
if type=='application/octet-stream' and \
((poweredby and poweredby[:4]=='Zope') or \
(server and server[:4]=='Zope')):
self.setWarning(_("Zope Server cannot determine MIME type"
self.setWarning(linkcheck._("Zope Server cannot determine MIME type"
" with HEAD, falling back to GET"))
status,statusText,self.mime = self._getHttpRequest("GET")
@ -160,15 +160,15 @@ class HttpUrlData(UrlData):
effectiveurl = urlparse.urlunparse(self.urlTuple)
if self.url != effectiveurl:
self.setWarning(_("Effective URL %s") % effectiveurl)
self.setWarning(linkcheck._("Effective URL %s") % effectiveurl)
self.url = effectiveurl
if has301status:
self.setWarning(_("HTTP 301 (moved permanent) encountered: you "
self.setWarning(linkcheck._("HTTP 301 (moved permanent) encountered: you "
"should update this link"))
if self.url[-1]!='/':
self.setWarning(
_("A HTTP 301 redirection occured and the url has no "
linkcheck._("A HTTP 301 redirection occured and the url has no "
"trailing / at the end. All urls which point to (home) "
"directories should end with a / to avoid redirection"))

View file

@ -18,7 +18,7 @@
import httplib
from UrlData import UrlData
from HttpUrlData import HttpUrlData
from linkcheck import _, Config
import linkcheck, Config
_supportHttps = hasattr(httplib, "HTTPS")
@ -36,5 +36,5 @@ class HttpsUrlData(HttpUrlData):
if _supportHttps:
HttpUrlData._check(self, config)
else:
self.setWarning(_("HTTPS url ignored"))
self.setWarning(linkcheck._("HTTPS url ignored"))
self.logMe(config)

View file

@ -15,9 +15,8 @@
# along with this program; if not, write to the Free Software
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
import re
import re, linkcheck
from UrlData import UrlData
from linkcheck import _
ignored_schemes_re = re.compile(r"""^(
acap # application configuration access protocol
@ -57,5 +56,5 @@ class IgnoredUrlData(UrlData):
"""Some schemes are defined in http://www.w3.org/Addressing/schemes"""
def _check(self, config):
self.setWarning(_("%s url ignored")%self.scheme.capitalize())
self.setWarning(linkcheck._("%s url ignored")%self.scheme.capitalize())
self.logMe(config)

View file

@ -19,7 +19,6 @@ import sys, time
from types import ListType
import Config, StringUtil
import linkcheck
_ = linkcheck._
LogFields = {
"realurl": "Real URL",
@ -34,10 +33,10 @@ LogFields = {
"checktime": "Check Time",
"url": "URL",
}
MaxIndent = max(map(lambda x: len(_(x)), LogFields.values()))+1
MaxIndent = max(map(lambda x: len(linkcheck._(x)), LogFields.values()))+1
Spaces = {}
for key,value in LogFields.items():
Spaces[key] = " "*(MaxIndent - len(_(value)))
Spaces[key] = " "*(MaxIndent - len(linkcheck._(value)))
EntityTable = {
'<': '&lt;',
@ -134,51 +133,51 @@ __init__(self, **args)
self.starttime = time.time()
if self.logfield('intro'):
self.fd.write("%s\n%s\n" % (Config.AppInfo, Config.Freeware))
self.fd.write(_("Get the newest version at %s\n") % Config.Url)
self.fd.write(_("Write comments and bugs to %s\n\n") % Config.Email)
self.fd.write(_("Start checking at %s\n") % _strtime(self.starttime))
self.fd.write(linkcheck._("Get the newest version at %s\n") % Config.Url)
self.fd.write(linkcheck._("Write comments and bugs to %s\n\n") % Config.Email)
self.fd.write(linkcheck._("Start checking at %s\n") % _strtime(self.starttime))
self.fd.flush()
def newUrl(self, urlData):
if self.fd is None: return
if self.logfield('url'):
self.fd.write("\n"+_(LogFields['url'])+Spaces['url']+urlData.urlName)
self.fd.write("\n"+linkcheck._(LogFields['url'])+Spaces['url']+urlData.urlName)
if urlData.cached:
self.fd.write(_(" (cached)\n"))
self.fd.write(linkcheck._(" (cached)\n"))
else:
self.fd.write("\n")
if urlData.name and self.logfield('name'):
self.fd.write(_(LogFields["name"])+Spaces["name"]+urlData.name+"\n")
self.fd.write(linkcheck._(LogFields["name"])+Spaces["name"]+urlData.name+"\n")
if urlData.parentName and self.logfield('parenturl'):
self.fd.write(_(LogFields['parenturl'])+Spaces["parenturl"]+
urlData.parentName+_(", line ")+
self.fd.write(linkcheck._(LogFields['parenturl'])+Spaces["parenturl"]+
urlData.parentName+linkcheck._(", line ")+
str(urlData.line)+"\n")
if urlData.baseRef and self.logfield('base'):
self.fd.write(_(LogFields["base"])+Spaces["base"]+urlData.baseRef+"\n")
self.fd.write(linkcheck._(LogFields["base"])+Spaces["base"]+urlData.baseRef+"\n")
if urlData.url and self.logfield('realurl'):
self.fd.write(_(LogFields["realurl"])+Spaces["realurl"]+urlData.url+"\n")
self.fd.write(linkcheck._(LogFields["realurl"])+Spaces["realurl"]+urlData.url+"\n")
if urlData.downloadtime and self.logfield('dltime'):
self.fd.write(_(LogFields["dltime"])+Spaces["dltime"]+
_("%.3f seconds\n") % urlData.downloadtime)
self.fd.write(linkcheck._(LogFields["dltime"])+Spaces["dltime"]+
linkcheck._("%.3f seconds\n") % urlData.downloadtime)
if urlData.checktime and self.logfield('checktime'):
self.fd.write(_(LogFields["checktime"])+Spaces["checktime"]+
_("%.3f seconds\n") % urlData.checktime)
self.fd.write(linkcheck._(LogFields["checktime"])+Spaces["checktime"]+
linkcheck._("%.3f seconds\n") % urlData.checktime)
if urlData.infoString and self.logfield('info'):
self.fd.write(_(LogFields["info"])+Spaces["info"]+
self.fd.write(linkcheck._(LogFields["info"])+Spaces["info"]+
StringUtil.indent(
StringUtil.blocktext(urlData.infoString, 65),
MaxIndent)+"\n")
if urlData.warningString:
#self.warnings += 1
if self.logfield('warning'):
self.fd.write(_(LogFields["warning"])+Spaces["warning"]+
self.fd.write(linkcheck._(LogFields["warning"])+Spaces["warning"]+
StringUtil.indent(
StringUtil.blocktext(urlData.warningString, 65),
MaxIndent)+"\n")
if self.logfield('result'):
self.fd.write(_(LogFields["result"])+Spaces["result"])
self.fd.write(linkcheck._(LogFields["result"])+Spaces["result"])
if urlData.valid:
self.fd.write(urlData.validString+"\n")
else:
@ -190,36 +189,48 @@ __init__(self, **args)
def endOfOutput(self, linknumber=-1):
if self.fd is None: return
if self.logfield('outro'):
self.fd.write(_("\nThats it. "))
self.fd.write(linkcheck._("\nThats it. "))
#if self.warnings==1:
# self.fd.write(_("1 warning, "))
# self.fd.write(linkcheck._("1 warning, "))
#else:
# self.fd.write(str(self.warnings)+_(" warnings, "))
# self.fd.write(str(self.warnings)+linkcheck._(" warnings, "))
if self.errors==1:
self.fd.write(_("1 error"))
self.fd.write(linkcheck._("1 error"))
else:
self.fd.write(str(self.errors)+_(" errors"))
self.fd.write(str(self.errors)+linkcheck._(" errors"))
if linknumber >= 0:
if linknumber == 1:
self.fd.write(_(" in 1 link"))
self.fd.write(linkcheck._(" in 1 link"))
else:
self.fd.write(_(" in %d links") % linknumber)
self.fd.write(_(" found\n"))
self.fd.write(linkcheck._(" in %d links") % linknumber)
self.fd.write(linkcheck._(" found\n"))
self.stoptime = time.time()
duration = self.stoptime - self.starttime
name = _("seconds")
self.fd.write(_("Stopped checking at %s") % _strtime(self.stoptime))
name = linkcheck._("seconds")
self.fd.write(linkcheck._("Stopped checking at %s") % _strtime(self.stoptime))
if duration > 60:
duration = duration / 60
name = _("minutes")
name = linkcheck._("minutes")
if duration > 60:
duration = duration / 60
name = _("hours")
name = linkcheck._("hours")
self.fd.write(" (%.3f %s)\n" % (duration, name))
self.fd.flush()
self.fd = None
HTML_HEADER = """<!DOCTYPE html PUBLIC "-//W3C//DTD html 4.0//EN">
<html><head><title>%s</title>
<style type="text/css">\n<!--
h2 { font-family: Verdana,sans-serif; font-size: 22pt;
font-style: bold; font-weight: bold }
body { font-family: Arial,sans-serif; font-size: 11pt }
td { font-family: Arial,sans-serif; font-size: 11pt }
code { font-family: Courier }
a:hover { color: #34a4ef }
//-->
</style></head>
<body bgcolor=%s link=%s vlink=%s alink=%s>
"""
class HtmlLogger(StandardLogger):
"""Logger with HTML output"""
@ -237,82 +248,72 @@ class HtmlLogger(StandardLogger):
def init(self):
if self.fd is None: return
self.starttime = time.time()
self.fd.write('<!DOCTYPE html PUBLIC "-//W3C//DTD html 4.0//en">\n'+
'<html><head><title>'+Config.App+"</title>\n"
'<style type="text/css">\n<!--\n'
"h2 { font-family: Verdana,sans-serif; font-size: 22pt; \n"
" font-style: bold; font-weight: bold }\n"
"body { font-family: Arial,sans-serif; font-size: 11pt }\n"
"td { font-family: Arial,sans-serif; font-size: 11pt }\n"
"code { font-family: Courier }\n"
"a:hover { color: #34a4ef }\n"
"//-->\n</style>\n</head>\n"+
"<body bgcolor="+self.colorbackground+" link="+self.colorlink+
" vlink="+self.colorlink+" alink="+self.colorlink+">")
self.fd.write(HTML_HEADER%(Config.App, self.colorbackground,
self.colorlink, self.colorlink, self.colorlink))
if self.logfield('intro'):
self.fd.write("<center><h2>"+Config.App+"</h2></center>"+
"<br><blockquote>"+Config.Freeware+"<br><br>"+
(_("Start checking at %s\n") % _strtime(self.starttime))+
(linkcheck._("Start checking at %s\n") % _strtime(self.starttime))+
"<br><br>")
self.fd.flush()
def newUrl(self, urlData):
if self.fd is None: return
self.fd.write('<table align=left border="0" cellspacing="0"'
' cellpadding="1" bgcolor='+self.colorborder+' summary="Border"'
'><tr><td><table align="left" border="0" cellspacing="0"'
' cellpadding="3" summary="checked link" bgcolor='+
self.fd.write('<table align=left border=0 cellspacing=0'
' cellpadding=1 bgcolor='+self.colorborder+' summary=Border'
'><tr><td><table align=left border=0 cellspacing=0'
' cellpadding=3 summary="checked link" bgcolor='+
self.colorbackground+
">")
if self.logfield("url"):
self.fd.write("<tr><td bgcolor="+self.colorurl+">"+_("URL")+
self.fd.write("<tr><td bgcolor="+self.colorurl+">"+linkcheck._("URL")+
"</td><td bgcolor="+self.colorurl+">"+urlData.urlName)
if urlData.cached:
self.fd.write(_(" (cached)\n"))
self.fd.write(linkcheck._(" (cached)\n"))
self.fd.write("</td></tr>\n")
if urlData.name and self.logfield("name"):
self.fd.write("<tr><td>"+_("Name")+"</td><td>"+
self.fd.write("<tr><td>"+linkcheck._("Name")+"</td><td>"+
urlData.name+"</td></tr>\n")
if urlData.parentName and self.logfield("parenturl"):
self.fd.write("<tr><td>"+_("Parent URL")+"</td><td>"+
self.fd.write("<tr><td>"+linkcheck._("Parent URL")+"</td><td>"+
'<a href="'+urlData.parentName+'">'+
urlData.parentName+"</a> line "+str(urlData.line)+
"</td></tr>\n")
if urlData.baseRef and self.logfield("base"):
self.fd.write("<tr><td>"+_("Base")+"</td><td>"+
self.fd.write("<tr><td>"+linkcheck._("Base")+"</td><td>"+
urlData.baseRef+"</td></tr>\n")
if urlData.url and self.logfield("realurl"):
self.fd.write("<tr><td>"+_("Real URL")+"</td><td>"+
self.fd.write("<tr><td>"+linkcheck._("Real URL")+"</td><td>"+
"<a href=\""+urlData.url+
'">'+urlData.url+"</a></td></tr>\n")
if urlData.downloadtime and self.logfield("dltime"):
self.fd.write("<tr><td>"+_("D/L Time")+"</td><td>"+
(_("%.3f seconds") % urlData.downloadtime)+
self.fd.write("<tr><td>"+linkcheck._("D/L Time")+"</td><td>"+
(linkcheck._("%.3f seconds") % urlData.downloadtime)+
"</td></tr>\n")
if urlData.checktime and self.logfield("checktime"):
self.fd.write("<tr><td>"+_("Check Time")+
self.fd.write("<tr><td>"+linkcheck._("Check Time")+
"</td><td>"+
(_("%.3f seconds") % urlData.checktime)+
(linkcheck._("%.3f seconds") % urlData.checktime)+
"</td></tr>\n")
if urlData.infoString and self.logfield("info"):
self.fd.write("<tr><td>"+_("Info")+"</td><td>"+
self.fd.write("<tr><td>"+linkcheck._("Info")+"</td><td>"+
StringUtil.htmlify(urlData.infoString)+
"</td></tr>\n")
if urlData.warningString:
#self.warnings += 1
if self.logfield("warning"):
self.fd.write("<tr>"+self.tablewarning+_("Warning")+
self.fd.write("<tr>"+self.tablewarning+linkcheck._("Warning")+
"</td>"+self.tablewarning+
urlData.warningString.replace("\n", "<br>")+
"</td></tr>\n")
if self.logfield("result"):
if urlData.valid:
self.fd.write("<tr>"+self.tableok+_("Result")+"</td>"+
self.fd.write("<tr>"+self.tableok+linkcheck._("Result")+"</td>"+
self.tableok+urlData.validString+"</td></tr>\n")
else:
self.errors += 1
self.fd.write("<tr>"+self.tableerror+_("Result")+
self.fd.write("<tr>"+self.tableerror+linkcheck._("Result")+
"</td>"+self.tableerror+
urlData.errorString+"</td></tr>\n")
@ -323,38 +324,38 @@ class HtmlLogger(StandardLogger):
def endOfOutput(self, linknumber=-1):
if self.fd is None: return
if self.logfield("outro"):
self.fd.write(_("\nThats it. "))
self.fd.write(linkcheck._("\nThats it. "))
#if self.warnings==1:
# self.fd.write(_("1 warning, "))
# self.fd.write(linkcheck._("1 warning, "))
#else:
# self.fd.write(str(self.warnings)+_(" warnings, "))
# self.fd.write(str(self.warnings)+linkcheck._(" warnings, "))
if self.errors==1:
self.fd.write(_("1 error"))
self.fd.write(linkcheck._("1 error"))
else:
self.fd.write(str(self.errors)+_(" errors"))
self.fd.write(str(self.errors)+linkcheck._(" errors"))
if linknumber >= 0:
if linknumber == 1:
self.fd.write(_(" in 1 link"))
self.fd.write(linkcheck._(" in 1 link"))
else:
self.fd.write(_(" in %d links") % linknumber)
self.fd.write(_(" found\n")+"<br>")
self.fd.write(linkcheck._(" in %d links") % linknumber)
self.fd.write(linkcheck._(" found\n")+"<br>")
self.stoptime = time.time()
duration = self.stoptime - self.starttime
name = _("seconds")
self.fd.write(_("Stopped checking at %s") % _strtime(self.stoptime))
name = linkcheck._("seconds")
self.fd.write(linkcheck._("Stopped checking at %s") % _strtime(self.stoptime))
if duration > 60:
duration = duration / 60
name = _("minutes")
name = linkcheck._("minutes")
if duration > 60:
duration = duration / 60
name = _("hours")
name = linkcheck._("hours")
self.fd.write(" (%.3f %s)\n" % (duration, name))
self.fd.write("</blockquote><br><hr noshade size=1><small>"+
Config.HtmlAppInfo+"<br>")
self.fd.write(_("Get the newest version at %s\n") %\
self.fd.write(linkcheck._("Get the newest version at %s\n") %\
('<a href="'+Config.Url+'" target="_top">'+Config.Url+
"</a>.<br>"))
self.fd.write(_("Write comments and bugs to %s\n\n") %\
self.fd.write(linkcheck._("Write comments and bugs to %s\n\n") %\
("<a href=\"mailto:"+Config.Email+"\">"+Config.Email+"</a>."))
self.fd.write("</small></body></html>")
self.fd.flush()
@ -388,7 +389,7 @@ class ColoredLogger(StandardLogger):
if self.currentPage != urlData.parentName:
if self.prefix:
self.fd.write("o\n")
self.fd.write("\n"+_("Parent URL")+Spaces["parenturl"]+
self.fd.write("\n"+linkcheck._("Parent URL")+Spaces["parenturl"]+
self.colorparent+urlData.parentName+
self.colorreset+"\n")
self.currentPage = urlData.parentName
@ -403,49 +404,49 @@ class ColoredLogger(StandardLogger):
self.fd.write("|\n+- ")
else:
self.fd.write("\n")
self.fd.write(_("URL")+Spaces["url"]+self.colorurl+
self.fd.write(linkcheck._("URL")+Spaces["url"]+self.colorurl+
urlData.urlName+self.colorreset)
if urlData.line: self.fd.write(_(", line ")+`urlData.line`+"")
if urlData.line: self.fd.write(linkcheck._(", line ")+`urlData.line`+"")
if urlData.cached:
self.fd.write(_(" (cached)\n"))
self.fd.write(linkcheck._(" (cached)\n"))
else:
self.fd.write("\n")
if urlData.name and self.logfield("name"):
if self.prefix:
self.fd.write("| ")
self.fd.write(_("Name")+Spaces["name"]+self.colorname+
self.fd.write(linkcheck._("Name")+Spaces["name"]+self.colorname+
urlData.name+self.colorreset+"\n")
if urlData.baseRef and self.logfield("base"):
if self.prefix:
self.fd.write("| ")
self.fd.write(_("Base")+Spaces["base"]+self.colorbase+
self.fd.write(linkcheck._("Base")+Spaces["base"]+self.colorbase+
urlData.baseRef+self.colorreset+"\n")
if urlData.url and self.logfield("realurl"):
if self.prefix:
self.fd.write("| ")
self.fd.write(_("Real URL")+Spaces["realurl"]+self.colorreal+
self.fd.write(linkcheck._("Real URL")+Spaces["realurl"]+self.colorreal+
urlData.url+self.colorreset+"\n")
if urlData.downloadtime and self.logfield("dltime"):
if self.prefix:
self.fd.write("| ")
self.fd.write(_("D/L Time")+Spaces["dltime"]+self.colordltime+
(_("%.3f seconds") % urlData.downloadtime)+self.colorreset+"\n")
self.fd.write(linkcheck._("D/L Time")+Spaces["dltime"]+self.colordltime+
(linkcheck._("%.3f seconds") % urlData.downloadtime)+self.colorreset+"\n")
if urlData.checktime and self.logfield("checktime"):
if self.prefix:
self.fd.write("| ")
self.fd.write(_("Check Time")+Spaces["checktime"]+
self.fd.write(linkcheck._("Check Time")+Spaces["checktime"]+
self.colordltime+
(_("%.3f seconds") % urlData.checktime)+self.colorreset+"\n")
(linkcheck._("%.3f seconds") % urlData.checktime)+self.colorreset+"\n")
if urlData.infoString and self.logfield("info"):
if self.prefix:
self.fd.write("| "+_("Info")+Spaces["info"]+
self.fd.write("| "+linkcheck._("Info")+Spaces["info"]+
StringUtil.indentWith(StringUtil.blocktext(
urlData.infoString, 65), "| "+Spaces["info"]))
else:
self.fd.write(_("Info")+Spaces["info"]+
self.fd.write(linkcheck._("Info")+Spaces["info"]+
StringUtil.indentWith(StringUtil.blocktext(
urlData.infoString, 65), " "+Spaces["info"]))
self.fd.write(self.colorreset+"\n")
@ -455,14 +456,14 @@ class ColoredLogger(StandardLogger):
if self.logfield("warning"):
if self.prefix:
self.fd.write("| ")
self.fd.write(_("Warning")+Spaces["warning"]+
self.fd.write(linkcheck._("Warning")+Spaces["warning"]+
self.colorwarning+
urlData.warningString+self.colorreset+"\n")
if self.logfield("result"):
if self.prefix:
self.fd.write("| ")
self.fd.write(_("Result")+Spaces["result"])
self.fd.write(linkcheck._("Result")+Spaces["result"])
if urlData.valid:
self.fd.write(self.colorvalid+urlData.validString+
self.colorreset+"\n")
@ -495,10 +496,10 @@ class GMLLogger(StandardLogger):
if self.fd is None: return
self.starttime = time.time()
if self.logfield("intro"):
self.fd.write("# "+(_("created by %s at %s\n") % (Config.AppName,
self.fd.write("# "+(linkcheck._("created by %s at %s\n") % (Config.AppName,
_strtime(self.starttime))))
self.fd.write("# "+(_("Get the newest version at %s\n") % Config.Url))
self.fd.write("# "+(_("Write comments and bugs to %s\n\n") % \
self.fd.write("# "+(linkcheck._("Get the newest version at %s\n") % Config.Url))
self.fd.write("# "+(linkcheck._("Write comments and bugs to %s\n\n") % \
Config.Email))
self.fd.write("graph [\n directed 1\n")
self.fd.flush()
@ -550,15 +551,15 @@ class GMLLogger(StandardLogger):
if self.logfield("outro"):
self.stoptime = time.time()
duration = self.stoptime - self.starttime
name = _("seconds")
self.fd.write("# "+_("Stopped checking at %s") % \
name = linkcheck._("seconds")
self.fd.write("# "+linkcheck._("Stopped checking at %s") % \
_strtime(self.stoptime))
if duration > 60:
duration = duration / 60
name = _("minutes")
name = linkcheck._("minutes")
if duration > 60:
duration = duration / 60
name = _("hours")
name = linkcheck._("hours")
self.fd.write(" (%.3f %s)\n" % (duration, name))
self.fd.flush()
self.fd = None
@ -579,10 +580,10 @@ class XMLLogger(StandardLogger):
self.fd.write('<?xml version="1.0"?>\n')
if self.logfield("intro"):
self.fd.write("<!--\n")
self.fd.write(" "+_("created by %s at %s\n") % \
self.fd.write(" "+linkcheck._("created by %s at %s\n") % \
(Config.AppName, _strtime(self.starttime)))
self.fd.write(" "+_("Get the newest version at %s\n") % Config.Url)
self.fd.write(" "+_("Write comments and bugs to %s\n\n") % \
self.fd.write(" "+linkcheck._("Get the newest version at %s\n") % Config.Url)
self.fd.write(" "+linkcheck._("Write comments and bugs to %s\n\n") % \
Config.Email)
self.fd.write("-->\n\n")
self.fd.write('<GraphXML>\n<graph isDirected="true">\n')
@ -641,15 +642,15 @@ class XMLLogger(StandardLogger):
if self.logfield("outro"):
self.stoptime = time.time()
duration = self.stoptime - self.starttime
name = _("seconds")
name = linkcheck._("seconds")
self.fd.write("<!-- ")
self.fd.write(_("Stopped checking at %s") % _strtime(self.stoptime))
self.fd.write(linkcheck._("Stopped checking at %s") % _strtime(self.stoptime))
if duration > 60:
duration = duration / 60
name = _("minutes")
name = linkcheck._("minutes")
if duration > 60:
duration = duration / 60
name = _("hours")
name = linkcheck._("hours")
self.fd.write(" (%.3f %s)\n" % (duration, name))
self.fd.write("-->")
self.fd.flush()
@ -669,10 +670,10 @@ class SQLLogger(StandardLogger):
if self.fd is None: return
self.starttime = time.time()
if self.logfield("intro"):
self.fd.write("-- "+(_("created by %s at %s\n") % (Config.AppName,
self.fd.write("-- "+(linkcheck._("created by %s at %s\n") % (Config.AppName,
_strtime(self.starttime))))
self.fd.write("-- "+(_("Get the newest version at %s\n") % Config.Url))
self.fd.write("-- "+(_("Write comments and bugs to %s\n\n") % \
self.fd.write("-- "+(linkcheck._("Get the newest version at %s\n") % Config.Url))
self.fd.write("-- "+(linkcheck._("Write comments and bugs to %s\n\n") % \
Config.Email))
self.fd.flush()
@ -706,15 +707,15 @@ class SQLLogger(StandardLogger):
if self.logfield("outro"):
self.stoptime = time.time()
duration = self.stoptime - self.starttime
name = _("seconds")
self.fd.write("-- "+_("Stopped checking at %s") % \
name = linkcheck._("seconds")
self.fd.write("-- "+linkcheck._("Stopped checking at %s") % \
_strtime(self.stoptime))
if duration > 60:
duration = duration / 60
name = _("minutes")
name = linkcheck._("minutes")
if duration > 60:
duration = duration / 60
name = _("hours")
name = linkcheck._("hours")
self.fd.write(" (%.3f %s)\n" % (duration, name))
self.fd.flush()
self.fd = None
@ -761,12 +762,12 @@ class CSVLogger(StandardLogger):
if self.fd is None: return
self.starttime = time.time()
if self.logfield("intro"):
self.fd.write("# "+(_("created by %s at %s\n") % (Config.AppName,
self.fd.write("# "+(linkcheck._("created by %s at %s\n") % (Config.AppName,
_strtime(self.starttime))))
self.fd.write("# "+(_("Get the newest version at %s\n") % Config.Url))
self.fd.write("# "+(_("Write comments and bugs to %s\n\n") % \
self.fd.write("# "+(linkcheck._("Get the newest version at %s\n") % Config.Url))
self.fd.write("# "+(linkcheck._("Write comments and bugs to %s\n\n") % \
Config.Email))
self.fd.write(_("# Format of the entries:\n")+\
self.fd.write(linkcheck._("# Format of the entries:\n")+\
"# urlname;\n"
"# recursionlevel;\n"
"# parentname;\n"
@ -811,14 +812,14 @@ class CSVLogger(StandardLogger):
self.stoptime = time.time()
if self.logfield("outro"):
duration = self.stoptime - self.starttime
name = _("seconds")
self.fd.write("# "+_("Stopped checking at %s") % _strtime(self.stoptime))
name = linkcheck._("seconds")
self.fd.write("# "+linkcheck._("Stopped checking at %s") % _strtime(self.stoptime))
if duration > 60:
duration = duration / 60
name = _("minutes")
name = linkcheck._("minutes")
if duration > 60:
duration = duration / 60
name = _("hours")
name = linkcheck._("hours")
self.fd.write(" (%.3f %s)\n" % (duration, name))
self.fd.flush()
self.fd = None

View file

@ -19,7 +19,6 @@ import os, re, DNS, sys, Config, cgi, urllib, linkcheck
from rfc822 import AddressList
from HostCheckingUrlData import HostCheckingUrlData
from smtplib import SMTP
from linkcheck import _
from debuglevels import *
# regular expression for RFC2368 compliant mailto: scanning
@ -65,7 +64,7 @@ class MailtoUrlData(HostCheckingUrlData):
an answer, print the verified adress as an info.
"""
if not self.adresses:
self.setWarning(_("No adresses found"))
self.setWarning(linkcheck._("No adresses found"))
return
value = "unknown reason"
@ -77,7 +76,7 @@ class MailtoUrlData(HostCheckingUrlData):
mxrecords = DNS.mxlookup(host, protocol="tcp")
Config.debug(HURT_ME_PLENTY, "found mailhosts", mxrecords)
if not len(mxrecords):
self.setError(_("No mail host for %s found")%host)
self.setError(linkcheck._("No mail host for %s found")%host)
return
smtpconnect = 0
for mxrecord in mxrecords:
@ -90,19 +89,19 @@ class MailtoUrlData(HostCheckingUrlData):
info = self.urlConnection.verify(user)
Config.debug(HURT_ME_PLENTY, "SMTP user info", info)
if info[0]==250:
self.setInfo(_("Verified adress: %s")%str(info[1]))
self.setInfo(linkcheck._("Verified adress: %s")%str(info[1]))
except:
type, value = sys.exc_info()[:2]
#print type,value
if smtpconnect: break
if not smtpconnect:
self.setWarning(_("None of the mail hosts for %s accepts an "
self.setWarning(linkcheck._("None of the mail hosts for %s accepts an "
"SMTP connection: %s") % (host, str(value)))
mxrecord = mxrecords[0][1]
else:
mxrecord = mxrecord[1]
self.setValid(_("found mail host %s") % mxrecord)
self.setValid(linkcheck._("found mail host %s") % mxrecord)
def _split_adress(self, adress):
@ -113,7 +112,7 @@ class MailtoUrlData(HostCheckingUrlData):
return tuple(split)
if len(split)==1:
return (split[0], "localhost")
raise linkcheck.error, _("could not split the mail adress")
raise linkcheck.error, linkcheck._("could not split the mail adress")
def closeConnection(self):

View file

@ -16,10 +16,9 @@
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
import re, time, sys, nntplib, urlparse, linkcheck
from linkcheck import _
from UrlData import ExcList,UrlData
debug = linkcheck.Config.debug
from debuglevels import *
debug = linkcheck.Config.debug
ExcList.extend([nntplib.error_reply,
nntplib.error_temp,
@ -45,7 +44,7 @@ class NntpUrlData(UrlData):
def checkConnection(self, config):
nntpserver = self.urlTuple[1] or config["nntpserver"]
if not nntpserver:
self.setWarning(_("No NNTP server specified, skipping this URL"))
self.setWarning(linkcheck._("No NNTP server specified, skipping this URL"))
return
nntp = self._connectNntp(nntpserver)
group = self.urlTuple[2]
@ -54,18 +53,18 @@ class NntpUrlData(UrlData):
if '@' in group:
# request article
resp,number,id = nntp.stat("<"+group+">")
self.setInfo(_('Articel number %s found' % number))
self.setInfo(linkcheck._('Articel number %s found' % number))
else:
# split off trailing articel span
group = group.split('/',1)[0]
if group:
# request group info
resp,count,first,last,name = nntp.group(group)
self.setInfo(_("Group %s has %s articles, range %s to %s") %\
self.setInfo(linkcheck._("Group %s has %s articles, range %s to %s") %\
(name, count, first, last))
else:
# group name is the empty string
self.setWarning(_("No newsgroup specified in NNTP URL"))
self.setWarning(linkcheck._("No newsgroup specified in NNTP URL"))
def _connectNntp(self, nntpserver):

View file

@ -18,7 +18,6 @@
import re, sys, htmlentitydefs
markup_re = re.compile("<.*?>", re.DOTALL)
entities = htmlentitydefs.entitydefs.items()
HtmlTable = map(lambda x: (x[1], "&"+x[0]+";"), entities)
UnHtmlTable = map(lambda x: ("&"+x[0]+";", x[1]), entities)

View file

@ -17,7 +17,6 @@
import telnetlib, re, linkcheck
from HostCheckingUrlData import HostCheckingUrlData
from linkcheck import _
# regular expression for syntax checking
@ -38,7 +37,7 @@ class TelnetUrlData(HostCheckingUrlData):
HostCheckingUrlData.buildUrl(self)
mo = telnet_re.match(self.urlName)
if not mo:
raise linkcheck.error, _("Illegal telnet link syntax")
raise linkcheck.error, linkcheck._("Illegal telnet link syntax")
self.user = mo.group("user")
self.password = mo.group("password")
self.host = mo.group("host")
@ -46,7 +45,6 @@ class TelnetUrlData(HostCheckingUrlData):
if not self.port:
self.port = 23
def checkConnection(self, config):
HostCheckingUrlData.checkConnection(self, config)
self.urlConnection = telnetlib.Telnet()

View file

@ -17,13 +17,12 @@
import sys, re, urlparse, urllib, time, traceback, socket, select
import DNS, Config, StringUtil, linkcheck, linkname
debug = Config.debug
from linkcheck import _
from debuglevels import *
debug = Config.debug
# helper function for internal errors
def internal_error ():
print >> sys.stderr, _("""\n********** Oops, I did it again. *************
print >> sys.stderr, linkcheck._("""\n********** Oops, I did it again. *************
You have found an internal error in LinkChecker.
Please write a bug report to %s and include
@ -37,13 +36,13 @@ I can work with ;).
import traceback
traceback.print_exc()
print_app_info()
print >> sys.stderr, _("\n******** LinkChecker internal error, bailing out ********")
print >> sys.stderr, linkcheck._("\n******** LinkChecker internal error, bailing out ********")
sys.exit(1)
def print_app_info ():
import os
print >> sys.stderr, _("System info:")
print >> sys.stderr, linkcheck._("System info:")
print >> sys.stderr, Config.App
print >> sys.stderr, "Python %s on %s" % (sys.version, sys.platform)
for key in ("LC_ALL", "LC_MESSAGES", "http_proxy", "ftp_proxy"):
@ -155,8 +154,8 @@ class UrlData:
self.recursionLevel = recursionLevel
self.parentName = parentName
self.baseRef = baseRef
self.errorString = _("Error")
self.validString = _("Valid")
self.errorString = linkcheck._("Error")
self.validString = linkcheck._("Valid")
self.warningString = None
self.infoString = None
self.valid = 1
@ -177,11 +176,11 @@ class UrlData:
def setError(self, s):
self.valid=0
self.errorString = _("Error")+": "+s
self.errorString = linkcheck._("Error")+": "+s
def setValid(self, s):
self.valid=1
self.validString = _("Valid")+": "+s
self.validString = linkcheck._("Valid")+": "+s
def isHtml(self):
return 0
@ -258,7 +257,7 @@ class UrlData:
# check syntax
debug(BRING_IT_ON, "checking syntax")
if not self.urlName or self.urlName=="":
self.setError(_("URL is null or empty"))
self.setError(linkcheck._("URL is null or empty"))
self.logMe(config)
return
try:
@ -283,7 +282,7 @@ class UrlData:
debug(BRING_IT_ON, "extern =", self.extern)
if self.extern and (config["strict"] or self.extern[1]):
self.setWarning(
_("outside of domain filter, checked only syntax"))
linkcheck._("outside of domain filter, checked only syntax"))
self.logMe(config)
return

View file

@ -19,18 +19,21 @@ class error(Exception):
pass
# i18n suppport
import os, _linkchecker_configdata
try:
import gettext
domain = 'linkcheck'
localedir = os.path.join(_linkchecker_configdata.install_data, 'locale')
t = gettext.translation(domain, localedir)
_ = t.gettext
except IOError:
_ = lambda s: s
import sys, os, _linkchecker_configdata
def init_gettext ():
global _
try:
import gettext
domain = 'linkcheck'
localedir = os.path.join(_linkchecker_configdata.install_data,
'share/locale')
_ = gettext.translation(domain, localedir).gettext
except (IOError, ImportError):
_ = lambda s: s
init_gettext()
import timeoutsocket
import Config, UrlData, sys, lc_cgi
import Config, UrlData, lc_cgi
from debuglevels import *
debug = Config.debug
@ -45,7 +48,6 @@ def checkUrls(config):
In the config object there are functions to get a new URL (getUrl) and
to check it (checkUrl).
"""
debug(HURT_ME_PLENTY, "threads", config['threads'])
config.log_init()
try:
while not config.finished():

View file

@ -15,24 +15,39 @@
# along with this program; if not, write to the Free Software
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
import re,time,urlparse
from linkcheck import _
import os, re, time, urlparse
import linkcheck
from types import StringType
_logfile = None
_supported_langs = ('de', 'fr', 'C')
def checkform(form):
if form.has_key("language"):
lang = form['language'].value
if lang in _supported_langs:
os.environ['LC_MESSAGES'] = lang
linkcheck.init_gettext()
else:
return 0
for key in ["level","url"]:
if not form.has_key(key) or form[key].value == "": return 0
if not re.match(r"^https?://[-\w./=%?~]+$", form["url"].value): return 0
if not re.match(r"\d", form["level"].value): return 0
if int(form["level"].value) > 3: return 0
if not form.has_key(key) or form[key].value == "":
return 0
if not re.match(r"^https?://[-\w./=%?~]+$", form["url"].value):
return 0
if not re.match(r"\d", form["level"].value):
return 0
if int(form["level"].value) > 3:
return 0
if form.has_key("anchors"):
if not form["anchors"].value=="on": return 0
if not form["anchors"].value=="on":
return 0
if form.has_key("errors"):
if not form["errors"].value=="on": return 0
if not form["errors"].value=="on":
return 0
if form.has_key("intern"):
if not form["intern"].value=="on": return 0
if not form["intern"].value=="on":
return 0
return 1
def getHostName(form):
@ -50,20 +65,20 @@ def logit(form, env):
"REMOTE_HOST", "REMOTE_PORT"]:
if env.has_key(var):
_logfile.write(var+"="+env[var]+"\n")
for key in ["level", "url", "anchors", "errors", "intern"]:
for key in ["level", "url", "anchors", "errors", "intern", "language"]:
if form.has_key(key):
_logfile.write(str(form[key])+"\n")
def printError(out):
out.write(_("<html><head><title>LinkChecker Online Error</title></head>"
"<body text=\"#192c83\" bgcolor=\"#fff7e5\" link=\"#191c83\" vlink=\"#191c83\""
"alink=\"#191c83\">"
"<blockquote>"
"<b>Error</b><br>"
"The LinkChecker Online script has encountered an error. Please ensure "
"that your provided URL link begins with <code>http://</code> and "
"contains only these characters: <code>A-Za-z0-9./_~-</code><br><br>"
"Errors are logged."
"</blockquote>"
"</body>"
"</html>"))
out.write(linkcheck._("""<html><head>
<title>LinkChecker Online Error</title></head>
<body text=#192c83 bgcolor=#fff7e5 link=#191c83 vlink=#191c83 alink=#191c83>
<blockquote>
<b>Error</b><br>
The LinkChecker Online script has encountered an error. Please ensure
that your provided URL link begins with <code>http://</code> and
contains only these characters: <code>A-Za-z0-9./_~-</code><br><br>
Errors are logged.
</blockquote>
</body>
</html>"""))

View file

@ -14,7 +14,7 @@
# along with this program; if not, write to the Free Software
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
import re,StringUtil
import re, StringUtil
imgtag_re = re.compile("(?i)\s+alt\s*=\s*(?P<name>(\".*?\"|'.*?'|[^\s>]+))", re.DOTALL)
img_re = re.compile("(?i)<\s*img\s+.*>", re.DOTALL)

View file

@ -15,7 +15,7 @@
# along with this program; if not, write to the Free Software
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
import sys,time,rotor,types
import sys, time, rotor, types
_curses = None
_color = 0

View file

@ -23,9 +23,10 @@ if sys.version[:5] < "2.0":
import getopt, re, os, urlparse, pprint, linkcheck
from linkcheck.debuglevels import *
from linkcheck import _,StringUtil
from linkcheck import StringUtil
debug = linkcheck.debug
Usage = _("""USAGE\tlinkchecker [options] file-or-url...
Usage = linkcheck._("""USAGE\tlinkchecker [options] file-or-url...
OPTIONS
For single-letter option arguments the space is not a necessity. So
@ -107,7 +108,7 @@ For single-letter option arguments the space is not a necessity. So
This option implies -w.\n") % linkcheck.Config.LoggerKeys
""")
Notes = _("""NOTES
Notes = linkcheck._("""NOTES
o LinkChecker assumes an http:// resp. ftp:// link when a commandline URL
starts with 'www.' resp. 'ftp.'
You can also give local files as arguments.
@ -125,7 +126,7 @@ o When checking 'news:' links the given NNTP host doesn't need to be the
same as the host of the user browsing your pages!
""")
Examples = _("""EXAMPLES
Examples = linkcheck._("""EXAMPLES
o linkchecker -v -ohtml -r2 -s -itreasure.calvinsplayground.de \\
http://treasure.calvinsplayground.de/~calvin/ > sample.html
o Local files and syntactic sugar on the command line:
@ -149,8 +150,8 @@ def printHelp():
sys.exit(0)
def printUsage(msg):
sys.stderr.write(_("Error: %s\n") % msg)
sys.stderr.write(_("Execute 'linkchecker -h' for help\n"))
sys.stderr.write(linkcheck._("Error: %s\n") % msg)
sys.stderr.write(linkcheck._("Execute 'linkchecker -h' for help\n"))
sys.exit(1)
@ -192,7 +193,7 @@ except getopt.error:
for opt,arg in options:
if opt=="-D" or opt=="--debug":
linkcheck.Config.DebugLevel += 1
linkcheck.Config.debug(BRING_IT_ON, "Python", sys.version, "on", sys.platform)
debug(BRING_IT_ON, "Python", sys.version, "on", sys.platform)
# apply configuration
config = linkcheck.Config.Configuration()
configfiles = []
@ -221,7 +222,7 @@ for opt,arg in options:
if linkcheck.Config.Loggers.has_key(arg):
config['log'] = config.newLogger(arg)
else:
printUsage((_("Illegal argument '%s' for option ") % arg) +\
printUsage((linkcheck._("Illegal argument '%s' for option ") % arg) +\
"'-o, --output'")
elif opt=="-F" or opt=="--file-output":
@ -233,7 +234,7 @@ for opt,arg in options:
if linkcheck.Config.Loggers.has_key(type) and type != "blacklist":
config['fileoutput'].append(config.newLogger(type, ns))
else:
printUsage((_("Illegal argument '%s' for option ") % arg) +\
printUsage((linkcheck._("Illegal argument '%s' for option ") % arg) +\
"'-F, --file-output'")
elif opt=="-I" or opt=="--interactive":
@ -256,7 +257,7 @@ for opt,arg in options:
if int(arg) >= 0:
config["wait"] = int(arg)
else:
printUsage((_("Illegal argument '%s' for option ") % arg) +
printUsage((linkcheck._("Illegal argument '%s' for option ") % arg) +
"'-P, --pause'")
elif opt=="-q" or opt=="--quiet":
@ -266,7 +267,7 @@ for opt,arg in options:
if int(arg) >= 0:
config["recursionlevel"] = int(arg)
else:
printUsage((_("Illegal argument '%s' for option ") % arg) +
printUsage((linkcheck._("Illegal argument '%s' for option ") % arg) +
"'-r, --recursion-level'")
elif opt=="-R" or opt=="--robots-txt":
@ -319,14 +320,14 @@ if config["log"].__class__ == linkcheck.Logging.BlacklistLogger and \
os.path.exists(config['log'].filename):
args = open(config['log'].filename).readlines()
linkcheck.Config.debug(HURT_ME_PLENTY, pprint.pformat(config.data))
debug(HURT_ME_PLENTY, pprint.pformat(config.data))
if len(args)==0:
if config['interactive']:
urls = raw_input(_("enter one or more urls, separated by white-space\n--> "))
urls = raw_input(linkcheck._("enter one or more urls, separated by white-space\n--> "))
args = urls.split()
else:
config.warn(_("no files or urls given"))
config.warn(linkcheck._("no files or urls given"))
for url in args:
url = url.strip()
@ -340,4 +341,4 @@ for url in args:
# check the urls
linkcheck.checkUrls(config)
if config['interactive']:
raw_input(_("Hit RETURN to finish"))
raw_input(linkcheck._("Hit RETURN to finish"))

View file

@ -23,7 +23,7 @@ SOURCES=\
../linkcheck/__init__.py \
../linkcheck/lc_cgi.py \
../linkchecker
LDIR=../locale
LDIR=../share/locale
LFILE=LC_MESSAGES/$(PACKAGE).mo
MOS=$(LDIR)/de/$(LFILE) $(LDIR)/fr/$(LFILE)
PACKAGE=linkcheck

View file

@ -33,8 +33,8 @@ msgstr "gefunden"
msgid "outside of domain filter, checked only syntax"
msgstr "außerhalb des Domain Filters; prüfe lediglich Syntax"
msgid "warning: no files or urls given"
msgstr "Warnung: keine Dateien oder URLs angegeben"
msgid "no files or urls given"
msgstr "keine Dateien oder URLs angegeben"
msgid "Result"
msgstr "Ergebnis"
@ -45,8 +45,8 @@ msgstr "Warnung"
msgid "Illegal argument '%s' for option "
msgstr "Ungültiges Argument '%s' für Option "
msgid "warning: timeoutsocket is not support on this system"
msgstr "Warnung: timeoutsocket wird auf diesem System nicht unterstützt"
msgid "timeoutsocket is not support on this system"
msgstr "timeoutsocket wird auf diesem System nicht unterstützt"
msgid "Verified adress: %s"
msgstr "Gültige Adresse: %s"
@ -479,7 +479,7 @@ msgid ""
"Write comments and bugs to %s\n"
"\n"
msgstr ""
"Kommentare und Fehler schreiben Sie bitte an %s\n"
"Schreiben Sie Kommentare und Fehler an %s\n"
"\n"
msgid "Zope Server cannot determine MIME type with HEAD, falling back to GET"

View file

@ -129,7 +129,7 @@ myname = "Bastian Kleineidam"
myemail = "calvin@users.sourceforge.net"
setup (name = "linkchecker",
version = "1.3.21",
version = "1.3.22",
description = "check HTML documents for broken links",
author = myname,
author_email = myemail,
@ -156,15 +156,16 @@ o a (Fast)CGI web interface (requires HTTP server)
cmdclass = {'install': MyInstall},
packages = ['','DNS','linkcheck'],
scripts = ['linkchecker'],
data_files = [('share/locale/de/LC_MESSAGES',
['locale/de/LC_MESSAGES/linkcheck.mo']),
('share/locale/fr/LC_MESSAGES',
['locale/fr/LC_MESSAGES/linkcheck.mo']),
('share/linkchecker', ['linkcheckerrc']),
('share/linkchecker/examples',
['lconline/leer.html',
'lconline/index.html', 'lconline/lc_cgi.html',
'lc.cgi','lc.fcgi','lc.sz_fcgi','linkchecker.bat']),
('man/man1', ['linkchecker.1']),
],
data_files = [
('share/locale/de/LC_MESSAGES',
['share/locale/de/LC_MESSAGES/linkcheck.mo']),
('share/locale/fr/LC_MESSAGES',
['share/locale/fr/LC_MESSAGES/linkcheck.mo']),
('share/linkchecker', ['linkcheckerrc']),
('share/linkchecker/examples',
['lconline/leer.html',
'lconline/index.html', 'lconline/lc_cgi.html',
'lc.cgi','lc.fcgi','lc.sz_fcgi','linkchecker.bat']),
('share/man/man1', ['linkchecker.1']),
],
)