linkchecker/TODO

24 lines
903 B
Text

- Improved print_status
- Ctrl-C is not really working.
- To limit the used memory, put a maximum size on the URL queue
(eg. 20000 URLs). If reached, the worker calling queue.put() will
wait for another worker to call queue.get() before continuing.
Problem: dead lock when all workers called queue.put().
- [FEATURE] postmortem debugging with pdb.pm()
- [BUGFIX] The URL in the log output is double normed.
- [BUGFIX] when an URL is found in the cache and it has a broken anchor,
the broken anchor name is not displayed as a warning
- [USAGE] make a nice GUI for linkchecker
- [FEATURE] warn if overall size of page (including images/flash/etc.) is
too big. Right now, the page size is only html content.
- [FEATURE] Option to save downloaded pages. This could also be used to
build an internal cache, however there are already a plethora of
caching proxies we could use for that.