Commit Graph

3 Commits

Author SHA1 Message Date
c6ee3d79ad Fix syntax-check 'sc_prohibit_have_config_h'
* cfg.mk: Remove sc_prohibit_have_config_h from local-checks-to-skip
* libwget/*.c: Include <config.h> unconditionally
* src/*.c: Likewise
* tests/*.c: Likewise
2017-04-30 22:01:34 +02:00
ec396c577f Fix URLs to HTTPS where possible 2017-02-28 15:31:30 +01:00
36b095fd64 Fix Robots Exclusion Standard
* include/libwget.h.in: Add function wget_list_getnext().
* libwget/list.c: Add function wget_list_getnext().
* libwget/robots.c: Fix memory leak.
* src/host.c (host_remove_job): Cleanup queue after downloading and
  scanning robots.txt.
* src/job.h (struct JOB): Add flag 'requested_by_user'.
* src/wget.c (add_url_to_queue): Set 'requested_by_user',
  (add_url): Fix checking for disallowed paths.
* tests/Makefile.am: Add test 'test-robots'.
* tests/test-robots.c: New test to prove robots functionality.

Special handling for automatic robots.txt jobs
==============================================
What can happen with --recursive and --span-hosts is that a document from hostA
has links to hostB. All these links might go into the hostB queue before robots.txt
is downloaded and parsed. To avoid downloading of 'disallowed' documents, the queue
for hostB has to be cleaned up right after downloading and parsing robots.txt.
Any links links that have been explicitly requested by the user are still downloaded.
2016-09-19 15:23:48 +02:00