Item530: Add Crawl-delay to robots.txt
Priority: Enhancement
Current State: Closed
Released In: 1.0.4
Target Release: patch
Applies To: Engine
Component: robots.txt
Branches:
There is a non-standard robots.txt tag Crawl-delay in the non-standard robots.txt file that limits the rate at which some bots crawl a site. At least yahoo and msft honour this. We should set this in the robots.txt that ships with foswiki to prevent problems with resource usage by crawlers hitting a foswiki site.
is this completed for trunk??
--
SvenDowideit - 10 May 2009
any reason not to put this in 1.0.6?
--
WillNorris - 10 May 2009
It is in trunk and was there before we created the branch. Closing.
--
KennethLavrsen - 10 May 2009