Patch |
r845630 |
r845630 | gstein | 2003-04-04 23:02:07 +0000 (Fri, 04 Apr 2003) some servlets are very expensive, and search engines which crawl the site can trigger them (this has caused at least some of the recent 500 errors). use the standard robots.txt to prevent well behaved robots from causing trouble. * www/robots.txt: new file. Patch by: Ed Korthof <edk@collab.net>