Search results for "module:WWW::RobotsRules"
WWW::RobotRules::Extended - database of robots.txt-derived permissions. This is a fork of WWW::RobotRules
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> It also parses rules that contains wildcards '*' and allow directives like Google does. Webmasters can use the /robo...YSIMONX/WWW-RobotRules-Extended-0.02 - 14 Jan 2012 10:23:47 UTC