WWW::RobotRules::AnyDBM_File - Persistent RobotRules No river data available

This is a subclass of *WWW::RobotRules* that uses the AnyDBM_File package to implement persistent diskcaching of robots.txt and host visit information. The constructor (the new() method) takes an extra argument specifying the name of the DBM file to ...

GAAS/WWW-RobotRules-6.02 - 18 Feb 2012 13:09:13 GMT

WWW::RobotRules - database of robots.txt-derived permissions No river data available

This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The pars...

GAAS/WWW-RobotRules-6.02 - 18 Feb 2012 13:09:13 GMT

WWW::RobotRules::Extended - database of robots.txt-derived permissions. This is a fork of WWW::RobotRules No river data available

This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> It also parses rules that contains wildcards '*' and allow directives like Google does. Webmasters can use the /robo...

YSIMONX/WWW-RobotRules-Extended-0.02 - 14 Jan 2012 10:23:47 GMT

3 results (0.045 seconds)