WWW::Webrobot::pod::Recur - Interface for traversing a site / generating multiple urls
Usage in a test plan:
<request> <method value='GET'/> <url value='http://server.org/start.html'/> <recurse> <WWW.Webrobot.Recur.LinkChecker> <and> <url value="http://server.org/"/> <scheme value="http"/> <not><url value="logout\.jsp"/></not> <not><url value="logout\.do"/></not> <not><url value="setUserLocale.do"/></not> </and> </WWW.Webrobot.Recur.LinkChecker> </recurse> <description value='Check site'/> </request>
For writing your own class please study the source of WWW::Webrobot::Recur::Browser or WWW::Webrobot::Recur::LinkChecker.
This interface allows you to visit new urls starting from an url given in a testplan.
If you want to write a recursion class you must implement the following methods:
Synopsis:
($newurl, $caller_pages) = $recurse -> next($r);
$newurl is the next url to visit and $caller_pages is a list of values. This list should indicate where $newurl has been found.
$newurl
$caller_pages
If you want to stop traversing than return (undef, undef).
(undef, undef)
It takes a string url as argument and must return 1 if this url is allowed and 0 if it is not allowed.
If you wonder why this method is needed - here is an explanation: Returning allowed URLs by next is not sufficient because URLs may be redirected via 3xx codes. This redirection is done automatically by WWW::Webrobot via HTTP::UserAgent and must therefore be savely excluded in HTTP::UserAgent. That's where the allowed method is used.
next
allowed
Follow all frames and images.
Follow all frames, images and links you can get.
Follow links randomly.
WWW::Webrobot::pod::Testplan
Stefan Trcek
Copyright(c) 2004 ABAS Software AG
This software is licensed under the perl license, see LICENSE file.
To install WWW::Webrobot, copy and paste the appropriate command in to your terminal.
cpanm
cpanm WWW::Webrobot
CPAN shell
perl -MCPAN -e shell install WWW::Webrobot
For more information on module installation, please visit the detailed CPAN module installation guide.