The Perl Toolchain Summit needs more sponsors. If your company depends on Perl, please support this very important event.

NAME

WWW::Webrobot::pod::Recur - Interface for traversing a site / generating multiple urls

SYNOPSIS

Usage in a test plan:

    <request>
        <method value='GET'/>
        <url value='http://server.org/start.html'/>
        <recurse>
            <WWW.Webrobot.Recur.LinkChecker>
                <and>
                    <url value="http://server.org/"/>
                    <scheme value="http"/>
                    <not><url value="logout\.jsp"/></not>
                    <not><url value="logout\.do"/></not>
                    <not><url value="setUserLocale.do"/></not>
                </and>
            </WWW.Webrobot.Recur.LinkChecker>
        </recurse>
        <description value='Check site'/>
    </request>

For writing your own class please study the source of WWW::Webrobot::Recur::Browser or WWW::Webrobot::Recur::LinkChecker.

DESCRIPTION

This interface allows you to visit new urls starting from an url given in a testplan.

METHODS

If you want to write a recursion class you must implement the following methods:

$obj -> next ($url)

Synopsis:

 ($newurl, $caller_pages) = $recurse -> next($r);

$newurl is the next url to visit and $caller_pages is a list of values. This list should indicate where $newurl has been found.

If you want to stop traversing than return (undef, undef).

$obj -> allowed ($url)

It takes a string url as argument and must return 1 if this url is allowed and 0 if it is not allowed.

If you wonder why this method is needed - here is an explanation: Returning allowed URLs by next is not sufficient because URLs may be redirected via 3xx codes. This redirection is done automatically by WWW::Webrobot via HTTP::UserAgent and must therefore be savely excluded in HTTP::UserAgent. That's where the allowed method is used.

IMPLEMENTING CLASSES

WWW::Webrobot::Recur::Browser

Follow all frames and images.

WWW::Webrobot::Recur::LinkChecker

Follow all frames, images and links you can get.

WWW::Webrobot::Recur::RandomBrowser [EXPERIMENTAL]

Follow links randomly.

SEE ALSO

WWW::Webrobot::pod::Testplan

AUTHOR

Stefan Trcek

COPYRIGHT

Copyright(c) 2004 ABAS Software AG

This software is licensed under the perl license, see LICENSE file.