++ed by:
PERLANCAR NGLENN KEEDI SHARYANTO MMCLERIC

5 PAUSE users
1 non-PAUSE user.

Shlomi Fish

NAME

wikicrawl - crawl Wikipedia to generate graph from the found article links

SYNOPSIS

Crawl wikipedia and create a Graph::Easy text describing the inter-article links that were found during the crawl.

At least one argument must be given to start:

        perl examples/wikicrawl.pl --lang=fr

ARGUMENTS

Here are the options:

--help

Print the full documentation, not just this short overview.

--version

Write version info and exit.

--language

Select the language of Wikipedia that we should crawl. Currently supported are 'de', 'en' and 'fr'. Default is 'en'.

--root

Set the root node where the crawl should start. Default is of course 'Xkcd'.

--maxdepth

The maximum depth the crawl should go. Please select small values under 10. Default is 4.

--maxspread

The maximum number of links we follow per article. Please select small values under 10. Default is 5.

--maxnodes

The maximum number of nodes we crawl. Set to -1 (default) to disable.

SEE ALSO

http://forums.xkcd.com/viewtopic.php?f=2&t=21300&p=672184 and Graph::Easy.

LICENSE

This library is free software; you can redistribute it and/or modify it under the terms of the GPL.

See the LICENSE file of Graph::Easy for a copy of the GPL.

AUTHOR

Copyright (C) 2008 by integral forum.xkcd.com Copyright (C) 2008 by Tels http://bloodgate.com