Usage: ./ [ -concurrency <concurrency level> ] [ -times <no. times> ] [ -depth <depth> ] [ -logdir <log dir> ] <base url>

DESCRIPTION is a perl script that stress tests a website. Given a URL, it will "spider" from that URL, requesting all pages linked from it, and all images on each page. It will only follow links on the same site (from the same host). It can be configured, using command line options, to traverse links to a particular depth (default 1), to do the traversal a number of times (default 1) and to fork a number of concurrent clients to do seperate traversals (default 1).

Each fork'ed client will log its activity in a logfile called "log.n", where n is the number of the client in a logging directory (default the install dir of the script). The log lists all requests, with time of request, bytes transfered, total bytes transfered, and total elapsed time in a tab seperated format; e.g.:

Mon May 14 12:36:13 2001 51691 51691 1.342589 02072906

The stresstester tries to mimic a browser; i.e. it will "cache" images, and only request them once.



Copyright (c) 2001 Ave Wrigley. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.