The London Perl and Raku Workshop takes place on 26th Oct 2024. If your company depends on Perl, please consider sponsoring and/or attending.

NAME

Scraperwiki - Scraperwiki library

SYNOPSIS

  use Scraperwiki;

  Scraperwiki::save_sqlite({unique_keys => ['country'], data => {country => 'CZ', heavy => 'metal'}});
  Scraperwiki::attach ('scraper');
  print Scraperwiki::select ("* from swdata limit 10");
  Scraperwiki::commit;
  print Scraperwiki::show_tables;
  Scraperwiki::table_info ({name => 'swdata'});
  Scraperwiki::table_info ('swdata');
  Scraperwiki::save_var ('Hello', {value => 8086, verbose => 2});
  Scraperwiki::get_var ('Hello');
  Scraperwiki::get_var ({name => 'Hello', default => 666});
  Scraperwiki::httpresponseheader ({headerkey => 'Content-Type', headervalue => 'text/plain'});
  Scraperwiki::gb_postcode_to_latlng ({postcode => 'L17AY'});
  Scraperwiki::gb_postcode_to_latlng ('L17AY');

DESCRIPTION

The Perl environment in ScraperWiki comes with the Scraperwiki module loaded.

METHODS

scrape (url[, params])

Returns the downloaded string from the given url. params are send as a POST if set.

save_sqlite (unique_keys, data[, table_name="swdata", verbose=2])

Saves a data record into the datastore into the table given by table_name. data is a hash with string or symbol field names as keys, unique_keys is an array that is a subset of data.keys which determines when a record is to be over-written. For large numbers of records data can be a list of hashes. verbose alters what is shown in the Data tab of the editor.

attach (name[, asname])

Attaches to the datastore of another scraper of name name. asname is an optional alias for the attached datastore.

select (val1[, val2])

Executes a select command on the datastore, e.g. select("* from swdata limit 10") Returns an array of hashes for the records that have been selected. val2 is an optional array of parameters when the select command contains '?'s.

sqliteexecute (val1[, val2])

Executes any arbitrary sqlite command (except attach), e.g. create, delete, insert or drop. val2 is an optional array of parameters if the command in val1 contains question marks. (e.g. "insert into swdata values (?,?,?)").

commit ()

Commits to the file after a series of execute commands. (save_sqlite() auto-commits after every action).

show_tables ([dbname])

Returns an array of tables and their schemas in either the current or an attached database.

table_info (name)

Returns an array of attributes for each element of the table.

save_var (key, value)

Saves an arbitrary single-value into a sqlite table called "swvariables". e.g. Can be used to make scrapers able to continue after an interruption.

get_var (key[, default])

Retrieves a single value that was saved by save_var.

httpresponseheader (headerkey, headervalue)

Set the content-type header to something other than HTML when using a ScraperWiki "view" (e.g. "Content-Type", "image/png")

gb_postcode_to_latlng (postcode)

Returns an array [lat, lng] in WGS84 coordinates representing the central point of a UK postcode area.

SEE ALSO

COPYRIGHT

This program is free software; you can redistribute it and/or modify it under the same terms as Scraperwiki itself.

AUTHOR

Lubomir Rintel <lkundrak@v3.sk>