NAME

Elastic::Model - A NoSQL document store with full text search for Moose objects using Elasticsearch as a backend.

VERSION

version 0.52

SYNOPSIS

    package MyApp;

    use Elastic::Model;

    has_namespace 'myapp' => {
        user => 'MyApp::User',
        post => 'MyApp::Post'
    };

    has_typemap 'MyApp::TypeMap';

    # Setup custom analyzers

    has_filter 'edge_ngrams' => (
        type     => 'edge_ngram',
        min_gram => 2,
        max_gram => 10
    );

    has_analyzer 'edge_ngrams' => (
        tokenizer => 'standard',
        filter    => [ 'standard', 'lowercase', 'edge_ngrams' ]
    );

    no Elastic::Model;

DESCRIPTION

Elastic::Model is a framework to store your Moose objects, which uses Elasticsearch as a NoSQL document store and flexible search engine.

It is designed to make it easy to start using Elasticsearch with minimal extra code, but allows you full access to the rich feature set available in Elasticsearch as soon as you are ready to use it.

FUTURE OF ELASTIC::MODEL - PLEASE READ AND COMMENT

Hi all users of Elastic::Model

Elasticsearch 2.0.0 is out, and Elastic::Model doesn't support it. In fact, Elastic::Model doesn't support a number of things from Elasticsearch 1.x either. I apologise for neglecting this module.

My feeling is that Elastic::Model tries to do way too much. Like many frameworks, it ties you into doing things in a particular way, which may or may not make sense for your use case. Most people who use Elastic::Model seem to use a subset of the functionality, and then talk to Elasticsearch directly the rest of the time.

I don't think it makes sense to just update the code for 2.x, it needs a complete rethink.

TELL ME HOW YOU USE IT

Please could you add comments to this issue explaining what bits you find useful, what bits you never use, and what bits you find annoying. Perhaps the code can be split out into smaller more useful chunks.

INTRODUCTION TO Elastic::Model

If you are not familiar with Elastic::Model, you should start by reading Elastic::Manual::Intro.

The rest of the documentation on this page explains how to use the Elastic::Model module itself.

BACKWARDS COMPATIBILITY BREAK

NOTE: This version of Elastic::Model uses Search::Elasticsearch and is intended for Elasticsearch 1.x. However, it can be used with Elasticsearch 0.90.x in "compatibility mode". Elasticsearch 2.x is not supported.

You can no longer use the old Search::Elasticsearch::Compat. See Elastic::Manual::Delta for instructions.

For a version of Elastic::Model which uses Search::Elasticsearch::Compat please see https://metacpan.org/release/DRTECH/Elastic-Model-0.28.

USING ELASTIC::MODEL

Your application needs a model class to handle the relationship between your object classes and the Elasticsearch cluster.

Your model class is most easily defined as follows:

    package MyApp;

    use Elastic::Model;

    has_namespace 'myapp' => {
        user => 'MyApp::User',
        post => 'MyApp::Post'
    };

    no Elastic::Model;

This applies Elastic::Model::Role::Model to your MyApp class, Elastic::Model::Meta::Class::Model to MyApp's metaclass and exports functions which help you to configure your model.

Your model must define at least one namespace, which tells Elastic::Model which type (like a table in a DB) should be handled by which of your classes. So the above declaration says:

"For all indices which belong to namespace myapp, objects of class MyApp::User will be stored under the type user in Elasticsearch."

Custom TypeMap

Elastic::Model uses a TypeMap to figure out how to inflate and deflate your objects, and how to configure them in Elasticsearch.

You can specify your own TypeMap using:

    has_typemap 'MyApp::TypeMap';

See Elastic::Model::TypeMap::Base for instructions on how to define your own type-map classes.

Custom unique key index

If you have attributes whose values are unique, then you can customize the index where these unique values are stored.

    has_unique_index 'myapp_unique';

The default value is unique_key.

Custom analyzers

Analysis is the process of converting full text into terms or tokens and is one of the things that gives full text search its power. When storing text in the Elasticsearch index, the text is first analyzed into terms/tokens. Then, when searching, search keywords go through the same analysis process to produce the terms/tokens which are then searched for in the index.

Choosing the right analyzer for each field gives you enormous control over how your data can be queried.

There are a large number of built-in analyzers available, but frequently you will want to define custom analyzers, which consist of:

  • zero or more character filters

  • a tokenizer

  • zero or more token filters

Elastic::Model provides sugar to make it easy to specify custom analyzers:

has_char_filter

Character filters can change the text before it gets tokenized, for instance:

    has_char_filter 'my_mapping' => (
        type        => 'mapping',
        mappings    => ['ph=>f','qu=>q']
    );

See "Default character filters" in Elastic::Model::Meta::Class::Model for a list of the built-in character filters.

has_tokenizer

A tokenizer breaks up the text into individual tokens or terms. For instance, the pattern tokenizer could be used to split text using a regex:

    has_tokenizer 'my_word_tokenizer' => (
        type        => 'pattern',
        pattern     => '\W+',          # splits on non-word chars
    );

See "Default tokenizers" in Elastic::Model::Meta::Class::Model for a list of the built-in tokenizers.

has_filter

Any terms/tokens produced by the "has_tokenizer" can the be passed through multiple token filters. For instance, each term could be broken down into "edge ngrams" (eg 'foo' => 'f','fo','foo') for partial matching.

    has_filter 'my_ngrams' => (
        type        => 'edge_ngram',
        min_gram    => 1,
        max_gram    => 10,
    );

See "Default token filters" in Elastic::Model::Meta::Class::Model for a list of the built-in character token filters.

has_analyzer

Custom analyzers can be defined by combining character filters, a tokenizer and token filters, some of which could be built-in, and some defined by the keywords above.

For instance:

    has_analyzer 'partial_word_analyzer' => (
        type        => 'custom',
        char_filter => ['my_mapping'],
        tokenizer   => ['my_word_tokenizer'],
        filter      => ['lowercase','stop','my_ngrams']
    );

See "Default analyzers" in Elastic::Model::Meta::Class::Model for a list of the built-in analyzers.

Overriding Core Classes

If you would like to override any of the core classes used by Elastic::Model, then you can do so as follows:

    override_classes (
        domain  => 'MyApp::Domain',
        store   => 'MyApp::Store'
    );

The defaults are:

SEE ALSO

AUTHOR

Clinton Gormley <drtech@cpan.org>

COPYRIGHT AND LICENSE

This software is copyright (c) 2015 by Clinton Gormley.

This is free software; you can redistribute it and/or modify it under the same terms as the Perl 5 programming language system itself.