0
0
bleve/analysis/token_filters/cld2
Marty Schoch c526a38369 major refactor of analysis files, now wired up to registry
ultimately this is make it more convenient for us to wire up
different elements of the analysis pipeline, without having to
preload everything into memory before we need it

separately the index layer now has a mechanism for storing
internal key/value pairs.  this is expected to be used to
store the mapping, and possibly other pieces of data by the
top layer, but not exposed to the user at the top.
2014-08-13 21:14:47 -04:00
..
cld2_filter_test.go introduced token type 2014-07-31 13:54:12 -04:00
cld2_filter.cc major refactor, apologies for the large commit 2014-07-30 12:30:38 -04:00
cld2_filter.go major refactor of analysis files, now wired up to registry 2014-08-13 21:14:47 -04:00
cld2_filter.h major refactor, apologies for the large commit 2014-07-30 12:30:38 -04:00
README.md major refactor, apologies for the large commit 2014-07-30 12:30:38 -04:00

cld2 token filter

A bleve token filter which passes the text of each token and passes it to the cld2 library. The library determines what it thinks the language most likely is. The ISO-639 language code replaces the token term.

In normal usage, you use this with the "single" tokenizer, so there is only one input token. Further, you should precede it with the "to_lower" filter so that the input term is in all lower-case unicode characters.

Building

  1. Acquire the source to cld2 in this directory.

    $ svn checkout http://cld2.googlecode.com/svn/trunk/ cld2-read-only
    
  2. Build cld2

    $ cd cld2-read-only/internal/
    $ ./compile_libs.sh
    
  3. Put the resulting libraries somewhere your dynamic linker can find.

    $ cp *.so /usr/local/lib
    
  4. Run the unit tests

    $ cd ../..
    $ go test -v
    === RUN TestCld2Filter
    --- PASS: TestCld2Filter (0.00 seconds)
    PASS
    ok      github.com/couchbaselabs/bleve/analysis/token_filters/cld2      0.033s