2968d3538a
removed analyzers (these are now built as needed through config) removed html chacter filter (now built as needed through config) added missing license header changed constructor signature of filters that cannot return errors filter constructors that can have errors, now have Must variant which panics change cdl2 tokenizer into filter (should only see lower-case input) new top level index api, closes #5 refactored index tests to not rely directly on analyzers moved query objects to top-level new top level search api, closes #12 top score collector allows skipping results index mapping supports _all by default, closes #3 and closes #6 index mapping supports disabled sections, closes #7 new http sub package with reusable http.Handler's, closes #22
31 lines
1.0 KiB
Markdown
31 lines
1.0 KiB
Markdown
# cld2 token filter
|
|
|
|
A bleve token filter which passes the text of each token and passes it to the cld2 library. The library determines what it thinks the language most likely is. The ISO-639 language code replaces the token term.
|
|
|
|
In normal usage, you use this with the "single" tokenizer, so there is only one input token. Further, you should precede it with the "to_lower" filter so that the input term is in all lower-case unicode characters.
|
|
|
|
# Building
|
|
|
|
1. Acquire the source to cld2 in this directory.
|
|
|
|
$ svn checkout http://cld2.googlecode.com/svn/trunk/ cld2-read-only
|
|
|
|
2. Build cld2
|
|
|
|
$ cd cld2-read-only/internal/
|
|
$ ./compile_libs.sh
|
|
|
|
|
|
3. Put the resulting libraries somewhere your dynamic linker can find.
|
|
|
|
$ cp *.so /usr/local/lib
|
|
|
|
4. Run the unit tests
|
|
|
|
$ cd ../..
|
|
$ go test -v
|
|
=== RUN TestCld2Filter
|
|
--- PASS: TestCld2Filter (0.00 seconds)
|
|
PASS
|
|
ok github.com/couchbaselabs/bleve/analysis/token_filters/cld2 0.033s
|