Struct elasticlunr::Index [−][src]
pub struct Index { pub fields: Vec<String>, pub pipeline: Pipeline, pub ref_field: String, pub version: &'static str, pub document_store: DocumentStore, // some fields omitted }
Expand description
An elasticlunr search index.
Fields
fields: Vec<String>
pipeline: Pipeline
ref_field: String
version: &'static str
document_store: DocumentStore
Implementations
pub fn with_language<I>(lang: Language, fields: I) -> Self where
I: IntoIterator,
I::Item: AsRef<str>,
pub fn with_language<I>(lang: Language, fields: I) -> Self where
I: IntoIterator,
I::Item: AsRef<str>,
Add the data from a document to the index.
NOTE: The elements of data
should be provided in the same order as
the fields used to create the index.
Example
let mut index = Index::new(&["title", "body"]); index.add_doc("1", &["this is a title", "this is body text"]);
pub fn add_doc_with_tokenizer<I>(
&mut self,
doc_ref: &str,
data: I,
tokenizer: TokenizerFn
) where
I: IntoIterator,
I::Item: AsRef<str>,
pub fn add_doc_with_tokenizer<I>(
&mut self,
doc_ref: &str,
data: I,
tokenizer: TokenizerFn
) where
I: IntoIterator,
I::Item: AsRef<str>,
Add the data from a document to the index.
NOTE: The elements of data
should be provided in the same order as
the fields used to create the index.
Example
fn css_tokenizer(text: &str) -> Vec<String> { text.split(|c: char| c.is_whitespace()) .filter(|s| !s.is_empty()) .map(|s| s.trim().to_lowercase()) .collect() } let mut index = Index::new(&["title", "body"]); index.add_doc_with_tokenizer("1", &["this is a title", "this is body text"], css_tokenizer);
pub fn add_doc_with_tokenizers<I, T>(
&mut self,
doc_ref: &str,
data: I,
tokenizers: T
) where
I: IntoIterator,
I::Item: AsRef<str>,
T: IntoIterator<Item = TokenizerFn>,
pub fn add_doc_with_tokenizers<I, T>(
&mut self,
doc_ref: &str,
data: I,
tokenizers: T
) where
I: IntoIterator,
I::Item: AsRef<str>,
T: IntoIterator<Item = TokenizerFn>,
Add the data from a document to the index.
NOTE: The elements of data
and tokenizers
should be provided in
the same order as the fields used to create the index.
Example
use elasticlunr::pipeline::{tokenize, TokenizerFn}; fn css_tokenizer(text: &str) -> Vec<String> { text.split(|c: char| c.is_whitespace()) .filter(|s| !s.is_empty()) .map(|s| s.trim().to_lowercase()) .collect() } let mut index = Index::new(&["title", "body"]); let tokenizers: Vec<TokenizerFn> = vec![tokenize, css_tokenizer]; index.add_doc_with_tokenizers("1", &["this is a title", "this is body text"], tokenizers);
Returns the index, serialized to pretty-printed JSON.
Trait Implementations
fn deserialize<__D>(__deserializer: __D) -> Result<Self, __D::Error> where
__D: Deserializer<'static>,
fn deserialize<__D>(__deserializer: __D) -> Result<Self, __D::Error> where
__D: Deserializer<'static>,
Deserialize this value from the given Serde deserializer. Read more
Auto Trait Implementations
impl RefUnwindSafe for Index
impl UnwindSafe for Index