Skip to main content
Version: User Guides (Cloud)

Length
Public Preview

The length filter removes tokens that do not meet specified length requirements, allowing you to control the length of tokens retained during text processing.

Configuration

The length filter is a custom filter in Zilliz Cloud, specified by setting "type": "length" in the filter configuration. You can configure it as a dictionary within the analyzer_params to define length limits.

analyzer_params = {
"tokenizer": "standard",
"filter":[{
"type": "length", # Specifies the filter type as length
"max": 10, # Sets the maximum token length to 10 characters
}],
}

The length filter accepts the following configurable parameters.

Parameter

Description

max

Sets the maximum token length. Tokens longer than this length are removed.

The length filter operates on the terms generated by the tokenizer, so it must be used in combination with a tokenizer. For a list of tokenizers available in Zilliz Cloud, refer to Tokenizer Reference.

After defining analyzer_params, you can apply them to a VARCHAR field when defining a collection schema. This allows Zilliz Cloud to process the text in that field using the specified analyzer for efficient tokenization and filtering. For details, refer to Example use.

Examples

Before applying the analyzer configuration to your collection schema, verify its behavior using the run_analyzer method.

Analyzer configuration:

analyzer_params = {
"tokenizer": "standard",
"filter":[{
"type": "length", # Specifies the filter type as length
"max": 10, # Sets the maximum token length to 10 characters
}],
}

**Verification using run_analyzer:

# Sample text to analyze
sample_text = "The length filter allows control over token length requirements for text processing."

# Run the standard analyzer with the defined configuration
result = MilvusClient.run_analyzer(sample_text, analyzer_params)
print(result)

Expected output (with max: 10):

['The', 'length', 'filter', 'allows', 'control', 'over', 'token', 'length', 'for', 'text', 'processing']