gtars-tokenizers
Wrapper around gtars-overlaprs for producing tokens for machine learning models.
Purpose
This module wraps the core overlap infrastructure from gtars-overlaprs to convert genomic regions into vocabulary tokens for machine learning pipelines. It is specifically designed for ML applications that need to represent genomic intervals as discrete tokens.
Design Philosophy
All overlap computation is delegated to gtars-overlaprs. This module focuses on:
- Token vocabulary management
- Encoding/decoding strategies
- Integration with ML frameworks (HuggingFace, etc.)
Use Cases
- Transformer Models: Convert genomic regions to token sequences
- Feature Extraction: Represent intervals as discrete features for ML
- Language Model Input: Prepare genomic data for NLP-based models
Main Components
Tokenizer: Maps regions to vocabulary tokens using overlap detectionUniverse: Vocabulary of genomic regions (peaks/intervals)
Example
use Path;
use Tokenizer;
use Region;
let tokenizer = from_bed.unwrap;
let regions = vec!;
let tokens = tokenizer.tokenize;