Expand description
Information-theoretic features and transformations
This module provides feature engineering and selection based on information theory including:
- Entropy measures (Shannon, Renyi, permutation entropy)
- Mutual information and conditional mutual information
- Information gain for feature selection
- Transfer entropy for causality detection
- Complexity measures (Lempel-Ziv, approximate entropy)
- Information bottleneck features
Structs§
- Information
Feature Selector - Information-based feature selector
- Information
Feature Selector Config - Configuration for information-based feature selection
- Information
Feature Selector Fitted - Fitted information-based feature selector
Enums§
- Information
Metric - Information-theoretic metric for feature selection
Functions§
- approximate_
entropy - Calculate approximate entropy (ApEn) - measure of regularity
- conditional_
entropy - Calculate conditional entropy H(Y|X)
- joint_
entropy - Calculate joint entropy H(X,Y)
- lempel_
ziv_ complexity - Calculate Lempel-Ziv complexity (normalized)
- mutual_
information - Calculate mutual information between two variables
- normalized_
mutual_ information - Calculate normalized mutual information (0 to 1)
- permutation_
entropy - Calculate permutation entropy (ordinal pattern-based entropy)
- renyi_
entropy - Calculate Renyi entropy of order alpha
- sample_
entropy - Sample entropy - improved version of approximate entropy
- shannon_
entropy - Calculate Shannon entropy for a discrete distribution
- transfer_
entropy - Calculate transfer entropy from X to Y (directional information flow)