Module information_theory

Module information_theory 

Source
Expand description

Information-theoretic features and transformations

This module provides feature engineering and selection based on information theory including:

  • Entropy measures (Shannon, Renyi, permutation entropy)
  • Mutual information and conditional mutual information
  • Information gain for feature selection
  • Transfer entropy for causality detection
  • Complexity measures (Lempel-Ziv, approximate entropy)
  • Information bottleneck features

Structs§

InformationFeatureSelector
Information-based feature selector
InformationFeatureSelectorConfig
Configuration for information-based feature selection
InformationFeatureSelectorFitted
Fitted information-based feature selector

Enums§

InformationMetric
Information-theoretic metric for feature selection

Functions§

approximate_entropy
Calculate approximate entropy (ApEn) - measure of regularity
conditional_entropy
Calculate conditional entropy H(Y|X)
joint_entropy
Calculate joint entropy H(X,Y)
lempel_ziv_complexity
Calculate Lempel-Ziv complexity (normalized)
mutual_information
Calculate mutual information between two variables
normalized_mutual_information
Calculate normalized mutual information (0 to 1)
permutation_entropy
Calculate permutation entropy (ordinal pattern-based entropy)
renyi_entropy
Calculate Renyi entropy of order alpha
sample_entropy
Sample entropy - improved version of approximate entropy
shannon_entropy
Calculate Shannon entropy for a discrete distribution
transfer_entropy
Calculate transfer entropy from X to Y (directional information flow)