logosq-optimizer 0.1.0

Classical optimizers for variational quantum algorithms
Documentation
# LogosQ Optimizer


Classical optimization algorithms for variational quantum algorithms, providing stable and fast parameter optimization.

## Features


- **Adam**: Adaptive moment estimation with momentum
- **L-BFGS**: Quasi-Newton method for smooth objectives
- **SPSA**: Gradient-free stochastic approximation
- **Natural Gradient**: Fisher information-aware optimization
- **GPU acceleration**: Optional CUDA support

## Quick Start


```rust
use logosq_optimizer::{Adam, Optimizer};

fn main() -> Result<(), Box<dyn std::error::Error>> {
    let optimizer = Adam::new()
        .with_learning_rate(0.01)
        .with_beta1(0.9);
    
    let mut params = vec![0.1; 16];
    let gradients = vec![0.01; 16];
    
    optimizer.step(&mut params, &gradients, 0)?;
    Ok(())
}
```

## Installation


```toml
[dependencies]
logosq-optimizer = "0.1"
```

## License


MIT OR Apache-2.0