llm-stack-openai 0.6.0

OpenAI GPT provider for the llm-stack SDK
Documentation
# Changelog

All notable changes to this project will be documented in this file.

## [0.6.0] - 2026-02-08


## [0.5.0] - 2026-02-07


## [0.4.0] - 2026-02-07

### Miscellaneous

- Release v0.3.0 ([#14]https://github.com/nazq-org/llm-stack/pull/14)


## [0.3.0] - 2026-02-06

### Bug Fixes

- Update README resumable loop example and add Usage::total_tokens() ([#13]https://github.com/nazq-org/llm-stack/pull/13)

### Features

- *(registry)* Add shared HTTP client to ProviderConfig ([#10]https://github.com/nazq-org/llm-stack/pull/10)
- *(tool)* Add resumable tool loop and optimize hot-path allocations ([#9]https://github.com/nazq-org/llm-stack/pull/9)


## [0.2.2] - 2026-02-05


## [0.2.1] - 2026-02-05

### Miscellaneous

- Add docs.rs metadata configuration


## [0.2.0] - 2026-02-05

### Features

- Rename llm-stack-core to llm-stack ([#4]https://github.com/nazq-org/llm-stack/pull/4)


## [0.1.1] - 2026-02-05