LLM inference engine, tool dispatch, and streaming execution loop.
- [
engine]: context window management, token estimation, and tool execution helpers. - [
runtime]: the core agent loop, streaming, and run lifecycle coordination.
LLM inference engine, tool dispatch, and streaming execution loop.
engine]: context window management, token estimation, and tool execution helpers.runtime]: the core agent loop, streaming, and run lifecycle coordination.