AIai/README
AI Module Overview
The AI module provides a unified streaming interface across multiple LLM providers. It exposes model definitions, provider adapters, message types, and utility helpers.
Primary entrypoint: indusagi/ai.
Directory Map
indusagi/src/ai/index.tsre-exports most public APIs.indusagi/src/ai/stream.tsprovides the main streaming and completion functions.indusagi/src/ai/models.tsloads models and computes costs.indusagi/src/ai/types.tsdefines message and tool types.indusagi/src/ai/providers/*contains provider adapters.indusagi/src/ai/utils/*includes parsing, validation, and overflow helpers.indusagi/src/ai/cli.tsis a small OAuth helper CLI.
Conceptual Flow
- Build a
Contextwith a system prompt, messages, and optional tool definitions. - Select a
Modelfrom the registry or custom providers. - Call
streamorstreamSimpleto get an async stream of events. - Consume events to render incremental text, tool calls, and final usage.
- Optionally call
completeorcompleteSimpleto await the final assistant message.
The same Context and Message structure is used across all providers.
Provider adapters handle conversion to each vendor API format.
What This Module Does
- Normalizes message formats across providers.
- Handles tool calls and tool results.
- Supports text and image inputs where allowed.
- Emits a unified stream of incremental events.
- Calculates token usage cost per model.
- Supports reasoning or thinking controls when the provider allows it.
For API details, see:
indusagi/docs/ai/api-reference.mdindusagi/docs/ai/streaming.mdindusagi/docs/ai/models.mdindusagi/docs/ai/providers.mdindusagi/docs/ai/utils.md
