Skip to content

FAQ⚓︎

What providers are supported?

ContextAgent ships adapters for OpenAI, Anthropic, Google, DeepSeek, and Azure OpenAI. Implement LLMClient to add more.

Does ContextAgent store data?

By default artifacts and state live in memory. Configure disk, S3, or database adapters to persist data between runs.

Can I use my own prompts?

Yes. Author profiles in contextagent/profiles/ and reference them in pipeline configs. Version prompts with the @vN suffix.

Where are logs stored?

Logs stream to stdout. Enable telemetry exporters to forward traces + logs to your observability stack.

How do I deploy to production?

Follow the deployment guide to containerize workers, configure stores, and expose an API.