How does Restate help?
The benefits of using Restate here are:- Automatic retries of failed tasks: LLM API down, timeouts, infrastructure failures, etc.
- Recovery of previous progress: After a failure, Restate recovers the progress the execution did before the crash.
- Works with any LLM SDK (Vercel AI, LangChain, LiteLLM, etc.) and any programming language supported by Restate (TypeScript, Python, Go, etc.).
Example
Wrap each step in the chain withctx.run() to ensure fault tolerance and automatic recovery. Restate uses durable execution to persist the result of each step as it completes, so if any step fails, Restate will retry from that exact point without losing previous progress or re-executing completed steps.

Run the example
Run the example
Requirements
- AI SDK of your choice (e.g., OpenAI, LangChain, Pydantic AI, LiteLLM, etc.) to make LLM calls.
- API key for your model provider.
Start the Service
Export the API key of your model provider as an environment variable and then start the agent. For example, for OpenAI:
Send a request
In the UI (
http://localhost:9070), click on the process handler of the CallChainingService to open the playground and send a default request:
