How does Restate help?
The benefits of using Restate here are:- Durable promises: Create promises that survive process restarts and failures
- Automatic recovery: Resume exactly where you left off after any interruption
- Works with any LLM SDK (Vercel AI, LangChain, LiteLLM, etc.) and any programming language supported by Restate (TypeScript, Python, Go, etc.).
- Serverless-friendly waiting: Suspend execution during approval wait times - pay for active work, not idle time:

Example
Usectx.awakeable() to create durable promises that can be resolved externally. The workflow suspends during the wait and resumes automatically when the promise is resolved.
Run the example
Run the example
Requirements
- AI SDK of your choice (e.g., OpenAI, LangChain, Pydantic AI, LiteLLM, etc.) to make LLM calls.
- API key for your model provider.
Start the Service
Export the API key of your model provider as an environment variable and then start the agent. For example, for OpenAI:
Send a request
- UI
- curl
In the UI (
http://localhost:9070), click on the moderate handler of the CallChainingService to open the playground and send a default request:

Resolve the approval
Check the service logs to find the curl command to resolve the approval. It will look like this:Replace
{awakeable_id} with the actual ID from the logs.
