How does Restate help?
The benefits of using Restate here are:- Durable promises: Create promises that survive process restarts and failures
- Automatic recovery: Resume exactly where you left off after any interruption
- Works with any LLM SDK (Vercel AI, LangChain, LiteLLM, etc.) and any programming language supported by Restate (TypeScript, Python, Go, etc.).
- Cost-efficient waiting: Suspend execution during approval wait times - pay for active work, not idle time:

Example
Usectx.awakeable() to create durable promises that can be resolved externally. The workflow suspends during the wait and resumes automatically when the promise is resolved.
Run the example
Run the example
1
Requirements
- AI SDK of your choice (e.g., OpenAI, LangChain, Pydantic AI, LiteLLM, etc.) to make LLM calls.
- API key for your model provider.
2
Download the example
3
Start the Restate Server
4
Start the Service
Export the API key of your model provider as an environment variable and then start the agent. For example, for OpenAI:
5
Register the services
- UI
- CLI

6
Send a request
- UI
- curl
In the UI (
http://localhost:9070), click on the moderate handler of the CallChainingService to open the playground and send a default request:

7
Resolve the approval
Check the service logs to find the curl command to resolve the approval. It will look like this:Replace
{awakeable_id} with the actual ID from the logs.8
Check the Restate UI
You can see in the Invocations Tab of the UI how the workflow suspends during the human approval step and resumes once the promise is resolved.
