Documentation Menu
Core Concepts
AgentLift is built around a few core concepts. Understanding these will help you get the most out of the platform.
Agents
An agent is a deployed AI service defined by a YAML configuration file (agent.yaml). Each agent has a model, system prompt, optional tools, and evaluation criteria. Agents are versioned — every push creates a new version, and you can roll back to any previous version.
Agents expose an OpenAI-compatible API endpoint (/v1/chat/completions), so you can use any existing OpenAI SDK or HTTP client to interact with them.
Runtime
The AgentLift runtime is the execution environment where your agents run. Built on Cloudflare Workers, it runs agents at the edge with minimal latency and no cold starts.
When a request arrives, the runtime:
- Loads the agent config from KV storage
- Executes the agent loop (LLM calls + tool use)
- Streams the response back to the caller
- Emits traces for observability
There are no per-agent deployments. Deploying an agent writes its config to Cloudflare KV, and the shared runtime picks it up on the next request.
Traces
Every agent interaction is automatically captured as a trace. Traces include:
- The full conversation (messages in and out)
- Tool calls and their results
- Model latency and token usage
- Cost breakdown per request
Traces are stored in the control plane and can be viewed in the dashboard or queried via the API. Use agentlift logs to stream traces from the CLI.
Evals
Evals let you measure agent quality systematically. Define evaluation criteria in your agent.yaml and AgentLift runs them automatically against agent responses.
Supported eval types:
- llm-judge — uses an LLM to score responses against criteria you define
- exact-match — checks for exact string matches (coming soon)
- custom — run your own evaluation function via webhook (coming soon)
Environments
AgentLift supports multiple deployment environments per agent. Use environments to separate staging from production, run A/B tests, or manage different configurations.
Each environment gets its own endpoint, secrets, and deployment history. Promote a version from staging to production when you’re confident it’s ready.