Quick Reference
A quick reference guide to Turn's core primitives, syntax, expressions, and built-in tools.
Core Primitives
First-class unit of behaviour. turn { ... } defines a control cycle representing one step of agent execution. Closures are turn(args) -> Type { ... }.
Context window management. context.append(str) adds to the LLM's working input. context.system(str) sets the system prompt. Both are per-process and isolated.
Persistent process memory. remember("key", value) writes; recall("key") reads. Returns null when key is missing. Persists across suspend and VM restarts.
Tool invocation. call("tool_name", arg) or call("tool_name", arg1, arg2, ...) for multi-arg tools. Serves as the single suspension boundary enabling deterministic replay.
Probabilistic effect returning a typed value. infer Type { "prompt"; } invokes an LLM with cognitive type safety. The VM validates the response against the struct schema.
Extract confidence from an Uncertain value. if confidence result < 0.85 { ... } gates execution on the model's certainty. First-class VM instruction.
Explicit durable checkpoint. suspend; writes the entire VM state to disk. Execution can be resumed externally from exactly this point.
Actor-style concurrency. spawn turn() { ... } creates an isolated process. spawn_link adds bidirectional exit signal propagation. spawn_each(list, closure) is the concurrent scatter/gather primitive.
Native higher-order list operations. map(list, turn(x: T) -> R { ... }) and filter(list, turn(x: T) -> Bool { ... }) are compiler-expanded to inline bytecode. No imports are required.
Types
- Identity: An opaque cryptographic capability handle for Zero-Trust authentication. Cannot be coerced to string.
Expressions & Statements
- Literals:
42,"hello",true,false,null,[1, 2, 3],{ "key": val } - Variables:
let id = expr; - Struct init:
Name { field: val, ..base }(spread copies frombase) - Operators:
+,==,!=,<,>,<=,>=,and,or,!,~>(Vec similarity) - Indexing:
expr[index]for lists and maps - Member access:
expr.fieldfor structs and maps - Control flow:
if expr { ... } else { ... }(also usable as expression),while expr { ... } - Structs:
struct Name { field: Type }; - Error handling:
try { ... } catch(e) { ... }andthrow expr; - Message passing:
send pid, value;andlet msg = receive; - Module import:
let mod = use "path/to/module"; - Identity grant:
grant identity::oauth("name")requests a secure Identity capability from the Turn VM host. The raw token never enters Turn memory. - Schema absorption:
use schema::openapi("url")performs compile-time schema absorption. It fetches an API schema and synthesizes native Turn closures at compile time.
Built-in Tools
These tools are available directly via call(...) with no import required:
| Tool | Input | Purpose |
|---|---|---|
call("echo", val) | any value | Print to stdout, returns the value |
call("len", val) | Str/List/Map/Vec | Returns length as Num |
call("list_push", [list, item]) | list + item | Returns new list with item appended |
call("list_contains", [list, item]) | list + item | Returns Bool |
call("llm_generate", map) | {"messages": List, "model": Str?} | Raw LLM generation via WASM driver |
NOTE
list_push and list_contains take a two-element list [list, item] as their single argument.
Standard Library
System I/O is provided by Turn's Standard Library, pure Turn modules embedded in the binary. Import with use "std/<name>". Capability-sensitive operations require a grant identity token.
let net = use "std/net";
let fs = use "std/fs";
let json = use "std/json";
let time = use "std/time";
let env = use "std/env";
let re = use "std/regex";
let math = use "std/math";
| Module | Function | Identity | Purpose |
|---|---|---|---|
std/net | net.get({ url, identity }) | network(...) | HTTP GET |
std/net | net.post({ url, body, identity }) | network(...) | HTTP POST |
std/fs | fs.read({ path, identity }) | filesystem(...) | Read file |
std/fs | fs.write({ path, content, identity }) | filesystem(...) | Write file |
std/json | json.parse(str) | none | Parse JSON string |
std/json | json.stringify(val) | none | Serialize to JSON string |
std/time | time.now() | none | Unix epoch as Num |
std/time | time.sleep(seconds) | none | Pause execution |
std/env | env.get({ key, identity }) | environment(...) | Read env variable |
std/env | env.set({ key, value, identity }) | environment(...) | Set env variable |
std/regex | re.matches(pattern, text) | none | Returns Bool |
std/regex | re.replace(pattern, text, replacement) | none | Returns Str |
std/math | math.max(a, b) | none | Larger of two Num |
std/math | math.min(a, b) | none | Smaller of two Num |
std/math | math.abs(n) | none | Absolute value |
See Standard Library for full documentation.
LLM Providers
Set the TURN_LLM_PROVIDER environment variable to route inference to the appropriate WASM driver:
| Provider | TURN_LLM_PROVIDER value | Required env vars |
|---|---|---|
| OpenAI | openai | OPENAI_API_KEY |
| Anthropic | anthropic | ANTHROPIC_API_KEY |
| Google Gemini | gemini | GEMINI_API_KEY |
| xAI Grok | grok | XAI_API_KEY |
| Ollama (local) | ollama | none |
| Azure OpenAI | azure_openai | AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY, AZURE_OPENAI_DEPLOYMENT |
| Azure Anthropic | azure_anthropic | AZURE_ANTHROPIC_ENDPOINT, AZURE_ANTHROPIC_API_KEY |
| Azure OpenAI Responses | azure_openai_responses | AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY, AZURE_OPENAI_DEPLOYMENT |