Quick Reference

A quick reference guide to Turn's core primitives, syntax, expressions, and built-in tools.


Core Primitives

turn

First-class unit of behaviour. turn { ... } defines a control cycle representing one step of agent execution. Closures are turn(args) -> Type { ... }.

context.append / context.system

Context window management. context.append(str) adds to the LLM's working input. context.system(str) sets the system prompt. Both are per-process and isolated.

remember / recall

Persistent process memory. remember("key", value) writes; recall("key") reads. Returns null when key is missing. Persists across suspend and VM restarts.

call

Tool invocation. call("tool_name", arg) or call("tool_name", arg1, arg2, ...) for multi-arg tools. Serves as the single suspension boundary enabling deterministic replay.

infer

Probabilistic effect returning a typed value. infer Type { "prompt"; } invokes an LLM with cognitive type safety. The VM validates the response against the struct schema.

confidence

Extract confidence from an Uncertain value. if confidence result < 0.85 { ... } gates execution on the model's certainty. First-class VM instruction.

suspend

Explicit durable checkpoint. suspend; writes the entire VM state to disk. Execution can be resumed externally from exactly this point.

spawn / spawn_link / spawn_each

Actor-style concurrency. spawn turn() { ... } creates an isolated process. spawn_link adds bidirectional exit signal propagation. spawn_each(list, closure) is the concurrent scatter/gather primitive.

map / filter

Native higher-order list operations. map(list, turn(x: T) -> R { ... }) and filter(list, turn(x: T) -> Bool { ... }) are compiler-expanded to inline bytecode. No imports are required.


Types

  • Identity: An opaque cryptographic capability handle for Zero-Trust authentication. Cannot be coerced to string.

Expressions & Statements

  • Literals: 42, "hello", true, false, null, [1, 2, 3], { "key": val }
  • Variables: let id = expr;
  • Struct init: Name { field: val, ..base } (spread copies from base)
  • Operators: +, ==, !=, <, >, <=, >=, and, or, !, ~> (Vec similarity)
  • Indexing: expr[index] for lists and maps
  • Member access: expr.field for structs and maps
  • Control flow: if expr { ... } else { ... } (also usable as expression), while expr { ... }
  • Structs: struct Name { field: Type };
  • Error handling: try { ... } catch(e) { ... } and throw expr;
  • Message passing: send pid, value; and let msg = receive;
  • Module import: let mod = use "path/to/module";
  • Identity grant: grant identity::oauth("name") requests a secure Identity capability from the Turn VM host. The raw token never enters Turn memory.
  • Schema absorption: use schema::openapi("url") performs compile-time schema absorption. It fetches an API schema and synthesizes native Turn closures at compile time.

Built-in Tools

These tools are available directly via call(...) with no import required:

ToolInputPurpose
call("echo", val)any valuePrint to stdout, returns the value
call("len", val)Str/List/Map/VecReturns length as Num
call("list_push", [list, item])list + itemReturns new list with item appended
call("list_contains", [list, item])list + itemReturns Bool
call("llm_generate", map){"messages": List, "model": Str?}Raw LLM generation via WASM driver

NOTE

list_push and list_contains take a two-element list [list, item] as their single argument.


Standard Library

System I/O is provided by Turn's Standard Library, pure Turn modules embedded in the binary. Import with use "std/<name>". Capability-sensitive operations require a grant identity token.

let net  = use "std/net";
let fs   = use "std/fs";
let json = use "std/json";
let time = use "std/time";
let env  = use "std/env";
let re   = use "std/regex";
let math = use "std/math";
ModuleFunctionIdentityPurpose
std/netnet.get({ url, identity })network(...)HTTP GET
std/netnet.post({ url, body, identity })network(...)HTTP POST
std/fsfs.read({ path, identity })filesystem(...)Read file
std/fsfs.write({ path, content, identity })filesystem(...)Write file
std/jsonjson.parse(str)noneParse JSON string
std/jsonjson.stringify(val)noneSerialize to JSON string
std/timetime.now()noneUnix epoch as Num
std/timetime.sleep(seconds)nonePause execution
std/envenv.get({ key, identity })environment(...)Read env variable
std/envenv.set({ key, value, identity })environment(...)Set env variable
std/regexre.matches(pattern, text)noneReturns Bool
std/regexre.replace(pattern, text, replacement)noneReturns Str
std/mathmath.max(a, b)noneLarger of two Num
std/mathmath.min(a, b)noneSmaller of two Num
std/mathmath.abs(n)noneAbsolute value

See Standard Library for full documentation.


LLM Providers

Set the TURN_LLM_PROVIDER environment variable to route inference to the appropriate WASM driver:

ProviderTURN_LLM_PROVIDER valueRequired env vars
OpenAIopenaiOPENAI_API_KEY
AnthropicanthropicANTHROPIC_API_KEY
Google GeminigeminiGEMINI_API_KEY
xAI GrokgrokXAI_API_KEY
Ollama (local)ollamanone
Azure OpenAIazure_openaiAZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY, AZURE_OPENAI_DEPLOYMENT
Azure Anthropicazure_anthropicAZURE_ANTHROPIC_ENDPOINT, AZURE_ANTHROPIC_API_KEY
Azure OpenAI Responsesazure_openai_responsesAZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY, AZURE_OPENAI_DEPLOYMENT

Next Steps