The infer Primitive
infer is the most important statement in Turn. It serves as the boundary between deterministic computation and stochastic intelligence as a first-class language construct rather than just a function call.
What infer Actually Does
In most frameworks, LLM inference looks like this:
response = llm.chat("Analyze this review: " + review)
# response is a str. You parse it. You hope it's JSON. You handle ValueError.In Turn, infer is a typed statement:
struct Sentiment {
score: Num,
label: Str,
reasoning: Str
};
let result = infer Sentiment { "I love this product!"; };
// result.score is a Num. result.label is a Str.
// The VM guarantees structural correctness - not just at runtime, but semantically.The difference is not syntactic sugar. infer is a VM trap. It suspends the process, delegates computation to a sandboxed Wasm provider, validates the response against a compile-time JSON Schema, and then resumes with a structurally guaranteed value.
Cognitive Type Safety
When you write infer Sentiment { ... }, the Turn compiler does something remarkable at compile time: it generates a full JSON Schema from the Sentiment struct definition and bakes it into the bytecode as part of the inference call.
This schema travels to the LLM provider as a response_format constraint. The model is forced to produce output that matches it. The Turn VM validates the JSON before binding it to your type.
struct Analysis {
summary: Str,
certainty: Num,
tags: List, // List of strings
action: Str
};
let doc = "Q4 revenue declined 12%, primarily driven by slowing enterprise deals in APAC.";
let result = infer Analysis {
"Analyze this business document and extract the key findings: " + doc;
};
// All fields are fully typed to prevent any need for manual parsing or casting
call("echo", "Summary: " + result.summary);
call("echo", "Certainty: " + result.certainty);
call("echo", "Action: " + result.action);
// Structured access to list fields
if result.certainty < 0.6 {
call("echo", "⚠ Low certainty requires human review");
}If infer Struct { ... } completes without error, the bound value structurally conforms to the declared type. Field names, types, and nesting are all guaranteed. A field access on an infer result cannot produce a type mismatch at runtime.
Anatomy of an infer Statement
let <binding> = infer <Type> {
<prompt_expression>;
};The prompt expression can be any Turn expression that evaluates to a string. You can concatenate variables, format values, and compose prompts dynamically:
let user_query = recall("last_question");
let product_name = recall("product");
let result = infer Summary {
"Summarize the following user question about " + product_name + ": " + user_query;
};Inference Without a Schema (Free-Form)
When you don't need a typed response, you can infer without a struct. The result is a raw string:
let poem = infer Str {
"Write a haiku about distributed systems.";
};
call("echo", poem);Use this sparingly. Prefer typed infer whenever the output will be compared, assigned to a struct field, or returned so the VM can enforce correctness.
Context and Working Memory in Prompts
The Turn VM automatically enriches infer calls with the agent's context window. Anything you've appended with context.append() is included in the prompt, in priority-stack order.
context.append("You are an expert financial analyst.");
context.append("Clients have conservative risk profiles.");
// The LLM receives the context above + the user prompt
let result = infer Portfolio {
"Allocate $50,000 across 5 asset classes for a 10-year growth horizon.";
};Any values you've written with remember can be retrieved with recall and added to context manually, giving agents explicit control over what prior knowledge is available to each inference call.
Handling Inference Errors
infer is stochastic. The model might fail, the provider might be rate-limited, or the schema validation might fail on a malformed response. Wrap inference in try/catch to handle failures gracefully:
try {
let result = infer Sentiment { review; };
call("echo", "Score: " + result["score"]);
} catch(e) {
call("echo", "Inference failed: " + e);
}NOTE
Turn's VM includes a hidden JSON recovery loop. If the LLM produces malformed JSON (stray backticks, trailing commas, or a missing field), the VM automatically strips the artifact and re-prompts with the exact validation error (up to 3 retries) before surfacing a failure. You typically never see raw parse errors.
Performance Characteristics
infer suspends the Turn process (not an OS thread) while the HTTP call is in flight. The VM serializes its continuation and the Tokio scheduler picks up other work. From the programmer's perspective, infer is synchronous; from the runtime's perspective, it is fully async.
- Cold start: ~0μs (Wasm module is pre-loaded at VM startup)
- Network round-trip: depends on provider and model
- Schema validation: O(n) in response size, typically under 1ms