Ecosystem Bridges
Every AI framework eventually hits the same problem: how do you let a probabilistic model interact with the deterministic, structured world?
The industry's current answer is to build massive, fragile scaffolding: CLI integration layers that let LLMs write bash scripts, JIT compilers that generate SDK wrappers on the fly, or manual HTTP boilerplate generation.
Turn solves this at the compiler level. We believe that interacting with an API should require zero runtime scaffolding.
Turn relies on four foundational primitives to absorb the entire ecosystem natively: Compile-Time Schema Adapters, the grant identity primitive, the Universal infer Struct adapter, and Wasm Component FFI.
1. Zero-Trust Authentication: grant identity
In traditional frameworks, API keys and OAuth tokens are passed around as strings in the agent's memory or environment variables. If an agent is exposed to a prompt injection attack, the attacker can simply ask the agent to print or email the raw token.
Turn introduces Identity as a Language Primitive.
Instead of reading a string, your Turn code requests an opaque Cryptographic Capability. The Turn VM (the secure Rust host) manages the actual OAuth state machine, token refreshes, and secret storage. The agent's memory only holds an unforgeable reference.
let net = use "std/net";
// Request an identity capability from the VM host.
// The raw OAuth token never enters the agent's memory.
let my_google = grant identity::oauth("google_workspace");
// Pass the capability through the Standard Library network module.
// The Turn VM intercepts at the HTTP boundary, retrieves the real
// token from the host environment, and injects the Bearer header automatically.
let events = net.get({
"url": "https://www.googleapis.com/calendar/v3/users/me/calendarList",
"identity": my_google
});By moving identity management into the VM, Turn ensures absolute Zero-Trust execution. The LLM never sees the credentials.
2. Compile-Time Schema Absorption: use schema::<protocol>
When an API is well-documented and structured (like OpenAPI, GraphQL, or FHIR), writing manual http_get requests and parsing JSON is an anti-pattern. You shouldn't have to build custom "function registries."
Turn features a built-in Compile-Time Schema Absorption macro system. During compilation, the use schema macro fetches the remote schema, parses it natively, and synthesizes pure Turn structs and closures on the fly.
It behaves exactly like a native library import.
// The Turn compiler fetches the schema, parses the paths, and synthesizes
// a native Turn module with fully typed functions at compile time.
let petstore = use schema::openapi("https://petstore.swagger.io/v2/swagger.json");
// The agent interacts with it exactly like a native function.
// No HTTP boilerplate, no JSON parsing, no manual string manipulation.
let available_pets = petstore.find_pets_by_status("available");By absorbing schemas at compile time, the Turn Virtual Machine doesn't need to carry massive SDK dependencies. The output is a highly optimized bytecode closure that does exactly what the schema dictated. The openapi adapter is shipped by default, with graphql, fhir, and mcp adapters in active development.
3. The Universal Fallback: infer Struct
While schemas are perfect for deterministic enterprise APIs, much of the world is unstructured, undocumented, or chaotic. For these scenarios, Turn leans entirely on its native cognitive type system.
Instead of writing a custom regex parser or a fragile retry loop for an undocumented API, you simply make a raw request and let the infer engine coerce the chaos into typed memory.
let net = use "std/net";
struct PatientData {
id: Str,
condition: Str,
certainty: Num
};
let id = grant identity::network("public");
let raw_html_dump = net.get({
"url": "https://legacy.hospital.com/record/123",
"identity": id
});
let patient = infer PatientData {
"Extract patient ID, core condition, and diagnostic certainty from this record: " + raw_html_dump;
};
call("echo", "Patient condition: " + patient.condition);4. Wasm Component FFI: Safe Logic Execution
When you need to execute deterministic, heavy logic (like a cryptographic hash, a complex math library, or custom protocol bridging), asking an LLM to generate and run a shell command is an architectural trap.
Turn solves "CLI domestication" through WebAssembly (Wasm) Component FFI.
Instead of shelling out to a binary, you mount a sandboxed Wasm component. Turn exposes its exported functions as native Turn closures.
// Mount a sandboxed Wasm component
let math = use "math.wasm";
// Call it exactly like a native Turn function
let result = math.calculate_trajectory(15.0, 27.5);Why Wasm?
- Strict Sandboxing: Wasm modules run in isolated linear memory. They cannot access the host filesystem or network unless you explicitly grant capabilities.
- Native Types: Wasm types map flawlessly to Turn types (
Num,Str,Struct). - No Shell Injections: The agent doesn't hallucinate bash flags. It calls a typed function.
The Model Context Protocol (MCP) Bridge
Turn natively supports the Model Context Protocol (MCP), treating any MCP server as a native module. Because MCP standardizes tool and resource definitions via JSON-RPC, Turn can seamlessly translate MCP schemas into native Turn closures via the use schema::mcp macro (currently in active development alongside openapi).
This means the Turn ecosystem is immediately compatible with all community-built MCP servers, from Google Drive integrations to local database introspection, without writing a single line of adapter code.