Probabilistic Routing
LLMs are inherently stochastic. Even with Cognitive Type Safety (infer Struct), an LLM might return an answer that structurally conforms to the required schema but is factually or contextually uncertain. Turn solves this problem at the language level via Probabilistic Routing.
In Turn, inference is not just about obtaining a typed result. It is also about managing the model's confidence. The confidence operator allows you to access the underlying probability of any variable and guarantee a deterministic fallback path if the provider's reported confidence falls below your specified threshold.
The confidence Operator
The probabilistic threshold is an explicit confidence boundary. It requires the LLM provider to assign a float scalar [0.0, 1.0] representing its log-probability certainty on the response. If the confidence fails to clear the threshold, you can use standard control flow to abort the primary assignment and fallback safely.
struct Sentiment {
score: Num,
reasoning: Str
};
// Execute inference
let analysis = infer Sentiment {
"Categorize the following sentiment: 'It was okay, I guess.'";
};
// Route execution if the model is less than 80% confident
if confidence analysis < 0.80 {
// Deterministic fallback
return Sentiment {
score: 0.5,
reasoning: "Automated Fallback: Model uncertainty triggered."
};
}
call("echo", "Final Reasoning: " + analysis.reasoning);TIP
The fallback block is completely deterministic. It executes natively in the Turn VM (zero network latency) and enforces structural type safety. The block must assign a value that strictly conforms to the expected structural type constraint of the originating infer statement.
Handling Total Provider Failures
Probabilistic Routing isn't just for low-confidence scores; you can use it to build a safety net for the entire stochastic operation. The conditional block is triggered under any of the following failure scenarios:
- Provider Confidence Failure: The returned score is explicitly below your scalar threshold (e.g.
< 0.80). - Schema Coercion Failure: After consuming all automatic retry loops, the LLM continually hallucinates incorrect JSON shapes.
- Provider Outage: The upstream LLM API goes down, rates limits, or returns a non-200 HTTP code.
Instead of writing verbose try/catch or match blocks for network I/O, Turn treats all non-deterministic failure states interchangeably with low-confidence responses via if confidence < threshold.
Untyped Fallbacks
The threshold operator also applies seamlessly to untyped infer calls (raw generic Strings).
// If the model cannot generate a suitable string with 90% confidence
let joke = infer Str {
"Tell me a highly original joke about WebAssembly.";
};
if confidence joke < 0.90 {
return "I couldn't think of a good Wasm joke. I guess it got lost in the binary translation.";
}
call("echo", joke);Why Language-Level Support?
In traditional frameworks, handling confidence branching requires manual orchestration:
- Asking the LLM to output a `"confidence_score"` field inside its JSON response.
- Writing manual
if/elselogic after parsing. - Dealing with the edge case where the LLM forgets the score field or hallucinates a `"high"` string instead of a float.
Turn handles routing mechanically inside the Wasm Driver boundary. The host extracts top-level token probabilities and forces the branch natively. The syntax is mathematically robust: either you get the guaranteed type above your confidence threshold, or you execute the exact fallback logic you dictated. There are no silent failures.
Next Steps
- Inference Providers The Wasm sandbox architecture and how to write a driver.
- Context Window The token-budgeted priority stack for working memory.