zyn
Type-safe LLM orchestration framework with composable reliability patterns
The Problem
Working with Large Language Models presents unique challenges:
- Unstructured outputs - LLMs return free-form text, requiring parsing and validation
- Unreliable responses - Rate limits, timeouts, and malformed JSON are common
- Prompt drift - Ad-hoc prompting leads to inconsistent behavior over time
- No type safety - Runtime errors from unexpected response formats
- Context management - Multi-turn conversations require careful state handling
The Solution
zyn provides structured, type-safe LLM interactions through synapses - specialized units that handle specific interaction patterns with compile-time guarantees.
// Type-safe extraction with automatic validation
type Contact struct {
Name string `json:"name"`
Email string `json:"email"`
}
func (c Contact) Validate() error {
if c.Email == "" {
return fmt.Errorf("email required")
}
return nil
}
extractor, _ := zyn.Extract[Contact]("contact information", provider)
contact, err := extractor.Fire(ctx, session, "John at john@example.com")
// contact is Contact{Name: "John", Email: "john@example.com"}
// Validation runs automatically - err if Email is empty
Core Philosophy
Synapses Over Prompts
Instead of crafting prompts, you declare intent through synapses:
| Pattern | Synapse | Input | Output |
|---|---|---|---|
| Yes/No decision | Binary | string | bool |
| Categorization | Classification | string | string |
| Data extraction | Extract[T] | string | T |
| Text transformation | Transform | string | string |
| Structured analysis | Analyze[T] | T | string |
| Type conversion | Convert[T,U] | T | U |
| Ordering | Ranking | string | string |
| Emotional analysis | Sentiment | string | SentimentResult |
Sessions for Context
Sessions manage conversation history across synapse calls:
session := zyn.NewSession()
// First call
classifier.Fire(ctx, session, "I love this product!")
// Session: [user: "I love this!", assistant: {sentiment analysis}]
// Second call sees previous context
followup.Fire(ctx, session, "What did I say about the product?")
// LLM can reference the previous exchange
Reliability Built-In
Every synapse integrates with pipz for production-grade reliability:
synapse, _ := zyn.Binary("question", provider,
zyn.WithRetry(3), // Retry on failure
zyn.WithTimeout(10*time.Second), // Timeout protection
zyn.WithCircuitBreaker(5, 30*time.Second), // Circuit breaker
zyn.WithRateLimit(10, 100), // Rate limiting
)
Observable by Default
All synapses emit capitan hooks for observability:
capitan.Hook(zyn.ProviderCallCompleted, func(ctx context.Context, e *capitan.Event) {
tokens, _ := zyn.TotalTokensKey.From(e)
log.Printf("Used %d tokens", tokens)
})
Design Priorities
- Type Safety - Compile-time guarantees over runtime surprises
- Composability - Small, focused synapses that combine naturally
- Reliability - Production patterns built-in, not bolted on
- Observability - Full visibility into LLM interactions
- Simplicity - Minimal API surface, maximum capability
Next Steps
- Quickstart - Build your first synapse in 10 minutes
- Core Concepts - Deep dive into synapses, sessions, and providers
- Guides - Practical implementation patterns
- Cookbook - Real-world recipes
- Reference - Complete API documentation