zoobzio December 13, 2025 Edit this page

zyn

Type-safe LLM orchestration framework with composable reliability patterns

The Problem

Working with Large Language Models presents unique challenges:

  1. Unstructured outputs - LLMs return free-form text, requiring parsing and validation
  2. Unreliable responses - Rate limits, timeouts, and malformed JSON are common
  3. Prompt drift - Ad-hoc prompting leads to inconsistent behavior over time
  4. No type safety - Runtime errors from unexpected response formats
  5. Context management - Multi-turn conversations require careful state handling

The Solution

zyn provides structured, type-safe LLM interactions through synapses - specialized units that handle specific interaction patterns with compile-time guarantees.

// Type-safe extraction with automatic validation
type Contact struct {
    Name  string `json:"name"`
    Email string `json:"email"`
}

func (c Contact) Validate() error {
    if c.Email == "" {
        return fmt.Errorf("email required")
    }
    return nil
}

extractor, _ := zyn.Extract[Contact]("contact information", provider)
contact, err := extractor.Fire(ctx, session, "John at john@example.com")
// contact is Contact{Name: "John", Email: "john@example.com"}
// Validation runs automatically - err if Email is empty

Core Philosophy

Synapses Over Prompts

Instead of crafting prompts, you declare intent through synapses:

PatternSynapseInputOutput
Yes/No decisionBinarystringbool
CategorizationClassificationstringstring
Data extractionExtract[T]stringT
Text transformationTransformstringstring
Structured analysisAnalyze[T]Tstring
Type conversionConvert[T,U]TU
OrderingRankingstringstring
Emotional analysisSentimentstringSentimentResult

Sessions for Context

Sessions manage conversation history across synapse calls:

session := zyn.NewSession()

// First call
classifier.Fire(ctx, session, "I love this product!")
// Session: [user: "I love this!", assistant: {sentiment analysis}]

// Second call sees previous context
followup.Fire(ctx, session, "What did I say about the product?")
// LLM can reference the previous exchange

Reliability Built-In

Every synapse integrates with pipz for production-grade reliability:

synapse, _ := zyn.Binary("question", provider,
    zyn.WithRetry(3),                              // Retry on failure
    zyn.WithTimeout(10*time.Second),               // Timeout protection
    zyn.WithCircuitBreaker(5, 30*time.Second),     // Circuit breaker
    zyn.WithRateLimit(10, 100),                    // Rate limiting
)

Observable by Default

All synapses emit capitan hooks for observability:

capitan.Hook(zyn.ProviderCallCompleted, func(ctx context.Context, e *capitan.Event) {
    tokens, _ := zyn.TotalTokensKey.From(e)
    log.Printf("Used %d tokens", tokens)
})

Design Priorities

  1. Type Safety - Compile-time guarantees over runtime surprises
  2. Composability - Small, focused synapses that combine naturally
  3. Reliability - Production patterns built-in, not bolted on
  4. Observability - Full visibility into LLM interactions
  5. Simplicity - Minimal API surface, maximum capability

Next Steps