LLM Integration
LLMUtils Overview
The LLMUtils class in src/utils/llm provides a unified interface for interacting with different LLM providers, supporting both OpenAI and OpenRouter APIs.
Text Generation
Generate text responses using different LLM models:
Structured Output
Get structured JSON responses using Zod schemas for type safety:
Boolean Decisions
Get simple true/false decisions from the LLM:
Image Analysis
Process images and get text descriptions or structured analysis:
Model Selection
LLMSize.SMALL
Uses gpt-4o-mini
Faster response times
Lower cost per request
Good for simple decisions
LLMSize.LARGE
Uses gpt-4o
Better reasoning
More nuanced responses
Complex analysis tasks
Best Practices
Use structured output for predictable responses
Stream responses for better user experience
Choose appropriate model size for the task
Handle API errors gracefully
Monitor token usage and costs
Cache responses when possible
Last updated