LLM Clients
The LLM Client layer abstracts different providers into a unified interface.
Usage
While you usually interact with Agent, you can use clients directly or configure them for Agents.
from yosrai.engine.llm import get_client
# Factory Pattern
client = get_client("anthropic", model="claude-3-opus-20240229", api_key="sk-...")
response = client.chat([{"role": "user", "content": "Hi"}])
Message Primitives
yosrai.engine.core.messages.Message
Bases: BaseModel
Represents a chat message with a role and content (which can be a list of blocks).
text_content
property
Get the text content of the message.
to_anthropic_dict()
Convert to Anthropic API format.
to_google_dict()
Convert to Google Gemini API format.
to_openai_dict()
Convert to OpenAI API format.
user(text, images=[])
classmethod
Factory for User message with optional images.
yosrai.engine.core.messages.ContentBlock
Bases: BaseModel
Represents a single block of content in a message (Text or Image).
ensure_base64()
Ensure image_base64 is populated. If it's a URL, download it.
from_image(source, media_type=None)
classmethod
Create an Image block from a URL, local file path, or base64 string.
API Reference
yosrai.engine.llm.base.LLMClient
Bases: ABC
Abstract base class for LLM providers. Directly wraps the provider's SDK or API.
achat(messages, tools=None, response_model=None)
abstractmethod
async
Send an async chat request to the LLM.
chat(messages, tools=None, response_model=None)
abstractmethod
Send a chat request to the LLM.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
messages
|
List[Dict[str, Any]]
|
List of message dicts [{'role': 'user', 'content': '...'}, ...] |
required |
tools
|
Optional[List[Dict[str, Any]]]
|
List of tool definitions (JSON Schema) |
None
|
response_model
|
Any
|
Optional Pydantic model for structured output |
None
|
Returns:
| Type | Description |
|---|---|
LLMResponse
|
LLMResponse object |
stream(messages, tools=None)
Stream response tokens.
yosrai.engine.llm.openai_client.OpenAIClient
yosrai.engine.llm.anthropic_client.AnthropicClient
yosrai.engine.llm.google_client.GoogleClient
Bases: LLMClient
Client for Google's Gemini API (via google-genai SDK). Requires 'google-genai' package.
yosrai.engine.llm.ollama_client.OllamaClient
Bases: LLMClient
Client for Ollama API (running locally). Uses direct HTTP calls via httpx to avoid extra dependencies.
stream(messages, tools=None)
Stream response tokens from Ollama. Returns a generator yielding (token, done, tool_calls_if_any).