LLM
The LLM
class provides a standardized interface for configuring and managing Language Model settings across different providers. It handles model selection, temperature settings, and token limits in a consistent way.
Class Overview
from yosrai.core import LLM
llm = LLM(
provider="openai",
model="gpt-3.5-turbo",
temperature=0.7,
max_tokens=1000
)
Constructor Parameters
provider
(str, optional): The LLM provider to use (e.g., "openai", "anthropic")model
(str, optional): The specific model to use from the providertemperature
(float, optional): Controls response randomness (0.0 to 1.0)max_tokens
(int, optional): Maximum number of tokens in the response
If parameters are not provided, the class will use default values from the configuration:
Properties
provider_name
Returns the name of the current provider.
model_name
Returns the name of the current model.
Usage Examples
Basic Configuration
from yosrai.core import LLM
# Create with specific settings
llm = LLM(
provider="openai",
model="gpt-4",
temperature=0.8,
max_tokens=2000
)
Using with Agent
from yosrai.core import Agent, LLM
# Configure LLM
llm = LLM(provider="anthropic", model="claude-2")
# Use with agent
agent = Agent(
agent_code="assistant",
agent_name="AI Assistant",
llm=llm
)
Changing Settings
# Create with defaults
llm = LLM()
# Update settings as needed
llm.temperature = 0.9
llm.max_tokens = 1500
Default Values
The LLM class uses default values from the Defaults
class:
- Default Provider: Configured in
Defaults.DEFAULT_LLM_PROVIDER
- Default Temperature:
Defaults.DEFAULT_TEMPERATURE
- Default Max Tokens:
Defaults.DEFAULT_MAX_TOKENS
- Default Model: Determined by provider's default configuration
Validation
The class includes built-in validation:
- Temperature must be between 0.0 and 1.0
- Max tokens must be greater than 0
- Provider and model must be valid strings
Provider Integration
The LLM class integrates with the Providers
system to ensure compatibility:
```python from yosrai import Providers
Get available providers
providers = Providers()
Create LLM with specific provider
llm = LLM( provider=providers.get_provider("openai").provider_name, model=providers.get_default_model("openai") )