Skip to content

Providers

The Providers class in YosrAI manages integrations with various Language Model providers. It provides a unified interface for accessing different LLM providers and their models, with built-in configuration management and dynamic loading.

Class Overview

from yosrai.core import Providers
from yosrai.utils.config import Config

# Initialize providers
providers = Providers(config=Config())

# Access a provider
openai = providers("openai")

Supported Providers

YosrAI supports multiple LLM providers out of the box:

  • OpenAI
  • Anthropic
  • Groq
  • Mistral
  • Ollama
  • Fireworks
  • Google AI
  • NVIDIA
  • Together
  • XAI
  • Cohere
  • Nebius
  • DeepSeek

Usage Examples

Basic Provider Access

from yosrai.core import Providers

providers = Providers()

# Access OpenAI provider
openai = providers("openai")

# Access Anthropic provider
anthropic = providers("anthropic")

Getting Available Models

# List models for a provider
openai_models = providers.get_models("openai")

# Get default model for a provider
default_model = providers.get_default_model("openai")

Listing All Providers

# Get list of all available providers
available_providers = providers.list()

Visualizing Provider Structure

# Print provider and model tree
providers.print_tree()

Configuration

The Providers class can be initialized with custom configuration:

from yosrai.utils.config import Config

config = Config(
    api_keys={
        "openai": "your-api-key",
        "anthropic": "your-api-key"
    }
)

providers = Providers(config=config)

Provider Management

Dynamic Loading

Providers are loaded dynamically when first accessed:

# Provider is loaded only when accessed
openai = providers("openai")  # First access loads the provider

Error Handling

The class includes robust error handling:

# Safe provider access with fallback
try:
    provider = providers("unknown_provider")
except Exception as e:
    print(f"Provider not available: {e}")

Integration with LLM Class

The Providers class works seamlessly with the LLM class:

from yosrai.core import LLM

# Get provider and create LLM instance
provider = providers("openai")
llm = LLM(
    provider=provider.provider_name,
    model=providers.get_default_model("openai")
)

Provider Features

Each provider instance offers:

  • List of available models
  • Default model selection
  • Provider-specific configurations
  • API key management
  • Model compatibility checks

Methods

call

def __call__(self, provider: str)

Access and load a specific provider.

list

def list(self)

Get list of all available providers.

get_models

def get_models(self, provider: Any)

Get list of models available for a provider.

get_default_model

def get_default_model(self, provider: Any)

Get the default model for a provider.

def print_tree(self)

Display a tree view of providers and their models.

Best Practices

  1. Configuration Management:

    # Load configuration from environment or file
    config = Config.from_env()
    providers = Providers(config=config)
    

  2. Error Handling:

    # Always handle potential provider errors
    if provider in providers.list():
        models = providers.get_models(provider)
    

  3. Model Selection:

    # Use default models when unsure
    model = providers.get_default_model(provider)
    

  4. Provider Inspection:

    # Inspect available providers and models
    providers.print_tree()
    

Advanced Usage

Custom Provider Integration

# Access provider-specific features
provider = providers("openai")
if hasattr(provider, "custom_feature"):
    result = provider.custom_feature()

Provider-Specific Configuration

config = Config(
    provider_configs={
        "openai": {
            "organization": "org-id",
            "api_version": "2024-01"
        }
    }
)
providers = Providers(config=config)

Model Compatibility

# Check if model is available
provider = providers("openai")
if model_name in providers.get_models("openai"):
    # Use model
    pass