Skip to content

Core Module

The core module is the heart of YosrAI, providing fundamental components for building AI-powered applications. This module includes essential classes and utilities for managing AI agents, LLM interactions, and chat management.

Key Components

  • Agent - The main class for creating and managing AI agents
  • LLM - Language Model interface and configuration
  • Chat Manager - Handles chat interactions and message management
  • Action - Defines actions and behaviors for agents
  • Context - Manages contextual information and state
  • Providers - Handles different LLM providers and configurations

Quick Start

from yosrai.core import Agent, LLM

# Initialize an LLM
llm = LLM(provider="openai", model="gpt-3.5-turbo")

# Create an agent
agent = Agent(
    agent_code="assistant",
    agent_name="AI Assistant",
    llm=llm
)

# Use the agent
response = await agent.act("Hello, how can you help me?")

Module Structure

The core module is organized into several key components that work together to provide a flexible and powerful AI agent framework:

  1. Agent: The central component that manages AI behavior and interactions
  2. LLM: Handles communication with language models
  3. Chat Manager: Manages conversation flow and message handling
  4. Action: Defines specific behaviors and capabilities
  5. Context: Maintains state and contextual information
  6. Providers: Manages different AI provider integrations

Each component is designed to be modular and extensible, allowing for easy customization and integration with various AI services and use cases.