LLMs Module
The LLMS Module in the Autoppia SDK provides a standardized way to interact with various language model (LLM) services. It converts backend configuration into concrete LLM service instances, allowing users to easily work with different providers. The module supports multiple LLM providers such as OpenAI and Anthropic, and can be extended to support more in the future.
Module Components
LLMServiceInterface: An abstract base class defining the interface that all LLM service implementations must follow.
Service Implementations: Concrete classes for LLM services, including:
OpenAIService: A service implementation for OpenAI models using LangChain.
AnthropicService: A service implementation for Anthropic models using LangChain.
LLMAdapter: An adapter that initializes the correct LLM service from backend configuration.
1. LLM Service Interface
This abstract base class defines the required methods for all LLM service implementations.
File: autoppia_sdk/src/llms/interface.py
pythonCopiarfrom abc import ABC, abstractmethod
from langchain.schema.language_model import BaseLanguageModel
class LLMServiceInterface(ABC):
"""Interface for language model services.
This abstract base class defines the required interface that all
language model service implementations must follow.
"""
@abstractmethod
def get_llm(self) -> BaseLanguageModel:
"""Get the language model instance.
Returns:
BaseLanguageModel: The configured language model instance
"""
pass
@abstractmethod
def update_model(self, model_name: str) -> None:
"""Update the model name.
Args:
model_name (str): New model name to use
"""
pass
@abstractmethod
def update_api_key(self, api_key: str) -> None:
"""Update the API key.
Args:
api_key (str): New API key to use
"""
pass2. Service Implementations
OpenAI Service
The OpenAIService class provides an interface to OpenAI's language models through LangChain. It handles model initialization, API key management, and model updates.
File: autoppia_sdk/src/llms/implementations/openai_service.py
Anthropic Service
The AnthropicService class provides an interface to Anthropic's language models through LangChain. It handles model initialization, API key management, and model updates.
File: autoppia_sdk/src/llms/implementations/anthropic_service.py
3. LLM Adapter
The LLMAdapter class converts backend LLM configuration data from user into a concrete LLM service instance. It supports multiple providers by reading the configuration and returning the corresponding service instance.
File: autoppia_sdk/src/llms/adapter.py
Last updated