LLMs Module
The LLMS Module in the Autoppia SDK provides a standardized way to interact with various language model (LLM) services. It converts backend configuration into concrete LLM service instances, allowing users to easily work with different providers. The module supports multiple LLM providers such as OpenAI and Anthropic, and can be extended to support more in the future.
Module Components
LLMServiceInterface: An abstract base class defining the interface that all LLM service implementations must follow.
Service Implementations: Concrete classes for LLM services, including:
OpenAIService: A service implementation for OpenAI models using LangChain.
AnthropicService: A service implementation for Anthropic models using LangChain.
LLMAdapter: An adapter that initializes the correct LLM service from backend configuration.
1. LLM Service Interface
This abstract base class defines the required methods for all LLM service implementations.
File: autoppia_sdk/src/llms/interface.py
pythonCopiarfrom abc import ABC, abstractmethod
from langchain.schema.language_model import BaseLanguageModel
class LLMServiceInterface(ABC):
"""Interface for language model services.
This abstract base class defines the required interface that all
language model service implementations must follow.
"""
@abstractmethod
def get_llm(self) -> BaseLanguageModel:
"""Get the language model instance.
Returns:
BaseLanguageModel: The configured language model instance
"""
pass
@abstractmethod
def update_model(self, model_name: str) -> None:
"""Update the model name.
Args:
model_name (str): New model name to use
"""
pass
@abstractmethod
def update_api_key(self, api_key: str) -> None:
"""Update the API key.
Args:
api_key (str): New API key to use
"""
pass
2. Service Implementations
OpenAI Service
The OpenAIService
class provides an interface to OpenAI's language models through LangChain. It handles model initialization, API key management, and model updates.
File: autoppia_sdk/src/llms/implementations/openai_service.py
pythonCopiarfrom langchain_openai import ChatOpenAI
from autoppia_sdk.src.llms.interface import LLMServiceInterface
class OpenAIService(LLMServiceInterface):
"""OpenAI language model service implementation.
This class provides an interface to OpenAI's language models through the LangChain
integration. It handles model initialization, API key management, and model updates.
Attributes:
api_key (str): OpenAI API key for authentication.
model (str): Name of the OpenAI model to use (default: "gpt-4o").
_llm (ChatOpenAI): Cached LangChain ChatOpenAI instance.
"""
def __init__(self, api_key: str, model: str = "gpt-4o"):
"""Initialize the OpenAI service.
Args:
api_key (str): OpenAI API key for authentication.
model (str, optional): Name of the OpenAI model. Defaults to "gpt-4o".
"""
self.api_key = api_key
self.model = model
self._llm = None
def get_llm(self):
"""Get or create the LangChain ChatOpenAI instance.
Returns:
ChatOpenAI: Configured LangChain ChatOpenAI instance.
"""
if not self._llm:
self._llm = ChatOpenAI(
openai_api_key=self.api_key,
model=self.model
)
return self._llm
def update_model(self, model_name: str):
"""Update the model name and reset the LLM instance.
Args:
model_name (str): New model name to use.
"""
self.model = model_name
self._llm = None # Force recreation with new model
def update_api_key(self, api_key: str):
"""Update the API key and reset the LLM instance.
Args:
api_key (str): New API key to use.
"""
self.api_key = api_key
self._llm = None # Force recreation with new API key
Anthropic Service
The AnthropicService
class provides an interface to Anthropic's language models through LangChain. It handles model initialization, API key management, and model updates.
File: autoppia_sdk/src/llms/implementations/anthropic_service.py
pythonCopiarfrom langchain_anthropic import ChatAnthropic
from langchain.schema.language_model import BaseLanguageModel
from autoppia_sdk.src.llms.interface import LLMServiceInterface
class AnthropicService(LLMServiceInterface):
"""Anthropic language model service implementation.
This class provides an interface to Anthropic's language models through the LangChain
integration. It handles model initialization, API key management, and model updates.
Attributes:
api_key (str): Anthropic API key for authentication.
model (str): Name of the Anthropic model to use (default: "claude-3-opus-20240229").
_llm (BaseLanguageModel): Cached LangChain ChatAnthropic instance.
"""
def __init__(self, api_key: str, model: str = "claude-3-opus-20240229"):
"""Initialize the Anthropic service.
Args:
api_key (str): Anthropic API key for authentication.
model (str, optional): Name of the Anthropic model. Defaults to "claude-3-opus-20240229".
"""
self.api_key = api_key
self.model = model
self._llm = None
def get_llm(self) -> BaseLanguageModel:
"""Get or create the LangChain ChatAnthropic instance.
Returns:
BaseLanguageModel: Configured LangChain ChatAnthropic instance.
"""
if not self._llm:
self._llm = ChatAnthropic(
model=self.model,
anthropic_api_key=self.api_key
)
return self._llm
def update_model(self, model_name: str) -> None:
"""Update the model name and reset the LLM instance.
Args:
model_name (str): New model name to use.
"""
self.model = model_name
self._llm = None
def update_api_key(self, api_key: str) -> None:
"""Update the API key and reset the LLM instance.
Args:
api_key (str): New API key to use.
"""
self.api_key = api_key
self._llm = None
3. LLM Adapter
The LLMAdapter
class converts backend LLM configuration data from user into a concrete LLM service instance. It supports multiple providers by reading the configuration and returning the corresponding service instance.
File: autoppia_sdk/src/llms/adapter.py
pythonCopiarfrom autoppia_backend_client.models import UserLLMModel
from autoppia_sdk.src.llms.implementations.openai_service import OpenAIService
from autoppia_sdk.src.llms.implementations.anthropic_service import AnthropicService
class LLMAdapter:
"""Adapter for initializing LLM services from backend configuration.
This class handles the conversion of backend LLM configuration into
appropriate LLM service instances.
Attributes:
llm_dto (UserLLMModel): Data transfer object containing LLM configuration.
"""
def __init__(self, llm_dto):
"""Initialize the LLM adapter.
Args:
llm_dto (UserLLMModel): Backend LLM configuration.
"""
self.llm_dto: UserLLMModel = llm_dto
def from_backend(self):
"""Initialize LLM service from backend configuration.
Returns:
LLMServiceInterface: Initialized LLM service instance.
None: For unsupported providers.
Raises:
ValueError: If required API key is missing.
"""
provider_type = self.llm_dto.llm_model.provider.provider_type.upper()
api_key = self.llm_dto.api_key
model_name = self.llm_dto.llm_model.name.lower()
if not api_key:
raise ValueError(f"Missing API key for {provider_type} provider")
if provider_type == "OPENAI":
return OpenAIService(api_key=api_key, model=model_name)
elif provider_type == "ANTHROPIC":
return AnthropicService(api_key=api_key, model=model_name)
return None
Last updated