Interface
LLM Interface¶
Base LLM is the base class for all LLMs. It provides a common interface for all LLMs to follow.
Base LLM Config¶
Base LLM Config is the configuration object for an LLM. It is used to configure the LLM and is passed to the LLM when it is created.
director.llm.base.BaseLLMConfig
¶
Bases: BaseSettings
Base configuration for all LLMs.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
llm_type |
str
|
Type of LLM. e.g. "openai", "anthropic", etc. |
required |
api_key |
str
|
API key for the LLM. |
required |
api_base |
str
|
Base URL for the LLM API. |
required |
chat_model |
str
|
Model name for chat completions. |
required |
temperature |
str
|
Sampling temperature for completions. |
required |
top_p |
float
|
Top p sampling for completions. |
required |
max_tokens |
int
|
Maximum tokens to generate. |
required |
timeout |
int
|
Timeout for the request. |
required |
Base LLM¶
Base LLM is the base class for all LLMs. It provides a common interface for all LLMs to follow.
director.llm.base.BaseLLM
¶
Bases: ABC
Interface for all LLMs. All LLMs should inherit from this class.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
config |
BaseLLMConfig
|
Configuration for the LLM. |
required |
Source code in backend/director/llm/base.py
chat_completions
abstractmethod
¶
LLM Response¶
LLM Response is the response object for an LLM. It is returned by the LLM after processing an input message.
director.llm.base.LLMResponse
pydantic-model
¶
Bases: BaseModel
Response model for completions from LLMs.