AnthropicAI
AnthropicAI¶
AnthropicAI extends the base LLM and implements the Anthropic API.
AnthropicAI Config¶
AnthropicAI Config is the configuration object for AnthropicAI. It is used to configure AnthropicAI and is passed to AnthropicAI when it is created.
director.llm.anthropic.AnthropicAIConfig
¶
AnthropicAI Interface¶
AnthropicAI is the LLM used by the agents and tools. It is used to generate responses to messages.
director.llm.anthropic.AnthropicAI
¶
Bases: BaseLLM
Parameters:
Name | Type | Description | Default |
---|---|---|---|
config |
AnthropicAIConfig
|
AnthropicAI Config |
None
|
Source code in backend/director/llm/anthropic.py
_format_tools
¶
Format the tools to the format that Anthropic expects.
Example::
[
{
"name": "get_weather",
"description": "Get the current weather in a given location",
"input_schema": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
}
},
"required": ["location"],
},
}
]
Source code in backend/director/llm/anthropic.py
chat_completions
¶
Get completions for chat.
tools docs: https://docs.anthropic.com/en/docs/build-with-claude/tool-use