Skip to content

Feat/add litellm provider#91

Open
RheagalFire wants to merge 3 commits intoSkyworkAI:mainfrom
RheagalFire:feat/add-litellm-provider
Open

Feat/add litellm provider#91
RheagalFire wants to merge 3 commits intoSkyworkAI:mainfrom
RheagalFire:feat/add-litellm-provider

Conversation

@RheagalFire
Copy link
Copy Markdown

Summary

  • Adds LiteLLM as an LLM provider, enabling access to 100+ LLM providers (OpenAI, Anthropic, Google, Azure, Bedrock, Ollama, etc.) through a single unified interface
  • New ChatLiteLLM client using litellm.acompletion() directly, follows the same Pydantic BaseModel + async __call__ pattern as ChatOpenAI, ChatOpenRouter, ChatAnthropic, and ChatGoogle
  • No proxy server required, uses litellm SDK directly

Motivation

DeepResearchAgent currently supports OpenAI, OpenRouter, Anthropic, and Google as LLM providers, each with its own dedicated client class using raw provider SDKs. Adding LiteLLM gives users access to 100+
additional providers (Azure, Bedrock, Vertex, Groq, Together, Ollama, etc.) through the same pattern. Since every existing provider uses raw SDKs (not LangChain), ChatLiteLLM follows the same approach by
calling litellm.acompletion() directly.

Changes

  • src/model/litellm/__init__.py - module init
  • src/model/litellm/chat.py - ChatLiteLLM client class with:
    • litellm.acompletion() for async completion (lazy-imported)
    • drop_params=True for cross-provider kwargs compatibility
    • OpenAIChatSerializer for message/tool formatting (same as OpenAI/OpenRouter)
    • Full tool call + structured output support matching ChatOpenAI
  • src/model/manager.py - registered litellm provider:
    • Added ChatLiteLLM import
    • Added _initialize_litellm_models() (auto-registers when LITELLM_MODEL env var is set)
    • Updated _create_client() with litellm branch
    • Updated register_model() validation to accept "litellm"
  • requirements.txt - added litellm>=1.60.0,<2.0.0
  • tests/test_litellm.py - unit tests covering instantiation, properties, drop_params=True, tool calls, import error handling, manager registration

Tests

1. Code verification (AST parse):
$ python -c "import ast; ..."
Classes: ['ChatLiteLLM']
Async methods: ['call', '_format_response']
PASS: AST parse valid, all required methods and patterns present
PASS: manager.py correctly imports and registers litellm

2. Lint (ruff):
$ ruff check src/model/litellm/
All checks passed!

$ ruff format --check src/model/litellm/
2 files already formatted

3. Tests (mocked litellm):

  • test_call_dispatches_to_litellm_acompletion - verifies model, drop_params, api_key forwarded
  • test_call_includes_drop_params_true - ensures cross-provider compat default
  • test_call_handles_tool_calls - verifies function calling response parsing
  • test_call_returns_failure_on_import_error - graceful ImportError when litellm not installed
  • test_litellm_in_allowed_providers - manager accepts litellm provider

Example usage

# Option 1: Auto-initialize via environment variables
export LITELLM_API_KEY=your-key
export LITELLM_MODEL=anthropic/claude-sonnet-4-20250514
# Models are auto-registered on manager.initialize()                                                                                                                                                               
 
# Option 2: Register programmatically                                                                                                                                                                              
from src.model.types import ModelConfig

config = ModelConfig(
    model_name="litellm/anthropic/claude-sonnet-4-20250514",
    model_id="anthropic/claude-sonnet-4-20250514",                                                                                                                                                                 
    model_type="chat/completions",
    provider="litellm",                                                                                                                                                                                            
    api_key="your-anthropic-key",  # or set ANTHROPIC_API_KEY env var
    temperature=0.7,                                                                                                                                                                                               
    max_completion_tokens=16384,
)                                                                                                                                                                                                                  
await model_manager.register_model(config)
response = await model_manager(
    "litellm/anthropic/claude-sonnet-4-20250514",
    messages=[...]                                                                                                                                                                                                 
)

See https://docs.litellm.ai/docs/providers for 100+ supported model strings.

Risk / Compatibility

  • Additive only, existing providers (OpenAI, OpenRouter, Anthropic, Google) are untouched
  • litellm is lazy-imported, fails gracefully with clear error if not installed
  • drop_params=True silently drops provider-unsupported kwargs (e.g. seed on Anthropic)
  • Reuses OpenAIChatSerializer for message formatting, same as OpenAI and OpenRouter providers

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant