Skip to content

Introduce Kaapi-Abstracted LLM Parameters to Support Multiple Providers #499

@avirajsingh7

Description

@avirajsingh7

Currently, the LLM call endpoint directly accepts and forwards parameters as-is to OpenAI.

We should introduce Kaapi-abstracted LLM parameters that provide a unified interface across multiple LLM providers. Kaapi will be responsible for mapping these abstracted parameters to provider-specific parameters in the backend.

This will significantly simplify the user experience when switching between models and providers.

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

Projects

Status

Closed

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions