-
Notifications
You must be signed in to change notification settings - Fork 10
Unified API: Add support for Kaapi Abstracted LLM Call #498
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
10 commits
Select commit
Hold shift + click to select a range
6980bbe
Add Kaapi LLM parameters and completion config; implement transformat…
avirajsingh7 2847d50
Refine LLM API documentation and improve code formatting for clarity;…
avirajsingh7 63a3bce
add/fix tests
avirajsingh7 b3393d8
Fix validation logic in map_kaapi_to_openai_params to prevent simulta…
avirajsingh7 f248495
Remove default value for 'model' in KaapiLLMParams to enforce explici…
avirajsingh7 0840c51
Refactor KaapiLLMParams to enforce explicit reasoning levels; update …
avirajsingh7 0a21c4e
Enhance LLM API documentation to clarify ad-hoc configuration paramet…
avirajsingh7 f5ab685
Refactor execute_job to use completion_config directly instead of con…
avirajsingh7 1399f22
Refactor LLM provider interfaces to use NativeCompletionConfig instea…
avirajsingh7 78c20ad
precommit
avirajsingh7 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,94 @@ | ||
| """Parameter mappers for converting Kaapi-abstracted parameters to provider-specific formats.""" | ||
|
|
||
| import litellm | ||
| from app.models.llm import KaapiLLMParams, KaapiCompletionConfig, NativeCompletionConfig | ||
|
|
||
|
|
||
| def map_kaapi_to_openai_params(kaapi_params: KaapiLLMParams) -> tuple[dict, list[str]]: | ||
| """Map Kaapi-abstracted parameters to OpenAI API parameters. | ||
|
|
||
| This mapper transforms standardized Kaapi parameters into OpenAI-specific | ||
| parameter format, enabling provider-agnostic interface design. | ||
|
|
||
| Args: | ||
| kaapi_params: KaapiLLMParams instance with standardized parameters | ||
|
|
||
| Supported Mapping: | ||
| - model → model | ||
| - instructions → instructions | ||
| - knowledge_base_ids → tools[file_search].vector_store_ids | ||
| - max_num_results → tools[file_search].max_num_results (fallback default) | ||
| - reasoning → reasoning.effort (if reasoning supported by model else suppressed) | ||
| - temperature → temperature (if reasoning not supported by model else suppressed) | ||
|
|
||
| Returns: | ||
| Tuple of: | ||
| - Dictionary of OpenAI API parameters ready to be passed to the API | ||
| - List of warnings describing suppressed or ignored parameters | ||
| """ | ||
| openai_params = {} | ||
| warnings = [] | ||
|
|
||
| support_reasoning = litellm.supports_reasoning( | ||
| model="openai/" + f"{kaapi_params.model}" | ||
| ) | ||
|
|
||
| # Handle reasoning vs temperature mutual exclusivity | ||
| if support_reasoning: | ||
| if kaapi_params.reasoning is not None: | ||
| openai_params["reasoning"] = {"effort": kaapi_params.reasoning} | ||
|
|
||
| if kaapi_params.temperature is not None: | ||
| warnings.append( | ||
| "Parameter 'temperature' was suppressed because the selected model " | ||
| "supports reasoning, and temperature is ignored when reasoning is enabled." | ||
| ) | ||
| else: | ||
| if kaapi_params.reasoning is not None: | ||
| warnings.append( | ||
| "Parameter 'reasoning' was suppressed because the selected model " | ||
| "does not support reasoning." | ||
| ) | ||
|
|
||
| if kaapi_params.temperature is not None: | ||
| openai_params["temperature"] = kaapi_params.temperature | ||
|
|
||
| if kaapi_params.model: | ||
| openai_params["model"] = kaapi_params.model | ||
|
|
||
| if kaapi_params.instructions: | ||
| openai_params["instructions"] = kaapi_params.instructions | ||
|
|
||
| if kaapi_params.knowledge_base_ids: | ||
| openai_params["tools"] = [ | ||
| { | ||
| "type": "file_search", | ||
| "vector_store_ids": kaapi_params.knowledge_base_ids, | ||
| "max_num_results": kaapi_params.max_num_results or 20, | ||
| } | ||
| ] | ||
|
|
||
| return openai_params, warnings | ||
|
|
||
|
|
||
| def transform_kaapi_config_to_native( | ||
| kaapi_config: KaapiCompletionConfig, | ||
| ) -> tuple[NativeCompletionConfig, list[str]]: | ||
| """Transform Kaapi completion config to native provider config with mapped parameters. | ||
|
|
||
| Currently supports OpenAI. Future: Claude, Gemini mappers. | ||
|
|
||
| Args: | ||
| kaapi_config: KaapiCompletionConfig with abstracted parameters | ||
|
|
||
| Returns: | ||
| NativeCompletionConfig with provider-native parameters ready for API | ||
| """ | ||
| if kaapi_config.provider == "openai": | ||
| mapped_params, warnings = map_kaapi_to_openai_params(kaapi_config.params) | ||
| return ( | ||
| NativeCompletionConfig(provider="openai-native", params=mapped_params), | ||
| warnings, | ||
| ) | ||
|
|
||
| raise ValueError(f"Unsupported provider: {kaapi_config.provider}") |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.