Skip to content

feat: BYOK enhancements#153

Open
kah-seng wants to merge 12 commits intostagingfrom
feat/byok
Open

feat: BYOK enhancements#153
kah-seng wants to merge 12 commits intostagingfrom
feat/byok

Conversation

@kah-seng
Copy link
Copy Markdown
Member

Key Changes:

  • Added temperature, parallel tool calls and store param as configurable options for BYOK
  • Added tooltips
  • Frontend prevents adding 2 custom models with same name and slug

Yet to fix bad request error for certain DeepSeek requests.

@kah-seng kah-seng requested a review from 4ndrelim April 10, 2026 13:28
@kah-seng kah-seng self-assigned this Apr 10, 2026
Copilot AI review requested due to automatic review settings April 10, 2026 13:28
@kah-seng kah-seng added the enhancement New feature or request label Apr 10, 2026
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds BYOK (Bring Your Own Key) enhancements across the webapp, proto API, and backend LLM client so custom models can expose additional OpenAI-compatible parameters and be selected more reliably.

Changes:

  • Extends custom model settings (temperature, parallel tool calls, store) and updates the settings UI with validation, sorting, and tooltips.
  • Adds custom model identifiers to supported-model listings and includes custom_model_id in the chat streaming request path.
  • Updates backend model mapping/storage and OpenAI request parameter construction to use the new custom-model fields.

Reviewed changes

Copilot reviewed 11 out of 11 changed files in this pull request and generated 5 comments.

Show a summary per file
File Description
webapp/_webapp/src/views/settings/sections/api-key-settings.tsx UI/UX for custom model CRUD; adds new BYOK fields, tooltips, validation, duplicate prevention
webapp/_webapp/src/views/chat/footer/toolbar/selection.tsx Extends selection item metadata to carry custom model identity flags
webapp/_webapp/src/views/chat/footer/toolbar/model-selection.tsx Selects custom models by ID (when available) to disambiguate custom entries
webapp/_webapp/src/utils/stream-request-builder.ts Adds customModelId to the stream request builder params/payload
webapp/_webapp/src/stores/setting-store.ts Initializes settings defaults to include customModels
webapp/_webapp/src/stores/conversation/conversation-ui-store.ts Persists lastUsedCustomModelId in UI store
webapp/_webapp/src/pkg/gen/apiclient/user/v1/user_pb.ts Regenerated TS client types for added custom model fields
webapp/_webapp/src/pkg/gen/apiclient/chat/v2/chat_pb.ts Regenerated TS client types for supported model IDs + stream request customModelId
webapp/_webapp/src/hooks/useSendMessageStream.ts Sends customModelId with message stream requests
webapp/_webapp/src/hooks/useLanguageModels.ts Tracks/uses lastUsedCustomModelId for selecting custom models
proto/user/v1/user.proto Adds custom model fields (temperature, parallel tool calls, store)
proto/chat/v2/chat.proto Adds supported model id + stream request custom_model_id
pkg/gen/api/user/v1/user.pb.go Regenerated Go proto for custom model fields
pkg/gen/api/chat/v2/chat.pb.go Regenerated Go proto for supported model IDs + stream request custom_model_id
internal/services/toolkit/client/utils_v2.go Uses custom model fields when building OpenAI chat completion params
internal/services/toolkit/client/get_conversation_title_v2.go Plumbs custom model through title generation path
internal/services/toolkit/client/get_citation_keys.go Updates call signature for ChatCompletionV2 with custom model param
internal/services/toolkit/client/completion_v2.go Updates ChatCompletionV2/StreamV2 signatures to accept custom model
internal/models/user.go Persists new custom model fields in MongoDB model struct
internal/api/mapper/user.go Maps new custom model fields between proto and internal models
internal/api/chat/list_supported_models_v2.go Includes custom model ID in supported models response
internal/api/chat/create_conversation_message_stream_v2.go Resolves custom models by custom_model_id and passes through to LLM client
Comments suppressed due to low confidence (1)

internal/api/chat/create_conversation_message_stream_v2.go:300

  • Custom models are now only resolved when custom_model_id is present. If the client omits custom_model_id (e.g., older clients, or loading an existing conversation where the UI store has no matching lastUsedCustomModelId), a custom model slug will be treated as a built-in model and the BYOK endpoint/API key won’t be applied. Consider keeping a backward-compatible fallback: when custom_model_id is empty, attempt to resolve settings.CustomModels by modelSlug (or persist custom model ID in the conversation state and always send it).
	customModelID := req.GetCustomModelId()
	if customModelID != "" {
		for i := range settings.CustomModels {
			if settings.CustomModels[i].Id.Hex() == customModelID {
				customModel = &settings.CustomModels[i]
				break
			}
		}
		if customModel == nil {
			return s.sendStreamError(stream, fmt.Errorf("custom model not found: %q", customModelID))
		}
		modelSlug = customModel.Slug
	}

	if customModel == nil {

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants