Skip to content

fix(opencode): merge system prompts for non-anthropic providers#23656

Closed
zhangdw156 wants to merge 1 commit into
anomalyco:devfrom
zhangdw156:fix/system-message-merge-vllm
Closed

fix(opencode): merge system prompts for non-anthropic providers#23656
zhangdw156 wants to merge 1 commit into
anomalyco:devfrom
zhangdw156:fix/system-message-merge-vllm

Conversation

@zhangdw156
Copy link
Copy Markdown

Issue for this PR

Closes #15059

Type of change

  • Bug fix
  • New feature
  • Refactor / code improvement
  • Documentation

What does this PR do?

It collapses plugin-added system prompts into a single system message for non-OpenAI-OAuth, non-workflow requests.

That keeps the existing Anthropic-compatible/messages behavior untouched, but avoids vLLM-served Qwen models failing with System message must be at the beginning. when plugins append extra system context.

It also adds a regression test covering experimental.chat.system.transform with an Alibaba/Qwen fixture.

Related: #16560, #20785, previous attempts #15018 and #16981.

How did you verify your code works?

  • Added a regression test and verified the red/green cycle locally
  • ~/.bun/bin/bun test packages/opencode/test/session/llm.test.ts
  • PATH=$HOME/.bun/bin:$PATH ~/.bun/bin/bun turbo typecheck --filter=opencode

Screenshots / recordings

n/a

Checklist

  • I have tested my changes locally
  • I have not included unrelated changes in this PR

@zhangdw156 zhangdw156 closed this Apr 21, 2026
@zhangdw156 zhangdw156 deleted the fix/system-message-merge-vllm branch April 21, 2026 09:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Multiple system prompts break Qwen3.5-* models

1 participant