feat(opencode): show model provider quota in prompt metrics#24826
feat(opencode): show model provider quota in prompt metrics#2482650sotero wants to merge 7 commits into
Conversation
|
Hey! Your PR title Please update it to start with one of:
Where See CONTRIBUTING.md for details. |
|
The following comment was made by an LLM, it may be inaccurate: Based on my searches, I found one PR that appears potentially related: Related PR Found:
However, #16337 appears to be an older architectural change for providing footer space, while the current PR (#24826) specifically implements Codex quota display using that infrastructure. Other tangentially related PRs:
None of these appear to be direct duplicates of PR #24826's specific implementation of showing Codex quota in prompt metrics. |
|
Thanks for updating your PR! It now meets our contributing guidelines. 👍 |
|
Closing this for now while I refine the quota meter presentation and screenshot evidence. I will submit a cleaner PR once that is sorted out. |
Issue for this PR
Closes #
Type of change
What does this PR do?
Adds a prompt-metrics surface for model/provider quota information. This PR wires up Codex quota first: it reads the existing OpenAI OAuth session, fetches a best-effort five-hour/weekly quota snapshot through a non-blocking experimental console endpoint, and renders it below the chat box in the existing prompt metrics row.
This keeps the global footer/status/version area unchanged. The Codex usage source is ChatGPT's private
/backend-api/wham/usageendpoint, so the quota data is intentionally best-effort and isolated under the experimental console surface.How did you verify your code works?
bun test test/plugin/codex.test.ts test/cli/tui/sync-provider.test.tsx test/cli/cmd/tui/prompt-metrics.test.tsbun typecheckOPENCODE_VERSION=1.14.28 bun run build --single --skip-install --skip-embed-web-uibun turbo typecheckScreenshots / recordings
Temporarily removed while screenshots are sanitized.
Checklist