-
Notifications
You must be signed in to change notification settings - Fork 10
llm call: polling llm db endpoint #726
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
edeb059
2861753
cb51ef7
20f96cb
05e9f57
5372913
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,10 @@ | ||
| Retrieve the status and results of an LLM call job by job ID. | ||
|
|
||
| This endpoint allows you to poll for the status and results of an asynchronous LLM call job that was previously initiated via the POST `/llm/call` endpoint. | ||
|
|
||
|
|
||
| ### Notes | ||
|
|
||
| - This endpoint returns both the job status AND the actual LLM response when complete | ||
| - LLM responses are also delivered asynchronously via the callback URL (if provided) | ||
| - Jobs can be queried at any time after creation |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -3,6 +3,9 @@ | |
|
|
||
| This module contains structured response models for LLM API calls. | ||
| """ | ||
| from datetime import datetime | ||
| from uuid import UUID | ||
|
|
||
| from sqlmodel import SQLModel, Field | ||
| from typing import Literal, Annotated | ||
| from app.models.llm.request import AudioContent, TextContent | ||
|
|
@@ -100,3 +103,26 @@ class IntermediateChainResponse(SQLModel): | |
| default=None, | ||
| description="Unmodified raw response from the LLM provider from the current block", | ||
| ) | ||
|
|
||
|
|
||
| # Job response models | ||
| class LLMJobBasePublic(SQLModel): | ||
| """Base response model for LLM job information.""" | ||
|
|
||
| job_id: UUID | ||
| status: str # JobStatus from job.py | ||
|
|
||
|
|
||
| class LLMJobImmediatePublic(LLMJobBasePublic): | ||
| """Immediate response after creating an LLM job.""" | ||
|
|
||
| message: str | ||
| job_inserted_at: datetime | ||
| job_updated_at: datetime | ||
|
|
||
|
|
||
| class LLMJobPublic(LLMJobBasePublic): | ||
| """Full job response with nested LLM response when complete.""" | ||
|
|
||
| llm_response: LLMCallResponse | None = None | ||
| error_message: str | None = None | ||
|
Comment on lines
+124
to
+128
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Keep the polling schema aligned with the route payload. The new GET handler already builds Suggested fix class LLMJobPublic(LLMJobBasePublic):
"""Full job response with nested LLM response when complete."""
+ job_inserted_at: datetime
+ job_updated_at: datetime
llm_response: LLMCallResponse | None = None
error_message: str | None = None🤖 Prompt for AI Agents |
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Authorize
job_idbefore returning status.This lookup is scoped only by raw UUID, and
_current_useris otherwise unused here. Any caller with project access who learns a job ID can poll another tenant's job, and this endpoint will also return non-LLM jobs unless you enforce ownership and expected job type before responding.🤖 Prompt for AI Agents