use streaming API route to driectly upload to backend without nextjs server loads, use rich color toasts#8
Conversation
…server loads, use rich color toasts
|
Warning Rate limit exceeded
To keep reviews running without waiting, you can enable usage-based add-on for your organization. This allows additional reviews beyond the hourly cap. Account admins can enable it under billing. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Repository UI Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (3)
📝 WalkthroughWalkthroughMigrates document upload handling from a server action to a new streaming API endpoint. The API validates chat ownership and forwards requests to a Python backend with streaming support. Client-side upload logic switches to Changes
Sequence DiagramsequenceDiagram
participant Client as Browser Client
participant API as API Endpoint<br/>/api/upload
participant Auth as Auth Service
participant DB as Database
participant Backend as Python Backend<br/>/ingest
Client->>API: POST /api/upload?chatId=...
API->>Auth: Authenticate request
Auth-->>API: User confirmed
API->>DB: Verify chat exists & ownership
DB-->>API: Chat details
API->>Backend: Forward request body (streaming)
Backend->>Backend: Process document
Backend-->>API: JSON result
API->>DB: Update chat documentCount<br/>& updatedAt
DB-->>API: Confirmed
API-->>Client: { success: true, result }
Client->>Client: Update optimistic document<br/>uploading → false
Client->>Client: router.refresh()
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Poem
🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 inconclusive)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Pull request overview
This PR shifts document uploads from a Next.js Server Action to a dedicated App Router API route that streams the upload through to the Python backend, and updates toast styling/UX for richer feedback during messaging and uploads.
Changes:
- Removed the
uploadDocumentActionServer Action and replaced uploads with a streaming/api/uploadroute. - Updated the chat UI to use
toast.promise, show upload progress (spinner), and improve error toasts. - Enabled Sonner’s
richColorsstyling globally.
Reviewed changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated 2 comments.
| File | Description |
|---|---|
| lib/actions/chat.ts | Removes the Server Action responsible for forwarding document uploads. |
| components/chat-area.tsx | Switches upload flow to /api/upload, adds upload progress state/UI, and improves toast error handling. |
| app/layout.tsx | Enables rich-colored toasts via <Toaster richColors />. |
| app/api/upload/route.ts | Adds an authenticated streaming upload proxy to the Python /ingest endpoint and updates Prisma document counts. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Actionable comments posted: 3
🧹 Nitpick comments (1)
components/chat-area.tsx (1)
161-197: Consider parallel upload completion with a single refresh.Current flow uploads sequentially and refreshes per file. For multi-file batches, this increases total wait time and causes repeated page refresh work.
⚙️ Refactor sketch
- for (const { file, optimisticDocument } of uploadQueue) { + const tasks = uploadQueue.map(({ file, optimisticDocument }) => { const formData = new FormData(); formData.append("file", file); formData.append("chat_id", currentChat?.id || "default-chat"); - - try { - const uploadPromise = fetch( + const uploadPromise = fetch( `/api/upload?chatId=${encodeURIComponent(currentChat?.id || "default-chat")}`, { method: "POST", body: formData }, - ).then(async (response) => { + ).then(async (response) => { if (!response.ok) { const err = await response.json().catch(() => ({})); throw new Error(err.error || "Upload failed"); } return response.json(); - }); + }); - toast.promise(uploadPromise, { + toast.promise(uploadPromise, { loading: `Uploading "${file.name}"...`, success: () => { setDocuments((prev) => prev.map((d) => d.id === optimisticDocument.id ? { ...d, uploading: false } : d, ), ); - router.refresh(); return `"${file.name}" uploaded successfully`; }, error: (err) => { setDocuments((prev) => prev.filter((d) => d.id !== optimisticDocument.id), ); return `Failed to upload "${file.name}": ${err.message}`; }, - }); - - await uploadPromise; - } catch { - console.error("Failed to upload file"); - } - } + }); + return uploadPromise; + }); + + await Promise.allSettled(tasks); + router.refresh();🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@components/chat-area.tsx` around lines 161 - 197, The loop currently awaits each uploadPromise sequentially (uploadQueue, optimisticDocument, uploadPromise) and calls router.refresh() inside each toast success handler, causing serial uploads and repeated refreshes; refactor to start all fetches in parallel by mapping uploadQueue to an array of upload promises (each still using toast.promise and updating setDocuments on success/error for the specific optimisticDocument), then await Promise.allSettled on that array and call router.refresh() once after all uploads complete; ensure per-file success/error handlers still update documents (setDocuments) and only the final router.refresh() is performed after all promises settle.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@app/api/upload/route.ts`:
- Around line 26-27: The handler reads chatId from
request.nextUrl.searchParams.get("chatId") but trusts client-sent multipart
field chat_id, allowing ingestion into a different chat; instead, after
extracting server-validated chatId, ensure you override or replace any
client-supplied chat_id in the parsed multipart body with that server value (or
reject if they differ) before calling the ingestion logic. Locate where you
parse the multipart form and where ingestion is invoked (references: chatId,
chat_id, and the upload route handler) and either inject the validated chatId
into the form/body object used downstream or validate and throw on mismatch so
only the server-validated chatId is used. Ensure this change is applied to all
code paths (the initial query handling at
request.nextUrl.searchParams.get("chatId") and the multipart handling around
lines 47-58).
- Around line 50-59: Wrap the outbound fetch to `${BACKEND_API_BASE}/ingest` and
the subsequent response parsing in a try/catch so network/parse exceptions are
caught and converted into a deterministic JSON error response (e.g., status
502). Specifically, around the fetch that assigns backendResponse and wherever
you call backendResponse.json() (and the similar block at the other occurrence),
catch any thrown errors and return a controlled NextResponse.json({ error:
"...", detail: error.message }) with status 502; also handle non-ok
backendResponse statuses by reading the body safely (text or json) and returning
that payload with an appropriate 502/propagated status instead of letting
exceptions escape. Ensure the error path includes contextual info (method/URL
via BACKEND_API_BASE and that the request.body was proxied) for easier
debugging.
In `@components/chat-area.tsx`:
- Around line 171-174: The current response error handling in the upload flow
(the block checking response.ok in components/chat-area.tsx) only surfaces
err.error and discards err.detail; update the throw to include backend detail by
reading both err.error and err.detail (e.g., build a message like `${err.error
|| 'Upload failed'}${err.detail ? ': ' + err.detail : ''}`) so the toast shows
the backend failure details; keep the existing fallback to "Upload failed" when
neither field exists and preserve the try/catch around response.json() that
defaults to {}.
---
Nitpick comments:
In `@components/chat-area.tsx`:
- Around line 161-197: The loop currently awaits each uploadPromise sequentially
(uploadQueue, optimisticDocument, uploadPromise) and calls router.refresh()
inside each toast success handler, causing serial uploads and repeated
refreshes; refactor to start all fetches in parallel by mapping uploadQueue to
an array of upload promises (each still using toast.promise and updating
setDocuments on success/error for the specific optimisticDocument), then await
Promise.allSettled on that array and call router.refresh() once after all
uploads complete; ensure per-file success/error handlers still update documents
(setDocuments) and only the final router.refresh() is performed after all
promises settle.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository UI
Review profile: CHILL
Plan: Pro
Run ID: 62f71d07-0976-46a1-9e71-cda4117ab9bb
📒 Files selected for processing (4)
app/api/upload/route.tsapp/layout.tsxcomponents/chat-area.tsxlib/actions/chat.ts
💤 Files with no reviewable changes (1)
- lib/actions/chat.ts
| if (!response.ok) { | ||
| const err = await response.json().catch(() => ({})); | ||
| throw new Error(err.error || "Upload failed"); | ||
| } |
There was a problem hiding this comment.
Preserve backend failure detail in toast errors.
Only err.error is surfaced; useful backend messages in detail are lost, so users get generic upload failures.
💡 Suggested tweak
- if (!response.ok) {
- const err = await response.json().catch(() => ({}));
- throw new Error(err.error || "Upload failed");
- }
+ if (!response.ok) {
+ const err = await response.json().catch(() => null);
+ const message =
+ err?.detail?.message ??
+ err?.detail ??
+ err?.error ??
+ `Upload failed (${response.status})`;
+ throw new Error(message);
+ }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@components/chat-area.tsx` around lines 171 - 174, The current response error
handling in the upload flow (the block checking response.ok in
components/chat-area.tsx) only surfaces err.error and discards err.detail;
update the throw to include backend detail by reading both err.error and
err.detail (e.g., build a message like `${err.error || 'Upload
failed'}${err.detail ? ': ' + err.detail : ''}`) so the toast shows the backend
failure details; keep the existing fallback to "Upload failed" when neither
field exists and preserve the try/catch around response.json() that defaults to
{}.
There was a problem hiding this comment.
Pull request overview
This PR shifts document uploads from a Next.js Server Action to a dedicated streaming App Router API route (proxying to the FastAPI backend) to reduce Next.js server memory load, and improves user feedback via richer Sonner toasts.
Changes:
- Remove the Server Action used for document uploads and switch the client to POST to a new
/api/uploadstreaming proxy route. - Update the FastAPI ingestion endpoint to take
chat_idfrom the query string instead of multipart form data. - Improve UX with toast error reporting, upload progress toasts, and
richColorsToaster styling.
Reviewed changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
lib/actions/chat.ts |
Removes uploadDocumentAction (upload now handled by API route). |
components/chat-area.tsx |
Uploads via /api/upload, adds optimistic uploading state + toast feedback, increases max client-side file size to 50MB. |
backend/api/v1/ingestion_routes.py |
Changes ingestion API to accept chat_id as a query parameter. |
app/layout.tsx |
Enables Sonner Toaster richColors. |
app/api/upload/route.ts |
Adds streaming upload proxy route with auth + chat ownership verification and Prisma doc count increment + revalidation. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
No description provided.