Conversation
…as_more functionality
- Add GET /projects/organization/{org_id} endpoint with org validation
- Add has_more pagination to organizations list endpoint
- Add Swagger docs for list_by_org
…organizations; update read_projects_by_organization response type
…om/ProjectTech4DevAI/kaapi-backend into feat/google-integration-auth-flow
|
Warning Rate limit exceeded
Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 16 minutes and 55 seconds. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
📝 WalkthroughWalkthroughRefactors JWT auth into a reusable helper, expands get_auth_context to accept a Request and prefer API keys, adds refresh-token support and Google OAuth endpoints (/auth/google, /auth/select-project, /auth/refresh, /auth/logout), extends token/user models, adds Google client config and dependency, and updates tests to cover cookie- and Google-based flows. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant Server
participant Google as Google Auth
participant DB as Database
participant JWT as JWT/Cookie
Client->>Server: POST /auth/google (google_id_token)
Server->>Google: verify_oauth2_token(token, audience=GOOGLE_CLIENT_ID)
alt invalid / expired / unverified
Google-->>Server: error
Server-->>Client: 400/401/403
else valid
Google-->>Server: idinfo (email, name, ...)
Server->>DB: lookup user by email
alt no user / inactive
DB-->>Server: not found / inactive
Server-->>Client: 401/403
else user active
Server->>DB: query active org/project via API keys
DB-->>Server: available projects
Server->>JWT: create_access_token + create_refresh_token (embed org/project if selected)
JWT-->>Server: tokens
Server-->>Client: set cookies, return access_token (or require project selection)
end
end
sequenceDiagram
participant Client
participant Server
participant JWT as JWT Validator
participant DB as Database
Client->>Server: POST /auth/refresh (refresh_token cookie)
Server->>JWT: decode & validate refresh_token
alt invalid / expired / wrong type
JWT-->>Server: error
Server-->>Client: 401/403
else valid
JWT-->>Server: claims (sub, org_id, project_id)
Server->>DB: verify user exists & active
alt user missing/inactive
DB-->>Server: not found / inactive
Server-->>Client: 401/403
else valid
Server->>JWT: create new access & refresh tokens (preserve org/project)
JWT-->>Server: tokens
Server-->>Client: set new cookies, return access_token
end
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Suggested labels
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
…kend into feat/google-integration-auth-flow
There was a problem hiding this comment.
Actionable comments posted: 5
🧹 Nitpick comments (6)
backend/app/models/auth.py (1)
27-33: Consider using more specific types fordictfields.The
google_profile: dictandavailable_projects: list[dict]fields lack specific type parameters. Usingdict[str, Any]provides better type clarity, or consider defining dedicated schema classes for stronger typing.Proposed improvement
+from typing import Any + class GoogleAuthResponse(SQLModel): access_token: str token_type: str = "bearer" user: UserPublic - google_profile: dict + google_profile: dict[str, Any] requires_project_selection: bool = False - available_projects: list[dict] = [] + available_projects: list[dict[str, Any]] = []🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@backend/app/models/auth.py` around lines 27 - 33, The GoogleAuthResponse model uses untyped dicts; update the type annotations for google_profile and available_projects to be more specific (e.g., google_profile: dict[str, Any] and available_projects: list[dict[str, Any]]) or better yet define dedicated Pydantic/SQLModel schema classes (e.g., GoogleProfile and ProjectInfo) and replace google_profile and available_projects types with those classes (or list[ProjectInfo]) to improve type clarity and validation in the GoogleAuthResponse model.backend/app/core/config.py (1)
57-58: Consider validatingGOOGLE_CLIENT_IDin production environments.The empty default for
GOOGLE_CLIENT_IDcould lead to silent failures if Google OAuth is expected to work but the client ID is not configured. Consider adding validation in_enforce_non_default_secretsto warn or error when this is empty in production, similar to howSECRET_KEYandPOSTGRES_PASSWORDare validated.Example validation addition
`@model_validator`(mode="after") def _enforce_non_default_secrets(self) -> Self: self._check_default_secret("SECRET_KEY", self.SECRET_KEY) self._check_default_secret("POSTGRES_PASSWORD", self.POSTGRES_PASSWORD) self._check_default_secret( "FIRST_SUPERUSER_PASSWORD", self.FIRST_SUPERUSER_PASSWORD ) # Warn if Google OAuth is not configured in production if self.ENVIRONMENT == "production" and not self.GOOGLE_CLIENT_ID: warnings.warn( "GOOGLE_CLIENT_ID is not set. Google OAuth will not work.", stacklevel=1, ) return self🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@backend/app/core/config.py` around lines 57 - 58, Add validation for GOOGLE_CLIENT_ID in the model validator _enforce_non_default_secrets: if ENVIRONMENT == "production" and GOOGLE_CLIENT_ID is empty, call _check_default_secret-like behavior or emit a warning/error (consistent with how SECRET_KEY and POSTGRES_PASSWORD are handled) so missing Google OAuth configuration is surfaced in production; reference the GOOGLE_CLIENT_ID field and the _enforce_non_default_secrets method when implementing this check.backend/app/core/security.py (2)
88-88: Usedatetime.UTCalias instead oftimezone.utc.Python 3.11+ provides
datetime.UTCas a cleaner alias. Since the project requires Python 3.12+, this is the preferred form.Proposed fix
- expire = datetime.now(timezone.utc) + expires_delta + expire = datetime.now(datetime.UTC) + expires_deltaAlso update the imports:
-from datetime import datetime, timedelta, timezone +from datetime import datetime, timedeltaAlso applies to: 115-115
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@backend/app/core/security.py` at line 88, Replace uses of timezone.utc with the new datetime.UTC alias: change expressions like "datetime.now(timezone.utc)" to "datetime.now(datetime.UTC)" (e.g., the "expire" assignments). Update imports to remove timezone (e.g., remove "timezone" from "from datetime import datetime, timezone, timedelta" or similar) so only datetime and timedelta are imported; ensure any other occurrences (both instances noted) are updated to use datetime.UTC.
70-121: Consider extracting shared token creation logic.
create_access_tokenandcreate_refresh_tokenshare nearly identical logic, differing only in thetypeclaim. This duplication could be reduced with a private helper function.Proposed refactor
def _create_token( subject: str | Any, expires_delta: timedelta, token_type: str, organization_id: int | None = None, project_id: int | None = None, ) -> str: """Internal helper to create JWT tokens.""" expire = datetime.now(datetime.UTC) + expires_delta to_encode: dict[str, Any] = {"exp": expire, "sub": str(subject), "type": token_type} if organization_id is not None: to_encode["org_id"] = organization_id if project_id is not None: to_encode["project_id"] = project_id return jwt.encode(to_encode, settings.SECRET_KEY, algorithm=ALGORITHM) def create_access_token( subject: str | Any, expires_delta: timedelta, organization_id: int | None = None, project_id: int | None = None, ) -> str: """Create a JWT access token.""" return _create_token(subject, expires_delta, "access", organization_id, project_id) def create_refresh_token( subject: str | Any, expires_delta: timedelta, organization_id: int | None = None, project_id: int | None = None, ) -> str: """Create a JWT refresh token.""" return _create_token(subject, expires_delta, "refresh", organization_id, project_id)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@backend/app/core/security.py` around lines 70 - 121, The two functions create_access_token and create_refresh_token duplicate token-building logic; extract the shared behavior into a private helper (e.g., _create_token) that accepts subject, expires_delta, token_type, organization_id, and project_id, builds the same to_encode payload (including "exp", "sub", "type", optional "org_id" and "project_id"), and returns jwt.encode(..., settings.SECRET_KEY, algorithm=ALGORITHM); then have create_access_token call _create_token(..., "access", ...) and create_refresh_token call _create_token(..., "refresh", ...).backend/app/api/docs/auth/google.md (1)
14-16: Documentation may be incomplete regarding refresh token handling.The documentation mentions setting an
access_tokencookie, but based on the PR summary, refresh tokens are also managed via HTTP-only cookies. Consider updating to clarify whether both tokens are set as cookies or just the access token.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@backend/app/api/docs/auth/google.md` around lines 14 - 16, Update the Google auth docs to explicitly document refresh token behavior: state that the server sets both access and refresh tokens as HTTP-only secure cookies (e.g., access_token and refresh_token) when issuing tokens, describe when a refresh token is issued/rotated and its cookie expiration, and note that the response body returns user details and Google profile but tokens are stored in cookies for client security; reference the Google auth flow described in this file so readers know both cookies are used and how to obtain a new access token via the refresh endpoint.backend/app/api/deps.py (1)
59-64: Fail closed on unexpected JWTtypevalues.Anything other than
"refresh"is currently accepted here, so a future token type would silently gain API access. It’s safer to allowlist"access"and reject everything else.🔐 Tighten the token-type check
- if token_data.type == "refresh": + if token_data.type != "access": raise HTTPException( status_code=status.HTTP_401_UNAUTHORIZED, - detail="Refresh tokens cannot be used for API access", + detail="Invalid token type for API access", )🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@backend/app/api/deps.py` around lines 59 - 64, The current check only rejects token_data.type == "refresh" and lets any other type through; update the logic in the dependency that checks token_data (e.g., the code handling token_data.type in deps.py / the get_current_user or similar function) to explicitly allow only "access" tokens and raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="Invalid token type") for any other value; keep the existing refresh-specific message if you prefer, but ensure any non-"access" type is rejected to fail closed.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@backend/app/api/routes/google_auth.py`:
- Around line 128-137: The response currently puts access_token into the JSON
body even though _set_auth_cookies already sets it as an HttpOnly cookie; remove
access_token from the JSON response payload and only return user/metadata (keep
google_profile, requires_project_selection, available_projects). Update the code
that constructs GoogleAuthResponse (the instantiation here and other occurrences
that build JSONResponse with access_token) to stop including access_token, and
if necessary adjust the GoogleAuthResponse schema so access_token is not
required; still call _set_auth_cookies(response, access_token, refresh_token) to
set the cookie. Ensure any other return sites that mirror this pattern (other
GoogleAuthResponse constructions) are changed consistently.
- Line 182: Replace direct interpolation of raw identifiers in all logger calls
in google_auth.py with masked versions using the existing mask_string utility
and ensure messages are prefixed with the function name in square brackets.
Concretely, change logger.info(f"[google_auth] No account found for email:
{email}") (and the other logger calls that include email, user_id, project_id,
etc.) to logger.info(f"[google_auth] No account found for email:
{mask_string(email)}") and similarly wrap project_id and user identifiers with
mask_string(project_id) or mask_string(user_id); ensure any
logger.error/logger.warning uses the same pattern and import/usage of
mask_string in the file.
- Around line 241-244: The router decorators for the auth endpoints
/select-project, /refresh, and /logout are missing external Swagger
descriptions; update each `@router.post`(...) decorator to include
load_description("auth/<action>.md") like the existing /google decorator (e.g.,
use load_description("auth/select-project.md") for the /select-project route)
and create corresponding markdown files (auth/select-project.md,
auth/refresh.md, auth/logout.md) with the endpoint docs; ensure you use the same
decorator pattern and keep response_model=Token (or the correct model) unchanged
so all operations are consistently documented.
- Around line 327-331: The refresh flow currently reuses org_id/project_id from
token_data when calling _create_token_pair (lines around the refresh branch),
which can extend revoked API-key scoped access; instead, re-query the user's
current project authorizations via _get_user_projects (or the existing
project-checking helper) using the user.id and the token_data (e.g., api_key id)
before minting a new token pair, then pass the re-derived organization_id and
project_id to _create_token_pair (or null them if no access) so refresh cannot
re-grant projects that were revoked.
- Around line 35-61: The functions _create_token_and_response and
_get_user_projects use overly-generic types; add precise type annotations by
changing the _create_token_and_response signature to accept user: User and
replace google_profile: dict and available_projects: list[dict] with concrete
types (define a GoogleProfile TypedDict or Pydantic model and a ProjectEntry
TypedDict/Pydantic model for project objects), update all internal usages to
those types, and import User from your models; also change _get_user_projects
return type from list[dict] to list[ProjectEntry] (or the chosen concrete
project type) and update the comprehension to match the new TypedDict/Model
fields. Ensure the new types are exported/imported where needed and used
consistently across token creation and response construction.
---
Nitpick comments:
In `@backend/app/api/deps.py`:
- Around line 59-64: The current check only rejects token_data.type == "refresh"
and lets any other type through; update the logic in the dependency that checks
token_data (e.g., the code handling token_data.type in deps.py / the
get_current_user or similar function) to explicitly allow only "access" tokens
and raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED,
detail="Invalid token type") for any other value; keep the existing
refresh-specific message if you prefer, but ensure any non-"access" type is
rejected to fail closed.
In `@backend/app/api/docs/auth/google.md`:
- Around line 14-16: Update the Google auth docs to explicitly document refresh
token behavior: state that the server sets both access and refresh tokens as
HTTP-only secure cookies (e.g., access_token and refresh_token) when issuing
tokens, describe when a refresh token is issued/rotated and its cookie
expiration, and note that the response body returns user details and Google
profile but tokens are stored in cookies for client security; reference the
Google auth flow described in this file so readers know both cookies are used
and how to obtain a new access token via the refresh endpoint.
In `@backend/app/core/config.py`:
- Around line 57-58: Add validation for GOOGLE_CLIENT_ID in the model validator
_enforce_non_default_secrets: if ENVIRONMENT == "production" and
GOOGLE_CLIENT_ID is empty, call _check_default_secret-like behavior or emit a
warning/error (consistent with how SECRET_KEY and POSTGRES_PASSWORD are handled)
so missing Google OAuth configuration is surfaced in production; reference the
GOOGLE_CLIENT_ID field and the _enforce_non_default_secrets method when
implementing this check.
In `@backend/app/core/security.py`:
- Line 88: Replace uses of timezone.utc with the new datetime.UTC alias: change
expressions like "datetime.now(timezone.utc)" to "datetime.now(datetime.UTC)"
(e.g., the "expire" assignments). Update imports to remove timezone (e.g.,
remove "timezone" from "from datetime import datetime, timezone, timedelta" or
similar) so only datetime and timedelta are imported; ensure any other
occurrences (both instances noted) are updated to use datetime.UTC.
- Around line 70-121: The two functions create_access_token and
create_refresh_token duplicate token-building logic; extract the shared behavior
into a private helper (e.g., _create_token) that accepts subject, expires_delta,
token_type, organization_id, and project_id, builds the same to_encode payload
(including "exp", "sub", "type", optional "org_id" and "project_id"), and
returns jwt.encode(..., settings.SECRET_KEY, algorithm=ALGORITHM); then have
create_access_token call _create_token(..., "access", ...) and
create_refresh_token call _create_token(..., "refresh", ...).
In `@backend/app/models/auth.py`:
- Around line 27-33: The GoogleAuthResponse model uses untyped dicts; update the
type annotations for google_profile and available_projects to be more specific
(e.g., google_profile: dict[str, Any] and available_projects: list[dict[str,
Any]]) or better yet define dedicated Pydantic/SQLModel schema classes (e.g.,
GoogleProfile and ProjectInfo) and replace google_profile and available_projects
types with those classes (or list[ProjectInfo]) to improve type clarity and
validation in the GoogleAuthResponse model.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: aad3753a-f63f-4429-8291-11acc69e1a8f
⛔ Files ignored due to path filters (1)
backend/uv.lockis excluded by!**/*.lock
📒 Files selected for processing (10)
backend/app/api/deps.pybackend/app/api/docs/auth/google.mdbackend/app/api/main.pybackend/app/api/routes/google_auth.pybackend/app/core/config.pybackend/app/core/security.pybackend/app/models/__init__.pybackend/app/models/auth.pybackend/app/models/user.pybackend/pyproject.toml
| response_data = GoogleAuthResponse( | ||
| access_token=access_token, | ||
| user=UserPublic.model_validate(user), | ||
| google_profile=google_profile, | ||
| requires_project_selection=requires_project_selection, | ||
| available_projects=available_projects, | ||
| ) | ||
|
|
||
| response = JSONResponse(content=response_data.model_dump()) | ||
| _set_auth_cookies(response, access_token, refresh_token) |
There was a problem hiding this comment.
Avoid returning the access token in JSON when you already set it as an HttpOnly cookie.
This exposes the same bearer credential to JavaScript and undercuts most of the XSS containment you get from httponly=True. For the browser flow, keep the token in the cookie only and return just the user / selection metadata.
Also applies to: 272-273, 333-334
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@backend/app/api/routes/google_auth.py` around lines 128 - 137, The response
currently puts access_token into the JSON body even though _set_auth_cookies
already sets it as an HttpOnly cookie; remove access_token from the JSON
response payload and only return user/metadata (keep google_profile,
requires_project_selection, available_projects). Update the code that constructs
GoogleAuthResponse (the instantiation here and other occurrences that build
JSONResponse with access_token) to stop including access_token, and if necessary
adjust the GoogleAuthResponse schema so access_token is not required; still call
_set_auth_cookies(response, access_token, refresh_token) to set the cookie.
Ensure any other return sites that mirror this pattern (other GoogleAuthResponse
constructions) are changed consistently.
| # Look up user by email | ||
| user = get_user_by_email(session=session, email=email) | ||
| if not user: | ||
| logger.info(f"[google_auth] No account found for email: {email}") |
There was a problem hiding this comment.
Mask identifiers in these auth logs.
These messages emit raw email/user/project identifiers from a sensitive auth path. Please mask them before logging so we don’t leak PII or internal IDs into log sinks.
As per coding guidelines, Prefix all log messages with the function name in square brackets: logger.info(f"[function_name] Message {mask_string(sensitive_value)}").
Also applies to: 208-209, 220-221, 231-232, 275-276, 336-336
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@backend/app/api/routes/google_auth.py` at line 182, Replace direct
interpolation of raw identifiers in all logger calls in google_auth.py with
masked versions using the existing mask_string utility and ensure messages are
prefixed with the function name in square brackets. Concretely, change
logger.info(f"[google_auth] No account found for email: {email}") (and the other
logger calls that include email, user_id, project_id, etc.) to
logger.info(f"[google_auth] No account found for email: {mask_string(email)}")
and similarly wrap project_id and user identifiers with mask_string(project_id)
or mask_string(user_id); ensure any logger.error/logger.warning uses the same
pattern and import/usage of mask_string in the file.
| @router.post( | ||
| "/select-project", | ||
| response_model=Token, | ||
| ) |
There was a problem hiding this comment.
🛠️ Refactor suggestion | 🟠 Major
Add external Swagger descriptions for the remaining auth endpoints.
/select-project, /refresh, and /logout skip load_description(...), so this router now mixes documented and undocumented operations. Mirror the /google decorator pattern here and add the matching markdown files.
As per coding guidelines, Load Swagger endpoint descriptions from external markdown files instead of inline strings using load_description("domain/action.md").
Also applies to: 281-284, 340-340
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@backend/app/api/routes/google_auth.py` around lines 241 - 244, The router
decorators for the auth endpoints /select-project, /refresh, and /logout are
missing external Swagger descriptions; update each `@router.post`(...) decorator
to include load_description("auth/<action>.md") like the existing /google
decorator (e.g., use load_description("auth/select-project.md") for the
/select-project route) and create corresponding markdown files
(auth/select-project.md, auth/refresh.md, auth/logout.md) with the endpoint
docs; ensure you use the same decorator pattern and keep response_model=Token
(or the correct model) unchanged so all operations are consistently documented.
| access_token, new_refresh_token = _create_token_pair( | ||
| user.id, | ||
| organization_id=token_data.org_id, | ||
| project_id=token_data.project_id, | ||
| ) |
There was a problem hiding this comment.
Re-check current project access before rotating the refresh token.
This branch copies org_id/project_id straight from the old refresh token instead of re-deriving them from _get_user_projects(). If that API-key grant was revoked after login, /auth/refresh will keep extending project-scoped access instead of stopping at the access-token boundary.
🛡️ Re-authorize before minting the next token pair
user = session.get(User, token_data.sub)
if not user:
raise HTTPException(status_code=404, detail="User not found")
if not user.is_active:
raise HTTPException(status_code=403, detail="Inactive user")
+ if token_data.project_id is not None:
+ current_projects = _get_user_projects(session, user.id)
+ still_authorized = any(
+ p["project_id"] == token_data.project_id
+ and p["organization_id"] == token_data.org_id
+ for p in current_projects
+ )
+ if not still_authorized:
+ raise HTTPException(
+ status_code=status.HTTP_403_FORBIDDEN,
+ detail="Project access has been revoked",
+ )
+
access_token, new_refresh_token = _create_token_pair(
user.id,
organization_id=token_data.org_id,
project_id=token_data.project_id,
)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@backend/app/api/routes/google_auth.py` around lines 327 - 331, The refresh
flow currently reuses org_id/project_id from token_data when calling
_create_token_pair (lines around the refresh branch), which can extend revoked
API-key scoped access; instead, re-query the user's current project
authorizations via _get_user_projects (or the existing project-checking helper)
using the user.id and the token_data (e.g., api_key id) before minting a new
token pair, then pass the re-derived organization_id and project_id to
_create_token_pair (or null them if no access) so refresh cannot re-grant
projects that were revoked.
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
backend/app/tests/api/test_permissions.py (1)
13-17: Use a realRequestobject instead of bareMagicMockto catch unexpected attribute access.The
_mock_request()helper at line 13-17 creates a bareMagicMock()which silently returns nested mocks for any undefined attribute access. This can mask regressions inget_auth_contextif its request-dependent behavior changes. Replace with a realRequestobject that will fail fast on unexpected attribute access.Suggested change
-from unittest.mock import MagicMock +from starlette.requests import Request -def _mock_request() -> MagicMock: - """Create a mock Request object with empty cookies.""" - request = MagicMock() - request.cookies = {} - return request +def _mock_request() -> Request: + """Create a minimal Request object with empty cookies.""" + return Request({"type": "http", "headers": []})🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@backend/app/tests/api/test_permissions.py` around lines 13 - 17, Replace the bare MagicMock returned by _mock_request with a real Request instance so unexpected attribute access fails fast; specifically, stop returning MagicMock() in _mock_request and instead construct a real starlette.requests.Request (or fastapi.Request) with an empty scope/cookies and pass that into get_auth_context in tests; update references to _mock_request() used in test_permissions to ensure tests use this real Request object rather than a MagicMock to catch regressions in get_auth_context.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@backend/app/tests/api/test_deps.py`:
- Line 46: The test's docstring in backend/app/tests/api/test_deps.py claims to
validate "invalid API key" but asserts the generic fallback message ("Invalid
Authorization format") produced by backend/app/api/deps.py; either make the test
explicitly assert an API-key-specific failure (e.g., exercise the API-key
validation branch and assert a distinct message like "Invalid API key" or check
the API-key validation function returns the API-key error) or rename/retitle the
test to indicate it verifies the generic fallback behavior (e.g.,
"authentication fails with invalid or missing credentials -> generic fallback"),
and update the assertion and test name/docstring accordingly so the intent
matches the asserted message from deps.py.
---
Nitpick comments:
In `@backend/app/tests/api/test_permissions.py`:
- Around line 13-17: Replace the bare MagicMock returned by _mock_request with a
real Request instance so unexpected attribute access fails fast; specifically,
stop returning MagicMock() in _mock_request and instead construct a real
starlette.requests.Request (or fastapi.Request) with an empty scope/cookies and
pass that into get_auth_context in tests; update references to _mock_request()
used in test_permissions to ensure tests use this real Request object rather
than a MagicMock to catch regressions in get_auth_context.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: a4c2dcd6-f849-4c9c-bfb2-4fe6e9aafd79
📒 Files selected for processing (2)
backend/app/tests/api/test_deps.pybackend/app/tests/api/test_permissions.py
|
|
||
| def test_get_auth_context_with_invalid_api_key(self, db: Session) -> None: | ||
| """Test authentication fails with invalid API key""" | ||
| """Test authentication fails with invalid API key when no other auth is provided""" |
There was a problem hiding this comment.
This assertion no longer validates the “invalid API key” path specifically.
At Line 58, the test now expects "Invalid Authorization format", which in backend/app/api/deps.py (Line 123) is the generic fallback used when auth methods fail, including missing credentials. This makes the test intent at Line 46 (“invalid API key”) ambiguous and less branch-specific.
Please either:
- assert an API-key-specific failure detail (if that behavior is intended), or
- rename/retitle this test to explicitly validate the generic fallback behavior.
Based on learnings: "Use JWT-based authentication for user access and API key support for programmatic access".
Also applies to: 58-58
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@backend/app/tests/api/test_deps.py` at line 46, The test's docstring in
backend/app/tests/api/test_deps.py claims to validate "invalid API key" but
asserts the generic fallback message ("Invalid Authorization format") produced
by backend/app/api/deps.py; either make the test explicitly assert an
API-key-specific failure (e.g., exercise the API-key validation branch and
assert a distinct message like "Invalid API key" or check the API-key validation
function returns the API-key error) or rename/retitle the test to indicate it
verifies the generic fallback behavior (e.g., "authentication fails with invalid
or missing credentials -> generic fallback"), and update the assertion and test
name/docstring accordingly so the intent matches the asserted message from
deps.py.
Codecov Report❌ Patch coverage is
📢 Thoughts on this report? Let us know! |
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (5)
backend/app/tests/core/test_security.py (2)
205-233: Add-> Nonereturn type hints to test methods.Per coding guidelines, all function parameters and return values require type hints. The test methods in
TestCreateAccessTokenare missing return type annotations.🔧 Proposed fix
- def test_creates_valid_jwt(self): + def test_creates_valid_jwt(self) -> None: """Test that a valid JWT is created.""" ... - def test_includes_org_and_project(self): + def test_includes_org_and_project(self) -> None: """Test that org_id and project_id are embedded in the token.""" ... - def test_omits_org_and_project_when_none(self): + def test_omits_org_and_project_when_none(self) -> None: """Test that org_id and project_id are omitted when not provided."""As per coding guidelines: "Always add type hints to all function parameters and return values in Python code".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@backend/app/tests/core/test_security.py` around lines 205 - 233, Add explicit return type hints "-> None" to the three test methods so they follow the project's typing guidelines: update test_creates_valid_jwt, test_includes_org_and_project, and test_omits_org_and_project_when_none to include "-> None" on their definitions (no other behavior changes).
239-268: Add-> Nonereturn type hints to test methods.Same issue in
TestCreateRefreshToken- the test methods are missing return type annotations.🔧 Proposed fix
- def test_creates_valid_refresh_jwt(self): + def test_creates_valid_refresh_jwt(self) -> None: """Test that a valid refresh JWT is created.""" ... - def test_includes_org_and_project(self): + def test_includes_org_and_project(self) -> None: """Test that org_id and project_id are embedded in the refresh token.""" ... - def test_omits_org_and_project_when_none(self): + def test_omits_org_and_project_when_none(self) -> None: """Test that org_id and project_id are omitted when not provided."""As per coding guidelines: "Always add type hints to all function parameters and return values in Python code".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@backend/app/tests/core/test_security.py` around lines 239 - 268, Add explicit return type hints (-> None) to the test methods in TestCreateRefreshToken: test_creates_valid_refresh_jwt, test_includes_org_and_project, and test_omits_org_and_project_when_none. Update each def signature to include "-> None" after the parameter list (e.g., def test_creates_valid_refresh_jwt(self) -> None:) so they conform to the project's typing guideline; no other logic changes are needed.backend/app/tests/api/test_deps.py (1)
207-224: Movetimedeltaimport to file top.The
from datetime import timedeltais imported inline in multiple test methods (Lines 209, 228, 249). Since this is a standard library import used across several tests, it should be at the file top for consistency and to avoid redundant imports.🔧 Proposed fix
Add at file top (around line 3):
from datetime import timedeltaThen remove the inline imports at lines 209, 228, and 249.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@backend/app/tests/api/test_deps.py` around lines 207 - 224, Move the inline "from datetime import timedelta" into the module-level imports and remove the redundant inline imports in the test functions; specifically add a single "from datetime import timedelta" at the top of the file and delete the inline imports inside test_get_auth_context_with_expired_token and the other test functions that currently re-import timedelta so those tests (e.g., test_get_auth_context_with_expired_token) use the top-level timedelta import.backend/app/tests/api/test_google_auth.py (2)
253-267: Consider verifying org/project claims are actually preserved in the new token.The test
test_refresh_with_org_projectcreates a refresh token with org/project claims and verifies a 200 response, but doesn't decode the returned access token to confirm the claims were preserved. This would strengthen the test's assertion.🔧 Proposed enhancement
+import jwt +from app.core.security import ALGORITHM + def test_refresh_with_org_project( self, db: Session, client: TestClient, user_api_key: TestAuthContext ): """Test refresh preserves org/project claims.""" refresh_token = create_refresh_token( subject=str(user_api_key.user.id), expires_delta=timedelta(days=7), organization_id=user_api_key.organization.id, project_id=user_api_key.project.id, ) client.cookies.set("refresh_token", refresh_token) resp = client.post(REFRESH_URL) assert resp.status_code == 200 - assert "access_token" in resp.json() + + new_access_token = resp.json()["access_token"] + payload = jwt.decode(new_access_token, settings.SECRET_KEY, algorithms=[ALGORITHM]) + assert payload["org_id"] == user_api_key.organization.id + assert payload["project_id"] == user_api_key.project.id🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@backend/app/tests/api/test_google_auth.py` around lines 253 - 267, Update the test_refresh_with_org_project test to decode the returned access token and assert the organization/project claims are preserved: after calling client.post(REFRESH_URL) and confirming status_code==200, extract resp.json()["access_token"], decode it (using the same JWT secret/algorithm as create_refresh_token) and assert the decoded payload contains organization_id == user_api_key.organization.id and project_id == user_api_key.project.id; reference the test function name test_refresh_with_org_project and the create_refresh_token/REFRESH_URL symbols to locate where to add the decoding and assertions.
34-40: Add-> Nonereturn type hints to test methods.Per coding guidelines, all test methods should have
-> Nonereturn type annotations. This applies to all test methods in this file.🔧 Proposed fix for test class signatures
- def test_google_auth_not_configured(self, mock_settings, client: TestClient): + def test_google_auth_not_configured(self, mock_settings, client: TestClient) -> None: - def test_google_auth_invalid_token(self, mock_settings, mock_verify, client: TestClient): + def test_google_auth_invalid_token(self, mock_settings, mock_verify, client: TestClient) -> None:Apply similar changes to all test methods in this file.
As per coding guidelines: "Always add type hints to all function parameters and return values in Python code".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@backend/app/tests/api/test_google_auth.py` around lines 34 - 40, The test methods (e.g., test_google_auth_not_configured) lack return type annotations; update each test function in this file to include the return type hint "-> None" on their definitions (preserve existing decorators like `@patch` and parameter lists such as mock_settings, client: TestClient) so every test_* function signature ends with "-> None".
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@backend/app/tests/api/test_google_auth.py`:
- Around line 87-111: The test test_google_auth_activates_inactive_user
conflicts with the google_auth handler which currently rejects inactive users
(raises HTTPException 403 with "Inactive user account"); either update the
handler (in google_auth) to flip the user.is_active to True and persist
(db.commit()/db.refresh()) before the inactive-user check so the request returns
200 and the user becomes active, or change the test to expect a 403 response and
the "Inactive user account" detail instead of asserting activation; reference
the google_auth handler and the test_google_auth_activates_inactive_user test
when making the fix.
---
Nitpick comments:
In `@backend/app/tests/api/test_deps.py`:
- Around line 207-224: Move the inline "from datetime import timedelta" into the
module-level imports and remove the redundant inline imports in the test
functions; specifically add a single "from datetime import timedelta" at the top
of the file and delete the inline imports inside
test_get_auth_context_with_expired_token and the other test functions that
currently re-import timedelta so those tests (e.g.,
test_get_auth_context_with_expired_token) use the top-level timedelta import.
In `@backend/app/tests/api/test_google_auth.py`:
- Around line 253-267: Update the test_refresh_with_org_project test to decode
the returned access token and assert the organization/project claims are
preserved: after calling client.post(REFRESH_URL) and confirming
status_code==200, extract resp.json()["access_token"], decode it (using the same
JWT secret/algorithm as create_refresh_token) and assert the decoded payload
contains organization_id == user_api_key.organization.id and project_id ==
user_api_key.project.id; reference the test function name
test_refresh_with_org_project and the create_refresh_token/REFRESH_URL symbols
to locate where to add the decoding and assertions.
- Around line 34-40: The test methods (e.g., test_google_auth_not_configured)
lack return type annotations; update each test function in this file to include
the return type hint "-> None" on their definitions (preserve existing
decorators like `@patch` and parameter lists such as mock_settings, client:
TestClient) so every test_* function signature ends with "-> None".
In `@backend/app/tests/core/test_security.py`:
- Around line 205-233: Add explicit return type hints "-> None" to the three
test methods so they follow the project's typing guidelines: update
test_creates_valid_jwt, test_includes_org_and_project, and
test_omits_org_and_project_when_none to include "-> None" on their definitions
(no other behavior changes).
- Around line 239-268: Add explicit return type hints (-> None) to the test
methods in TestCreateRefreshToken: test_creates_valid_refresh_jwt,
test_includes_org_and_project, and test_omits_org_and_project_when_none. Update
each def signature to include "-> None" after the parameter list (e.g., def
test_creates_valid_refresh_jwt(self) -> None:) so they conform to the project's
typing guideline; no other logic changes are needed.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 2c9473ab-ec7a-4ed6-9dd6-3cf21db24ef7
📒 Files selected for processing (3)
backend/app/tests/api/test_deps.pybackend/app/tests/api/test_google_auth.pybackend/app/tests/core/test_security.py
Summary
Target issue is: #731
Explain the motivation for making this change. What existing problem does the pull request solve?
Checklist
Before submitting a pull request, please ensure that you mark these task.
fastapi run --reload app/main.pyordocker compose upin the repository root and test.Notes
Please add here if any other information is required for the reviewer.
Summary by CodeRabbit
New Features
Bug Fixes
Documentation
Tests
Chores