Changes to how users are added and deleted#235
Conversation
📝 WalkthroughWalkthroughAdded bidirectional SQLAlchemy relationships between Changes
Sequence DiagramssequenceDiagram
participant Client
participant GraphQL as GraphQL Server (CreateUser)
participant S3 as S3/Spaces (boto3)
participant DB as Database
Client->>GraphQL: POST CreateUser(net_id, encoded_image)
GraphQL->>GraphQL: base64.b64decode(encoded_image)
GraphQL->>S3: put_object(Bucket, Key, Body, ContentType, ACL)
S3-->>GraphQL: Success / ClientError
GraphQL->>GraphQL: construct public Spaces URL
GraphQL->>DB: INSERT user (including photo_url)
DB-->>GraphQL: created user
GraphQL-->>Client: Return created user + photo_url
sequenceDiagram
participant Client
participant GraphQL as GraphQL Server (DeleteUserById)
participant DB as Database
participant S3 as S3/Spaces (boto3)
Client->>GraphQL: POST DeleteUserById(user_id)
GraphQL->>DB: Query user by id
DB-->>GraphQL: user record (contains net_id/encoded_image)
GraphQL->>S3: delete_object(Bucket, Key) (if encoded_image)
S3-->>GraphQL: Success / ClientError
GraphQL->>DB: DELETE user record
DB-->>GraphQL: User deleted
GraphQL-->>Client: Deletion confirmed
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Poem
🚥 Pre-merge checks | ✅ 1 | ❌ 2❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (1 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 4
🧹 Nitpick comments (2)
src/schema.py (2)
10-10: Unused import flagged by static analysis.
WeeklyChallengeModelis imported but never used. Consider removing this import.Proposed fix
-from src.models.weekly_challenge import WeeklyChallenge as WeeklyChallengeModel🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` at line 10, The import WeeklyChallengeModel in src/schema.py is unused; remove the line importing WeeklyChallenge from src.models.weekly_challenge (the symbol WeeklyChallengeModel) so there is no unused import left in the module.
833-837: Consider extracting S3 client initialization to avoid duplication.The boto3 client initialization is duplicated in
CreateUser(lines 833-837) andDeleteUserById(lines 1170-1174). Consider extracting to a helper function or module-level lazy singleton.Example helper function
def get_s3_client(): return boto3.client('s3', endpoint_url=os.getenv("DIGITAL_OCEAN_URL"), aws_access_key_id=os.getenv("DIGITAL_OCEAN_ACCESS"), aws_secret_access_key=os.getenv("DIGITAL_OCEAN_SECRET_ACCESS") )🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 833 - 837, Extract the duplicated boto3.client('s3', ...) initialization into a shared helper (e.g., get_s3_client) or a module-level lazy singleton (e.g., _s3_client with a get_s3_client() accessor) and update the CreateUser and DeleteUserById handlers to call that helper instead of inlining the client creation; ensure the helper reads the same env vars (DIGITAL_OCEAN_URL, DIGITAL_OCEAN_ACCESS, DIGITAL_OCEAN_SECRET_ACCESS) and returns a boto3 S3 client so both CreateUser and DeleteUserById use the single implementation.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/schema.py`:
- Around line 1078-1079: Re-enable JWT protection on the logWorkout mutation by
restoring the `@jwt_required`() decorator above the mutate method and enforce
token-based user identity: inside mutate (the logWorkout resolver) call the JWT
helper (e.g., get_jwt_identity or equivalent) to obtain the authenticated user
id and validate it against the incoming user_id (or reject if mismatched) so
that only the authenticated user can log their own workouts; also ensure the JWT
decorator import is present (from flask_jwt_extended import jwt_required,
get_jwt_identity) and add an appropriate unauthorized error path.
- Around line 1022-1023: The SetWorkoutGoals.mutate method has its
`@jwt_required`() decorator commented out; re-enable authentication by restoring
`@jwt_required`() above the mutate definition, ensure jwt_required is imported
from flask_jwt_extended (or the project's JWT helper), and inside mutate use
get_jwt_identity() (or equivalent) to verify the caller matches user_id (or has
permission) before applying changes to prevent unauthorized updates.
- Around line 1176-1182: The except block after s3.delete_object is using the
wrong message and silently swallowing failures; update the handler around
s3.delete_object to log a correct, informative message (e.g., "S3 delete error
for user {user.net_id}") including the exception details (ClientError e) and
propagate the failure instead of swallowing it—either by re-raising the
exception or returning/raising a controlled error so the caller can roll back
the DB delete; locate the try/except that calls s3.delete_object and modify the
ClientError handling accordingly.
- Around line 839-859: The code decodes and uploads encoded_image without
guarding for None and may leave final_photo_url undefined on exceptions; update
the logic around base64.b64decode(encoded_image), s3.put_object(...) and
final_photo_url so: first check if encoded_image is truthy before
decoding/uploading; if falsy, skip upload and set final_photo_url to None or a
default placeholder; catch ClientError and assign a safe final_photo_url (or
None) in the except block; remove the unused variable key; ensure
UserModel(name=..., net_id=..., email=..., encoded_image=final_photo_url) always
receives a defined value.
---
Nitpick comments:
In `@src/schema.py`:
- Line 10: The import WeeklyChallengeModel in src/schema.py is unused; remove
the line importing WeeklyChallenge from src.models.weekly_challenge (the symbol
WeeklyChallengeModel) so there is no unused import left in the module.
- Around line 833-837: Extract the duplicated boto3.client('s3', ...)
initialization into a shared helper (e.g., get_s3_client) or a module-level lazy
singleton (e.g., _s3_client with a get_s3_client() accessor) and update the
CreateUser and DeleteUserById handlers to call that helper instead of inlining
the client creation; ensure the helper reads the same env vars
(DIGITAL_OCEAN_URL, DIGITAL_OCEAN_ACCESS, DIGITAL_OCEAN_SECRET_ACCESS) and
returns a boto3 S3 client so both CreateUser and DeleteUserById use the single
implementation.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 42acbf42-23f4-498e-a164-3409b2d55ca1
📒 Files selected for processing (4)
requirements.txtsrc/models/user.pysrc/models/workout.pysrc/schema.py
There was a problem hiding this comment.
♻️ Duplicate comments (4)
src/schema.py (4)
1031-1032:⚠️ Potential issue | 🟠 MajorSecurity: JWT protection disabled on
SetWorkoutGoalsmutation.The
@jwt_required()decorator is commented out, allowing unauthenticated users to modify any user's workout goals. Re-enable authentication before merging.Proposed fix
- # `@jwt_required`() + `@jwt_required`() def mutate(self, info, user_id, workout_goal):🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 1031 - 1032, The SetWorkoutGoals mutation currently has its authentication disabled: uncomment and restore the `@jwt_required`() decorator above the mutate method (the mutate function inside the SetWorkoutGoals class) so that JWT-authenticated users are required to call this mutation; ensure the decorator from flask_jwt_extended (or your JWT library) is imported and applied directly above def mutate(self, info, user_id, workout_goal) to enforce authentication.
843-869:⚠️ Potential issue | 🔴 CriticalCritical: Incorrect URL construction and missing None-guard for
encoded_image.Multiple issues in this code segment:
URL is incorrect (new issue): The constructed URL on line 863 is missing the bucket name. The upload goes to bucket
appdev-uploadwith keyuplift-dev/user-profile/{net_id}-profile.png, but the URL omits the bucket. This will result in broken profile image links.
encoded_imagecan beNone: Sinceencoded_imageis optional (required=Falseon line 833), callingbase64.b64decode(encoded_image)will raiseTypeErrorwhenNoneis passed.
final_photo_urlundefined on error: IfClientErroroccurs,final_photo_urlis never assigned, causingNameErroron line 869.Dead code:
response(line 854) andkey(line 862) are assigned but never used.Proposed fix
s3 = boto3.client('s3', endpoint_url=os.getenv("DIGITAL_OCEAN_URL"), aws_access_key_id=os.getenv("DIGITAL_OCEAN_ACCESS"), aws_secret_access_key=os.getenv("DIGITAL_OCEAN_SECRET_ACCESS") ) - # Decode the base64 image - image_data = base64.b64decode(encoded_image) - - # Upload to Spaces - try: - response = s3.put_object( - Bucket="appdev-upload", - Key=f"uplift-dev/user-profile/{net_id}-profile.png", - Body=image_data, - ContentType="image/png", - ACL="public-read" - ) - - key = "uplift-dev/photo.jpg" - final_photo_url = f"https://nyc3.digitaloceanspaces.com/uplift-dev/user-profile/{net_id}-profile.png" - - - except ClientError as e: - print("Upload error:", e) - - new_user = UserModel(name=name, net_id=net_id, email=email, encoded_image=final_photo_url) + final_photo_url = None + if encoded_image: + # Decode the base64 image + image_data = base64.b64decode(encoded_image) + + # Upload to Spaces + try: + s3.put_object( + Bucket="appdev-upload", + Key=f"uplift-dev/user-profile/{net_id}-profile.png", + Body=image_data, + ContentType="image/png", + ACL="public-read" + ) + final_photo_url = f"https://appdev-upload.nyc3.digitaloceanspaces.com/uplift-dev/user-profile/{net_id}-profile.png" + except ClientError as e: + raise GraphQLError(f"Failed to upload profile image: {e}") + + new_user = UserModel(name=name, net_id=net_id, email=email, encoded_image=final_photo_url)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 843 - 869, Guard against encoded_image being None before calling base64.b64decode and only attempt S3 upload when encoded_image is present; in the upload block use a single canonical key (e.g. key = f"uplift-dev/user-profile/{net_id}-profile.png") and construct final_photo_url using the bucket name and key (include "appdev-upload" in the URL) so it points to the actual object, handle ClientError by setting final_photo_url to a safe fallback (None or empty string) inside the except block, and remove unused variables (response and the previous unused key assignment) so only the used key and final_photo_url remain when creating the UserModel.
1083-1084:⚠️ Potential issue | 🟠 MajorSecurity: JWT protection disabled on
logWorkoutmutation.The
@jwt_required()decorator is commented out, allowing unauthenticated users to log workouts for any user. Re-enable authentication before merging.Proposed fix
- # `@jwt_required`() + `@jwt_required`() def mutate(self, info, workout_time, user_id, facility_id):🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 1083 - 1084, The logWorkout GraphQL mutation currently has its authentication disabled because the `@jwt_required`() decorator is commented out; re-enable authentication by uncommenting and applying `@jwt_required`() to the mutate method (the method named mutate on the logWorkout mutation class) and ensure the resolver uses the authenticated identity (e.g., read current_user or info.context after JWT validation) rather than trusting the passed user_id parameter to prevent unauthenticated or spoofed workout logs.
1181-1187:⚠️ Potential issue | 🟡 MinorIncorrect error message and silently swallowed delete failure.
Line 1187 prints "Upload error" but this is a delete operation. Additionally, silently swallowing the
ClientErrormeans the user is deleted from the database but their profile image may remain in cloud storage as an orphan.Consider either propagating the error or logging it correctly and proceeding with a warning.
Proposed fix
try: s3.delete_object( Bucket="appdev-upload", Key=f"uplift-dev/user-profile/{user.net_id}-profile.png", ) except ClientError as e: - print("Upload error:", e) + print(f"Warning: Failed to delete profile image for user {user.net_id}: {e}") + # Proceeding with user deletion despite image deletion failure🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 1181 - 1187, The delete_object error handler incorrectly logs "Upload error" and swallows failures; update the except block for s3.delete_object to log a correct, descriptive message (e.g., "S3 delete_object failed for uplift-dev/user-profile/{user.net_id}-profile.png") including the caught ClientError details and either re-raise the exception or log as a warning depending on desired flow; locate the try/except around s3.delete_object (references: s3.delete_object, ClientError, user.net_id) and change the print to use the process logger or raise the error so orphaned files are not silently ignored.
🧹 Nitpick comments (2)
src/schema.py (2)
11-11: Remove unused import.
WeeklyChallengeModelis imported but never used in this file. As per static analysis hint (F401).Proposed fix
-from src.models.weekly_challenge import WeeklyChallenge as WeeklyChallengeModel🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` at line 11, The import WeeklyChallengeModel in src/schema.py is unused; remove the line "from src.models.weekly_challenge import WeeklyChallenge as WeeklyChallengeModel" from the file to satisfy the F401 lint and avoid dead imports, and run tests/lint to ensure no remaining references to WeeklyChallengeModel in src/schema.py.
38-38: Remove duplicate import.
base64is already imported on line 2. As per static analysis hint (F811).Proposed fix
import boto3 from botocore.exceptions import ClientError -import base64🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` at line 38, The file contains a duplicate import of the base64 module (another import already exists near the top of src/schema.py); remove the redundant "import base64" statement shown in the diff so only the single original import remains, ensuring no F811 duplicate-import lint error.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@src/schema.py`:
- Around line 1031-1032: The SetWorkoutGoals mutation currently has its
authentication disabled: uncomment and restore the `@jwt_required`() decorator
above the mutate method (the mutate function inside the SetWorkoutGoals class)
so that JWT-authenticated users are required to call this mutation; ensure the
decorator from flask_jwt_extended (or your JWT library) is imported and applied
directly above def mutate(self, info, user_id, workout_goal) to enforce
authentication.
- Around line 843-869: Guard against encoded_image being None before calling
base64.b64decode and only attempt S3 upload when encoded_image is present; in
the upload block use a single canonical key (e.g. key =
f"uplift-dev/user-profile/{net_id}-profile.png") and construct final_photo_url
using the bucket name and key (include "appdev-upload" in the URL) so it points
to the actual object, handle ClientError by setting final_photo_url to a safe
fallback (None or empty string) inside the except block, and remove unused
variables (response and the previous unused key assignment) so only the used key
and final_photo_url remain when creating the UserModel.
- Around line 1083-1084: The logWorkout GraphQL mutation currently has its
authentication disabled because the `@jwt_required`() decorator is commented out;
re-enable authentication by uncommenting and applying `@jwt_required`() to the
mutate method (the method named mutate on the logWorkout mutation class) and
ensure the resolver uses the authenticated identity (e.g., read current_user or
info.context after JWT validation) rather than trusting the passed user_id
parameter to prevent unauthenticated or spoofed workout logs.
- Around line 1181-1187: The delete_object error handler incorrectly logs
"Upload error" and swallows failures; update the except block for
s3.delete_object to log a correct, descriptive message (e.g., "S3 delete_object
failed for uplift-dev/user-profile/{user.net_id}-profile.png") including the
caught ClientError details and either re-raise the exception or log as a warning
depending on desired flow; locate the try/except around s3.delete_object
(references: s3.delete_object, ClientError, user.net_id) and change the print to
use the process logger or raise the error so orphaned files are not silently
ignored.
---
Nitpick comments:
In `@src/schema.py`:
- Line 11: The import WeeklyChallengeModel in src/schema.py is unused; remove
the line "from src.models.weekly_challenge import WeeklyChallenge as
WeeklyChallengeModel" from the file to satisfy the F401 lint and avoid dead
imports, and run tests/lint to ensure no remaining references to
WeeklyChallengeModel in src/schema.py.
- Line 38: The file contains a duplicate import of the base64 module (another
import already exists near the top of src/schema.py); remove the redundant
"import base64" statement shown in the diff so only the single original import
remains, ensuring no F811 duplicate-import lint error.
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/schema.py (1)
1169-1173:⚠️ Potential issue | 🟠 MajorAdd
@jwt_required()decorator toDeleteUserById.mutate().The mutation currently allows deleting any user by ID without authentication. This is inconsistent with other user-modifying mutations like
SetWorkoutGoalsandLogWorkout, which are protected with@jwt_required(). A public endpoint that can delete any user is a security risk.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 1169 - 1173, The DeleteUserById.mutate method is missing authentication; add the `@jwt_required`() decorator to the DeleteUserById.mutate method so the mutation requires a valid JWT, and ensure you import jwt_required (e.g., from flask_jwt_extended import jwt_required) at the top of the module; update the decorator directly above the mutate method in the DeleteUserById class and run tests to confirm other auth logic (current_user access, permission checks) still works.
♻️ Duplicate comments (2)
src/schema.py (2)
1181-1187:⚠️ Potential issue | 🟡 MinorIncorrect error message and silently swallowed failure.
The issues from the previous review remain:
- Line 1187 prints "Upload error" but this is a delete operation.
- The
ClientErroris swallowed, so the user is deleted from the database while their profile image may remain orphaned in cloud storage.Consider either propagating the error or at minimum logging correctly:
Proposed fix
try: s3.delete_object( Bucket="appdev-upload", Key=f"uplift-dev/user-profile/{user.net_id}-profile.png", ) except ClientError as e: - print("Upload error:", e) + # Log but continue - deleting the user is more important than cleaning up the image + print(f"Warning: Failed to delete profile image for user {user.net_id}: {e}")🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 1181 - 1187, The delete-block around s3.delete_object is swallowing ClientError and has the wrong log text; update the except handler for s3.delete_object (Bucket "appdev-upload", Key f"uplift-dev/user-profile/{user.net_id}-profile.png") to log an accurate message (e.g., "Delete profile image error") including the exception details and then either re-raise the ClientError or propagate it to the caller instead of silently continuing so the user record isn't deleted while the S3 object may remain; ensure you reference ClientError and user.net_id in the handler so the caller can handle or abort the DB delete upon failure.
849-867:⚠️ Potential issue | 🔴 CriticalCritical:
encoded_imagecan beNoneandfinal_photo_urlmay be undefined.The issues from the previous review remain unaddressed:
encoded_imageis optional (required=False), butbase64.b64decode(encoded_image)will raiseTypeErrorifencoded_imageisNone.- If a
ClientErroroccurs,final_photo_urlis never assigned, causingNameErroron line 869.- Lines 854 and 862 assign
responseandkeybut neither is used.Proposed fix
+ final_photo_url = None + if encoded_image: - # Decode the base64 image - image_data = base64.b64decode(encoded_image) - - # Upload to Spaces - try: - response = s3.put_object( - Bucket="appdev-upload", - Key=f"uplift-dev/user-profile/{net_id}-profile.png", - Body=image_data, - ContentType="image/png", - ACL="public-read" - ) - - key = "uplift-dev/photo.jpg" - final_photo_url = f"https://nyc3.digitaloceanspaces.com/uplift-dev/user-profile/{net_id}-profile.png" - - - except ClientError as e: - print("Upload error:", e) + # Decode the base64 image + image_data = base64.b64decode(encoded_image) + + # Upload to Spaces + try: + s3.put_object( + Bucket="appdev-upload", + Key=f"uplift-dev/user-profile/{net_id}-profile.png", + Body=image_data, + ContentType="image/png", + ACL="public-read" + ) + final_photo_url = f"https://appdev-upload.nyc3.digitaloceanspaces.com/uplift-dev/user-profile/{net_id}-profile.png" + except ClientError as e: + raise GraphQLError(f"Failed to upload profile image: {e}")🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 849 - 867, Handle the optional encoded_image and ensure final_photo_url is always defined: before calling base64.b64decode(encoded_image) in the block using encoded_image and s3.put_object, check if encoded_image is None and either skip the upload (set final_photo_url to a sensible default/None) or return/raise a clear error; initialize final_photo_url to a safe default before the try so a ClientError cannot leave it undefined; remove or use the unused locals response and key (either drop them or use response/Key in further logic) and catch ClientError from botocore.exceptions to log the exception (e) while keeping final_photo_url defined.
🧹 Nitpick comments (2)
src/schema.py (2)
11-11: Remove unused and duplicate imports.Static analysis flagged:
- Line 11:
WeeklyChallengeModelis imported but never used in this file.- Line 38:
base64is imported again, duplicating line 2.Proposed fix
-from src.models.weekly_challenge import WeeklyChallenge as WeeklyChallengeModelimport boto3 from botocore.exceptions import ClientError -import base64Also applies to: 36-38
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` at line 11, Remove the unused import WeeklyChallengeModel and the duplicate base64 import: delete the line that imports WeeklyChallenge as WeeklyChallengeModel (it's never referenced) and remove the second/base64 re-import so only a single import of base64 remains; confirm only necessary imports exist at the top of the module and run linters/type-checks to verify no remaining unused imports.
843-847: Consider extracting S3 client initialization and constants.The S3 client creation and bucket/path configurations are duplicated. Extracting these would reduce duplication and make configuration changes easier.
Suggested approach
# At module level or in a separate utils module SPACES_BUCKET = "appdev-upload" PROFILE_IMAGE_PREFIX = "uplift-dev/user-profile" def get_s3_client(): return boto3.client( 's3', endpoint_url=os.getenv("DIGITAL_OCEAN_URL"), aws_access_key_id=os.getenv("DIGITAL_OCEAN_ACCESS"), aws_secret_access_key=os.getenv("DIGITAL_OCEAN_SECRET_ACCESS") ) def get_profile_image_key(net_id: str) -> str: return f"{PROFILE_IMAGE_PREFIX}/{net_id}-profile.png" def get_profile_image_url(net_id: str) -> str: return f"https://{SPACES_BUCKET}.nyc3.digitaloceanspaces.com/{PROFILE_IMAGE_PREFIX}/{net_id}-profile.png"Also applies to: 1175-1179
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 843 - 847, The S3 client creation (variable s3) and hard-coded bucket/path strings are duplicated; extract them into module-level constants (e.g., SPACES_BUCKET, PROFILE_IMAGE_PREFIX) and a factory function get_s3_client() that returns the boto3.client configured with DIGITAL_OCEAN_URL/ACCESS/SECRET, and add helper functions get_profile_image_key(net_id) and get_profile_image_url(net_id) to build keys/URLs; then replace direct instantiations of s3 and inlined path strings (the s3 = boto3.client(...) usages and the profile path constructions currently duplicated at the shown diff and around lines 1175-1179) with calls to get_s3_client(), get_profile_image_key(), and get_profile_image_url() so configuration is centralized and duplication removed.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/schema.py`:
- Line 863: The constructed final_photo_url is missing the bucket and CDN
domain; update how final_photo_url is built (the variable final_photo_url in
this file) to use the canonical DigitalOcean Spaces format used elsewhere
(ASSET_BASE_URL from constants.py) or explicitly include the bucket
"appdev-upload" and the CDN host (e.g.,
{bucket}.{region}.cdn.digitaloceanspaces.com) so the URL becomes
"https://appdev-upload.{region}.cdn.digitaloceanspaces.com/user-profile/{net_id}-profile.png"
(use the existing ASSET_BASE_URL constant if available to assemble the path with
net_id and the same user-profile path).
---
Outside diff comments:
In `@src/schema.py`:
- Around line 1169-1173: The DeleteUserById.mutate method is missing
authentication; add the `@jwt_required`() decorator to the DeleteUserById.mutate
method so the mutation requires a valid JWT, and ensure you import jwt_required
(e.g., from flask_jwt_extended import jwt_required) at the top of the module;
update the decorator directly above the mutate method in the DeleteUserById
class and run tests to confirm other auth logic (current_user access, permission
checks) still works.
---
Duplicate comments:
In `@src/schema.py`:
- Around line 1181-1187: The delete-block around s3.delete_object is swallowing
ClientError and has the wrong log text; update the except handler for
s3.delete_object (Bucket "appdev-upload", Key
f"uplift-dev/user-profile/{user.net_id}-profile.png") to log an accurate message
(e.g., "Delete profile image error") including the exception details and then
either re-raise the ClientError or propagate it to the caller instead of
silently continuing so the user record isn't deleted while the S3 object may
remain; ensure you reference ClientError and user.net_id in the handler so the
caller can handle or abort the DB delete upon failure.
- Around line 849-867: Handle the optional encoded_image and ensure
final_photo_url is always defined: before calling
base64.b64decode(encoded_image) in the block using encoded_image and
s3.put_object, check if encoded_image is None and either skip the upload (set
final_photo_url to a sensible default/None) or return/raise a clear error;
initialize final_photo_url to a safe default before the try so a ClientError
cannot leave it undefined; remove or use the unused locals response and key
(either drop them or use response/Key in further logic) and catch ClientError
from botocore.exceptions to log the exception (e) while keeping final_photo_url
defined.
---
Nitpick comments:
In `@src/schema.py`:
- Line 11: Remove the unused import WeeklyChallengeModel and the duplicate
base64 import: delete the line that imports WeeklyChallenge as
WeeklyChallengeModel (it's never referenced) and remove the second/base64
re-import so only a single import of base64 remains; confirm only necessary
imports exist at the top of the module and run linters/type-checks to verify no
remaining unused imports.
- Around line 843-847: The S3 client creation (variable s3) and hard-coded
bucket/path strings are duplicated; extract them into module-level constants
(e.g., SPACES_BUCKET, PROFILE_IMAGE_PREFIX) and a factory function
get_s3_client() that returns the boto3.client configured with
DIGITAL_OCEAN_URL/ACCESS/SECRET, and add helper functions
get_profile_image_key(net_id) and get_profile_image_url(net_id) to build
keys/URLs; then replace direct instantiations of s3 and inlined path strings
(the s3 = boto3.client(...) usages and the profile path constructions currently
duplicated at the shown diff and around lines 1175-1179) with calls to
get_s3_client(), get_profile_image_key(), and get_profile_image_url() so
configuration is centralized and duplication removed.
JoshD94
left a comment
There was a problem hiding this comment.
lgtm after we fix the 3 issues found by coderabbit
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/schema.py`:
- Line 11: Remove the unused import WeeklyChallengeModel and the duplicate
base64 import to fix the lint regressions: delete the line importing
WeeklyChallenge as WeeklyChallengeModel (since it isn't referenced anywhere) and
remove the redundant second import of base64 so only a single base64 import
remains; ensure there are no other references to WeeklyChallengeModel before
removing.
- Around line 1178-1194: Add an authorization/ownership check before calling
s3.delete_object so only the account owner or an authorized admin can delete the
profile image: fetch the current authenticated user (e.g., from context or auth
helper used elsewhere in schema.py), compare its identifier to the target "user"
(or check an is_admin flag) and raise GraphQLError("Not authorized") if the
caller is not the owner/admin; only proceed to call s3.delete_object for
user.net_id when the check passes (leave the existing ClientError handling
intact).
- Around line 856-868: The upload code currently hardcodes an object Key (path)
and builds final_photo_url, but EditUser still calls the legacy upload contract
(sending only bucket and image) and DeleteUserById re-derives the fixed key —
causing orphaned avatars; fix by unifying the storage contract: either (A) stop
hardcoding Key and instead call the same legacy upload service from this code
and use the returned URL (so the stored value matches what EditUser saves), or
(B) change EditUser to send the object Key and bucket (not the raw image) so it
stores the same path this code writes; update DeleteUserById to delete using the
stored URL (parse bucket and key from final_photo_url or from stored fields) and
ensure functions EditUser and DeleteUserById and the upload code that builds
final_photo_url/path all use the same bucket+key format.
| from graphql import GraphQLError | ||
| from src.models.capacity import Capacity as CapacityModel | ||
| from src.models.capacity_reminder import CapacityReminder as CapacityReminderModel | ||
| from src.models.weekly_challenge import WeeklyChallenge as WeeklyChallengeModel |
There was a problem hiding this comment.
Remove the new lint regressions.
WeeklyChallengeModel is unused, and base64 is imported twice (Lines 2 and 38). Both are already lint errors in this file.
Also applies to: 38-38
🧰 Tools
🪛 Flake8 (7.3.0)
[error] 11-11: 'src.models.weekly_challenge.WeeklyChallenge as WeeklyChallengeModel' imported but unused
(F401)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/schema.py` at line 11, Remove the unused import WeeklyChallengeModel and
the duplicate base64 import to fix the lint regressions: delete the line
importing WeeklyChallenge as WeeklyChallengeModel (since it isn't referenced
anywhere) and remove the redundant second import of base64 so only a single
base64 import remains; ensure there are no other references to
WeeklyChallengeModel before removing.
| bucket = "appdev-upload" | ||
| path = f"uplift-dev/user-profile/{net_id}-profile.png" | ||
| region = "nyc3" | ||
|
|
||
| s3.put_object( | ||
| Bucket=bucket, | ||
| Key=path, | ||
| Body=image_data, | ||
| ContentType="image/png", | ||
| ACL="public-read" | ||
| ) | ||
|
|
||
| final_photo_url = f"https://{bucket}.{region}.digitaloceanspaces.com/{path}" |
There was a problem hiding this comment.
Keep create/edit/delete on one storage contract.
This mutation now writes a hard-coded object key, but EditUser still sends only bucket and image to the legacy upload service on Lines 903-906 and stores whatever URL comes back. DeleteUserById later re-derives this fixed key instead of deleting the stored object, so avatars uploaded through EditUser will be orphaned.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/schema.py` around lines 856 - 868, The upload code currently hardcodes an
object Key (path) and builds final_photo_url, but EditUser still calls the
legacy upload contract (sending only bucket and image) and DeleteUserById
re-derives the fixed key — causing orphaned avatars; fix by unifying the storage
contract: either (A) stop hardcoding Key and instead call the same legacy upload
service from this code and use the returned URL (so the stored value matches
what EditUser saves), or (B) change EditUser to send the object Key and bucket
(not the raw image) so it stores the same path this code writes; update
DeleteUserById to delete using the stored URL (parse bucket and key from
final_photo_url or from stored fields) and ensure functions EditUser and
DeleteUserById and the upload code that builds final_photo_url/path all use the
same bucket+key format.
|
|
||
| s3 = boto3.client('s3', | ||
| endpoint_url=os.getenv("DIGITAL_OCEAN_URL"), | ||
| aws_access_key_id=os.getenv("DIGITAL_OCEAN_ACCESS"), | ||
| aws_secret_access_key=os.getenv("DIGITAL_OCEAN_SECRET_ACCESS") | ||
| ) | ||
|
|
||
| if user.encoded_image: | ||
| try: | ||
| s3.delete_object( | ||
| Bucket="appdev-upload", | ||
| Key=f"uplift-dev/user-profile/{user.net_id}-profile.png", | ||
| ) | ||
| except ClientError as e: | ||
| print("Delete error:", e) | ||
| raise GraphQLError("Error deleting user profile picture") | ||
|
|
There was a problem hiding this comment.
Protect the new delete path with authorization.
This resolver still has no auth or ownership check, so any caller that can reach GraphQL can delete another user's account and profile image by ID.
🧰 Tools
🪛 Flake8 (7.3.0)
[error] 1180-1180: continuation line under-indented for visual indent
(E128)
[error] 1181-1181: continuation line under-indented for visual indent
(E128)
[error] 1182-1182: continuation line under-indented for visual indent
(E128)
[error] 1183-1183: closing bracket does not match visual indentation
(E124)
🪛 Ruff (0.15.6)
[warning] 1193-1193: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/schema.py` around lines 1178 - 1194, Add an authorization/ownership check
before calling s3.delete_object so only the account owner or an authorized admin
can delete the profile image: fetch the current authenticated user (e.g., from
context or auth helper used elsewhere in schema.py), compare its identifier to
the target "user" (or check an is_admin flag) and raise GraphQLError("Not
authorized") if the caller is not the owner/admin; only proceed to call
s3.delete_object for user.net_id when the check passes (leave the existing
ClientError handling intact).
Overview
Our team had some issues with adding users (specifically user profile pictures) and deleting users on the frontend through GraphQL mutations. This PR addresses some of those issues.
The issues relating to adding users and uploading profile pictures were the first priority. I realized we were uploading profile pictures to Digital Ocean through a
requestwhen we should be using theboto3and passing in access keys to perform WRITE and DELETE operations on the buckets.The issues relating to deleting users stemmed from workouts linked to users not being deleted, and this was easily fixed by adding a foreign key and a cascade delete constraint on the foreign key relationship between workouts and users.
Lastly, I realized on the DeleteUser mutation we were not deleting user profile pictures, so I added that functionality using the
boto3client.Test Coverage
Tested locally.
Summary by CodeRabbit
Updates
Bug Fixes