Conversation
|
Warning Rate limit exceeded
⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📒 Files selected for processing (7)
📝 WalkthroughWalkthroughThis PR adds deep linking and web support to the Polybuys app. It configures iOS/Android deep link handlers via app.json and site association files, refactors the listing detail screen to support messaging flows and web metadata, and enhances the listing edit screen with form state management and image handling. Changes
Sequence DiagramsequenceDiagram
participant User as User / Web
participant Client as Mobile/Web Client
participant Auth as Authentication Service
participant Server as Convex Server
User->>Client: Load listing detail page
Client->>Auth: Check authentication state
alt Authenticated
Auth-->>Client: Return user session
Client->>Server: Fetch listing + current user subject
else Not Authenticated
Auth-->>Client: No session
Client->>Server: Fetch listing only (skip user subject)
end
Server-->>Client: Return listing data
alt Web Platform
Client->>Client: Generate Head metadata (OG tags)
Client->>User: Render listing with web banner + deep link button
else Mobile Platform
alt User is Owner
Client->>User: Render "Edit Listing" action
else User is Authenticated
Client->>User: Render "Message Seller" action
else User Not Authenticated
Client->>User: Render "Sign in to message seller"
User->>Client: Click sign in
Client->>Auth: Redirect to login (with returnTo)
end
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 1 | ❌ 2❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (1 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 10
🧹 Nitpick comments (4)
frontend/app/index.tsx (1)
22-24: Consider adding accessibility props to the button.The navigation and styling look correct. For better accessibility support, consider adding
accessibilityLabelandaccessibilityRole="button"to the TouchableOpacity.♿ Optional accessibility improvement
- <TouchableOpacity style={styles.createButton} onPress={() => router.push('/listings/new')}> + <TouchableOpacity + style={styles.createButton} + onPress={() => router.push('/listings/new')} + accessibilityRole="button" + accessibilityLabel="Create a new listing" + > <Text style={styles.createButtonText}>Create Listing</Text> </TouchableOpacity>🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app/index.tsx` around lines 22 - 24, Add accessibility props to the TouchableOpacity that triggers router.push('/listings/new') so screen readers identify it as a button; update the TouchableOpacity (the element using styles.createButton and the inner Text createButtonText) to include accessibilityLabel (e.g., "Create listing") and accessibilityRole="button" (and optionally accessible={true}) so assistive tech can announce it properly.frontend/app/listings/[id]/edit.tsx (1)
27-38: Consider adding a guard forlisting.images.If
listing.imagesis undefined or null (e.g., due to schema migration or data inconsistency), this could cause issues. Adding a fallback ensures robustness.♻️ Proposed defensive fallback
setCategory(listing.category); setCondition(listing.condition); - setImages(listing.images); + setImages(listing.images ?? []); setHasInitialized(true);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app/listings/`[id]/edit.tsx around lines 27 - 38, The effect in useEffect that initializes form state (functions setTitle, setDescription, setPrice, setCategory, setCondition, setImages) should defensively handle a missing or null listing.images; update the setImages call to use a safe fallback (e.g., an empty array when listing.images is undefined/null or not an array) so initialization never receives undefined and downstream code expecting an array is safe; locate the useEffect block and replace the direct setImages(listing.images) with a guarded assignment (using a nullish or type check) and keep the rest of the initialization and setHasInitialized(true) unchanged.frontend/hooks/useResolvedImageUrls.ts (1)
43-50: Consider moving ref update outside the setState callback.Updating
resolvedUrlsRef.currentinside thesetStatecallback works but could lead to subtle timing issues if multiple rapid state updates occur. Moving the ref update outside ensures the ref always reflects the latest resolved state.♻️ Proposed refactor
if (cancelled) { return; } + const newResolved: Record<string, string | null> = {}; + for (const [storageId, url] of nextEntries) { + newResolved[storageId] = url; + } + setResolvedUrls((prev) => { - const next = { ...prev }; - for (const [storageId, url] of nextEntries) { - next[storageId] = url; - } - resolvedUrlsRef.current = next; + const next = { ...prev, ...newResolved }; + resolvedUrlsRef.current = next; return next; });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/hooks/useResolvedImageUrls.ts` around lines 43 - 50, The state/ref update should be decoupled: build the merged object from the current ref, apply nextEntries to produce next (e.g. const next = { ...resolvedUrlsRef.current }; for (const [storageId, url] of nextEntries) next[storageId] = url;), then call setResolvedUrls(next) and finally set resolvedUrlsRef.current = next; this keeps resolvedUrlsRef in sync while avoiding updating the ref inside the setResolvedUrls callback (references: setResolvedUrls, resolvedUrlsRef, nextEntries).frontend/components/ImageUploader.tsx (1)
277-283: Use collision-resistant pending IDs.
pending-${Date.now()}can collide during fast multi-pick flows. A UUID/random suffix avoids accidental state overwrite.🔧 Proposed fix
- const localId = `pending-${Date.now()}`; + const localId = `pending-${Date.now()}-${Math.random().toString(36).slice(2, 8)}`; ... - const localId = `pending-${Date.now()}`; + const localId = `pending-${Date.now()}-${Math.random().toString(36).slice(2, 8)}`;Also applies to: 302-310
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/components/ImageUploader.tsx` around lines 277 - 283, The pending upload IDs use `pending-${Date.now()}` which can collide during fast multi-pick flows; update the code that generates `localId` (the value pushed into `setPendingUploads`) to use a collision-resistant generator (e.g., `uuid` or a crypto-random suffix) instead of Date.now(), and change the other similar occurrence (the one around the block that updates pending uploads at the later occurrence) to use the same generator so `localId` is globally unique across `localId`, `setPendingUploads`, and any references to that id when resolving/removing pending uploads.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@backend/convex/listings.ts`:
- Around line 122-124: The createListing mutation is missing server-side
enforcement of non-negative prices; add the same check used in updateListing by
calling the price validation (e.g., invoke validatePrice(args.price) or perform
if (args.price < 0) throw new Error(...)) inside createListing before persisting
the listing (alongside validateTitle and validateImages) so negative prices are
rejected server-side.
- Around line 113-121: The current code allows using client-provided
args.sellerEmail when identity.email is absent (variables authenticatedEmail,
providedEmail, sellerEmail), enabling impersonation; change the logic to require
a verified identity email from identity.email only and never accept
args.sellerEmail as a fallback: remove the fallback assignment (sellerEmail =
authenticatedEmail ?? providedEmail) and instead throw an error if
identity.email is missing or null, or implement a server-side verification step
(e.g., lookup/verify args.sellerEmail against a trusted server-side binding)
before using it; update any code paths that reference sellerEmail to rely
exclusively on the verified identity email (authenticatedEmail).
In `@backend/convex/messages.ts`:
- Around line 60-66: The patch currently leaves the sender's unreadCount as
(existingRow.unreadCount ?? 0), which can preserve stale badges; in the
ctx.db.patch call that updates existingRow (refer to existingRow._id and the
unreadCount field in that patch), change the logic so when participantId ===
senderId you explicitly set unreadCount to 0, otherwise set it to
(existingRow.unreadCount ?? 0) + 1; update the unreadCount expression in the
function handling message writes (where participantId and senderId are in scope)
accordingly.
- Around line 161-185: The unbounded read risk: ensure args.limit is
validated/capped and avoid using .collect() for same-timestamp scans; in the
conversationParticipants queries (see parseConversationCursor, participantRows,
sameTimestampRows, and index by_user_lastActivityAt) validate and clamp limit
(e.g., max cap like 100) before use and replace the .collect() on the
sameTimestampRows branch with a bounded query (e.g., .take(limit + 1) or add a
deterministic secondary sort/key and limit by that key) so the same-timestamp
branch cannot read an arbitrarily large result set in one request.
In `@frontend/app/listings/`[id].tsx:
- Around line 26-28: The query is using the raw route param `id` which may be an
array/undefined; update the call to use the normalized param `listingId` instead
so the backend receives a valid ID shape. In the useQuery invocation for
`api.listings.getListing` (the call currently passing `{ id: id as
Id<'listings'> }`), replace the `id` value with `listingId` (and keep the
existing Id<'listings'> typing if needed) so the normalized/validated route
param is always sent to the backend.
- Around line 51-60: The handleOpenInApp callback sets a fallback setTimeout but
never clears it, causing stray redirects; fix by storing the timeout ID (e.g.,
in a ref) and clearing any existing timeout before creating a new one, and add a
useEffect cleanup that calls clearTimeout on that ref when the component
unmounts; reference handleOpenInApp, deepLinkUrl, fallbackStoreUrl, and
listingId to locate the code and ensure you call clearTimeout(timerId) both
before setting a new timer and in the component cleanup.
In `@frontend/app/listings/`[id]/edit.tsx:
- Around line 14-16: The code uses useLocalSearchParams to read id and
immediately casts it to Id<'listings'> for useQuery (api.listings.getListing)
and useMutation (api.listings.updateListing) without validating it; add
validation to ensure id is present and non-empty before calling
useQuery/useMutation: read const { id } = useLocalSearchParams..., check if id
is a non-empty string (return an error UI, redirect, or render a loading/invalid
state) and only call useQuery(api.listings.getListing, { id: id as
Id<'listings'> }) and wire updateListing
(useMutation(api.listings.updateListing)) when id is valid; ensure all
downstream code that assumes listing data guards against the invalid state.
In `@frontend/app/listings/new.tsx`:
- Around line 31-47: The form validates sellerEmail into normalizedEmail but
never includes it in the create-listing payload; update the code that sends the
listing (the API call or create function invoked after validation) to include
the normalizedEmail value under the backend-expected key (e.g., "email" or
"sellerEmail") so the server receives the user's email; apply the same fix for
the second occurrence referenced around lines 68-75 so both submit paths attach
normalizedEmail to the payload.
In `@frontend/components/ImageUploader.tsx`:
- Around line 203-239: The uploadToConvex function can hang indefinitely because
XMLHttpRequest has no timeout handling; update uploadToConvex to set xhr.timeout
(e.g., a sensible default or configurable value) and implement xhr.ontimeout to
reject with a clear timeout Error so upstream stays unblocked, and ensure the
timeout path mirrors xhr.onerror/ontimeout behavior (rejecting and stopping any
progress handling); reference the xhr instance, its ontimeout handler, and the
Promise resolve/reject flow in uploadToConvex to add this logic.
In `@frontend/public/.well-known/apple-app-site-association`:
- Around line 1-11: Replace the placeholder value in the AASA JSON where
"appID": "TEAMID.com.polybuy.app" with your actual Apple Developer Team ID
(format: <TEAMID>.com.polybuy.app) before production deployment; keep the
"paths": ["/listings/*"] entry as-is. Ensure the file is served from the
well-known location with Content-Type: application/json and that your server
configuration does not perform any redirects (including trailing-slash
redirects) for this file so Apple can fetch it unchanged.
---
Nitpick comments:
In `@frontend/app/index.tsx`:
- Around line 22-24: Add accessibility props to the TouchableOpacity that
triggers router.push('/listings/new') so screen readers identify it as a button;
update the TouchableOpacity (the element using styles.createButton and the inner
Text createButtonText) to include accessibilityLabel (e.g., "Create listing")
and accessibilityRole="button" (and optionally accessible={true}) so assistive
tech can announce it properly.
In `@frontend/app/listings/`[id]/edit.tsx:
- Around line 27-38: The effect in useEffect that initializes form state
(functions setTitle, setDescription, setPrice, setCategory, setCondition,
setImages) should defensively handle a missing or null listing.images; update
the setImages call to use a safe fallback (e.g., an empty array when
listing.images is undefined/null or not an array) so initialization never
receives undefined and downstream code expecting an array is safe; locate the
useEffect block and replace the direct setImages(listing.images) with a guarded
assignment (using a nullish or type check) and keep the rest of the
initialization and setHasInitialized(true) unchanged.
In `@frontend/components/ImageUploader.tsx`:
- Around line 277-283: The pending upload IDs use `pending-${Date.now()}` which
can collide during fast multi-pick flows; update the code that generates
`localId` (the value pushed into `setPendingUploads`) to use a
collision-resistant generator (e.g., `uuid` or a crypto-random suffix) instead
of Date.now(), and change the other similar occurrence (the one around the block
that updates pending uploads at the later occurrence) to use the same generator
so `localId` is globally unique across `localId`, `setPendingUploads`, and any
references to that id when resolving/removing pending uploads.
In `@frontend/hooks/useResolvedImageUrls.ts`:
- Around line 43-50: The state/ref update should be decoupled: build the merged
object from the current ref, apply nextEntries to produce next (e.g. const next
= { ...resolvedUrlsRef.current }; for (const [storageId, url] of nextEntries)
next[storageId] = url;), then call setResolvedUrls(next) and finally set
resolvedUrlsRef.current = next; this keeps resolvedUrlsRef in sync while
avoiding updating the ref inside the setResolvedUrls callback (references:
setResolvedUrls, resolvedUrlsRef, nextEntries).
ℹ️ Review info
Configuration used: defaults
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (2)
backend/convex/_generated/api.d.tsis excluded by!**/_generated/**package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (24)
babel.config.jsbackend/convex/__mocks__/server.jsbackend/convex/__tests__/messages.test.tsbackend/convex/__tests__/schema.test.tsbackend/convex/__tests__/tsconfig.jsonbackend/convex/listings.tsbackend/convex/messages.tsbackend/convex/schema.tsbackend/convex/tsconfig.jsonbackend/jest.config.jsbackend/package.jsonfrontend/app.jsonfrontend/app/_layout.tsxfrontend/app/index.tsxfrontend/app/listings/[id].tsxfrontend/app/listings/[id]/edit.tsxfrontend/app/listings/new.tsxfrontend/components/ImageUploader.tsxfrontend/hooks/useResolvedImageUrls.tsfrontend/package.jsonfrontend/public/.well-known/apple-app-site-associationfrontend/public/.well-known/assetlinks.jsonjest.config.jspackage.json
💤 Files with no reviewable changes (1)
- backend/convex/tsconfig.json
| await ctx.db.patch(existingRow._id, { | ||
| lastActivityAt: now, | ||
| unreadCount: | ||
| participantId === senderId | ||
| ? (existingRow.unreadCount ?? 0) | ||
| : (existingRow.unreadCount ?? 0) + 1, | ||
| updatedAt: now, |
There was a problem hiding this comment.
Reset sender unread count when they send a message.
Keeping (existingRow.unreadCount ?? 0) for the sender can preserve stale unread badges. Sending should clear the sender’s unread count.
🔧 Proposed fix
unreadCount:
participantId === senderId
- ? (existingRow.unreadCount ?? 0)
+ ? 0
: (existingRow.unreadCount ?? 0) + 1,🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@backend/convex/messages.ts` around lines 60 - 66, The patch currently leaves
the sender's unreadCount as (existingRow.unreadCount ?? 0), which can preserve
stale badges; in the ctx.db.patch call that updates existingRow (refer to
existingRow._id and the unreadCount field in that patch), change the logic so
when participantId === senderId you explicitly set unreadCount to 0, otherwise
set it to (existingRow.unreadCount ?? 0) + 1; update the unreadCount expression
in the function handling message writes (where participantId and senderId are in
scope) accordingly.
| const limit = args.limit || 20; | ||
| const parsedCursor = parseConversationCursor(args.cursor); | ||
| let participantRows: ConversationParticipantRow[]; | ||
|
|
||
| if (!parsedCursor) { | ||
| participantRows = (await ctx.db | ||
| .query('conversationParticipants') | ||
| .withIndex('by_user_lastActivityAt', (q) => q.eq('userId', userId)) | ||
| .order('desc') | ||
| .take(limit + 1)) as ConversationParticipantRow[]; | ||
| } else { | ||
| const olderRows = (await ctx.db | ||
| .query('conversationParticipants') | ||
| .withIndex('by_user_lastActivityAt', (q) => | ||
| q.eq('userId', userId).lt('lastActivityAt', parsedCursor.lastActivityAt) | ||
| ) | ||
| .order('desc') | ||
| .take(limit + 1)) as ConversationParticipantRow[]; | ||
|
|
||
| const sameTimestampRows = (await ctx.db | ||
| .query('conversationParticipants') | ||
| .withIndex('by_user_lastActivityAt', (q) => | ||
| q.eq('userId', userId).eq('lastActivityAt', parsedCursor.lastActivityAt) | ||
| ) | ||
| .collect()) as ConversationParticipantRow[]; |
There was a problem hiding this comment.
Bound pagination inputs and avoid unbounded same-timestamp scans.
limit is unchecked, and the same-timestamp branch uses .collect(), which can become an unbounded read for hot timestamps.
🔧 Proposed fix
- const limit = args.limit || 20;
+ const requestedLimit = args.limit ?? 20;
+ if (!Number.isInteger(requestedLimit) || requestedLimit < 1 || requestedLimit > 50) {
+ throw new Error('limit must be an integer between 1 and 50');
+ }
+ const limit = requestedLimit;Consider capping the same-timestamp query (or introducing a deterministic secondary sort key) so this path cannot read an arbitrarily large row set in one request.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@backend/convex/messages.ts` around lines 161 - 185, The unbounded read risk:
ensure args.limit is validated/capped and avoid using .collect() for
same-timestamp scans; in the conversationParticipants queries (see
parseConversationCursor, participantRows, sameTimestampRows, and index
by_user_lastActivityAt) validate and clamp limit (e.g., max cap like 100) before
use and replace the .collect() on the sameTimestampRows branch with a bounded
query (e.g., .take(limit + 1) or add a deterministic secondary sort/key and
limit by that key) so the same-timestamp branch cannot read an arbitrarily large
result set in one request.
| const handleOpenInApp = useCallback(() => { | ||
| if (Platform.OS !== 'web' || !listingId) { | ||
| return; | ||
| } | ||
|
|
||
| window.location.href = deepLinkUrl; | ||
| window.setTimeout(() => { | ||
| window.location.href = fallbackStoreUrl; | ||
| }, 1200); | ||
| }, [deepLinkUrl, fallbackStoreUrl, listingId]); |
There was a problem hiding this comment.
Clear the deep-link fallback timer to prevent delayed redirects.
The fallback setTimeout is never cleared. If the user navigates away quickly, the pending timer can still redirect them to the store from another screen.
🔧 Proposed fix
-import { useCallback, useMemo } from 'react';
+import { useCallback, useEffect, useMemo, useRef } from 'react';
...
+ const fallbackTimerRef = useRef<ReturnType<typeof setTimeout> | null>(null);
+ useEffect(
+ () => () => {
+ if (fallbackTimerRef.current) {
+ clearTimeout(fallbackTimerRef.current);
+ }
+ },
+ []
+ );
...
- window.setTimeout(() => {
+ if (fallbackTimerRef.current) {
+ clearTimeout(fallbackTimerRef.current);
+ }
+ fallbackTimerRef.current = window.setTimeout(() => {
window.location.href = fallbackStoreUrl;
}, 1200);📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const handleOpenInApp = useCallback(() => { | |
| if (Platform.OS !== 'web' || !listingId) { | |
| return; | |
| } | |
| window.location.href = deepLinkUrl; | |
| window.setTimeout(() => { | |
| window.location.href = fallbackStoreUrl; | |
| }, 1200); | |
| }, [deepLinkUrl, fallbackStoreUrl, listingId]); | |
| import { useCallback, useEffect, useMemo, useRef } from 'react'; | |
| // ... other imports and code ... | |
| const fallbackTimerRef = useRef<ReturnType<typeof setTimeout> | null>(null); | |
| useEffect( | |
| () => () => { | |
| if (fallbackTimerRef.current) { | |
| clearTimeout(fallbackTimerRef.current); | |
| } | |
| }, | |
| [] | |
| ); | |
| const handleOpenInApp = useCallback(() => { | |
| if (Platform.OS !== 'web' || !listingId) { | |
| return; | |
| } | |
| window.location.href = deepLinkUrl; | |
| if (fallbackTimerRef.current) { | |
| clearTimeout(fallbackTimerRef.current); | |
| } | |
| fallbackTimerRef.current = window.setTimeout(() => { | |
| window.location.href = fallbackStoreUrl; | |
| }, 1200); | |
| }, [deepLinkUrl, fallbackStoreUrl, listingId]); |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@frontend/app/listings/`[id].tsx around lines 51 - 60, The handleOpenInApp
callback sets a fallback setTimeout but never clears it, causing stray
redirects; fix by storing the timeout ID (e.g., in a ref) and clearing any
existing timeout before creating a new one, and add a useEffect cleanup that
calls clearTimeout on that ref when the component unmounts; reference
handleOpenInApp, deepLinkUrl, fallbackStoreUrl, and listingId to locate the code
and ensure you call clearTimeout(timerId) both before setting a new timer and in
the component cleanup.
| const normalizedEmail = sellerEmail.trim().toLowerCase(); | ||
| const emailPattern = /^[^\s@]+@[^\s@]+\.[^\s@]+$/; | ||
|
|
||
| if (!title.trim() || !description.trim() || !normalizedEmail) { | ||
| Alert.alert('Missing fields', 'Title, description, and email are required.'); | ||
| return; | ||
| } | ||
|
|
||
| if (!emailPattern.test(normalizedEmail)) { | ||
| Alert.alert('Invalid email', 'Please enter a valid email address.'); | ||
| return; | ||
| } | ||
|
|
||
| if (!normalizedEmail.endsWith('@calpoly.edu')) { | ||
| Alert.alert('Invalid email', 'Please use your @calpoly.edu email address.'); | ||
| return; | ||
| } |
There was a problem hiding this comment.
sellerEmail is validated but never sent in the create payload.
This makes the email field effectively a no-op and can fail listing creation when backend identity email is unavailable.
🔧 Proposed fix
await createListing({
title: title.trim(),
description: description.trim(),
price: parsedPrice,
+ sellerEmail: normalizedEmail,
category,
condition,
images,
});Also applies to: 68-75
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@frontend/app/listings/new.tsx` around lines 31 - 47, The form validates
sellerEmail into normalizedEmail but never includes it in the create-listing
payload; update the code that sends the listing (the API call or create function
invoked after validation) to include the normalizedEmail value under the
backend-expected key (e.g., "email" or "sellerEmail") so the server receives the
user's email; apply the same fix for the second occurrence referenced around
lines 68-75 so both submit paths attach normalizedEmail to the payload.
| async function uploadToConvex(blob: Blob, onProgress: (progress: number) => void) { | ||
| const uploadUrl = await generateUploadUrl({}); | ||
|
|
||
| return await new Promise<string>((resolve, reject) => { | ||
| const xhr = new XMLHttpRequest(); | ||
| xhr.open('POST', uploadUrl); | ||
| xhr.setRequestHeader('Content-Type', blob.type || 'image/jpeg'); | ||
|
|
||
| xhr.upload.onprogress = (event) => { | ||
| if (!event.lengthComputable) { | ||
| return; | ||
| } | ||
| onProgress(event.loaded / event.total); | ||
| }; | ||
|
|
||
| xhr.onerror = () => reject(new Error('Network error during upload.')); | ||
| xhr.onload = () => { | ||
| if (xhr.status < 200 || xhr.status >= 300) { | ||
| reject(new Error(`Upload failed (${xhr.status}).`)); | ||
| return; | ||
| } | ||
|
|
||
| try { | ||
| const parsed = JSON.parse(xhr.responseText) as { storageId?: string }; | ||
| if (!parsed.storageId) { | ||
| reject(new Error('Upload response missing storage ID.')); | ||
| return; | ||
| } | ||
| resolve(parsed.storageId); | ||
| } catch { | ||
| reject(new Error('Upload response could not be parsed.')); | ||
| } | ||
| }; | ||
|
|
||
| xhr.send(blob); | ||
| }); | ||
| } |
There was a problem hiding this comment.
Add an upload timeout path for stuck network requests.
Without xhr.timeout/ontimeout, a stalled upload can remain permanently in uploading state and block submit flows upstream.
🔧 Proposed fix
const xhr = new XMLHttpRequest();
xhr.open('POST', uploadUrl);
+ xhr.timeout = 30000;
xhr.setRequestHeader('Content-Type', blob.type || 'image/jpeg');
...
xhr.onerror = () => reject(new Error('Network error during upload.'));
+ xhr.ontimeout = () => reject(new Error('Upload timed out.'));📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| async function uploadToConvex(blob: Blob, onProgress: (progress: number) => void) { | |
| const uploadUrl = await generateUploadUrl({}); | |
| return await new Promise<string>((resolve, reject) => { | |
| const xhr = new XMLHttpRequest(); | |
| xhr.open('POST', uploadUrl); | |
| xhr.setRequestHeader('Content-Type', blob.type || 'image/jpeg'); | |
| xhr.upload.onprogress = (event) => { | |
| if (!event.lengthComputable) { | |
| return; | |
| } | |
| onProgress(event.loaded / event.total); | |
| }; | |
| xhr.onerror = () => reject(new Error('Network error during upload.')); | |
| xhr.onload = () => { | |
| if (xhr.status < 200 || xhr.status >= 300) { | |
| reject(new Error(`Upload failed (${xhr.status}).`)); | |
| return; | |
| } | |
| try { | |
| const parsed = JSON.parse(xhr.responseText) as { storageId?: string }; | |
| if (!parsed.storageId) { | |
| reject(new Error('Upload response missing storage ID.')); | |
| return; | |
| } | |
| resolve(parsed.storageId); | |
| } catch { | |
| reject(new Error('Upload response could not be parsed.')); | |
| } | |
| }; | |
| xhr.send(blob); | |
| }); | |
| } | |
| async function uploadToConvex(blob: Blob, onProgress: (progress: number) => void) { | |
| const uploadUrl = await generateUploadUrl({}); | |
| return await new Promise<string>((resolve, reject) => { | |
| const xhr = new XMLHttpRequest(); | |
| xhr.open('POST', uploadUrl); | |
| xhr.timeout = 30000; | |
| xhr.setRequestHeader('Content-Type', blob.type || 'image/jpeg'); | |
| xhr.upload.onprogress = (event) => { | |
| if (!event.lengthComputable) { | |
| return; | |
| } | |
| onProgress(event.loaded / event.total); | |
| }; | |
| xhr.onerror = () => reject(new Error('Network error during upload.')); | |
| xhr.ontimeout = () => reject(new Error('Upload timed out.')); | |
| xhr.onload = () => { | |
| if (xhr.status < 200 || xhr.status >= 300) { | |
| reject(new Error(`Upload failed (${xhr.status}).`)); | |
| return; | |
| } | |
| try { | |
| const parsed = JSON.parse(xhr.responseText) as { storageId?: string }; | |
| if (!parsed.storageId) { | |
| reject(new Error('Upload response missing storage ID.')); | |
| return; | |
| } | |
| resolve(parsed.storageId); | |
| } catch { | |
| reject(new Error('Upload response could not be parsed.')); | |
| } | |
| }; | |
| xhr.send(blob); | |
| }); | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@frontend/components/ImageUploader.tsx` around lines 203 - 239, The
uploadToConvex function can hang indefinitely because XMLHttpRequest has no
timeout handling; update uploadToConvex to set xhr.timeout (e.g., a sensible
default or configurable value) and implement xhr.ontimeout to reject with a
clear timeout Error so upstream stays unblocked, and ensure the timeout path
mirrors xhr.onerror/ontimeout behavior (rejecting and stopping any progress
handling); reference the xhr instance, its ontimeout handler, and the Promise
resolve/reject flow in uploadToConvex to add this logic.
|
Created clean companion branch Feedback:
|
b4cbf82 to
fe85691
Compare
There was a problem hiding this comment.
Actionable comments posted: 4
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
frontend/app/listings/[id].tsx (1)
46-46:⚠️ Potential issue | 🟠 MajorUse the
/listingsURL in sharing to match deep-link routing.Line 46 shares
https://polybuys.com/l/..., but the app-link setup and OG URL use/listings/.... This inconsistency can prevent app-link open behavior from shared links.🔧 Proposed fix
- const shareUrl = `https://polybuys.com/l/${listing._id}`; + const shareUrl = `https://polybuys.com/listings/${listing._id}`;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app/listings/`[id].tsx at line 46, The shareUrl constant uses the short path `/l/` which is inconsistent with the app-link and OG canonical path; update the code that defines shareUrl (the shareUrl constant in frontend/app/listings/[id].tsx) to build the URL using `/listings/${listing._id}` instead of `/l/${listing._id}` so shared links use the deep-link routing path.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@frontend/app/listings/`[id].tsx:
- Around line 163-164: The Message Seller button currently has an empty onPress;
implement a handler (e.g., handleMessageSeller) and wire it to
TouchableOpacity's onPress to either (1) require authentication and redirect
unauthenticated users to the login flow, or (2) for authenticated users, create
or fetch a messaging thread for this listing/seller (send listingId and
sellerId) and navigate to the chat screen (use your router/navigation, e.g.,
useRouter or navigation.navigate) passing the threadId and listingId; update
imports/usages as needed and use styles.messageButton/styles.messageButtonText
unchanged.
- Around line 90-99: Update the Head block in the listings page (the Head
component that currently sets og:title/og:description/og:url for listing) to
include an og:image meta (use listing.image or listing.images[0], falling back
to a site default and ensure an absolute https:// URL), add Twitter card tags
(twitter:card, twitter:title, twitter:description, twitter:image) mirroring the
OG values, and add an iOS Smart App Banner meta tag (meta
name="apple-itunes-app" with app-id and optional app-argument pointing to the
listing URL) so social previews and native app promotion are supported.
- Around line 101-110: The web banner's "Open in App" onPress currently only
calls Linking.openURL with the custom scheme
(polybuys://listings/{listing._id}); change the handler (the TouchableOpacity
onPress in frontend/app/listings/[id].tsx that renders styles.webBannerContainer
/ styles.webBannerButton) to first attempt to open the custom URL (using
Linking.canOpenURL or Linking.openURL) and if that fails (catch the rejected
promise or false from canOpenURL) fall back to opening the appropriate app-store
URL (Apple App Store for iOS, Google Play for Android or a generic web store
landing page) so users are redirected to install the app when the custom scheme
isn’t handled.
In `@frontend/public/.well-known/assetlinks.json`:
- Line 7: The assetlinks.json currently contains a placeholder
"YOUR_SHA256_CERT_FINGERPRINT" under the "sha256_cert_fingerprints" array which
will break Android Digital Asset Links; replace that string with the real
SHA-256 fingerprint of the app signing certificate (the fingerprint used for
your release key or Play App Signing), preserving the JSON structure and the
"sha256_cert_fingerprints" key so App Links autoVerify can succeed.
---
Outside diff comments:
In `@frontend/app/listings/`[id].tsx:
- Line 46: The shareUrl constant uses the short path `/l/` which is inconsistent
with the app-link and OG canonical path; update the code that defines shareUrl
(the shareUrl constant in frontend/app/listings/[id].tsx) to build the URL using
`/listings/${listing._id}` instead of `/l/${listing._id}` so shared links use
the deep-link routing path.
ℹ️ Review info
Configuration used: defaults
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
frontend/app.jsonfrontend/app/listings/[id].tsxfrontend/app/listings/[id]/edit.tsxfrontend/public/.well-known/apple-app-site-associationfrontend/public/.well-known/assetlinks.json
🚧 Files skipped from review as they are similar to previous changes (1)
- frontend/public/.well-known/apple-app-site-association
| "target": { | ||
| "namespace": "android_app", | ||
| "package_name": "com.polybuy.app", | ||
| "sha256_cert_fingerprints": ["YOUR_SHA256_CERT_FINGERPRINT"] |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, verify the assetlinks.json file exists and check its content
cat -n frontend/public/.well-known/assetlinks.jsonRepository: codebox-calpoly/PolyBuys
Length of output: 377
🏁 Script executed:
# Run the verification command from the review comment to check for placeholders
rg -n "YOUR_SHA256_CERT_FINGERPRINT|TEAMID|APP_STORE_ID|YOUR_APP_STORE_ID"Repository: codebox-calpoly/PolyBuys
Length of output: 50
Replace the placeholder SHA256 certificate fingerprint before release.
The file contains "YOUR_SHA256_CERT_FINGERPRINT" which will cause Android Digital Asset Links verification to fail and prevent App Links (autoVerify) from activating. Replace with the actual SHA256 certificate fingerprint from your Android app's signing key.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@frontend/public/.well-known/assetlinks.json` at line 7, The assetlinks.json
currently contains a placeholder "YOUR_SHA256_CERT_FINGERPRINT" under the
"sha256_cert_fingerprints" array which will break Android Digital Asset Links;
replace that string with the real SHA-256 fingerprint of the app signing
certificate (the fingerprint used for your release key or Play App Signing),
preserving the JSON structure and the "sha256_cert_fingerprints" key so App
Links autoVerify can succeed.
0f395bd to
f066dc8
Compare
f066dc8 to
5ecc2e7
Compare
Linked Issues
Closes #
Linear: (e.g., POLY-143)
Summary
Briefly explain the change and why.
Did what what the issue said.
What I changed and why:
File: app.json
Added:
scheme: "polybuys" so links like polybuys://... can open the app.
expo-router polybuys.com so router knows your web origin.
iOS associatedDomains: ["applinks:polybuys.com"] for Universal Links.
Android intentFilters for https://polybuys.com/listings... with autoVerify.
File: apple-app-site-association
Added applinks config with /listings/*.
Purpose: tells iOS that your domain is allowed to open your app for listing URLs.
File: assetlinks.json
Added common.handle_all_urls with your package name.
Purpose: tells Android that your domain is allowed to open your app.
File: listings/[id].tsx
Added:
metadata for og:title, og:description, og:image, og:url, and Twitter card.iOS smart app banner meta (apple-itunes-app).
Mobile web CTA banner:
Open in App: attempts deep link polybuys://listings/... then falls back to store.
Get the App: direct app store/play store link.
Values you still need to set before production:
Replace TEAMID in apple-app-site-association.
Replace Android SHA256 fingerprint in assetlinks.json.
Replace iOS App Store ID in frontend/app/listings/[id].tsx.
Confirm domain serves /.well-known/* exactly (no redirects/auth, correct JSON content type).
Important implementation note:
OG tags are now in the route, but social crawlers often require server-rendered HTML for dynamic per-listing previews. If previews don’t show listing-specific data, add server-side prerendering for /listings/:id.
Steps to verify locally:
npm run lintnpm run typechecknpm testnpm run dev:backend(in terminal A)npm run dev(in terminal B)Checklist
npm run lint)devScreenshots / Demos
(if UI or visible behavior - attach images, videos, or GIFs)
Summary by CodeRabbit
New Features
Documentation