Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughUpdates Supabase client versions, adds a migration to include webhook queues in the Changes
Sequence Diagram(s)sequenceDiagram
participant Test as Test Runner
participant PGMQ as PGMQ
participant Dispatcher as webhook_dispatcher Consumer
participant DB as Postgres (webhook_deliveries)
participant Delivery as webhook_delivery Consumer
Test->>PGMQ: send message to topic `webhook_dispatcher` (apps.UPDATE payload)
PGMQ->>Dispatcher: deliver message
Dispatcher->>DB: insert webhook_deliveries row (status: pending, event_type)
Test->>Dispatcher: POST /triggers/queue_consumer/sync (retry/backoff) -> 202 OK
Dispatcher->>DB: ensure delivery row exists
Test->>Delivery: POST /triggers/queue_consumer/sync (retry/backoff) -> 202 OK
Delivery->>DB: process delivery -> update status (failed), attempt_count, response_body
Test->>DB: poll until final status observed
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Possibly related PRs
Suggested labels
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
|
Review the following changes in direct dependencies. Learn more about Socket for GitHub.
|
There was a problem hiding this comment.
🧹 Nitpick comments (1)
tests/webhook-queue-processing.test.ts (1)
27-45: Retry transient fetch exceptions infetchQueueSync.Right now only non-202 responses retry; thrown network errors fail immediately and make this test flaky.
Suggested change
async function fetchQueueSync(queueName: string, maxRetries = 4) { for (let attempt = 0; attempt < maxRetries; attempt++) { - const response = await fetch(`${BASE_URL_TRIGGER}/queue_consumer/sync`, { - method: 'POST', - headers: headersInternal, - body: JSON.stringify({ queue_name: queueName }), - }) - - if (response.status === 202) { - expect(await response.json()).toEqual({ status: 'ok' }) - return - } + try { + const response = await fetch(`${BASE_URL_TRIGGER}/queue_consumer/sync`, { + method: 'POST', + headers: headersInternal, + body: JSON.stringify({ queue_name: queueName }), + }) + + if (response.status === 202) { + expect(await response.json()).toEqual({ status: 'ok' }) + return + } + } + catch { + if (attempt === maxRetries - 1) + throw new Error(`queue_consumer/sync network failure for ${queueName}`) + } if (attempt < maxRetries - 1) await new Promise(resolve => setTimeout(resolve, 250 * (attempt + 1))) }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/webhook-queue-processing.test.ts` around lines 27 - 45, The fetchQueueSync helper currently only retries on non-202 responses and will fail immediately on thrown network errors; wrap the fetch call in a try/catch inside the for-loop in fetchQueueSync, treat thrown exceptions like a non-202 attempt (i.e., if attempt < maxRetries-1 await the same backoff and continue), and on final failure rethrow a descriptive Error that includes the original exception and the queueName so test logs show the underlying network error.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@tests/webhook-queue-processing.test.ts`:
- Around line 27-45: The fetchQueueSync helper currently only retries on non-202
responses and will fail immediately on thrown network errors; wrap the fetch
call in a try/catch inside the for-loop in fetchQueueSync, treat thrown
exceptions like a non-202 attempt (i.e., if attempt < maxRetries-1 await the
same backoff and continue), and on final failure rethrow a descriptive Error
that includes the original exception and the queueName so test logs show the
underlying network error.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: e315b39e-b804-49d7-8daf-e3eb4d9501c4
⛔ Files ignored due to path filters (1)
bun.lockis excluded by!**/*.lock
📒 Files selected for processing (4)
package.jsonsupabase/migrations/20260327220305_add_webhook_queues_to_cron_tasks.sqlsupabase/tests/49_test_webhook_cron_registration.sqltests/webhook-queue-processing.test.ts
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 2a5b2db865
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
supabase/migrations/20260327220305_add_webhook_queues_to_cron_tasks.sql
Outdated
Show resolved
Hide resolved
|



Summary (AI generated)
webhook_dispatcherandwebhook_deliveryin the activehigh_frequency_queuescron taskpublic.cron_tasksMotivation (AI generated)
A scheduler regression left webhook jobs enqueued but never drained. The webhook migration added the queues to the old hard-coded cron runner, but the later table-driven
cron_tasksmigration rebuilthigh_frequency_queueswithout carrying those queue names forward.Business Impact (AI generated)
Webhook deliveries back customer integrations and downstream automation. When those jobs stay stuck in the queue, Capgo records the events but never sends them, which breaks customer workflows and undermines reliability.
Test Plan (AI generated)
bun lintbun lint:backendbunx eslint tests/webhook-queue-processing.test.tsbunx supabase test dbbun run supabase:with-env -- bunx vitest run tests/webhook-queue-processing.test.tsLocal Supabase bootstrap was still pulling first-run Docker images during this session, so the DB and integration runs were not completed here.
Generated with AI
Summary by CodeRabbit
Chores
Bug Fixes
Tests