Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
📝 WalkthroughWalkthroughIntroduces Nova Poshta shipping integration with database schema for shipments, cities, and warehouses; API routes for shipping methods, city/warehouse lookups, and admin actions; cart UI for shipping selection; shipment processing workers; and PII-safe retention logic. Also includes blog caching improvements, visibility-based notification polling, Sanity CDN enablement, and removal of Speed Insights. Changes
Sequence Diagram(s)sequenceDiagram
participant User as User/Cart
participant Cart as CartPageClient
participant ShippingAPI as /api/shipping/*
participant NP as Nova Poshta API
participant DB as Database
participant Checkout as /api/checkout
participant Worker as Shipments Worker
User->>Cart: Open cart, enable shipping
Cart->>ShippingAPI: GET /shipping/methods
ShippingAPI->>DB: Check availability flags
ShippingAPI-->>Cart: Return available methods
User->>Cart: Select city
Cart->>ShippingAPI: GET /shipping/np/cities?q=Kyiv
ShippingAPI->>DB: Query cached cities
alt Cache hit
ShippingAPI-->>Cart: Return cached cities
else Cache miss
ShippingAPI->>NP: Search settlements
NP-->>ShippingAPI: City data
ShippingAPI->>DB: Upsert to np_cities
ShippingAPI-->>Cart: Return cities
end
User->>Cart: Select warehouse
Cart->>ShippingAPI: GET /shipping/np/warehouses?cityRef=...
ShippingAPI->>DB: Query cached warehouses
alt Cache hit
ShippingAPI-->>Cart: Return cached warehouses
else Cache miss
ShippingAPI->>NP: Get warehouses
NP-->>ShippingAPI: Warehouse data
ShippingAPI->>DB: Upsert to np_warehouses
ShippingAPI-->>Cart: Return warehouses
end
User->>Cart: Enter recipient details, place order
Cart->>Checkout: POST with shipping payload
Checkout->>DB: Create order with shipping
Checkout->>DB: Create order_shipping snapshot
Checkout->>DB: Enqueue shipping_shipments
Checkout-->>Cart: Order created
Note over Worker,NP: Async shipment processing
Worker->>DB: Claim queued shipments
Worker->>DB: Load shipping snapshot
Worker->>NP: Create Internet Document (TTN)
NP-->>Worker: Provider ref, tracking number
Worker->>DB: Update shipment status to succeeded
Worker->>DB: Update order tracking number
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Suggested labels
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
✅ Deploy Preview for develop-devlovers ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
There was a problem hiding this comment.
Actionable comments posted: 12
Note
Due to the large number of review comments, Critical, Major severity comments were prioritized as inline comments.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
frontend/lib/services/orders/checkout.ts (1)
714-749:⚠️ Potential issue | 🟡 MinorBackfill hash uses current request locale, not persisted order context.
When
idempotencyRequestHashis missing,derivedExistingHashis computed with the current request locale. That makes hash backfill request-context dependent for legacy rows and can weaken conflict detection.Use only persisted order-derived inputs (or a stable fallback like
null) for backfill hashing.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/lib/services/orders/checkout.ts` around lines 714 - 749, The backfill hash computation for derivedExistingHash is using the current request variable locale (locale ?? null) which makes legacy-row hashing request-dependent; change the hash inputs to use only persisted order-derived values: pass the persisted locale (e.g., row.locale ?? null) or a stable null fallback instead of the request-scoped locale, and ensure all other inputs to hashIdempotencyRequest remain sourced from persisted fields (row.*, existing.*, existingCityRef/existingWarehouseRef) so the hash is stable and independent of the current request context.
🟡 Minor comments (10)
frontend/lib/tests/shop/checkout-shipping-phase3.test.ts-111-120 (1)
111-120:⚠️ Potential issue | 🟡 MinorMissing cleanup for
productPricestable.The
seedCheckoutShippingDatafunction inserts rows intoproductPrices(lines 47-66), butcleanupSeedDatadoes not delete them. IfproductPricesdoes not have a cascade-on-delete constraint fromproducts, this will leave orphaned test data in the database.Proposed fix to add productPrices cleanup
async function cleanupSeedData(data: SeedData, orderIds: string[]) { for (const orderId of orderIds) { await db.delete(orders).where(eq(orders.id, orderId)); } + await db.delete(productPrices).where(eq(productPrices.productId, data.productId)); await db.delete(npWarehouses).where(eq(npWarehouses.ref, data.warehouseRefA)); await db.delete(npWarehouses).where(eq(npWarehouses.ref, data.warehouseRefB)); await db.delete(npCities).where(eq(npCities.ref, data.cityRef)); await db.delete(products).where(eq(products.id, data.productId)); }Run the following script to check if
productPriceshas cascade delete onproductId:#!/bin/bash # Check if productPrices has cascade delete configured rg -n -A5 'productPrices|product_prices' --type ts | rg -i 'cascade|onDelete'🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/lib/tests/shop/checkout-shipping-phase3.test.ts` around lines 111 - 120, The cleanupSeedData function is missing deletion of productPrices rows inserted by seedCheckoutShippingData; modify cleanupSeedData to delete from productPrices where eq(productPrices.productId, data.productId) (using the same db.delete pattern as other tables) before deleting products to avoid orphaned productPrices entries; reference the productPrices model/identifier and the cleanupSeedData and seedCheckoutShippingData functions when making this change.frontend/components/header/NotificationBell.tsx-297-300 (1)
297-300:⚠️ Potential issue | 🟡 MinorAvoid hardcoded unread tooltip text
Line 299 uses
title="Unread"directly. This bypasses localization in an otherwise translated component.Proposed fix
-<div +<div className="mt-1.5 h-1.5 w-1.5 shrink-0 rounded-full bg-(--accent-primary) shadow-[0_0_8px_var(--accent-primary)]" - title="Unread" + title={t('unread')} />🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/components/header/NotificationBell.tsx` around lines 297 - 300, The unread indicator uses a hardcoded title "Unread" in NotificationBell; replace this with the localized string from the component's translation mechanism (e.g., use the existing t function or translations prop used in this file) so the title becomes t('notifications.unread') or the appropriate key; update the title attribute on the <div> inside NotificationBell to use the translated value (ensuring the translation key exists) rather than the literal "Unread".frontend/components/header/NotificationBell.tsx-25-35 (1)
25-35:⚠️ Potential issue | 🟡 MinorRelative-time rounding can flip units too early
On Line 25, Line 29, and Line 33,
Math.roundcan produce premature transitions (e.g., ~23.5h showing as 1 day). For past timestamps this creates visible off-by-one behavior.Proposed fix
function getRelativeTime(date: Date, locale: string, justNow: string) { const rtf = new Intl.RelativeTimeFormat(locale, { numeric: 'auto' }); const now = new Date().getTime(); - const daysDifference = Math.round( - (date.getTime() - now) / (1000 * 60 * 60 * 24) - ); - if (daysDifference === 0) { - const hoursDifference = Math.round( - (date.getTime() - now) / (1000 * 60 * 60) - ); - - if (hoursDifference === 0) { - const minutesDifference = Math.round( - (date.getTime() - now) / (1000 * 60) - ); - - if (minutesDifference === 0) return justNow; - return rtf.format(minutesDifference, 'minute'); - } - - return rtf.format(hoursDifference, 'hour'); - } - - return rtf.format(daysDifference, 'day'); + const diffMs = date.getTime() - now; + const minutesDifference = Math.trunc(diffMs / (1000 * 60)); + if (minutesDifference === 0) return justNow; + if (Math.abs(minutesDifference) < 60) return rtf.format(minutesDifference, 'minute'); + + const hoursDifference = Math.trunc(minutesDifference / 60); + if (Math.abs(hoursDifference) < 24) return rtf.format(hoursDifference, 'hour'); + + const daysDifference = Math.trunc(hoursDifference / 24); + return rtf.format(daysDifference, 'day'); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/components/header/NotificationBell.tsx` around lines 25 - 35, The current use of Math.round for daysDifference/hoursDifference/minutesDifference causes unit flips near boundaries (e.g., ~23.5h -> 1 day); update the rounding logic in NotificationBell.tsx so each difference uses floor when the delta is positive (future) and ceil when negative (past) — e.g., compute rawDelta = (date.getTime() - now) / msPerUnit, then set daysDifference = rawDelta >= 0 ? Math.floor(rawDelta) : Math.ceil(rawDelta), and apply the same pattern for hoursDifference and minutesDifference to prevent premature transitions.frontend/app/api/shop/webhooks/stripe/route.ts-1000-1010 (1)
1000-1010:⚠️ Potential issue | 🟡 MinorDocument assumption: shipment enqueue relies on background worker as fallback for already-paid orders.
This edge case is real: when
order.paymentStatus === 'paid', the WHERE clause at line 447 (where ${shouldAttemptEnqueue}) skips enqueueing becausepaymentBecamePaidInThisApplyis false. If a webhook crashes between marking the order paid and enqueueing the shipment, a retry won't re-attempt the enqueue.However, there is an existing mitigation: the shipments-worker (shipments-worker.ts) processes orders with
shipping_status='pending'(set during checkout), eventually handling the enqueue even if the webhook fails. This pattern works but isn't documented in the webhook code.Consider adding a code comment explicitly stating the dependency on shipments-worker as the compensating mechanism for this scenario, or document the assumption in the function's JSDoc.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app/api/shop/webhooks/stripe/route.ts` around lines 1000 - 1010, Add a short JSDoc or inline comment on applyStripePaidAndQueueShipmentAtomic (near where paymentBecamePaidInThisApply is computed and the WHERE clause uses shouldAttemptEnqueue) stating that if order.paymentStatus === 'paid' the enqueue step may be skipped and that shipments-worker (shipments-worker.ts) serves as the compensating background process that will eventually pick up orders with shipping_status='pending' and enqueue shipments; reference the paymentBecamePaidInThisApply flag and shipments-worker to make the retry/fallback assumption explicit for future maintainers.frontend/app/api/blog-author/route.ts-35-35 (1)
35-35:⚠️ Potential issue | 🟡 MinorUse
withConfig({ useCdn: false })to respect the dynamic intent of this route.This API route declares
revalidate = 0and setsCache-Control: no-store, indicating the intent to always serve fresh data. However, the query uses the defaultuseCdn: true, which queries Sanity's API CDN. CDN responses are cached and invalidated on publish, but brief staleness windows (stale-while-revalidate pattern) can occur before invalidation propagates.For consistency with this route's dynamic contract and to match the pattern used in
blog-search/route.ts, consider adding.withConfig({ useCdn: false })to guarantee fresh author data on every request:Suggested change
- const author = await client.fetch(authorQuery, { name, locale }); + const author = await client.withConfig({ useCdn: false }).fetch(authorQuery, { name, locale });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app/api/blog-author/route.ts` at line 35, The route currently calls client.fetch(authorQuery, { name, locale }) which uses the CDN; change it to use a non-CDN client by calling client.withConfig({ useCdn: false }).fetch(authorQuery, { name, locale }) so authorQuery is always fetched fresh for the dynamic route (match the pattern used in blog-search/route.ts and keep the same name and locale params).frontend/lib/services/shop/shipping/checkout-payload.ts-93-99 (1)
93-99:⚠️ Potential issue | 🟡 MinorAlign NP
cityRef/warehouseRefvalidation with backend schema.This builder currently accepts any non-empty string, while backend validation is stricter (
frontend/lib/validation/shop.ts, Line [322]-Line [327]). That mismatch can produceok: trueclient payloads that fail server-side.Suggested fix
const UA_PHONE_REGEX = /^(?:\+380\d{9}|0\d{9})$/; const EMAIL_REGEX = /^[^\s@]+@[^\s@]+\.[^\s@]+$/; +const NP_REF_REGEX = /^[A-Za-z0-9-]{20,64}$/; @@ const cityRef = trimOrNull(input.cityRef); - if (!cityRef) { + if (!cityRef || !NP_REF_REGEX.test(cityRef)) { return { ok: false, code: 'CITY_REQUIRED', }; } @@ if (methodCode === 'NP_WAREHOUSE' || methodCode === 'NP_LOCKER') { - if (!warehouseRef) { + if (!warehouseRef || !NP_REF_REGEX.test(warehouseRef)) { return { ok: false, code: 'WAREHOUSE_REQUIRED', }; } }Also applies to: 110-117
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/lib/services/shop/shipping/checkout-payload.ts` around lines 93 - 99, The NP cityRef/warehouseRef checks accept any non-empty string; update them to use the same strict validation rules as the frontend shop validation (the logic around NP cityRef/warehouseRef in the validation module) instead of just trimOrNull. Import or reuse the validator used in frontend/lib/validation/shop.ts and replace the simple presence checks for cityRef and warehouseRef in checkout-payload.ts (variables cityRef, warehouseRef) with that validator; if validation fails return ok: false with code 'CITY_INVALID' or 'WAREHOUSE_INVALID' (keep 'CITY_REQUIRED'/'WAREHOUSE_REQUIRED' for empty values). Also apply the same change to the other occurrence noted (the 110-117 block).frontend/lib/validation/shop.ts-393-398 (1)
393-398:⚠️ Potential issue | 🟡 MinorTighten
countryvalidation to alphabetic ISO-2 values.Current
.length(2)accepts non-letters (for example,1$). This should be rejected at schema level to avoid invalid country data entering checkout flow.Suggested fix
country: z .string() .trim() - .length(2) + .regex(/^[A-Za-z]{2}$/) .transform(value => value.toUpperCase()) .optional(),🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/lib/validation/shop.ts` around lines 393 - 398, The country field currently uses .length(2) which allows non-letter characters; update the country schema in frontend/lib/validation/shop.ts (the country z.string().trim().length(2).transform(...).optional() entry) to enforce alphabetic ISO-2 codes by replacing .length(2) with a regex check that allows only two letters — e.g. apply the .transform(value => value.toUpperCase()) and then .regex(/^[A-Z]{2}$/, "Invalid country code") (or use .regex(/^[A-Za-z]{2}$/, "Invalid country code") before transform) while keeping .trim() and .optional().frontend/.env.example-85-90 (1)
85-90:⚠️ Potential issue | 🟡 MinorResolve dotenv-linter findings in this block.
Please align the
NP_SENDER_*key ordering and add a trailing blank line at EOF to clear the current dotenv-linter warnings.Also applies to: 166-166
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/.env.example` around lines 85 - 90, Reorder the NP_SENDER_* keys in the .env.example so they follow a consistent, expected sequence (e.g., NP_SENDER_REF, NP_SENDER_NAME, NP_SENDER_CONTACT_REF, NP_SENDER_PHONE, NP_SENDER_CITY_REF, NP_SENDER_WAREHOUSE_REF) to satisfy dotenv-linter key ordering rules and ensure the same change is applied to the duplicate block at line ~166; also add a single trailing blank line at EOF so the file ends with a newline.frontend/app/api/shop/shipping/methods/route.ts-51-71 (1)
51-71:⚠️ Potential issue | 🟡 MinorMethod titles are hardcoded in English despite locale-aware API input.
This will return non-localized labels for
uk/pl/...callers. Prefer returning translation keys (or locale-specific titles) instead of fixed English strings.💡 Suggested direction
type ShippingMethod = { provider: 'nova_poshta'; methodCode: 'NP_WAREHOUSE' | 'NP_LOCKER' | 'NP_COURIER'; - title: string; + titleKey: + | 'shop.shipping.methods.npWarehouse' + | 'shop.shipping.methods.npLocker' + | 'shop.shipping.methods.npCourier'; requiredFields: Array<'cityRef' | 'warehouseRef' | 'addressLine1' | 'recipientName' | 'recipientPhone'>; };Then localize in the UI using
titleKey.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app/api/shop/shipping/methods/route.ts` around lines 51 - 71, The getMethods function currently returns hardcoded English strings in the title field; change it to return locale-agnostic translation keys instead (e.g., replace title: 'Nova Poshta warehouse' with titleKey: 'shipping.nova_poshta.warehouse') for each ShippingMethod entry (NP_WAREHOUSE, NP_LOCKER, NP_COURIER) and ensure the ShippingMethod type supports titleKey (or both titleKey and optional title). Update consumers to read titleKey and perform UI localization rather than relying on title.frontend/app/[locale]/admin/shop/orders/[id]/page.tsx-384-397 (1)
384-397:⚠️ Potential issue | 🟡 MinorLocalize remaining hardcoded item labels.
At Line [384], Line [389], and Line [396],
Qty,Unit, andLineare hardcoded English while the rest of the page is translated. Use translation keys for consistent locale output.Suggested fix
- <dd className="text-sm">Qty: {it.quantity}</dd> + <dd className="text-sm"> + {t('quantity')}: {it.quantity} + </dd> ... - <dd className="text-muted-foreground text-sm"> - Unit:{' '} + <dd className="text-muted-foreground text-sm"> + {t('unitPrice')}:{' '} {safeFormatMoneyMajor(it.unitPrice, currency, locale)} </dd> ... - <dd className="text-sm font-medium"> - Line:{' '} + <dd className="text-sm font-medium"> + {t('lineTotal')}:{' '} {safeFormatMoneyMajor(it.lineTotal, currency, locale)} </dd>🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app/`[locale]/admin/shop/orders/[id]/page.tsx around lines 384 - 397, Replace the hardcoded visible item labels "Qty", "Unit", and "Line" with the translation function so they follow the page locale (use t('quantity') for Qty, t('unitPrice') for Unit, and t('lineTotal') for Line); update the JSX where these strings are rendered (the block that renders it.quantity, safeFormatMoneyMajor(it.unitPrice...), and safeFormatMoneyMajor(it.lineTotal...)) to call the existing t(...) helper (it is already used for the sr-only <dt> labels) so no new imports are required and translations remain consistent.
🧹 Nitpick comments (25)
frontend/scripts/np-mock-server.mjs (3)
48-63: Consider wrapping the async handler with error handling.If
readBodyrejects (e.g., client aborts the connection), the promise rejection is unhandled and will emit anunhandledRejectionwarning. Wrapping the handler or adding a try-catch would make the server more robust.🛡️ Optional: Add error handling
const server = http.createServer(async (req, res) => { + try { const url = new URL(req.url || '/', `http://${req.headers.host}`); if ( req.method === 'GET' && (url.pathname === '/' || url.pathname === '/health') ) { return json(res, 200, { ok: true, service: 'np-mock', port: PORT }); } if (req.method !== 'POST') { return json(res, 405, { ok: false, error: 'Method not allowed' }); } const raw = await readBody(req); // ... rest of handler ... + } catch (err) { + console.error('[np-mock] Request error:', err.message); + if (!res.headersSent) { + json(res, 500, { ok: false, error: 'Internal server error' }); + } + } });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/scripts/np-mock-server.mjs` around lines 48 - 63, The async request handler passed to http.createServer can throw (e.g., readBody rejects) causing unhandled rejections; wrap the handler body in a try-catch (or use an async wrapper) around the code inside the function created for server so any errors from readBody, normalizeReqPayload, or subsequent logic are caught; on catch, log the error and return a json(res, 500, { ok: false, error: 'Internal Server Error' }) (or a suitable message) and ensure the response is ended to avoid leaking connections.
6-13: Consider adding a body size limit.The
readBodyfunction accumulates all chunks without a size limit, which could exhaust memory if a large payload is sent. For a development mock server, this is low risk, but adding a reasonable limit (e.g., 1MB) would be good defensive practice.🛡️ Optional: Add body size limit
-function readBody(req) { +function readBody(req, maxBytes = 1024 * 1024) { return new Promise((resolve, reject) => { let data = ''; - req.on('data', chunk => (data += chunk)); + let size = 0; + req.on('data', chunk => { + size += chunk.length; + if (size > maxBytes) { + req.destroy(); + reject(new Error('Request body too large')); + return; + } + data += chunk; + }); req.on('end', () => resolve(data)); req.on('error', reject); }); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/scripts/np-mock-server.mjs` around lines 6 - 13, The readBody function currently accumulates request chunks without a size guard; add a MAX_BODY_BYTES constant (e.g., 1 * 1024 * 1024) and, inside readBody's 'data' handler, track the accumulated byte length (use Buffer.byteLength on the accumulating string or prefer collecting Buffer chunks) and if it exceeds MAX_BODY_BYTES immediately reject the promise with a clear error (e.g., Error('Payload too large') or an HTTP 413-style error), remove/cleanup the listeners and destroy the request stream to stop reading; keep the existing 'end' and 'error' flows but ensure the oversized-path cleans up listeners to avoid leaks.
119-133: Inconsistent condition:searchSettlementsdoesn't verifymodelName.The condition on line 121 matches
searchSettlementsregardless ofmodelName, while other handlers consistently check both. This could match unintended requests.♻️ Suggested fix for consistency
if ( - (modelName === 'Address' && calledMethod === 'getCities') || - calledMethod === 'searchSettlements' + modelName === 'Address' && + (calledMethod === 'getCities' || calledMethod === 'searchSettlements') ) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/scripts/np-mock-server.mjs` around lines 119 - 133, The conditional wrongly allows calledMethod === 'searchSettlements' for any model; update the if to require both modelName === 'Address' and calledMethod === 'searchSettlements' (so it mirrors the getCities check) — i.e., change the branch that currently reads ((modelName === 'Address' && calledMethod === 'getCities') || calledMethod === 'searchSettlements') to require modelName === 'Address' for the searchSettlements case as well, ensuring both modelName and calledMethod are validated before returning the mock response.frontend/scripts/seed-np-local.sql (1)
40-48: ON CONFLICT clause hardcodesis_active = true, ignoring the incoming value.In the
ON CONFLICT DO UPDATEclause,is_activeis hardcoded totrueinstead of usingEXCLUDED.is_active. This means running the seed will always activate cities regardless of what value is provided in the INSERT. If this is intentional behavior for the seed script, consider adding a comment to clarify.🔧 Suggested fix if you want to preserve the incoming value
settlement_type = EXCLUDED.settlement_type, - is_active = true, + is_active = EXCLUDED.is_active, updated_at = now();🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/scripts/seed-np-local.sql` around lines 40 - 48, The ON CONFLICT (ref) DO UPDATE clause currently sets is_active = true, which ignores the incoming INSERT value; update the SET to assign is_active = EXCLUDED.is_active so the inserted value is preserved (or, if always activating is intentional for this seed script, add a clear comment next to the ON CONFLICT block documenting that behavior); look for the ON CONFLICT (ref) DO UPDATE block and modify the is_active assignment accordingly in the seed-np-local.sql script.frontend/scripts/seed-np-local.ps1 (1)
24-24: Consider usingWrite-OutputorWrite-Informationfor better scriptability.As flagged by static analysis,
Write-Hostmay not work in all hosts and cannot be captured or redirected. For a seed script that might be invoked from CI or other automation contexts,Write-OutputorWrite-Informationwould be more flexible.🔧 Suggested fix
- Write-Host "Seeding NP local catalog from: $SeedPath" + Write-Information "Seeding NP local catalog from: $SeedPath" -InformationAction ContinueOr simply:
- Write-Host "Seeding NP local catalog from: $SeedPath" + Write-Output "Seeding NP local catalog from: $SeedPath"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/scripts/seed-np-local.ps1` at line 24, Replace the Write-Host call used to announce the seed path with a capturable output cmdlet so CI/automation can capture it; change the Write-Host "Seeding NP local catalog from: $SeedPath" usage to Write-Output (or Write-Information) while preserving the message string and variable $SeedPath so the same text is emitted but can be redirected or logged by callers.frontend/lib/services/orders/payment-attempts.ts (2)
242-249: Consider extracting the snapshot-update pattern to reduce duplication.The same pattern of fetching snapshot params and updating the attempt record with
currencyandexpectedAmountMinorappears three times. A small helper could improve maintainability:♻️ Optional helper extraction
async function syncAttemptWithOrderSnapshot( attemptId: string, orderId: string ): Promise<void> { const snapshot = await readStripePaymentIntentParams(orderId); await db .update(paymentAttempts) .set({ currency: snapshot.currency, expectedAmountMinor: snapshot.amountMinor, updatedAt: new Date(), }) .where(eq(paymentAttempts.id, attemptId)); }Then each call site becomes:
await syncAttemptWithOrderSnapshot(attempt.id, orderId);Also applies to: 278-288, 300-307
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/lib/services/orders/payment-attempts.ts` around lines 242 - 249, Duplicate snapshot-to-attempt update logic (reading snapshot via readStripePaymentIntentParams and updating paymentAttempts with currency, expectedAmountMinor, updatedAt) should be extracted into a small helper like syncAttemptWithOrderSnapshot(attemptId, orderId); implement that helper to call readStripePaymentIntentParams(orderId) and perform the db.update on paymentAttempts, then replace the three repeated blocks (the updates around nextAttempt.id / attempt.id locations) with calls to syncAttemptWithOrderSnapshot to reduce duplication and centralize the behavior.
278-288: Backfill logic is sound, but note potential data skew.The condition correctly identifies missing data for backfill. However,
readStripePaymentIntentParamsreads from the current order state, while the actual PaymentIntent was created with potentially different amounts. For legacy backfill scenarios this is acceptable, but be aware thatexpectedAmountMinorin the attempt record may not match the actual PI amount if the order was modified after PI creation.If strict audit accuracy is required, consider fetching the amount directly from the existing PI instead of the order snapshot.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/lib/services/orders/payment-attempts.ts` around lines 278 - 288, The backfill currently uses readStripePaymentIntentParams(orderId) which reads the current order snapshot and can diverge from the original PaymentIntent; change the backfill to fetch the actual PaymentIntent data for the attempt (use the PaymentIntent id stored on the attempt record and call the Stripe-read helper that returns the PI amount/currency) and write those values into paymentAttempts (currency, expectedAmountMinor, updatedAt); if the PI id is missing or the PI fetch fails, fall back to readStripePaymentIntentParams(orderId) so the existing behavior remains safe.frontend/lib/services/orders/monobank-janitor.ts (1)
59-59:raw_sha256is selected but not used in job3 processing.The
raw_sha256field was added toJob3CandidateRowand is selected in bothreadUnclaimedJob3CandidatesandclaimJob3Events, but it's never referenced in the job3 processing loop (lines 897–973). If this is intentional for future use or observability, consider adding a brief comment. Otherwise, it could be omitted from the select to reduce data transfer.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/lib/services/orders/monobank-janitor.ts` at line 59, The new raw_sha256 field on Job3CandidateRow is being selected by readUnclaimedJob3Candidates and claimJob3Events but never used in the job3 processing loop (job3 handler); either remove raw_sha256 from the Job3CandidateRow type and from the SELECTs in readUnclaimedJob3Candidates/claimJob3Events to avoid transferring unused data, or leave it and add a one-line comment next to raw_sha256 in Job3CandidateRow (and/or in the job3 processing loop) indicating it’s intentionally reserved for future use/observability; update all three locations (Job3CandidateRow, readUnclaimedJob3Candidates, claimJob3Events) to keep them consistent.frontend/lib/tests/shop/monobank-janitor-job3.test.ts (1)
3-3: Unused import:sqlis imported but never used.The
sqlimport was added but is not referenced anywhere in this test file. Onlyeqis used (in thecleanupfunction and various queries).🧹 Remove unused import
-import { eq, sql } from 'drizzle-orm'; +import { eq } from 'drizzle-orm';🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/lib/tests/shop/monobank-janitor-job3.test.ts` at line 3, The import statement currently brings in both eq and sql from 'drizzle-orm' but only eq is used in this test (e.g., in the cleanup function and assertions); remove the unused sql symbol from the import so the line imports only eq (ensure there are no other references to sql elsewhere in monobank-janitor-job3.test.ts before committing).frontend/client.ts (1)
6-6: The shared Sanity client only fetches published content; optional refactor to split clients if draft/preview functionality is added later.The client on line 6 with
useCdn: trueis used exclusively for published blog content inlayout.tsx,blog/page.tsx,blog/[slug]/page.tsx, andPostDetails.tsx—all with appropriate 1-hour ISR revalidation. No draft or preview queries use this client, so CDN caching of published-only content is safe. If draft/preview functionality is added to content fetching in the future, consider exporting a separateuseCdn: falseclient at that time.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/client.ts` at line 6, The current Sanity client uses useCdn: true for published content only; prepare for future draft/preview support by exporting an additional client with useCdn: false (e.g., export const previewClient / draftClient) alongside the existing published client (e.g., publishedClient or default export) in frontend/client.ts, and update import sites (layout.tsx, blog/page.tsx, blog/[slug]/page.tsx, PostDetails.tsx) to keep using the published client while switching to the previewClient when preview/draft mode is enabled; ensure the clients are named clearly (publishedClient, previewClient) so callers can opt into non-CDN queries as needed.frontend/app/[locale]/blog/category/[category]/page.tsx (1)
191-199:getCategoryLabelis duplicated in PostDetails.tsx.This helper function is identical to the one in
frontend/app/[locale]/blog/[slug]/PostDetails.tsx(lines 687-695). Consider extracting it to a shared utility file (e.g.,@/lib/blog/category.ts) to avoid duplication and ensure consistent category label mapping.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app/`[locale]/blog/category/[category]/page.tsx around lines 191 - 199, Extract the duplicated getCategoryLabel function into a shared utility (e.g., create and export it from a new module like `@/lib/blog/category.ts`) and update both page.tsx and PostDetails.tsx to import and use that exported getCategoryLabel; ensure the function signature (categoryName: string, t: (key: string) => string) and returned mappings are identical, remove the duplicate implementation from both files, and run a quick build/test to confirm imports resolve correctly.frontend/app/[locale]/blog/page.tsx (1)
70-77: Categories are fetched both here and in the layout.The layout (
frontend/app/[locale]/layout.tsx) already fetches and caches blog categories viagetCachedBlogCategories(). This page fetches them again separately. While not incorrect (different components may need different data), consider whether you could reuse the cached categories to avoid redundant Sanity queries.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app/`[locale]/blog/page.tsx around lines 70 - 77, This page redundantly fetches categories; instead reuse the cached categories from the layout by replacing the direct Sanity query in page.tsx (the const categories = await client.fetch(...) block) with the shared cached accessor getCachedBlogCategories() (or accept categories as a prop from the parent layout if you prefer prop drilling), ensuring you import getCachedBlogCategories from the same module used in frontend/app/[locale]/layout.tsx and remove the duplicate client.fetch call to avoid extra Sanity queries.frontend/lib/tests/shop/shipping-internal-np-sync-route-p2.test.ts (1)
21-28: Consider callingresetEnvCache()after stubbing environment variables.The similar test file
shipping-internal-retention-route-phase7.test.tscallsresetEnvCache()after stubbing environment variables to ensure the env module picks up the new values. Without this, cached environment values from prior runs may persist.♻️ Proposed fix
+import { resetEnvCache } from '@/lib/env'; + const dbExecuteMock = vi.fn();beforeEach(() => { vi.clearAllMocks(); vi.unstubAllEnvs(); vi.stubEnv('INTERNAL_JANITOR_SECRET', 'test-secret'); vi.stubEnv('SHOP_SHIPPING_ENABLED', 'true'); vi.stubEnv('SHOP_SHIPPING_NP_ENABLED', 'true'); vi.stubEnv('SHOP_SHIPPING_SYNC_ENABLED', 'true'); + resetEnvCache(); });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/lib/tests/shop/shipping-internal-np-sync-route-p2.test.ts` around lines 21 - 28, The beforeEach in this test sets env vars with vi.stubEnv but doesn't call resetEnvCache(), so the env module may still use stale cached values; update the beforeEach to call resetEnvCache() (the same helper used in shipping-internal-retention-route-phase7.test.ts) immediately after the vi.stubEnv(...) calls so the code that reads process.env via the env module (used by the shipping sync route tests) picks up the new values.frontend/lib/services/shop/shipping/checkout-payload.ts (1)
33-36: UsereasonCodeas a consistency guard (or remove it).
reasonCodeis part of the input contract but not used. Consider validatingreasonCode === 'OK'whenshippingAvailableis true to avoid contradictory states.Suggested fix
- if (!input.shippingAvailable) { + if (!input.shippingAvailable || (input.reasonCode && input.reasonCode !== 'OK')) { return { ok: false, code: 'SHIPPING_UNAVAILABLE', }; }Also applies to: 79-84
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/lib/services/shop/shipping/checkout-payload.ts` around lines 33 - 36, The input type BuildCheckoutShippingPayloadInput exposes reasonCode but the code never uses it; either remove reasonCode from the type or enforce it as a consistency guard by validating that when shippingAvailable is true then reasonCode === 'OK' (and when shippingAvailable is false reasonCode is non-'OK' or null as appropriate) in the consumer (e.g., the function that builds the checkout shipping payload such as buildCheckoutShippingPayload or any function using BuildCheckoutShippingPayloadInput); update the function to throw or return a validation error if the pair (shippingAvailable, reasonCode) is contradictory and add unit tests to cover both valid and invalid combinations.frontend/lib/services/shop/shipping/inventory-eligibility.ts (1)
25-31: Add a defensive guard for an empty status list.If
INVENTORY_COMMITTED_FOR_SHIPPINGever becomes empty, this will emitIN (), which is invalid SQL. A small guard makes this robust to future edits.Suggested fix
export function inventoryCommittedForShippingSql( columnReference: SQLWrapper ): SQL { + if (INVENTORY_COMMITTED_FOR_SHIPPING.length === 0) { + return sql`false`; + } + return sql`${columnReference} in (${sql.join( INVENTORY_COMMITTED_FOR_SHIPPING.map(value => sql`${value}`), sql`, ` )})`; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/lib/services/shop/shipping/inventory-eligibility.ts` around lines 25 - 31, inventoryCommittedForShippingSql can emit "IN ()" if INVENTORY_COMMITTED_FOR_SHIPPING is empty; add a defensive guard at the top of the function (inventoryCommittedForShippingSql) to detect when INVENTORY_COMMITTED_FOR_SHIPPING.length === 0 and return a safe SQL fragment (e.g. a false predicate) instead of building the IN list; keep the rest of the logic unchanged and still use columnReference and INVENTORY_COMMITTED_FOR_SHIPPING when the list is non-empty.frontend/lib/validation/shop-shipping.ts (1)
104-116: Consider adding a default forretentionDaysor documenting the expected fallback.The
retentionDaysfield is optional without a default value (unlike other fields in this schema). If the calling code relies on a fallback from environment configuration, this is fine, but it differs from the pattern used in other schemas where defaults are explicit.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/lib/validation/shop-shipping.ts` around lines 104 - 116, internalShippingRetentionRunPayloadSchema's retentionDays is optional but has no default, breaking the explicit-default pattern used for batchSize and minIntervalSeconds; either add a sensible default via retentionDays: z.coerce.number().int().min(1).max(3650).optional().default(<value>) (pick a safe integer like 30) or update the schema's usage docs and callers to explicitly handle undefined/merge in env fallback. Locate the schema definition (internalShippingRetentionRunPayloadSchema) and implement one of these fixes consistently so callers always have a known value for retentionDays.frontend/db/schema/shop.ts (1)
662-666: Potentially redundant indexes onnpCities.Lines 663 and 665 define two indexes that appear functionally identical:
np_cities_active_name_idxon(isActive, nameUa)np_cities_active_name_prefix_idxon(isActive, nameUa)If these are intended to serve the same purpose, consider removing the duplicate. If one is meant for prefix/pattern searches (e.g., using a different operator class), the schema definition doesn't reflect that distinction.
♻️ Suggested fix: Remove duplicate index
table => [ index('np_cities_active_name_idx').on(table.isActive, table.nameUa), index('np_cities_last_sync_run_idx').on(table.lastSyncRunId), - index('np_cities_active_name_prefix_idx').on(table.isActive, table.nameUa), ]🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/db/schema/shop.ts` around lines 662 - 666, The schema defines two identical indexes np_cities_active_name_idx and np_cities_active_name_prefix_idx both on (table.isActive, table.nameUa); remove the redundant index or make the intent explicit: either delete the np_cities_active_name_prefix_idx entry, or if you intended a prefix/text-search index, replace it with an index using the appropriate operator/class (e.g., a trigram/GIN or a text_pattern_ops variant) on table.nameUa while keeping np_cities_active_name_idx for exact matches—update the index list where index('np_cities_active_name_idx') and index('np_cities_active_name_prefix_idx') are defined accordingly.frontend/lib/services/shop/shipping/log-sanitizer.ts (1)
6-8: Phone regex is UA-specific; consider if broader coverage is needed.The regex patterns (
PHONE_UA_E164_REandPHONE_UA_LOCAL_RE) only match Ukrainian phone formats. If the shipping integration expands to other countries, additional patterns may be needed.For the current UA-focused Nova Poshta integration, this is appropriate.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/lib/services/shop/shipping/log-sanitizer.ts` around lines 6 - 8, The phone regexes are UA-specific but named generically; update PHONE_UA_E164_RE and PHONE_UA_LOCAL_RE to clearly indicate they target Ukraine (e.g., PHONE_UKRAINE_E164_RE, PHONE_UKRAINE_LOCAL_RE) and add a short comment/TODO above them noting they are Nova Poshta/Ukraine-only and should be extended or made configurable if other country formats are needed; ensure any references to PHONE_UA_E164_RE and PHONE_UA_LOCAL_RE in the codebase are renamed accordingly.frontend/lib/tests/shop/shipping-retention-phase7.test.ts (1)
22-39: Type castas anybypasses schema validation.The
as anycast on line 39 is used to overrideupdatedAtfor testing, but it bypasses TypeScript's type checking. While this is sometimes necessary in tests to set specific dates, consider whether a more type-safe approach is feasible.Alternatively, document why the cast is needed (e.g., drizzle-orm doesn't expose a way to set
updatedAton insert since it has$onUpdate).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/lib/tests/shop/shipping-retention-phase7.test.ts` around lines 22 - 39, The test currently uses an unsafe cast ("as any") on the object passed to db.insert(orders).values to override updatedAt; remove the broad cast and instead provide a properly typed insert object or a narrow cast just for the insert type so TypeScript still validates other fields. Concretely, replace the global "as any" with the appropriate insert type for the orders table (or cast only the literal updatedAt value), e.g. use the Insertable/InsertType for orders when calling db.insert(orders).values or construct the object as the table's insert type; if drizzle-orm truly prevents setting updatedAt on insert, add a short inline comment explaining why the narrow cast is required (mentioning updatedAt and db.insert(orders).values).frontend/app/[locale]/shop/cart/CartPageClient.tsx (1)
869-873:canPlaceOrderdoes not fully reflect delivery readiness.Right now it gates on method selection only. City/warehouse/address/recipient requirements are validated later on submit.
Suggested direction
+ const shippingPayloadPreview = shippingAvailable + ? buildCheckoutShippingPayload({ + shippingAvailable, + reasonCode: shippingReasonCode, + locale, + methodCode: selectedShippingMethod, + cityRef: selectedCityRef, + warehouseRef: selectedWarehouseRef, + addressLine1: courierAddressLine1, + addressLine2: courierAddressLine2, + recipientFullName: recipientName, + recipientPhone, + recipientEmail, + recipientComment, + }) + : null; + + const shippingReady = + !shippingAvailable || Boolean(shippingPayloadPreview?.ok); + const canPlaceOrder = hasSelectableProvider && !shippingMethodsLoading && !shippingUnavailableHardBlock && - (!shippingAvailable || !!selectedShippingMethod); + shippingReady;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app/`[locale]/shop/cart/CartPageClient.tsx around lines 869 - 873, The canPlaceOrder condition only checks shipping method selection and misses delivery readiness (city, warehouse, address, recipient) which are validated only on submit; add explicit delivery validation into canPlaceOrder by creating a helper (e.g., isDeliveryReady) that returns false when required delivery pieces (selectedCity, selectedWarehouse or address fields, recipientName/phone as applicable) are missing or invalid, then use that helper in the canPlaceOrder expression alongside hasSelectableProvider, shippingMethodsLoading, shippingUnavailableHardBlock, shippingAvailable and selectedShippingMethod so the button is disabled until all delivery fields are present and valid.frontend/app/[locale]/admin/shop/orders/[id]/ShippingActions.tsx (1)
99-134: Accessibility: Connect error message to buttons viaaria-describedby.The
errorIdis generated but not linked to the buttons. When an error occurs, screen reader users won't know which element the error relates to.♿ Suggested fix
<button type="button" onClick={() => runAction('retry_label_creation')} disabled={isPending || !retryEnabled} aria-busy={isPending} + aria-describedby={error ? errorId : undefined} className="border-border text-foreground hover:bg-secondary rounded-md border px-3 py-1.5 text-sm font-medium transition-colors disabled:cursor-not-allowed disabled:opacity-50" > Retry label creation </button> <button type="button" onClick={() => runAction('mark_shipped')} disabled={isPending || !shippedEnabled} aria-busy={isPending} + aria-describedby={error ? errorId : undefined} className="border-border text-foreground hover:bg-secondary rounded-md border px-3 py-1.5 text-sm font-medium transition-colors disabled:cursor-not-allowed disabled:opacity-50" > Mark shipped </button> <button type="button" onClick={() => runAction('mark_delivered')} disabled={isPending || !deliveredEnabled} aria-busy={isPending} + aria-describedby={error ? errorId : undefined} className="border-border text-foreground hover:bg-secondary rounded-md border px-3 py-1.5 text-sm font-medium transition-colors disabled:cursor-not-allowed disabled:opacity-50" > Mark delivered </button>🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app/`[locale]/admin/shop/orders/[id]/ShippingActions.tsx around lines 99 - 134, The error message element (error with id errorId) is not connected to the buttons, so add aria-describedby to each action button (the three buttons that call runAction('retry_label_creation'), runAction('mark_shipped'), and runAction('mark_delivered')) when an error exists; set aria-describedby to errorId only if error is truthy (e.g. aria-describedby={error ? errorId : undefined}) so screen readers will announce the error for the relevant controls without adding an empty reference when there's no error.frontend/app/api/shop/shipping/np/warehouses/route.ts (1)
66-70: Keep request tracing consistent on 429 responses.The rate-limited branch returns
rateLimitResponse(...)withoutX-Request-Id, while other branches include it. Add the header before returning so logs and client errors correlate consistently.💡 Suggested patch
if (!decision.ok) { logWarn('shop_shipping_np_warehouses_rate_limited', { ...baseMeta, code: 'RATE_LIMITED', retryAfterSeconds: decision.retryAfterSeconds, }); - return rateLimitResponse({ + const res = rateLimitResponse({ retryAfterSeconds: decision.retryAfterSeconds, details: { scope: 'shop_shipping_np_warehouses' }, }); + res.headers.set('X-Request-Id', requestId); + return res; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app/api/shop/shipping/np/warehouses/route.ts` around lines 66 - 70, The 429 branch returns rateLimitResponse(...) without the X-Request-Id header, making tracing inconsistent; update the rate-limited return to include the same X-Request-Id header used elsewhere (add headers: { 'X-Request-Id': requestId } or equivalent) when calling rateLimitResponse with decision.retryAfterSeconds and details: { scope: 'shop_shipping_np_warehouses' } so logs and client errors can be correlated; ensure you use the same requestId variable/derivation as the other branches in this file.frontend/app/api/shop/internal/shipping/shipments/run/route.ts (1)
27-78: Consider extracting the job-gate helpers to a shared internal module.
normalizeDate/retryAfterSeconds/acquireJobSlotare effectively duplicated in the NP sync route. Centralizing them reduces drift and makes gate semantics easier to maintain.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app/api/shop/internal/shipping/shipments/run/route.ts` around lines 27 - 78, Extract the common job-gate helpers normalizeDate, retryAfterSeconds, and acquireJobSlot into a new internal/shared module (e.g., an internal job-gate util) and replace the local implementations in this route and the NP sync route with imports from that module; move the SQL/DB-dependent logic (acquireJobSlot) into the shared module and keep JOB_NAME and minIntervalSeconds as parameters, export the three functions (normalizeDate, retryAfterSeconds, acquireJobSlot) and update the route to import and call those exported symbols instead of its local copies so both routes share one canonical implementation.frontend/app/api/shop/shipping/methods/route.ts (1)
99-103: AddX-Request-Idto the 429 response path as well.All non-rate-limit responses include it, so this branch should too for consistent diagnostics.
💡 Suggested patch
- return rateLimitResponse({ + const res = rateLimitResponse({ retryAfterSeconds: decision.retryAfterSeconds, details: { scope: 'shop_shipping_methods' }, }); + res.headers.set('X-Request-Id', requestId); + return res;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app/api/shop/shipping/methods/route.ts` around lines 99 - 103, The 429 branch returns rateLimitResponse({ retryAfterSeconds: decision.retryAfterSeconds, details: { scope: 'shop_shipping_methods' } }) but omits the X-Request-Id header; update this return to include the current request ID (sourced from the same variable used in other branches, e.g. requestId or ctx.requestId) so the rateLimitResponse call includes the X-Request-Id (either via a headers param or by adding it into the response metadata/details) while preserving retryAfterSeconds and scope (reference: rateLimitResponse and decision.retryAfterSeconds).frontend/lib/services/shop/shipping/nova-poshta-catalog.ts (1)
108-117: Search pattern bypasses the new prefix-index strategy.The city search at lines 114-115 uses
ILIKE '%...%'(contains search), but the migration0019_p2_shop_invariants.sqladds a prefix-optimized index usingtext_pattern_ops. This pattern won't benefit from that optimization. Consider switching to prefix matching (string%instead of%string%) for typeahead, or add a separate index suited for contains search (standard btree onname_uaor full-text search).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/lib/services/shop/shipping/nova-poshta-catalog.ts` around lines 108 - 117, The current search uses a contains pattern (`%${q}%`) in the db.execute call querying np_cities (fields name_ua and name_ru), which defeats the prefix index added in the migration; change the search pattern used in the query (the like variable passed to db.execute/SQL) to use prefix matching (e.g. `${q}%`) for typeahead behavior so the existing text_pattern_ops index on name_ua can be used, or alternatively add/use a separate index (btree or trigram/full-text) and adjust the query accordingly to support contains searches if that behavior must be preserved; update the references to like and the WHERE clauses on name_ua/name_ru in the SQL template inside nova-poshta-catalog.ts.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
frontend/package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (84)
CHANGELOG.mdfrontend/.env.examplefrontend/actions/notifications.tsfrontend/app/[locale]/admin/shop/orders/[id]/RefundButton.tsxfrontend/app/[locale]/admin/shop/orders/[id]/ShippingActions.tsxfrontend/app/[locale]/admin/shop/orders/[id]/page.tsxfrontend/app/[locale]/blog/[slug]/PostDetails.tsxfrontend/app/[locale]/blog/[slug]/page.tsxfrontend/app/[locale]/blog/category/[category]/page.tsxfrontend/app/[locale]/blog/page.tsxfrontend/app/[locale]/layout.tsxfrontend/app/[locale]/shop/cart/CartPageClient.tsxfrontend/app/[locale]/shop/orders/[id]/page.tsxfrontend/app/api/blog-author/route.tsfrontend/app/api/shop/admin/orders/[id]/route.tsfrontend/app/api/shop/admin/orders/[id]/shipping/route.tsfrontend/app/api/shop/checkout/route.tsfrontend/app/api/shop/internal/shipping/np/sync/route.tsfrontend/app/api/shop/internal/shipping/retention/run/route.tsfrontend/app/api/shop/internal/shipping/shipments/run/route.tsfrontend/app/api/shop/orders/[id]/route.tsfrontend/app/api/shop/shipping/methods/route.tsfrontend/app/api/shop/shipping/np/cities/route.tsfrontend/app/api/shop/shipping/np/warehouses/route.tsfrontend/app/api/shop/webhooks/stripe/route.tsfrontend/app/layout.tsxfrontend/client.tsfrontend/components/header/NotificationBell.tsxfrontend/db/queries/shop/admin-orders.tsfrontend/db/schema/shop.tsfrontend/drizzle/0017_shop_shipping_core.sqlfrontend/drizzle/0018_shop_orders_shipping_invariants.sqlfrontend/drizzle/0019_p2_shop_invariants.sqlfrontend/drizzle/meta/0017_snapshot.jsonfrontend/drizzle/meta/0018_snapshot.jsonfrontend/drizzle/meta/0019_snapshot.jsonfrontend/drizzle/meta/_journal.jsonfrontend/lib/env/index.tsfrontend/lib/env/nova-poshta.tsfrontend/lib/services/orders/_shared.tsfrontend/lib/services/orders/checkout.tsfrontend/lib/services/orders/monobank-janitor.tsfrontend/lib/services/orders/monobank-webhook.tsfrontend/lib/services/orders/payment-attempts.tsfrontend/lib/services/shop/shipping/admin-actions.tsfrontend/lib/services/shop/shipping/availability.tsfrontend/lib/services/shop/shipping/checkout-payload.tsfrontend/lib/services/shop/shipping/inventory-eligibility.tsfrontend/lib/services/shop/shipping/log-sanitizer.tsfrontend/lib/services/shop/shipping/metrics.tsfrontend/lib/services/shop/shipping/nova-poshta-catalog.tsfrontend/lib/services/shop/shipping/nova-poshta-client.tsfrontend/lib/services/shop/shipping/retention.tsfrontend/lib/services/shop/shipping/shipments-worker.tsfrontend/lib/shop/locale.tsfrontend/lib/tests/shop/admin-api-killswitch.test.tsfrontend/lib/tests/shop/admin-csrf-contract.test.tsfrontend/lib/tests/shop/checkout-currency-policy.test.tsfrontend/lib/tests/shop/checkout-shipping-phase3.test.tsfrontend/lib/tests/shop/monobank-janitor-job3.test.tsfrontend/lib/tests/shop/monobank-webhook-apply-outcomes.test.tsfrontend/lib/tests/shop/monobank-webhook-apply.test.tsfrontend/lib/tests/shop/nova-poshta-client-network-failure.test.tsfrontend/lib/tests/shop/shipping-checkout-payload-phase6.test.tsfrontend/lib/tests/shop/shipping-internal-np-sync-route-p2.test.tsfrontend/lib/tests/shop/shipping-internal-retention-route-phase7.test.tsfrontend/lib/tests/shop/shipping-log-sanitizer-phase7.test.tsfrontend/lib/tests/shop/shipping-methods-route-p2.test.tsfrontend/lib/tests/shop/shipping-np-cities-route-p2.test.tsfrontend/lib/tests/shop/shipping-np-warehouses-route-p2.test.tsfrontend/lib/tests/shop/shipping-retention-phase7.test.tsfrontend/lib/tests/shop/shipping-shipments-worker-phase5.test.tsfrontend/lib/tests/shop/stripe-webhook-psp-fields.test.tsfrontend/lib/types/shop.tsfrontend/lib/validation/shop-shipping.tsfrontend/lib/validation/shop.tsfrontend/messages/en.jsonfrontend/messages/pl.jsonfrontend/messages/uk.jsonfrontend/package.jsonfrontend/scripts/np-mock-server.mjsfrontend/scripts/seed-np-local.ps1frontend/scripts/seed-np-local.sqlnetlify.toml
💤 Files with no reviewable changes (1)
- frontend/actions/notifications.ts
| SHOP_SHIPPING_ENABLED=0 | ||
| SHOP_SHIPPING_NP_ENABLED=0 | ||
|
|
||
| # Retention (optional; days, used for cleanup/retention policies) | ||
| SHOP_SHIPPING_RETENTION_DAYS= | ||
|
|
||
| # Required when shipping is enabled (SHOP_SHIPPING_ENABLED=1 and SHOP_SHIPPING_NP_ENABLED=1). | ||
| # If shipping is enabled without required NP config, app throws NovaPoshtaConfigError at runtime. | ||
| # Optional if code has a default; set explicitly in production for clarity |
There was a problem hiding this comment.
Shipping toggle examples/comments conflict with runtime parsing.
This file suggests enabling shipping with =1, but frontend/lib/env/nova-poshta.ts checks for literal 'true'. Using 1 will keep features disabled unexpectedly.
💡 Suggested patch
-# Toggles (optional; defaults are handled in code)
-SHOP_SHIPPING_ENABLED=0
-SHOP_SHIPPING_NP_ENABLED=0
+# Toggles (optional; runtime checks expect "true"/"false")
+SHOP_SHIPPING_ENABLED=false
+SHOP_SHIPPING_NP_ENABLED=false
-# Required when shipping is enabled (SHOP_SHIPPING_ENABLED=1 and SHOP_SHIPPING_NP_ENABLED=1).
+# Required when shipping is enabled (SHOP_SHIPPING_ENABLED=true and SHOP_SHIPPING_NP_ENABLED=true).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@frontend/.env.example` around lines 74 - 82, The .env.example suggests
enabling shipping with SHOP_SHIPPING_ENABLED=1/SHOP_SHIPPING_NP_ENABLED=1 but
frontend/lib/env/nova-poshta.ts checks for the literal 'true', causing the
toggle to stay disabled; fix by making the toggle parsing robust and/or the
example consistent: update frontend/lib/env/nova-poshta.ts to parse
SHOP_SHIPPING_ENABLED and SHOP_SHIPPING_NP_ENABLED with a small helper (e.g.,
isTruthyEnv) that treats '1', 'true', 'yes' (case-insensitive) as true and falsy
values as false, and also update the .env.example comment to show the accepted
true form(s) (or change it to use 'true' explicitly) so both the code and
documentation match.
| const providerOk = m.provider === 'nova_poshta'; | ||
| const methodCodeOk = | ||
| typeof m.methodCode === 'string' && m.methodCode.trim().length > 0; | ||
| const titleOk = | ||
| typeof m.title === 'string' && m.title.trim().length > 0; | ||
|
|
||
| if (!providerOk || !methodCodeOk || !titleOk) { | ||
| hardBlock(); | ||
| return; | ||
| } | ||
|
|
||
| methods.push({ | ||
| provider: 'nova_poshta', | ||
| methodCode: m.methodCode as CheckoutDeliveryMethodCode, | ||
| title: String(m.title), |
There was a problem hiding this comment.
Fail closed on unknown shipping method codes.
methodCode is currently accepted as any non-empty string and cast. That lets unsupported codes slip through client validation.
Suggested patch
+ const VALID_METHOD_CODES = new Set<CheckoutDeliveryMethodCode>([
+ 'NP_WAREHOUSE',
+ 'NP_LOCKER',
+ 'NP_COURIER',
+ ]);
+
const methods: ShippingMethod[] = [];
for (const item of methodsRaw) {
@@
- const methodCodeOk =
- typeof m.methodCode === 'string' && m.methodCode.trim().length > 0;
+ const methodCode =
+ typeof m.methodCode === 'string' &&
+ VALID_METHOD_CODES.has(m.methodCode as CheckoutDeliveryMethodCode)
+ ? (m.methodCode as CheckoutDeliveryMethodCode)
+ : null;
const titleOk =
typeof m.title === 'string' && m.title.trim().length > 0;
- if (!providerOk || !methodCodeOk || !titleOk) {
+ if (!providerOk || !methodCode || !titleOk) {
hardBlock();
return;
}
methods.push({
provider: 'nova_poshta',
- methodCode: m.methodCode as CheckoutDeliveryMethodCode,
+ methodCode,
title: String(m.title),
});
}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@frontend/app/`[locale]/shop/cart/CartPageClient.tsx around lines 345 - 359,
The code currently treats any non-empty string as a valid methodCode and casts
it to CheckoutDeliveryMethodCode, letting unsupported codes pass; change the
validation to check that m.methodCode is one of the supported delivery codes
(e.g. compare against the CheckoutDeliveryMethodCode enum or a shared
SUPPORTED_DELIVERY_METHOD_CODES array) before pushing; if the code is not in
that allowed set call hardBlock() and return; update the methods.push entry to
use the validated value (no unsafe cast) and keep provider/title checks as-is.
| const gate = bypass | ||
| ? ({ ok: true as const } as const) | ||
| : await acquireJobSlot({ | ||
| runId, | ||
| minIntervalSeconds: parsed.data.minIntervalSeconds, | ||
| }); | ||
|
|
There was a problem hiding this comment.
Handle gate-acquisition failures with the same structured error contract.
acquireJobSlot(...) can throw, and that path currently escapes your route-level JSON/error contract.
Suggested patch
- const gate = bypass
- ? ({ ok: true as const } as const)
- : await acquireJobSlot({
- runId,
- minIntervalSeconds: parsed.data.minIntervalSeconds,
- });
+ let gate: { ok: true } | { ok: false; nextAllowedAt: Date | null };
+ try {
+ gate = bypass
+ ? ({ ok: true as const } as const)
+ : await acquireJobSlot({
+ runId,
+ minIntervalSeconds: parsed.data.minIntervalSeconds,
+ });
+ } catch (error) {
+ logError(
+ 'shop_shipping_retention_gate_failed',
+ sanitizeShippingErrorForLog(error, 'Shipping retention gate failed.'),
+ sanitizeShippingLogMeta({
+ ...safeBaseMeta,
+ code: 'RETENTION_GATE_FAILED',
+ })
+ );
+ return noStoreJson(
+ { success: false, code: 'RETENTION_FAILED' },
+ requestId,
+ 500
+ );
+ }📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const gate = bypass | |
| ? ({ ok: true as const } as const) | |
| : await acquireJobSlot({ | |
| runId, | |
| minIntervalSeconds: parsed.data.minIntervalSeconds, | |
| }); | |
| let gate: { ok: true } | { ok: false; nextAllowedAt: Date | null }; | |
| try { | |
| gate = bypass | |
| ? ({ ok: true as const } as const) | |
| : await acquireJobSlot({ | |
| runId, | |
| minIntervalSeconds: parsed.data.minIntervalSeconds, | |
| }); | |
| } catch (error) { | |
| logError( | |
| 'shop_shipping_retention_gate_failed', | |
| sanitizeShippingErrorForLog(error, 'Shipping retention gate failed.'), | |
| sanitizeShippingLogMeta({ | |
| ...safeBaseMeta, | |
| code: 'RETENTION_GATE_FAILED', | |
| }) | |
| ); | |
| return noStoreJson( | |
| { success: false, code: 'RETENTION_FAILED' }, | |
| requestId, | |
| 500 | |
| ); | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@frontend/app/api/shop/internal/shipping/retention/run/route.ts` around lines
180 - 186, The call to acquireJobSlot can throw and currently escapes the
route's JSON/error contract; wrap the await acquireJobSlot(...) call in a
try/catch (or handle Promise rejection) where you catch any error, normalize it
into the same structured error response your route returns (e.g., { ok: false,
error: { message, code?, details? } } or whatever contract your route uses), and
either assign that normalized value to gate or immediately return the structured
JSON response; update the gate assignment site (the bypass ternary and the
acquireJobSlot call) so both success and failure paths conform to the
route-level error contract.
| function cachedJson(body: unknown, requestId: string) { | ||
| const res = NextResponse.json(body, { status: 200 }); | ||
| res.headers.set( | ||
| 'Cache-Control', | ||
| 'public, max-age=30, s-maxage=60, stale-while-revalidate=120' | ||
| ); | ||
| res.headers.set('X-Request-Id', requestId); | ||
| return res; | ||
| } |
There was a problem hiding this comment.
Avoid public caching when availability is derived from request context.
locale/currency can come from request context, but responses are cached as public. The same URL can produce different availability across users and leak stale/wrong availability via shared caches.
Use no-store unless locale/currency/country are explicitly provided in query (or make cache keys explicitly vary by those inputs).
🔧 Suggested guard
- if (!availability.available) {
- return cachedJson(
+ const hasExplicitContext =
+ Boolean(parsed.data.locale) &&
+ Boolean(parsed.data.currency) &&
+ Boolean(parsed.data.country);
+
+ if (!availability.available) {
+ const respond = hasExplicitContext ? cachedJson : noStoreJson;
+ return respond(
{
success: true,
available: false,
reasonCode: availability.reasonCode,
items: [],
},
requestId
);
}
...
- return cachedJson(
+ const respond = hasExplicitContext ? cachedJson : noStoreJson;
+ return respond(
{
success: true,
available: true,
reasonCode: 'OK',
items,
},
requestId
);Also applies to: 86-87, 97-107, 116-124
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@frontend/app/api/shop/shipping/np/cities/route.ts` around lines 24 - 32, The
cachedJson function currently sets responses as public-cachable which can leak
request-derived availability (locale/currency/country); update cachedJson (and
the other similar response creators at the other locations) to use no-store by
default unless locale/currency/country are explicitly provided in the request
query, or instead compute a cache key that incorporates those explicit inputs
and set appropriate Cache-Control/Vary headers; specifically, remove 'public,
max-age=...' and use 'no-store' when request-derived context is present, or when
you choose to allow caching, ensure the cache key or headers vary by
locale/currency/country so shared caches cannot serve wrong availability (keep
setting 'X-Request-Id' as before).
| const handleMarkAllAsRead = async () => { | ||
| await markAllAsRead(); | ||
| setNotifications((prev) => | ||
| prev.map((n) => ({ ...n, isRead: true })) | ||
| ); | ||
| setNotifications(prev => prev.map(n => ({ ...n, isRead: true }))); | ||
| }; | ||
|
|
||
| const handleMarkAsRead = async (id: string, isRead: boolean) => { | ||
| if (isRead) return; | ||
| await markAsRead(id); | ||
| setNotifications((prev) => | ||
| prev.map((n) => (n.id === id ? { ...n, isRead: true } : n)) | ||
| setNotifications(prev => | ||
| prev.map(n => (n.id === id ? { ...n, isRead: true } : n)) | ||
| ); |
There was a problem hiding this comment.
Only update local read state when server mutation succeeds
On Line 113–115 and Line 119–122, local state is marked as read even if the server action fails. This can desync UI from persisted state.
Proposed fix
const handleMarkAllAsRead = async () => {
- await markAllAsRead();
- setNotifications(prev => prev.map(n => ({ ...n, isRead: true })));
+ try {
+ const result = await markAllAsRead();
+ if (result?.success === false) return;
+ setNotifications(prev => prev.map(n => ({ ...n, isRead: true })));
+ } catch {
+ // keep current UI state; optionally add toast/log here
+ }
};
const handleMarkAsRead = async (id: string, isRead: boolean) => {
if (isRead) return;
- await markAsRead(id);
+ const result = await markAsRead(id);
+ if (!result?.success) return;
setNotifications(prev =>
prev.map(n => (n.id === id ? { ...n, isRead: true } : n))
);
};📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const handleMarkAllAsRead = async () => { | |
| await markAllAsRead(); | |
| setNotifications((prev) => | |
| prev.map((n) => ({ ...n, isRead: true })) | |
| ); | |
| setNotifications(prev => prev.map(n => ({ ...n, isRead: true }))); | |
| }; | |
| const handleMarkAsRead = async (id: string, isRead: boolean) => { | |
| if (isRead) return; | |
| await markAsRead(id); | |
| setNotifications((prev) => | |
| prev.map((n) => (n.id === id ? { ...n, isRead: true } : n)) | |
| setNotifications(prev => | |
| prev.map(n => (n.id === id ? { ...n, isRead: true } : n)) | |
| ); | |
| const handleMarkAllAsRead = async () => { | |
| try { | |
| const result = await markAllAsRead(); | |
| if (result?.success === false) return; | |
| setNotifications(prev => prev.map(n => ({ ...n, isRead: true }))); | |
| } catch { | |
| // keep current UI state; optionally add toast/log here | |
| } | |
| }; | |
| const handleMarkAsRead = async (id: string, isRead: boolean) => { | |
| if (isRead) return; | |
| const result = await markAsRead(id); | |
| if (!result?.success) return; | |
| setNotifications(prev => | |
| prev.map(n => (n.id === id ? { ...n, isRead: true } : n)) | |
| ); | |
| }; |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@frontend/components/header/NotificationBell.tsx` around lines 112 - 122, The
handlers handleMarkAllAsRead and handleMarkAsRead currently update local state
immediately after calling markAllAsRead()/markAsRead() which can desync UI if
the server call fails; change each to await the mutation inside a try/catch (or
check the returned success value) and only call setNotifications when the server
call succeeds, and on failure do not mutate state (or optionally revert/notify
the user); reference the functions handleMarkAllAsRead, handleMarkAsRead,
markAllAsRead, markAsRead and setNotifications when making this change.
| const cityRef = args.shipping.selection.cityRef; | ||
| const warehouseRef = args.shipping.selection.warehouseRef ?? null; | ||
| const methodCode = args.shipping.methodCode; | ||
|
|
There was a problem hiding this comment.
Reject or nullify warehouseRef for courier method.
For NP_COURIER, warehouseRef is not validated away and can flow into snapshot/hash. That creates inconsistent idempotency behavior and can send contradictory NP payload fields.
🔧 Suggested fix
- const cityRef = args.shipping.selection.cityRef;
- const warehouseRef = args.shipping.selection.warehouseRef ?? null;
+ const cityRef = args.shipping.selection.cityRef;
+ const requestedWarehouseRef = args.shipping.selection.warehouseRef ?? null;
const methodCode = args.shipping.methodCode;
+ let warehouseRef: string | null = null;
...
if (methodCode === 'NP_WAREHOUSE' || methodCode === 'NP_LOCKER') {
- if (!warehouseRef) {
+ warehouseRef = requestedWarehouseRef;
+ if (!warehouseRef) {
throw new InvalidPayloadError(
'warehouseRef is required for this method.',
{
code: 'INVALID_SHIPPING_ADDRESS',
}
);
}
...
}
+ if (methodCode === 'NP_COURIER' && requestedWarehouseRef) {
+ throw new InvalidPayloadError('warehouseRef is not allowed for courier.', {
+ code: 'INVALID_SHIPPING_ADDRESS',
+ });
+ }Also applies to: 487-523
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@frontend/lib/services/orders/checkout.ts` around lines 410 - 413, The code
reads shipping.selection.cityRef, warehouseRef and methodCode but doesn't clear
warehouseRef for NP_COURIER; update the handling around cityRef/warehouseRef
(and the later similar block at lines ~487-523) so that when methodCode ===
'NP_COURIER' you explicitly set warehouseRef to null (or remove it) before
building the snapshot/hash and before composing NP payloads; locate uses of
warehouseRef, cityRef and methodCode in checkout.ts and ensure NP_COURIER
branches never propagate a non-null warehouseRef into idempotency keys or NP
payload generation.
| const preparedShipping = await prepareCheckoutShipping({ | ||
| shipping: shipping ?? null, | ||
| locale, | ||
| country: country ?? null, | ||
| currency, | ||
| }); | ||
|
|
There was a problem hiding this comment.
Idempotent replay can fail because shipping is validated before existing-order lookup.
prepareCheckoutShipping executes before getOrderByIdempotencyKey. If flags/catalog availability changed after the first successful attempt, a valid retry with the same idempotency key can be rejected instead of returning the existing order.
Move existing-order lookup ahead of strict shipping validation, or add an existing-order fast path that bypasses live catalog gating.
Also applies to: 794-803, 934-939
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@frontend/lib/services/orders/checkout.ts` around lines 657 - 663, The current
flow validates shipping via prepareCheckoutShipping before checking for an
existing order with getOrderByIdempotencyKey, which can cause legitimate
idempotent retries to be rejected if catalog/flags changed; move the
getOrderByIdempotencyKey call to run before prepareCheckoutShipping (or
implement an early fast-path that calls getOrderByIdempotencyKey and returns the
existing order if found) so that shipping validation is skipped on idempotent
replay, and apply the same change to the other occurrences referenced (around
the blocks at ~794-803 and ~934-939) to ensure all idempotent replay paths
bypass live catalog gating.
| export async function findCachedCities(args: { | ||
| q: string; | ||
| limit: number; | ||
| }): Promise<ShippingCity[]> { | ||
| const q = args.q.trim(); | ||
| if (!q) return []; | ||
|
|
||
| const like = `%${q}%`; | ||
| const res = await db.execute<CityRow>(sql` | ||
| SELECT ref, name_ua, name_ru, area, region, settlement_type | ||
| FROM np_cities | ||
| WHERE is_active = true | ||
| AND ( | ||
| name_ua ILIKE ${like} | ||
| OR COALESCE(name_ru, '') ILIKE ${like} | ||
| ) | ||
| ORDER BY name_ua ASC | ||
| LIMIT ${args.limit} | ||
| `); |
There was a problem hiding this comment.
Clamp limit before passing it to SQL.
At Line [118] and Line [151], raw args.limit is used directly. Please enforce bounds (e.g., 1..50) at this layer to prevent oversized scans and avoid invalid limits slipping through.
Suggested fix
export async function findCachedCities(args: {
q: string;
limit: number;
}): Promise<ShippingCity[]> {
const q = args.q.trim();
if (!q) return [];
+ const limit = Math.max(1, Math.min(50, Math.floor(args.limit)));
const like = `%${q}%`;
const res = await db.execute<CityRow>(sql`
@@
- LIMIT ${args.limit}
+ LIMIT ${limit}
`);
@@
export async function findCachedWarehouses(args: {
settlementRef: string;
q?: string;
limit: number;
}): Promise<ShippingWarehouse[]> {
+ const limit = Math.max(1, Math.min(50, Math.floor(args.limit)));
const hasQ = !!args.q?.trim();
const like = `%${(args.q ?? '').trim()}%`;
@@
- LIMIT ${args.limit}
+ LIMIT ${limit}
`);Also applies to: 130-152
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@frontend/lib/services/shop/shipping/nova-poshta-catalog.ts` around lines 101
- 119, Clamp the incoming args.limit to a safe range (e.g., 1..50) before
interpolating it into SQL to avoid oversized scans and invalid values: inside
findCachedCities (and the other function that also uses args.limit in the same
file) compute a local numeric limitClamped = Math.max(1, Math.min(50,
Number(args.limit) || 0)) or equivalent, and use limitClamped in the SQL LIMIT
and any other places rather than raw args.limit; ensure the clamping is applied
to both occurrences that reference args.limit (search for uses in
findCachedCities and the adjacent function around lines 130–152).
| let payload: NovaPoshtaEnvelope<T>; | ||
| try { | ||
| payload = (await response.json()) as NovaPoshtaEnvelope<T>; | ||
| } catch { | ||
| throw new NovaPoshtaApiError( | ||
| 'NP_INVALID_JSON', | ||
| 'Nova Poshta response parsing failed', | ||
| 503 | ||
| ); | ||
| } | ||
|
|
||
| if (!payload.success) { | ||
| const message = firstMessage( | ||
| payload.errors, | ||
| payload.warnings, | ||
| payload.info | ||
| ); | ||
|
|
||
| throw new NovaPoshtaApiError( | ||
| normalizeNpBusinessErrorCode({ | ||
| rawCode: payload.errorCodes?.[0], | ||
| message, | ||
| }), | ||
| message, | ||
| 503 | ||
| ); | ||
| } | ||
|
|
||
| return payload.data ?? []; | ||
| } |
There was a problem hiding this comment.
Validate NP response shape before using/returning it.
response.json() is trusted as NovaPoshtaEnvelope<T>, but malformed payloads can bypass typing and cause runtime errors or non-array returns from a function typed as T[].
🔧 Suggested hardening
- let payload: NovaPoshtaEnvelope<T>;
+ let payload: unknown;
try {
- payload = (await response.json()) as NovaPoshtaEnvelope<T>;
+ payload = await response.json();
} catch {
throw new NovaPoshtaApiError(
'NP_INVALID_JSON',
'Nova Poshta response parsing failed',
503
);
}
- if (!payload.success) {
+ if (!payload || typeof payload !== 'object') {
+ throw new NovaPoshtaApiError(
+ 'NP_INVALID_PAYLOAD',
+ 'Nova Poshta response shape is invalid',
+ 503
+ );
+ }
+ const envelope = payload as NovaPoshtaEnvelope<T>;
+ if (typeof envelope.success !== 'boolean') {
+ throw new NovaPoshtaApiError(
+ 'NP_INVALID_PAYLOAD',
+ 'Nova Poshta response shape is invalid',
+ 503
+ );
+ }
+
+ if (!envelope.success) {
const message = firstMessage(
- payload.errors,
- payload.warnings,
- payload.info
+ envelope.errors,
+ envelope.warnings,
+ envelope.info
);
throw new NovaPoshtaApiError(
normalizeNpBusinessErrorCode({
- rawCode: payload.errorCodes?.[0],
+ rawCode: envelope.errorCodes?.[0],
message,
}),
message,
503
);
}
- return payload.data ?? [];
+ if (envelope.data !== undefined && !Array.isArray(envelope.data)) {
+ throw new NovaPoshtaApiError(
+ 'NP_INVALID_PAYLOAD',
+ 'Nova Poshta response shape is invalid',
+ 503
+ );
+ }
+ return envelope.data ?? [];🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@frontend/lib/services/shop/shipping/nova-poshta-client.ts` around lines 292 -
321, The parsed response is blindly cast to NovaPoshtaEnvelope<T> and may be
malformed; after awaiting response.json() in this block and before checking
payload.success, validate the shape (e.g., ensure payload is an object, typeof
payload.success === 'boolean', and payload.data is either undefined or an array)
and if the shape is invalid throw a NovaPoshtaApiError (e.g., use a new code
like 'NP_MALFORMED_RESPONSE') instead of proceeding; update the logic around the
payload variable usage (the parsed value from response.json(), the success
check, and the final return payload.data ?? []) so callers always get T[] or a
clear API error, referencing NovaPoshtaEnvelope, payload, NovaPoshtaApiError,
firstMessage and normalizeNpBusinessErrorCode to locate where to add the
validation and error path.
| export async function claimQueuedShipmentsForProcessing(args: { | ||
| runId: string; | ||
| leaseSeconds: number; | ||
| limit: number; | ||
| }): Promise<ClaimedShipmentRow[]> { | ||
| const res = await db.execute<ClaimedShipmentRow>(sql` | ||
| with candidates as ( | ||
| select s.id | ||
| from shipping_shipments s | ||
| where ( | ||
| ( | ||
| s.status in ('queued', 'failed') | ||
| and (s.next_attempt_at is null or s.next_attempt_at <= now()) | ||
| ) | ||
| or s.status = 'processing' | ||
| ) | ||
| and (s.lease_expires_at is null or s.lease_expires_at < now()) | ||
| order by coalesce(s.next_attempt_at, s.created_at) asc, s.created_at asc | ||
| for update skip locked | ||
| limit ${args.limit} | ||
| ), | ||
| claimed as ( | ||
| update shipping_shipments s | ||
| set status = 'processing', | ||
| lease_owner = ${args.runId}, | ||
| lease_expires_at = now() + make_interval(secs => ${args.leaseSeconds}), | ||
| updated_at = now() |
There was a problem hiding this comment.
Prevent duplicate TTN creation on lease expiry during external call.
createInternetDocument performs a non-idempotent provider write before durable success marking. If lease expires and markSucceeded returns null, the worker reports retried and another worker can create a second TTN for the same shipment.
Use an idempotency guard for provider calls (or persistent “label creation started” token + lease heartbeat / strict lease > request timeout margin) before issuing the external write.
Also applies to: 541-558
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@frontend/lib/services/shop/shipping/shipments-worker.ts` around lines 331 -
357, Add a persistent idempotency guard in the shipment row and check/update it
atomically to prevent duplicate provider writes: when a worker claims a shipment
(claimQueuedShipmentsForProcessing) or immediately before calling
createInternetDocument, set a DB column like label_creation_started_by (string)
and label_creation_started_at (timestamp) (or a single token) only if it is
currently null; make createInternetDocument first verify and set that token in
the same transactional flow so another worker will skip provider calls if the
token exists, and ensure markSucceeded and the failure/rollback paths clear or
record the final label id (e.g., ttn) and remove the in-progress token
consistently; also ensure lease heartbeat/expiry is kept longer than the
external request timeout or that the token prevents re-creation even if lease
expires.
Summary by CodeRabbit
New Features
Performance & Infrastructure