diff --git a/docs/audit-final-deferred.md b/docs/audit-final-deferred.md index 6e79167..d89a691 100644 --- a/docs/audit-final-deferred.md +++ b/docs/audit-final-deferred.md @@ -82,3 +82,189 @@ headerH + 10` after the rect+stroke block computes against the the per-port override silently falls through to defaults. Not a security bug; a tuning footgun. Fix: accept `"true"`/`"false"` string forms in `asBool`. + +# Audit-final v2 (post-merge platform-wide pass) deferred findings + +A second comprehensive audit (security, routes, DB, integrations, UI/UX) +ran after the merge. The high-impact items landed in commit +`fix(audit-final-v2): platform-wide hardening` (or similar). Items below +are deferred follow-ups. + +## Routes / API + +- **Saved-views routes lack `withPermission`** — + `src/app/api/v1/saved-views/[id]/route.ts:4-5` and + `src/app/api/v1/saved-views/route.ts:24`. Convention is + `withAuth(withPermission(...))`. Verify the service applies + `(ctx.userId, ctx.portId)` ownership filtering, then add either an + explicit owner-only comment or wrap with a benign permission gate. + +- **Custom-fields permission resource hardcoded to `clients`** — + `src/app/api/v1/custom-fields/[entityId]/route.ts:15,29`. Custom fields + attach to client / yacht / interest / berth / company, but the route + always checks `clients.view` / `clients.edit`. A user with + `companies.view` can read confidential company custom-field values via + this endpoint (the service-level `customFieldDefinitions.portId` filter + prevents cross-tenant access but not cross-resource within a tenant). + Fix: split into per-entity routes, OR resolve `entityType` and gate on + the matching permission inline. + +- **`alerts/[id]/acknowledge|dismiss` ungated** — + `src/app/api/v1/alerts/[id]/acknowledge/route.ts:6` etc. only `withAuth`, + no `withPermission`. Verify the service requires user ownership; if + not, gate on `reports.view_dashboard` or similar. + +- **Public POST routes bypass service layer** — + `src/app/api/public/interests/route.ts`, `…/website-inquiries/route.ts`, + `…/residential-inquiries/route.ts`. These do extensive `tx.insert(...)` + with hand-rolled audit logs (`userId: null as unknown as string`). + Extract a `publicInterestService.create(...)` so the same code path is + unit-testable and port-id discipline is uniform. Verify + `audit_logs.user_id` is nullable (the cast pattern signals it is, but + enforce in schema if not). + +- **Inconsistent response shapes** — most endpoints return `{ data: ... }`, + but `notifications/[notificationId]` returns `{ success: true }`, + `website-inquiries` returns `{ id, deduped }`. Document a convention in + CLAUDE.md and migrate. + +- **`req.json()` without `parseBody` helper** — admin custom-fields + routes use `await req.json(); schema.parse(body)` directly instead of + the project's `parseBody(req, schema)` helper. Migrate for uniform + 400 error shapes. + +## Documenso integration + +- **v2 voidDocument endpoint may not match real API** — + `src/lib/services/documenso-client.ts:450-466`. The audit flagged that + Documenso 2.x exposes envelope deletion as + `POST /api/v2/envelope/delete` with `{ envelopeId }` body, not + `DELETE /api/v2/envelope/{id}`. The unit test mocks fetch so it can't + catch the real shape. Verify against a live Documenso 2.x instance + (`pnpm exec playwright test --project=realapi`) before flipping any + port to v2. + +- **Webhook dedup vs per-recipient signed events** — + `src/app/api/webhooks/documenso/route.ts:103-110`. The top-level + `signatureHash` (sha256 of raw body) blocks exact replays, but a + duplicate webhook delivery for a multi-recipient document with a + re-encoded body will go through the per-recipient loop. Make + `documentEvents.signatureHash` unique cover the suffixed values OR add + a composite unique index `(documensoDocumentId, recipientEmail, eventType)`. + +- **v1 `placeFields` per-field POST has no retry** — + `src/lib/services/documenso-client.ts:374-398`. A single transient 500 + mid-loop leaves the document with a partial field set. Add 3-attempt + exponential backoff on 5xx + voidDocument on final failure. + +## Storage + +- **S3 backend has no startup bucket-exists check** — + `src/lib/storage/s3.ts:100-111`. A typo'd bucket name surfaces as a + 500 inside a user-facing request rather than at boot. Add + `await client.bucketExists(bucket)` in `S3Backend.create` with a clear + error message. + +- **Storage cache fingerprint includes encrypted secret** — + `src/lib/storage/index.ts:158-159`. After a key rotation the old + cached client survives until `resetStorageBackendCache()` is called + (already called via the settings-write hook). Document the + invariant or fingerprint on a content-hash that excludes encrypted + material. + +- **Filesystem dev HMAC silent fallback** — + `src/lib/storage/filesystem.ts:309-332`. Two dev nodes started with + different `BETTER_AUTH_SECRET` derive different secrets and reject + each other's tokens. Log a one-line warn at backend boot in non-prod. + +## DB schema + +- **`berths.current_pdf_version_id` lacks Drizzle FK** — + `src/lib/db/schema/berths.ts:83`. The FK exists in migration 0030 + but not in the schema source-of-truth, so `pnpm db:push` against an + empty DB skips the constraint. Either add the FK with a deferred + declaration or document that `db:push` is unsupported. + +- **Missing indexes on FK columns** — `berthReservations.interestId`, + `berthReservations.contractFileId`, `documents.fileId`, + `documents.signedFileId`, `documentEvents.signerId`, + `documentTemplates.sourceFileId`, `formSubmissions.formTemplateId`, + `formSubmissions.clientId`, `documentSends.brochureId`, + `documentSends.brochureVersionId`, `documentSends.sentByUserId`. Add + `index(...)` declarations to avoid full-scan FK checks on parent + delete. + +- **`systemSettings` PK / unique-index drift** — + `src/lib/db/schema/system.ts:119-133`. Schema declares only a + `uniqueIndex` on `(key, port_id)` but the migration uses `key` as PK. + `port_id` is nullable so `(key, port_id)` cannot serve as a PK with + default NULLs-not-equal semantics. Reconcile: declare + `primaryKey({ columns: [table.key, table.portId] })` (after making + `portId` non-null with a sentinel) OR use partial unique indexes for + global + per-port settings. + +- **Composite vs partial archived indexes** — many tables use + `index('idx_*_archived').on(portId, archivedAt)` when the dominant + query is `WHERE port_id = ? AND archived_at IS NULL`. Convert to + `index(...).on(portId).where(sql\`archived_at IS NULL\`)` partial + indexes for smaller storage + faster planner choice. + +- **`documentSends.sentByUserId` ungated FK** — + `src/lib/db/schema/brochures.ts:118` is `notNull()` but has no FK + reference. If a user is hard-deleted (rare; we soft-delete), an + orphan id remains. Add `.references(() => users.id, { onDelete: 'set null' })` + and make the column nullable. Same audit-trail rationale as the + other documentSends FK fixes (commit 0035). + +## UI/UX + +- **Storage admin migration mutation lacks toasts** — + `src/components/admin/storage-admin-panel.tsx:61-72`. Add `onSuccess` + toast with row count + `onError` toast. + +- **Invoice detail send/payment mutations lack error feedback + gates** — + `src/components/invoices/invoice-detail.tsx:93-99,152-167`. Add + `onError: (e) => toast.error(...)` and wrap mutating buttons in + `` / + `record_payment`. + +- **Admin user list edit button ungated** — + `src/components/admin/users/user-list.tsx:114`. Wrap in + ``. + +- **Email threads list missing skeleton** — + `src/components/email/email-threads-list.tsx:29-45`. Use `` + rows during load + `` for the empty case. + +- **Scan page mutations swallow OCR errors** — + `src/app/(dashboard)/[portSlug]/expenses/scan/page.tsx:67-87`. Add an + inline error state for `scanMutation.isError` (the upload-side + already does this). + +- **Invoice detail uses `any` for query data** — strict-mode escape + hatch. Define a proper response type matching the API contract. + +## Security defense-in-depth + +- **Storage proxy token does not bind to port_id** — + `src/lib/storage/filesystem.ts:73-84`. Token's HMAC is global. Fix: + add `p` (portId) claim and have the proxy resolve key→owner row + + assert `owner.portId === payload.p`. + +- **Documenso webhook does not enforce port_id** — + `src/app/api/webhooks/documenso/route.ts:96-148`. Handlers dispatch + by global `documensoId`. Verify `documents(documenso_id)` is unique + port-wide OR include the originating instance/team in the lookup. + +- **EOI in-app pathway silently swallows missing `Berth Range` field** — + `src/lib/pdf/fill-eoi-form.ts:93`. Log warn when + `context.eoiBerthRange` is non-empty AND the field is absent so the + Documenso template deployment gap is observable. + +- **AI worker has no cost-tracking ledger write** — + `src/lib/queue/workers/ai.ts:122-177`. Persist token usage to the + `ai_usage` ledger after every call. + +- **Logger redact paths miss nested credentials** — + `src/lib/logger.ts:5-19`. Extend redact list to cover + `*.headers.authorization`, `**.token`, `secretKeyEncrypted`, etc. diff --git a/src/app/(dashboard)/[portSlug]/expenses/scan/page.tsx b/src/app/(dashboard)/[portSlug]/expenses/scan/page.tsx index 38608a4..a9902d4 100644 --- a/src/app/(dashboard)/[portSlug]/expenses/scan/page.tsx +++ b/src/app/(dashboard)/[portSlug]/expenses/scan/page.tsx @@ -241,15 +241,6 @@ export default function ScanReceiptPage() {

JPEG, PNG, HEIC, WebP up to 10 MB

-

- Have many receipts?{' '} - - Bulk upload → - -

)} {/* `image/*` is the broadest accept — includes HEIC on iOS, diff --git a/src/app/api/storage/[token]/route.ts b/src/app/api/storage/[token]/route.ts index 24a8119..70f9e8d 100644 --- a/src/app/api/storage/[token]/route.ts +++ b/src/app/api/storage/[token]/route.ts @@ -52,7 +52,7 @@ export async function GET( ); } - const result = verifyProxyToken(token, backend.getHmacSecret()); + const result = verifyProxyToken(token, backend.getHmacSecret(), 'get'); if (!result.ok) { logger.warn({ reason: result.reason }, 'Storage proxy token rejected'); return NextResponse.json({ error: 'Invalid or expired token' }, { status: 403 }); @@ -145,7 +145,7 @@ export async function PUT( ); } - const result = verifyProxyToken(token, backend.getHmacSecret()); + const result = verifyProxyToken(token, backend.getHmacSecret(), 'put'); if (!result.ok) { logger.warn({ reason: result.reason }, 'Storage proxy upload token rejected'); return NextResponse.json({ error: 'Invalid or expired token' }, { status: 403 }); diff --git a/src/app/api/v1/admin/templates/[templateId]/rollback/route.ts b/src/app/api/v1/admin/templates/[templateId]/rollback/route.ts index f00b3d1..a8c184a 100644 --- a/src/app/api/v1/admin/templates/[templateId]/rollback/route.ts +++ b/src/app/api/v1/admin/templates/[templateId]/rollback/route.ts @@ -14,7 +14,7 @@ import { rollbackAdminTemplateSchema } from '@/lib/validators/document-templates * Body: { version: number } */ export const POST = withAuth( - withPermission('documents', 'manage', async (req, ctx, params) => { + withPermission('document_templates', 'manage', async (req, ctx, params) => { try { const body = await parseBody(req, rollbackAdminTemplateSchema); const result = await rollbackAdminTemplate( diff --git a/src/app/api/v1/admin/templates/[templateId]/route.ts b/src/app/api/v1/admin/templates/[templateId]/route.ts index a023f3d..c0fed8e 100644 --- a/src/app/api/v1/admin/templates/[templateId]/route.ts +++ b/src/app/api/v1/admin/templates/[templateId]/route.ts @@ -15,7 +15,7 @@ import { updateAdminTemplateSchema } from '@/lib/validators/document-templates'; * Retrieve a single TipTap-based document template. */ export const GET = withAuth( - withPermission('documents', 'manage', async (req, ctx, params) => { + withPermission('document_templates', 'manage', async (req, ctx, params) => { try { const template = await getAdminTemplate(ctx.portId, params.templateId!); return NextResponse.json({ data: template }); @@ -30,21 +30,15 @@ export const GET = withAuth( * Update a TipTap-based document template. Increments version if content changes. */ export const PATCH = withAuth( - withPermission('documents', 'manage', async (req, ctx, params) => { + withPermission('document_templates', 'manage', async (req, ctx, params) => { try { const body = await parseBody(req, updateAdminTemplateSchema); - const updated = await updateAdminTemplate( - ctx.portId, - params.templateId!, - ctx.userId, - body, - { - userId: ctx.userId, - portId: ctx.portId, - ipAddress: ctx.ipAddress, - userAgent: ctx.userAgent, - }, - ); + const updated = await updateAdminTemplate(ctx.portId, params.templateId!, ctx.userId, body, { + userId: ctx.userId, + portId: ctx.portId, + ipAddress: ctx.ipAddress, + userAgent: ctx.userAgent, + }); return NextResponse.json({ data: updated }); } catch (error) { return errorResponse(error); @@ -57,19 +51,14 @@ export const PATCH = withAuth( * Delete a TipTap-based document template. */ export const DELETE = withAuth( - withPermission('documents', 'manage', async (req, ctx, params) => { + withPermission('document_templates', 'manage', async (req, ctx, params) => { try { - await deleteAdminTemplate( - ctx.portId, - params.templateId!, - ctx.userId, - { - userId: ctx.userId, - portId: ctx.portId, - ipAddress: ctx.ipAddress, - userAgent: ctx.userAgent, - }, - ); + await deleteAdminTemplate(ctx.portId, params.templateId!, ctx.userId, { + userId: ctx.userId, + portId: ctx.portId, + ipAddress: ctx.ipAddress, + userAgent: ctx.userAgent, + }); return new NextResponse(null, { status: 204 }); } catch (error) { return errorResponse(error); diff --git a/src/app/api/v1/admin/templates/[templateId]/versions/route.ts b/src/app/api/v1/admin/templates/[templateId]/versions/route.ts index bb822f0..d67678c 100644 --- a/src/app/api/v1/admin/templates/[templateId]/versions/route.ts +++ b/src/app/api/v1/admin/templates/[templateId]/versions/route.ts @@ -9,12 +9,9 @@ import { getAdminTemplateVersions } from '@/lib/services/document-templates.serv * Returns version history for a template, sourced from audit_logs. */ export const GET = withAuth( - withPermission('documents', 'manage', async (req, ctx, params) => { + withPermission('document_templates', 'manage', async (req, ctx, params) => { try { - const versions = await getAdminTemplateVersions( - ctx.portId, - params.templateId!, - ); + const versions = await getAdminTemplateVersions(ctx.portId, params.templateId!); return NextResponse.json({ data: versions }); } catch (error) { return errorResponse(error); diff --git a/src/app/api/v1/admin/templates/preview/route.ts b/src/app/api/v1/admin/templates/preview/route.ts index 0cb5985..64c240f 100644 --- a/src/app/api/v1/admin/templates/preview/route.ts +++ b/src/app/api/v1/admin/templates/preview/route.ts @@ -25,7 +25,7 @@ import { previewAdminTemplateSchema } from '@/lib/validators/document-templates' * sampleData?: Record - variable substitutions */ export const POST = withAuth( - withPermission('documents', 'manage', async (req, _ctx) => { + withPermission('document_templates', 'manage', async (req, _ctx) => { try { const body = await parseBody(req, previewAdminTemplateSchema); diff --git a/src/app/api/v1/admin/templates/route.ts b/src/app/api/v1/admin/templates/route.ts index fca5590..cb98427 100644 --- a/src/app/api/v1/admin/templates/route.ts +++ b/src/app/api/v1/admin/templates/route.ts @@ -3,10 +3,7 @@ import { NextResponse } from 'next/server'; import { withAuth, withPermission } from '@/lib/api/helpers'; import { parseQuery, parseBody } from '@/lib/api/route-helpers'; import { errorResponse } from '@/lib/errors'; -import { - listAdminTemplates, - createAdminTemplate, -} from '@/lib/services/document-templates.service'; +import { listAdminTemplates, createAdminTemplate } from '@/lib/services/document-templates.service'; import { listAdminTemplatesSchema, createAdminTemplateSchema, @@ -17,7 +14,7 @@ import { * List TipTap-based document templates for the port. */ export const GET = withAuth( - withPermission('documents', 'manage', async (req, ctx) => { + withPermission('document_templates', 'manage', async (req, ctx) => { try { const query = parseQuery(req, listAdminTemplatesSchema); const data = await listAdminTemplates(ctx.portId, query); @@ -33,7 +30,7 @@ export const GET = withAuth( * Create a new TipTap-based document template. */ export const POST = withAuth( - withPermission('documents', 'manage', async (req, ctx) => { + withPermission('document_templates', 'manage', async (req, ctx) => { try { const body = await parseBody(req, createAdminTemplateSchema); const template = await createAdminTemplate(ctx.portId, ctx.userId, body, { diff --git a/src/app/api/webhooks/documenso/route.ts b/src/app/api/webhooks/documenso/route.ts index 353b6ea..fa92449 100644 --- a/src/app/api/webhooks/documenso/route.ts +++ b/src/app/api/webhooks/documenso/route.ts @@ -111,7 +111,15 @@ export async function POST(req: NextRequest): Promise { } case 'DOCUMENT_OPENED': { - const openedRecipients = recipients.filter((r) => r.readStatus === 'OPENED'); + // Documenso v1 sends `readStatus: 'OPENED'`; v2 has used both + // upper and lower case across releases and may omit the field + // entirely (the event itself signals the open). Treat the event + // as the signal: dispatch a per-recipient open for every + // recipient on the document so v2 deployments stop silently + // dropping opens. + const openedRecipients = recipients.filter( + (r) => !r.readStatus || String(r.readStatus).toUpperCase() === 'OPENED', + ); for (const r of openedRecipients) { await handleDocumentOpened({ documentId: documensoId, diff --git a/src/components/admin/brochures-admin-panel.tsx b/src/components/admin/brochures-admin-panel.tsx index e6c64c2..45847d4 100644 --- a/src/components/admin/brochures-admin-panel.tsx +++ b/src/components/admin/brochures-admin-panel.tsx @@ -123,6 +123,8 @@ function BrochureCard({ brochure, onChange }: { brochure: BrochureRow; onChange: toast.success('Default brochure updated'); onChange(); }, + onError: (e) => + toast.error(e instanceof Error ? e.message : 'Could not update default brochure'), }); const archiveMutation = useMutation({ @@ -131,6 +133,7 @@ function BrochureCard({ brochure, onChange }: { brochure: BrochureRow; onChange: toast.success('Brochure archived'); onChange(); }, + onError: (e) => toast.error(e instanceof Error ? e.message : 'Archive failed'), }); async function handleUpload(file: File) { @@ -284,6 +287,7 @@ function CreateBrochureDialog({ onCreated(); onOpenChange(false); }, + onError: (e) => toast.error(e instanceof Error ? e.message : 'Could not create brochure'), }); return ( diff --git a/src/components/expenses/expense-detail.tsx b/src/components/expenses/expense-detail.tsx index 9d1f285..649e826 100644 --- a/src/components/expenses/expense-detail.tsx +++ b/src/components/expenses/expense-detail.tsx @@ -3,17 +3,87 @@ import { useState, useEffect } from 'react'; import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query'; import { format } from 'date-fns'; -import { Loader2, Receipt, Edit, Archive } from 'lucide-react'; +import { Archive, Download, Edit, FileText, Loader2, Receipt } from 'lucide-react'; import { Button } from '@/components/ui/button'; import { Badge } from '@/components/ui/badge'; import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card'; import { ArchiveConfirmDialog } from '@/components/shared/archive-confirm-dialog'; +import { PermissionGate } from '@/components/shared/permission-gate'; +import { toast } from 'sonner'; import { apiFetch } from '@/lib/api/client'; import { useMobileChrome } from '@/components/layout/mobile/mobile-layout-provider'; import type { ExpenseRow } from './expense-columns'; import { ExpenseDuplicateBanner } from './expense-duplicate-banner'; +/** + * Renders an image thumbnail for previewable receipts (jpeg/png/webp/heic + * via the existing /files/[id]/preview presign), falling back to a "Download" + * link for PDFs and other non-previewable types. Replaces the prior + * impossible-to-use UUID-badge list — reps can finally see the receipt + * they uploaded against the expense. + */ +function ReceiptThumbnail({ fileId }: { fileId: string }) { + const { data, isLoading, isError } = useQuery<{ + data: { url: string; mimeType: string } | null; + error?: string; + }>({ + queryKey: ['file-preview', fileId], + queryFn: async () => { + try { + const res = await apiFetch<{ data: { url: string; mimeType: string } }>( + `/api/v1/files/${fileId}/preview`, + ); + return res; + } catch (e) { + // Non-image files raise ValidationError ("This file type cannot be + // previewed") — fall through to the Download link. + return { data: null, error: e instanceof Error ? e.message : 'preview unavailable' }; + } + }, + staleTime: 5 * 60 * 1000, + }); + + if (isLoading) { + return ( +
+ Loading preview… +
+ ); + } + + const url = data?.data?.url; + const mime = data?.data?.mimeType ?? ''; + const isImage = mime.startsWith('image/'); + + return ( +
+ {url && isImage ? ( + + Receipt + + ) : ( +
+ +
+ )} +
+ {mime || (isError ? 'Receipt' : 'File')} + + Download + +
+
+ ); +} + const PAYMENT_STATUS_COLORS: Record = { unpaid: 'bg-red-100 text-red-700 border-red-200', paid: 'bg-green-100 text-green-700 border-green-200', @@ -48,8 +118,13 @@ export function ExpenseDetail({ expenseId, onEdit, onArchived }: ExpenseDetailPr onSuccess: () => { queryClient.invalidateQueries({ queryKey: ['expenses'] }); setArchiveOpen(false); + toast.success('Expense archived'); onArchived?.(); }, + onError: (e) => { + toast.error(e instanceof Error ? e.message : 'Archive failed'); + setArchiveOpen(false); + }, }); if (isLoading) { @@ -84,20 +159,24 @@ export function ExpenseDetail({ expenseId, onEdit, onArchived }: ExpenseDetailPr
{onEdit && ( - + + + )} - + + +
@@ -172,11 +251,9 @@ export function ExpenseDetail({ expenseId, onEdit, onArchived }: ExpenseDetailPr -
- {(expense.receiptFileIds as string[]).map((fileId: string) => ( - - {fileId} - +
+ {(expense.receiptFileIds as string[]).map((fileId) => ( + ))}
diff --git a/src/lib/db/migrations/0035_document_sends_preserve_audit_on_parent_delete.sql b/src/lib/db/migrations/0035_document_sends_preserve_audit_on_parent_delete.sql new file mode 100644 index 0000000..f81aa00 --- /dev/null +++ b/src/lib/db/migrations/0035_document_sends_preserve_audit_on_parent_delete.sql @@ -0,0 +1,28 @@ +-- Audit-final v2 fix: document_sends FKs default to NO ACTION which means +-- a hard-delete of a referenced client/interest/berth/brochure either +-- silently blocks the parent delete OR (if a future cascade path is added) +-- nukes the send-out audit row. The audit trail must outlast its source — +-- recipient_email + document_kind + body_markdown + from_address are +-- already denormalized onto the row for exactly this purpose. +-- +-- Switch every parent FK to ON DELETE SET NULL so the audit row keeps a +-- timestamp + email even when the source row is gone. + +ALTER TABLE document_sends + DROP CONSTRAINT IF EXISTS document_sends_client_id_clients_id_fk, + DROP CONSTRAINT IF EXISTS document_sends_interest_id_interests_id_fk, + DROP CONSTRAINT IF EXISTS document_sends_berth_id_berths_id_fk, + DROP CONSTRAINT IF EXISTS document_sends_brochure_id_brochures_id_fk, + DROP CONSTRAINT IF EXISTS document_sends_brochure_version_id_brochure_versions_id_fk; + +ALTER TABLE document_sends + ADD CONSTRAINT document_sends_client_id_clients_id_fk + FOREIGN KEY (client_id) REFERENCES clients(id) ON DELETE SET NULL, + ADD CONSTRAINT document_sends_interest_id_interests_id_fk + FOREIGN KEY (interest_id) REFERENCES interests(id) ON DELETE SET NULL, + ADD CONSTRAINT document_sends_berth_id_berths_id_fk + FOREIGN KEY (berth_id) REFERENCES berths(id) ON DELETE SET NULL, + ADD CONSTRAINT document_sends_brochure_id_brochures_id_fk + FOREIGN KEY (brochure_id) REFERENCES brochures(id) ON DELETE SET NULL, + ADD CONSTRAINT document_sends_brochure_version_id_brochure_versions_id_fk + FOREIGN KEY (brochure_version_id) REFERENCES brochure_versions(id) ON DELETE SET NULL; diff --git a/src/lib/db/migrations/0036_polymorphic_check_constraints.sql b/src/lib/db/migrations/0036_polymorphic_check_constraints.sql new file mode 100644 index 0000000..dac98d0 --- /dev/null +++ b/src/lib/db/migrations/0036_polymorphic_check_constraints.sql @@ -0,0 +1,13 @@ +-- Audit-final v2 fix: polymorphic discriminator columns are currently free-text +-- and a typo in service code (e.g. `'clients'` vs `'client'`) inserts silently +-- and the row becomes unreachable through every read path that uses the +-- resolver service. Add CHECK constraints on the two most load-bearing +-- discriminators to surface drift at the DB level. + +ALTER TABLE yachts + ADD CONSTRAINT yachts_current_owner_type_chk + CHECK (current_owner_type IN ('client', 'company')); + +ALTER TABLE invoices + ADD CONSTRAINT invoices_billing_entity_type_chk + CHECK (billing_entity_type IN ('client', 'company')); diff --git a/src/lib/db/migrations/meta/_journal.json b/src/lib/db/migrations/meta/_journal.json index 95466ee..0f5e39a 100644 --- a/src/lib/db/migrations/meta/_journal.json +++ b/src/lib/db/migrations/meta/_journal.json @@ -246,6 +246,20 @@ "when": 1778000000000, "tag": "0034_normalize_mooring_numbers_broaden", "breakpoints": true + }, + { + "idx": 35, + "version": "7", + "when": 1778050000000, + "tag": "0035_document_sends_preserve_audit_on_parent_delete", + "breakpoints": true + }, + { + "idx": 36, + "version": "7", + "when": 1778100000000, + "tag": "0036_polymorphic_check_constraints", + "breakpoints": true } ] } diff --git a/src/lib/db/schema/brochures.ts b/src/lib/db/schema/brochures.ts index 27582d8..abb3f26 100644 --- a/src/lib/db/schema/brochures.ts +++ b/src/lib/db/schema/brochures.ts @@ -102,17 +102,25 @@ export const documentSends = pgTable( portId: text('port_id') .notNull() .references(() => ports.id), - /** Either client_id or interest_id is set (or both). */ - clientId: text('client_id').references(() => clients.id), - interestId: text('interest_id').references(() => interests.id), + /** + * Either client_id or interest_id is set (or both). All five FKs use + * `onDelete: 'set null'` so the audit row survives if the parent + * client/interest/berth/brochure is deleted — `recipient_email`, + * `document_kind`, `body_markdown`, and `from_address` are denormalized + * onto the row precisely so the audit trail outlasts the source. + */ + clientId: text('client_id').references(() => clients.id, { onDelete: 'set null' }), + interestId: text('interest_id').references(() => interests.id, { onDelete: 'set null' }), recipientEmail: text('recipient_email').notNull(), /** 'berth_pdf' | 'brochure' */ documentKind: text('document_kind').notNull(), - berthId: text('berth_id').references(() => berths.id), + berthId: text('berth_id').references(() => berths.id, { onDelete: 'set null' }), /** Forward FK ref — berth_pdf_versions defined in Phase 6b. Loose-coupled. */ berthPdfVersionId: text('berth_pdf_version_id'), - brochureId: text('brochure_id').references(() => brochures.id), - brochureVersionId: text('brochure_version_id').references(() => brochureVersions.id), + brochureId: text('brochure_id').references(() => brochures.id, { onDelete: 'set null' }), + brochureVersionId: text('brochure_version_id').references(() => brochureVersions.id, { + onDelete: 'set null', + }), /** Exact body used (after merge-field expansion + sanitization). */ bodyMarkdown: text('body_markdown'), sentByUserId: text('sent_by_user_id').notNull(), diff --git a/src/lib/email/index.ts b/src/lib/email/index.ts index 743b654..63097a4 100644 --- a/src/lib/email/index.ts +++ b/src/lib/email/index.ts @@ -69,7 +69,10 @@ async function resolveAttachments( const { files } = await import('@/lib/db/schema/documents'); const { eq } = await import('drizzle-orm'); const { ForbiddenError, NotFoundError } = await import('@/lib/errors'); - const { minioClient } = await import('@/lib/minio'); + // Pluggable storage backend (s3 OR filesystem). Direct MinIO imports + // break the filesystem-mode deployment path documented in CLAUDE.md. + const { getStorageBackend } = await import('@/lib/storage'); + const backend = await getStorageBackend(); return Promise.all( refs.map(async (ref) => { @@ -78,9 +81,9 @@ async function resolveAttachments( if (portId && file.portId !== portId) { throw new ForbiddenError('File belongs to a different port'); } - const stream = await minioClient.getObject(file.storageBucket, file.storagePath); + const stream = await backend.get(file.storagePath); const chunks: Buffer[] = []; - for await (const chunk of stream) { + for await (const chunk of stream as AsyncIterable) { chunks.push(Buffer.isBuffer(chunk) ? chunk : Buffer.from(chunk)); } return { diff --git a/src/lib/storage/filesystem.ts b/src/lib/storage/filesystem.ts index 5d22475..e349fb6 100644 --- a/src/lib/storage/filesystem.ts +++ b/src/lib/storage/filesystem.ts @@ -70,6 +70,17 @@ export function validateStorageKey(key: string): void { // ─── HMAC token helpers ───────────────────────────────────────────────────── +/** + * Token op binding. `'get'` tokens are issued by `presignDownload` and only + * accepted by the proxy GET handler. `'put'` tokens are issued by + * `presignUpload` and only accepted by the proxy PUT handler. Without this + * binding a long-lived 24h download URL emailed to a customer could be + * replayed against the PUT handler to overwrite the original storage object + * (since both routes share an HMAC and key — the magic-byte check is also + * skipped when `c` is unset). + */ +export type ProxyTokenOp = 'get' | 'put'; + interface ProxyTokenPayload { /** Storage key (validated). */ k: string; @@ -77,6 +88,11 @@ interface ProxyTokenPayload { e: number; /** Random nonce so two URLs for the same (key, expiry) differ. */ n: string; + /** + * Bound operation. Tokens minted before this field was added (legacy) + * fail-closed: the proxy handlers require the field's exact value. + */ + op: ProxyTokenOp; /** Optional download filename. */ f?: string; /** Optional content-type override. */ @@ -102,6 +118,12 @@ export function signProxyToken(payload: ProxyTokenPayload, secret: string): stri export function verifyProxyToken( token: string, secret: string, + /** + * Required: the operation the verifier is allowed to perform. The token + * must have been minted with the same `op`. Without this argument an + * upload token could be replayed as a download (and vice versa). + */ + expectedOp: ProxyTokenOp, ): { ok: true; payload: ProxyTokenPayload } | { ok: false; reason: string } { if (typeof token !== 'string' || !token.includes('.')) { return { ok: false, reason: 'malformed' }; @@ -138,6 +160,11 @@ export function verifyProxyToken( } catch { return { ok: false, reason: 'invalid-key' }; } + // Op-binding: tokens minted before this field was added have no `op` + // and are now rejected. Fresh tokens must match `expectedOp` exactly. + if (payload.op !== expectedOp) { + return { ok: false, reason: 'op-mismatch' }; + } return { ok: true, payload }; } @@ -269,7 +296,7 @@ export class FilesystemBackend implements StorageBackend { validateStorageKey(key); const expiresAt = Math.floor(Date.now() / 1000) + (opts.expirySeconds ?? 900); const token = signProxyToken( - { k: key, e: expiresAt, n: randomUUID(), c: opts.contentType }, + { k: key, e: expiresAt, n: randomUUID(), op: 'put', c: opts.contentType }, this.hmacSecret, ); return { url: `/api/storage/${token}`, method: 'PUT' }; @@ -280,7 +307,14 @@ export class FilesystemBackend implements StorageBackend { const expirySec = opts.expirySeconds ?? 900; const expiresAtSec = Math.floor(Date.now() / 1000) + expirySec; const token = signProxyToken( - { k: key, e: expiresAtSec, n: randomUUID(), f: opts.filename, c: opts.contentType }, + { + k: key, + e: expiresAtSec, + n: randomUUID(), + op: 'get', + f: opts.filename, + c: opts.contentType, + }, this.hmacSecret, ); // ABSOLUTE URL: send-out emails interpolate this verbatim into the diff --git a/src/lib/utils/markdown-email.ts b/src/lib/utils/markdown-email.ts index 1ae6050..132b297 100644 --- a/src/lib/utils/markdown-email.ts +++ b/src/lib/utils/markdown-email.ts @@ -133,11 +133,41 @@ export function extractTokens(markdown: string): string[] { return matches ? Array.from(new Set(matches)) : []; } +/** + * Markdown-significant characters that need to be neutralized before a merge + * value is substituted into the rep-authored body. Without this, a value + * like `[click here](https://attacker.tld)` stored on a client/company would + * survive `renderEmailBody`'s HTML escape (escapeHtml leaves `[`, `]`, `(`, + * `)` intact) and produce a real `` in the outbound email — a + * phishing lure delivered from the legitimate sales account. + * + * Each char is replaced with its HTML entity. The entity encoding survives + * `escapeHtml` (which only re-escapes `&`) and renders as the original + * literal character — visually the user still sees their data verbatim, + * but the markdown rules (link, emphasis, code) no longer fire on it. + */ +const MERGE_VALUE_ESCAPE_MAP: Record = { + '\\': '\', + '`': '`', + '*': '*', + _: '_', + '[': '[', + ']': ']', + '(': '(', + ')': ')', + '{': '{', + '}': '}', +}; + +function escapeMergeValue(value: string): string { + return value.replace(/[\\`*_[\](){}]/g, (ch) => MERGE_VALUE_ESCAPE_MAP[ch] ?? ch); +} + /** * Replace `{{token}}` references with values from the supplied map. Tokens * not present in the map are left intact so the dry-run reporter can flag - * them. Values are HTML-escape-safe by virtue of being run BEFORE - * `renderEmailBody()`; the caller is expected to pass plain strings. + * them. Values are markdown-escaped before substitution so a malicious + * field cannot inject a link, emphasis, or another `{{token}}` form. */ export function expandMergeTokens( markdown: string, @@ -147,7 +177,7 @@ export function expandMergeTokens( const key = `{{${raw}}}`; const value = values[key]; if (value === null || value === undefined || value === '') return full; - return String(value); + return escapeMergeValue(String(value)); }); } diff --git a/src/middleware.ts b/src/middleware.ts index 4743ba7..3dbe5de 100644 --- a/src/middleware.ts +++ b/src/middleware.ts @@ -32,9 +32,58 @@ function isApiRoute(pathname: string): boolean { return pathname.startsWith('/api/'); } +const STATE_CHANGING_METHODS = new Set(['POST', 'PUT', 'PATCH', 'DELETE']); + +/** + * SameSite=Lax cookies block top-level cross-site POSTs in modern browsers, + * but defense-in-depth: every state-changing request to a session-authed + * `/api/v1/**` endpoint must originate from the same origin as the app. + * Webhooks (`/api/webhooks/**`) and public posts (`/api/public/**`) are + * exempt because they're called by external systems with no session + * cookie. Auth flows (`/api/auth/**`) and portal (`/api/portal/**`) handle + * their own origin/CSRF checks via better-auth. + */ +function isOriginCheckedPath(pathname: string): boolean { + if (!pathname.startsWith('/api/v1/')) return false; + return true; +} + +function originAllowed(request: NextRequest): boolean { + const origin = request.headers.get('origin'); + const referer = request.headers.get('referer'); + // Same-origin fetch from the app sends both Origin AND a matching host. + // Use request.nextUrl.origin (the deployed origin) as the source of truth. + const expectedOrigin = request.nextUrl.origin; + if (origin) return origin === expectedOrigin; + if (referer) { + try { + return new URL(referer).origin === expectedOrigin; + } catch { + return false; + } + } + // Neither header present: most browser fetches always send Origin on + // POST/PUT/PATCH/DELETE, so this likely means a same-origin server-side + // call (e.g. Next.js internal fetch). Allow. + return true; +} + export function middleware(request: NextRequest): NextResponse { const { pathname } = request.nextUrl; + // CSRF defense-in-depth: state-changing requests to authed /api/v1 + // endpoints must come from the app's own origin. + if ( + STATE_CHANGING_METHODS.has(request.method) && + isOriginCheckedPath(pathname) && + !originAllowed(request) + ) { + return NextResponse.json( + { error: 'Cross-origin state-changing request rejected' }, + { status: 403 }, + ); + } + // Always allow public paths through if (isPublicPath(pathname)) { return NextResponse.next(); diff --git a/tests/integration/storage/proxy-route.test.ts b/tests/integration/storage/proxy-route.test.ts index e0e5ce9..a4c3769 100644 --- a/tests/integration/storage/proxy-route.test.ts +++ b/tests/integration/storage/proxy-route.test.ts @@ -111,6 +111,7 @@ describe('GET /api/storage/[token]', () => { k: 'berths/abc/file.txt', e: Math.floor(Date.now() / 1000) + 60, n: 'nonce', + op: 'get' as const, }, 'wrong-secret', ); @@ -130,6 +131,7 @@ describe('GET /api/storage/[token]', () => { k: 'berths/abc/file.txt', e: Math.floor(Date.now() / 1000) - 1, n: 'nonce', + op: 'get' as const, }, backend.getHmacSecret(), ); diff --git a/tests/unit/markdown-email-sanitization.test.ts b/tests/unit/markdown-email-sanitization.test.ts index 785c458..d5da919 100644 --- a/tests/unit/markdown-email-sanitization.test.ts +++ b/tests/unit/markdown-email-sanitization.test.ts @@ -143,4 +143,30 @@ describe('merge token helpers', () => { }); expect(unresolved).toEqual(['{{b}}', '{{c}}']); }); + + // Audit-final v2: a malicious merge value (e.g. a client.fullName imported + // from a low-trust source) must NOT inject a link or emphasis into the + // rendered email body. escapeMergeValue neutralizes the markdown chars + // inside the value before substitution. + it('escapes markdown control chars inside merge values', () => { + const expanded = expandMergeTokens('Hi {{client.fullName}}, welcome.', { + '{{client.fullName}}': '[click here](https://attacker.tld)', + }); + // The brackets/parens are now entity-encoded, so the markdown link + // rule will not fire. + expect(expanded).not.toContain('[click here](https://attacker.tld)'); + expect(expanded).toContain('[click here]'); + + const html = renderEmailBody(expanded); + expect(html).not.toContain('