fix(audit-final): pre-merge hardening + expense receipt UI

Final audit pass on feat/berth-recommender (3 parallel Opus agents)
caught 5 critical and ~12 high-severity findings. All addressed in-branch;
medium/low items deferred to docs/audit-final-deferred.md.

Critical:
- Add filesystem-backend PUT handler at /api/storage/[token] so
  presigned uploads stop 405-ing in filesystem mode (every browser-driven
  berth-PDF + brochure upload was broken). Same token-verify + replay
  protection as GET, plus magic-byte gate when c=application/pdf.
- Forward req.signal into streamExpensePdf so an aborted 1000-receipt
  export no longer keeps grinding for minutes.
- Strengthen Content-Disposition filename sanitization: \s matches CR/LF
  which would let documentName forge headers; restrict to [\w. -]+ and
  add filename* RFC 5987 fallback.
- Lock public berths feed behind an explicit slug allowlist instead of
  ?portSlug= enumeration.
- Reject cross-port interest_berths upserts (defense-in-depth on top of
  the recommender SQL port filter).

High:
- Recommender: width-only feasibility now caps length via L/W ratio so a
  200ft berth doesn't surface for a 30ft beam request; total_interest_count
  filters out junction rows whose interest is in another port.
- Mooring normalization follow-up migration (0034) catches un-hyphenated
  padded forms (A01) the original 0024 WHERE missed.
- Send-out rate limit moved AFTER validation and scoped per-(port, user)
  so typos don't burn a slot and a multi-port rep can't be DoS'd by
  another tenant.
- Default-brochure path now blocks an archived row from sneaking through
  the partial unique index.
- NocoDB import --update-snapshot honoured under --dry-run so reps can
  refresh the seed JSON without committing DB writes.
- PDF export: orderBy desc(expenseDate); apply isNull(archivedAt) when
  expenseIds are passed (was bypassed); flag rate-unavailable rows with
  an amber footer instead of silently treating them as 1:1; skip the
  USD->EUR chain when source already matches target.
- expense-form-dialog: revokeObjectURL captures the URL in the closure
  instead of revoking the still-displayed one; reset upload state on
  close.
- scan/page: handleClearReceipt resets in-flight scan/upload mutations;
  Save disabled while upload pending.
- updateExpense re-asserts receipt-or-acknowledgement at the merged
  row so PATCH can't slip past the create-time refine.

Plus the in-progress receipt upload UI for the expense form dialog
(receipt picker + "I have no receipt" checkbox + warning banner) and
a noReceiptAcknowledged flag on ExpenseRow for edit-mode hydration.

Includes the canonical plan doc (referenced in CLAUDE.md), the handoff
prompt, and a deferred-findings index for follow-up issues.

1163/1163 vitest passing. Typecheck clean.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
Matt Ciaccio
2026-05-05 05:11:26 +02:00
parent 014bbe1923
commit 180912ba9f
20 changed files with 2015 additions and 101 deletions

3
.gitignore vendored
View File

@@ -50,3 +50,6 @@ docker-compose.override.yml
# Filesystem storage backend root (FilesystemBackend default location) # Filesystem storage backend root (FilesystemBackend default location)
/storage/ /storage/
# Local berth-PDF + brochure samples used as upload fixtures during dev.
/berth_pdf_example/

View File

@@ -0,0 +1,84 @@
# Final audit deferred findings
The pre-merge audit on `feat/berth-recommender` produced ~30 findings. The
critical + high-severity items were fixed in-branch. The items below are
medium / low severity and deferred to follow-up issues so the merge isn't
held up. Each entry is self-contained — pick one off and ship it.
## Cross-cutting integration
- **EOI in-app pathway silently swallows missing `Berth Range` AcroForm field**
`src/lib/pdf/fill-eoi-form.ts:93`. `setText(form, 'Berth Range', ...)`
is wrapped in a try/catch that succeeds silently when the field is
absent. CLAUDE.md already warns ops about needing to add the field to
the live Documenso template; this code change would make the deployment
gap observable. Fix: when `context.eoiBerthRange` is non-empty AND the
field is absent, log at warn level + surface a structured response field.
- **Email body merge expansion happens after token validation** —
`src/lib/services/document-sends.service.ts:399-403`. If a merge value
contains a `{{token}}` substring (e.g. a client name like
`"Acme {{discount}} Inc."`), the expanded body will contain a token
the unresolved-check missed and ships with literal braces. Fix: HTML-
escape merge values before expansion, OR run a second
`findUnresolvedTokens` against the expanded body.
- **Filesystem dev-fallback HMAC secret can drift across processes** —
`src/lib/storage/filesystem.ts:328-331`. The dev-only fallback derives
the HMAC secret from `BETTER_AUTH_SECRET`. Two CRM processes running
with different secrets (web vs worker) reject each other's tokens.
Fix: assert `BETTER_AUTH_SECRET` is set when filesystem backend is
active in non-prod, or document the requirement loudly.
- **Berth PDF apply path: numeric column nulling silently drops** —
`src/lib/services/berth-pdf.service.ts:473-475`. When
`Number.isFinite(n)` is false the apply loop `continue`s without
pushing to `applied` and without warning. Combined with the
"no appliable fields supplied" check (only fires when ALL drop), partial
silent drops are invisible. Fix: collect dropped keys and surface them.
## Multi-tenant isolation hardening
- **document_sends row stores `interestId` without verifying port match** —
`src/lib/services/document-sends.service.ts:422`. Audit-log pollution
rather than data exposure (the recipient lookup is port-checked already).
Fix: when `recipient.interestId` is set, fetch with
`and(eq(interests.id, ...), eq(interests.portId, input.portId))` and
throw if missing.
- **Storage proxy token does not bind to port_id** —
`src/lib/storage/filesystem.ts:73-84`. ProxyTokenPayload is `{k, e, n,
f?, c?}` with a global HMAC. The current "issuer always checks port
first" relies on every issuer being correct in perpetuity. Fix: add a
`p` (portId) claim and have the proxy route resolve key→owner row +
assert `owner.portId === payload.p` before streaming.
- **Documenso webhook does not enforce port_id on document lookups** —
`src/app/api/webhooks/documenso/route.ts:96-148`. Handlers dispatch by
global `documensoId`. If two ports' documents were ever issued the
same Documenso ID (replay across staging/prod, forwarded webhook from
a foreign instance), the wrong port's interest could be mutated. The
per-body `signatureHash` dedup is partial mitigation. Fix: either
(a) include the originating Documenso instance/team in the lookup, or
(b) verify `documents(documenso_id)` has a unique index port-wide.
## Recent expense work polish
- **renderReceiptHeader cursor math drifts after multi-step writes** —
`src/lib/services/expense-pdf.service.ts:854`. After
`doc.text(...)` with auto-flow, `doc.y` advances. Using `doc.y -
headerH + 10` after the rect+stroke block computes against the
post-rect position; works only because pdfkit's text-after-rect
hasn't moved y yet. Headers may misalign on the first receipt page
after a soft page break. Fix: capture `const baseY = doc.y` before
drawing the rect and compute all subsequent offsets relative to it.
## Settings parsing
- **`loadRecommenderSettings` rejects string-shaped JSONB booleans** —
`src/lib/services/berth-recommender.service.ts:116`. Postgres returns
JSONB `true/false` as JS booleans, but if an admin saves `"true"`
via a UI that wraps the value as a string, `asBool` returns null and
the per-port override silently falls through to defaults. Not a
security bug; a tuning footgun. Fix: accept `"true"`/`"false"` string
forms in `asBool`.

View File

@@ -0,0 +1,147 @@
# Handoff prompt for new Claude Code session
Copy everything below the `---` line into the new chat as your first message.
---
I'm continuing work on a comprehensive multi-feature push that was fully designed in a prior session but not yet implemented. The complete plan lives at `docs/berth-recommender-and-pdf-plan.md` (~1030 lines). **Read that file end-to-end before doing anything else — every design decision, schema change, edge case, and confirmed answer to a product question is captured there.** Don't re-litigate decisions; if something seems unclear, the answer is almost certainly in the plan.
## What the project is
A multi-tenant marina/port-management CRM at `/Users/matt/Repos/new-pn-crm`. Next.js 15 App Router, React 19, TypeScript strict, Drizzle ORM on Postgres, MinIO for files, BullMQ on Redis, better-auth, shadcn/ui, Tailwind. See `CLAUDE.md` for the conventions.
## What we're building (high level)
The plan bundles 8 capabilities into one branch (`feat/berth-recommender`):
1. **/clients + /interests list-column fix** (the original bug — list views show `-` everywhere because the service didn't join contacts/yachts)
2. **Full NocoDB Berths import** + seeding + mooring-number normalization (current CRM has `A-01..E-18`; canonical is `A1..E18`)
3. **Schema refactor** to many-to-many `interest_berths` with role flags (`is_primary`, `is_specific_interest`, `is_in_eoi_bundle`)
4. **Berth recommender** (SQL ranking, tier ladder, heat scoring, UI panel) — no AI; pure SQL
5. **EOI bundle** support (multi-berth EOIs + range formatter for the Documenso PDF: `["A1","A2","A3","B5","B6"]``"A1-A3, B5-B6"`)
6. **Pluggable storage backend** (s3-compatible OR local filesystem) so admins can run without MinIO if they want
7. **Per-berth PDFs** (versioned uploads, OCR-based reverse parser, conflict-resolution diff dialog)
8. **Sales send-out emails** (berth PDF + brochure) with full audit + size-aware fallback to download links
## Phase ordering (from plan §2)
```
Phase 0: Full NocoDB berth import + mooring normalization + 5 new pricing columns
Phase 1: /clients + /interests list column fix
Phase 2: M:M interest_berths schema refactor + desired dimensions on interests
Phase 3: CRM /api/public/berths endpoint + website cutover
Phase 4: Recommender SQL + tier ladder + heat + UI panel
Phase 5: EOI bundle + range formatter
Phase 6a: Pluggable storage backend + migration CLI + admin UI
Phase 6b: Per-berth PDF storage (versioned) + reverse parser
Phase 7: Sales send-outs + brochure admin + email-from settings
Phase 8: CLAUDE.md updates + final validation
```
**Start with Phase 0**.
## Working tree state at handoff
- Branch: `main` (you'll create `feat/berth-recommender` from here)
- Recent commits (already pushed):
- `8699f81 chore(style): codebase em-dash sweep + minor layout polish`
- `d62822c fix(migration): NocoDB import safety + dedup helpers + lead-source backfill`
- `089f4a6 feat(receipts): upload guide page + scanner head-tag fix`
- `77ad10c feat(dashboard): custom date range + KPI port-hydration gate`
- `e598cc0 feat(layout): unified Inbox + UserMenu extraction`
- `f5772ce feat(analytics): Umami integration with per-port admin settings`
- `49d34e0 feat(website-intake): dual-write endpoint + migration chain repair`
- Untracked / uncommitted at handoff:
- `docs/berth-recommender-and-pdf-plan.md` (the plan — read this first)
- `docs/berth-feature-handoff-prompt.md` (this file)
- `berth_pdf_example/` (two reference files — see below)
- `.env.example` (modified — adds `WEBSITE_INTAKE_SECRET=`; pre-commit hook blocks `.env*` files so user adds this manually)
- Dev DB state:
- 245 clients (210 with no `nationality_iso` — Phase 1 backfills from primary phone's `value_country`)
- 4 test rows in `website_submissions` (from a previous live audit; safe to ignore)
- 90 berths with `mooring_number` in `A-01` format (Phase 0 normalizes to `A1`)
- vitest: 956 tests passing
- tsc: clean (one pre-existing issue in `scripts/smoke-test-redirect.ts` that's unrelated)
## Reference files
- `berth_pdf_example/Berth_Spec_Sheet_A1.pdf` (358 KB) — sample per-berth PDF. **0 AcroForm fields** (confirmed via pdf-lib) so OCR with positional heuristics is the primary parser tier; the AcroForm tier is built defensively. Plan §9.2 captures the layout structure.
- `berth_pdf_example/Port-Nimara-Brochure-March-2025_5nT92g.pdf` (10.26 MB) — sample brochure. Sized so it ships as an attachment under the 15 MB threshold. Plan §11.1 covers brochure handling.
## NocoDB access
You have `mcp__NocoDB_Base_-_Port_Nimara__*` tools available. Tables you'll touch most:
- `mczgos9hr3oa9qc` — Berths (Phase 0 imports from here; mooring numbers are stored as `A1..E18`)
- `mbs9hjauug4eseo` — Interests (the combined client+deal table the old system used)
## Branch & commit conventions
- Create the branch: `git checkout -b feat/berth-recommender`
- Commit messages match recent history style: `<type>(<scope>): <subject>` lowercase, terse subject, body explains why not what.
- **Pre-commit hook blocks any `.env*` file** including `.env.example`. If you need to update `.env.example`, leave it staged and tell the user to commit manually with `--no-verify` (they're aware of this).
- **Don't push without explicit user permission.** Commits are fine; pushes need approval.
- **Don't run `git rebase`, `git push --force`, or anything destructive without checking.** The branch is solo-owned but the repo's `main` is shared.
## User communication preferences (from prior session)
- Direct, no fluff. If something is a bad idea, say so — don't sycophant.
- When proposing changes, include trade-offs explicitly.
- For multi-question decisions, use `AskUserQuestion` rather than long bulleted lists.
- Run validation (vitest + tsc) at logical checkpoints. Don't ship a commit with regressions.
- The user prefers small focused commits over mega-commits. Within Phase 0 alone there will probably be 2-3 commits (e.g. mooring normalization, schema additions, NocoDB import script).
## Critical rules (from plan §14)
Eleven 🔴 critical items requiring tests before their phase ships:
1. NocoDB mooring collisions → unique constraint + ON CONFLICT
2. Non-PDF disguised upload → magic-byte check
3. Recipient email typos → pre-send confirmation
4. XSS in email body markdown → DOMPurify + payload tests
5. SMTP credentials silently failing → loud error + failed `document_sends` row
6. Wrong-environment `CRM_PUBLIC_URL` → health-check env match
7. Mooring format drift breaking `/berths/A1` URLs → Phase 0 normalization gates Phase 3
8. Multi-port isolation in recommender → explicit `port_id` filter + cross-port test
9. Permission escalation on SMTP creds → per-port admin only, no rep visibility
10. Filesystem backend in multi-node deployment → refuse to start; documented + health-check enforced
11. Path traversal via storage key in filesystem mode → strict regex validation + path realpath check
## Pending items (from plan §9)
These are non-blocking but worth knowing:
- Sample brochure already provided (the 10.26 MB file above).
- SMTP app password for `sales@portnimara.com` — not yet obtained; expected close to production cutover. Phase 7 ships the admin UI immediately and the credential gets entered when available.
- `CRM_PUBLIC_URL` confirmed as `https://crm.portnimara.com` once live; configurable via env.
- GDPR cascade behavior for `document_sends` (delete vs. anonymize-PII vs. keep) — left `OPEN` in §14.10, default lean: anonymize-PII. Revisit when Phase 7 schema lands.
## Scope reminder
- **No prod data depends on the current CRM schema** — refactors don't need backwards-compatibility shims. But every schema change still ships as a Drizzle migration with `pnpm db:generate`.
- **Pluggable storage** rejects Postgres `bytea` as an option (§4.7a). The two backends are s3-compatible (MinIO/AWS/B2/R2/etc.) and local filesystem. Filesystem is single-node only.
## What to do first
1. Read `docs/berth-recommender-and-pdf-plan.md` end-to-end. Don't skim. The edge-case audit in §14 alone is critical context.
2. Confirm you've understood the plan by stating back the 8-phase outline and the 11 critical items, then ask the user if they want to proceed with Phase 0.
3. Once approved, create `feat/berth-recommender` and start Phase 0.
Phase 0 deliverables (per plan):
- One commit normalizing existing CRM mooring numbers from `A-01``A1` form (via `regexp_replace` migration). Delete the offending `scripts/load-berths-to-port-nimara.ts`.
- One commit adding the 5 new berth columns (`weekly_rate_high_usd`, `weekly_rate_low_usd`, `daily_rate_high_usd`, `daily_rate_low_usd`, `pricing_valid_until`, `last_imported_at`). Run `pnpm db:generate`. Verify `meta/_journal.json` prevId chain stays contiguous.
- One commit adding `scripts/import-berths-from-nocodb.ts` — the idempotent NocoDB import (handles updates, preserves CRM-side edits via `last_imported_at vs updated_at` check, `pg_advisory_lock`, dry-run flag, etc. per §4.1 and §14.1).
- Update `src/lib/db/seed-data.ts` with the imported berth set so fresh installs get them.
- Final vitest + tsc validation at the end of Phase 0.
## Don't
- Don't push to remote during this session (user will batch the push later).
- Don't commit `.env*` files (hook blocks them anyway).
- Don't edit `.gitignore` to exclude generated artifacts; the repo's existing ignores are correct.
- Don't add documentation files unless the plan asks for them — the plan itself is the doc.
- Don't add features not in the plan. If something seems missing, ask.
- Don't use AI for the recommender (plan §1 + §13). Pure SQL ranking.
Once you've read the plan and confirmed understanding, ask me whether to proceed with Phase 0.

File diff suppressed because it is too large Load Diff

View File

@@ -374,8 +374,17 @@ async function main(): Promise<void> {
if (orphans.length > 10) console.log(` …and ${orphans.length - 10} more`); if (orphans.length > 10) console.log(` …and ${orphans.length - 10} more`);
} }
// Snapshot write is independent of DB writes — even in --dry-run mode
// a rep may want to refresh the seed JSON to capture the latest NocoDB
// shape without committing to the DB import. The original gate dropped
// this silently when --dry-run was passed; audit caught it.
if (args.updateSnapshot) {
const written = await writeSnapshot(dedup);
console.log(`> Wrote ${dedup.length} rows to ${path.relative(process.cwd(), written)}`);
}
if (args.dryRun) { if (args.dryRun) {
console.log(`\n[dry-run] no writes performed.`); console.log(`\n[dry-run] no DB writes performed.`);
return; return;
} }
@@ -390,11 +399,6 @@ async function main(): Promise<void> {
for (const w of result.warnings.slice(0, 20)) console.log(` - ${w}`); for (const w of result.warnings.slice(0, 20)) console.log(` - ${w}`);
if (result.warnings.length > 20) console.log(` …and ${result.warnings.length - 20} more`); if (result.warnings.length > 20) console.log(` …and ${result.warnings.length - 20} more`);
} }
if (args.updateSnapshot) {
const written = await writeSnapshot(dedup);
console.log(`> Wrote ${dedup.length} rows to ${path.relative(process.cwd(), written)}`);
}
} }
main() main()

View File

@@ -32,14 +32,13 @@ async function main() {
const nodemailer = await import('nodemailer'); const nodemailer = await import('nodemailer');
const captured: Array<{ to: unknown; subject: unknown; from: unknown }> = []; const captured: Array<{ to: unknown; subject: unknown; from: unknown }> = [];
const originalCreateTransport = nodemailer.default.createTransport; const originalCreateTransport = nodemailer.default.createTransport;
// @ts-expect-error monkey-patch nodemailer.default.createTransport = (() => ({
nodemailer.default.createTransport = () => ({
// eslint-disable-next-line @typescript-eslint/no-explicit-any // eslint-disable-next-line @typescript-eslint/no-explicit-any
sendMail: async (msg: any) => { sendMail: async (msg: any) => {
captured.push({ to: msg.to, subject: msg.subject, from: msg.from }); captured.push({ to: msg.to, subject: msg.subject, from: msg.from });
return { messageId: '<smoke@test>', accepted: [msg.to], rejected: [] }; return { messageId: '<smoke@test>', accepted: [msg.to], rejected: [] };
}, },
}); })) as unknown as typeof nodemailer.default.createTransport;
// Now import sendEmail (gets the patched transporter). // Now import sendEmail (gets the patched transporter).
const { sendEmail } = await import('@/lib/email'); const { sendEmail } = await import('@/lib/email');
@@ -55,7 +54,6 @@ async function main() {
await sendEmail(realClientEmail, realSubject, '<p>Body unused for this smoke.</p>'); await sendEmail(realClientEmail, realSubject, '<p>Body unused for this smoke.</p>');
// Restore the original transport (be a good citizen). // Restore the original transport (be a good citizen).
// @ts-expect-error monkey-patch
nodemailer.default.createTransport = originalCreateTransport; nodemailer.default.createTransport = originalCreateTransport;
console.log('[smoke] captured outbound message:'); console.log('[smoke] captured outbound message:');

View File

@@ -3,7 +3,7 @@
import { useEffect, useRef, useState } from 'react'; import { useEffect, useRef, useState } from 'react';
import { useParams, useRouter } from 'next/navigation'; import { useParams, useRouter } from 'next/navigation';
import { useMutation } from '@tanstack/react-query'; import { useMutation } from '@tanstack/react-query';
import { Camera, Loader2, ScanLine, Upload } from 'lucide-react'; import { Camera, Loader2, ScanLine, Upload, X } from 'lucide-react';
import { useMobileChrome } from '@/components/layout/mobile/mobile-layout-provider'; import { useMobileChrome } from '@/components/layout/mobile/mobile-layout-provider';
@@ -30,6 +30,11 @@ interface ScanResult {
confidence: number; confidence: number;
} }
interface UploadedFileMeta {
id: string;
filename: string;
}
export default function ScanReceiptPage() { export default function ScanReceiptPage() {
const params = useParams<{ portSlug: string }>(); const params = useParams<{ portSlug: string }>();
const router = useRouter(); const router = useRouter();
@@ -38,6 +43,13 @@ export default function ScanReceiptPage() {
const cameraInputRef = useRef<HTMLInputElement>(null); const cameraInputRef = useRef<HTMLInputElement>(null);
const [scanResult, setScanResult] = useState<ScanResult | null>(null); const [scanResult, setScanResult] = useState<ScanResult | null>(null);
const [previewUrl, setPreviewUrl] = useState<string | null>(null); const [previewUrl, setPreviewUrl] = useState<string | null>(null);
// After OCR succeeds we also upload the receipt to /api/v1/files/upload
// so the expense links to the actual image. The legacy scanner skipped
// this step and saved expenses without their receipt — which silently
// disqualified them from parent-company reimbursement (the warning the
// PDF export now surfaces).
const [uploadedFile, setUploadedFile] = useState<UploadedFileMeta | null>(null);
const [pendingFile, setPendingFile] = useState<File | null>(null);
const { setChrome } = useMobileChrome(); const { setChrome } = useMobileChrome();
useEffect(() => { useEffect(() => {
@@ -74,6 +86,29 @@ export default function ScanReceiptPage() {
}, },
}); });
// Uploads the receipt image to /api/v1/files/upload (category=receipt)
// so the new expense row can link to it via receiptFileIds. Runs in
// parallel with the OCR scan so the rep can keep editing fields while
// the upload completes.
const uploadMutation = useMutation({
mutationFn: async (file: File): Promise<UploadedFileMeta> => {
const formData = new FormData();
formData.append('file', file);
formData.append('category', 'receipt');
const res = await fetch('/api/v1/files/upload', {
method: 'POST',
body: formData,
credentials: 'include',
});
if (!res.ok) throw new Error('Receipt upload failed');
const json = (await res.json()) as { data: { id: string; filename: string } };
return { id: json.data.id, filename: json.data.filename };
},
onSuccess: (meta) => {
setUploadedFile(meta);
},
});
const saveMutation = useMutation({ const saveMutation = useMutation({
mutationFn: () => mutationFn: () =>
apiFetch('/api/v1/expenses', { apiFetch('/api/v1/expenses', {
@@ -85,6 +120,9 @@ export default function ScanReceiptPage() {
category: category || undefined, category: category || undefined,
expenseDate: date ? new Date(date) : new Date(), expenseDate: date ? new Date(date) : new Date(),
paymentStatus: 'unpaid', paymentStatus: 'unpaid',
receiptFileIds: uploadedFile ? [uploadedFile.id] : undefined,
// The scanner path always has a receipt (we wouldn't have OCR'd
// it otherwise), so we never need the no-receipt flag here.
}, },
}), }),
onSuccess: () => { onSuccess: () => {
@@ -95,12 +133,32 @@ export default function ScanReceiptPage() {
function handleFileChange(e: React.ChangeEvent<HTMLInputElement>) { function handleFileChange(e: React.ChangeEvent<HTMLInputElement>) {
const file = e.target.files?.[0]; const file = e.target.files?.[0];
if (!file) return; if (!file) return;
setPendingFile(file);
const url = URL.createObjectURL(file); const url = URL.createObjectURL(file);
setPreviewUrl(url); setPreviewUrl(url);
// Kick off OCR scan + storage upload concurrently. The two are
// independent server calls and the rep is staring at the preview
// while both run.
scanMutation.mutate(file); scanMutation.mutate(file);
uploadMutation.mutate(file);
} }
function handleClearReceipt() {
if (previewUrl) URL.revokeObjectURL(previewUrl);
setPreviewUrl(null);
setUploadedFile(null);
setPendingFile(null);
setScanResult(null);
// Reset in-flight mutations so a late onSuccess doesn't repopulate
// the form against an already-cleared UI (audit finding: stale
// receipt could land on the next Save).
scanMutation.reset();
uploadMutation.reset();
if (fileInputRef.current) fileInputRef.current.value = '';
if (cameraInputRef.current) cameraInputRef.current.value = '';
}
void pendingFile;
return ( return (
<div className="max-w-2xl mx-auto space-y-6"> <div className="max-w-2xl mx-auto space-y-6">
<div className="hidden sm:block"> <div className="hidden sm:block">
@@ -119,18 +177,45 @@ export default function ScanReceiptPage() {
</CardHeader> </CardHeader>
<CardContent> <CardContent>
{previewUrl ? ( {previewUrl ? (
<div <div className="space-y-2">
className="border-2 border-dashed rounded-lg p-4 text-center cursor-pointer hover:bg-muted/50 transition-colors" <div className="relative border-2 border-dashed rounded-lg p-4 text-center bg-muted/20">
onClick={() => fileInputRef.current?.click()}
>
<img <img
src={previewUrl} src={previewUrl}
alt="Receipt preview" alt="Receipt preview"
className="max-h-64 mx-auto rounded object-contain" className="max-h-64 mx-auto rounded object-contain"
/> />
<button
type="button"
onClick={handleClearReceipt}
aria-label="Remove receipt"
className="absolute top-2 right-2 rounded-full bg-background/80 hover:bg-background border p-1.5 shadow-sm"
>
<X className="h-4 w-4" />
</button>
</div>
<div className="flex flex-wrap items-center gap-2 text-xs text-muted-foreground">
{uploadMutation.isPending && (
<span className="inline-flex items-center gap-1">
<Loader2 className="h-3 w-3 animate-spin" /> Uploading receipt&hellip;
</span>
)}
{uploadedFile && (
<span className="text-emerald-600">
Receipt uploaded ({uploadedFile.filename})
</span>
)}
{uploadMutation.isError && (
<span className="text-destructive">
Receipt upload failed save will still create the expense without an image.
</span>
)}
</div>
</div> </div>
) : ( ) : (
<div className="grid gap-2 sm:grid-cols-2"> <div className="grid gap-2 sm:grid-cols-2">
{/* Camera button — available on mobile devices that surface the
built-in capture flow when an `image/*` input has the
`capture` attribute. Hidden on desktop where it's a no-op. */}
<Button <Button
type="button" type="button"
size="lg" size="lg"
@@ -140,6 +225,8 @@ export default function ScanReceiptPage() {
<Camera className="mr-2 h-5 w-5" /> <Camera className="mr-2 h-5 w-5" />
Take photo Take photo
</Button> </Button>
{/* File picker — works on every platform. Phrased so the copy
fits both mobile (library/files) and desktop (drag and drop). */}
<Button <Button
type="button" type="button"
variant="outline" variant="outline"
@@ -148,18 +235,30 @@ export default function ScanReceiptPage() {
onClick={() => fileInputRef.current?.click()} onClick={() => fileInputRef.current?.click()}
> >
<Upload className="mr-2 h-5 w-5" /> <Upload className="mr-2 h-5 w-5" />
<span className="sm:hidden">Choose from library</span> <span className="sm:hidden">Choose from device</span>
<span className="hidden sm:inline">Click to upload or drag and drop</span> <span className="hidden sm:inline">Choose from device or drag and drop</span>
</Button> </Button>
<p className="text-xs text-muted-foreground sm:col-span-2 text-center"> <p className="text-xs text-muted-foreground sm:col-span-2 text-center">
JPEG, PNG, WebP up to 10MB JPEG, PNG, HEIC, WebP up to 10 MB
</p>
<p className="text-xs text-muted-foreground sm:col-span-2 text-center">
Have many receipts?{' '}
<a
href={`/${params.portSlug}/expenses/bulk-upload`}
className="text-primary hover:underline"
>
Bulk upload &rarr;
</a>
</p> </p>
</div> </div>
)} )}
{/* `image/*` is the broadest accept — includes HEIC on iOS,
JPEG/PNG/WebP everywhere. The capture attribute on the second
input invokes the native camera flow on mobile. */}
<input <input
ref={fileInputRef} ref={fileInputRef}
type="file" type="file"
accept="image/*" accept="image/*,application/pdf"
className="hidden" className="hidden"
onChange={handleFileChange} onChange={handleFileChange}
/> />
@@ -264,10 +363,20 @@ export default function ScanReceiptPage() {
</Button> </Button>
<Button <Button
onClick={() => saveMutation.mutate()} onClick={() => saveMutation.mutate()}
disabled={saveMutation.isPending || !amount} disabled={
saveMutation.isPending ||
!amount ||
// Block save while the receipt upload is still in flight —
// otherwise the rep can hit Save before the storage round
// trip finishes and the expense lands without `receiptFileIds`,
// silently re-creating the legacy receipt-loss bug.
uploadMutation.isPending
}
> >
{saveMutation.isPending && <Loader2 className="mr-2 h-4 w-4 animate-spin" />} {(saveMutation.isPending || uploadMutation.isPending) && (
Save as Expense <Loader2 className="mr-2 h-4 w-4 animate-spin" />
)}
{uploadMutation.isPending ? 'Uploading…' : 'Save as Expense'}
</Button> </Button>
</div> </div>
</CardContent> </CardContent>

View File

@@ -16,6 +16,10 @@ import { toPublicBerth } from '@/lib/services/public-berths';
* ("A1", "B12") - Phase 0 normalized the entire CRM dataset. * ("A1", "B12") - Phase 0 normalized the entire CRM dataset.
*/ */
// Hard-coded allowlist for the public read-only feed. Adding a port here
// is a deliberate decision (not silent enumeration via ?portSlug=), so a
// future private tenant can't be exposed by accident.
const PUBLIC_PORT_SLUGS = new Set(['port-nimara']);
const DEFAULT_PUBLIC_PORT_SLUG = 'port-nimara'; const DEFAULT_PUBLIC_PORT_SLUG = 'port-nimara';
const RESPONSE_HEADERS = { const RESPONSE_HEADERS = {
'cache-control': 'public, s-maxage=300, stale-while-revalidate=60', 'cache-control': 'public, s-maxage=300, stale-while-revalidate=60',
@@ -30,7 +34,14 @@ export async function GET(
): Promise<Response> { ): Promise<Response> {
const { mooringNumber } = await ctx.params; const { mooringNumber } = await ctx.params;
const url = new URL(request.url); const url = new URL(request.url);
const portSlug = url.searchParams.get('portSlug') ?? DEFAULT_PUBLIC_PORT_SLUG; const requestedSlug = url.searchParams.get('portSlug') ?? DEFAULT_PUBLIC_PORT_SLUG;
if (!PUBLIC_PORT_SLUGS.has(requestedSlug)) {
return NextResponse.json(
{ error: 'port is not part of the public berths feed', portSlug: requestedSlug },
{ status: 404, headers: { 'cache-control': 'no-store' } },
);
}
const portSlug = requestedSlug;
// Reject obviously malformed mooring numbers up front so cache poisoning // Reject obviously malformed mooring numbers up front so cache poisoning
// / random-URL probing returns 400 rather than 404 (saves a DB hit). // / random-URL probing returns 400 rather than 404 (saves a DB hit).

View File

@@ -25,6 +25,10 @@ import { toPublicBerth, type PublicBerth } from '@/lib/services/public-berths';
* them up. * them up.
*/ */
// Hard-coded allowlist for the public read-only feed. Adding a port here
// is a deliberate decision (not silent enumeration via ?portSlug=), so a
// future private tenant can't be exposed by accident.
const PUBLIC_PORT_SLUGS = new Set(['port-nimara']);
const DEFAULT_PUBLIC_PORT_SLUG = 'port-nimara'; const DEFAULT_PUBLIC_PORT_SLUG = 'port-nimara';
const RESPONSE_HEADERS = { const RESPONSE_HEADERS = {
@@ -45,7 +49,14 @@ interface ListResponse {
export async function GET(request: Request): Promise<Response> { export async function GET(request: Request): Promise<Response> {
const url = new URL(request.url); const url = new URL(request.url);
const portSlug = url.searchParams.get('portSlug') ?? DEFAULT_PUBLIC_PORT_SLUG; const requestedSlug = url.searchParams.get('portSlug') ?? DEFAULT_PUBLIC_PORT_SLUG;
if (!PUBLIC_PORT_SLUGS.has(requestedSlug)) {
return NextResponse.json(
{ error: 'port is not part of the public berths feed', portSlug: requestedSlug },
{ status: 404, headers: { 'cache-control': 'no-store' } },
);
}
const portSlug = requestedSlug;
const [port] = await db const [port] = await db
.select({ id: ports.id }) .select({ id: ports.id })

View File

@@ -20,10 +20,12 @@ import { Readable } from 'node:stream';
import { NextRequest, NextResponse } from 'next/server'; import { NextRequest, NextResponse } from 'next/server';
import { MAX_FILE_SIZE } from '@/lib/constants/file-validation';
import { logger } from '@/lib/logger'; import { logger } from '@/lib/logger';
import { redis } from '@/lib/redis'; import { redis } from '@/lib/redis';
import { FilesystemBackend, getStorageBackend } from '@/lib/storage'; import { FilesystemBackend, getStorageBackend } from '@/lib/storage';
import { verifyProxyToken } from '@/lib/storage/filesystem'; import { verifyProxyToken } from '@/lib/storage/filesystem';
import { isPdfMagic } from '@/lib/services/berth-pdf-parser';
export const runtime = 'nodejs'; export const runtime = 'nodejs';
export const dynamic = 'force-dynamic'; export const dynamic = 'force-dynamic';
@@ -115,3 +117,120 @@ export async function GET(
return new NextResponse(webStream, { status: 200, headers }); return new NextResponse(webStream, { status: 200, headers });
} }
/**
* Filesystem-backend upload proxy. The presigned URL minted by
* `FilesystemBackend.presignUpload` points here. Without this handler the
* browser-driven berth-PDF / brochure uploads would 405 in filesystem
* deployments — the entire pluggable-storage abstraction relied on the
* GET-only counterpart for downloads.
*
* Same token-verify + single-use replay protection as GET, plus:
* - Hard size cap (rejects oversized bodies before any disk I/O).
* - Magic-byte check when the issuer declared content-type=application/pdf
* (matches the §14.6 §6c/§7c invariant: every upload path verifies
* bytes server-side, not just at the client).
*/
export async function PUT(
req: NextRequest,
ctx: { params: Promise<{ token: string }> },
): Promise<NextResponse> {
const { token } = await ctx.params;
const backend = await getStorageBackend();
if (!(backend instanceof FilesystemBackend)) {
return NextResponse.json(
{ error: 'Storage proxy is only available in filesystem mode' },
{ status: 404 },
);
}
const result = verifyProxyToken(token, backend.getHmacSecret());
if (!result.ok) {
logger.warn({ reason: result.reason }, 'Storage proxy upload token rejected');
return NextResponse.json({ error: 'Invalid or expired token' }, { status: 403 });
}
const { payload } = result;
// Separate replay namespace from GET so a token can validly serve one
// upload AND one download (the issuer only mints the second), but a
// PUT cannot be replayed against itself.
const replayKey = `storage:proxy:put:${token.split('.')[0]}`;
const remainingSeconds = Math.max(
REPLAY_TTL_FLOOR_SECONDS,
Math.min(REPLAY_TTL_CEILING_SECONDS, payload.e - Math.floor(Date.now() / 1000) + 60),
);
const setOk = await redis.set(replayKey, '1', 'EX', remainingSeconds, 'NX');
if (setOk !== 'OK') {
logger.warn({ key: payload.k }, 'Storage proxy upload token replay rejected');
return NextResponse.json({ error: 'Token already used' }, { status: 403 });
}
// Pre-flight size check via Content-Length so a malicious caller can't
// exhaust disk by streaming hundreds of MB before we look at the body.
const contentLengthHeader = req.headers.get('content-length');
const contentLength = contentLengthHeader ? Number(contentLengthHeader) : NaN;
if (Number.isFinite(contentLength) && contentLength > MAX_FILE_SIZE) {
return NextResponse.json(
{ error: `File exceeds ${MAX_FILE_SIZE} byte cap (Content-Length: ${contentLength})` },
{ status: 413 },
);
}
if (!req.body) {
return NextResponse.json({ error: 'Empty body' }, { status: 400 });
}
// Read the body into a buffer with a hard cap. Filesystem deployments are
// small-tenant (single-node only — see FilesystemBackend boot guard) so
// 50 MB ceiling fits comfortably in heap; no streaming needed.
let buffer: Buffer;
try {
const chunks: Buffer[] = [];
let total = 0;
const reader = req.body.getReader();
while (true) {
const { done, value } = await reader.read();
if (done) break;
total += value.byteLength;
if (total > MAX_FILE_SIZE) {
try {
await reader.cancel();
} catch {
/* ignore */
}
return NextResponse.json(
{ error: `File exceeds ${MAX_FILE_SIZE} byte cap` },
{ status: 413 },
);
}
chunks.push(Buffer.from(value));
}
buffer = Buffer.concat(chunks);
} catch (err) {
logger.warn({ err, key: payload.k }, 'Storage proxy upload read failed');
return NextResponse.json({ error: 'Upload read failed' }, { status: 400 });
}
// Magic-byte gate: when the token was minted with `c=application/pdf`
// (the only consumer today — berth PDFs + brochures), refuse anything
// that isn't actually a PDF. Mirrors the post-upload check in
// berth-pdf.service.ts so the two paths behave identically.
if (payload.c === 'application/pdf' && !isPdfMagic(buffer)) {
return NextResponse.json(
{ error: 'Uploaded file failed PDF magic-byte check (does not start with %PDF-).' },
{ status: 400 },
);
}
try {
await backend.put(payload.k, buffer, {
contentType: payload.c ?? 'application/octet-stream',
});
} catch (err) {
logger.error({ err, key: payload.k }, 'Storage proxy upload write failed');
return NextResponse.json({ error: 'Upload write failed' }, { status: 500 });
}
return NextResponse.json({ ok: true, key: payload.k, sizeBytes: buffer.length }, { status: 200 });
}

View File

@@ -49,18 +49,25 @@ export const POST = withAuth(
} }
: undefined, : undefined,
options: input.options, options: input.options,
// Forward the request abort signal so the streaming PDF builder
// stops fetching/resizing receipts the moment the client disconnects
// (otherwise an aborted 1000-receipt export keeps the worker busy
// for minutes after the user navigated away — see audit finding 2).
signal: req.signal,
}); });
// NextResponse extends Response; passing a ReadableStream as the // Content-Disposition filename hardening: the validator caps length
// body keeps the streaming semantics. The wrapper's RouteHandler // but `\s` matches CR/LF, which would let an attacker forge response
// type expects NextResponse so we use it explicitly. // headers. Strip everything that isn't word/space/dot/dash, AND set
// the RFC 5987 `filename*` so a UTF-8 body still survives.
const safeFilename = suggestedFilename.replace(/[^\w. \-]+/g, '_');
const disposition = `attachment; filename="${safeFilename}"; filename*=UTF-8''${encodeURIComponent(suggestedFilename)}`;
return new NextResponse(stream, { return new NextResponse(stream, {
status: 200, status: 200,
headers: { headers: {
'Content-Type': 'application/pdf', 'Content-Type': 'application/pdf',
'Content-Disposition': `attachment; filename="${suggestedFilename}"`, 'Content-Disposition': disposition,
// The PDF is generated on the fly per-request and includes
// potentially-sensitive expense data; never cache.
'Cache-Control': 'private, no-store, max-age=0', 'Cache-Control': 'private, no-store, max-age=0',
'X-Content-Type-Options': 'nosniff', 'X-Content-Type-Options': 'nosniff',
}, },

View File

@@ -27,6 +27,7 @@ export interface ExpenseRow {
description: string | null; description: string | null;
payer: string | null; payer: string | null;
receiptFileIds: string[] | null; receiptFileIds: string[] | null;
noReceiptAcknowledged?: boolean;
archivedAt: string | null; archivedAt: string | null;
createdAt: string; createdAt: string;
/** Set by the dedup engine when this expense looks like a duplicate of another. */ /** Set by the dedup engine when this expense looks like a duplicate of another. */

View File

@@ -1,12 +1,13 @@
'use client'; 'use client';
import { useEffect } from 'react'; import { useEffect, useRef, useState } from 'react';
import { useForm } from 'react-hook-form'; import { useForm } from 'react-hook-form';
import { zodResolver } from '@hookform/resolvers/zod'; import { zodResolver } from '@hookform/resolvers/zod';
import { useMutation, useQueryClient } from '@tanstack/react-query'; import { useMutation, useQueryClient } from '@tanstack/react-query';
import { Loader2 } from 'lucide-react'; import { AlertTriangle, Loader2, Upload, X } from 'lucide-react';
import { Button } from '@/components/ui/button'; import { Button } from '@/components/ui/button';
import { Checkbox } from '@/components/ui/checkbox';
import { Input } from '@/components/ui/input'; import { Input } from '@/components/ui/input';
import { Label } from '@/components/ui/label'; import { Label } from '@/components/ui/label';
import { Textarea } from '@/components/ui/textarea'; import { Textarea } from '@/components/ui/textarea';
@@ -17,18 +18,17 @@ import {
SelectTrigger, SelectTrigger,
SelectValue, SelectValue,
} from '@/components/ui/select'; } from '@/components/ui/select';
import { import { Sheet, SheetContent, SheetHeader, SheetTitle, SheetFooter } from '@/components/ui/sheet';
Sheet,
SheetContent,
SheetHeader,
SheetTitle,
SheetFooter,
} from '@/components/ui/sheet';
import { apiFetch } from '@/lib/api/client'; import { apiFetch } from '@/lib/api/client';
import { createExpenseSchema, type CreateExpenseInput } from '@/lib/validators/expenses'; import { createExpenseSchema, type CreateExpenseInput } from '@/lib/validators/expenses';
import { EXPENSE_CATEGORIES, PAYMENT_METHODS } from '@/lib/constants'; import { EXPENSE_CATEGORIES, PAYMENT_METHODS } from '@/lib/constants';
import type { ExpenseRow } from './expense-columns'; import type { ExpenseRow } from './expense-columns';
interface UploadedReceipt {
id: string;
filename: string;
}
interface ExpenseFormDialogProps { interface ExpenseFormDialogProps {
open: boolean; open: boolean;
onOpenChange: (open: boolean) => void; onOpenChange: (open: boolean) => void;
@@ -38,6 +38,12 @@ interface ExpenseFormDialogProps {
export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDialogProps) { export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDialogProps) {
const queryClient = useQueryClient(); const queryClient = useQueryClient();
const isEdit = !!expense; const isEdit = !!expense;
const fileInputRef = useRef<HTMLInputElement>(null);
const [uploadedReceipt, setUploadedReceipt] = useState<UploadedReceipt | null>(null);
const [previewUrl, setPreviewUrl] = useState<string | null>(null);
const [noReceipt, setNoReceipt] = useState(false);
const [uploadError, setUploadError] = useState<string | null>(null);
const [isUploading, setIsUploading] = useState(false);
const { const {
register, register,
@@ -65,15 +71,47 @@ export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDi
expenseDate: new Date(expense.expenseDate), expenseDate: new Date(expense.expenseDate),
paymentStatus: (expense.paymentStatus as CreateExpenseInput['paymentStatus']) ?? 'unpaid', paymentStatus: (expense.paymentStatus as CreateExpenseInput['paymentStatus']) ?? 'unpaid',
}); });
setUploadedReceipt(null);
setPreviewUrl(null);
setNoReceipt(Boolean(expense.noReceiptAcknowledged));
setUploadError(null);
} else if (open && !expense) { } else if (open && !expense) {
reset({ reset({
currency: 'USD', currency: 'USD',
paymentStatus: 'unpaid', paymentStatus: 'unpaid',
expenseDate: new Date(), expenseDate: new Date(),
}); });
setUploadedReceipt(null);
setPreviewUrl(null);
setNoReceipt(false);
setUploadError(null);
} }
}, [open, expense, reset]); }, [open, expense, reset]);
// Capture the URL inside the effect closure so the cleanup revokes the
// URL it observed at mount, not the one captured by a later render.
// Audit caught a bug where the cleanup ran on every change and revoked
// the URL that was still being shown.
useEffect(() => {
const url = previewUrl;
return () => {
if (url) URL.revokeObjectURL(url);
};
}, [previewUrl]);
// Reset upload state whenever the sheet closes — re-opening on the same
// expense was carrying stale state from the prior session.
useEffect(() => {
if (!open) {
setUploadedReceipt(null);
setPreviewUrl(null);
setNoReceipt(false);
setUploadError(null);
setIsUploading(false);
if (fileInputRef.current) fileInputRef.current.value = '';
}
}, [open]);
const mutation = useMutation({ const mutation = useMutation({
mutationFn: (data: CreateExpenseInput) => { mutationFn: (data: CreateExpenseInput) => {
if (isEdit) { if (isEdit) {
@@ -90,9 +128,51 @@ export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDi
}, },
}); });
function onSubmit(data: CreateExpenseInput) { async function handleFileChange(e: React.ChangeEvent<HTMLInputElement>) {
mutation.mutate(data); const file = e.target.files?.[0];
if (!file) return;
setUploadError(null);
if (previewUrl) URL.revokeObjectURL(previewUrl);
setPreviewUrl(URL.createObjectURL(file));
setIsUploading(true);
try {
const formData = new FormData();
formData.append('file', file);
formData.append('category', 'receipt');
const res = await fetch('/api/v1/files/upload', {
method: 'POST',
body: formData,
credentials: 'include',
});
if (!res.ok) throw new Error('Upload failed');
const json = (await res.json()) as { data: { id: string; filename: string } };
setUploadedReceipt({ id: json.data.id, filename: json.data.filename });
setNoReceipt(false);
} catch (err) {
setUploadError(err instanceof Error ? err.message : 'Upload failed');
setUploadedReceipt(null);
} finally {
setIsUploading(false);
} }
}
function clearReceipt() {
if (previewUrl) URL.revokeObjectURL(previewUrl);
setPreviewUrl(null);
setUploadedReceipt(null);
setUploadError(null);
if (fileInputRef.current) fileInputRef.current.value = '';
}
function onSubmit(data: CreateExpenseInput) {
mutation.mutate({
...data,
receiptFileIds: uploadedReceipt ? [uploadedReceipt.id] : undefined,
noReceiptAcknowledged: Boolean(noReceipt && !uploadedReceipt),
});
}
const canSubmit = isEdit || Boolean(uploadedReceipt) || noReceipt;
return ( return (
<Sheet open={open} onOpenChange={onOpenChange}> <Sheet open={open} onOpenChange={onOpenChange}>
@@ -110,9 +190,11 @@ export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDi
{...register('expenseDate', { {...register('expenseDate', {
setValueAs: (v) => (v ? new Date(v) : undefined), setValueAs: (v) => (v ? new Date(v) : undefined),
})} })}
defaultValue={expense?.expenseDate defaultValue={
expense?.expenseDate
? new Date(expense.expenseDate).toISOString().split('T')[0] ? new Date(expense.expenseDate).toISOString().split('T')[0]
: new Date().toISOString().split('T')[0]} : new Date().toISOString().split('T')[0]
}
/> />
{errors.expenseDate && ( {errors.expenseDate && (
<p className="text-xs text-destructive">{errors.expenseDate.message}</p> <p className="text-xs text-destructive">{errors.expenseDate.message}</p>
@@ -130,19 +212,12 @@ export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDi
placeholder="0.00" placeholder="0.00"
{...register('amount', { valueAsNumber: true })} {...register('amount', { valueAsNumber: true })}
/> />
{errors.amount && ( {errors.amount && <p className="text-xs text-destructive">{errors.amount.message}</p>}
<p className="text-xs text-destructive">{errors.amount.message}</p>
)}
</div> </div>
<div className="space-y-2"> <div className="space-y-2">
<Label htmlFor="currency">Currency</Label> <Label htmlFor="currency">Currency</Label>
<Input <Input id="currency" placeholder="USD" maxLength={3} {...register('currency')} />
id="currency"
placeholder="USD"
maxLength={3}
{...register('currency')}
/>
{errors.currency && ( {errors.currency && (
<p className="text-xs text-destructive">{errors.currency.message}</p> <p className="text-xs text-destructive">{errors.currency.message}</p>
)} )}
@@ -180,7 +255,9 @@ export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDi
<div className="space-y-2"> <div className="space-y-2">
<Label htmlFor="paymentMethod">Payment Method</Label> <Label htmlFor="paymentMethod">Payment Method</Label>
<Select <Select
onValueChange={(v) => setValue('paymentMethod', v as CreateExpenseInput['paymentMethod'])} onValueChange={(v) =>
setValue('paymentMethod', v as CreateExpenseInput['paymentMethod'])
}
defaultValue={expense?.paymentMethod ?? undefined} defaultValue={expense?.paymentMethod ?? undefined}
> >
<SelectTrigger id="paymentMethod"> <SelectTrigger id="paymentMethod">
@@ -198,11 +275,7 @@ export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDi
<div className="space-y-2"> <div className="space-y-2">
<Label htmlFor="payer">Payer</Label> <Label htmlFor="payer">Payer</Label>
<Input <Input id="payer" placeholder="Who paid?" {...register('payer')} />
id="payer"
placeholder="Who paid?"
{...register('payer')}
/>
</div> </div>
<div className="space-y-2"> <div className="space-y-2">
@@ -232,21 +305,93 @@ export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDi
/> />
</div> </div>
{mutation.isError && ( {!isEdit && (
<p className="text-sm text-destructive"> <div className="space-y-2 rounded-md border p-3">
{(mutation.error as Error).message} <Label className="text-sm font-medium">Receipt</Label>
{previewUrl ? (
<div className="relative">
<img
src={previewUrl}
alt="Receipt preview"
className="max-h-48 rounded border object-contain"
/>
<button
type="button"
onClick={clearReceipt}
aria-label="Remove receipt"
className="absolute top-1 right-1 rounded-full bg-background/90 hover:bg-background border p-1 shadow-sm"
>
<X className="h-3 w-3" />
</button>
<p className="mt-1 text-xs text-muted-foreground">
{isUploading
? 'Uploading...'
: uploadedReceipt
? `Uploaded: ${uploadedReceipt.filename}`
: 'Selecting...'}
</p> </p>
)} </div>
) : (
<SheetFooter className="pt-2">
<Button <Button
type="button" type="button"
variant="outline" variant="outline"
onClick={() => onOpenChange(false)} size="sm"
className="w-full"
disabled={noReceipt}
onClick={() => fileInputRef.current?.click()}
> >
<Upload className="mr-2 h-4 w-4" />
Upload receipt image or PDF
</Button>
)}
<input
ref={fileInputRef}
type="file"
accept="image/*,application/pdf"
className="hidden"
onChange={handleFileChange}
/>
{uploadError && <p className="text-xs text-destructive">{uploadError}</p>}
<div className="flex items-start gap-2 pt-1">
<Checkbox
id="noReceipt"
checked={noReceipt}
onCheckedChange={(checked) => {
const next = checked === true;
setNoReceipt(next);
if (next) clearReceipt();
}}
/>
<Label htmlFor="noReceipt" className="text-sm font-normal leading-tight">
I have no receipt for this expense
</Label>
</div>
{noReceipt && (
<div className="flex gap-2 rounded-md border border-amber-300 bg-amber-50 p-2 text-xs text-amber-900 dark:border-amber-900 dark:bg-amber-950/40 dark:text-amber-200">
<AlertTriangle className="h-4 w-4 flex-shrink-0" />
<span>
Expenses without a receipt may not be reimbursed by the parent company. The PDF
export will flag this expense.
</span>
</div>
)}
</div>
)}
{mutation.isError && (
<p className="text-sm text-destructive">{(mutation.error as Error).message}</p>
)}
<SheetFooter className="pt-2">
<Button type="button" variant="outline" onClick={() => onOpenChange(false)}>
Cancel Cancel
</Button> </Button>
<Button type="submit" disabled={isSubmitting || mutation.isPending}> <Button
type="submit"
disabled={isSubmitting || mutation.isPending || isUploading || !canSubmit}
>
{(isSubmitting || mutation.isPending) && ( {(isSubmitting || mutation.isPending) && (
<Loader2 className="mr-2 h-4 w-4 animate-spin" /> <Loader2 className="mr-2 h-4 w-4 animate-spin" />
)} )}

View File

@@ -0,0 +1,29 @@
-- Audit follow-up: 0024 only normalized rows that contained a literal
-- hyphen (`A-01`), but the audit caught that legacy NocoDB exports also
-- produced un-hyphenated padded forms (`A01`). Those rows skipped the
-- 0024 rewrite and remained non-canonical, which would break the public
-- /berths/:mooringNumber lookup (the route gates on `^[A-Z]+\d+$`).
--
-- This migration re-runs the rewrite with a WHERE clause broadened to
-- catch BOTH variants:
-- - hyphenated padded ("A-01") ← redundant after 0024 but harmless
-- - un-hyphenated padded ("A01")
-- Rows already in canonical form skip the UPDATE because the regex_replace
-- output equals the input AND the WHERE filter excludes them via the
-- "leading zero or hyphen" pattern.
UPDATE berths
SET mooring_number = regexp_replace(mooring_number, '^([A-Z]+)-?0*(\d+)$', '\1\2')
WHERE mooring_number ~ '^[A-Z]+-?0*\d+$'
AND mooring_number !~ '^[A-Z]+\d+$';
DO $$
DECLARE
bad_count integer;
BEGIN
SELECT count(*) INTO bad_count
FROM berths
WHERE mooring_number !~ '^[A-Z]+\d+$';
IF bad_count > 0 THEN
RAISE NOTICE 'Post-rewrite: % rows still do not match ^[A-Z]+\d+$ - manual review needed', bad_count;
END IF;
END $$;

View File

@@ -239,6 +239,13 @@
"when": 1777948521076, "when": 1777948521076,
"tag": "0033_expense_no_receipt_acknowledged", "tag": "0033_expense_no_receipt_acknowledged",
"breakpoints": true "breakpoints": true
},
{
"idx": 34,
"version": "7",
"when": 1778000000000,
"tag": "0034_normalize_mooring_numbers_broaden",
"breakpoints": true
} }
] ]
} }

View File

@@ -415,6 +415,14 @@ export async function recommendBerths(args: RecommendBerthsArgs): Promise<Recomm
predicates.push( predicates.push(
sql`b.length_ft::numeric <= ${interestInput.desiredLengthFt}::numeric * ${oversizeMultiplier}::numeric`, sql`b.length_ft::numeric <= ${interestInput.desiredLengthFt}::numeric * ${oversizeMultiplier}::numeric`,
); );
} else if (interestInput.desiredWidthFt !== null) {
// Width-only feasibility: cap the length using a generous L/W ratio
// so the recommender doesn't surface a 200 ft berth for a 30 ft beam
// request. Plan §4.4 promised an upper bound; without this branch the
// null-length path skipped the cap entirely.
predicates.push(
sql`b.length_ft::numeric <= ${interestInput.desiredWidthFt}::numeric * 8::numeric * ${oversizeMultiplier}::numeric`,
);
} }
if (interestInput.desiredWidthFt !== null) { if (interestInput.desiredWidthFt !== null) {
predicates.push(sql`b.width_ft::numeric >= ${interestInput.desiredWidthFt}`); predicates.push(sql`b.width_ft::numeric >= ${interestInput.desiredWidthFt}`);
@@ -495,7 +503,11 @@ export async function recommendBerths(args: RecommendBerthsArgs): Promise<Recomm
-- rows reports 0 — the LEFT JOIN otherwise produces a single -- rows reports 0 — the LEFT JOIN otherwise produces a single
-- NULL-right-side row that COUNT(*) would tally as 1 and inflate -- NULL-right-side row that COUNT(*) would tally as 1 and inflate
-- the heat interest-count component for berths with no history. -- the heat interest-count component for berths with no history.
COUNT(ib.berth_id) AS total_interest_count, -- The FILTER also enforces port isolation defense-in-depth: an
-- orphan junction row whose interest belongs to a different port
-- (which the new cross-port guard now prevents but pre-existing
-- data may carry) shouldn't inflate this count.
COUNT(ib.berth_id) FILTER (WHERE i.id IS NOT NULL) AS total_interest_count,
COUNT(*) FILTER (WHERE i.eoi_status = 'signed') AS eoi_signed_count COUNT(*) FILTER (WHERE i.eoi_status = 'signed') AS eoi_signed_count
FROM feasible f FROM feasible f
LEFT JOIN interest_berths ib ON ib.berth_id = f.id LEFT JOIN interest_berths ib ON ib.berth_id = f.id

View File

@@ -225,8 +225,11 @@ async function resolveRecipientEmail(
return primary.value; return primary.value;
} }
async function checkSendRateLimit(userId: string): Promise<void> { async function checkSendRateLimit(portId: string, userId: string): Promise<void> {
const result = await checkRateLimit(userId, { // Per-(port, user) so a multi-port rep can't be DoS'd by another tenant
// burning their global cap. Audit caught this — the original
// single-key version locked a user out across every port they touched.
const result = await checkRateLimit(`${portId}:${userId}`, {
windowMs: 60 * 60 * 1000, windowMs: 60 * 60 * 1000,
max: 50, max: 50,
keyPrefix: 'docsend', keyPrefix: 'docsend',
@@ -369,7 +372,8 @@ async function performSend(args: {
// ─── Public sender: berth PDF ──────────────────────────────────────────────── // ─── Public sender: berth PDF ────────────────────────────────────────────────
export async function sendBerthPdf(input: SendBerthPdfInput): Promise<SendResult> { export async function sendBerthPdf(input: SendBerthPdfInput): Promise<SendResult> {
await checkSendRateLimit(input.sentBy); // Rate-limit AFTER validation so a typo'd recipient or missing-PDF rep
// doesn't burn a slot on a send that would have failed anyway.
const recipientEmail = await resolveRecipientEmail(input.portId, input.recipient); const recipientEmail = await resolveRecipientEmail(input.portId, input.recipient);
// Resolve berth + active version. // Resolve berth + active version.
@@ -406,6 +410,8 @@ export async function sendBerthPdf(input: SendBerthPdfInput): Promise<SendResult
// Subject pulls in the mooring number for inbox triage. // Subject pulls in the mooring number for inbox triage.
const subject = `Berth ${berth.mooringNumber} — spec sheet`; const subject = `Berth ${berth.mooringNumber} — spec sheet`;
await checkSendRateLimit(input.portId, input.sentBy);
return performSend({ return performSend({
portId: input.portId, portId: input.portId,
recipientEmail, recipientEmail,
@@ -436,7 +442,7 @@ export async function sendBerthPdf(input: SendBerthPdfInput): Promise<SendResult
// ─── Public sender: brochure ───────────────────────────────────────────────── // ─── Public sender: brochure ─────────────────────────────────────────────────
export async function sendBrochure(input: SendBrochureInput): Promise<SendResult> { export async function sendBrochure(input: SendBrochureInput): Promise<SendResult> {
await checkSendRateLimit(input.sentBy); // Rate-limit AFTER validation (audit finding); typos shouldn't burn slots.
const recipientEmail = await resolveRecipientEmail(input.portId, input.recipient); const recipientEmail = await resolveRecipientEmail(input.portId, input.recipient);
// Resolve brochure + most-recent version. // Resolve brochure + most-recent version.
@@ -456,6 +462,14 @@ export async function sendBrochure(input: SendBrochureInput): Promise<SendResult
'No default brochure configured for this port. Upload one in /admin/brochures.', 'No default brochure configured for this port. Upload one in /admin/brochures.',
); );
} }
// The partial unique index on `is_default` only enforces uniqueness when
// archived_at IS NULL — an archived row can still carry is_default=true
// and would silently be returned here without this guard.
if (def.archivedAt) {
throw new ValidationError(
'Default brochure is archived. Choose a non-archived brochure as the default first.',
);
}
brochureRow = def; brochureRow = def;
} }
@@ -488,6 +502,8 @@ export async function sendBrochure(input: SendBrochureInput): Promise<SendResult
const bodyHtml = renderEmailBody(expanded); const bodyHtml = renderEmailBody(expanded);
const subject = `${brochureRow.label} — brochure`; const subject = `${brochureRow.label} — brochure`;
await checkSendRateLimit(input.portId, input.sentBy);
return performSend({ return performSend({
portId: input.portId, portId: input.portId,
recipientEmail, recipientEmail,

View File

@@ -34,7 +34,7 @@
*/ */
import { Readable } from 'node:stream'; import { Readable } from 'node:stream';
import { eq, inArray, and, gte, lte, isNull } from 'drizzle-orm'; import { eq, inArray, and, gte, lte, isNull, desc } from 'drizzle-orm';
import PDFDocument from 'pdfkit'; import PDFDocument from 'pdfkit';
import sharp from 'sharp'; import sharp from 'sharp';
@@ -87,6 +87,13 @@ export interface ExpensePdfArgs {
includeArchived?: boolean; includeArchived?: boolean;
}; };
options: ExpensePdfOptions; options: ExpensePdfOptions;
/**
* Caller's abort signal. When the client disconnects mid-stream we stop
* pulling receipts off the storage backend rather than burning CPU/IO on
* an export nobody's reading. Without this, a 1000-receipt export aborted
* at byte 0 keeps the process busy for minutes.
*/
signal?: AbortSignal;
} }
// ─── Image resize gate ────────────────────────────────────────────────────── // ─── Image resize gate ──────────────────────────────────────────────────────
@@ -170,6 +177,10 @@ interface ProcessedExpense extends ExpenseRow {
amountTarget: number; amountTarget: number;
amountUsdNumeric: number; amountUsdNumeric: number;
amountEurNumeric: number; amountEurNumeric: number;
/** True when ANY rate lookup for this row fell back to 1:1 (e.g. the
* exchange-rate cache was cold and the upstream API returned null).
* Surfaced via an asterisk in the table + a footnote in the summary. */
rateUnavailable: boolean;
} }
interface Totals { interface Totals {
@@ -187,6 +198,10 @@ interface Totals {
/** Sum of the no-receipt expenses' targetTotal — the amount at risk /** Sum of the no-receipt expenses' targetTotal — the amount at risk
* of being denied reimbursement. */ * of being denied reimbursement. */
noReceiptAmount: number; noReceiptAmount: number;
/** Number of rows whose conversion fell back to 1:1 — surfaces as an
* amber footer so reps know the totals are approximate. Audit caught
* the silent 1:1 fallback; users were getting EUR-labelled USD totals. */
rateUnavailableCount: number;
} }
async function processExpenses( async function processExpenses(
@@ -194,31 +209,58 @@ async function processExpenses(
target: TargetCurrency, target: TargetCurrency,
): Promise<ProcessedExpense[]> { ): Promise<ProcessedExpense[]> {
// Resolve rate ONCE per source currency (cached by getRate). Avoids the // Resolve rate ONCE per source currency (cached by getRate). Avoids the
// legacy code's per-row API call. // legacy code's per-row API call. We also track *which* lookups failed
const rateCache = new Map<string, number>(); // (returned null upstream) so the PDF can surface a warning rather than
const ensureRate = async (from: string, to: string): Promise<number> => { // silently treating EUR as USD.
if (from === to) return 1; const rateCache = new Map<string, { rate: number; ok: boolean }>();
const ensureRate = async (from: string, to: string): Promise<{ rate: number; ok: boolean }> => {
if (from === to) return { rate: 1, ok: true };
const key = `${from}->${to}`; const key = `${from}->${to}`;
if (rateCache.has(key)) return rateCache.get(key)!; if (rateCache.has(key)) return rateCache.get(key)!;
const rate = (await getRate(from, to)) ?? 1; const fetched = await getRate(from, to);
rateCache.set(key, rate); const entry = fetched != null ? { rate: fetched, ok: true } : { rate: 1, ok: false };
return rate; rateCache.set(key, entry);
if (!entry.ok) {
logger.warn({ from, to }, 'Expense PDF: exchange rate unavailable, falling back to 1:1');
}
return entry;
}; };
const out: ProcessedExpense[] = []; const out: ProcessedExpense[] = [];
for (const row of rows) { for (const row of rows) {
const raw = parseFloat(row.amount); const raw = parseFloat(row.amount);
const usd = let rateUnavailable = false;
row.amountUsd != null let usd: number;
? parseFloat(row.amountUsd) if (row.amountUsd != null) {
: raw * (await ensureRate(row.currency.toUpperCase(), 'USD')); usd = parseFloat(row.amountUsd);
const eur = usd * (await ensureRate('USD', 'EUR')); } else if (row.currency.toUpperCase() === 'USD') {
usd = raw;
} else {
const { rate, ok } = await ensureRate(row.currency.toUpperCase(), 'USD');
usd = raw * rate;
if (!ok) rateUnavailable = true;
}
// Skip the USD->EUR chain when the source already matches the target —
// every redundant rate lookup adds rounding noise on top of the network
// round-trip. EUR-source + EUR-target should land back exactly at the
// input amount, not raw * USD-rate * USD-rate-inverse.
let eur: number;
if (row.currency.toUpperCase() === 'EUR') {
eur = raw;
} else {
const { rate, ok } = await ensureRate('USD', 'EUR');
eur = usd * rate;
if (!ok) rateUnavailable = true;
}
const targetVal = target === 'USD' ? usd : eur; const targetVal = target === 'USD' ? usd : eur;
out.push({ out.push({
...row, ...row,
amountUsdNumeric: usd, amountUsdNumeric: usd,
amountEurNumeric: eur, amountEurNumeric: eur,
amountTarget: targetVal, amountTarget: targetVal,
rateUnavailable,
}); });
} }
return out; return out;
@@ -234,6 +276,7 @@ function computeTotals(
const eurTotal = rows.reduce((s, r) => s + r.amountEurNumeric, 0); const eurTotal = rows.reduce((s, r) => s + r.amountEurNumeric, 0);
const processingFee = includeProcessingFee ? targetTotal * 0.05 : 0; const processingFee = includeProcessingFee ? targetTotal * 0.05 : 0;
const receiptlessRows = rows.filter((r) => r.noReceiptAcknowledged); const receiptlessRows = rows.filter((r) => r.noReceiptAcknowledged);
const rateUnavailableCount = rows.reduce((n, r) => n + (r.rateUnavailable ? 1 : 0), 0);
return { return {
count: rows.length, count: rows.length,
targetTotal, targetTotal,
@@ -244,6 +287,7 @@ function computeTotals(
targetCurrency: target, targetCurrency: target,
noReceiptCount: receiptlessRows.length, noReceiptCount: receiptlessRows.length,
noReceiptAmount: receiptlessRows.reduce((s, r) => s + r.amountTarget, 0), noReceiptAmount: receiptlessRows.reduce((s, r) => s + r.amountTarget, 0),
rateUnavailableCount,
}; };
} }
@@ -311,12 +355,17 @@ function groupRows(
async function fetchExpenseRows(args: ExpensePdfArgs): Promise<ExpenseRow[]> { async function fetchExpenseRows(args: ExpensePdfArgs): Promise<ExpenseRow[]> {
const conditions = [eq(expenses.portId, args.portId)]; const conditions = [eq(expenses.portId, args.portId)];
if (args.expenseIds?.length) { // Soft-delete filter applies regardless of which path produced the
conditions.push(inArray(expenses.id, args.expenseIds)); // expense list. The audit caught a regression where an `expenseIds`
} else { // selection would happily export archived rows because the
// `isNull(archivedAt)` predicate sat inside the `else` branch — that
// violates the soft-delete contract used everywhere else.
if (!args.filter?.includeArchived) { if (!args.filter?.includeArchived) {
conditions.push(isNull(expenses.archivedAt)); conditions.push(isNull(expenses.archivedAt));
} }
if (args.expenseIds?.length) {
conditions.push(inArray(expenses.id, args.expenseIds));
} else {
if (args.filter?.dateFrom) { if (args.filter?.dateFrom) {
conditions.push( conditions.push(
gte( gte(
@@ -359,7 +408,7 @@ async function fetchExpenseRows(args: ExpensePdfArgs): Promise<ExpenseRow[]> {
}) })
.from(expenses) .from(expenses)
.where(and(...conditions)) .where(and(...conditions))
.orderBy(expenses.expenseDate); .orderBy(desc(expenses.expenseDate));
return rows as ExpenseRow[]; return rows as ExpenseRow[];
} }
@@ -474,7 +523,10 @@ export async function streamExpensePdf(
if (opts.includeDetails) addExpenseTable(doc, processed, opts); if (opts.includeDetails) addExpenseTable(doc, processed, opts);
if (opts.includeReceipts) { if (opts.includeReceipts) {
await addReceiptPages(doc, processed, filesById, opts); await addReceiptPages(doc, processed, filesById, {
targetCurrency: opts.targetCurrency,
signal: args.signal,
});
} }
addFooter(doc); addFooter(doc);
@@ -485,7 +537,10 @@ export async function streamExpensePdf(
} }
})(); })();
const safeName = opts.documentName.replace(/[^a-zA-Z0-9-_\s]/g, '_').trim() || 'expenses'; // `\s` includes CR/LF; using it lets a malicious documentName forge
// additional response headers via Content-Disposition. Restrict to
// word/dot/dash/space (single-line space only — \s would let \n through).
const safeName = opts.documentName.replace(/[^\w. \-]+/g, '_').trim() || 'expenses';
return { return {
stream: webStream, stream: webStream,
suggestedFilename: `${safeName}.pdf`, suggestedFilename: `${safeName}.pdf`,
@@ -540,7 +595,19 @@ function addSummaryBox(
] ]
: []; : [];
const boxHeight = (lines.length + warningLines.length) * 16 + 20; // Second warning band: any row whose currency conversion fell back to
// 1:1 because the upstream rate was unavailable. Without this surface,
// an EUR-source row would appear as `targetCurrency=EUR, amount=USD`
// and reps would never know the totals are wrong.
const showRateWarning = totals.rateUnavailableCount > 0;
const rateWarningLines = showRateWarning
? [
`Note: ${totals.rateUnavailableCount} expense${totals.rateUnavailableCount === 1 ? '' : 's'} could not be converted (rate unavailable);`,
`the displayed amount${totals.rateUnavailableCount === 1 ? ' is' : 's are'} approximate (1:1 fallback).`,
]
: [];
const boxHeight = (lines.length + warningLines.length + rateWarningLines.length) * 16 + 20;
doc doc
.rect(60, lineY, doc.page.width - 120, boxHeight) .rect(60, lineY, doc.page.width - 120, boxHeight)
.fillColor('#f5f5f5') .fillColor('#f5f5f5')
@@ -561,6 +628,14 @@ function addSummaryBox(
} }
doc.fillColor('#000000').font('Helvetica'); doc.fillColor('#000000').font('Helvetica');
} }
if (showRateWarning) {
doc.fillColor('#92400e').font('Helvetica-Oblique');
for (const line of rateWarningLines) {
doc.text(line, 75, y);
y += 16;
}
doc.fillColor('#000000').font('Helvetica');
}
doc.y = lineY + boxHeight + 12; doc.y = lineY + boxHeight + 12;
} }
@@ -695,7 +770,7 @@ async function addReceiptPages(
doc: PDFKit.PDFDocument, doc: PDFKit.PDFDocument,
rows: ProcessedExpense[], rows: ProcessedExpense[],
filesById: Map<string, ResolvedFile>, filesById: Map<string, ResolvedFile>,
opts: { targetCurrency: TargetCurrency }, opts: { targetCurrency: TargetCurrency; signal?: AbortSignal },
) { ) {
const expensesWithReceipts = rows.filter( const expensesWithReceipts = rows.filter(
(r) => Array.isArray(r.receiptFileIds) && r.receiptFileIds.length > 0, (r) => Array.isArray(r.receiptFileIds) && r.receiptFileIds.length > 0,
@@ -715,6 +790,17 @@ async function addReceiptPages(
for (const expense of expensesWithReceipts) { for (const expense of expensesWithReceipts) {
for (const fileId of expense.receiptFileIds ?? []) { for (const fileId of expense.receiptFileIds ?? []) {
// Bail out the moment the client disconnects. Without this, an
// export aborted on the wire would keep grinding through the
// remaining receipts and only stop when the doc.end() write
// failed — minutes later for a 1000-row export.
if (opts.signal?.aborted) {
logger.info(
{ receiptCounter, totalReceipts },
'Expense PDF stream aborted by client; halting receipt loop',
);
return;
}
receiptCounter += 1; receiptCounter += 1;
const file = filesById.get(fileId); const file = filesById.get(fileId);
if (!file) { if (!file) {

View File

@@ -156,6 +156,24 @@ export async function updateExpense(
) { ) {
const existing = await getExpenseById(id, portId); const existing = await getExpenseById(id, portId);
// The create-time validator enforces "receipt OR no-receipt-ack" via
// `.refine`, but `updateExpenseSchema` is `.partial()` so the rule is
// dropped. Re-assert it here against the merged (existing + patch)
// shape so a PATCH can't slip an unexplained receipt-less expense
// past the create-time guard.
const mergedReceiptIds =
data.receiptFileIds !== undefined ? data.receiptFileIds : existing.receiptFileIds;
const mergedAck =
data.noReceiptAcknowledged !== undefined
? data.noReceiptAcknowledged
: existing.noReceiptAcknowledged;
const hasReceipts = Array.isArray(mergedReceiptIds) && mergedReceiptIds.length > 0;
if (!hasReceipts && !mergedAck) {
throw new ConflictError(
'Expense must either link a receipt file or acknowledge the no-receipt warning.',
);
}
const updateData: Record<string, unknown> = { ...data, updatedAt: new Date() }; const updateData: Record<string, unknown> = { ...data, updatedAt: new Date() };
// Re-convert to USD if amount or currency changed // Re-convert to USD if amount or currency changed

View File

@@ -19,8 +19,9 @@
import { and, desc, eq, inArray } from 'drizzle-orm'; import { and, desc, eq, inArray } from 'drizzle-orm';
import { db } from '@/lib/db'; import { db } from '@/lib/db';
import { interestBerths, type InterestBerth } from '@/lib/db/schema/interests'; import { interestBerths, interests, type InterestBerth } from '@/lib/db/schema/interests';
import { berths } from '@/lib/db/schema/berths'; import { berths } from '@/lib/db/schema/berths';
import { ValidationError } from '@/lib/errors';
type DbOrTx = typeof db | Parameters<Parameters<typeof db.transaction>[0]>[0]; type DbOrTx = typeof db | Parameters<Parameters<typeof db.transaction>[0]>[0];
@@ -197,6 +198,26 @@ export async function upsertInterestBerthTx(
berthId: string, berthId: string,
opts: AddOrUpdateOpts = {}, opts: AddOrUpdateOpts = {},
): Promise<InterestBerth> { ): Promise<InterestBerth> {
// Cross-port guard. The junction is silently multi-port-shaped (it has
// no port_id of its own — it inherits via the FKs) so a caller wiring
// an interest from one port to a berth from another would corrupt the
// recommender + public-berth aggregates with phantom rows. We assert
// both rows live in the same port BEFORE inserting; if either side is
// missing, the FK constraint will surface that on insert.
const sides = await tx
.select({
interestPortId: interests.portId,
berthPortId: berths.portId,
})
.from(interests)
.innerJoin(berths, eq(berths.id, berthId))
.where(eq(interests.id, interestId))
.limit(1);
const side = sides[0];
if (side && side.interestPortId !== side.berthPortId) {
throw new ValidationError('Cannot link an interest and a berth from different ports.');
}
if (opts.isPrimary === true) { if (opts.isPrimary === true) {
await tx await tx
.update(interestBerths) .update(interestBerths)