Merge feat/berth-recommender into main

Multi-phase work bundle (24 commits, 159 files, ~127k LOC) covering plan
docs/berth-recommender-and-pdf-plan.md:

  Phase 0 — NocoDB berth import + mooring normalization (A-01 → A1)
  Phase 1 — /clients + /interests list-column redesign (contacts/yachts join)
  Phase 2 — M:M interest_berths junction with role flags
  Phase 3 — Public berths API + /api/public/health
  Phase 4 — Berth recommender (SQL ranking, tier ladder, heat scoring)
  Phase 5 — Multi-berth EOI bundle + range formatter
  Phase 6 — Pluggable storage backend + per-berth PDF parser
  Phase 7 — Sales send-outs + brochures + email-from settings
  Phase 8 — CLAUDE.md conventions update

Plus a memory-efficient streaming expense PDF export (replaces a legacy
implementation that OOM'd on hundreds of receipts), receipt-less expense
flag with PDF warning annotations, receipt upload UI in the expense
form dialog, and the scan-receipt page accepting device-uploaded photos
in parallel with the OCR scan.

Four audit passes (audit-1 → audit-final, mostly Opus 4.7 reviewers in
parallel) drove progressive hardening: ~50 findings landed; the last
audit's 5 critical / 12 high items are fixed in 180912b. Medium/low
items are deferred and indexed in docs/audit-final-deferred.md.

Tests: 1163/1163 vitest passing. tsc clean. 12 new migrations applied
in dev (0023..0034), three of which (0028/0029, 0034) involve careful
backfills.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
Matt Ciaccio
2026-05-05 05:12:24 +02:00
166 changed files with 128122 additions and 1403 deletions

6
.gitignore vendored
View File

@@ -47,3 +47,9 @@ docker-compose.override.yml
/.claude/
/.serena/
/ruvector.db
# Filesystem storage backend root (FilesystemBackend default location)
/storage/
# Local berth-PDF + brochure samples used as upload fixtures during dev.
/berth_pdf_example/

View File

@@ -95,6 +95,16 @@ src/
- **Inline editing pattern:** detail pages (clients, yachts, companies, interests, residential clients/interests) use `<InlineEditableField>` (`src/components/shared/inline-editable-field.tsx`) for click-to-edit text/select/textarea fields and `<InlineTagEditor>` (`src/components/shared/inline-tag-editor.tsx`) for tag chips. Each entity exposes a `PUT /api/v1/<entity>/[id]/tags` endpoint backed by a `set<Entity>Tags` service helper that wipes-and-rewrites the join table inside a single transaction. There are no separate "Edit" modal forms on detail pages — the entire overview tab is editable in place.
- **Notes (polymorphic across entity types):** `notes.service.ts` dispatches across `clientNotes`, `interestNotes`, `yachtNotes`, `companyNotes` based on an `entityType` discriminator. `<NotesList entityType="…" />` works for all four. `companyNotes` lacks an `updatedAt` column — the service substitutes `createdAt` so callers get a uniform shape.
- **Route handler exports:** Next.js App Router `route.ts` files only allow specific named exports (`GET|POST|…`). Service-tested handler functions live in sibling `handlers.ts` files (e.g. `src/app/api/v1/yachts/[id]/handlers.ts`) and are imported by the colocated `route.ts` for `withAuth(withPermission(...))` wrapping. Integration tests import from `handlers.ts` directly to bypass auth/permission middleware.
- **Multi-berth interest model:** `interest_berths` is the source of truth for which berths an interest is linked to; `interests.berth_id` does not exist (dropped in migration 0029). Three role flags: `is_primary` (≤1 row per interest, enforced by partial unique index — surfaces as "the berth for this deal" in templates / forms / list views), `is_specific_interest` (true → berth shows as "Under Offer" on the public map; false → legal/EOI-only link), `is_in_eoi_bundle` (covered by the interest's EOI signature). Read/write through `src/lib/services/interest-berths.service.ts` helpers (`getPrimaryBerth`, `getPrimaryBerthsForInterests`, `upsertInterestBerth`, `setPrimaryBerth`, `removeInterestBerth`); never query `interest_berths` from outside that service.
- **Mooring number canonical format:** `^[A-Z]+\d+$` (e.g. `A1`, `B12`, `E18`) — no hyphen, no leading zeros. Stored, displayed, URL-encoded, and rendered in EOIs in this exact form. Phase 0 normalized the entire CRM dataset; the mooring-pattern regex gates the public `/api/public/berths/[mooringNumber]` route before any DB hit.
- **Public berths API:** `/api/public/berths` (list) and `/api/public/berths/[mooringNumber]` (single) are the public-facing data feed for the marketing website. Output shape mirrors the legacy NocoDB Berths shape verbatim (`"Mooring Number"`, `"Side Pontoon"`, etc.) — see `src/lib/services/public-berths.ts`. Cache headers: `s-maxage=300, stale-while-revalidate=60`. Status mapping: `"Sold"` (berth.status=sold) > `"Under Offer"` (status=under_offer OR has any active `interest_berths.is_specific_interest=true` link with `interests.outcome IS NULL`) > `"Available"`. The companion `/api/public/health` endpoint returns `{env, appUrl}` so the website refuses to start when its `CRM_PUBLIC_URL` points at a different deployment env.
- **Berth recommender:** Pure SQL ranking (no AI). Lives in `src/lib/services/berth-recommender.service.ts`. Tier ladder A/B/C/D classifies each feasible berth based on its `interest_berths` aggregates. Heat scoring (recency / furthest stage / interest count / EOI count) only fires for tier B (lost/cancelled-only history); per-port admin tunes weights via `system_settings` keys (`heat_weight_*`, `recommender_max_oversize_pct`, `recommender_top_n_default`, `fallthrough_policy`, `fallthrough_cooldown_days`, `tier_ladder_hide_late_stage`). The recommender enforces multi-port isolation both at the entry point (rejects cross-port interest lookups) AND inside the SQL aggregates CTE (defense-in-depth `i.port_id` filter).
- **EOI bundle / range formatter:** Multi-berth EOIs render the in-bundle berth set as a compact range string ("A1-A3, B5-B7") via `formatBerthRange()` in `src/lib/templates/berth-range.ts`. Used only inside the Documenso `Berth Range` form field — CRM UI always shows berths as individual chips. The `{{eoi.berthRange}}` token is in `VALID_MERGE_TOKENS`.
- **Pluggable storage backend:** Code never imports MinIO/S3 directly. All file I/O goes through `getStorageBackend()` from `src/lib/storage/`. Configured via `system_settings.storage_backend` ('s3' | 'filesystem'). Switching backends is a settings change + `pnpm tsx scripts/migrate-storage.ts` run. **Filesystem backend is single-node only**: refuses to start when `MULTI_NODE_DEPLOYMENT=true`. Multi-node deployments must use the s3-compatible backend.
- **Per-berth PDFs:** Versioned via `berth_pdf_versions`; `berths.current_pdf_version_id` always points to the latest active version. Storage key is UUID-based per upload (not version-numbered) so concurrent uploads can't collide on blob paths; `pg_advisory_xact_lock` per berth_id serializes the version-number allocation. 3-tier parser: AcroForm → OCR (Tesseract.js with positional heuristics) → optional AI (rep clicks "AI parse" only when OCR confidence is low). Magic-byte (`%PDF-`) check enforced on BOTH the in-server upload path AND the presigned-PUT path (the post-upload service streams the first 5 bytes via the storage backend). Mooring-number mismatch between PDF and target berth surfaces as a service-level `ConflictError` unless the apply call passes `confirmMooringMismatch: true`.
- **Brochures:** Per-port; default brochure marked via `is_default` (enforced by partial unique index on `(port_id) WHERE is_default=true AND archived_at IS NULL`). Archived brochures retain version history. Same upload flow as berth PDFs (presign + magic-byte verification on the post-upload register endpoint).
- **Send-from accounts (sales send-outs):** Configurable via `system_settings`; defaults to `sales@portnimara.com` for human-touch and `noreply@portnimara.com` for automation. SMTP/IMAP passwords are AES-256-GCM encrypted at rest; the API never returns decrypted secrets — only `*PassIsSet` boolean markers. Send-out audit goes to `document_sends` (separate from `audit_logs` because of volume + binary refs). Body markdown is XSS-safe via `renderEmailBody()` (escape-then-allowlist; tested against the standard XSS vector list). Rate limit: 50 sends/user/hour individual. Pre-send size threshold: files > `email_attach_threshold_mb` ship as a 24h signed-URL link rather than an attachment (avoids the duplicate-send race from async bounces). The download-link fallback HTML-escapes the filename to prevent injection from admin-supplied brochure names. Bounce monitoring requires IMAP credentials in addition to SMTP — without them, the size-rejection banner stays disabled.
- **NocoDB berth import:** `pnpm tsx scripts/import-berths-from-nocodb.ts --apply --port-slug port-nimara` re-imports from the legacy NocoDB Berths table. Idempotent: rows where `updated_at > last_imported_at` (the "human edited this since last import" guard) are skipped unless `--force`. Adds `--update-snapshot` to also rewrite `src/lib/db/seed-data/berths.json`. Uses `pg_advisory_xact_lock` so two simultaneous runs serialize. Pure helpers in `src/lib/services/berth-import.ts` are unit-tested.
- **Routes:** Multi-tenant via `[portSlug]` dynamic segment. Typed routes enabled.
- **Pre-commit:** Husky + lint-staged runs ESLint fix + Prettier on staged `.ts`/`.tsx` files. The hook also blocks `.env*` files (including `.env.example`) from being committed; pass them via a separate workflow if needed.
@@ -139,6 +149,14 @@ Domain-specific references:
- `docs/eoi-documenso-field-mapping.md` — canonical mapping from `EoiContext`
paths to the Documenso template's `formValues` keys, with the matching
AcroForm field names used by the in-app pathway.
AcroForm field names used by the in-app pathway. **Note:** the multi-
berth EOI bundle adds a new `Berth Range` form field populated by
`formatBerthRange()` from `src/lib/templates/berth-range.ts` — the live
Documenso template needs the field added before multi-berth EOIs render
with the compact range string instead of just the primary mooring.
- `assets/README.md` — what the in-app EOI source PDF must contain and how
to override its path in dev/test.
- `docs/berth-recommender-and-pdf-plan.md` — the comprehensive plan for the
Phase 08 berth-recommender + PDF + send-outs work bundle. Single source
of truth for the multi-berth interest model, recommender tier ladder,
pluggable storage, per-berth PDF parser, and sales send-out flows.

View File

@@ -0,0 +1,84 @@
# Final audit deferred findings
The pre-merge audit on `feat/berth-recommender` produced ~30 findings. The
critical + high-severity items were fixed in-branch. The items below are
medium / low severity and deferred to follow-up issues so the merge isn't
held up. Each entry is self-contained — pick one off and ship it.
## Cross-cutting integration
- **EOI in-app pathway silently swallows missing `Berth Range` AcroForm field**
`src/lib/pdf/fill-eoi-form.ts:93`. `setText(form, 'Berth Range', ...)`
is wrapped in a try/catch that succeeds silently when the field is
absent. CLAUDE.md already warns ops about needing to add the field to
the live Documenso template; this code change would make the deployment
gap observable. Fix: when `context.eoiBerthRange` is non-empty AND the
field is absent, log at warn level + surface a structured response field.
- **Email body merge expansion happens after token validation** —
`src/lib/services/document-sends.service.ts:399-403`. If a merge value
contains a `{{token}}` substring (e.g. a client name like
`"Acme {{discount}} Inc."`), the expanded body will contain a token
the unresolved-check missed and ships with literal braces. Fix: HTML-
escape merge values before expansion, OR run a second
`findUnresolvedTokens` against the expanded body.
- **Filesystem dev-fallback HMAC secret can drift across processes** —
`src/lib/storage/filesystem.ts:328-331`. The dev-only fallback derives
the HMAC secret from `BETTER_AUTH_SECRET`. Two CRM processes running
with different secrets (web vs worker) reject each other's tokens.
Fix: assert `BETTER_AUTH_SECRET` is set when filesystem backend is
active in non-prod, or document the requirement loudly.
- **Berth PDF apply path: numeric column nulling silently drops** —
`src/lib/services/berth-pdf.service.ts:473-475`. When
`Number.isFinite(n)` is false the apply loop `continue`s without
pushing to `applied` and without warning. Combined with the
"no appliable fields supplied" check (only fires when ALL drop), partial
silent drops are invisible. Fix: collect dropped keys and surface them.
## Multi-tenant isolation hardening
- **document_sends row stores `interestId` without verifying port match** —
`src/lib/services/document-sends.service.ts:422`. Audit-log pollution
rather than data exposure (the recipient lookup is port-checked already).
Fix: when `recipient.interestId` is set, fetch with
`and(eq(interests.id, ...), eq(interests.portId, input.portId))` and
throw if missing.
- **Storage proxy token does not bind to port_id** —
`src/lib/storage/filesystem.ts:73-84`. ProxyTokenPayload is `{k, e, n,
f?, c?}` with a global HMAC. The current "issuer always checks port
first" relies on every issuer being correct in perpetuity. Fix: add a
`p` (portId) claim and have the proxy route resolve key→owner row +
assert `owner.portId === payload.p` before streaming.
- **Documenso webhook does not enforce port_id on document lookups** —
`src/app/api/webhooks/documenso/route.ts:96-148`. Handlers dispatch by
global `documensoId`. If two ports' documents were ever issued the
same Documenso ID (replay across staging/prod, forwarded webhook from
a foreign instance), the wrong port's interest could be mutated. The
per-body `signatureHash` dedup is partial mitigation. Fix: either
(a) include the originating Documenso instance/team in the lookup, or
(b) verify `documents(documenso_id)` has a unique index port-wide.
## Recent expense work polish
- **renderReceiptHeader cursor math drifts after multi-step writes** —
`src/lib/services/expense-pdf.service.ts:854`. After
`doc.text(...)` with auto-flow, `doc.y` advances. Using `doc.y -
headerH + 10` after the rect+stroke block computes against the
post-rect position; works only because pdfkit's text-after-rect
hasn't moved y yet. Headers may misalign on the first receipt page
after a soft page break. Fix: capture `const baseY = doc.y` before
drawing the rect and compute all subsequent offsets relative to it.
## Settings parsing
- **`loadRecommenderSettings` rejects string-shaped JSONB booleans** —
`src/lib/services/berth-recommender.service.ts:116`. Postgres returns
JSONB `true/false` as JS booleans, but if an admin saves `"true"`
via a UI that wraps the value as a string, `asBool` returns null and
the per-port override silently falls through to defaults. Not a
security bug; a tuning footgun. Fix: accept `"true"`/`"false"` string
forms in `asBool`.

View File

@@ -0,0 +1,147 @@
# Handoff prompt for new Claude Code session
Copy everything below the `---` line into the new chat as your first message.
---
I'm continuing work on a comprehensive multi-feature push that was fully designed in a prior session but not yet implemented. The complete plan lives at `docs/berth-recommender-and-pdf-plan.md` (~1030 lines). **Read that file end-to-end before doing anything else — every design decision, schema change, edge case, and confirmed answer to a product question is captured there.** Don't re-litigate decisions; if something seems unclear, the answer is almost certainly in the plan.
## What the project is
A multi-tenant marina/port-management CRM at `/Users/matt/Repos/new-pn-crm`. Next.js 15 App Router, React 19, TypeScript strict, Drizzle ORM on Postgres, MinIO for files, BullMQ on Redis, better-auth, shadcn/ui, Tailwind. See `CLAUDE.md` for the conventions.
## What we're building (high level)
The plan bundles 8 capabilities into one branch (`feat/berth-recommender`):
1. **/clients + /interests list-column fix** (the original bug — list views show `-` everywhere because the service didn't join contacts/yachts)
2. **Full NocoDB Berths import** + seeding + mooring-number normalization (current CRM has `A-01..E-18`; canonical is `A1..E18`)
3. **Schema refactor** to many-to-many `interest_berths` with role flags (`is_primary`, `is_specific_interest`, `is_in_eoi_bundle`)
4. **Berth recommender** (SQL ranking, tier ladder, heat scoring, UI panel) — no AI; pure SQL
5. **EOI bundle** support (multi-berth EOIs + range formatter for the Documenso PDF: `["A1","A2","A3","B5","B6"]``"A1-A3, B5-B6"`)
6. **Pluggable storage backend** (s3-compatible OR local filesystem) so admins can run without MinIO if they want
7. **Per-berth PDFs** (versioned uploads, OCR-based reverse parser, conflict-resolution diff dialog)
8. **Sales send-out emails** (berth PDF + brochure) with full audit + size-aware fallback to download links
## Phase ordering (from plan §2)
```
Phase 0: Full NocoDB berth import + mooring normalization + 5 new pricing columns
Phase 1: /clients + /interests list column fix
Phase 2: M:M interest_berths schema refactor + desired dimensions on interests
Phase 3: CRM /api/public/berths endpoint + website cutover
Phase 4: Recommender SQL + tier ladder + heat + UI panel
Phase 5: EOI bundle + range formatter
Phase 6a: Pluggable storage backend + migration CLI + admin UI
Phase 6b: Per-berth PDF storage (versioned) + reverse parser
Phase 7: Sales send-outs + brochure admin + email-from settings
Phase 8: CLAUDE.md updates + final validation
```
**Start with Phase 0**.
## Working tree state at handoff
- Branch: `main` (you'll create `feat/berth-recommender` from here)
- Recent commits (already pushed):
- `8699f81 chore(style): codebase em-dash sweep + minor layout polish`
- `d62822c fix(migration): NocoDB import safety + dedup helpers + lead-source backfill`
- `089f4a6 feat(receipts): upload guide page + scanner head-tag fix`
- `77ad10c feat(dashboard): custom date range + KPI port-hydration gate`
- `e598cc0 feat(layout): unified Inbox + UserMenu extraction`
- `f5772ce feat(analytics): Umami integration with per-port admin settings`
- `49d34e0 feat(website-intake): dual-write endpoint + migration chain repair`
- Untracked / uncommitted at handoff:
- `docs/berth-recommender-and-pdf-plan.md` (the plan — read this first)
- `docs/berth-feature-handoff-prompt.md` (this file)
- `berth_pdf_example/` (two reference files — see below)
- `.env.example` (modified — adds `WEBSITE_INTAKE_SECRET=`; pre-commit hook blocks `.env*` files so user adds this manually)
- Dev DB state:
- 245 clients (210 with no `nationality_iso` — Phase 1 backfills from primary phone's `value_country`)
- 4 test rows in `website_submissions` (from a previous live audit; safe to ignore)
- 90 berths with `mooring_number` in `A-01` format (Phase 0 normalizes to `A1`)
- vitest: 956 tests passing
- tsc: clean (one pre-existing issue in `scripts/smoke-test-redirect.ts` that's unrelated)
## Reference files
- `berth_pdf_example/Berth_Spec_Sheet_A1.pdf` (358 KB) — sample per-berth PDF. **0 AcroForm fields** (confirmed via pdf-lib) so OCR with positional heuristics is the primary parser tier; the AcroForm tier is built defensively. Plan §9.2 captures the layout structure.
- `berth_pdf_example/Port-Nimara-Brochure-March-2025_5nT92g.pdf` (10.26 MB) — sample brochure. Sized so it ships as an attachment under the 15 MB threshold. Plan §11.1 covers brochure handling.
## NocoDB access
You have `mcp__NocoDB_Base_-_Port_Nimara__*` tools available. Tables you'll touch most:
- `mczgos9hr3oa9qc` — Berths (Phase 0 imports from here; mooring numbers are stored as `A1..E18`)
- `mbs9hjauug4eseo` — Interests (the combined client+deal table the old system used)
## Branch & commit conventions
- Create the branch: `git checkout -b feat/berth-recommender`
- Commit messages match recent history style: `<type>(<scope>): <subject>` lowercase, terse subject, body explains why not what.
- **Pre-commit hook blocks any `.env*` file** including `.env.example`. If you need to update `.env.example`, leave it staged and tell the user to commit manually with `--no-verify` (they're aware of this).
- **Don't push without explicit user permission.** Commits are fine; pushes need approval.
- **Don't run `git rebase`, `git push --force`, or anything destructive without checking.** The branch is solo-owned but the repo's `main` is shared.
## User communication preferences (from prior session)
- Direct, no fluff. If something is a bad idea, say so — don't sycophant.
- When proposing changes, include trade-offs explicitly.
- For multi-question decisions, use `AskUserQuestion` rather than long bulleted lists.
- Run validation (vitest + tsc) at logical checkpoints. Don't ship a commit with regressions.
- The user prefers small focused commits over mega-commits. Within Phase 0 alone there will probably be 2-3 commits (e.g. mooring normalization, schema additions, NocoDB import script).
## Critical rules (from plan §14)
Eleven 🔴 critical items requiring tests before their phase ships:
1. NocoDB mooring collisions → unique constraint + ON CONFLICT
2. Non-PDF disguised upload → magic-byte check
3. Recipient email typos → pre-send confirmation
4. XSS in email body markdown → DOMPurify + payload tests
5. SMTP credentials silently failing → loud error + failed `document_sends` row
6. Wrong-environment `CRM_PUBLIC_URL` → health-check env match
7. Mooring format drift breaking `/berths/A1` URLs → Phase 0 normalization gates Phase 3
8. Multi-port isolation in recommender → explicit `port_id` filter + cross-port test
9. Permission escalation on SMTP creds → per-port admin only, no rep visibility
10. Filesystem backend in multi-node deployment → refuse to start; documented + health-check enforced
11. Path traversal via storage key in filesystem mode → strict regex validation + path realpath check
## Pending items (from plan §9)
These are non-blocking but worth knowing:
- Sample brochure already provided (the 10.26 MB file above).
- SMTP app password for `sales@portnimara.com` — not yet obtained; expected close to production cutover. Phase 7 ships the admin UI immediately and the credential gets entered when available.
- `CRM_PUBLIC_URL` confirmed as `https://crm.portnimara.com` once live; configurable via env.
- GDPR cascade behavior for `document_sends` (delete vs. anonymize-PII vs. keep) — left `OPEN` in §14.10, default lean: anonymize-PII. Revisit when Phase 7 schema lands.
## Scope reminder
- **No prod data depends on the current CRM schema** — refactors don't need backwards-compatibility shims. But every schema change still ships as a Drizzle migration with `pnpm db:generate`.
- **Pluggable storage** rejects Postgres `bytea` as an option (§4.7a). The two backends are s3-compatible (MinIO/AWS/B2/R2/etc.) and local filesystem. Filesystem is single-node only.
## What to do first
1. Read `docs/berth-recommender-and-pdf-plan.md` end-to-end. Don't skim. The edge-case audit in §14 alone is critical context.
2. Confirm you've understood the plan by stating back the 8-phase outline and the 11 critical items, then ask the user if they want to proceed with Phase 0.
3. Once approved, create `feat/berth-recommender` and start Phase 0.
Phase 0 deliverables (per plan):
- One commit normalizing existing CRM mooring numbers from `A-01``A1` form (via `regexp_replace` migration). Delete the offending `scripts/load-berths-to-port-nimara.ts`.
- One commit adding the 5 new berth columns (`weekly_rate_high_usd`, `weekly_rate_low_usd`, `daily_rate_high_usd`, `daily_rate_low_usd`, `pricing_valid_until`, `last_imported_at`). Run `pnpm db:generate`. Verify `meta/_journal.json` prevId chain stays contiguous.
- One commit adding `scripts/import-berths-from-nocodb.ts` — the idempotent NocoDB import (handles updates, preserves CRM-side edits via `last_imported_at vs updated_at` check, `pg_advisory_lock`, dry-run flag, etc. per §4.1 and §14.1).
- Update `src/lib/db/seed-data.ts` with the imported berth set so fresh installs get them.
- Final vitest + tsc validation at the end of Phase 0.
## Don't
- Don't push to remote during this session (user will batch the push later).
- Don't commit `.env*` files (hook blocks them anyway).
- Don't edit `.gitignore` to exclude generated artifacts; the repo's existing ignores are correct.
- Don't add documentation files unless the plan asks for them — the plan itself is the doc.
- Don't add features not in the plan. If something seems missing, ask.
- Don't use AI for the recommender (plan §1 + §13). Pure SQL ranking.
Once you've read the plan and confirmed understanding, ask me whether to proceed with Phase 0.

File diff suppressed because it is too large Load Diff

View File

@@ -19,19 +19,24 @@ The template exposes eight text fields (`formValues` keys) and two boolean check
## Field mapping
The legacy template (Documenso template `8`, configured in production) auto-fills exactly the fields below. All eight text fields + two booleans are populated by `buildDocumensoPayload()` from the resolved `EoiContext`. Anything else on the form (signature, date, terms acknowledgment) is filled in by the client inside Documenso.
| Documenso key | Type | Legacy source | New `EoiContext` path | Notes |
| -------------- | ------- | --------------------------- | ----------------------------------------------------- | ------------------------------------------------------------------------- |
| -------------- | ------- | --------------------------- | ----------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `Name` | text | `interest['Full Name']` | `context.client.fullName` | The interest's point-of-contact client (billing signer). |
| `Email` | text | `interest['Email Address']` | `context.client.primaryEmail` | Primary email contact from `client_contacts`. |
| `Address` | text | `interest['Address']` | concat `context.client.address.{street,city,country}` | Concatenate street, city, country with `', '`. Empty if address is null. |
| `Yacht Name` | text | `interest['Yacht Name']` | `context.yacht.name` | Yacht is now a first-class row; pulled via `interest.yachtId`. |
| `Length` | text | `interest['Length']` | `context.yacht.lengthFt` | Send as string. Documenso doesn't enforce numeric format. |
| `Yacht Name` | text | `interest['Yacht Name']` | `context.yacht.name` | Yacht is now a first-class row; pulled via `interest.yachtId`. Empty string when no yacht is linked yet. |
| `Length` | text | `interest['Length']` | `context.yacht.lengthFt` | Boat dimension. Send as string. Documenso doesn't enforce numeric format. Empty string when not applicable. |
| `Width` | text | `interest['Width']` | `context.yacht.widthFt` | Same. |
| `Draft` | text | `interest['Depth']` | `context.yacht.draftFt` | Legacy field was named "Depth" in NocoDB; Documenso key is "Draft". |
| `Berth Number` | text | `berthNumbers` (joined) | `context.berth.mooringNumber` | One berth per reservation. Multi-berth case was multi-interest in legacy. |
| `Berth Number` | text | `berthNumbers` (joined) | `context.berth.mooringNumber` | The interest's PRIMARY berth (resolved via `interest_berths.is_primary=true`). Empty string when no primary set. |
| `Berth Range` | text | (new) | `context.eoiBerthRange` | **NEW IN PHASE 5** — compact range string for multi-berth EOIs (e.g. `"A1-A3, B5-B7"`) covering every junction row marked `is_in_eoi_bundle=true`. Empty string when the bundle is empty. **The live Documenso template (id `8`) does NOT yet have this field. Add a `Berth Range` text field to the template before multi-berth EOIs render the range; until then Documenso silently drops the value and only `Berth Number` (the primary mooring) renders.** |
| `Lease_10` | boolean | hardcoded `false` | `false` | Hardcoded — legacy flow defaults to Purchase (not Lease). |
| `Purchase` | boolean | hardcoded `true` | `true` | Hardcoded — legacy flow defaults to Purchase. |
**Backwards-compatibility guarantee**: every legacy `formValues` key is still emitted with the same name and type. The only addition is `Berth Range` (Phase 5). Documenso silently ignores unknown formValues keys, so old templates that don't have `Berth Range` will simply not render it — single-berth EOIs continue to work identically. No template changes are required for legacy use.
## Document `meta` fields (non-`formValues`)
| Documenso key | Type | Legacy source | New source |

View File

@@ -52,6 +52,7 @@
"@tanstack/react-query": "^5.62.0",
"@tanstack/react-query-devtools": "^5.62.0",
"@tanstack/react-table": "^8.21.3",
"@types/pdfkit": "^0.17.6",
"archiver": "^7.0.1",
"better-auth": "^1.2.0",
"bullmq": "^5.25.0",
@@ -73,6 +74,7 @@
"nodemailer": "^6.9.0",
"openai": "^6.27.0",
"pdf-lib": "^1.17.1",
"pdfkit": "^0.18.0",
"pino": "^9.5.0",
"pino-pretty": "^13.0.0",
"postgres": "^3.4.0",
@@ -81,6 +83,7 @@
"react-dom": "^19.0.0",
"react-hook-form": "^7.54.0",
"recharts": "^3.8.0",
"sharp": "^0.34.5",
"socket.io": "^4.8.0",
"socket.io-client": "^4.8.0",
"sonner": "^1.7.0",

357
pnpm-lock.yaml generated
View File

@@ -101,6 +101,9 @@ importers:
'@tanstack/react-table':
specifier: ^8.21.3
version: 8.21.3(react-dom@19.2.4(react@19.2.4))(react@19.2.4)
'@types/pdfkit':
specifier: ^0.17.6
version: 0.17.6
archiver:
specifier: ^7.0.1
version: 7.0.1
@@ -164,6 +167,9 @@ importers:
pdf-lib:
specifier: ^1.17.1
version: 1.17.1
pdfkit:
specifier: ^0.18.0
version: 0.18.0
pino:
specifier: ^9.5.0
version: 9.14.0
@@ -188,6 +194,9 @@ importers:
recharts:
specifier: ^3.8.0
version: 3.8.0(@types/react@19.2.14)(react-dom@19.2.4(react@19.2.4))(react-is@18.3.1)(react@19.2.4)(redux@5.0.1)
sharp:
specifier: ^0.34.5
version: 0.34.5
socket.io:
specifier: ^4.8.0
version: 4.8.3
@@ -1153,64 +1162,138 @@ packages:
resolution: {integrity: sha512-bV0Tgo9K4hfPCek+aMAn81RppFKv2ySDQeMoSZuvTASywNTnVJCArCZE2FWqpvIatKu7VMRLWlR1EazvVhDyhQ==}
engines: {node: '>=18.18'}
'@img/colour@1.1.0':
resolution: {integrity: sha512-Td76q7j57o/tLVdgS746cYARfSyxk8iEfRxewL9h4OMzYhbW4TAcppl0mT4eyqXddh6L/jwoM75mo7ixa/pCeQ==}
engines: {node: '>=18'}
'@img/sharp-darwin-arm64@0.33.5':
resolution: {integrity: sha512-UT4p+iz/2H4twwAoLCqfA9UH5pI6DggwKEGuaPy7nCVQ8ZsiY5PIcrRvD1DzuY3qYL07NtIQcWnBSY/heikIFQ==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [arm64]
os: [darwin]
'@img/sharp-darwin-arm64@0.34.5':
resolution: {integrity: sha512-imtQ3WMJXbMY4fxb/Ndp6HBTNVtWCUI0WdobyheGf5+ad6xX8VIDO8u2xE4qc/fr08CKG/7dDseFtn6M6g/r3w==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [arm64]
os: [darwin]
'@img/sharp-darwin-x64@0.33.5':
resolution: {integrity: sha512-fyHac4jIc1ANYGRDxtiqelIbdWkIuQaI84Mv45KvGRRxSAa7o7d1ZKAOBaYbnepLC1WqxfpimdeWfvqqSGwR2Q==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [x64]
os: [darwin]
'@img/sharp-darwin-x64@0.34.5':
resolution: {integrity: sha512-YNEFAF/4KQ/PeW0N+r+aVVsoIY0/qxxikF2SWdp+NRkmMB7y9LBZAVqQ4yhGCm/H3H270OSykqmQMKLBhBJDEw==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [x64]
os: [darwin]
'@img/sharp-libvips-darwin-arm64@1.0.4':
resolution: {integrity: sha512-XblONe153h0O2zuFfTAbQYAX2JhYmDHeWikp1LM9Hul9gVPjFY427k6dFEcOL72O01QxQsWi761svJ/ev9xEDg==}
cpu: [arm64]
os: [darwin]
'@img/sharp-libvips-darwin-arm64@1.2.4':
resolution: {integrity: sha512-zqjjo7RatFfFoP0MkQ51jfuFZBnVE2pRiaydKJ1G/rHZvnsrHAOcQALIi9sA5co5xenQdTugCvtb1cuf78Vf4g==}
cpu: [arm64]
os: [darwin]
'@img/sharp-libvips-darwin-x64@1.0.4':
resolution: {integrity: sha512-xnGR8YuZYfJGmWPvmlunFaWJsb9T/AO2ykoP3Fz/0X5XV2aoYBPkX6xqCQvUTKKiLddarLaxpzNe+b1hjeWHAQ==}
cpu: [x64]
os: [darwin]
'@img/sharp-libvips-darwin-x64@1.2.4':
resolution: {integrity: sha512-1IOd5xfVhlGwX+zXv2N93k0yMONvUlANylbJw1eTah8K/Jtpi15KC+WSiaX/nBmbm2HxRM1gZ0nSdjSsrZbGKg==}
cpu: [x64]
os: [darwin]
'@img/sharp-libvips-linux-arm64@1.0.4':
resolution: {integrity: sha512-9B+taZ8DlyyqzZQnoeIvDVR/2F4EbMepXMc/NdVbkzsJbzkUjhXv/70GQJ7tdLA4YJgNP25zukcxpX2/SueNrA==}
cpu: [arm64]
os: [linux]
libc: [glibc]
'@img/sharp-libvips-linux-arm64@1.2.4':
resolution: {integrity: sha512-excjX8DfsIcJ10x1Kzr4RcWe1edC9PquDRRPx3YVCvQv+U5p7Yin2s32ftzikXojb1PIFc/9Mt28/y+iRklkrw==}
cpu: [arm64]
os: [linux]
libc: [glibc]
'@img/sharp-libvips-linux-arm@1.0.5':
resolution: {integrity: sha512-gvcC4ACAOPRNATg/ov8/MnbxFDJqf/pDePbBnuBDcjsI8PssmjoKMAz4LtLaVi+OnSb5FK/yIOamqDwGmXW32g==}
cpu: [arm]
os: [linux]
libc: [glibc]
'@img/sharp-libvips-linux-arm@1.2.4':
resolution: {integrity: sha512-bFI7xcKFELdiNCVov8e44Ia4u2byA+l3XtsAj+Q8tfCwO6BQ8iDojYdvoPMqsKDkuoOo+X6HZA0s0q11ANMQ8A==}
cpu: [arm]
os: [linux]
libc: [glibc]
'@img/sharp-libvips-linux-ppc64@1.2.4':
resolution: {integrity: sha512-FMuvGijLDYG6lW+b/UvyilUWu5Ayu+3r2d1S8notiGCIyYU/76eig1UfMmkZ7vwgOrzKzlQbFSuQfgm7GYUPpA==}
cpu: [ppc64]
os: [linux]
libc: [glibc]
'@img/sharp-libvips-linux-riscv64@1.2.4':
resolution: {integrity: sha512-oVDbcR4zUC0ce82teubSm+x6ETixtKZBh/qbREIOcI3cULzDyb18Sr/Wcyx7NRQeQzOiHTNbZFF1UwPS2scyGA==}
cpu: [riscv64]
os: [linux]
libc: [glibc]
'@img/sharp-libvips-linux-s390x@1.0.4':
resolution: {integrity: sha512-u7Wz6ntiSSgGSGcjZ55im6uvTrOxSIS8/dgoVMoiGE9I6JAfU50yH5BoDlYA1tcuGS7g/QNtetJnxA6QEsCVTA==}
cpu: [s390x]
os: [linux]
libc: [glibc]
'@img/sharp-libvips-linux-s390x@1.2.4':
resolution: {integrity: sha512-qmp9VrzgPgMoGZyPvrQHqk02uyjA0/QrTO26Tqk6l4ZV0MPWIW6LTkqOIov+J1yEu7MbFQaDpwdwJKhbJvuRxQ==}
cpu: [s390x]
os: [linux]
libc: [glibc]
'@img/sharp-libvips-linux-x64@1.0.4':
resolution: {integrity: sha512-MmWmQ3iPFZr0Iev+BAgVMb3ZyC4KeFc3jFxnNbEPas60e1cIfevbtuyf9nDGIzOaW9PdnDciJm+wFFaTlj5xYw==}
cpu: [x64]
os: [linux]
libc: [glibc]
'@img/sharp-libvips-linux-x64@1.2.4':
resolution: {integrity: sha512-tJxiiLsmHc9Ax1bz3oaOYBURTXGIRDODBqhveVHonrHJ9/+k89qbLl0bcJns+e4t4rvaNBxaEZsFtSfAdquPrw==}
cpu: [x64]
os: [linux]
libc: [glibc]
'@img/sharp-libvips-linuxmusl-arm64@1.0.4':
resolution: {integrity: sha512-9Ti+BbTYDcsbp4wfYib8Ctm1ilkugkA/uscUn6UXK1ldpC1JjiXbLfFZtRlBhjPZ5o1NCLiDbg8fhUPKStHoTA==}
cpu: [arm64]
os: [linux]
libc: [musl]
'@img/sharp-libvips-linuxmusl-arm64@1.2.4':
resolution: {integrity: sha512-FVQHuwx1IIuNow9QAbYUzJ+En8KcVm9Lk5+uGUQJHaZmMECZmOlix9HnH7n1TRkXMS0pGxIJokIVB9SuqZGGXw==}
cpu: [arm64]
os: [linux]
libc: [musl]
'@img/sharp-libvips-linuxmusl-x64@1.0.4':
resolution: {integrity: sha512-viYN1KX9m+/hGkJtvYYp+CCLgnJXwiQB39damAO7WMdKWlIhmYTfHjwSbQeUK/20vY154mwezd9HflVFM1wVSw==}
cpu: [x64]
os: [linux]
libc: [musl]
'@img/sharp-libvips-linuxmusl-x64@1.2.4':
resolution: {integrity: sha512-+LpyBk7L44ZIXwz/VYfglaX/okxezESc6UxDSoyo2Ks6Jxc4Y7sGjpgU9s4PMgqgjj1gZCylTieNamqA1MF7Dg==}
cpu: [x64]
os: [linux]
libc: [musl]
'@img/sharp-linux-arm64@0.33.5':
resolution: {integrity: sha512-JMVv+AMRyGOHtO1RFBiJy/MBsgz0x4AWrT6QoEVVTyh1E39TrCUpTRI7mx9VksGX4awWASxqCYLCV4wBZHAYxA==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
@@ -1218,6 +1301,13 @@ packages:
os: [linux]
libc: [glibc]
'@img/sharp-linux-arm64@0.34.5':
resolution: {integrity: sha512-bKQzaJRY/bkPOXyKx5EVup7qkaojECG6NLYswgktOZjaXecSAeCWiZwwiFf3/Y+O1HrauiE3FVsGxFg8c24rZg==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [arm64]
os: [linux]
libc: [glibc]
'@img/sharp-linux-arm@0.33.5':
resolution: {integrity: sha512-JTS1eldqZbJxjvKaAkxhZmBqPRGmxgu+qFKSInv8moZ2AmT5Yib3EQ1c6gp493HvrvV8QgdOXdyaIBrhvFhBMQ==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
@@ -1225,6 +1315,27 @@ packages:
os: [linux]
libc: [glibc]
'@img/sharp-linux-arm@0.34.5':
resolution: {integrity: sha512-9dLqsvwtg1uuXBGZKsxem9595+ujv0sJ6Vi8wcTANSFpwV/GONat5eCkzQo/1O6zRIkh0m/8+5BjrRr7jDUSZw==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [arm]
os: [linux]
libc: [glibc]
'@img/sharp-linux-ppc64@0.34.5':
resolution: {integrity: sha512-7zznwNaqW6YtsfrGGDA6BRkISKAAE1Jo0QdpNYXNMHu2+0dTrPflTLNkpc8l7MUP5M16ZJcUvysVWWrMefZquA==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [ppc64]
os: [linux]
libc: [glibc]
'@img/sharp-linux-riscv64@0.34.5':
resolution: {integrity: sha512-51gJuLPTKa7piYPaVs8GmByo7/U7/7TZOq+cnXJIHZKavIRHAP77e3N2HEl3dgiqdD/w0yUfiJnII77PuDDFdw==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [riscv64]
os: [linux]
libc: [glibc]
'@img/sharp-linux-s390x@0.33.5':
resolution: {integrity: sha512-y/5PCd+mP4CA/sPDKl2961b+C9d+vPAveS33s6Z3zfASk2j5upL6fXVPZi7ztePZ5CuH+1kW8JtvxgbuXHRa4Q==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
@@ -1232,6 +1343,13 @@ packages:
os: [linux]
libc: [glibc]
'@img/sharp-linux-s390x@0.34.5':
resolution: {integrity: sha512-nQtCk0PdKfho3eC5MrbQoigJ2gd1CgddUMkabUj+rBevs8tZ2cULOx46E7oyX+04WGfABgIwmMC0VqieTiR4jg==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [s390x]
os: [linux]
libc: [glibc]
'@img/sharp-linux-x64@0.33.5':
resolution: {integrity: sha512-opC+Ok5pRNAzuvq1AG0ar+1owsu842/Ab+4qvU879ippJBHvyY5n2mxF1izXqkPYlGuP/M556uh53jRLJmzTWA==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
@@ -1239,6 +1357,13 @@ packages:
os: [linux]
libc: [glibc]
'@img/sharp-linux-x64@0.34.5':
resolution: {integrity: sha512-MEzd8HPKxVxVenwAa+JRPwEC7QFjoPWuS5NZnBt6B3pu7EG2Ge0id1oLHZpPJdn3OQK+BQDiw9zStiHBTJQQQQ==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [x64]
os: [linux]
libc: [glibc]
'@img/sharp-linuxmusl-arm64@0.33.5':
resolution: {integrity: sha512-XrHMZwGQGvJg2V/oRSUfSAfjfPxO+4DkiRh6p2AFjLQztWUuY/o8Mq0eMQVIY7HJ1CDQUJlxGGZRw1a5bqmd1g==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
@@ -1246,6 +1371,13 @@ packages:
os: [linux]
libc: [musl]
'@img/sharp-linuxmusl-arm64@0.34.5':
resolution: {integrity: sha512-fprJR6GtRsMt6Kyfq44IsChVZeGN97gTD331weR1ex1c1rypDEABN6Tm2xa1wE6lYb5DdEnk03NZPqA7Id21yg==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [arm64]
os: [linux]
libc: [musl]
'@img/sharp-linuxmusl-x64@0.33.5':
resolution: {integrity: sha512-WT+d/cgqKkkKySYmqoZ8y3pxx7lx9vVejxW/W4DOFMYVSkErR+w7mf2u8m/y4+xHe7yY9DAXQMWQhpnMuFfScw==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
@@ -1253,23 +1385,53 @@ packages:
os: [linux]
libc: [musl]
'@img/sharp-linuxmusl-x64@0.34.5':
resolution: {integrity: sha512-Jg8wNT1MUzIvhBFxViqrEhWDGzqymo3sV7z7ZsaWbZNDLXRJZoRGrjulp60YYtV4wfY8VIKcWidjojlLcWrd8Q==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [x64]
os: [linux]
libc: [musl]
'@img/sharp-wasm32@0.33.5':
resolution: {integrity: sha512-ykUW4LVGaMcU9lu9thv85CbRMAwfeadCJHRsg2GmeRa/cJxsVY9Rbd57JcMxBkKHag5U/x7TSBpScF4U8ElVzg==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [wasm32]
'@img/sharp-wasm32@0.34.5':
resolution: {integrity: sha512-OdWTEiVkY2PHwqkbBI8frFxQQFekHaSSkUIJkwzclWZe64O1X4UlUjqqqLaPbUpMOQk6FBu/HtlGXNblIs0huw==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [wasm32]
'@img/sharp-win32-arm64@0.34.5':
resolution: {integrity: sha512-WQ3AgWCWYSb2yt+IG8mnC6Jdk9Whs7O0gxphblsLvdhSpSTtmu69ZG1Gkb6NuvxsNACwiPV6cNSZNzt0KPsw7g==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [arm64]
os: [win32]
'@img/sharp-win32-ia32@0.33.5':
resolution: {integrity: sha512-T36PblLaTwuVJ/zw/LaH0PdZkRz5rd3SmMHX8GSmR7vtNSP5Z6bQkExdSK7xGWyxLw4sUknBuugTelgw2faBbQ==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [ia32]
os: [win32]
'@img/sharp-win32-ia32@0.34.5':
resolution: {integrity: sha512-FV9m/7NmeCmSHDD5j4+4pNI8Cp3aW+JvLoXcTUo0IqyjSfAZJ8dIUmijx1qaJsIiU+Hosw6xM5KijAWRJCSgNg==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [ia32]
os: [win32]
'@img/sharp-win32-x64@0.33.5':
resolution: {integrity: sha512-MpY/o8/8kj+EcnxwvrP4aTJSWw/aZ7JIGR4aBeZkZw5B7/Jn+tY9/VNwtcoGmdT7GfggGIU4kygOMSbYnOrAbg==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [x64]
os: [win32]
'@img/sharp-win32-x64@0.34.5':
resolution: {integrity: sha512-+29YMsqY2/9eFEiW93eqWnuLcWcufowXewwSNIT6UwZdUUCrM3oFjMWH/Z6/TMmb4hlFenmfAVbpWeup2jryCw==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [x64]
os: [win32]
'@ioredis/commands@1.5.0':
resolution: {integrity: sha512-eUgLqrMf8nJkZxT24JvVRrQya1vZkQh8BBeYNwGDqa5I0VUi8ACx7uFvAaLxintokpTenkK6DASvo/bvNbBGow==}
@@ -1390,10 +1552,18 @@ packages:
cpu: [x64]
os: [win32]
'@noble/ciphers@1.3.0':
resolution: {integrity: sha512-2I0gnIVPtfnMw9ee9h1dJG7tp81+8Ob3OJb3Mv37rx5L40/b0i7djjCVvGOVqc9AEIQyvyu1i6ypKdFw8R8gQw==}
engines: {node: ^14.21.3 || >=16}
'@noble/ciphers@2.1.1':
resolution: {integrity: sha512-bysYuiVfhxNJuldNXlFEitTVdNnYUc+XNJZd7Qm2a5j1vZHgY+fazadNFWFaMK/2vye0JVlxV3gHmC0WDfAOQw==}
engines: {node: '>= 20.19.0'}
'@noble/hashes@1.8.0':
resolution: {integrity: sha512-jCs9ldd7NwzpgXDIf6P3+NrHh9/sD6CQdxHyjQI+h/6rDNo88ypBxxz45UDuZHz9r3tNz7N/VInSVoVdtXEI4A==}
engines: {node: ^14.21.3 || >=16}
'@noble/hashes@2.0.1':
resolution: {integrity: sha512-XlOlEbQcE9fmuXxrVTXCTlG2nlRXa9Rj3rr5Ue/+tX+nmkgbX720YHh0VR3hBF9xDvwnb8D2shVGOwNx+ulArw==}
engines: {node: '>= 20.19.0'}
@@ -2328,6 +2498,9 @@ packages:
'@types/nodemailer@6.4.23':
resolution: {integrity: sha512-aFV3/NsYFLSx9mbb5gtirBSXJnAlrusoKNuPbxsASWc7vrKLmIrTQRpdcxNcSFL3VW2A2XpeLEavwb2qMi6nlQ==}
'@types/pdfkit@0.17.6':
resolution: {integrity: sha512-tIwzxk2uWKp0Cq9JIluQXJid77lYhF52EsIOwhsMF4iWLA6YneoBR1xVKYYdAysHuepUB0OX4tdwMiUDdGKmig==}
'@types/react-dom@19.2.3':
resolution: {integrity: sha512-jp2L/eY6fn+KgVVQAOqYItbF0VY/YApe5Mz2F0aykSO8gx31bYCZyvSeYxCHKvzHG5eZjc+zyaS5BrBWya2+kQ==}
peerDependencies:
@@ -2776,6 +2949,10 @@ packages:
bare-url@2.4.2:
resolution: {integrity: sha512-/9a2j4ac6ckpmAHvod/ob7x439OAHst/drc2Clnq+reRYd/ovddwcF4LfoxHyNk5AuGBnPg+HqFjmE/Zpq6v0A==}
base64-js@0.0.8:
resolution: {integrity: sha512-3XSA2cR/h/73EzlXXdU6YNycmYI7+kicTxks4eJg2g39biHR84slg2+des+p7iHYhbRg/udIS4TD53WabcOUkw==}
engines: {node: '>= 0.4'}
base64-js@1.5.1:
resolution: {integrity: sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA==}
@@ -2893,6 +3070,9 @@ packages:
browser-or-node@2.1.1:
resolution: {integrity: sha512-8CVjaLJGuSKMVTxJ2DpBl5XnlNDiT4cQFeuCJJrvJmts9YrTZDizTX7PjC2s6W4x+MBGZeEY6dGMrF04/6Hgqg==}
browserify-zlib@0.2.0:
resolution: {integrity: sha512-Z942RysHXmJrhqk88FmKBVq/v5tqmSkDz7p54G/MGyjMnCFFnC79XWNbg+Vta8W6Wb2qtSZTSxIGkJrRpCFEiA==}
browserslist@4.28.1:
resolution: {integrity: sha512-ZC5Bd0LgJXgwGqUknZY/vkUQ04r8NXnJZ3yYi4vDmSiZmC/pdSN0NbNRPxZpbtO4uAfDUAFffO8IZoM3Gj8IkA==}
engines: {node: ^6 || ^7 || ^8 || ^9 || ^10 || ^11 || ^12 || >=13.7}
@@ -4134,6 +4314,9 @@ packages:
resolution: {integrity: sha512-cEiJEAEoIbWfCZYKWhVwFuvPX1gETRYPw6LlaTKoxD3s2AkXzkCjnp6h0V77ozyqj0jakteJ4YqDJT830+lVGw==}
engines: {node: '>=14'}
js-md5@0.8.3:
resolution: {integrity: sha512-qR0HB5uP6wCuRMrWPTrkMaev7MJZwJuuw4fnwAzRgP4J4/F8RwtodOKpGp4XpqsLBFzzgqIO42efFAyz2Et6KQ==}
js-tokens@10.0.0:
resolution: {integrity: sha512-lM/UBzQmfJRo9ABXbPWemivdCW8V2G8FHaHdypQaIy523snUjog0W71ayWXTjiR+ixeMyVHN2XcpnTd/liPg/Q==}
@@ -4286,6 +4469,9 @@ packages:
resolution: {integrity: sha512-/vlFKAoH5Cgt3Ie+JLhRbwOsCQePABiU3tJ1egGvyQ+33R/vcwM2Zl2QR/LzjsBeItPt3oSVXapn+m4nQDvpzw==}
engines: {node: '>=14'}
linebreak@1.1.0:
resolution: {integrity: sha512-MHp03UImeVhB7XZtjd0E4n6+3xr5Dq/9xI/5FptGk5FrbDR3zagPa2DS6U8ks/3HjbKWG9Q1M2ufOzxV2qLYSQ==}
lines-and-columns@1.2.4:
resolution: {integrity: sha512-7ylylesZQ/PV29jhEDl3Ufjo6ZX7gCqJr5F7PKrqc93v7fzSymt1BpwEU8nAUXs8qzzvqhbjhK5QZg6Mt/HkBg==}
@@ -4696,6 +4882,9 @@ packages:
pdf-lib@1.17.1:
resolution: {integrity: sha512-V/mpyJAoTsN4cnP31vc0wfNA1+p20evqqnap0KLoRUN0Yk/p3wN52DOEsL4oBFcLdb76hlpKPtzJIgo67j/XLw==}
pdfkit@0.18.0:
resolution: {integrity: sha512-NvUwSDZ0eYEzqAiWwVQkRkjYUkZ48kcsHuCO31ykqPPIVkwoSDjDGiwIgHHNtsiwls3z3P/zy4q00hl2chg2Ug==}
peberminta@0.9.0:
resolution: {integrity: sha512-XIxfHpEuSJbITd1H3EeQwpcZbTLHc+VVr8ANI9t5sit565tsI4/xK3KWTUFE2e6QiangUkh3B0jihzmGnNrRsQ==}
@@ -4757,6 +4946,9 @@ packages:
engines: {node: '>=18'}
hasBin: true
png-js@1.1.0:
resolution: {integrity: sha512-PM/uYGzGdNSzqeOgly68+6wKQDL1SY0a/N+OEa/+br6LnHWOAJB0Npiamnodfq3jd2LS/i2fMeOKSAILjA+m5Q==}
possible-typed-array-names@1.1.0:
resolution: {integrity: sha512-/+5VFTchJDoVj3bhoqi6UeymcD00DAwb1nJwamzPvHEszJ4FpF6SNNbUbOS8yI56qHzdV8eK0qEfOSiodkTdxg==}
engines: {node: '>= 0.4'}
@@ -5386,6 +5578,10 @@ packages:
resolution: {integrity: sha512-haPVm1EkS9pgvHrQ/F3Xy+hgcuMV0Wm9vfIBSiwZ05k+xgb0PkBQpGsAA/oWdDobNaZTH5ppvHtzCFbnSEwHVw==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
sharp@0.34.5:
resolution: {integrity: sha512-Ou9I5Ft9WNcCbXrU9cMgPBcCK8LiwLqcbywW3t4oDV37n1pzpuNLsYiAV8eODnjbtQlSDwZ2cUEeQz4E54Hltg==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
shebang-command@2.0.0:
resolution: {integrity: sha512-kHxr2zZpYtdmrN1qDjrrX/Z1rR1kG8Dx+gkpK1G4eXmvXswmcE1hTWBWYUzlraYw1/yZp6YuDY77YtvbN0dmDA==}
engines: {node: '>=8'}
@@ -6648,81 +6844,177 @@ snapshots:
'@humanwhocodes/retry@0.4.3': {}
'@img/colour@1.1.0': {}
'@img/sharp-darwin-arm64@0.33.5':
optionalDependencies:
'@img/sharp-libvips-darwin-arm64': 1.0.4
optional: true
'@img/sharp-darwin-arm64@0.34.5':
optionalDependencies:
'@img/sharp-libvips-darwin-arm64': 1.2.4
optional: true
'@img/sharp-darwin-x64@0.33.5':
optionalDependencies:
'@img/sharp-libvips-darwin-x64': 1.0.4
optional: true
'@img/sharp-darwin-x64@0.34.5':
optionalDependencies:
'@img/sharp-libvips-darwin-x64': 1.2.4
optional: true
'@img/sharp-libvips-darwin-arm64@1.0.4':
optional: true
'@img/sharp-libvips-darwin-arm64@1.2.4':
optional: true
'@img/sharp-libvips-darwin-x64@1.0.4':
optional: true
'@img/sharp-libvips-darwin-x64@1.2.4':
optional: true
'@img/sharp-libvips-linux-arm64@1.0.4':
optional: true
'@img/sharp-libvips-linux-arm64@1.2.4':
optional: true
'@img/sharp-libvips-linux-arm@1.0.5':
optional: true
'@img/sharp-libvips-linux-arm@1.2.4':
optional: true
'@img/sharp-libvips-linux-ppc64@1.2.4':
optional: true
'@img/sharp-libvips-linux-riscv64@1.2.4':
optional: true
'@img/sharp-libvips-linux-s390x@1.0.4':
optional: true
'@img/sharp-libvips-linux-s390x@1.2.4':
optional: true
'@img/sharp-libvips-linux-x64@1.0.4':
optional: true
'@img/sharp-libvips-linux-x64@1.2.4':
optional: true
'@img/sharp-libvips-linuxmusl-arm64@1.0.4':
optional: true
'@img/sharp-libvips-linuxmusl-arm64@1.2.4':
optional: true
'@img/sharp-libvips-linuxmusl-x64@1.0.4':
optional: true
'@img/sharp-libvips-linuxmusl-x64@1.2.4':
optional: true
'@img/sharp-linux-arm64@0.33.5':
optionalDependencies:
'@img/sharp-libvips-linux-arm64': 1.0.4
optional: true
'@img/sharp-linux-arm64@0.34.5':
optionalDependencies:
'@img/sharp-libvips-linux-arm64': 1.2.4
optional: true
'@img/sharp-linux-arm@0.33.5':
optionalDependencies:
'@img/sharp-libvips-linux-arm': 1.0.5
optional: true
'@img/sharp-linux-arm@0.34.5':
optionalDependencies:
'@img/sharp-libvips-linux-arm': 1.2.4
optional: true
'@img/sharp-linux-ppc64@0.34.5':
optionalDependencies:
'@img/sharp-libvips-linux-ppc64': 1.2.4
optional: true
'@img/sharp-linux-riscv64@0.34.5':
optionalDependencies:
'@img/sharp-libvips-linux-riscv64': 1.2.4
optional: true
'@img/sharp-linux-s390x@0.33.5':
optionalDependencies:
'@img/sharp-libvips-linux-s390x': 1.0.4
optional: true
'@img/sharp-linux-s390x@0.34.5':
optionalDependencies:
'@img/sharp-libvips-linux-s390x': 1.2.4
optional: true
'@img/sharp-linux-x64@0.33.5':
optionalDependencies:
'@img/sharp-libvips-linux-x64': 1.0.4
optional: true
'@img/sharp-linux-x64@0.34.5':
optionalDependencies:
'@img/sharp-libvips-linux-x64': 1.2.4
optional: true
'@img/sharp-linuxmusl-arm64@0.33.5':
optionalDependencies:
'@img/sharp-libvips-linuxmusl-arm64': 1.0.4
optional: true
'@img/sharp-linuxmusl-arm64@0.34.5':
optionalDependencies:
'@img/sharp-libvips-linuxmusl-arm64': 1.2.4
optional: true
'@img/sharp-linuxmusl-x64@0.33.5':
optionalDependencies:
'@img/sharp-libvips-linuxmusl-x64': 1.0.4
optional: true
'@img/sharp-linuxmusl-x64@0.34.5':
optionalDependencies:
'@img/sharp-libvips-linuxmusl-x64': 1.2.4
optional: true
'@img/sharp-wasm32@0.33.5':
dependencies:
'@emnapi/runtime': 1.9.0
optional: true
'@img/sharp-wasm32@0.34.5':
dependencies:
'@emnapi/runtime': 1.9.0
optional: true
'@img/sharp-win32-arm64@0.34.5':
optional: true
'@img/sharp-win32-ia32@0.33.5':
optional: true
'@img/sharp-win32-ia32@0.34.5':
optional: true
'@img/sharp-win32-x64@0.33.5':
optional: true
'@img/sharp-win32-x64@0.34.5':
optional: true
'@ioredis/commands@1.5.0': {}
'@ioredis/commands@1.5.1': {}
@@ -6816,8 +7108,12 @@ snapshots:
'@next/swc-win32-x64-msvc@15.1.0':
optional: true
'@noble/ciphers@1.3.0': {}
'@noble/ciphers@2.1.1': {}
'@noble/hashes@1.8.0': {}
'@noble/hashes@2.0.1': {}
'@nodelib/fs.scandir@2.1.5':
@@ -7747,6 +8043,10 @@ snapshots:
dependencies:
'@types/node': 22.19.15
'@types/pdfkit@0.17.6':
dependencies:
'@types/node': 22.19.15
'@types/react-dom@19.2.3(@types/react@19.2.14)':
dependencies:
'@types/react': 19.2.14
@@ -8283,6 +8583,8 @@ snapshots:
dependencies:
bare-path: 3.0.0
base64-js@0.0.8: {}
base64-js@1.5.1: {}
base64id@2.0.0: {}
@@ -8363,6 +8665,10 @@ snapshots:
browser-or-node@2.1.1: {}
browserify-zlib@0.2.0:
dependencies:
pako: 1.0.11
browserslist@4.28.1:
dependencies:
baseline-browser-mapping: 2.10.8
@@ -9774,6 +10080,8 @@ snapshots:
js-cookie@3.0.5: {}
js-md5@0.8.3: {}
js-tokens@10.0.0: {}
js-tokens@4.0.0: {}
@@ -9894,6 +10202,11 @@ snapshots:
lilconfig@3.1.3: {}
linebreak@1.1.0:
dependencies:
base64-js: 0.0.8
unicode-trie: 2.0.0
lines-and-columns@1.2.4: {}
linkify-it@5.0.0:
@@ -10308,6 +10621,15 @@ snapshots:
pako: 1.0.11
tslib: 1.14.1
pdfkit@0.18.0:
dependencies:
'@noble/ciphers': 1.3.0
'@noble/hashes': 1.8.0
fontkit: 2.0.4
js-md5: 0.8.3
linebreak: 1.1.0
png-js: 1.1.0
peberminta@0.9.0: {}
performance-now@2.1.0: {}
@@ -10386,6 +10708,10 @@ snapshots:
optionalDependencies:
fsevents: 2.3.2
png-js@1.1.0:
dependencies:
browserify-zlib: 0.2.0
possible-typed-array-names@1.1.0: {}
postcss-import@15.1.0(postcss@8.5.8):
@@ -11175,6 +11501,37 @@ snapshots:
'@img/sharp-win32-x64': 0.33.5
optional: true
sharp@0.34.5:
dependencies:
'@img/colour': 1.1.0
detect-libc: 2.1.2
semver: 7.7.4
optionalDependencies:
'@img/sharp-darwin-arm64': 0.34.5
'@img/sharp-darwin-x64': 0.34.5
'@img/sharp-libvips-darwin-arm64': 1.2.4
'@img/sharp-libvips-darwin-x64': 1.2.4
'@img/sharp-libvips-linux-arm': 1.2.4
'@img/sharp-libvips-linux-arm64': 1.2.4
'@img/sharp-libvips-linux-ppc64': 1.2.4
'@img/sharp-libvips-linux-riscv64': 1.2.4
'@img/sharp-libvips-linux-s390x': 1.2.4
'@img/sharp-libvips-linux-x64': 1.2.4
'@img/sharp-libvips-linuxmusl-arm64': 1.2.4
'@img/sharp-libvips-linuxmusl-x64': 1.2.4
'@img/sharp-linux-arm': 0.34.5
'@img/sharp-linux-arm64': 0.34.5
'@img/sharp-linux-ppc64': 0.34.5
'@img/sharp-linux-riscv64': 0.34.5
'@img/sharp-linux-s390x': 0.34.5
'@img/sharp-linux-x64': 0.34.5
'@img/sharp-linuxmusl-arm64': 0.34.5
'@img/sharp-linuxmusl-x64': 0.34.5
'@img/sharp-wasm32': 0.34.5
'@img/sharp-win32-arm64': 0.34.5
'@img/sharp-win32-ia32': 0.34.5
'@img/sharp-win32-x64': 0.34.5
shebang-command@2.0.0:
dependencies:
shebang-regex: 3.0.0

View File

@@ -0,0 +1,52 @@
/**
* Dev-only smoke check for the berth recommender. Resolves the first
* port-nimara interest (with desired dims set) and prints the top-N
* recommendations.
*
* pnpm tsx scripts/dev-recommender-smoke.ts
*/
import 'dotenv/config';
import { eq, isNotNull, and } from 'drizzle-orm';
import { db } from '@/lib/db';
import { ports } from '@/lib/db/schema/ports';
import { interests } from '@/lib/db/schema/interests';
import { recommendBerths } from '@/lib/services/berth-recommender.service';
async function main() {
const [port] = await db
.select({ id: ports.id })
.from(ports)
.where(eq(ports.slug, 'port-nimara'))
.limit(1);
if (!port) throw new Error('port-nimara not found');
const [interest] = await db
.select({ id: interests.id })
.from(interests)
.where(and(eq(interests.portId, port.id), isNotNull(interests.desiredLengthFt)))
.limit(1);
if (!interest) throw new Error('No interest with desired dims set');
console.log(`> Recommending berths for interest ${interest.id} on port ${port.id}`);
const recs = await recommendBerths({
interestId: interest.id,
portId: port.id,
});
console.log(`> ${recs.length} recommendations:`);
for (const r of recs) {
console.log(
` ${r.mooringNumber.padEnd(5)} tier=${r.tier} fit=${r.fitScore} ` +
`${r.lengthFt}×${r.widthFt}×${r.draftFt} ft buf=${r.sizeBufferPct}% ` +
`${r.reasons.dimensional}; ${r.reasons.pipeline}`,
);
}
}
main()
.then(() => process.exit(0))
.catch((err) => {
console.error(err);
process.exit(1);
});

View File

@@ -0,0 +1,409 @@
/**
* Idempotent NocoDB Berths → CRM `berths` import.
*
* Re-running picks up NocoDB additions/edits without clobbering CRM-side
* overrides: rows where `updated_at > last_imported_at` are treated as
* human-edited and skipped (use `--force` to override). Map Data JSON
* is validated and upserted into `berth_map_data` as a separate step.
*
* Usage:
* pnpm tsx scripts/import-berths-from-nocodb.ts --dry-run [--port-slug port-nimara]
* pnpm tsx scripts/import-berths-from-nocodb.ts --apply [--port-slug port-nimara]
* pnpm tsx scripts/import-berths-from-nocodb.ts --apply --force
* pnpm tsx scripts/import-berths-from-nocodb.ts --apply --update-snapshot
*
* Edge cases mitigated (see plan §14.1):
* - Mooring collisions : unique (port_id, mooring_number) on the table.
* - Concurrent runs : pg_advisory_xact_lock on a stable key.
* - Numeric-with-units : parseDecimalWithUnit() strips trailing units.
* - Metric drift : NocoDB metric formula columns are ignored;
* metric values are recomputed from imperial.
* - Map Data shape : zod-validated; failures are skipped silently
* rather than aborting the whole import.
* - Status enum : NocoDB display strings → CRM snake_case.
* - NocoDB row deleted : reported as "orphaned in CRM"; not auto-deleted.
*/
import 'dotenv/config';
import { eq, sql } from 'drizzle-orm';
import { promises as fs } from 'node:fs';
import path from 'node:path';
import { db } from '@/lib/db';
import { ports } from '@/lib/db/schema/ports';
import { berths, berthMapData } from '@/lib/db/schema/berths';
import { fetchAllRows, loadNocoDbConfig, NOCO_TABLES } from '@/lib/dedup/nocodb-source';
import {
buildPlan,
mapRow,
type Action,
type ImportedBerth,
type PlanEntry,
type ExistingBerthRow,
} from '@/lib/services/berth-import';
// ─── CLI ────────────────────────────────────────────────────────────────────
interface CliArgs {
dryRun: boolean;
apply: boolean;
portSlug: string;
force: boolean;
updateSnapshot: boolean;
}
function parseArgs(argv: string[]): CliArgs {
const args: CliArgs = {
dryRun: false,
apply: false,
portSlug: 'port-nimara',
force: false,
updateSnapshot: false,
};
for (let i = 0; i < argv.length; i += 1) {
const a = argv[i]!;
if (a === '--dry-run') args.dryRun = true;
else if (a === '--apply') args.apply = true;
else if (a === '--port-slug') args.portSlug = argv[++i] ?? 'port-nimara';
else if (a === '--force') args.force = true;
else if (a === '--update-snapshot') args.updateSnapshot = true;
else if (a === '-h' || a === '--help') {
printHelp();
process.exit(0);
} else {
console.error(`Unknown argument: ${a}`);
printHelp();
process.exit(1);
}
}
if (!args.dryRun && !args.apply) {
console.error('Must specify either --dry-run or --apply.');
printHelp();
process.exit(1);
}
return args;
}
function printHelp(): void {
console.log(`Usage:
pnpm tsx scripts/import-berths-from-nocodb.ts --dry-run [--port-slug <slug>]
pnpm tsx scripts/import-berths-from-nocodb.ts --apply [--port-slug <slug>] [--force] [--update-snapshot]
Flags:
--dry-run Read NocoDB + diff vs CRM. No writes.
--apply Apply the plan to the DB.
--port-slug <slug> Target port slug (default: port-nimara).
--force Overwrite rows where CRM updated_at > last_imported_at.
--update-snapshot Rewrite src/lib/db/seed-data/berths.json after apply.
-h, --help Show this help.
`);
}
// ─── Stable advisory lock key ───────────────────────────────────────────────
// 64-bit BIGINT - first 4 bytes spell "BRTH" so it's grep-able in pg_locks.
const BERTH_IMPORT_LOCK_KEY = 0x4252544800000001n;
// ─── Apply ──────────────────────────────────────────────────────────────────
interface ApplyResult {
inserted: number;
updated: number;
skipped: number;
mapDataWritten: number;
warnings: string[];
}
async function apply(
portId: string,
plan: PlanEntry[],
orphans: ExistingBerthRow[],
importedAt: Date,
): Promise<ApplyResult> {
const result: ApplyResult = {
inserted: 0,
updated: 0,
skipped: 0,
mapDataWritten: 0,
warnings: [],
};
for (const orphan of orphans) {
result.warnings.push(
`Orphan: CRM has mooring="${orphan.mooringNumber}" but NocoDB no longer does (id=${orphan.id})`,
);
}
await db.transaction(async (tx) => {
// Stable lock so two simultaneous --apply runs serialize.
await tx.execute(sql`SELECT pg_advisory_xact_lock(${BERTH_IMPORT_LOCK_KEY})`);
for (const entry of plan) {
if (entry.action === 'skip-edited' || entry.action === 'noop') {
result.skipped += 1;
result.warnings.push(`Skipped ${entry.imported.mooringNumber}: ${entry.reason ?? 'no-op'}`);
continue;
}
const i = entry.imported;
const n = i.numerics;
const baseValues = {
portId,
mooringNumber: i.mooringNumber,
area: i.area,
status: i.status,
lengthFt: n.lengthFt != null ? String(n.lengthFt) : null,
widthFt: n.widthFt != null ? String(n.widthFt) : null,
draftFt: n.draftFt != null ? String(n.draftFt) : null,
lengthM: n.lengthM != null ? String(n.lengthM) : null,
widthM: n.widthM != null ? String(n.widthM) : null,
draftM: n.draftM != null ? String(n.draftM) : null,
widthIsMinimum: i.widthIsMinimum,
nominalBoatSize: n.nominalBoatSize != null ? String(n.nominalBoatSize) : null,
nominalBoatSizeM: n.nominalBoatSizeM != null ? String(n.nominalBoatSizeM) : null,
waterDepth: n.waterDepth != null ? String(n.waterDepth) : null,
waterDepthM: n.waterDepthM != null ? String(n.waterDepthM) : null,
waterDepthIsMinimum: i.waterDepthIsMinimum,
sidePontoon: i.sidePontoon,
powerCapacity: n.powerCapacity != null ? String(n.powerCapacity) : null,
voltage: n.voltage != null ? String(n.voltage) : null,
mooringType: i.mooringType,
cleatType: i.cleatType,
cleatCapacity: i.cleatCapacity,
bollardType: i.bollardType,
bollardCapacity: i.bollardCapacity,
access: i.access,
price: n.price != null ? String(n.price) : null,
priceCurrency: 'USD' as const,
bowFacing: i.bowFacing,
berthApproved: i.berthApproved,
statusOverrideMode: i.statusOverrideMode,
lastImportedAt: importedAt,
updatedAt: importedAt,
};
let berthId: string;
if (entry.action === 'insert') {
const [inserted] = await tx
.insert(berths)
.values({ ...baseValues, tenureType: 'permanent' })
.returning({ id: berths.id });
berthId = inserted!.id;
result.inserted += 1;
} else {
await tx.update(berths).set(baseValues).where(eq(berths.id, entry.existing!.id));
berthId = entry.existing!.id;
result.updated += 1;
}
if (i.mapData) {
const mapValues = {
berthId,
svgPath: i.mapData.path ?? null,
x: i.mapData.x != null ? String(i.mapData.x) : null,
y: i.mapData.y != null ? String(i.mapData.y) : null,
transform: i.mapData.transform ?? null,
fontSize: i.mapData.fontSize != null ? String(i.mapData.fontSize) : null,
updatedAt: importedAt,
};
await tx
.insert(berthMapData)
.values(mapValues)
.onConflictDoUpdate({
target: berthMapData.berthId,
set: {
svgPath: mapValues.svgPath,
x: mapValues.x,
y: mapValues.y,
transform: mapValues.transform,
fontSize: mapValues.fontSize,
updatedAt: importedAt,
},
});
result.mapDataWritten += 1;
}
}
});
return result;
}
// ─── Snapshot writer (for seed-data refresh) ────────────────────────────────
async function writeSnapshot(imported: ImportedBerth[]): Promise<string> {
// Ordering: idx 0..4 available (small), 5..9 under_offer (medium),
// 10..11 sold (large), then everything else by mooring number. The
// first 12 indexes feed `seed-data.ts` interest/reservation stubs.
const sortByLength = (a: ImportedBerth, b: ImportedBerth) =>
(a.numerics.lengthFt ?? 0) - (b.numerics.lengthFt ?? 0);
const available = imported
.filter((b) => b.status === 'available')
.sort(sortByLength)
.slice(0, 5);
const underOffer = imported
.filter((b) => b.status === 'under_offer')
.sort(sortByLength)
.slice(0, 5);
const sold = imported
.filter((b) => b.status === 'sold')
.sort((a, b) => -sortByLength(a, b))
.slice(0, 2);
const featured = new Set([...available, ...underOffer, ...sold].map((b) => b.mooringNumber));
const rest = imported
.filter((b) => !featured.has(b.mooringNumber))
.sort((a, b) => a.mooringNumber.localeCompare(b.mooringNumber, 'en', { numeric: true }));
const ordered = [...available, ...underOffer, ...sold, ...rest];
const payload = ordered.map((b) => ({
legacyId: b.legacyId,
mooringNumber: b.mooringNumber,
area: b.area,
status: b.status,
lengthFt: b.numerics.lengthFt,
widthFt: b.numerics.widthFt,
draftFt: b.numerics.draftFt,
lengthM: b.numerics.lengthM,
widthM: b.numerics.widthM,
draftM: b.numerics.draftM,
widthIsMinimum: b.widthIsMinimum,
nominalBoatSize: b.numerics.nominalBoatSize,
nominalBoatSizeM: b.numerics.nominalBoatSizeM,
waterDepth: b.numerics.waterDepth,
waterDepthM: b.numerics.waterDepthM,
waterDepthIsMinimum: b.waterDepthIsMinimum,
sidePontoon: b.sidePontoon,
powerCapacity: b.numerics.powerCapacity,
voltage: b.numerics.voltage,
mooringType: b.mooringType,
cleatType: b.cleatType,
cleatCapacity: b.cleatCapacity,
bollardType: b.bollardType,
bollardCapacity: b.bollardCapacity,
access: b.access,
price: b.numerics.price,
bowFacing: b.bowFacing,
berthApproved: b.berthApproved,
statusOverrideMode: b.statusOverrideMode,
}));
const target = path.resolve(process.cwd(), 'src/lib/db/seed-data/berths.json');
await fs.writeFile(target, JSON.stringify(payload, null, 2) + '\n', 'utf8');
return target;
}
// ─── Main ───────────────────────────────────────────────────────────────────
async function main(): Promise<void> {
const args = parseArgs(process.argv.slice(2));
const config = loadNocoDbConfig();
const [port] = await db
.select({ id: ports.id, slug: ports.slug })
.from(ports)
.where(eq(ports.slug, args.portSlug))
.limit(1);
if (!port) {
console.error(`No port found with slug "${args.portSlug}".`);
process.exit(1);
}
console.log(`> Fetching NocoDB Berths…`);
const rows = await fetchAllRows(NOCO_TABLES.berths, config);
console.log(` fetched ${rows.length} rows from NocoDB`);
const imported: ImportedBerth[] = [];
let skippedMalformed = 0;
for (const r of rows) {
const m = mapRow(r);
if (m) imported.push(m);
else skippedMalformed += 1;
}
if (skippedMalformed > 0) {
console.warn(` ${skippedMalformed} rows skipped (missing Mooring Number)`);
}
// De-dup against any same-mooring twins surfacing from NocoDB
// (defensive — the Berths table is keyed on Mooring Number in NocoDB).
const seen = new Set<string>();
const dedup: ImportedBerth[] = [];
for (const b of imported) {
if (seen.has(b.mooringNumber)) {
console.warn(` duplicate mooring "${b.mooringNumber}" in NocoDB — keeping first`);
continue;
}
seen.add(b.mooringNumber);
dedup.push(b);
}
console.log(`> Reading current CRM berths for port "${port.slug}"…`);
const existingRows = await db
.select({
id: berths.id,
mooringNumber: berths.mooringNumber,
updatedAt: berths.updatedAt,
lastImportedAt: berths.lastImportedAt,
})
.from(berths)
.where(eq(berths.portId, port.id));
console.log(` ${existingRows.length} existing rows`);
const existingByMooring = new Map(existingRows.map((r) => [r.mooringNumber, r]));
const { plan, orphans } = buildPlan(dedup, existingByMooring, args.force);
const counts = plan.reduce(
(acc, e) => {
acc[e.action] += 1;
return acc;
},
{ insert: 0, update: 0, 'skip-edited': 0, noop: 0 } as Record<Action, number>,
);
console.log(`> Plan:`);
console.log(` insert : ${counts.insert}`);
console.log(` update : ${counts.update}`);
console.log(` skip-edited : ${counts['skip-edited']}`);
console.log(` no-op : ${counts.noop}`);
console.log(` orphans (CRM): ${orphans.length}`);
if (counts['skip-edited'] > 0) {
console.log(` ↳ Skipped (CRM-edited; pass --force to overwrite):`);
for (const e of plan.filter((p) => p.action === 'skip-edited').slice(0, 10)) {
console.log(` - ${e.imported.mooringNumber} ${e.reason}`);
}
if (counts['skip-edited'] > 10) console.log(` …and ${counts['skip-edited'] - 10} more`);
}
if (orphans.length > 0) {
console.log(` ↳ Orphans (in CRM but missing from NocoDB):`);
for (const o of orphans.slice(0, 10)) console.log(` - ${o.mooringNumber}`);
if (orphans.length > 10) console.log(` …and ${orphans.length - 10} more`);
}
// Snapshot write is independent of DB writes — even in --dry-run mode
// a rep may want to refresh the seed JSON to capture the latest NocoDB
// shape without committing to the DB import. The original gate dropped
// this silently when --dry-run was passed; audit caught it.
if (args.updateSnapshot) {
const written = await writeSnapshot(dedup);
console.log(`> Wrote ${dedup.length} rows to ${path.relative(process.cwd(), written)}`);
}
if (args.dryRun) {
console.log(`\n[dry-run] no DB writes performed.`);
return;
}
console.log(`> Applying…`);
const result = await apply(port.id, plan, orphans, new Date());
console.log(` inserted : ${result.inserted}`);
console.log(` updated : ${result.updated}`);
console.log(` skipped : ${result.skipped}`);
console.log(` map data writes : ${result.mapDataWritten}`);
if (result.warnings.length) {
console.log(` warnings :`);
for (const w of result.warnings.slice(0, 20)) console.log(` - ${w}`);
if (result.warnings.length > 20) console.log(` …and ${result.warnings.length - 20} more`);
}
}
main()
.then(() => process.exit(0))
.catch((err: unknown) => {
console.error(err);
process.exit(1);
});

View File

@@ -1,126 +0,0 @@
/**
* One-shot: load the 117-berth NocoDB snapshot into the port-nimara
* port, skipping any moorings that already exist.
*
* The original seed only seeded 12 hand-rolled berths into port-nimara
* (A-01..D-03), but the migration's interest rows reference moorings
* across A-01..E-18. This loads the full set so interest→berth links
* resolve cleanly on the next migration run.
*/
import 'dotenv/config';
import { eq, and, sql, inArray } from 'drizzle-orm';
import { db } from '@/lib/db';
import { ports } from '@/lib/db/schema/ports';
import { berths } from '@/lib/db/schema/berths';
import berthSnapshot from '@/lib/db/seed-data/berths.json';
interface SnapshotBerth {
mooringNumber: string;
area: string;
status: 'available' | 'under_offer' | 'sold';
lengthFt: number | null;
widthFt: number | null;
draftFt: number | null;
lengthM: number | null;
widthM: number | null;
draftM: number | null;
widthIsMinimum: boolean;
nominalBoatSize: number | null;
nominalBoatSizeM: number | null;
waterDepth: number | null;
waterDepthM: number | null;
waterDepthIsMinimum: boolean;
sidePontoon: string | null;
powerCapacity: number | null;
voltage: number | null;
mooringType: string | null;
cleatType: string | null;
cleatCapacity: string | null;
bollardType: string | null;
bollardCapacity: string | null;
access: string | null;
price: number | null;
bowFacing: string | null;
berthApproved: boolean;
statusOverrideMode: string | null;
}
async function main() {
const [port] = await db
.select({ id: ports.id })
.from(ports)
.where(eq(ports.slug, 'port-nimara'))
.limit(1);
if (!port) throw new Error('port-nimara not found');
const snapshot = berthSnapshot as unknown as SnapshotBerth[];
// Existing moorings — skip these.
const existingRows = await db
.select({ mooringNumber: berths.mooringNumber })
.from(berths)
.where(eq(berths.portId, port.id));
const existingMoorings = new Set(existingRows.map((r) => r.mooringNumber));
const toInsert = snapshot.filter((b) => !existingMoorings.has(b.mooringNumber));
console.log(
`Snapshot: ${snapshot.length} berths, existing in port-nimara: ${existingRows.length}, to insert: ${toInsert.length}`,
);
if (toInsert.length === 0) {
console.log('Nothing to do.');
return;
}
const inserted = await db
.insert(berths)
.values(
toInsert.map((b) => ({
portId: port.id,
mooringNumber: b.mooringNumber,
area: b.area,
status: b.status,
lengthFt: b.lengthFt != null ? String(b.lengthFt) : null,
widthFt: b.widthFt != null ? String(b.widthFt) : null,
draftFt: b.draftFt != null ? String(b.draftFt) : null,
lengthM: b.lengthM != null ? String(b.lengthM) : null,
widthM: b.widthM != null ? String(b.widthM) : null,
draftM: b.draftM != null ? String(b.draftM) : null,
widthIsMinimum: b.widthIsMinimum,
nominalBoatSize: b.nominalBoatSize != null ? String(b.nominalBoatSize) : null,
nominalBoatSizeM: b.nominalBoatSizeM != null ? String(b.nominalBoatSizeM) : null,
waterDepth: b.waterDepth != null ? String(b.waterDepth) : null,
waterDepthM: b.waterDepthM != null ? String(b.waterDepthM) : null,
waterDepthIsMinimum: b.waterDepthIsMinimum,
sidePontoon: b.sidePontoon,
powerCapacity: b.powerCapacity != null ? String(b.powerCapacity) : null,
voltage: b.voltage != null ? String(b.voltage) : null,
mooringType: b.mooringType,
cleatType: b.cleatType,
cleatCapacity: b.cleatCapacity,
bollardType: b.bollardType,
bollardCapacity: b.bollardCapacity,
access: b.access,
price: b.price != null ? String(b.price) : null,
priceCurrency: 'USD',
bowFacing: b.bowFacing,
berthApproved: b.berthApproved,
statusOverrideMode: b.statusOverrideMode,
tenureType: 'permanent' as const,
})),
)
.returning({ id: berths.id, mooringNumber: berths.mooringNumber });
console.log(`Inserted ${inserted.length} berths.`);
// Suppress unused-import warning if eslint is strict.
void and;
void sql;
void inArray;
}
main().catch((e) => {
console.error(e);
process.exit(1);
});

View File

@@ -0,0 +1,29 @@
/**
* Storage backend migration CLI — see §4.7a + §14.9a of
* docs/berth-recommender-and-pdf-plan.md.
*
* pnpm tsx scripts/migrate-storage.ts --from s3 --to filesystem [--dry-run]
* pnpm tsx scripts/migrate-storage.ts --from filesystem --to s3
*
* The actual migration logic lives in `src/lib/storage/migrate.ts` so the
* admin UI's "Switch backend" button can run the exact same code path. This
* file is a thin CLI wrapper.
*/
import { logger } from '@/lib/logger';
import { parseArgs, runMigration } from '@/lib/storage/migrate';
async function main(): Promise<void> {
const args = parseArgs(process.argv.slice(2));
logger.info({ args }, 'Starting storage migration');
const result = await runMigration(args);
logger.info({ result }, 'Storage migration complete');
console.log(JSON.stringify(result, null, 2));
process.exit(0);
}
main().catch((err) => {
logger.error({ err }, 'Storage migration failed');
console.error(err);
process.exit(2);
});

View File

@@ -32,14 +32,13 @@ async function main() {
const nodemailer = await import('nodemailer');
const captured: Array<{ to: unknown; subject: unknown; from: unknown }> = [];
const originalCreateTransport = nodemailer.default.createTransport;
// @ts-expect-error monkey-patch
nodemailer.default.createTransport = () => ({
nodemailer.default.createTransport = (() => ({
// eslint-disable-next-line @typescript-eslint/no-explicit-any
sendMail: async (msg: any) => {
captured.push({ to: msg.to, subject: msg.subject, from: msg.from });
return { messageId: '<smoke@test>', accepted: [msg.to], rejected: [] };
},
});
})) as unknown as typeof nodemailer.default.createTransport;
// Now import sendEmail (gets the patched transporter).
const { sendEmail } = await import('@/lib/email');
@@ -55,7 +54,6 @@ async function main() {
await sendEmail(realClientEmail, realSubject, '<p>Body unused for this smoke.</p>');
// Restore the original transport (be a good citizen).
// @ts-expect-error monkey-patch
nodemailer.default.createTransport = originalCreateTransport;
console.log('[smoke] captured outbound message:');

View File

@@ -0,0 +1,21 @@
import { PageHeader } from '@/components/shared/page-header';
import { BrochuresAdminPanel } from '@/components/admin/brochures-admin-panel';
/**
* Per-port admin page for managing brochures (Phase 7 §5.8).
*
* Lists brochures, lets per-port admins upload new versions via direct-to-
* storage presigned URLs (so the 20MB+ file never traverses Next.js's
* body-size limit — see §11.1), and toggle the default flag.
*/
export default function BrochuresAdminPage() {
return (
<div className="space-y-6">
<PageHeader
title="Brochures"
description="Port-wide marketing PDFs available to the sales send-out flow. The default brochure is the one /clients picker pre-selects."
/>
<BrochuresAdminPanel />
</div>
);
}

View File

@@ -3,6 +3,7 @@ import {
type SettingFieldDef,
} from '@/components/admin/shared/settings-form-card';
import { PageHeader } from '@/components/shared/page-header';
import { SalesEmailConfigCard } from '@/components/admin/sales-email-config-card';
const FIELDS: SettingFieldDef[] = [
{
@@ -94,6 +95,7 @@ export default function EmailSettingsPage() {
description="Optional per-port SMTP credentials. Leave blank to use the global env defaults."
fields={FIELDS.slice(5)}
/>
<SalesEmailConfigCard />
</div>
);
}

View File

@@ -180,6 +180,13 @@ const GROUPS: AdminGroup[] = [
description: 'Database snapshots and on-demand exports.',
icon: HardDrive,
},
{
href: 'storage',
label: 'Storage Backend',
description:
'Choose between S3-compatible object store or local filesystem; migrate between them.',
icon: HardDrive,
},
],
},
{

View File

@@ -0,0 +1,7 @@
import { StorageAdminPanel } from '@/components/admin/storage-admin-panel';
export const dynamic = 'force-dynamic';
export default function StorageAdminPage() {
return <StorageAdminPanel />;
}

View File

@@ -3,7 +3,7 @@
import { useEffect, useRef, useState } from 'react';
import { useParams, useRouter } from 'next/navigation';
import { useMutation } from '@tanstack/react-query';
import { Camera, Loader2, ScanLine, Upload } from 'lucide-react';
import { Camera, Loader2, ScanLine, Upload, X } from 'lucide-react';
import { useMobileChrome } from '@/components/layout/mobile/mobile-layout-provider';
@@ -30,6 +30,11 @@ interface ScanResult {
confidence: number;
}
interface UploadedFileMeta {
id: string;
filename: string;
}
export default function ScanReceiptPage() {
const params = useParams<{ portSlug: string }>();
const router = useRouter();
@@ -38,6 +43,13 @@ export default function ScanReceiptPage() {
const cameraInputRef = useRef<HTMLInputElement>(null);
const [scanResult, setScanResult] = useState<ScanResult | null>(null);
const [previewUrl, setPreviewUrl] = useState<string | null>(null);
// After OCR succeeds we also upload the receipt to /api/v1/files/upload
// so the expense links to the actual image. The legacy scanner skipped
// this step and saved expenses without their receipt — which silently
// disqualified them from parent-company reimbursement (the warning the
// PDF export now surfaces).
const [uploadedFile, setUploadedFile] = useState<UploadedFileMeta | null>(null);
const [pendingFile, setPendingFile] = useState<File | null>(null);
const { setChrome } = useMobileChrome();
useEffect(() => {
@@ -74,6 +86,29 @@ export default function ScanReceiptPage() {
},
});
// Uploads the receipt image to /api/v1/files/upload (category=receipt)
// so the new expense row can link to it via receiptFileIds. Runs in
// parallel with the OCR scan so the rep can keep editing fields while
// the upload completes.
const uploadMutation = useMutation({
mutationFn: async (file: File): Promise<UploadedFileMeta> => {
const formData = new FormData();
formData.append('file', file);
formData.append('category', 'receipt');
const res = await fetch('/api/v1/files/upload', {
method: 'POST',
body: formData,
credentials: 'include',
});
if (!res.ok) throw new Error('Receipt upload failed');
const json = (await res.json()) as { data: { id: string; filename: string } };
return { id: json.data.id, filename: json.data.filename };
},
onSuccess: (meta) => {
setUploadedFile(meta);
},
});
const saveMutation = useMutation({
mutationFn: () =>
apiFetch('/api/v1/expenses', {
@@ -85,6 +120,9 @@ export default function ScanReceiptPage() {
category: category || undefined,
expenseDate: date ? new Date(date) : new Date(),
paymentStatus: 'unpaid',
receiptFileIds: uploadedFile ? [uploadedFile.id] : undefined,
// The scanner path always has a receipt (we wouldn't have OCR'd
// it otherwise), so we never need the no-receipt flag here.
},
}),
onSuccess: () => {
@@ -95,12 +133,32 @@ export default function ScanReceiptPage() {
function handleFileChange(e: React.ChangeEvent<HTMLInputElement>) {
const file = e.target.files?.[0];
if (!file) return;
setPendingFile(file);
const url = URL.createObjectURL(file);
setPreviewUrl(url);
// Kick off OCR scan + storage upload concurrently. The two are
// independent server calls and the rep is staring at the preview
// while both run.
scanMutation.mutate(file);
uploadMutation.mutate(file);
}
function handleClearReceipt() {
if (previewUrl) URL.revokeObjectURL(previewUrl);
setPreviewUrl(null);
setUploadedFile(null);
setPendingFile(null);
setScanResult(null);
// Reset in-flight mutations so a late onSuccess doesn't repopulate
// the form against an already-cleared UI (audit finding: stale
// receipt could land on the next Save).
scanMutation.reset();
uploadMutation.reset();
if (fileInputRef.current) fileInputRef.current.value = '';
if (cameraInputRef.current) cameraInputRef.current.value = '';
}
void pendingFile;
return (
<div className="max-w-2xl mx-auto space-y-6">
<div className="hidden sm:block">
@@ -119,18 +177,45 @@ export default function ScanReceiptPage() {
</CardHeader>
<CardContent>
{previewUrl ? (
<div
className="border-2 border-dashed rounded-lg p-4 text-center cursor-pointer hover:bg-muted/50 transition-colors"
onClick={() => fileInputRef.current?.click()}
>
<div className="space-y-2">
<div className="relative border-2 border-dashed rounded-lg p-4 text-center bg-muted/20">
<img
src={previewUrl}
alt="Receipt preview"
className="max-h-64 mx-auto rounded object-contain"
/>
<button
type="button"
onClick={handleClearReceipt}
aria-label="Remove receipt"
className="absolute top-2 right-2 rounded-full bg-background/80 hover:bg-background border p-1.5 shadow-sm"
>
<X className="h-4 w-4" />
</button>
</div>
<div className="flex flex-wrap items-center gap-2 text-xs text-muted-foreground">
{uploadMutation.isPending && (
<span className="inline-flex items-center gap-1">
<Loader2 className="h-3 w-3 animate-spin" /> Uploading receipt&hellip;
</span>
)}
{uploadedFile && (
<span className="text-emerald-600">
Receipt uploaded ({uploadedFile.filename})
</span>
)}
{uploadMutation.isError && (
<span className="text-destructive">
Receipt upload failed save will still create the expense without an image.
</span>
)}
</div>
</div>
) : (
<div className="grid gap-2 sm:grid-cols-2">
{/* Camera button — available on mobile devices that surface the
built-in capture flow when an `image/*` input has the
`capture` attribute. Hidden on desktop where it's a no-op. */}
<Button
type="button"
size="lg"
@@ -140,6 +225,8 @@ export default function ScanReceiptPage() {
<Camera className="mr-2 h-5 w-5" />
Take photo
</Button>
{/* File picker — works on every platform. Phrased so the copy
fits both mobile (library/files) and desktop (drag and drop). */}
<Button
type="button"
variant="outline"
@@ -148,18 +235,30 @@ export default function ScanReceiptPage() {
onClick={() => fileInputRef.current?.click()}
>
<Upload className="mr-2 h-5 w-5" />
<span className="sm:hidden">Choose from library</span>
<span className="hidden sm:inline">Click to upload or drag and drop</span>
<span className="sm:hidden">Choose from device</span>
<span className="hidden sm:inline">Choose from device or drag and drop</span>
</Button>
<p className="text-xs text-muted-foreground sm:col-span-2 text-center">
JPEG, PNG, WebP up to 10MB
JPEG, PNG, HEIC, WebP up to 10 MB
</p>
<p className="text-xs text-muted-foreground sm:col-span-2 text-center">
Have many receipts?{' '}
<a
href={`/${params.portSlug}/expenses/bulk-upload`}
className="text-primary hover:underline"
>
Bulk upload &rarr;
</a>
</p>
</div>
)}
{/* `image/*` is the broadest accept — includes HEIC on iOS,
JPEG/PNG/WebP everywhere. The capture attribute on the second
input invokes the native camera flow on mobile. */}
<input
ref={fileInputRef}
type="file"
accept="image/*"
accept="image/*,application/pdf"
className="hidden"
onChange={handleFileChange}
/>
@@ -264,10 +363,20 @@ export default function ScanReceiptPage() {
</Button>
<Button
onClick={() => saveMutation.mutate()}
disabled={saveMutation.isPending || !amount}
disabled={
saveMutation.isPending ||
!amount ||
// Block save while the receipt upload is still in flight —
// otherwise the rep can hit Save before the storage round
// trip finishes and the expense lands without `receiptFileIds`,
// silently re-creating the legacy receipt-loss bug.
uploadMutation.isPending
}
>
{saveMutation.isPending && <Loader2 className="mr-2 h-4 w-4 animate-spin" />}
Save as Expense
{(saveMutation.isPending || uploadMutation.isPending) && (
<Loader2 className="mr-2 h-4 w-4 animate-spin" />
)}
{uploadMutation.isPending ? 'Uploading…' : 'Save as Expense'}
</Button>
</div>
</CardContent>

View File

@@ -0,0 +1,108 @@
import { NextResponse } from 'next/server';
import { and, eq, isNull } from 'drizzle-orm';
import { db } from '@/lib/db';
import { ports } from '@/lib/db/schema/ports';
import { berths, berthMapData } from '@/lib/db/schema/berths';
import { interestBerths, interests } from '@/lib/db/schema/interests';
import { logger } from '@/lib/logger';
import { toPublicBerth } from '@/lib/services/public-berths';
/**
* GET /api/public/berths/[mooringNumber]
*
* Single-berth lookup for the public website's `/berths/[number]`
* page. Mooring numbers are matched against the canonical bare form
* ("A1", "B12") - Phase 0 normalized the entire CRM dataset.
*/
// Hard-coded allowlist for the public read-only feed. Adding a port here
// is a deliberate decision (not silent enumeration via ?portSlug=), so a
// future private tenant can't be exposed by accident.
const PUBLIC_PORT_SLUGS = new Set(['port-nimara']);
const DEFAULT_PUBLIC_PORT_SLUG = 'port-nimara';
const RESPONSE_HEADERS = {
'cache-control': 'public, s-maxage=300, stale-while-revalidate=60',
'content-type': 'application/json; charset=utf-8',
};
const MOORING_PATTERN = /^[A-Z]+\d+$/;
export async function GET(
request: Request,
ctx: { params: Promise<{ mooringNumber: string }> },
): Promise<Response> {
const { mooringNumber } = await ctx.params;
const url = new URL(request.url);
const requestedSlug = url.searchParams.get('portSlug') ?? DEFAULT_PUBLIC_PORT_SLUG;
if (!PUBLIC_PORT_SLUGS.has(requestedSlug)) {
return NextResponse.json(
{ error: 'port is not part of the public berths feed', portSlug: requestedSlug },
{ status: 404, headers: { 'cache-control': 'no-store' } },
);
}
const portSlug = requestedSlug;
// Reject obviously malformed mooring numbers up front so cache poisoning
// / random-URL probing returns 400 rather than 404 (saves a DB hit).
if (!MOORING_PATTERN.test(mooringNumber)) {
return NextResponse.json(
{ error: 'invalid mooring number', mooringNumber },
{ status: 400, headers: { 'cache-control': 'no-store' } },
);
}
const [port] = await db
.select({ id: ports.id })
.from(ports)
.where(eq(ports.slug, portSlug))
.limit(1);
if (!port) {
return NextResponse.json(
{ error: 'port not found', portSlug },
{ status: 404, headers: { 'cache-control': 'no-store' } },
);
}
const [berth] = await db
.select()
.from(berths)
.where(and(eq(berths.portId, port.id), eq(berths.mooringNumber, mooringNumber)))
.limit(1);
if (!berth) {
return NextResponse.json(
{ error: 'berth not found', mooringNumber },
{ status: 404, headers: { 'cache-control': 'no-store' } },
);
}
const [mapData, specificInterestRows] = await Promise.all([
db.select().from(berthMapData).where(eq(berthMapData.berthId, berth.id)).limit(1),
db
.select({ berthId: interestBerths.berthId })
.from(interestBerths)
.innerJoin(interests, eq(interests.id, interestBerths.interestId))
.where(
and(
eq(interestBerths.berthId, berth.id),
eq(interestBerths.isSpecificInterest, true),
isNull(interests.archivedAt),
// Closed deals (won/lost/cancelled) don't promote to "Under
// Offer" - won flows through berths.status='sold' handled in
// derivePublicStatus; lost/cancelled means back on the market.
isNull(interests.outcome),
),
)
.limit(1),
]);
const out = toPublicBerth(berth, mapData[0] ?? null, specificInterestRows.length > 0);
if (out.Status !== 'Available' && out.Status !== 'Under Offer' && out.Status !== 'Sold') {
logger.error({ berthId: berth.id, status: out.Status }, 'Public berth status out of range');
return NextResponse.json({ error: 'internal' }, { status: 500 });
}
return new Response(JSON.stringify(out), { headers: RESPONSE_HEADERS, status: 200 });
}

View File

@@ -0,0 +1,157 @@
import { NextResponse } from 'next/server';
import { and, eq, inArray, isNull } from 'drizzle-orm';
import { db } from '@/lib/db';
import { ports } from '@/lib/db/schema/ports';
import { berths, berthMapData } from '@/lib/db/schema/berths';
import { interestBerths, interests } from '@/lib/db/schema/interests';
import { logger } from '@/lib/logger';
import { toPublicBerth, type PublicBerth } from '@/lib/services/public-berths';
/**
* GET /api/public/berths
*
* Public-website data feed. Returns the full berth list for the public-
* facing port (default: port-nimara) in the same JSON shape NocoDB
* returned, so the website's existing `getBerths()` swap is a one-line
* URL change (plan §4.5 + §7.3).
*
* Auth: none. The endpoint is read-only and exposes only the explicit
* field allowlist defined in `toPublicBerth`.
*
* Caching: `s-maxage=300, stale-while-revalidate=60` matches the
* website's existing 5-minute TTL behaviour against NocoDB. Edge/CDN
* caches honour these headers; the Next.js fetch cache also picks
* them up.
*/
// Hard-coded allowlist for the public read-only feed. Adding a port here
// is a deliberate decision (not silent enumeration via ?portSlug=), so a
// future private tenant can't be exposed by accident.
const PUBLIC_PORT_SLUGS = new Set(['port-nimara']);
const DEFAULT_PUBLIC_PORT_SLUG = 'port-nimara';
const RESPONSE_HEADERS = {
'cache-control': 'public, s-maxage=300, stale-while-revalidate=60',
'content-type': 'application/json; charset=utf-8',
};
interface ListResponse {
list: PublicBerth[];
pageInfo: {
totalRows: number;
page: 1;
pageSize: number;
isFirstPage: true;
isLastPage: true;
};
}
export async function GET(request: Request): Promise<Response> {
const url = new URL(request.url);
const requestedSlug = url.searchParams.get('portSlug') ?? DEFAULT_PUBLIC_PORT_SLUG;
if (!PUBLIC_PORT_SLUGS.has(requestedSlug)) {
return NextResponse.json(
{ error: 'port is not part of the public berths feed', portSlug: requestedSlug },
{ status: 404, headers: { 'cache-control': 'no-store' } },
);
}
const portSlug = requestedSlug;
const [port] = await db
.select({ id: ports.id })
.from(ports)
.where(eq(ports.slug, portSlug))
.limit(1);
if (!port) {
return NextResponse.json(
{ error: 'port not found', portSlug },
{ status: 404, headers: { 'cache-control': 'no-store' } },
);
}
// 1. Active berths for the port (archived would be an explicit field
// once we add one - today we don't have an archived_at on berths,
// so we surface every row except those marked status='sold' on
// request? No: §4.5 says "filters out berths archived in CRM".
// The current schema has no archived flag for berths, so this is
// a no-op today; future archive flag plugs in here.
const berthRows = await db.select().from(berths).where(eq(berths.portId, port.id));
if (berthRows.length === 0) {
return jsonResponse({ list: [], pageInfo: emptyPageInfo() });
}
const berthIds = berthRows.map((b) => b.id);
// 2. Bulk-fetch map_data + the "has specific-interest link" flag.
const [mapRows, specificInterestRows] = await Promise.all([
db.select().from(berthMapData).where(inArray(berthMapData.berthId, berthIds)),
db
.selectDistinct({ berthId: interestBerths.berthId })
.from(interestBerths)
.innerJoin(interests, eq(interests.id, interestBerths.interestId))
.where(
and(
inArray(interestBerths.berthId, berthIds),
eq(interestBerths.isSpecificInterest, true),
isNull(interests.archivedAt),
// Don't promote a berth to "Under Offer" when the only specific-
// interest link is a closed deal. `won` flips happen via
// berths.status='sold' (handled in derivePublicStatus). Lost/
// cancelled outcomes mean the berth is back on the market.
isNull(interests.outcome),
),
),
]);
const mapByBerth = new Map(mapRows.map((m) => [m.berthId, m]));
const specificInterestSet = new Set(specificInterestRows.map((r) => r.berthId));
const list = berthRows.map((b) =>
toPublicBerth(b, mapByBerth.get(b.id) ?? null, specificInterestSet.has(b.id)),
);
// Validate the response enum before returning - any unknown status
// value would hit a 500 (per §14.8) rather than silently shipping
// invalid data downstream.
for (const row of list) {
if (row.Status !== 'Available' && row.Status !== 'Under Offer' && row.Status !== 'Sold') {
// Log just the identifying fields - never the full berth row, which
// includes price + amenity columns that don't belong in error logs.
logger.error(
{ berthId: row.Id, mooringNumber: row['Mooring Number'], status: row.Status },
'Public berth status out of range',
);
return NextResponse.json(
{ error: 'internal', detail: 'berth status enum drift' },
{ status: 500 },
);
}
}
return jsonResponse({
list,
pageInfo: {
totalRows: list.length,
page: 1,
pageSize: list.length,
isFirstPage: true,
isLastPage: true,
},
});
}
function jsonResponse(body: ListResponse): Response {
return new Response(JSON.stringify(body), { headers: RESPONSE_HEADERS, status: 200 });
}
function emptyPageInfo() {
return {
totalRows: 0,
page: 1 as const,
pageSize: 0,
isFirstPage: true as const,
isLastPage: true as const,
};
}

View File

@@ -0,0 +1,25 @@
import { NextResponse } from 'next/server';
import { env } from '@/lib/env';
/**
* GET /api/public/health
*
* Public-facing health probe. Used by the marketing-website server on
* startup to verify it's pointed at a CRM matching its own deployment
* env (plan §14.8 critical: prevent staging-website-talking-to-prod-CRM).
*
* Returns the CRM's `NODE_ENV` and `APP_URL` so the website can do a
* strict equality check before serving any request.
*/
export function GET(): Response {
return NextResponse.json(
{
status: 'ok',
env: env.NODE_ENV,
appUrl: env.APP_URL,
timestamp: new Date().toISOString(),
},
{ headers: { 'cache-control': 'no-store' } },
);
}

View File

@@ -4,7 +4,7 @@ import type { z } from 'zod';
import { db } from '@/lib/db';
import { withTransaction } from '@/lib/db/utils';
import { interests } from '@/lib/db/schema/interests';
import { interests, interestBerths } from '@/lib/db/schema/interests';
import { clients, clientContacts, clientAddresses } from '@/lib/db/schema/clients';
import { berths } from '@/lib/db/schema/berths';
import { ports } from '@/lib/db/schema/ports';
@@ -213,13 +213,17 @@ export async function POST(req: NextRequest) {
}
}
// 5. Create interest with yachtId wired up.
// 5. Create interest with yachtId wired up. The legacy
// interests.berth_id column has been replaced by the
// interest_berths junction (plan §3.4); when the public form
// resolves to a known berth we materialise it as a primary,
// specific-interest junction row in the same transaction so it
// rolls back together with the parent interest insert.
const [newInterest] = await tx
.insert(interests)
.values({
portId,
clientId,
berthId,
yachtId,
source: 'website',
pipelineStage: 'open',
@@ -227,6 +231,16 @@ export async function POST(req: NextRequest) {
})
.returning();
if (berthId) {
await tx.insert(interestBerths).values({
interestId: newInterest!.id,
berthId,
isPrimary: true,
isSpecificInterest: true,
isInEoiBundle: false,
});
}
return {
interestId: newInterest!.id,
clientId,

View File

@@ -0,0 +1,236 @@
/**
* Filesystem-backend download proxy.
*
* The `FilesystemBackend.presignDownload(...)` returns a CRM-internal URL of
* the form `/api/storage/<hmac-signed-token>`. This route verifies the HMAC,
* checks expiry, enforces single-use via a short Redis cache, then streams
* the file out with explicit `Content-Type` + `Content-Disposition`.
*
* §14.9a mitigations exercised here:
* - HMAC verification (timingSafeEqual via filesystem.verifyProxyToken)
* - expiry check (token includes `e` epoch seconds)
* - single-use replay protection via short Redis SET-NX
* - Node runtime only (no edge); explicit headers so Next.js doesn't try to
* process the bytes (no image optimization, no streaming transforms)
*/
import { createReadStream } from 'node:fs';
import * as fs from 'node:fs/promises';
import { Readable } from 'node:stream';
import { NextRequest, NextResponse } from 'next/server';
import { MAX_FILE_SIZE } from '@/lib/constants/file-validation';
import { logger } from '@/lib/logger';
import { redis } from '@/lib/redis';
import { FilesystemBackend, getStorageBackend } from '@/lib/storage';
import { verifyProxyToken } from '@/lib/storage/filesystem';
import { isPdfMagic } from '@/lib/services/berth-pdf-parser';
export const runtime = 'nodejs';
export const dynamic = 'force-dynamic';
// Replay-protection TTL must outlive the token itself, otherwise the
// dedup key expires and the same token can be redeemed twice. We pin it
// to the token's own expiry (clamped to a 25-day ceiling so a forged
// far-future token can't pollute Redis indefinitely). Send-out emails
// mint 24-hour tokens so the typical TTL is 24h + a small buffer.
const REPLAY_TTL_FLOOR_SECONDS = 60; // never below 60s (post-expiry tail).
const REPLAY_TTL_CEILING_SECONDS = 25 * 24 * 60 * 60; // 25 days.
export async function GET(
_req: NextRequest,
ctx: { params: Promise<{ token: string }> },
): Promise<NextResponse> {
const { token } = await ctx.params;
const backend = await getStorageBackend();
if (!(backend instanceof FilesystemBackend)) {
return NextResponse.json(
{ error: 'Storage proxy is only available in filesystem mode' },
{ status: 404 },
);
}
const result = verifyProxyToken(token, backend.getHmacSecret());
if (!result.ok) {
logger.warn({ reason: result.reason }, 'Storage proxy token rejected');
return NextResponse.json({ error: 'Invalid or expired token' }, { status: 403 });
}
const { payload } = result;
// Single-use enforcement. SET NX with a TTL pinned to the token's own
// expiry so the dedup window never closes before the token does. Using
// the body half of the token as the dedup key (signature included
// would also work but body is enough — a reused token has the same body).
const replayKey = `storage:proxy:seen:${token.split('.')[0]}`;
const remainingSeconds = Math.max(
REPLAY_TTL_FLOOR_SECONDS,
Math.min(REPLAY_TTL_CEILING_SECONDS, payload.e - Math.floor(Date.now() / 1000) + 60),
);
const setOk = await redis.set(replayKey, '1', 'EX', remainingSeconds, 'NX');
if (setOk !== 'OK') {
logger.warn({ key: payload.k }, 'Storage proxy token replay rejected');
return NextResponse.json({ error: 'Token already used' }, { status: 403 });
}
let absolutePath: string;
try {
absolutePath = backend.resolveKeyForProxy(payload.k);
} catch (err) {
logger.warn({ err, key: payload.k }, 'Storage proxy key resolution failed');
return NextResponse.json({ error: 'Invalid key' }, { status: 400 });
}
let size: number;
try {
const stat = await fs.stat(absolutePath);
if (!stat.isFile()) {
return NextResponse.json({ error: 'Not found' }, { status: 404 });
}
size = stat.size;
} catch (err) {
const code = (err as NodeJS.ErrnoException).code;
if (code === 'ENOENT') {
return NextResponse.json({ error: 'Not found' }, { status: 404 });
}
throw err;
}
// Convert the Node Readable into a Web ReadableStream for NextResponse.
const nodeStream = createReadStream(absolutePath);
const webStream = Readable.toWeb(nodeStream) as unknown as ReadableStream<Uint8Array>;
const headers = new Headers();
headers.set('Content-Type', payload.c ?? 'application/octet-stream');
headers.set('Content-Length', String(size));
if (payload.f) {
// RFC 5987 — quote the filename and provide a UTF-8 fallback.
const safe = payload.f.replace(/"/g, '');
headers.set(
'Content-Disposition',
`attachment; filename="${safe}"; filename*=UTF-8''${encodeURIComponent(payload.f)}`,
);
}
headers.set('Cache-Control', 'private, no-store');
headers.set('X-Content-Type-Options', 'nosniff');
return new NextResponse(webStream, { status: 200, headers });
}
/**
* Filesystem-backend upload proxy. The presigned URL minted by
* `FilesystemBackend.presignUpload` points here. Without this handler the
* browser-driven berth-PDF / brochure uploads would 405 in filesystem
* deployments — the entire pluggable-storage abstraction relied on the
* GET-only counterpart for downloads.
*
* Same token-verify + single-use replay protection as GET, plus:
* - Hard size cap (rejects oversized bodies before any disk I/O).
* - Magic-byte check when the issuer declared content-type=application/pdf
* (matches the §14.6 §6c/§7c invariant: every upload path verifies
* bytes server-side, not just at the client).
*/
export async function PUT(
req: NextRequest,
ctx: { params: Promise<{ token: string }> },
): Promise<NextResponse> {
const { token } = await ctx.params;
const backend = await getStorageBackend();
if (!(backend instanceof FilesystemBackend)) {
return NextResponse.json(
{ error: 'Storage proxy is only available in filesystem mode' },
{ status: 404 },
);
}
const result = verifyProxyToken(token, backend.getHmacSecret());
if (!result.ok) {
logger.warn({ reason: result.reason }, 'Storage proxy upload token rejected');
return NextResponse.json({ error: 'Invalid or expired token' }, { status: 403 });
}
const { payload } = result;
// Separate replay namespace from GET so a token can validly serve one
// upload AND one download (the issuer only mints the second), but a
// PUT cannot be replayed against itself.
const replayKey = `storage:proxy:put:${token.split('.')[0]}`;
const remainingSeconds = Math.max(
REPLAY_TTL_FLOOR_SECONDS,
Math.min(REPLAY_TTL_CEILING_SECONDS, payload.e - Math.floor(Date.now() / 1000) + 60),
);
const setOk = await redis.set(replayKey, '1', 'EX', remainingSeconds, 'NX');
if (setOk !== 'OK') {
logger.warn({ key: payload.k }, 'Storage proxy upload token replay rejected');
return NextResponse.json({ error: 'Token already used' }, { status: 403 });
}
// Pre-flight size check via Content-Length so a malicious caller can't
// exhaust disk by streaming hundreds of MB before we look at the body.
const contentLengthHeader = req.headers.get('content-length');
const contentLength = contentLengthHeader ? Number(contentLengthHeader) : NaN;
if (Number.isFinite(contentLength) && contentLength > MAX_FILE_SIZE) {
return NextResponse.json(
{ error: `File exceeds ${MAX_FILE_SIZE} byte cap (Content-Length: ${contentLength})` },
{ status: 413 },
);
}
if (!req.body) {
return NextResponse.json({ error: 'Empty body' }, { status: 400 });
}
// Read the body into a buffer with a hard cap. Filesystem deployments are
// small-tenant (single-node only — see FilesystemBackend boot guard) so
// 50 MB ceiling fits comfortably in heap; no streaming needed.
let buffer: Buffer;
try {
const chunks: Buffer[] = [];
let total = 0;
const reader = req.body.getReader();
while (true) {
const { done, value } = await reader.read();
if (done) break;
total += value.byteLength;
if (total > MAX_FILE_SIZE) {
try {
await reader.cancel();
} catch {
/* ignore */
}
return NextResponse.json(
{ error: `File exceeds ${MAX_FILE_SIZE} byte cap` },
{ status: 413 },
);
}
chunks.push(Buffer.from(value));
}
buffer = Buffer.concat(chunks);
} catch (err) {
logger.warn({ err, key: payload.k }, 'Storage proxy upload read failed');
return NextResponse.json({ error: 'Upload read failed' }, { status: 400 });
}
// Magic-byte gate: when the token was minted with `c=application/pdf`
// (the only consumer today — berth PDFs + brochures), refuse anything
// that isn't actually a PDF. Mirrors the post-upload check in
// berth-pdf.service.ts so the two paths behave identically.
if (payload.c === 'application/pdf' && !isPdfMagic(buffer)) {
return NextResponse.json(
{ error: 'Uploaded file failed PDF magic-byte check (does not start with %PDF-).' },
{ status: 400 },
);
}
try {
await backend.put(payload.k, buffer, {
contentType: payload.c ?? 'application/octet-stream',
});
} catch (err) {
logger.error({ err, key: payload.k }, 'Storage proxy upload write failed');
return NextResponse.json({ error: 'Upload write failed' }, { status: 500 });
}
return NextResponse.json({ ok: true, key: payload.k, sizeBytes: buffer.length }, { status: 200 });
}

View File

@@ -0,0 +1,44 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { archiveBrochure, getBrochure, updateBrochure } from '@/lib/services/brochures.service';
import { updateBrochureSchema } from '@/lib/validators/brochures';
export const GET = withAuth(
withPermission('admin', 'manage_settings', async (_req, ctx, params) => {
try {
const id = params.id!;
const data = await getBrochure(ctx.portId, id);
return NextResponse.json({ data });
} catch (error) {
return errorResponse(error);
}
}),
);
export const PATCH = withAuth(
withPermission('admin', 'manage_settings', async (req, ctx, params) => {
try {
const id = params.id!;
const input = await parseBody(req, updateBrochureSchema);
const data = await updateBrochure(ctx.portId, id, input);
return NextResponse.json({ data });
} catch (error) {
return errorResponse(error);
}
}),
);
export const DELETE = withAuth(
withPermission('admin', 'manage_settings', async (_req, ctx, params) => {
try {
const id = params.id!;
await archiveBrochure(ctx.portId, id);
return NextResponse.json({ success: true });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,68 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import {
generateBrochureStorageKey,
registerBrochureVersion,
} from '@/lib/services/brochures.service';
import { registerBrochureVersionSchema } from '@/lib/validators/brochures';
/**
* Two-step upload (per §11.1):
* 1. GET (no body) — server returns a fresh storage key + presigned URL.
* 2. POST (metadata) — after the browser PUTs to the URL, register the
* version row server-side.
*
* Direct-to-storage uploads bypass Next.js's body-size limit; the server
* never holds the 20MB+ payload in memory.
*/
import { getStorageBackend } from '@/lib/storage';
import { getSalesContentConfig } from '@/lib/services/sales-email-config.service';
export const GET = withAuth(
withPermission('admin', 'manage_settings', async (_req, ctx, params) => {
try {
const id = params.id!;
const content = await getSalesContentConfig(ctx.portId);
const storageKey = await generateBrochureStorageKey(ctx.portId, id);
const storage = await getStorageBackend();
const { url } = await storage.presignUpload(storageKey, {
expirySeconds: 900,
contentType: 'application/pdf',
});
return NextResponse.json({
data: {
storageKey,
uploadUrl: url,
method: 'PUT',
maxBytes: content.brochureMaxUploadMb * 1024 * 1024,
},
});
} catch (error) {
return errorResponse(error);
}
}),
);
export const POST = withAuth(
withPermission('admin', 'manage_settings', async (req, ctx, params) => {
try {
const id = params.id!;
const input = await parseBody(req, registerBrochureVersionSchema);
const data = await registerBrochureVersion({
portId: ctx.portId,
brochureId: id,
storageKey: input.storageKey,
fileName: input.fileName,
fileSizeBytes: input.fileSizeBytes,
contentSha256: input.contentSha256,
uploadedBy: ctx.userId,
});
return NextResponse.json({ data }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,36 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { createBrochure, listBrochures } from '@/lib/services/brochures.service';
import { createBrochureSchema } from '@/lib/validators/brochures';
export const GET = withAuth(
withPermission('admin', 'manage_settings', async (_req, ctx) => {
try {
const data = await listBrochures(ctx.portId, { includeArchived: true });
return NextResponse.json({ data });
} catch (error) {
return errorResponse(error);
}
}),
);
export const POST = withAuth(
withPermission('admin', 'manage_settings', async (req, ctx) => {
try {
const input = await parseBody(req, createBrochureSchema);
const data = await createBrochure({
portId: ctx.portId,
label: input.label,
description: input.description ?? null,
isDefault: input.isDefault,
createdBy: ctx.userId,
});
return NextResponse.json({ data }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,74 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import {
getSalesEmailConfig,
getSalesImapConfig,
getSalesContentConfig,
redactSalesConfigForResponse,
updateSalesEmailConfig,
} from '@/lib/services/sales-email-config.service';
import { updateSalesEmailConfigSchema } from '@/lib/validators/sales-email-config';
/**
* GET /api/v1/admin/email/sales-config
*
* Returns the redacted view of the sales-email config. Per §14.10
* reps can't see the decrypted password — the response only carries
* `*PassIsSet` boolean markers via `redactSalesConfigForResponse`.
*
* Today this endpoint is admin-only because it's consumed only by the
* admin UI panel (`src/components/admin/sales-email-config-card.tsx`).
* A future rep-facing surface that needs the from-address or body
* templates can split into a separate `/email/sales-config/preview`
* endpoint scoped to `email.view` — keeping the admin endpoint locked
* to `manage_settings` avoids accidentally widening secret-adjacent
* surfaces (e.g. the SMTP host name itself can be a leak vector).
*/
export const GET = withAuth(
withPermission('admin', 'manage_settings', async (_req, ctx) => {
try {
const [email, imap, content] = await Promise.all([
getSalesEmailConfig(ctx.portId),
getSalesImapConfig(ctx.portId),
getSalesContentConfig(ctx.portId),
]);
const redacted = redactSalesConfigForResponse(email, imap, content);
return NextResponse.json({ data: redacted });
} catch (error) {
return errorResponse(error);
}
}),
);
/**
* PATCH /api/v1/admin/email/sales-config
*
* Per-port admin only. Encrypts SMTP/IMAP passwords via AES-256-GCM before
* storage; the API never returns decrypted secrets (mirror enforcement on
* the GET handler).
*/
export const PATCH = withAuth(
withPermission('admin', 'manage_settings', async (req, ctx) => {
try {
const input = await parseBody(req, updateSalesEmailConfigSchema);
await updateSalesEmailConfig(ctx.portId, input, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
// Return the freshly-redacted view so the UI can re-render.
const [email, imap, content] = await Promise.all([
getSalesEmailConfig(ctx.portId),
getSalesImapConfig(ctx.portId),
getSalesContentConfig(ctx.portId),
]);
return NextResponse.json({ data: redactSalesConfigForResponse(email, imap, content) });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,40 @@
/**
* Admin-triggered storage migration. Same code path as `scripts/migrate-storage.ts`
* (both delegate to `runMigration()` in `@/lib/storage/migrate`). Body:
* { from: 's3'|'filesystem', to: 's3'|'filesystem', dryRun?: boolean }
*
* Super-admin only. The `/[portSlug]/admin` segment is already gated; this
* route enforces the same constraint defensively.
*/
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse, ForbiddenError } from '@/lib/errors';
import { runMigration } from '@/lib/storage/migrate';
const schema = z.object({
from: z.enum(['s3', 'filesystem']),
to: z.enum(['s3', 'filesystem']),
dryRun: z.boolean().default(false),
});
export const runtime = 'nodejs';
export const POST = withAuth(async (req, ctx) => {
try {
if (!ctx.isSuperAdmin) {
throw new ForbiddenError('Super admin only');
}
const body = await parseBody(req, schema);
if (body.from === body.to) {
return NextResponse.json({ error: 'from and to must differ' }, { status: 400 });
}
const result = await runMigration({ ...body, userId: ctx.userId });
return NextResponse.json({ data: result });
} catch (error) {
return errorResponse(error);
}
});

View File

@@ -0,0 +1,72 @@
/**
* Admin storage status + connection test. Super-admin only.
*
* GET /api/v1/admin/storage — current backend + capacity stats
* POST /api/v1/admin/storage/test — exercise list/put/get/delete on s3
*/
import { NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { errorResponse, ForbiddenError } from '@/lib/errors';
import { TABLES_WITH_STORAGE_KEYS } from '@/lib/storage/migrate';
import { getStorageBackend } from '@/lib/storage';
import { S3Backend } from '@/lib/storage/s3';
import { db } from '@/lib/db';
import { sql } from 'drizzle-orm';
export const runtime = 'nodejs';
export const GET = withAuth(async (_req, ctx) => {
try {
if (!ctx.isSuperAdmin) {
throw new ForbiddenError('Super admin only');
}
const backend = await getStorageBackend();
// Aggregate row count + total bytes across every storage-bearing table.
let fileCount = 0;
const totalBytes = 0;
for (const tbl of TABLES_WITH_STORAGE_KEYS) {
const result = await db.execute(
sql.raw(
`SELECT COUNT(*)::bigint AS n FROM ${tbl.table} WHERE ${tbl.keyColumn} IS NOT NULL`,
),
);
const rows = (
Array.isArray(result) ? result : ((result as { rows?: unknown[] }).rows ?? [])
) as Array<{ n: number | string }>;
fileCount += Number(rows[0]?.n ?? 0);
}
return NextResponse.json({
data: {
backend: backend.name,
fileCount,
totalBytes,
tablesTracked: TABLES_WITH_STORAGE_KEYS.map((t) => t.table),
},
});
} catch (error) {
return errorResponse(error);
}
});
export const POST = withAuth(async (_req, ctx) => {
try {
if (!ctx.isSuperAdmin) {
throw new ForbiddenError('Super admin only');
}
const backend = await getStorageBackend();
if (!(backend instanceof S3Backend)) {
return NextResponse.json(
{ ok: false, error: 'Test connection only available for S3 backend' },
{ status: 400 },
);
}
const result = await backend.healthCheck();
return NextResponse.json(result);
} catch (error) {
return errorResponse(error);
}
});

View File

@@ -0,0 +1,70 @@
/**
* Returns a presigned URL the browser can use to PUT a PDF directly to the
* active storage backend. The URL is constrained by content-length-range up
* to `system_settings.berth_pdf_max_upload_mb` (default 15 MB) per §11.1.
*
* For S3 backends this is a true signed URL; for filesystem backends it's a
* CRM-internal proxy URL with an HMAC token (see `FilesystemBackend`).
*/
import { NextResponse } from 'next/server';
import { type RouteHandler } from '@/lib/api/helpers';
import { db } from '@/lib/db';
import { berths } from '@/lib/db/schema/berths';
import { eq } from 'drizzle-orm';
import { errorResponse, NotFoundError, ValidationError } from '@/lib/errors';
import { getMaxUploadMb } from '@/lib/services/berth-pdf.service';
import { getStorageBackend } from '@/lib/storage';
interface PostBody {
fileName: string;
/** Size hint in bytes — used to early-reject oversized uploads before we
* burn a presigned URL. */
sizeBytes?: number;
}
export const postHandler: RouteHandler = async (req, _ctx, params) => {
try {
const body = (await req.json()) as Partial<PostBody>;
const fileName = (body.fileName ?? '').trim();
if (!fileName) throw new ValidationError('fileName is required');
const berthRow = await db.query.berths.findFirst({ where: eq(berths.id, params.id!) });
if (!berthRow) throw new NotFoundError('Berth');
const maxMb = await getMaxUploadMb(berthRow.portId);
const maxBytes = maxMb * 1024 * 1024;
if (typeof body.sizeBytes === 'number' && body.sizeBytes > maxBytes) {
throw new ValidationError(
`File exceeds ${maxMb} MB upload cap (got ${(body.sizeBytes / 1024 / 1024).toFixed(1)} MB).`,
);
}
// Provisional version number: the actual row insert happens in POST
// /pdf-versions and re-computes via SELECT max+1 inside a transaction,
// so a race between two reps just shifts which one wins the version
// slot. The storage key is gen_random_uuid()-namespaced so collisions
// in the storage layer are impossible.
const sanitized = fileName.replace(/[^a-zA-Z0-9._-]/g, '_').slice(0, 200) || 'berth.pdf';
const storageKey = `berths/${params.id!}/uploads/${crypto.randomUUID()}_${sanitized}`;
const backend = await getStorageBackend();
const presigned = await backend.presignUpload(storageKey, {
contentType: 'application/pdf',
expirySeconds: 900,
});
return NextResponse.json({
data: {
url: presigned.url,
method: presigned.method,
storageKey,
maxBytes,
backend: backend.name,
},
});
} catch (error) {
return errorResponse(error);
}
};

View File

@@ -0,0 +1,5 @@
import { withAuth, withPermission } from '@/lib/api/helpers';
import { postHandler } from './handlers';
export const POST = withAuth(withPermission('berths', 'edit', postHandler));

View File

@@ -0,0 +1,14 @@
import { NextResponse } from 'next/server';
import { type RouteHandler } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { rollbackToVersion } from '@/lib/services/berth-pdf.service';
export const postHandler: RouteHandler = async (_req, _ctx, params) => {
try {
const result = await rollbackToVersion(params.id!, params.versionId!);
return NextResponse.json({ data: result });
} catch (error) {
return errorResponse(error);
}
};

View File

@@ -0,0 +1,5 @@
import { withAuth, withPermission } from '@/lib/api/helpers';
import { postHandler } from './handlers';
export const POST = withAuth(withPermission('berths', 'edit', postHandler));

View File

@@ -0,0 +1,88 @@
/**
* Route handlers for `/api/v1/berths/[id]/pdf-versions` (Phase 6b).
*
* Lives in handlers.ts (not route.ts) so integration tests can call them
* directly, bypassing the auth/permission middleware (per CLAUDE.md
* "Route handler exports" convention).
*/
import { NextResponse } from 'next/server';
import { type RouteHandler } from '@/lib/api/helpers';
import { errorResponse, ValidationError } from '@/lib/errors';
import { listBerthPdfVersions, uploadBerthPdf } from '@/lib/services/berth-pdf.service';
interface PostBody {
storageKey: string;
fileName: string;
fileSizeBytes: number;
sha256: string;
parseResults?: {
engine: 'acroform' | 'ocr' | 'ai';
extracted?: Record<string, unknown>;
meanConfidence?: number;
warnings?: string[];
};
}
export const getHandler: RouteHandler = async (_req, _ctx, params) => {
try {
const versions = await listBerthPdfVersions(params.id!);
return NextResponse.json({ data: versions });
} catch (error) {
return errorResponse(error);
}
};
export const postHandler: RouteHandler = async (req, ctx, params) => {
try {
const body = (await req.json()) as Partial<PostBody>;
if (!body.storageKey || !body.fileName) {
throw new ValidationError('storageKey and fileName are required');
}
if (typeof body.fileSizeBytes !== 'number' || body.fileSizeBytes <= 0) {
throw new ValidationError('fileSizeBytes must be a positive integer');
}
if (!body.sha256 || typeof body.sha256 !== 'string') {
throw new ValidationError('sha256 is required');
}
const result = await uploadBerthPdf({
berthId: params.id!,
storageKey: body.storageKey,
fileName: body.fileName,
fileSizeBytes: body.fileSizeBytes,
sha256: body.sha256,
uploadedBy: ctx.userId,
parseResult: body.parseResults
? {
engine: body.parseResults.engine,
// Reconstruct just enough of the ParseResult shape to round-trip
// through serialization; the rep already saw the conflicts in the
// diff dialog, so storing the engine + extracted is what we need
// for audit.
fields: Object.fromEntries(
Object.entries(body.parseResults.extracted ?? {}).map(([k, v]) => {
if (v && typeof v === 'object' && 'value' in v) {
const obj = v as { value: unknown; confidence?: number };
return [
k,
{
value: obj.value as never,
confidence: typeof obj.confidence === 'number' ? obj.confidence : 1,
engine: body.parseResults!.engine,
},
];
}
return [k, undefined];
}),
) as never,
meanConfidence: body.parseResults.meanConfidence ?? 1,
warnings: body.parseResults.warnings ?? [],
}
: undefined,
});
return NextResponse.json({ data: result }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
};

View File

@@ -0,0 +1,24 @@
import { NextResponse } from 'next/server';
import { type RouteHandler } from '@/lib/api/helpers';
import { errorResponse, ValidationError } from '@/lib/errors';
import { applyParseResults, type ExtractedBerthFields } from '@/lib/services/berth-pdf.service';
interface PostBody {
versionId: string;
fieldsToApply: Partial<ExtractedBerthFields>;
}
export const postHandler: RouteHandler = async (req, _ctx, params) => {
try {
const body = (await req.json()) as Partial<PostBody>;
if (!body.versionId) throw new ValidationError('versionId is required');
if (!body.fieldsToApply || typeof body.fieldsToApply !== 'object') {
throw new ValidationError('fieldsToApply must be an object');
}
const result = await applyParseResults(params.id!, body.versionId, body.fieldsToApply);
return NextResponse.json({ data: result });
} catch (error) {
return errorResponse(error);
}
};

View File

@@ -0,0 +1,5 @@
import { withAuth, withPermission } from '@/lib/api/helpers';
import { postHandler } from './handlers';
export const POST = withAuth(withPermission('berths', 'edit', postHandler));

View File

@@ -0,0 +1,6 @@
import { withAuth, withPermission } from '@/lib/api/helpers';
import { getHandler, postHandler } from './handlers';
export const GET = withAuth(withPermission('berths', 'view', getHandler));
export const POST = withAuth(withPermission('berths', 'edit', postHandler));

View File

@@ -0,0 +1,33 @@
import { NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { sendBerthPdf } from '@/lib/services/document-sends.service';
import { sendBerthPdfSchema } from '@/lib/validators/document-sends';
/**
* POST /api/v1/document-sends/berth-pdf
*
* Sends the active per-berth PDF version to a client recipient. The body
* markdown goes through the merge-field expander + sanitizer
* (`renderEmailBody`) before reaching nodemailer (§14.7 critical mitigation:
* body XSS).
*/
export const POST = withAuth(async (req, ctx) => {
try {
const input = await parseBody(req, sendBerthPdfSchema);
const result = await sendBerthPdf({
portId: ctx.portId,
berthId: input.berthId,
recipient: input.recipient,
customBodyMarkdown: input.customBodyMarkdown,
sentBy: ctx.userId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: result });
} catch (error) {
return errorResponse(error);
}
});

View File

@@ -0,0 +1,31 @@
import { NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { sendBrochure } from '@/lib/services/document-sends.service';
import { sendBrochureSchema } from '@/lib/validators/document-sends';
/**
* POST /api/v1/document-sends/brochure
*
* Sends a brochure (default or specified) to a client recipient. Same
* sanitization + audit-row pipeline as the berth-pdf endpoint.
*/
export const POST = withAuth(async (req, ctx) => {
try {
const input = await parseBody(req, sendBrochureSchema);
const result = await sendBrochure({
portId: ctx.portId,
brochureId: input.brochureId,
recipient: input.recipient,
customBodyMarkdown: input.customBodyMarkdown,
sentBy: ctx.userId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: result });
} catch (error) {
return errorResponse(error);
}
});

View File

@@ -0,0 +1,31 @@
import { NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { previewBody } from '@/lib/services/document-sends.service';
import { previewBodySchema } from '@/lib/validators/document-sends';
/**
* POST /api/v1/document-sends/preview
*
* Renders a body for the dry-run UI without actually sending. Returns the
* sanitized HTML, the post-merge markdown, and the list of unresolved
* `{{tokens}}` so the UI can block submit until the rep fills them in
* (§14.7 mitigation).
*/
export const POST = withAuth(async (req, ctx) => {
try {
const input = await parseBody(req, previewBodySchema);
const result = await previewBody(
ctx.portId,
input.documentKind,
input.recipient,
input.customBodyMarkdown ?? null,
{ berthId: input.berthId, brochureLabel: input.brochureId },
);
return NextResponse.json({ data: result });
} catch (error) {
return errorResponse(error);
}
});

View File

@@ -0,0 +1,23 @@
import { NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { parseQuery } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { listSends } from '@/lib/services/document-sends.service';
import { listSendsQuerySchema } from '@/lib/validators/document-sends';
export const GET = withAuth(async (req, ctx) => {
try {
const query = parseQuery(req, listSendsQuerySchema);
const data = await listSends({
portId: ctx.portId,
clientId: query.clientId,
interestId: query.interestId,
berthId: query.berthId,
limit: query.limit,
});
return NextResponse.json({ data });
} catch (error) {
return errorResponse(error);
}
});

View File

@@ -2,21 +2,74 @@ import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { exportPdf } from '@/lib/services/expense-export';
import { listExpensesSchema } from '@/lib/validators/expenses';
import { streamExpensePdf } from '@/lib/services/expense-pdf.service';
import { exportExpensePdfSchema } from '@/lib/validators/expenses';
/**
* POST /api/v1/expenses/export/pdf
*
* Streams the expense report PDF directly to the client — body bytes
* leave the process as pdfkit writes them, so the route is safe for
* hundreds of expenses with full-resolution receipt images. See
* `expense-pdf.service.ts` for the memory-budget design.
*
* Request body shape (zod-validated):
* {
* expenseIds?: string[] // explicit selection (preferred)
* filter?: {...} // listExpenses-style filter when no ids
* options: {
* documentName, subheader?, groupBy, includeReceipts,
* includeReceiptContents, includeSummary, includeDetails,
* includeProcessingFee, targetCurrency, pageFormat,
* }
* }
*
* Response: `application/pdf` binary stream + Content-Disposition.
*/
export const runtime = 'nodejs';
export const dynamic = 'force-dynamic';
export const POST = withAuth(
withPermission('expenses', 'view', async (req, ctx) => {
withPermission('expenses', 'export', async (req, ctx) => {
try {
const body = await req.json().catch(() => ({}));
const query = listExpensesSchema.parse(body);
const pdf = await exportPdf(ctx.portId, query);
const input = exportExpensePdfSchema.parse(body);
return new NextResponse(Buffer.from(pdf), {
const { stream, suggestedFilename } = await streamExpensePdf({
portId: ctx.portId,
expenseIds: input.expenseIds,
filter: input.filter
? {
dateFrom: input.filter.dateFrom ?? null,
dateTo: input.filter.dateTo ?? null,
category: input.filter.category ?? null,
paymentStatus: input.filter.paymentStatus ?? null,
payer: input.filter.payer ?? null,
includeArchived: input.filter.includeArchived ?? false,
}
: undefined,
options: input.options,
// Forward the request abort signal so the streaming PDF builder
// stops fetching/resizing receipts the moment the client disconnects
// (otherwise an aborted 1000-receipt export keeps the worker busy
// for minutes after the user navigated away — see audit finding 2).
signal: req.signal,
});
// Content-Disposition filename hardening: the validator caps length
// but `\s` matches CR/LF, which would let an attacker forge response
// headers. Strip everything that isn't word/space/dot/dash, AND set
// the RFC 5987 `filename*` so a UTF-8 body still survives.
const safeFilename = suggestedFilename.replace(/[^\w. \-]+/g, '_');
const disposition = `attachment; filename="${safeFilename}"; filename*=UTF-8''${encodeURIComponent(suggestedFilename)}`;
return new NextResponse(stream, {
status: 200,
headers: {
'Content-Type': 'application/pdf',
'Content-Disposition': `attachment; filename="expenses-${Date.now()}.pdf"`,
'Content-Disposition': disposition,
'Cache-Control': 'private, no-store, max-age=0',
'X-Content-Type-Options': 'nosniff',
},
});
} catch (error) {

View File

@@ -0,0 +1,156 @@
import { NextResponse } from 'next/server';
import { and, eq } from 'drizzle-orm';
import { z } from 'zod';
import { type RouteHandler } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse, NotFoundError, ValidationError } from '@/lib/errors';
import { db } from '@/lib/db';
import { interests, interestBerths } from '@/lib/db/schema/interests';
import { berths } from '@/lib/db/schema/berths';
import { removeInterestBerth, upsertInterestBerth } from '@/lib/services/interest-berths.service';
import { createAuditLog } from '@/lib/audit';
import { emitToRoom } from '@/lib/socket/server';
// ─── Schemas ────────────────────────────────────────────────────────────────
/**
* Partial update of a junction row's role flags + EOI bypass fields. Every
* field is optional; passing only the ones the rep wants to change.
*
* `eoiBypassReason` is a tri-state:
* - omitted → no change
* - non-empty → record bypass (server stamps `eoiBypassedAt = now()` and
* `eoiBypassedBy = caller`)
* - null → clear bypass (also clears `eoiBypassedBy` / `eoiBypassedAt`)
*/
const patchBerthSchema = z
.object({
isPrimary: z.boolean().optional(),
isSpecificInterest: z.boolean().optional(),
isInEoiBundle: z.boolean().optional(),
eoiBypassReason: z.string().max(2000).nullable().optional(),
})
.refine((v) => Object.values(v).some((x) => x !== undefined), {
message: 'At least one field must be provided.',
});
// ─── Helpers ────────────────────────────────────────────────────────────────
async function loadScopedRow(interestId: string, berthId: string, portId: string) {
// Verify interest port-scope first so unrelated 404s look identical to a
// truly-missing row (enumeration prevention — plan §14.10).
const interest = await db.query.interests.findFirst({
where: eq(interests.id, interestId),
});
if (!interest || interest.portId !== portId) {
throw new NotFoundError('Interest');
}
const link = await db.query.interestBerths.findFirst({
where: and(eq(interestBerths.interestId, interestId), eq(interestBerths.berthId, berthId)),
});
if (!link) {
throw new NotFoundError('Berth link');
}
// Also confirm the berth itself is in-port; defensive against a junction row
// pointing at a foreign berth (shouldn't happen, but cheap to check).
const berth = await db.query.berths.findFirst({
where: and(eq(berths.id, berthId), eq(berths.portId, portId)),
});
if (!berth) {
throw new NotFoundError('Berth');
}
return { interest, link, berth };
}
// ─── PATCH /api/v1/interests/[id]/berths/[berthId] ──────────────────────────
export const patchHandler: RouteHandler = async (req, ctx, params) => {
try {
const interestId = params.id!;
const berthId = params.berthId!;
const body = await parseBody(req, patchBerthSchema);
const { interest } = await loadScopedRow(interestId, berthId, ctx.portId);
// Plan §5.5: the bypass control is only available once the interest's
// primary EOI is signed. Defend the API too — never trust the UI to
// gate this.
if (body.eoiBypassReason !== undefined && interest.eoiStatus !== 'signed') {
throw new ValidationError('EOI bypass requires a signed primary EOI on the interest');
}
const updated = await upsertInterestBerth(interestId, berthId, {
isPrimary: body.isPrimary,
isSpecificInterest: body.isSpecificInterest,
isInEoiBundle: body.isInEoiBundle,
eoiBypassReason: body.eoiBypassReason,
eoiBypassedBy: body.eoiBypassReason ? ctx.userId : null,
});
void createAuditLog({
userId: ctx.userId,
portId: ctx.portId,
action: 'update',
entityType: 'interest',
entityId: interestId,
newValue: { berthId, ...body },
metadata: { type: 'berth_link_updated' },
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
emitToRoom(`port:${ctx.portId}`, 'interest:berthLinkUpdated', {
interestId,
berthId,
});
void import('@/lib/services/webhook-dispatch').then(({ dispatchWebhookEvent }) =>
dispatchWebhookEvent(ctx.portId, 'interest:berthLinkUpdated', {
interestId,
berthId,
}),
);
return NextResponse.json({ data: updated });
} catch (error) {
return errorResponse(error);
}
};
// ─── DELETE /api/v1/interests/[id]/berths/[berthId] ─────────────────────────
export const deleteHandler: RouteHandler = async (_req, ctx, params) => {
try {
const interestId = params.id!;
const berthId = params.berthId!;
await loadScopedRow(interestId, berthId, ctx.portId);
await removeInterestBerth(interestId, berthId);
void createAuditLog({
userId: ctx.userId,
portId: ctx.portId,
action: 'update',
entityType: 'interest',
entityId: interestId,
oldValue: { berthId },
metadata: { type: 'berth_removed_from_interest' },
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
emitToRoom(`port:${ctx.portId}`, 'interest:berthUnlinked', {
interestId,
berthId,
});
void import('@/lib/services/webhook-dispatch').then(({ dispatchWebhookEvent }) =>
dispatchWebhookEvent(ctx.portId, 'interest:berthUnlinked', {
interestId,
berthId,
}),
);
return new NextResponse(null, { status: 204 });
} catch (error) {
return errorResponse(error);
}
};

View File

@@ -0,0 +1,6 @@
import { withAuth, withPermission } from '@/lib/api/helpers';
import { deleteHandler, patchHandler } from './handlers';
export const PATCH = withAuth(withPermission('interests', 'edit', patchHandler));
export const DELETE = withAuth(withPermission('interests', 'edit', deleteHandler));

View File

@@ -0,0 +1,109 @@
import { NextResponse } from 'next/server';
import { and, eq } from 'drizzle-orm';
import { z } from 'zod';
import { type RouteHandler } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse, NotFoundError, ValidationError } from '@/lib/errors';
import { db } from '@/lib/db';
import { interests } from '@/lib/db/schema/interests';
import { berths } from '@/lib/db/schema/berths';
import { listBerthsForInterest, upsertInterestBerth } from '@/lib/services/interest-berths.service';
import { createAuditLog } from '@/lib/audit';
import { emitToRoom } from '@/lib/socket/server';
// ─── Schemas ────────────────────────────────────────────────────────────────
const addBerthSchema = z.object({
berthId: z.string().min(1),
/** Drives the public-map "Under Offer" sub-status. See plan §5.4. */
isSpecificInterest: z.boolean(),
});
// ─── GET /api/v1/interests/[id]/berths ──────────────────────────────────────
//
// Returns the linked-berths list (plan §5.5) along with the parent interest's
// `eoiStatus` so the UI can decide whether to show the EOI-bypass control.
// Tenant-scoped: 404 when the interest doesn't belong to the caller's port,
// matching the recommender route's enumeration-prevention behaviour.
export const listHandler: RouteHandler = async (_req, ctx, params) => {
try {
const interestId = params.id!;
const interest = await db.query.interests.findFirst({
where: eq(interests.id, interestId),
});
if (!interest || interest.portId !== ctx.portId) {
throw new NotFoundError('Interest');
}
const links = await listBerthsForInterest(interestId);
return NextResponse.json({
data: links,
meta: { eoiStatus: interest.eoiStatus },
});
} catch (error) {
return errorResponse(error);
}
};
// ─── POST /api/v1/interests/[id]/berths ─────────────────────────────────────
//
// Add a (non-primary) berth link to the interest. Defaults to
// `isInEoiBundle=false`, `isPrimary=false`; the rep can flip these later via
// the linked-berths list (PATCH route below).
export const addHandler: RouteHandler = async (req, ctx, params) => {
try {
const body = await parseBody(req, addBerthSchema);
const interestId = params.id!;
const interest = await db.query.interests.findFirst({
where: eq(interests.id, interestId),
});
if (!interest || interest.portId !== ctx.portId) {
throw new NotFoundError('Interest');
}
// Tenant scope: berth must belong to this port (never trust a client-
// supplied id to cross port boundaries — plan §14.10).
const berth = await db.query.berths.findFirst({
where: and(eq(berths.id, body.berthId), eq(berths.portId, ctx.portId)),
});
if (!berth) {
throw new ValidationError('berthId not found in this port');
}
const link = await upsertInterestBerth(interestId, body.berthId, {
isSpecificInterest: body.isSpecificInterest,
addedBy: ctx.userId,
});
void createAuditLog({
userId: ctx.userId,
portId: ctx.portId,
action: 'update',
entityType: 'interest',
entityId: interestId,
newValue: { berthId: body.berthId, isSpecificInterest: body.isSpecificInterest },
metadata: { type: 'berth_added_to_interest' },
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
emitToRoom(`port:${ctx.portId}`, 'interest:berthLinked', {
interestId,
berthId: body.berthId,
});
// Outbound webhook: the legacy /link-berth path dispatched
// `interest.berth_linked` and external integrations subscribe to it.
// The new junction-add path must keep that contract.
void import('@/lib/services/webhook-dispatch').then(({ dispatchWebhookEvent }) =>
dispatchWebhookEvent(ctx.portId, 'interest:berthLinked', {
interestId,
berthId: body.berthId,
}),
);
return NextResponse.json({ data: link }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
};

View File

@@ -0,0 +1,6 @@
import { withAuth, withPermission } from '@/lib/api/helpers';
import { addHandler, listHandler } from './handlers';
export const GET = withAuth(withPermission('interests', 'view', listHandler));
export const POST = withAuth(withPermission('interests', 'edit', addHandler));

View File

@@ -0,0 +1,44 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { recommendBerths } from '@/lib/services/berth-recommender.service';
/**
* POST body — mirrors `RecommendBerthsArgs` minus the `interestId` (route
* param) and `portId` (resolved from the auth context — never trust a
* client-supplied port, plan §14.10).
*/
const recommendBerthsSchema = z.object({
topN: z.number().int().min(1).max(999).optional(),
maxOversizePct: z.number().min(0).max(1000).optional(),
showLateStage: z.boolean().optional(),
amenityFilters: z
.object({
minPowerCapacityKw: z.number().min(0).optional(),
requiredVoltage: z.number().int().min(0).optional(),
requiredAccess: z.string().min(1).optional(),
requiredMooringType: z.string().min(1).optional(),
requiredCleatCapacity: z.string().min(1).optional(),
})
.optional(),
});
// POST /api/v1/interests/[id]/recommend-berths
export const POST = withAuth(
withPermission('interests', 'view', async (req, ctx, params) => {
try {
const body = await parseBody(req, recommendBerthsSchema);
const data = await recommendBerths({
interestId: params.id!,
portId: ctx.portId,
...body,
});
return NextResponse.json({ data });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,345 @@
'use client';
/**
* Brochures admin panel (Phase 7 §5.8).
*
* Lists every brochure for the port (including archived). Lets a
* `manage_settings` admin:
* - Create new brochures.
* - Upload a new version (direct-to-storage presigned PUT, see §11.1).
* - Mark default / archive.
*/
import { useState } from 'react';
import { useMutation, useQuery, useQueryClient } from '@tanstack/react-query';
import { Archive, FileText, Loader2, Plus, Star, Upload } from 'lucide-react';
import { toast } from 'sonner';
import { Button } from '@/components/ui/button';
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from '@/components/ui/dialog';
import { Input } from '@/components/ui/input';
import { Label } from '@/components/ui/label';
import { Textarea } from '@/components/ui/textarea';
import { Switch } from '@/components/ui/switch';
import { apiFetch } from '@/lib/api/client';
interface BrochureRow {
id: string;
label: string;
description: string | null;
isDefault: boolean;
archivedAt: string | null;
versionCount: number;
currentVersion: {
id: string;
fileName: string;
fileSizeBytes: number;
uploadedAt: string;
} | null;
}
interface BrochuresResponse {
data: BrochureRow[];
}
interface UploadGrantResponse {
data: { storageKey: string; uploadUrl: string; method: 'PUT'; maxBytes: number };
}
export function BrochuresAdminPanel() {
const queryClient = useQueryClient();
const [createOpen, setCreateOpen] = useState(false);
const brochuresQuery = useQuery<BrochuresResponse>({
queryKey: ['brochures', 'admin'],
queryFn: () => apiFetch('/api/v1/admin/brochures'),
});
const rows = brochuresQuery.data?.data ?? [];
return (
<div className="space-y-4">
<div className="flex justify-end">
<Button onClick={() => setCreateOpen(true)}>
<Plus className="mr-2 h-4 w-4" /> New brochure
</Button>
</div>
{brochuresQuery.isLoading && (
<div className="flex items-center gap-2 text-sm text-muted-foreground">
<Loader2 className="h-4 w-4 animate-spin" /> Loading
</div>
)}
{!brochuresQuery.isLoading && rows.length === 0 && (
<Card>
<CardContent className="py-8 text-center text-sm text-muted-foreground">
No brochures yet. Click &ldquo;New brochure&rdquo; to add one.
</CardContent>
</Card>
)}
<div className="space-y-3">
{rows.map((b) => (
<BrochureCard
key={b.id}
brochure={b}
onChange={() => {
void queryClient.invalidateQueries({ queryKey: ['brochures', 'admin'] });
void queryClient.invalidateQueries({ queryKey: ['brochures', 'list'] });
}}
/>
))}
</div>
<CreateBrochureDialog
open={createOpen}
onOpenChange={setCreateOpen}
onCreated={() => {
void queryClient.invalidateQueries({ queryKey: ['brochures', 'admin'] });
}}
/>
</div>
);
}
function BrochureCard({ brochure, onChange }: { brochure: BrochureRow; onChange: () => void }) {
const [uploading, setUploading] = useState(false);
const setDefaultMutation = useMutation({
mutationFn: () =>
apiFetch(`/api/v1/admin/brochures/${brochure.id}`, {
method: 'PATCH',
body: { isDefault: true },
}),
onSuccess: () => {
toast.success('Default brochure updated');
onChange();
},
});
const archiveMutation = useMutation({
mutationFn: () => apiFetch(`/api/v1/admin/brochures/${brochure.id}`, { method: 'DELETE' }),
onSuccess: () => {
toast.success('Brochure archived');
onChange();
},
});
async function handleUpload(file: File) {
setUploading(true);
try {
const grant: UploadGrantResponse = await apiFetch(
`/api/v1/admin/brochures/${brochure.id}/versions`,
);
if (file.size > grant.data.maxBytes) {
throw new Error(
`File is too large. Max is ${(grant.data.maxBytes / 1024 / 1024).toFixed(0)}MB.`,
);
}
// Direct-to-storage PUT (§11.1).
const putRes = await fetch(grant.data.uploadUrl, {
method: 'PUT',
body: file,
headers: { 'Content-Type': 'application/pdf' },
});
if (!putRes.ok) throw new Error(`Upload failed: ${putRes.status}`);
const sha = await sha256Hex(file);
await apiFetch(`/api/v1/admin/brochures/${brochure.id}/versions`, {
method: 'POST',
body: {
storageKey: grant.data.storageKey,
fileName: file.name,
fileSizeBytes: file.size,
contentSha256: sha,
},
});
toast.success('New version uploaded');
onChange();
} catch (err) {
toast.error(err instanceof Error ? err.message : 'Upload failed');
} finally {
setUploading(false);
}
}
return (
<Card className={brochure.archivedAt ? 'opacity-60' : ''}>
<CardHeader>
<CardTitle className="flex items-center justify-between text-base">
<span className="flex items-center gap-2">
<FileText className="h-4 w-4" /> {brochure.label}
{brochure.isDefault && (
<span className="flex items-center gap-1 rounded bg-primary/10 px-2 py-0.5 text-xs text-primary">
<Star className="h-3 w-3" /> default
</span>
)}
{brochure.archivedAt && (
<span className="rounded bg-muted px-2 py-0.5 text-xs text-muted-foreground">
archived
</span>
)}
</span>
<span className="text-xs text-muted-foreground">{brochure.versionCount} versions</span>
</CardTitle>
</CardHeader>
<CardContent className="space-y-2">
{brochure.description && (
<p className="text-sm text-muted-foreground">{brochure.description}</p>
)}
{brochure.currentVersion && (
<p className="text-xs text-muted-foreground">
Latest: {brochure.currentVersion.fileName} (
{(brochure.currentVersion.fileSizeBytes / 1024 / 1024).toFixed(2)} MB,{' '}
{new Date(brochure.currentVersion.uploadedAt).toLocaleDateString()})
</p>
)}
<div className="flex gap-2 pt-2">
{!brochure.archivedAt && (
<>
<label className="cursor-pointer">
<input
type="file"
accept="application/pdf"
className="hidden"
onChange={(e) => {
const file = e.target.files?.[0];
if (file) void handleUpload(file);
e.target.value = '';
}}
/>
<Button asChild variant="outline" size="sm" disabled={uploading}>
<span>
{uploading ? (
<Loader2 className="mr-2 h-3 w-3 animate-spin" />
) : (
<Upload className="mr-2 h-3 w-3" />
)}
Upload version
</span>
</Button>
</label>
{!brochure.isDefault && (
<Button
variant="outline"
size="sm"
onClick={() => setDefaultMutation.mutate()}
disabled={setDefaultMutation.isPending}
>
<Star className="mr-2 h-3 w-3" /> Mark default
</Button>
)}
<Button
variant="outline"
size="sm"
onClick={() => archiveMutation.mutate()}
disabled={archiveMutation.isPending}
>
<Archive className="mr-2 h-3 w-3" /> Archive
</Button>
</>
)}
</div>
</CardContent>
</Card>
);
}
function CreateBrochureDialog({
open,
onOpenChange,
onCreated,
}: {
open: boolean;
onOpenChange: (o: boolean) => void;
onCreated: () => void;
}) {
const [label, setLabel] = useState('');
const [description, setDescription] = useState('');
const [isDefault, setIsDefault] = useState(false);
const createMutation = useMutation({
mutationFn: () =>
apiFetch('/api/v1/admin/brochures', {
method: 'POST',
body: {
label,
description: description || null,
isDefault,
},
}),
onSuccess: () => {
toast.success('Brochure created. Upload a version next.');
setLabel('');
setDescription('');
setIsDefault(false);
onCreated();
onOpenChange(false);
},
});
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent>
<DialogHeader>
<DialogTitle>New brochure</DialogTitle>
<DialogDescription>
Create the brochure container, then upload a PDF version on the card that appears.
</DialogDescription>
</DialogHeader>
<div className="space-y-3">
<div className="space-y-1">
<Label htmlFor="b-label">Label</Label>
<Input
id="b-label"
value={label}
onChange={(e) => setLabel(e.target.value)}
placeholder="General overview"
/>
</div>
<div className="space-y-1">
<Label htmlFor="b-desc">Description (optional)</Label>
<Textarea
id="b-desc"
rows={2}
value={description}
onChange={(e) => setDescription(e.target.value)}
/>
</div>
<div className="flex items-center justify-between">
<Label htmlFor="b-def">Set as default</Label>
<Switch id="b-def" checked={isDefault} onCheckedChange={setIsDefault} />
</div>
</div>
<DialogFooter>
<Button variant="outline" onClick={() => onOpenChange(false)}>
Cancel
</Button>
<Button
disabled={!label.trim() || createMutation.isPending}
onClick={() => createMutation.mutate()}
>
{createMutation.isPending && <Loader2 className="mr-2 h-4 w-4 animate-spin" />}
Create
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
);
}
async function sha256Hex(file: File): Promise<string> {
const buf = await file.arrayBuffer();
const hash = await crypto.subtle.digest('SHA-256', buf);
return Array.from(new Uint8Array(hash))
.map((b) => b.toString(16).padStart(2, '0'))
.join('');
}

View File

@@ -0,0 +1,381 @@
'use client';
/**
* Sales send-from config card (Phase 7 §5.9).
*
* Lives on /[portSlug]/admin/email below the existing noreply transport
* card. Lets per-port admins configure the SMTP/IMAP creds + body templates
* that the document-sends flow uses.
*
* §14.10 enforcement: passwords are write-only. The GET endpoint never
* returns the decrypted value — only a `*PassIsSet` boolean. Empty
* password input means "leave unchanged"; explicit `null` sent over the
* wire means "clear".
*/
import { useEffect, useState } from 'react';
import { Loader2 } from 'lucide-react';
import { toast } from 'sonner';
import { Button } from '@/components/ui/button';
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from '@/components/ui/card';
import { Input } from '@/components/ui/input';
import { Label } from '@/components/ui/label';
import { Switch } from '@/components/ui/switch';
import { Textarea } from '@/components/ui/textarea';
import { apiFetch } from '@/lib/api/client';
interface SalesConfigResponse {
data: {
email: {
fromAddress: string;
smtpHost: string | null;
smtpPort: number;
smtpSecure: boolean;
smtpUser: string | null;
authMethod: string;
smtpPassIsSet: boolean;
isUsable: boolean;
};
imap: {
imapHost: string | null;
imapPort: number;
imapUser: string | null;
imapPassIsSet: boolean;
isUsable: boolean;
};
content: {
noreplyFromAddress: string;
templateBerthPdfBody: string;
templateBrochureBody: string;
brochureMaxUploadMb: number;
emailAttachThresholdMb: number;
};
};
}
interface FormState {
fromAddress: string;
smtpHost: string;
smtpPort: number | '';
smtpSecure: boolean;
smtpUser: string;
smtpPass: string; // empty = unchanged
imapHost: string;
imapPort: number | '';
imapUser: string;
imapPass: string;
noreplyFromAddress: string;
templateBerthPdfBody: string;
templateBrochureBody: string;
brochureMaxUploadMb: number | '';
emailAttachThresholdMb: number | '';
}
const EMPTY_FORM: FormState = {
fromAddress: '',
smtpHost: '',
smtpPort: 587,
smtpSecure: false,
smtpUser: '',
smtpPass: '',
imapHost: '',
imapPort: 993,
imapUser: '',
imapPass: '',
noreplyFromAddress: '',
templateBerthPdfBody: '',
templateBrochureBody: '',
brochureMaxUploadMb: 50,
emailAttachThresholdMb: 15,
};
export function SalesEmailConfigCard() {
const [loading, setLoading] = useState(true);
const [saving, setSaving] = useState(false);
const [smtpPassSet, setSmtpPassSet] = useState(false);
const [imapPassSet, setImapPassSet] = useState(false);
const [form, setForm] = useState<FormState>(EMPTY_FORM);
async function refresh() {
setLoading(true);
try {
const res: SalesConfigResponse = await apiFetch('/api/v1/admin/email/sales-config');
setSmtpPassSet(res.data.email.smtpPassIsSet);
setImapPassSet(res.data.imap.imapPassIsSet);
setForm({
fromAddress: res.data.email.fromAddress,
smtpHost: res.data.email.smtpHost ?? '',
smtpPort: res.data.email.smtpPort,
smtpSecure: res.data.email.smtpSecure,
smtpUser: res.data.email.smtpUser ?? '',
smtpPass: '',
imapHost: res.data.imap.imapHost ?? '',
imapPort: res.data.imap.imapPort,
imapUser: res.data.imap.imapUser ?? '',
imapPass: '',
noreplyFromAddress: res.data.content.noreplyFromAddress,
templateBerthPdfBody: res.data.content.templateBerthPdfBody,
templateBrochureBody: res.data.content.templateBrochureBody,
brochureMaxUploadMb: res.data.content.brochureMaxUploadMb,
emailAttachThresholdMb: res.data.content.emailAttachThresholdMb,
});
} finally {
setLoading(false);
}
}
useEffect(() => {
void refresh();
}, []);
function update<K extends keyof FormState>(key: K, value: FormState[K]) {
setForm((prev) => ({ ...prev, [key]: value }));
}
async function handleSave() {
setSaving(true);
try {
const payload: Record<string, unknown> = {
fromAddress: form.fromAddress || null,
smtpHost: form.smtpHost || null,
smtpPort: typeof form.smtpPort === 'number' ? form.smtpPort : null,
smtpSecure: form.smtpSecure,
smtpUser: form.smtpUser || null,
imapHost: form.imapHost || null,
imapPort: typeof form.imapPort === 'number' ? form.imapPort : null,
imapUser: form.imapUser || null,
noreplyFromAddress: form.noreplyFromAddress || null,
templateBerthPdfBody: form.templateBerthPdfBody,
templateBrochureBody: form.templateBrochureBody,
brochureMaxUploadMb:
typeof form.brochureMaxUploadMb === 'number' ? form.brochureMaxUploadMb : null,
emailAttachThresholdMb:
typeof form.emailAttachThresholdMb === 'number' ? form.emailAttachThresholdMb : null,
};
// Only send password fields when the user actually typed something.
if (form.smtpPass !== '') payload.smtpPass = form.smtpPass;
if (form.imapPass !== '') payload.imapPass = form.imapPass;
await apiFetch('/api/v1/admin/email/sales-config', { method: 'PATCH', body: payload });
toast.success('Sales email settings saved');
await refresh();
} catch (err) {
toast.error(err instanceof Error ? err.message : 'Save failed');
} finally {
setSaving(false);
}
}
if (loading) {
return (
<Card>
<CardContent className="flex items-center gap-2 py-6 text-sm text-muted-foreground">
<Loader2 className="h-4 w-4 animate-spin" /> Loading sales email config
</CardContent>
</Card>
);
}
return (
<div className="space-y-4">
<Card>
<CardHeader>
<CardTitle>Sales send-from account</CardTitle>
<CardDescription>
SMTP credentials for human-touch outbound (brochures + per-berth PDFs). IMAP creds
enable the bounce monitor leave blank to disable bounce-rejection banners. Passwords
are encrypted at rest and never returned by the API.
</CardDescription>
</CardHeader>
<CardContent className="space-y-4">
<div className="grid gap-3 md:grid-cols-2">
<Field label="From address" id="sef-from">
<Input
id="sef-from"
type="email"
value={form.fromAddress}
onChange={(e) => update('fromAddress', e.target.value)}
placeholder="sales@portnimara.com"
/>
</Field>
<Field label="SMTP host" id="sef-smtp-host">
<Input
id="sef-smtp-host"
value={form.smtpHost}
onChange={(e) => update('smtpHost', e.target.value)}
placeholder="smtp.gmail.com"
/>
</Field>
<Field label="SMTP port" id="sef-smtp-port">
<Input
id="sef-smtp-port"
type="number"
value={form.smtpPort}
onChange={(e) =>
update('smtpPort', e.target.value === '' ? '' : Number(e.target.value))
}
/>
</Field>
<div className="flex items-end justify-between gap-2">
<Label htmlFor="sef-smtp-secure" className="text-sm">
SSL (true=465, false=STARTTLS on 587)
</Label>
<Switch
id="sef-smtp-secure"
checked={form.smtpSecure}
onCheckedChange={(v) => update('smtpSecure', v)}
/>
</div>
<Field label="SMTP username" id="sef-smtp-user">
<Input
id="sef-smtp-user"
value={form.smtpUser}
onChange={(e) => update('smtpUser', e.target.value)}
/>
</Field>
<Field
label={`SMTP password ${smtpPassSet ? '(stored — leave blank to keep)' : ''}`}
id="sef-smtp-pass"
>
<Input
id="sef-smtp-pass"
type="password"
value={form.smtpPass}
onChange={(e) => update('smtpPass', e.target.value)}
placeholder={smtpPassSet ? '••••••••' : 'app password'}
/>
</Field>
</div>
</CardContent>
</Card>
<Card>
<CardHeader>
<CardTitle>Bounce monitor (IMAP)</CardTitle>
<CardDescription>
Required only for the async-bounce banner (§14.9). Same provider account as SMTP in most
setups.
</CardDescription>
</CardHeader>
<CardContent className="grid gap-3 md:grid-cols-2">
<Field label="IMAP host" id="sef-imap-host">
<Input
id="sef-imap-host"
value={form.imapHost}
onChange={(e) => update('imapHost', e.target.value)}
placeholder="imap.gmail.com"
/>
</Field>
<Field label="IMAP port" id="sef-imap-port">
<Input
id="sef-imap-port"
type="number"
value={form.imapPort}
onChange={(e) =>
update('imapPort', e.target.value === '' ? '' : Number(e.target.value))
}
/>
</Field>
<Field label="IMAP username" id="sef-imap-user">
<Input
id="sef-imap-user"
value={form.imapUser}
onChange={(e) => update('imapUser', e.target.value)}
/>
</Field>
<Field
label={`IMAP password ${imapPassSet ? '(stored — leave blank to keep)' : ''}`}
id="sef-imap-pass"
>
<Input
id="sef-imap-pass"
type="password"
value={form.imapPass}
onChange={(e) => update('imapPass', e.target.value)}
placeholder={imapPassSet ? '••••••••' : 'app password'}
/>
</Field>
</CardContent>
</Card>
<Card>
<CardHeader>
<CardTitle>Body templates</CardTitle>
<CardDescription>
Default markdown bodies used when a rep doesn&rsquo;t write a custom one. Tokens like{' '}
<code>{'{{client.fullName}}'}</code> are expanded server-side.
</CardDescription>
</CardHeader>
<CardContent className="space-y-4">
<Field label="Berth PDF body" id="sef-tmpl-berth">
<Textarea
id="sef-tmpl-berth"
rows={6}
value={form.templateBerthPdfBody}
onChange={(e) => update('templateBerthPdfBody', e.target.value)}
className="font-mono text-sm"
/>
</Field>
<Field label="Brochure body" id="sef-tmpl-broc">
<Textarea
id="sef-tmpl-broc"
rows={6}
value={form.templateBrochureBody}
onChange={(e) => update('templateBrochureBody', e.target.value)}
className="font-mono text-sm"
/>
</Field>
<div className="grid gap-3 md:grid-cols-2">
<Field label="Brochure max upload (MB)" id="sef-broc-max">
<Input
id="sef-broc-max"
type="number"
value={form.brochureMaxUploadMb}
onChange={(e) =>
update('brochureMaxUploadMb', e.target.value === '' ? '' : Number(e.target.value))
}
/>
</Field>
<Field label="Attach-vs-link threshold (MB)" id="sef-attach">
<Input
id="sef-attach"
type="number"
value={form.emailAttachThresholdMb}
onChange={(e) =>
update(
'emailAttachThresholdMb',
e.target.value === '' ? '' : Number(e.target.value),
)
}
/>
</Field>
</div>
<Field label="Noreply from address" id="sef-noreply">
<Input
id="sef-noreply"
type="email"
value={form.noreplyFromAddress}
onChange={(e) => update('noreplyFromAddress', e.target.value)}
/>
</Field>
</CardContent>
</Card>
<div className="flex justify-end">
<Button onClick={handleSave} disabled={saving}>
{saving && <Loader2 className="mr-2 h-4 w-4 animate-spin" />}
Save sales email settings
</Button>
</div>
</div>
);
}
function Field({ label, id, children }: { label: string; id: string; children: React.ReactNode }) {
return (
<div className="space-y-1">
<Label htmlFor={id}>{label}</Label>
{children}
</div>
);
}

View File

@@ -0,0 +1,239 @@
'use client';
import { useState } from 'react';
import { useMutation, useQuery, useQueryClient } from '@tanstack/react-query';
import { CheckCircle2, HardDrive, Loader2, RefreshCw, ServerCog, XCircle } from 'lucide-react';
import { PageHeader } from '@/components/shared/page-header';
import { Button } from '@/components/ui/button';
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from '@/components/ui/card';
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from '@/components/ui/dialog';
import { apiFetch } from '@/lib/api/client';
type BackendName = 's3' | 'filesystem';
interface StorageStatus {
backend: BackendName;
fileCount: number;
totalBytes: number;
tablesTracked: string[];
}
interface MigrationResult {
rowsConsidered: number;
rowsMigrated: number;
rowsSkippedAlreadyDone: number;
totalBytes: number;
flipped: boolean;
dryRun: boolean;
}
export function StorageAdminPanel() {
const queryClient = useQueryClient();
const [confirmOpen, setConfirmOpen] = useState(false);
const [dryRun, setDryRun] = useState<MigrationResult | null>(null);
const [testResult, setTestResult] = useState<{ ok: boolean; error?: string } | null>(null);
const status = useQuery({
queryKey: ['admin', 'storage', 'status'],
queryFn: () => apiFetch<{ data: StorageStatus }>('/api/v1/admin/storage'),
});
const dryRunMutation = useMutation({
mutationFn: async (opts: { from: BackendName; to: BackendName }) =>
apiFetch<{ data: MigrationResult }>('/api/v1/admin/storage/migrate', {
method: 'POST',
body: JSON.stringify({ ...opts, dryRun: true }),
}),
onSuccess: (result) => {
setDryRun(result.data);
setConfirmOpen(true);
},
});
const migrateMutation = useMutation({
mutationFn: async (opts: { from: BackendName; to: BackendName }) =>
apiFetch<{ data: MigrationResult }>('/api/v1/admin/storage/migrate', {
method: 'POST',
body: JSON.stringify({ ...opts, dryRun: false }),
}),
onSuccess: () => {
setConfirmOpen(false);
setDryRun(null);
queryClient.invalidateQueries({ queryKey: ['admin', 'storage', 'status'] });
},
});
const testMutation = useMutation({
mutationFn: async () =>
apiFetch<{ ok: boolean; error?: string }>('/api/v1/admin/storage', {
method: 'POST',
}),
onSuccess: (r) => setTestResult(r),
onError: (e: Error) => setTestResult({ ok: false, error: e.message }),
});
if (status.isLoading) {
return (
<div className="flex items-center gap-2 text-sm text-muted-foreground">
<Loader2 className="h-4 w-4 animate-spin" /> Loading storage status
</div>
);
}
if (status.isError || !status.data?.data) {
return <div className="text-sm text-destructive">Failed to load storage status.</div>;
}
const s = status.data.data;
const otherBackend: BackendName = s.backend === 's3' ? 'filesystem' : 's3';
return (
<div className="space-y-6">
<PageHeader
title="Storage Backend"
description="Where the CRM stores per-berth PDFs, brochures, GDPR exports, and other binary files."
/>
<div className="grid gap-4 lg:grid-cols-3">
<Card className="lg:col-span-2">
<CardHeader className="flex flex-row items-start gap-3 space-y-0 pb-2">
{s.backend === 's3' ? (
<ServerCog className="mt-0.5 h-5 w-5 text-muted-foreground" />
) : (
<HardDrive className="mt-0.5 h-5 w-5 text-muted-foreground" />
)}
<div>
<CardTitle className="text-base">Active backend: {s.backend}</CardTitle>
<CardDescription>
{s.backend === 's3'
? 'Files stored in an S3-compatible object store (MinIO, AWS S3, Backblaze B2, Cloudflare R2, Wasabi, Tigris).'
: 'Files stored on the local filesystem under storage_filesystem_root. Single-node deployments only.'}
</CardDescription>
</div>
</CardHeader>
<CardContent className="space-y-4">
<dl className="grid grid-cols-2 gap-3 text-sm">
<div>
<dt className="text-muted-foreground">Tracked tables</dt>
<dd>
{s.tablesTracked.length === 0
? '(none yet — Phase 6b)'
: s.tablesTracked.join(', ')}
</dd>
</div>
<div>
<dt className="text-muted-foreground">File count</dt>
<dd>{s.fileCount}</dd>
</div>
</dl>
<div className="flex flex-wrap gap-3">
<Button
variant="outline"
disabled={dryRunMutation.isPending}
onClick={() => dryRunMutation.mutate({ from: s.backend, to: otherBackend })}
>
{dryRunMutation.isPending && <Loader2 className="mr-2 h-4 w-4 animate-spin" />}
Switch to {otherBackend}
</Button>
{s.backend === 's3' && (
<Button
variant="outline"
onClick={() => testMutation.mutate()}
disabled={testMutation.isPending}
>
{testMutation.isPending && <Loader2 className="mr-2 h-4 w-4 animate-spin" />}
Test connection
</Button>
)}
<Button variant="ghost" onClick={() => status.refetch()} disabled={status.isFetching}>
<RefreshCw className="mr-2 h-4 w-4" /> Refresh
</Button>
</div>
{testResult && (
<div className="rounded-md border p-3 text-sm">
{testResult.ok ? (
<div className="flex items-center gap-2 text-emerald-600">
<CheckCircle2 className="h-4 w-4" /> Connection OK round-trip succeeded.
</div>
) : (
<div className="flex items-center gap-2 text-destructive">
<XCircle className="h-4 w-4" /> {testResult.error ?? 'Connection failed'}
</div>
)}
</div>
)}
</CardContent>
</Card>
<Card>
<CardHeader>
<CardTitle className="text-base">Backup notes</CardTitle>
</CardHeader>
<CardContent className="space-y-2 text-sm text-muted-foreground">
{s.backend === 's3' ? (
<p>
S3 mode: configure your provider&apos;s lifecycle / replication / versioning
policies as your primary backup. The CRM does not duplicate object storage in its
own backups.
</p>
) : (
<p>
Filesystem mode: include the storage root directory in your backup tool (restic,
borg, snapshots). It sits next to the database; the two should be backed up
together.
</p>
)}
<p className="pt-2 text-xs">
Filesystem mode refuses to start when MULTI_NODE_DEPLOYMENT=true. For multi-node
deployments, switch to an S3-compatible backend.
</p>
</CardContent>
</Card>
</div>
<Dialog open={confirmOpen} onOpenChange={setConfirmOpen}>
<DialogContent>
<DialogHeader>
<DialogTitle>Switch storage backend</DialogTitle>
<DialogDescription>
Move all tracked files from the current backend to the new backend, verify each file
via sha256, then atomically flip the active backend.
</DialogDescription>
</DialogHeader>
{dryRun && (
<div className="rounded-md border p-3 text-sm">
<dl className="grid grid-cols-2 gap-2">
<dt className="text-muted-foreground">Rows considered</dt>
<dd>{dryRun.rowsConsidered}</dd>
<dt className="text-muted-foreground">Already migrated (resumable)</dt>
<dd>{dryRun.rowsSkippedAlreadyDone}</dd>
<dt className="text-muted-foreground">Total bytes</dt>
<dd>{Math.round(dryRun.totalBytes / 1024)} KB</dd>
</dl>
</div>
)}
<DialogFooter>
<Button variant="outline" onClick={() => setConfirmOpen(false)}>
Cancel
</Button>
<Button
disabled={migrateMutation.isPending}
onClick={() => migrateMutation.mutate({ from: s.backend, to: otherBackend })}
>
{migrateMutation.isPending && <Loader2 className="mr-2 h-4 w-4 animate-spin" />}
Migrate now
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
</div>
);
}

View File

@@ -0,0 +1,269 @@
/**
* Documents tab on the berth detail page (Phase 6b — see plan §5.6).
*
* Sections:
* - Current PDF panel (download link, "Replace PDF" button, parse-engine chip).
* - Version history list — newest first, with rollback affordance on every
* non-current row.
* - Reconcile-diff dialog (PdfReconcileDialog), opened after a successful
* upload + parse. Shows auto-applied vs conflicted fields and lets the
* rep accept the conflict resolution.
*
* The actual upload is split in two steps:
* 1. POST /pdf-upload-url -> presigned URL + storageKey
* 2. PUT the file to that URL (multipart for filesystem-proxy mode, signed
* PUT for S3 mode)
* 3. POST /pdf-versions with the storage key + parse results
*/
'use client';
import { useRef, useState } from 'react';
import { useMutation, useQuery, useQueryClient } from '@tanstack/react-query';
import { toast } from 'sonner';
import { apiFetch } from '@/lib/api/client';
import { Button } from '@/components/ui/button';
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
import { Badge } from '@/components/ui/badge';
import { PdfReconcileDialog } from './pdf-reconcile-dialog';
interface PdfVersionRow {
id: string;
versionNumber: number;
fileName: string;
fileSizeBytes: number;
uploadedBy: string;
uploadedAt: string;
isCurrent: boolean;
downloadUrl: string;
downloadUrlExpiresAt: string;
parseEngine: 'acroform' | 'ocr' | 'ai' | null;
}
interface UploadUrlResponse {
url: string;
method: 'PUT' | 'POST';
storageKey: string;
maxBytes: number;
backend: 's3' | 'filesystem';
}
export function BerthDocumentsTab({ berthId }: { berthId: string }) {
const qc = useQueryClient();
const fileInputRef = useRef<HTMLInputElement | null>(null);
const [pendingDiff, setPendingDiff] = useState<{
versionId: string;
autoApplied: Array<{ field: string; value: string | number }>;
conflicts: Array<{
field: string;
crmValue: string | number | null;
pdfValue: string | number | null;
pdfConfidence: number;
}>;
warnings: string[];
} | null>(null);
const { data: versions, isLoading } = useQuery<PdfVersionRow[]>({
queryKey: ['berth-pdf-versions', berthId],
queryFn: () =>
apiFetch<{ data: PdfVersionRow[] }>(`/api/v1/berths/${berthId}/pdf-versions`).then(
(r) => r.data,
),
});
const rollback = useMutation({
mutationFn: (versionId: string) =>
apiFetch(`/api/v1/berths/${berthId}/pdf-versions/${versionId}/rollback`, {
method: 'POST',
}),
onSuccess: () => {
void qc.invalidateQueries({ queryKey: ['berth-pdf-versions', berthId] });
void qc.invalidateQueries({ queryKey: ['berth', berthId] });
toast.success('Rolled back to selected version.');
},
onError: (err: Error) => {
toast.error('Rollback failed', { description: err.message });
},
});
const upload = useMutation({
mutationFn: async (file: File) => {
// 1. ask the server for a presigned upload URL
const upRes = await apiFetch<{ data: UploadUrlResponse }>(
`/api/v1/berths/${berthId}/pdf-upload-url`,
{
method: 'POST',
body: { fileName: file.name, sizeBytes: file.size },
},
);
const { url, method, storageKey, maxBytes } = upRes.data;
if (file.size > maxBytes) {
throw new Error(
`File ${(file.size / 1024 / 1024).toFixed(1)} MB exceeds ${(maxBytes / 1024 / 1024).toFixed(0)} MB limit`,
);
}
// 2. upload directly to storage (filesystem-proxy or S3)
const putRes = await fetch(url, {
method,
body: file,
headers: { 'content-type': 'application/pdf' },
credentials: url.startsWith('/') ? 'include' : 'omit',
});
if (!putRes.ok) {
throw new Error(`Storage PUT failed (${putRes.status})`);
}
// 3. compute sha256 in the browser for the metadata row
const sha256 = await sha256Hex(file);
// 4. register the version metadata + parse server-side. The server
// runs parseBerthPdf via the buffer from storage; the client
// doesn't ship the raw PDF a second time.
const verRes = await apiFetch<{ data: { versionId: string } }>(
`/api/v1/berths/${berthId}/pdf-versions`,
{
method: 'POST',
body: {
storageKey,
fileName: file.name,
fileSizeBytes: file.size,
sha256,
},
},
);
return { versionId: verRes.data.versionId };
},
onSuccess: () => {
void qc.invalidateQueries({ queryKey: ['berth-pdf-versions', berthId] });
void qc.invalidateQueries({ queryKey: ['berth', berthId] });
toast.success('PDF uploaded.');
},
onError: (err: Error) => {
toast.error('Upload failed', { description: err.message });
},
});
const onFileChange = (e: React.ChangeEvent<HTMLInputElement>) => {
const file = e.target.files?.[0];
if (!file) return;
if (!file.name.toLowerCase().endsWith('.pdf')) {
toast.error('Only PDFs are accepted.');
return;
}
upload.mutate(file);
if (fileInputRef.current) fileInputRef.current.value = '';
};
const current = versions?.find((v) => v.isCurrent);
const others = versions?.filter((v) => !v.isCurrent) ?? [];
return (
<div className="space-y-6">
<Card>
<CardHeader className="flex flex-row items-center justify-between pb-3">
<CardTitle className="text-sm font-medium">Current PDF</CardTitle>
<div>
<input
ref={fileInputRef}
type="file"
accept="application/pdf"
className="hidden"
onChange={onFileChange}
/>
<Button
size="sm"
onClick={() => fileInputRef.current?.click()}
disabled={upload.isPending}
>
{upload.isPending ? 'Uploading…' : current ? 'Replace PDF' : 'Upload PDF'}
</Button>
</div>
</CardHeader>
<CardContent className="pt-0 text-sm">
{isLoading ? (
<p className="text-muted-foreground">Loading</p>
) : current ? (
<div className="flex flex-wrap items-center gap-2">
<a
href={current.downloadUrl}
target="_blank"
rel="noreferrer"
className="font-medium underline underline-offset-2"
>
{current.fileName}
</a>
<span className="text-muted-foreground">
v{current.versionNumber} · {(current.fileSizeBytes / 1024 / 1024).toFixed(2)} MB
</span>
{current.parseEngine ? <ParseEngineBadge engine={current.parseEngine} /> : null}
</div>
) : (
<p className="text-muted-foreground">No PDF uploaded yet.</p>
)}
</CardContent>
</Card>
<Card>
<CardHeader className="pb-3">
<CardTitle className="text-sm font-medium">Version history</CardTitle>
</CardHeader>
<CardContent className="pt-0">
{others.length === 0 ? (
<p className="text-sm text-muted-foreground">No prior versions.</p>
) : (
<ul className="divide-y">
{others.map((v) => (
<li key={v.id} className="flex items-center justify-between py-2 text-sm">
<div>
<a href={v.downloadUrl} target="_blank" rel="noreferrer" className="underline">
{v.fileName}
</a>{' '}
<span className="text-muted-foreground">
v{v.versionNumber} · {(v.fileSizeBytes / 1024 / 1024).toFixed(2)} MB ·{' '}
{new Date(v.uploadedAt).toLocaleDateString()}
</span>
</div>
<Button
size="sm"
variant="outline"
onClick={() => rollback.mutate(v.id)}
disabled={rollback.isPending}
>
Rollback
</Button>
</li>
))}
</ul>
)}
</CardContent>
</Card>
{pendingDiff ? (
<PdfReconcileDialog
berthId={berthId}
versionId={pendingDiff.versionId}
autoApplied={pendingDiff.autoApplied}
conflicts={pendingDiff.conflicts}
warnings={pendingDiff.warnings}
onClose={() => setPendingDiff(null)}
/>
) : null}
</div>
);
}
function ParseEngineBadge({ engine }: { engine: 'acroform' | 'ocr' | 'ai' }) {
const tone = engine === 'acroform' ? 'default' : engine === 'ocr' ? 'secondary' : 'outline';
const label = engine === 'acroform' ? 'AcroForm' : engine === 'ocr' ? 'OCR' : 'AI';
return <Badge variant={tone}>{label}</Badge>;
}
async function sha256Hex(file: File): Promise<string> {
const buf = await file.arrayBuffer();
const hash = await crypto.subtle.digest('SHA-256', buf);
return Array.from(new Uint8Array(hash))
.map((b) => b.toString(16).padStart(2, '0'))
.join('');
}

View File

@@ -11,6 +11,7 @@ import { apiFetch } from '@/lib/api/client';
import { stageBadgeClass, stageLabel } from '@/lib/constants';
import { computeUrgencyBadges } from '@/components/interests/urgency';
import type { InterestRow } from '@/components/interests/interest-columns';
import { useRealtimeInvalidation } from '@/hooks/use-realtime-invalidation';
import { cn } from '@/lib/utils';
interface InterestsResponse {
@@ -33,8 +34,9 @@ export function BerthInterestPulse({ berthId }: { berthId: string }) {
const params = useParams<{ portSlug: string }>();
const portSlug = params?.portSlug ?? '';
const queryKey = ['interests', { berthId, sort: 'dateLastContact', order: 'desc' }];
const { data, isLoading } = useQuery<InterestsResponse>({
queryKey: ['interests', { berthId, sort: 'dateLastContact', order: 'desc' }],
queryKey,
queryFn: () =>
apiFetch<InterestsResponse>(
`/api/v1/interests?berthId=${berthId}&limit=10&sort=dateLastContact&order=desc`,
@@ -42,6 +44,19 @@ export function BerthInterestPulse({ berthId }: { berthId: string }) {
staleTime: 30_000,
});
// Stay in sync with the linked-berths list + add-to-interest dialog.
// Each of those flows emits a realtime socket event but does NOT
// invalidate this exact query key (it's berth-scoped, theirs are
// interest-scoped) — bridge via the invalidation hook.
useRealtimeInvalidation({
'interest:berthLinked': [queryKey],
'interest:berthUnlinked': [queryKey],
'interest:berthLinkUpdated': [queryKey],
'interest:created': [queryKey],
'interest:stageChanged': [queryKey],
'interest:archived': [queryKey],
});
const all = data?.data ?? [];
const active = all.filter((i) => !i.archivedAt && !i.outcome);
const preview = active.slice(0, PREVIEW_LIMIT);

View File

@@ -6,6 +6,7 @@ import { TagBadge } from '@/components/shared/tag-badge';
import { BerthReservationsTab } from './berth-reservations-tab';
import { BerthInterestsTab } from './berth-interests-tab';
import { BerthInterestPulse } from './berth-interest-pulse';
import { BerthDocumentsTab } from './berth-documents-tab';
type BerthData = {
id: string;
@@ -231,6 +232,11 @@ export function buildBerthTabs(berth: BerthData): DetailTab[] {
label: 'Reservations',
content: <BerthReservationsTab berthId={berth.id} />,
},
{
id: 'documents',
label: 'Documents',
content: <BerthDocumentsTab berthId={berth.id} />,
},
{
id: 'waiting-list',
label: 'Waiting List',

View File

@@ -0,0 +1,195 @@
/**
* Reconcile-diff dialog (Phase 6b — see plan §4.7b, §14.6).
*
* Shown after a successful per-berth PDF upload + parse. Surfaces three
* sections:
* - Warnings (mooring-number mismatch, imperial-vs-metric drift, etc.)
* so the rep can abort before applying.
* - Auto-applied fields — fields the parser found that the CRM had as null;
* these are pre-checked and applied on confirm.
* - Conflicts — fields where CRM and PDF disagree on a non-null value.
* The rep picks "Keep CRM" or "Use PDF" per row before confirming.
*
* On confirm, the dialog POSTs to /pdf-versions/parse-results/apply with the
* rep-curated `fieldsToApply` map.
*/
'use client';
import { useState } from 'react';
import { useMutation, useQueryClient } from '@tanstack/react-query';
import { toast } from 'sonner';
import { apiFetch } from '@/lib/api/client';
import { Button } from '@/components/ui/button';
import { Checkbox } from '@/components/ui/checkbox';
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from '@/components/ui/dialog';
interface AutoAppliedField {
field: string;
value: string | number;
}
interface ConflictField {
field: string;
crmValue: string | number | null;
pdfValue: string | number | null;
pdfConfidence: number;
}
export interface PdfReconcileDialogProps {
berthId: string;
versionId: string;
autoApplied: AutoAppliedField[];
conflicts: ConflictField[];
warnings: string[];
onClose: () => void;
}
export function PdfReconcileDialog({
berthId,
versionId,
autoApplied,
conflicts,
warnings,
onClose,
}: PdfReconcileDialogProps) {
const qc = useQueryClient();
// For each auto-applied field: rep can opt out by unchecking.
const [autoChecked, setAutoChecked] = useState<Record<string, boolean>>(
Object.fromEntries(autoApplied.map((f) => [f.field, true])),
);
// For each conflict: 'pdf' applies the PDF value, 'crm' keeps CRM (omit from
// payload), 'skip' is the same as 'crm' but distinct in the UI for clarity.
const [conflictChoice, setConflictChoice] = useState<Record<string, 'pdf' | 'crm'>>(
Object.fromEntries(conflicts.map((c) => [c.field, 'crm'])),
);
const apply = useMutation({
mutationFn: async () => {
const fieldsToApply: Record<string, string | number> = {};
for (const f of autoApplied) if (autoChecked[f.field]) fieldsToApply[f.field] = f.value;
for (const c of conflicts) {
if (conflictChoice[c.field] === 'pdf' && c.pdfValue != null) {
fieldsToApply[c.field] = c.pdfValue;
}
}
return apiFetch(`/api/v1/berths/${berthId}/pdf-versions/parse-results/apply`, {
method: 'POST',
body: { versionId, fieldsToApply },
});
},
onSuccess: () => {
void qc.invalidateQueries({ queryKey: ['berth', berthId] });
void qc.invalidateQueries({ queryKey: ['berth-pdf-versions', berthId] });
toast.success('Berth fields updated from PDF.');
onClose();
},
onError: (err: Error) => {
toast.error('Apply failed', { description: err.message });
},
});
return (
<Dialog open onOpenChange={(open) => (!open ? onClose() : undefined)}>
<DialogContent className="max-w-2xl">
<DialogHeader>
<DialogTitle>Review parsed fields</DialogTitle>
<DialogDescription>
The PDF parser extracted these values. Review and apply the ones you trust.
</DialogDescription>
</DialogHeader>
{warnings.length > 0 ? (
<div className="rounded-md border border-yellow-300 bg-yellow-50 p-3 text-sm">
<p className="font-medium">Warnings</p>
<ul className="mt-1 list-disc pl-5">
{warnings.map((w, i) => (
<li key={i}>{w}</li>
))}
</ul>
</div>
) : null}
{autoApplied.length > 0 ? (
<section>
<h3 className="text-sm font-medium">
Auto-applied <span className="text-muted-foreground">({autoApplied.length})</span>
</h3>
<p className="text-xs text-muted-foreground">
CRM had no value; the PDF supplied one. Uncheck to skip.
</p>
<ul className="mt-2 space-y-1">
{autoApplied.map((f) => (
<li key={f.field} className="flex items-center gap-2 text-sm">
<Checkbox
id={`auto-${f.field}`}
checked={autoChecked[f.field]}
onCheckedChange={(checked) =>
setAutoChecked((prev) => ({ ...prev, [f.field]: checked === true }))
}
/>
<label htmlFor={`auto-${f.field}`} className="flex-1">
<span className="font-medium">{f.field}</span>:{' '}
<span className="text-muted-foreground">{String(f.value)}</span>
</label>
</li>
))}
</ul>
</section>
) : null}
{conflicts.length > 0 ? (
<section>
<h3 className="text-sm font-medium">
Conflicts <span className="text-muted-foreground">({conflicts.length})</span>
</h3>
<p className="text-xs text-muted-foreground">
Pick which value to keep for each field.
</p>
<ul className="mt-2 space-y-2">
{conflicts.map((c) => (
<li
key={c.field}
className="grid grid-cols-[1fr_auto_auto] items-center gap-2 rounded border p-2 text-sm"
>
<span className="font-medium">{c.field}</span>
<Button
size="sm"
variant={conflictChoice[c.field] === 'crm' ? 'default' : 'outline'}
onClick={() => setConflictChoice((prev) => ({ ...prev, [c.field]: 'crm' }))}
>
Keep: {String(c.crmValue)}
</Button>
<Button
size="sm"
variant={conflictChoice[c.field] === 'pdf' ? 'default' : 'outline'}
onClick={() => setConflictChoice((prev) => ({ ...prev, [c.field]: 'pdf' }))}
>
Use PDF: {String(c.pdfValue)} ({Math.round(c.pdfConfidence * 100)}%)
</Button>
</li>
))}
</ul>
</section>
) : null}
<DialogFooter>
<Button variant="outline" onClick={onClose}>
Cancel
</Button>
<Button onClick={() => apply.mutate()} disabled={apply.isPending}>
{apply.isPending ? 'Applying…' : 'Apply'}
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
);
}

View File

@@ -0,0 +1,41 @@
'use client';
/**
* Berth-detail "Send to client" dialog (Phase 7 §5.6 / §5.7).
*
* Thin wrapper around {@link SendDocumentDialog} that pins documentKind to
* `berth_pdf`. Used by the berth detail page header action and by the
* recommender panel quick-send shortcut.
*/
import { SendDocumentDialog } from '@/components/shared/send-document-dialog';
interface SendBerthPdfDialogProps {
open: boolean;
onOpenChange: (open: boolean) => void;
berthId: string;
berthMooringNumber: string;
recipient: { clientId?: string; email?: string; interestId?: string };
onSent?: () => void;
}
export function SendBerthPdfDialog({
open,
onOpenChange,
berthId,
berthMooringNumber,
recipient,
onSent,
}: SendBerthPdfDialogProps) {
return (
<SendDocumentDialog
open={open}
onOpenChange={onOpenChange}
documentKind="berth_pdf"
recipient={recipient}
context={{ berthId }}
title={`Send berth ${berthMooringNumber} spec sheet`}
subtitle="The current PDF version is attached automatically."
onSent={onSent}
/>
);
}

View File

@@ -35,7 +35,8 @@ interface ClientCardProps {
}
export function ClientCard({ client, portSlug, onEdit, onArchive }: ClientCardProps) {
const primary = client.contacts?.find((c) => c.isPrimary);
// Card display: prefer email, fall back to phone.
const primaryContactValue = client.primaryEmail ?? client.primaryPhone ?? null;
const nationality = client.nationalityIso ? getCountryName(client.nationalityIso, 'en') : null;
const sourceLabel = client.source ? (SOURCE_LABELS[client.source] ?? client.source) : null;
const tags = client.tags ?? [];
@@ -93,8 +94,8 @@ export function ClientCard({ client, portSlug, onEdit, onArchive }: ClientCardPr
<span aria-hidden className="block h-9 w-9 shrink-0" />
</div>
{primary ? (
<p className="truncate text-sm text-muted-foreground">{primary.value}</p>
{primaryContactValue ? (
<p className="truncate text-sm text-muted-foreground">{primaryContactValue}</p>
) : null}
{meta.length > 0 ? (

View File

@@ -13,7 +13,6 @@ import {
DropdownMenuTrigger,
} from '@/components/ui/dropdown-menu';
import { Badge } from '@/components/ui/badge';
import { TagBadge } from '@/components/shared/tag-badge';
import { getCountryName } from '@/lib/i18n/countries';
export interface ClientRow {
@@ -23,14 +22,27 @@ export interface ClientRow {
source: string | null;
archivedAt: string | null;
createdAt: string;
primaryEmail?: string | null;
primaryPhone?: string | null;
yachtCount?: number;
companyCount?: number;
interestCount?: number;
latestInterest?: { stage: string; mooringNumber: string | null } | null;
contacts?: Array<{ channel: string; value: string; isPrimary: boolean }>;
tags?: Array<{ id: string; name: string; color: string }>;
}
const STAGE_LABELS: Record<string, string> = {
open: 'Open',
qualified: 'Qualified',
eoi_sent: 'EOI sent',
eoi_signed: 'EOI signed',
deposit: 'Deposit',
contract: 'Contract',
signed: 'Signed',
closed_won: 'Won',
closed_lost: 'Lost',
};
const SOURCE_LABELS: Record<string, string> = {
website: 'Website',
manual: 'Manual',
@@ -65,24 +77,29 @@ export function getClientColumns({
),
},
{
id: 'primaryContact',
header: 'Primary Contact',
id: 'email',
header: 'Email',
enableSorting: false,
cell: ({ row }) => {
const primary = row.original.contacts?.find((c) => c.isPrimary);
if (!primary) return <span className="text-muted-foreground">-</span>;
return (
<span className="text-sm">
<span className="text-muted-foreground capitalize">{primary.channel}: </span>
{primary.value}
</span>
);
const value = row.original.primaryEmail;
if (!value) return <span className="text-muted-foreground">-</span>;
return <span className="text-sm">{value}</span>;
},
},
{
id: 'nationality',
id: 'phone',
header: 'Phone',
enableSorting: false,
cell: ({ row }) => {
const value = row.original.primaryPhone;
if (!value) return <span className="text-muted-foreground">-</span>;
return <span className="text-sm">{value}</span>;
},
},
{
id: 'country',
accessorKey: 'nationalityIso',
header: 'Nationality',
header: 'Country',
cell: ({ getValue }) => {
const iso = getValue() as string | null;
return (
@@ -105,51 +122,20 @@ export function getClientColumns({
},
},
{
id: 'yachtCount',
header: 'Yachts',
id: 'latestStage',
header: 'Latest stage',
enableSorting: false,
cell: ({ row }) => {
const c = row.original.yachtCount ?? 0;
return c === 0 ? (
<span className="text-muted-foreground">-</span>
) : (
<Badge variant="secondary" className="text-xs">
{c}
</Badge>
);
},
},
{
id: 'companyCount',
header: 'Companies',
enableSorting: false,
cell: ({ row }) => {
const c = row.original.companyCount ?? 0;
return c === 0 ? (
<span className="text-muted-foreground">-</span>
) : (
<Badge variant="secondary" className="text-xs">
{c}
</Badge>
);
},
},
{
id: 'tags',
header: 'Tags',
enableSorting: false,
cell: ({ row }) => {
const clientTags = row.original.tags ?? [];
if (clientTags.length === 0) return <span className="text-muted-foreground">-</span>;
const latest = row.original.latestInterest;
if (!latest) return <span className="text-muted-foreground">-</span>;
const stageLabel = STAGE_LABELS[latest.stage] ?? latest.stage;
return (
<div className="flex flex-wrap gap-1">
{clientTags.slice(0, 3).map((tag) => (
<TagBadge key={tag.id} name={tag.name} color={tag.color} />
))}
{clientTags.length > 3 && (
<Badge variant="secondary" className="text-xs">
+{clientTags.length - 3}
<div className="flex items-center gap-2 text-sm">
<Badge variant="secondary" className="text-xs capitalize">
{stageLabel}
</Badge>
{latest.mooringNumber && (
<span className="text-muted-foreground">{latest.mooringNumber}</span>
)}
</div>
);

View File

@@ -0,0 +1,164 @@
'use client';
/**
* Client-detail multi-step "Send documents" dialog (Phase 7 §5.7).
*
* The client header action opens this dialog. The rep picks one of the
* client's interest-linked berths (to send a per-berth PDF) OR a brochure
* (defaults to the port default when unspecified). The actual send flow
* delegates to {@link SendDocumentDialog}; this wrapper is the picker.
*/
import { useState } from 'react';
import { useQuery } from '@tanstack/react-query';
import { FileText, Loader2, Mail } from 'lucide-react';
import { Button } from '@/components/ui/button';
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from '@/components/ui/dialog';
import { SendDocumentDialog } from '@/components/shared/send-document-dialog';
import { apiFetch } from '@/lib/api/client';
interface SendDocumentsDialogProps {
open: boolean;
onOpenChange: (open: boolean) => void;
clientId: string;
clientName: string;
/** When the rep is launching from a specific interest, pin it. */
interestId?: string;
}
interface BrochureOption {
id: string;
label: string;
isDefault: boolean;
archivedAt: string | null;
versionCount: number;
}
interface BrochuresResponse {
data: BrochureOption[];
}
export function SendDocumentsDialog({
open,
onOpenChange,
clientId,
clientName,
interestId,
}: SendDocumentsDialogProps) {
const [activeSend, setActiveSend] = useState<
| { kind: 'brochure'; brochureId?: string }
| { kind: 'berth_pdf'; berthId: string; mooring: string }
| null
>(null);
// Lightweight brochures fetch — only fires once dialog is opened.
const brochuresQuery = useQuery<BrochuresResponse>({
queryKey: ['brochures', 'list'],
queryFn: () => apiFetch('/api/v1/admin/brochures'),
enabled: open,
});
const usableBrochures =
brochuresQuery.data?.data.filter((b) => !b.archivedAt && b.versionCount > 0) ?? [];
return (
<>
<Dialog open={open && activeSend === null} onOpenChange={onOpenChange}>
<DialogContent>
<DialogHeader>
<DialogTitle>Send documents to {clientName}</DialogTitle>
<DialogDescription>
Pick a brochure or open the berth detail page to send a per-berth spec sheet.
</DialogDescription>
</DialogHeader>
<div className="space-y-3">
<div>
<h3 className="mb-2 flex items-center gap-2 text-sm font-semibold">
<Mail className="h-4 w-4" /> Brochures
</h3>
{brochuresQuery.isLoading && (
<div className="flex items-center gap-2 text-sm text-muted-foreground">
<Loader2 className="h-4 w-4 animate-spin" /> Loading brochures
</div>
)}
{!brochuresQuery.isLoading && usableBrochures.length === 0 && (
<p className="text-sm text-muted-foreground">
No brochures uploaded yet. Add one in /admin/brochures.
</p>
)}
<div className="space-y-2">
{usableBrochures.map((b) => (
<Button
key={b.id}
variant="outline"
className="w-full justify-between"
onClick={() => setActiveSend({ kind: 'brochure', brochureId: b.id })}
>
<span className="flex items-center gap-2">
<FileText className="h-4 w-4" />
{b.label}
{b.isDefault && (
<span className="rounded bg-primary/10 px-2 py-0.5 text-xs text-primary">
default
</span>
)}
</span>
<span className="text-xs text-muted-foreground">{b.versionCount} ver.</span>
</Button>
))}
</div>
</div>
</div>
<DialogFooter>
<Button variant="outline" onClick={() => onOpenChange(false)}>
Close
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
{activeSend?.kind === 'brochure' && (
<SendDocumentDialog
open
onOpenChange={(o) => {
if (!o) {
setActiveSend(null);
onOpenChange(false);
}
}}
documentKind="brochure"
recipient={{ clientId, interestId }}
context={{ brochureId: activeSend.brochureId }}
title={`Send brochure to ${clientName}`}
onSent={() => setActiveSend(null)}
/>
)}
{activeSend?.kind === 'berth_pdf' && (
<SendDocumentDialog
open
onOpenChange={(o) => {
if (!o) {
setActiveSend(null);
onOpenChange(false);
}
}}
documentKind="berth_pdf"
recipient={{ clientId, interestId }}
context={{ berthId: activeSend.berthId }}
title={`Send berth ${activeSend.mooring} spec sheet`}
onSent={() => setActiveSend(null)}
/>
)}
</>
);
}

View File

@@ -27,6 +27,7 @@ export interface ExpenseRow {
description: string | null;
payer: string | null;
receiptFileIds: string[] | null;
noReceiptAcknowledged?: boolean;
archivedAt: string | null;
createdAt: string;
/** Set by the dedup engine when this expense looks like a duplicate of another. */

View File

@@ -1,12 +1,13 @@
'use client';
import { useEffect } from 'react';
import { useEffect, useRef, useState } from 'react';
import { useForm } from 'react-hook-form';
import { zodResolver } from '@hookform/resolvers/zod';
import { useMutation, useQueryClient } from '@tanstack/react-query';
import { Loader2 } from 'lucide-react';
import { AlertTriangle, Loader2, Upload, X } from 'lucide-react';
import { Button } from '@/components/ui/button';
import { Checkbox } from '@/components/ui/checkbox';
import { Input } from '@/components/ui/input';
import { Label } from '@/components/ui/label';
import { Textarea } from '@/components/ui/textarea';
@@ -17,18 +18,17 @@ import {
SelectTrigger,
SelectValue,
} from '@/components/ui/select';
import {
Sheet,
SheetContent,
SheetHeader,
SheetTitle,
SheetFooter,
} from '@/components/ui/sheet';
import { Sheet, SheetContent, SheetHeader, SheetTitle, SheetFooter } from '@/components/ui/sheet';
import { apiFetch } from '@/lib/api/client';
import { createExpenseSchema, type CreateExpenseInput } from '@/lib/validators/expenses';
import { EXPENSE_CATEGORIES, PAYMENT_METHODS } from '@/lib/constants';
import type { ExpenseRow } from './expense-columns';
interface UploadedReceipt {
id: string;
filename: string;
}
interface ExpenseFormDialogProps {
open: boolean;
onOpenChange: (open: boolean) => void;
@@ -38,6 +38,12 @@ interface ExpenseFormDialogProps {
export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDialogProps) {
const queryClient = useQueryClient();
const isEdit = !!expense;
const fileInputRef = useRef<HTMLInputElement>(null);
const [uploadedReceipt, setUploadedReceipt] = useState<UploadedReceipt | null>(null);
const [previewUrl, setPreviewUrl] = useState<string | null>(null);
const [noReceipt, setNoReceipt] = useState(false);
const [uploadError, setUploadError] = useState<string | null>(null);
const [isUploading, setIsUploading] = useState(false);
const {
register,
@@ -65,15 +71,47 @@ export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDi
expenseDate: new Date(expense.expenseDate),
paymentStatus: (expense.paymentStatus as CreateExpenseInput['paymentStatus']) ?? 'unpaid',
});
setUploadedReceipt(null);
setPreviewUrl(null);
setNoReceipt(Boolean(expense.noReceiptAcknowledged));
setUploadError(null);
} else if (open && !expense) {
reset({
currency: 'USD',
paymentStatus: 'unpaid',
expenseDate: new Date(),
});
setUploadedReceipt(null);
setPreviewUrl(null);
setNoReceipt(false);
setUploadError(null);
}
}, [open, expense, reset]);
// Capture the URL inside the effect closure so the cleanup revokes the
// URL it observed at mount, not the one captured by a later render.
// Audit caught a bug where the cleanup ran on every change and revoked
// the URL that was still being shown.
useEffect(() => {
const url = previewUrl;
return () => {
if (url) URL.revokeObjectURL(url);
};
}, [previewUrl]);
// Reset upload state whenever the sheet closes — re-opening on the same
// expense was carrying stale state from the prior session.
useEffect(() => {
if (!open) {
setUploadedReceipt(null);
setPreviewUrl(null);
setNoReceipt(false);
setUploadError(null);
setIsUploading(false);
if (fileInputRef.current) fileInputRef.current.value = '';
}
}, [open]);
const mutation = useMutation({
mutationFn: (data: CreateExpenseInput) => {
if (isEdit) {
@@ -90,9 +128,51 @@ export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDi
},
});
function onSubmit(data: CreateExpenseInput) {
mutation.mutate(data);
async function handleFileChange(e: React.ChangeEvent<HTMLInputElement>) {
const file = e.target.files?.[0];
if (!file) return;
setUploadError(null);
if (previewUrl) URL.revokeObjectURL(previewUrl);
setPreviewUrl(URL.createObjectURL(file));
setIsUploading(true);
try {
const formData = new FormData();
formData.append('file', file);
formData.append('category', 'receipt');
const res = await fetch('/api/v1/files/upload', {
method: 'POST',
body: formData,
credentials: 'include',
});
if (!res.ok) throw new Error('Upload failed');
const json = (await res.json()) as { data: { id: string; filename: string } };
setUploadedReceipt({ id: json.data.id, filename: json.data.filename });
setNoReceipt(false);
} catch (err) {
setUploadError(err instanceof Error ? err.message : 'Upload failed');
setUploadedReceipt(null);
} finally {
setIsUploading(false);
}
}
function clearReceipt() {
if (previewUrl) URL.revokeObjectURL(previewUrl);
setPreviewUrl(null);
setUploadedReceipt(null);
setUploadError(null);
if (fileInputRef.current) fileInputRef.current.value = '';
}
function onSubmit(data: CreateExpenseInput) {
mutation.mutate({
...data,
receiptFileIds: uploadedReceipt ? [uploadedReceipt.id] : undefined,
noReceiptAcknowledged: Boolean(noReceipt && !uploadedReceipt),
});
}
const canSubmit = isEdit || Boolean(uploadedReceipt) || noReceipt;
return (
<Sheet open={open} onOpenChange={onOpenChange}>
@@ -110,9 +190,11 @@ export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDi
{...register('expenseDate', {
setValueAs: (v) => (v ? new Date(v) : undefined),
})}
defaultValue={expense?.expenseDate
defaultValue={
expense?.expenseDate
? new Date(expense.expenseDate).toISOString().split('T')[0]
: new Date().toISOString().split('T')[0]}
: new Date().toISOString().split('T')[0]
}
/>
{errors.expenseDate && (
<p className="text-xs text-destructive">{errors.expenseDate.message}</p>
@@ -130,19 +212,12 @@ export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDi
placeholder="0.00"
{...register('amount', { valueAsNumber: true })}
/>
{errors.amount && (
<p className="text-xs text-destructive">{errors.amount.message}</p>
)}
{errors.amount && <p className="text-xs text-destructive">{errors.amount.message}</p>}
</div>
<div className="space-y-2">
<Label htmlFor="currency">Currency</Label>
<Input
id="currency"
placeholder="USD"
maxLength={3}
{...register('currency')}
/>
<Input id="currency" placeholder="USD" maxLength={3} {...register('currency')} />
{errors.currency && (
<p className="text-xs text-destructive">{errors.currency.message}</p>
)}
@@ -180,7 +255,9 @@ export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDi
<div className="space-y-2">
<Label htmlFor="paymentMethod">Payment Method</Label>
<Select
onValueChange={(v) => setValue('paymentMethod', v as CreateExpenseInput['paymentMethod'])}
onValueChange={(v) =>
setValue('paymentMethod', v as CreateExpenseInput['paymentMethod'])
}
defaultValue={expense?.paymentMethod ?? undefined}
>
<SelectTrigger id="paymentMethod">
@@ -198,11 +275,7 @@ export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDi
<div className="space-y-2">
<Label htmlFor="payer">Payer</Label>
<Input
id="payer"
placeholder="Who paid?"
{...register('payer')}
/>
<Input id="payer" placeholder="Who paid?" {...register('payer')} />
</div>
<div className="space-y-2">
@@ -232,21 +305,93 @@ export function ExpenseFormDialog({ open, onOpenChange, expense }: ExpenseFormDi
/>
</div>
{mutation.isError && (
<p className="text-sm text-destructive">
{(mutation.error as Error).message}
{!isEdit && (
<div className="space-y-2 rounded-md border p-3">
<Label className="text-sm font-medium">Receipt</Label>
{previewUrl ? (
<div className="relative">
<img
src={previewUrl}
alt="Receipt preview"
className="max-h-48 rounded border object-contain"
/>
<button
type="button"
onClick={clearReceipt}
aria-label="Remove receipt"
className="absolute top-1 right-1 rounded-full bg-background/90 hover:bg-background border p-1 shadow-sm"
>
<X className="h-3 w-3" />
</button>
<p className="mt-1 text-xs text-muted-foreground">
{isUploading
? 'Uploading...'
: uploadedReceipt
? `Uploaded: ${uploadedReceipt.filename}`
: 'Selecting...'}
</p>
)}
<SheetFooter className="pt-2">
</div>
) : (
<Button
type="button"
variant="outline"
onClick={() => onOpenChange(false)}
size="sm"
className="w-full"
disabled={noReceipt}
onClick={() => fileInputRef.current?.click()}
>
<Upload className="mr-2 h-4 w-4" />
Upload receipt image or PDF
</Button>
)}
<input
ref={fileInputRef}
type="file"
accept="image/*,application/pdf"
className="hidden"
onChange={handleFileChange}
/>
{uploadError && <p className="text-xs text-destructive">{uploadError}</p>}
<div className="flex items-start gap-2 pt-1">
<Checkbox
id="noReceipt"
checked={noReceipt}
onCheckedChange={(checked) => {
const next = checked === true;
setNoReceipt(next);
if (next) clearReceipt();
}}
/>
<Label htmlFor="noReceipt" className="text-sm font-normal leading-tight">
I have no receipt for this expense
</Label>
</div>
{noReceipt && (
<div className="flex gap-2 rounded-md border border-amber-300 bg-amber-50 p-2 text-xs text-amber-900 dark:border-amber-900 dark:bg-amber-950/40 dark:text-amber-200">
<AlertTriangle className="h-4 w-4 flex-shrink-0" />
<span>
Expenses without a receipt may not be reimbursed by the parent company. The PDF
export will flag this expense.
</span>
</div>
)}
</div>
)}
{mutation.isError && (
<p className="text-sm text-destructive">{(mutation.error as Error).message}</p>
)}
<SheetFooter className="pt-2">
<Button type="button" variant="outline" onClick={() => onOpenChange(false)}>
Cancel
</Button>
<Button type="submit" disabled={isSubmitting || mutation.isPending}>
<Button
type="submit"
disabled={isSubmitting || mutation.isPending || isUploading || !canSubmit}
>
{(isSubmitting || mutation.isPending) && (
<Loader2 className="mr-2 h-4 w-4 animate-spin" />
)}

View File

@@ -0,0 +1,150 @@
'use client';
import { useState } from 'react';
import { useMutation, useQueryClient } from '@tanstack/react-query';
import { Eye, EyeOff, Loader2 } from 'lucide-react';
import { Button } from '@/components/ui/button';
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from '@/components/ui/dialog';
import { RadioGroup, RadioGroupItem } from '@/components/ui/radio-group';
import { Label } from '@/components/ui/label';
import { apiFetch } from '@/lib/api/client';
import { cn } from '@/lib/utils';
interface AddBerthToInterestDialogProps {
interestId: string;
berth: { berthId: string; mooringNumber: string };
open: boolean;
onOpenChange: (open: boolean) => void;
onAdded?: () => void;
}
type RoleChoice = 'specific' | 'exploring';
export function AddBerthToInterestDialog({
interestId,
berth,
open,
onOpenChange,
onAdded,
}: AddBerthToInterestDialogProps) {
const queryClient = useQueryClient();
const [choice, setChoice] = useState<RoleChoice>('specific');
const mutation = useMutation({
mutationFn: async (isSpecificInterest: boolean) =>
apiFetch(`/api/v1/interests/${interestId}/berths`, {
method: 'POST',
body: { berthId: berth.berthId, isSpecificInterest },
}),
onSuccess: () => {
// Invalidate the recommender cache + linked-berths cache so both
// surfaces re-fetch immediately. (See plan §5.3 / §5.5.)
queryClient.invalidateQueries({ queryKey: ['berth-recommendations', interestId] });
queryClient.invalidateQueries({ queryKey: ['interest-berths', interestId] });
queryClient.invalidateQueries({ queryKey: ['interests', interestId] });
onAdded?.();
onOpenChange(false);
},
});
const handleSubmit = () => {
mutation.mutate(choice === 'specific');
};
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent>
<DialogHeader>
<DialogTitle>Add berth {berth.mooringNumber} to interest</DialogTitle>
<DialogDescription>
Choose how this berth relates to the deal. This drives whether it shows as &ldquo;Under
Offer&rdquo; on the public map.
</DialogDescription>
</DialogHeader>
<RadioGroup
value={choice}
onValueChange={(v) => setChoice(v as RoleChoice)}
className="gap-3"
>
<RoleCard
value="specific"
checked={choice === 'specific'}
title="Pitching specifically"
description="The client is pitched on this exact berth."
consequence="This berth will appear as under interest on the public map."
icon={<Eye className="size-4" />}
/>
<RoleCard
value="exploring"
checked={choice === 'exploring'}
title="Just exploring"
description="The berth is being considered or covered by the EOI bundle, but not pitched specifically."
consequence="This berth is hidden from the public map."
icon={<EyeOff className="size-4" />}
/>
</RadioGroup>
{mutation.isError ? (
<p className="text-sm text-destructive">
{(mutation.error as Error)?.message ?? 'Failed to add berth.'}
</p>
) : null}
<DialogFooter>
<Button
type="button"
variant="outline"
onClick={() => onOpenChange(false)}
disabled={mutation.isPending}
>
Cancel
</Button>
<Button type="button" onClick={handleSubmit} disabled={mutation.isPending}>
{mutation.isPending ? <Loader2 className="mr-1.5 size-3.5 animate-spin" /> : null}
Add berth to interest
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
);
}
interface RoleCardProps {
value: RoleChoice;
checked: boolean;
title: string;
description: string;
consequence: string;
icon: React.ReactNode;
}
function RoleCard({ value, checked, title, description, consequence, icon }: RoleCardProps) {
return (
<Label
htmlFor={`role-${value}`}
className={cn(
'flex cursor-pointer items-start gap-3 rounded-lg border p-3 transition-colors',
checked ? 'border-brand-300 bg-brand-50/50 ring-1 ring-brand-200' : 'border-border',
)}
>
<RadioGroupItem value={value} id={`role-${value}`} className="mt-1" />
<div className="flex-1 space-y-1">
<div className="flex items-center gap-1.5 text-sm font-semibold">
{icon}
{title}
</div>
<p className="text-xs text-muted-foreground">{description}</p>
<p className="text-xs font-medium text-foreground/80">{consequence}</p>
</div>
</Label>
);
}

View File

@@ -0,0 +1,470 @@
'use client';
import { useState, useMemo } from 'react';
import Link from 'next/link';
import { useParams } from 'next/navigation';
import { useQuery } from '@tanstack/react-query';
import { ChevronDown, ChevronUp, Filter, Flame, Plus, RefreshCw, Sparkles } from 'lucide-react';
import { Button } from '@/components/ui/button';
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
import { Input } from '@/components/ui/input';
import { Label } from '@/components/ui/label';
import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue,
} from '@/components/ui/select';
import { StatusPill, type StatusPillStatus } from '@/components/ui/status-pill';
import { AddBerthToInterestDialog } from '@/components/interests/add-berth-to-interest-dialog';
import { apiFetch } from '@/lib/api/client';
import { cn } from '@/lib/utils';
// ─── Types (mirror the recommender service Recommendation shape) ───────────
type Tier = 'A' | 'B' | 'C' | 'D';
interface HeatBreakdown {
recency: number;
furthestStage: number;
interestCount: number;
eoiCount: number;
total: number;
}
export interface Recommendation {
berthId: string;
mooringNumber: string;
area: string | null;
tier: Tier;
fitScore: number;
sizeBufferPct: number | null;
heat: HeatBreakdown | null;
reasons: {
dimensional: string;
pipeline: string;
amenities?: string;
heat?: string;
};
lengthFt: number | null;
widthFt: number | null;
draftFt: number | null;
status: string;
amenities: {
powerCapacity: number | null;
voltage: number | null;
access: string | null;
mooringType: string | null;
cleatCapacity: string | null;
};
}
interface AmenityFilters {
minPowerCapacityKw?: number;
requiredVoltage?: number;
requiredAccess?: string;
requiredMooringType?: string;
requiredCleatCapacity?: string;
}
interface BerthRecommenderPanelProps {
interestId: string;
/** Display label for the dimensions in the header. */
desiredLengthFt: number | null;
desiredWidthFt: number | null;
desiredDraftFt: number | null;
}
const TIER_LABELS: Record<Tier, { label: string; tone: string }> = {
A: { label: 'Open', tone: 'border-emerald-200 bg-emerald-50 text-emerald-800' },
B: { label: 'Fall-through', tone: 'border-amber-200 bg-amber-50 text-amber-800' },
C: { label: 'Active interest', tone: 'border-sky-200 bg-sky-50 text-sky-800' },
D: { label: 'Late stage', tone: 'border-slate-300 bg-slate-100 text-slate-700' },
};
function statusToPill(status: string): StatusPillStatus {
switch (status) {
case 'available':
return 'active';
case 'under_offer':
return 'sent';
case 'sold':
return 'completed';
case 'reserved':
return 'partial';
default:
return 'pending';
}
}
function formatStatus(status: string): string {
return status.replace(/_/g, ' ').replace(/\b\w/g, (m) => m.toUpperCase());
}
function formatDimensions(
length: number | null,
width: number | null,
draft: number | null,
): string {
const parts: string[] = [];
if (length !== null) parts.push(`${length.toFixed(1)}ft L`);
if (width !== null) parts.push(`${width.toFixed(1)}ft W`);
if (draft !== null) parts.push(`${draft.toFixed(1)}ft D`);
return parts.join(' · ');
}
function formatDesired(length: number | null, width: number | null, draft: number | null): string {
const parts: string[] = [];
if (length !== null) parts.push(`${length}ft L`);
if (width !== null) parts.push(`${width}ft W`);
if (draft !== null) parts.push(`${draft}ft D`);
return parts.length > 0 ? parts.join(' · ') : 'no dimensions set';
}
interface RecommendationCardProps {
rec: Recommendation;
portSlug: string;
onAdd: (rec: Recommendation) => void;
}
function RecommendationCard({ rec, portSlug, onAdd }: RecommendationCardProps) {
const [expanded, setExpanded] = useState(false);
const tier = TIER_LABELS[rec.tier];
const showHeat = rec.heat && rec.heat.total > 0;
return (
<div className="rounded-lg border bg-card text-sm">
<button
type="button"
onClick={() => setExpanded((v) => !v)}
className="flex w-full items-start gap-3 p-3 text-left hover:bg-muted/40"
>
<div className="min-w-0 flex-1 space-y-1">
<div className="flex flex-wrap items-center gap-2">
<span className="font-semibold">{rec.mooringNumber}</span>
{rec.area ? <span className="text-xs text-muted-foreground">{rec.area}</span> : null}
<StatusPill status={statusToPill(rec.status)}>{formatStatus(rec.status)}</StatusPill>
<span
className={cn(
'inline-flex items-center rounded-md border px-2 py-0.5 text-xs font-medium',
tier.tone,
)}
>
Tier {rec.tier} · {tier.label}
</span>
{showHeat ? (
<span className="inline-flex items-center gap-1 rounded-md border border-rose-200 bg-rose-50 px-2 py-0.5 text-xs font-medium text-rose-800">
<Flame className="size-3" />
Heat {Math.round(rec.heat!.total)}
</span>
) : null}
</div>
<div className="text-xs text-muted-foreground">
{formatDimensions(rec.lengthFt, rec.widthFt, rec.draftFt)}
{rec.sizeBufferPct !== null ? (
<span>
{' '}
· {rec.sizeBufferPct >= 0 ? '+' : ''}
{rec.sizeBufferPct}% vs desired
</span>
) : null}
<span className="ml-2 font-medium text-foreground">Fit {rec.fitScore}</span>
</div>
</div>
{expanded ? (
<ChevronUp className="size-4 shrink-0 text-muted-foreground" />
) : (
<ChevronDown className="size-4 shrink-0 text-muted-foreground" />
)}
</button>
{expanded ? (
<div className="space-y-3 border-t bg-muted/20 p-3">
<dl className="space-y-1 text-xs">
<div className="flex gap-2">
<dt className="w-28 shrink-0 text-muted-foreground">Dimensional</dt>
<dd>{rec.reasons.dimensional}</dd>
</div>
<div className="flex gap-2">
<dt className="w-28 shrink-0 text-muted-foreground">Pipeline</dt>
<dd>{rec.reasons.pipeline}</dd>
</div>
{rec.reasons.amenities ? (
<div className="flex gap-2">
<dt className="w-28 shrink-0 text-muted-foreground">Amenities</dt>
<dd>{rec.reasons.amenities}</dd>
</div>
) : null}
{rec.reasons.heat ? (
<div className="flex gap-2">
<dt className="w-28 shrink-0 text-muted-foreground">Heat</dt>
<dd>{rec.reasons.heat}</dd>
</div>
) : null}
</dl>
<div className="flex flex-wrap gap-2">
<Button
type="button"
size="sm"
onClick={(e) => {
e.stopPropagation();
onAdd(rec);
}}
>
<Plus className="mr-1.5 size-3.5" />
Add to interest
</Button>
<Button asChild size="sm" variant="outline">
<Link href={`/${portSlug}/berths/${rec.berthId}`}>View berth</Link>
</Button>
</div>
</div>
) : null}
</div>
);
}
interface AmenityFilterFormProps {
filters: AmenityFilters;
onChange: (next: AmenityFilters) => void;
}
function AmenityFilterForm({ filters, onChange }: AmenityFilterFormProps) {
const update = <K extends keyof AmenityFilters>(key: K, value: AmenityFilters[K]) => {
const next = { ...filters };
if (value === undefined || value === '' || (typeof value === 'number' && Number.isNaN(value))) {
delete next[key];
} else {
next[key] = value;
}
onChange(next);
};
return (
<div className="grid grid-cols-1 gap-3 rounded-lg border bg-muted/20 p-3 sm:grid-cols-2 lg:grid-cols-3">
<div className="space-y-1">
<Label htmlFor="filter-power" className="text-xs">
Min power (kW)
</Label>
<Input
id="filter-power"
type="number"
min={0}
step="0.1"
value={filters.minPowerCapacityKw ?? ''}
onChange={(e) =>
update('minPowerCapacityKw', e.target.value ? parseFloat(e.target.value) : undefined)
}
/>
</div>
<div className="space-y-1">
<Label htmlFor="filter-voltage" className="text-xs">
Voltage
</Label>
<Input
id="filter-voltage"
type="number"
min={0}
step="1"
value={filters.requiredVoltage ?? ''}
onChange={(e) =>
update('requiredVoltage', e.target.value ? parseInt(e.target.value, 10) : undefined)
}
/>
</div>
<div className="space-y-1">
<Label htmlFor="filter-mooring" className="text-xs">
Mooring type
</Label>
<Select
value={filters.requiredMooringType ?? ''}
onValueChange={(v) => update('requiredMooringType', v || undefined)}
>
<SelectTrigger id="filter-mooring">
<SelectValue placeholder="Any" />
</SelectTrigger>
<SelectContent>
<SelectItem value="stern_to">Stern-to</SelectItem>
<SelectItem value="alongside">Alongside</SelectItem>
<SelectItem value="bow_to">Bow-to</SelectItem>
<SelectItem value="swing">Swing mooring</SelectItem>
</SelectContent>
</Select>
</div>
<div className="space-y-1">
<Label htmlFor="filter-cleat" className="text-xs">
Cleat capacity
</Label>
<Select
value={filters.requiredCleatCapacity ?? ''}
onValueChange={(v) => update('requiredCleatCapacity', v || undefined)}
>
<SelectTrigger id="filter-cleat">
<SelectValue placeholder="Any" />
</SelectTrigger>
<SelectContent>
<SelectItem value="standard">Standard</SelectItem>
<SelectItem value="heavy">Heavy</SelectItem>
<SelectItem value="superyacht">Superyacht</SelectItem>
</SelectContent>
</Select>
</div>
<div className="space-y-1">
<Label htmlFor="filter-access" className="text-xs">
Access
</Label>
<Select
value={filters.requiredAccess ?? ''}
onValueChange={(v) => update('requiredAccess', v || undefined)}
>
<SelectTrigger id="filter-access">
<SelectValue placeholder="Any" />
</SelectTrigger>
<SelectContent>
<SelectItem value="land">Land access</SelectItem>
<SelectItem value="dinghy">Dinghy only</SelectItem>
<SelectItem value="walk_on">Walk-on</SelectItem>
</SelectContent>
</Select>
</div>
</div>
);
}
export function BerthRecommenderPanel({
interestId,
desiredLengthFt,
desiredWidthFt,
desiredDraftFt,
}: BerthRecommenderPanelProps) {
const params = useParams<{ portSlug: string }>();
const portSlug = params?.portSlug ?? '';
const [filtersOpen, setFiltersOpen] = useState(false);
const [amenityFilters, setAmenityFilters] = useState<AmenityFilters>({});
const [showAll, setShowAll] = useState(false);
const [pendingBerth, setPendingBerth] = useState<Recommendation | null>(null);
const hasDimensions = desiredLengthFt !== null;
const queryKey = useMemo(
() => ['berth-recommendations', interestId, amenityFilters, showAll] as const,
[interestId, amenityFilters, showAll],
);
const { data, isFetching, refetch } = useQuery({
queryKey,
enabled: hasDimensions,
queryFn: () =>
apiFetch<{ data: Recommendation[] }>(`/api/v1/interests/${interestId}/recommend-berths`, {
method: 'POST',
body: {
...(showAll ? { topN: 999 } : {}),
...(Object.keys(amenityFilters).length > 0 ? { amenityFilters } : {}),
},
}).then((r) => r.data),
staleTime: 60_000,
});
const recommendations = data ?? [];
return (
<Card>
<CardHeader className="gap-3">
<div className="flex flex-wrap items-start justify-between gap-2">
<div className="min-w-0 space-y-1">
<CardTitle className="flex items-center gap-2 text-base">
<Sparkles className="size-4 text-brand-600" />
Recommendations for {formatDesired(desiredLengthFt, desiredWidthFt, desiredDraftFt)}
</CardTitle>
{!hasDimensions ? (
<p className="text-xs text-muted-foreground">
Set desired dimensions to see recommendations.
</p>
) : null}
</div>
<div className="flex flex-wrap items-center gap-2">
<Button
type="button"
size="sm"
variant="outline"
onClick={() => setFiltersOpen((v) => !v)}
disabled={!hasDimensions}
>
<Filter className="mr-1.5 size-3.5" />
{filtersOpen ? 'Hide filters' : 'Add filters'}
</Button>
<Button
type="button"
size="sm"
variant="outline"
onClick={() => refetch()}
disabled={!hasDimensions || isFetching}
>
<RefreshCw className={cn('mr-1.5 size-3.5', isFetching && 'animate-spin')} />
Refresh
</Button>
</div>
</div>
{filtersOpen && hasDimensions ? (
<AmenityFilterForm filters={amenityFilters} onChange={setAmenityFilters} />
) : null}
</CardHeader>
<CardContent className="space-y-3">
{!hasDimensions ? (
<p className="text-sm text-muted-foreground">
Once length, width, and draft are set on this interest, the recommender will surface
berths that fit. Edit the desired dimensions on the{' '}
<Link href="?tab=overview" className="text-primary underline">
Overview tab
</Link>
.
</p>
) : isFetching && recommendations.length === 0 ? (
<div className="space-y-2">
{[0, 1, 2].map((i) => (
<div key={i} className="h-16 animate-pulse rounded-lg bg-muted" />
))}
</div>
) : recommendations.length === 0 ? (
<p className="py-6 text-center text-sm text-muted-foreground">
No berths match the current dimensions and filters.
</p>
) : (
<div className="space-y-2">
{recommendations.map((rec) => (
<RecommendationCard
key={rec.berthId}
rec={rec}
portSlug={portSlug}
onAdd={setPendingBerth}
/>
))}
</div>
)}
{hasDimensions && recommendations.length > 0 ? (
<div className="flex justify-center pt-1">
<Button type="button" size="sm" variant="ghost" onClick={() => setShowAll((v) => !v)}>
{showAll ? 'Show top recommendations' : 'Show all feasible'}
</Button>
</div>
) : null}
</CardContent>
{pendingBerth ? (
<AddBerthToInterestDialog
interestId={interestId}
berth={{
berthId: pendingBerth.berthId,
mooringNumber: pendingBerth.mooringNumber,
}}
open={pendingBerth !== null}
onOpenChange={(open) => {
if (!open) setPendingBerth(null);
}}
/>
) : null}
</Card>
);
}

View File

@@ -13,7 +13,6 @@ import {
DropdownMenuTrigger,
} from '@/components/ui/dropdown-menu';
import { Badge } from '@/components/ui/badge';
import { TagBadge } from '@/components/shared/tag-badge';
import { stageBadgeClass, stageLabel } from '@/lib/constants';
import { computeUrgencyBadges, type InterestUrgencyInput } from '@/components/interests/urgency';
@@ -21,6 +20,8 @@ export interface InterestRow {
id: string;
clientId: string;
clientName: string | null;
yachtId?: string | null;
yachtName?: string | null;
berthId: string | null;
berthMooringNumber: string | null;
pipelineStage: string;
@@ -36,15 +37,33 @@ export interface InterestRow {
dateDepositReceived?: string | null;
eoiStatus?: string | null;
outcome?: string | null;
/** Imperial; nullable. Recommender treats nulls as "no constraint" on
* that axis. Rendered as a compact "60×18×6 ft" string in the list. */
desiredLengthFt?: string | number | null;
desiredWidthFt?: string | number | null;
desiredDraftFt?: string | number | null;
notesCount?: number;
tags?: Array<{ id: string; name: string; color: string }>;
}
const CATEGORY_LABELS: Record<string, string> = {
general_interest: 'General Interest',
specific_qualified: 'Specific Qualified',
hot_lead: 'Hot Lead',
};
function formatDim(value: string | number | null | undefined): string {
if (value === null || value === undefined || value === '') return '?';
const n = typeof value === 'number' ? value : parseFloat(value);
if (!Number.isFinite(n)) return '?';
return Number.isInteger(n) ? String(n) : n.toFixed(1);
}
function formatDesiredSize(row: InterestRow): string | null {
const { desiredLengthFt, desiredWidthFt, desiredDraftFt } = row;
if (
(desiredLengthFt === null || desiredLengthFt === undefined || desiredLengthFt === '') &&
(desiredWidthFt === null || desiredWidthFt === undefined || desiredWidthFt === '') &&
(desiredDraftFt === null || desiredDraftFt === undefined || desiredDraftFt === '')
) {
return null;
}
return `${formatDim(desiredLengthFt)}×${formatDim(desiredWidthFt)}×${formatDim(desiredDraftFt)} ft`;
}
const SOURCE_LABELS: Record<string, string> = {
website: 'Website',
@@ -53,6 +72,12 @@ const SOURCE_LABELS: Record<string, string> = {
broker: 'Broker',
};
const EOI_STATUS_LABELS: Record<string, { label: string; tone: string }> = {
waiting_for_signatures: { label: 'Waiting', tone: 'bg-amber-100 text-amber-900' },
signed: { label: 'Signed', tone: 'bg-emerald-100 text-emerald-900' },
expired: { label: 'Expired', tone: 'bg-rose-100 text-rose-900' },
};
interface GetColumnsOptions {
portSlug: string;
onEdit: (interest: InterestRow) => void;
@@ -93,6 +118,27 @@ export function getInterestColumns({
);
},
},
{
id: 'yachtName',
accessorKey: 'yachtName',
header: 'Yacht',
enableSorting: false,
cell: ({ row }) => {
const name = row.original.yachtName;
if (!name) return <span className="text-muted-foreground">-</span>;
const yachtId = row.original.yachtId;
if (!yachtId) return <span className="truncate text-sm">{name}</span>;
return (
<Link
href={`/${portSlug}/yachts/${yachtId}`}
className="truncate text-primary hover:underline text-sm"
onClick={(e) => e.stopPropagation()}
>
{name}
</Link>
);
},
},
{
id: 'berthMooringNumber',
accessorKey: 'berthMooringNumber',
@@ -112,6 +158,16 @@ export function getInterestColumns({
);
},
},
{
id: 'desiredSize',
header: 'Berth size desired',
enableSorting: false,
cell: ({ row }) => {
const label = formatDesiredSize(row.original);
if (!label) return <span className="text-muted-foreground">-</span>;
return <span className="text-sm tabular-nums">{label}</span>;
},
},
{
id: 'pipelineStage',
accessorKey: 'pipelineStage',
@@ -145,16 +201,22 @@ export function getInterestColumns({
},
},
{
id: 'leadCategory',
accessorKey: 'leadCategory',
header: 'Category',
id: 'eoiStatus',
accessorKey: 'eoiStatus',
header: 'EOI status',
enableSorting: false,
cell: ({ getValue }) => {
const cat = getValue() as string | null;
if (!cat) return <span className="text-muted-foreground">-</span>;
const status = getValue() as string | null;
if (!status) return <span className="text-muted-foreground">-</span>;
const meta = EOI_STATUS_LABELS[status];
return (
<Badge variant="outline" className="text-xs capitalize">
{CATEGORY_LABELS[cat] ?? cat}
</Badge>
<span
className={`inline-flex items-center rounded-full px-2 py-0.5 text-xs font-medium ${
meta?.tone ?? 'bg-muted text-muted-foreground'
}`}
>
{meta?.label ?? status}
</span>
);
},
},
@@ -172,27 +234,6 @@ export function getInterestColumns({
);
},
},
{
id: 'tags',
header: 'Tags',
enableSorting: false,
cell: ({ row }) => {
const rowTags = row.original.tags ?? [];
if (rowTags.length === 0) return <span className="text-muted-foreground">-</span>;
return (
<div className="flex flex-wrap gap-1">
{rowTags.slice(0, 3).map((tag) => (
<TagBadge key={tag.id} name={tag.name} color={tag.color} />
))}
{rowTags.length > 3 && (
<Badge variant="secondary" className="text-xs">
+{rowTags.length - 3}
</Badge>
)}
</div>
);
},
},
{
// Sales-triage default: prefer the explicit dateLastContact, fall back
// to updatedAt. Sortable on dateLastContact server-side; the column

View File

@@ -41,6 +41,11 @@ interface InterestData {
} | null;
berthId: string | null;
berthMooringNumber: string | null;
/** Yacht-fit dimensions (numeric strings from postgres). Drive the
* recommender panel guard ("Set desired dimensions to see recommendations"). */
desiredLengthFt: string | null;
desiredWidthFt: string | null;
desiredDraftFt: string | null;
pipelineStage: string;
leadCategory: string | null;
source: string | null;

View File

@@ -66,6 +66,9 @@ interface InterestFormProps {
reminderEnabled?: boolean;
reminderDays?: number | null;
tags?: Array<{ id: string }>;
desiredLengthFt?: string | number | null;
desiredWidthFt?: string | number | null;
desiredDraftFt?: string | number | null;
};
}
@@ -131,6 +134,18 @@ export function InterestForm({ open, onOpenChange, defaultClientId, interest }:
reminderEnabled: interest.reminderEnabled ?? false,
reminderDays: interest.reminderDays ?? undefined,
tagIds: interest.tags?.map((t) => t.id) ?? [],
desiredLengthFt:
interest.desiredLengthFt === null || interest.desiredLengthFt === undefined
? undefined
: String(interest.desiredLengthFt),
desiredWidthFt:
interest.desiredWidthFt === null || interest.desiredWidthFt === undefined
? undefined
: String(interest.desiredWidthFt),
desiredDraftFt:
interest.desiredDraftFt === null || interest.desiredDraftFt === undefined
? undefined
: String(interest.desiredDraftFt),
});
} else if (!interest && open) {
reset({
@@ -394,6 +409,54 @@ export function InterestForm({ open, onOpenChange, defaultClientId, interest }:
<Separator />
{/* Desired berth dimensions (recommender inputs) */}
<div className="space-y-2">
<h3 className="text-sm font-medium text-muted-foreground uppercase tracking-wide">
Berth size desired
</h3>
<p className="text-xs text-muted-foreground">
Imperial. Optional - the recommender treats blank fields as no constraint on that
axis.
</p>
<div className="grid grid-cols-3 gap-3">
<div className="space-y-1">
<Label htmlFor="desiredLengthFt">Length (ft)</Label>
<Input
id="desiredLengthFt"
{...register('desiredLengthFt')}
type="number"
step="0.01"
min={0}
placeholder="e.g. 60"
/>
</div>
<div className="space-y-1">
<Label htmlFor="desiredWidthFt">Width (ft)</Label>
<Input
id="desiredWidthFt"
{...register('desiredWidthFt')}
type="number"
step="0.01"
min={0}
placeholder="e.g. 18"
/>
</div>
<div className="space-y-1">
<Label htmlFor="desiredDraftFt">Draft (ft)</Label>
<Input
id="desiredDraftFt"
{...register('desiredDraftFt')}
type="number"
step="0.01"
min={0}
placeholder="e.g. 6"
/>
</div>
</div>
</div>
<Separator />
{/* Notes */}
<div className="space-y-2">
<Label>Notes</Label>

View File

@@ -12,6 +12,8 @@ import { NotesList } from '@/components/shared/notes-list';
import { InlineEditableField } from '@/components/shared/inline-editable-field';
import { InlineTagEditor } from '@/components/shared/inline-tag-editor';
import { RecommendationList } from '@/components/interests/recommendation-list';
import { BerthRecommenderPanel } from '@/components/interests/berth-recommender-panel';
import { LinkedBerthsList } from '@/components/interests/linked-berths-list';
import { InterestTimeline } from '@/components/interests/interest-timeline';
import { InterestDocumentsTab } from '@/components/interests/interest-documents-tab';
import { InterestFilesTab } from '@/components/interests/interest-files-tab';
@@ -37,6 +39,10 @@ interface InterestTabsOptions {
currentUserId?: string;
interest: {
pipelineStage: string;
/** Drives the recommender panel mounted on the Overview tab. */
desiredLengthFt?: string | null;
desiredWidthFt?: string | null;
desiredDraftFt?: string | null;
leadCategory: string | null;
source: string | null;
eoiStatus: string | null;
@@ -306,6 +312,12 @@ function OverviewTab({
activeMilestone = 'contract';
}
const toNum = (v: string | null | undefined): number | null => {
if (v === null || v === undefined) return null;
const n = parseFloat(v);
return Number.isFinite(n) ? n : null;
};
return (
<div className="space-y-6">
{/* Sales-process milestones - the heart of the system. Each section is a
@@ -498,6 +510,22 @@ function OverviewTab({
/>
</div>
</div>
{/* Linked berths (plan §5.5) - shown ABOVE the recommender so reps see
what's already linked before browsing more options. Each row exposes
per-berth role-flag toggles and the EOI bypass control (only visible
once the parent interest's primary EOI is signed). */}
<LinkedBerthsList interestId={interestId} />
{/* Berth recommender (plan §5.3) - always-mounted card driven by the
interest's desired dimensions. Renders an inline guidance message
when dimensions aren't set yet. */}
<BerthRecommenderPanel
interestId={interestId}
desiredLengthFt={toNum(interest.desiredLengthFt)}
desiredWidthFt={toNum(interest.desiredWidthFt)}
desiredDraftFt={toNum(interest.desiredDraftFt)}
/>
</div>
);
}

View File

@@ -0,0 +1,478 @@
'use client';
/**
* Linked-berths list — plan §5.5.
*
* Shows every berth currently linked to the interest with per-row controls:
* - "Specifically pitching" toggle (`is_specific_interest`) — drives the
* public-map "Under Offer" sub-status. Each state surfaces its consequence
* in plain text below the toggle.
* - "Mark in EOI bundle" toggle (`is_in_eoi_bundle`).
* - "Set as primary" button when this row isn't already primary. The
* service helper handles the demote-prior-primary case in a single tx.
* - "Bypass EOI for this berth" with a reason textarea. Only rendered when
* the parent interest's `eoiStatus === 'signed'`. Writes
* `eoi_bypass_reason`, `eoi_bypassed_by`, `eoi_bypassed_at`.
* - "Remove" — calls `removeInterestBerth`.
*/
import { useState } from 'react';
import Link from 'next/link';
import { useParams } from 'next/navigation';
import { useMutation, useQuery, useQueryClient } from '@tanstack/react-query';
import { Anchor, Loader2, Star, Trash2 } from 'lucide-react';
import { Button } from '@/components/ui/button';
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from '@/components/ui/dialog';
import { Label } from '@/components/ui/label';
import { StatusPill, type StatusPillStatus } from '@/components/ui/status-pill';
import { Switch } from '@/components/ui/switch';
import { Textarea } from '@/components/ui/textarea';
import { apiFetch } from '@/lib/api/client';
import { cn } from '@/lib/utils';
// ─── Types (mirror the API GET shape — see interest-berths.service.ts) ─────
export interface LinkedBerthRow {
id: string;
interestId: string;
berthId: string;
isPrimary: boolean;
isSpecificInterest: boolean;
isInEoiBundle: boolean;
eoiBypassReason: string | null;
eoiBypassedBy: string | null;
eoiBypassedAt: string | null;
addedBy: string | null;
addedAt: string;
notes: string | null;
mooringNumber: string | null;
area: string | null;
status: string;
lengthFt: string | null;
widthFt: string | null;
draftFt: string | null;
}
interface LinkedBerthsResponse {
data: LinkedBerthRow[];
meta: { eoiStatus: string | null };
}
interface LinkedBerthsListProps {
interestId: string;
}
// ─── Helpers ────────────────────────────────────────────────────────────────
function statusToPill(status: string): StatusPillStatus {
switch (status) {
case 'available':
return 'active';
case 'under_offer':
return 'sent';
case 'sold':
return 'completed';
case 'reserved':
return 'partial';
default:
return 'pending';
}
}
function formatStatus(status: string): string {
return status.replace(/_/g, ' ').replace(/\b\w/g, (m) => m.toUpperCase());
}
function formatDimensions(
length: string | null,
width: string | null,
draft: string | null,
): string | null {
const parts: string[] = [];
const toNum = (v: string | null) => {
if (v === null) return null;
const n = parseFloat(v);
return Number.isFinite(n) ? n : null;
};
const l = toNum(length);
const w = toNum(width);
const d = toNum(draft);
if (l !== null) parts.push(`${l.toFixed(1)}ft L`);
if (w !== null) parts.push(`${w.toFixed(1)}ft W`);
if (d !== null) parts.push(`${d.toFixed(1)}ft D`);
return parts.length > 0 ? parts.join(' · ') : null;
}
const SPECIFIC_CONSEQUENCE_ON = 'This berth will appear as under interest on the public map.';
const SPECIFIC_CONSEQUENCE_OFF = 'This berth is hidden from the public map.';
// ─── Hooks ──────────────────────────────────────────────────────────────────
function useLinkedBerths(interestId: string) {
return useQuery({
queryKey: ['interest-berths', interestId] as const,
queryFn: () => apiFetch<LinkedBerthsResponse>(`/api/v1/interests/${interestId}/berths`),
staleTime: 30_000,
});
}
interface PatchPayload {
isPrimary?: boolean;
isSpecificInterest?: boolean;
isInEoiBundle?: boolean;
eoiBypassReason?: string | null;
}
function useUpdateLink(interestId: string) {
const qc = useQueryClient();
return useMutation({
mutationFn: async (args: { berthId: string; patch: PatchPayload }) =>
apiFetch(`/api/v1/interests/${interestId}/berths/${args.berthId}`, {
method: 'PATCH',
body: args.patch,
}),
onSuccess: () => {
qc.invalidateQueries({ queryKey: ['interest-berths', interestId] });
qc.invalidateQueries({ queryKey: ['interests', interestId] });
},
});
}
function useRemoveLink(interestId: string) {
const qc = useQueryClient();
return useMutation({
mutationFn: async (berthId: string) =>
apiFetch(`/api/v1/interests/${interestId}/berths/${berthId}`, { method: 'DELETE' }),
onSuccess: () => {
qc.invalidateQueries({ queryKey: ['interest-berths', interestId] });
qc.invalidateQueries({ queryKey: ['interests', interestId] });
qc.invalidateQueries({ queryKey: ['berth-recommendations', interestId] });
},
});
}
// ─── Bypass dialog ──────────────────────────────────────────────────────────
interface BypassDialogProps {
row: LinkedBerthRow;
open: boolean;
onOpenChange: (open: boolean) => void;
onSubmit: (reason: string | null) => void;
isPending: boolean;
}
function BypassDialog({ row, open, onOpenChange, onSubmit, isPending }: BypassDialogProps) {
const [reason, setReason] = useState(row.eoiBypassReason ?? '');
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent>
<DialogHeader>
<DialogTitle>Bypass EOI for berth {row.mooringNumber}</DialogTitle>
<DialogDescription>
Record why this berth&apos;s individual EOI is being waived. The interest&apos;s primary
EOI signature will cover it instead.
</DialogDescription>
</DialogHeader>
<div className="space-y-2">
<Label htmlFor={`bypass-reason-${row.berthId}`} className="text-xs">
Reason
</Label>
<Textarea
id={`bypass-reason-${row.berthId}`}
value={reason}
onChange={(e) => setReason(e.target.value)}
placeholder="e.g. covered under bundle EOI signed 2025-04-12"
rows={4}
/>
</div>
<DialogFooter className="gap-2 sm:gap-2">
{row.eoiBypassReason ? (
<Button
type="button"
variant="outline"
onClick={() => onSubmit(null)}
disabled={isPending}
>
Clear bypass
</Button>
) : null}
<Button
type="button"
variant="outline"
onClick={() => onOpenChange(false)}
disabled={isPending}
>
Cancel
</Button>
<Button
type="button"
onClick={() => onSubmit(reason.trim().length > 0 ? reason.trim() : null)}
disabled={isPending || reason.trim().length === 0}
>
{isPending ? <Loader2 className="mr-1.5 size-3.5 animate-spin" /> : null}
Save bypass
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
);
}
// ─── Row ────────────────────────────────────────────────────────────────────
interface RowProps {
row: LinkedBerthRow;
portSlug: string;
eoiStatus: string | null;
onUpdate: (berthId: string, patch: PatchPayload) => void;
onRemove: (berthId: string) => void;
isPending: boolean;
}
function LinkedBerthRowItem({ row, portSlug, eoiStatus, onUpdate, onRemove, isPending }: RowProps) {
const [bypassOpen, setBypassOpen] = useState(false);
const [confirmRemove, setConfirmRemove] = useState(false);
const dims = formatDimensions(row.lengthFt, row.widthFt, row.draftFt);
const showBypassControl = eoiStatus === 'signed';
return (
<div
className={cn(
'rounded-lg border bg-card p-3 text-sm',
row.isPrimary ? 'border-brand-300 ring-1 ring-brand-200' : 'border-border',
)}
>
<div className="flex flex-wrap items-start justify-between gap-2">
<div className="min-w-0 space-y-1">
<div className="flex flex-wrap items-center gap-2">
<Link
href={`/${portSlug}/berths/${row.berthId}`}
className="font-semibold text-primary hover:underline"
>
{row.mooringNumber ?? row.berthId}
</Link>
{row.area ? <span className="text-xs text-muted-foreground">{row.area}</span> : null}
<StatusPill status={statusToPill(row.status)}>{formatStatus(row.status)}</StatusPill>
{row.isPrimary ? (
<span className="inline-flex items-center gap-1 rounded-md border border-brand-200 bg-brand-50 px-2 py-0.5 text-xs font-medium text-brand-800">
<Star className="size-3" />
Primary
</span>
) : null}
{row.eoiBypassReason ? (
<span className="inline-flex items-center rounded-md border border-amber-200 bg-amber-50 px-2 py-0.5 text-xs font-medium text-amber-800">
EOI bypassed
</span>
) : null}
</div>
{dims ? <div className="text-xs text-muted-foreground">{dims}</div> : null}
</div>
<div className="flex flex-wrap items-center gap-2">
{!row.isPrimary ? (
<Button
type="button"
size="sm"
variant="outline"
onClick={() => onUpdate(row.berthId, { isPrimary: true })}
disabled={isPending}
>
<Star className="mr-1.5 size-3.5" />
Set as primary
</Button>
) : null}
<Button
type="button"
size="sm"
variant="ghost"
onClick={() => setConfirmRemove(true)}
disabled={isPending}
className="text-destructive hover:text-destructive"
aria-label={`Remove berth ${row.mooringNumber ?? row.berthId}`}
>
<Trash2 className="size-3.5" />
</Button>
</div>
</div>
<div className="mt-3 grid grid-cols-1 gap-3 border-t pt-3 sm:grid-cols-2">
<div className="space-y-1">
<div className="flex items-center justify-between gap-2">
<Label htmlFor={`specific-${row.berthId}`} className="text-sm font-medium">
Specifically pitching
</Label>
<Switch
id={`specific-${row.berthId}`}
checked={row.isSpecificInterest}
disabled={isPending}
onCheckedChange={(checked) => onUpdate(row.berthId, { isSpecificInterest: checked })}
/>
</div>
<p className="text-xs text-muted-foreground">
{row.isSpecificInterest ? SPECIFIC_CONSEQUENCE_ON : SPECIFIC_CONSEQUENCE_OFF}
</p>
</div>
<div className="space-y-1">
<div className="flex items-center justify-between gap-2">
<Label htmlFor={`bundle-${row.berthId}`} className="text-sm font-medium">
Mark in EOI bundle
</Label>
<Switch
id={`bundle-${row.berthId}`}
checked={row.isInEoiBundle}
disabled={isPending}
onCheckedChange={(checked) => onUpdate(row.berthId, { isInEoiBundle: checked })}
/>
</div>
<p className="text-xs text-muted-foreground">
{row.isInEoiBundle
? 'Covered by the interests EOI signature.'
: 'Not covered by the EOI bundle.'}
</p>
</div>
</div>
{showBypassControl ? (
<div className="mt-3 flex flex-wrap items-start justify-between gap-2 border-t pt-3">
<div className="min-w-0 space-y-0.5">
<p className="text-sm font-medium">Bypass EOI for this berth</p>
{row.eoiBypassReason ? (
<p className="text-xs text-muted-foreground">
<span className="font-medium text-foreground/80">Bypassed:</span>{' '}
{row.eoiBypassReason}
</p>
) : (
<p className="text-xs text-muted-foreground">
Record a reason if this berth doesn&apos;t need its own EOI.
</p>
)}
</div>
<Button
type="button"
size="sm"
variant="outline"
onClick={() => setBypassOpen(true)}
disabled={isPending}
>
{row.eoiBypassReason ? 'Edit bypass' : 'Add bypass'}
</Button>
</div>
) : null}
{bypassOpen ? (
<BypassDialog
row={row}
open={bypassOpen}
onOpenChange={setBypassOpen}
isPending={isPending}
onSubmit={(reason) => {
onUpdate(row.berthId, { eoiBypassReason: reason });
setBypassOpen(false);
}}
/>
) : null}
<Dialog open={confirmRemove} onOpenChange={setConfirmRemove}>
<DialogContent>
<DialogHeader>
<DialogTitle>Remove berth {row.mooringNumber} from interest?</DialogTitle>
<DialogDescription>
The berth itself isn&apos;t deleted only its link to this interest.
</DialogDescription>
</DialogHeader>
<DialogFooter className="gap-2 sm:gap-2">
<Button
type="button"
variant="outline"
onClick={() => setConfirmRemove(false)}
disabled={isPending}
>
Cancel
</Button>
<Button
type="button"
variant="destructive"
onClick={() => {
onRemove(row.berthId);
setConfirmRemove(false);
}}
disabled={isPending}
>
Remove
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
</div>
);
}
// ─── Component ──────────────────────────────────────────────────────────────
export function LinkedBerthsList({ interestId }: LinkedBerthsListProps) {
const params = useParams<{ portSlug: string }>();
const portSlug = params?.portSlug ?? '';
const { data, isLoading } = useLinkedBerths(interestId);
const updateMutation = useUpdateLink(interestId);
const removeMutation = useRemoveLink(interestId);
const rows = data?.data ?? [];
const eoiStatus = data?.meta.eoiStatus ?? null;
const isPending = updateMutation.isPending || removeMutation.isPending;
return (
<Card>
<CardHeader>
<CardTitle className="flex items-center gap-2 text-base">
<Anchor className="size-4 text-brand-600" />
Linked berths{rows.length > 0 ? ` (${rows.length})` : ''}
</CardTitle>
</CardHeader>
<CardContent className="space-y-3">
{isLoading ? (
<div className="space-y-2">
{[0, 1].map((i) => (
<div key={i} className="h-24 animate-pulse rounded-lg bg-muted" />
))}
</div>
) : rows.length === 0 ? (
<p className="py-6 text-center text-sm text-muted-foreground">
No berths linked yet. Use the recommender below to add one.
</p>
) : (
<div className="space-y-2">
{rows.map((row) => (
<LinkedBerthRowItem
key={row.id}
row={row}
portSlug={portSlug}
eoiStatus={eoiStatus}
onUpdate={(berthId, patch) => updateMutation.mutate({ berthId, patch })}
onRemove={(berthId) => removeMutation.mutate(berthId)}
isPending={isPending}
/>
))}
</div>
)}
{updateMutation.isError ? (
<p className="text-sm text-destructive">
{(updateMutation.error as Error)?.message ?? 'Failed to update berth.'}
</p>
) : null}
{removeMutation.isError ? (
<p className="text-sm text-destructive">
{(removeMutation.error as Error)?.message ?? 'Failed to remove berth.'}
</p>
) : null}
</CardContent>
</Card>
);
}

View File

@@ -0,0 +1,42 @@
'use client';
/**
* Per-interest send launcher (Phase 7 §5.9).
*
* Shown on the interest detail page header. Opens the same picker as
* {@link SendDocumentsDialog} but pre-pins the `interestId` so the resulting
* audit row is filed against the interest timeline.
*/
import { useState } from 'react';
import { Send } from 'lucide-react';
import { Button } from '@/components/ui/button';
import { SendDocumentsDialog } from '@/components/clients/send-documents-dialog';
interface SendFromInterestButtonProps {
interestId: string;
clientId: string;
clientName: string;
}
export function SendFromInterestButton({
interestId,
clientId,
clientName,
}: SendFromInterestButtonProps) {
const [open, setOpen] = useState(false);
return (
<>
<Button variant="outline" size="sm" onClick={() => setOpen(true)}>
<Send className="mr-2 h-4 w-4" /> Send documents
</Button>
<SendDocumentsDialog
open={open}
onOpenChange={setOpen}
clientId={clientId}
clientName={clientName}
interestId={interestId}
/>
</>
);
}

View File

@@ -0,0 +1,277 @@
'use client';
/**
* Shared send-document dialog (Phase 7).
*
* Used by:
* - {@link SendBerthPdfDialog} (berths/) — single berth, recipient picker.
* - {@link SendBrochureDialog} (clients/, interests/) — brochure picker.
* - The interest "send from interest" pattern reuses both via a wrapper.
*
* §14.7 mitigations enforced client-side:
* - Recipient email is shown verbatim in the confirm step (no quick-send).
* - Pre-send dry-run calls /preview first; the Send button is disabled
* until the unresolved-tokens list is empty.
* - Body length capped at 50KB; char count visible.
*/
import { useEffect, useMemo, useState } from 'react';
import { useMutation, useQuery } from '@tanstack/react-query';
import { Loader2 } from 'lucide-react';
import { toast } from 'sonner';
import { Button } from '@/components/ui/button';
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from '@/components/ui/dialog';
import { Label } from '@/components/ui/label';
import { Textarea } from '@/components/ui/textarea';
import { Input } from '@/components/ui/input';
import { apiFetch } from '@/lib/api/client';
const BODY_MAX = 50_000;
export type DocumentKind = 'berth_pdf' | 'brochure';
interface SendDocumentDialogProps {
open: boolean;
onOpenChange: (open: boolean) => void;
documentKind: DocumentKind;
/** Pre-filled recipient. Leave both blank to let the rep type one. */
recipient: { clientId?: string; email?: string; interestId?: string };
/** Either a berthId (for berth_pdf) or brochureId (for brochure). */
context: { berthId?: string; brochureId?: string };
/** Title displayed in the dialog header. */
title: string;
/** Short context line under the title (e.g. "Berth A1 — primary version"). */
subtitle?: string;
onSent?: () => void;
}
interface PreviewResponse {
data: { html: string; markdown: string; unresolved: string[] };
}
export function SendDocumentDialog({
open,
onOpenChange,
documentKind,
recipient,
context,
title,
subtitle,
onSent,
}: SendDocumentDialogProps) {
const [step, setStep] = useState<'compose' | 'confirm'>('compose');
const [emailOverride, setEmailOverride] = useState(recipient.email ?? '');
const [customBody, setCustomBody] = useState('');
useEffect(() => {
if (open) {
setStep('compose');
setEmailOverride(recipient.email ?? '');
setCustomBody('');
}
}, [open, recipient.email]);
const recipientForApi = useMemo(
() => ({
clientId: recipient.clientId,
email: emailOverride || recipient.email,
interestId: recipient.interestId,
}),
[recipient.clientId, recipient.email, recipient.interestId, emailOverride],
);
// Live preview via /api/v1/document-sends/preview. Re-runs whenever the
// body text or recipient changes (debounce-by-react-query for free).
const previewQuery = useQuery<PreviewResponse>({
queryKey: [
'document-sends-preview',
documentKind,
context.berthId ?? null,
context.brochureId ?? null,
recipientForApi.clientId ?? null,
recipientForApi.email ?? null,
customBody,
],
queryFn: () =>
apiFetch('/api/v1/document-sends/preview', {
method: 'POST',
body: {
documentKind,
recipient: recipientForApi,
berthId: context.berthId,
brochureId: context.brochureId,
customBodyMarkdown: customBody.trim() ? customBody : undefined,
},
}),
enabled: open && Boolean(recipientForApi.clientId || recipientForApi.email),
});
type SendResp = { data: { error?: string; deliveredAsAttachment: boolean } };
const sendMutation = useMutation<SendResp, Error, void>({
mutationFn: async (): Promise<SendResp> => {
const endpoint =
documentKind === 'berth_pdf'
? '/api/v1/document-sends/berth-pdf'
: '/api/v1/document-sends/brochure';
const body =
documentKind === 'berth_pdf'
? {
berthId: context.berthId,
recipient: recipientForApi,
customBodyMarkdown: customBody.trim() ? customBody : undefined,
}
: {
brochureId: context.brochureId,
recipient: recipientForApi,
customBodyMarkdown: customBody.trim() ? customBody : undefined,
};
return (await apiFetch<SendResp>(endpoint, { method: 'POST', body })) as SendResp;
},
onSuccess: (resp) => {
if (resp.data.error) {
toast.error(`Send failed: ${resp.data.error}`);
} else {
toast.success(
resp.data.deliveredAsAttachment
? 'Sent as attachment'
: 'Sent (large file delivered as download link)',
);
onSent?.();
onOpenChange(false);
}
},
onError: (err) => {
toast.error(err instanceof Error ? err.message : 'Send failed');
},
});
const unresolved = previewQuery.data?.data.unresolved ?? [];
const previewHtml = previewQuery.data?.data.html ?? '';
const recipientResolved = Boolean(recipientForApi.clientId || recipientForApi.email);
const canPreview = recipientResolved;
const canSend = step === 'confirm' && unresolved.length === 0 && !sendMutation.isPending;
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent className="max-w-2xl">
<DialogHeader>
<DialogTitle>{title}</DialogTitle>
{subtitle && <DialogDescription>{subtitle}</DialogDescription>}
</DialogHeader>
{step === 'compose' ? (
<div className="space-y-4">
<div className="space-y-1">
<Label htmlFor="ds-email">Recipient email</Label>
<Input
id="ds-email"
type="email"
value={emailOverride}
onChange={(e) => setEmailOverride(e.target.value)}
placeholder={recipient.email ? '' : 'recipient@example.com'}
/>
<p className="text-xs text-muted-foreground">
{recipient.clientId
? 'Defaults to the client primary email; override here if needed.'
: 'Type the address you want to send to.'}
</p>
</div>
<div className="space-y-1">
<Label htmlFor="ds-body">Message body</Label>
<Textarea
id="ds-body"
rows={10}
value={customBody}
onChange={(e) => setCustomBody(e.target.value.slice(0, BODY_MAX))}
placeholder="Leave blank to use the port template…"
className="font-mono text-sm"
/>
<div className="flex items-center justify-between text-xs text-muted-foreground">
<span>Markdown supported. Tokens like {'{{client.fullName}}'} are expanded.</span>
<span>
{customBody.length} / {BODY_MAX}
</span>
</div>
</div>
{canPreview && previewQuery.isSuccess && (
<div className="rounded border bg-muted/30 p-3">
<p className="mb-2 text-xs font-semibold uppercase tracking-wide text-muted-foreground">
Preview
</p>
<div
className="prose prose-sm max-w-none"
dangerouslySetInnerHTML={{ __html: previewHtml }}
/>
{unresolved.length > 0 && (
<p className="mt-3 rounded border border-amber-300 bg-amber-50 p-2 text-xs text-amber-900">
Unresolved merge fields: {unresolved.join(', ')}. Replace these before sending.
</p>
)}
</div>
)}
{previewQuery.isLoading && canPreview && (
<div className="flex items-center gap-2 text-sm text-muted-foreground">
<Loader2 className="h-4 w-4 animate-spin" /> Rendering preview
</div>
)}
</div>
) : (
<div className="space-y-3">
<div className="rounded border bg-muted/30 p-3 text-sm">
<p>
Sending to: <span className="font-mono">{recipientForApi.email}</span>
</p>
{context.berthId && <p>Document: berth PDF</p>}
{context.brochureId !== undefined && <p>Document: brochure</p>}
</div>
<div
className="prose prose-sm max-w-none rounded border p-3"
dangerouslySetInnerHTML={{ __html: previewHtml }}
/>
<p className="text-xs text-muted-foreground">
The PDF file is added by the system after the body your text won&rsquo;t override
it.
</p>
</div>
)}
<DialogFooter className="gap-2">
{step === 'compose' ? (
<>
<Button variant="outline" onClick={() => onOpenChange(false)}>
Cancel
</Button>
<Button
onClick={() => setStep('confirm')}
disabled={!recipientResolved || unresolved.length > 0 || previewQuery.isLoading}
>
Review & send
</Button>
</>
) : (
<>
<Button variant="outline" onClick={() => setStep('compose')}>
Back
</Button>
<Button onClick={() => sendMutation.mutate()} disabled={!canSend}>
{sendMutation.isPending && <Loader2 className="mr-2 h-4 w-4 animate-spin" />}
Confirm and send
</Button>
</>
)}
</DialogFooter>
</DialogContent>
</Dialog>
);
}

View File

@@ -0,0 +1,31 @@
-- Normalize berth mooring numbers from legacy hyphen+padded form ("A-01")
-- to the canonical form ("A1") that NocoDB, the public website, the
-- Documenso EOI templates, and every external reference use.
--
-- Idempotent: rows already in canonical form are untouched. The regex
-- accepts:
-- - optional hyphen between letter prefix and digits
-- - optional leading zeros on the digits
-- - one or more letters in the prefix (future-proofs "AA1" etc.)
-- Pure-numeric or otherwise non-conforming moorings (e.g. "B-LEG") are
-- left unchanged so they show up in the orphan check below for manual
-- review.
UPDATE berths
SET mooring_number = regexp_replace(mooring_number, '^([A-Z]+)-?0*(\d+)$', '\1\2')
WHERE mooring_number ~ '^[A-Z]+-0*\d+$';
-- Sanity check: surface any moorings that don't match the canonical
-- pattern after the rewrite. These need manual triage before Phase 3
-- can ship (the public website builds /berths/:mooring URLs from this
-- value). Logged via NOTICE so the migration runner prints them.
DO $$
DECLARE
bad_count integer;
BEGIN
SELECT count(*) INTO bad_count
FROM berths
WHERE mooring_number !~ '^[A-Z]+\d+$';
IF bad_count > 0 THEN
RAISE NOTICE 'Mooring normalization: % rows do not match ^[A-Z]+\d+$ - manual review needed', bad_count;
END IF;
END $$;

View File

@@ -0,0 +1,6 @@
ALTER TABLE "berths" ADD COLUMN "weekly_rate_high_usd" numeric;--> statement-breakpoint
ALTER TABLE "berths" ADD COLUMN "weekly_rate_low_usd" numeric;--> statement-breakpoint
ALTER TABLE "berths" ADD COLUMN "daily_rate_high_usd" numeric;--> statement-breakpoint
ALTER TABLE "berths" ADD COLUMN "daily_rate_low_usd" numeric;--> statement-breakpoint
ALTER TABLE "berths" ADD COLUMN "pricing_valid_until" date;--> statement-breakpoint
ALTER TABLE "berths" ADD COLUMN "last_imported_at" timestamp with time zone;

View File

@@ -0,0 +1 @@
CREATE UNIQUE INDEX "idx_cc_one_primary_per_channel" ON "client_contacts" USING btree ("client_id","channel") WHERE "client_contacts"."is_primary" = true;

View File

@@ -0,0 +1,24 @@
-- Backfill clients.nationality_iso from the primary phone's parsed
-- value_country. Idempotent (only runs on rows where nationality_iso
-- is null), safe to re-execute. Phone country is a *proxy* for
-- nationality - the client-list UI labels the column "Country" rather
-- than "Nationality" to avoid implying it's authoritative (see §14.2).
--
-- Pattern: prefer the row marked `is_primary=true`; fall back to the
-- most recently created phone contact when no row is flagged primary.
WITH primary_phone AS (
SELECT DISTINCT ON (cc.client_id)
cc.client_id,
cc.value_country
FROM client_contacts cc
WHERE cc.channel = 'phone'
AND cc.value_country IS NOT NULL
ORDER BY cc.client_id,
cc.is_primary DESC,
cc.created_at DESC
)
UPDATE clients c
SET nationality_iso = primary_phone.value_country
FROM primary_phone
WHERE c.nationality_iso IS NULL
AND c.id = primary_phone.client_id;

View File

@@ -0,0 +1,61 @@
CREATE TABLE "interest_berths" (
"id" text PRIMARY KEY NOT NULL,
"interest_id" text NOT NULL,
"berth_id" text NOT NULL,
"is_primary" boolean DEFAULT false NOT NULL,
"is_specific_interest" boolean DEFAULT true NOT NULL,
"is_in_eoi_bundle" boolean DEFAULT false NOT NULL,
"eoi_bypass_reason" text,
"eoi_bypassed_by" text,
"eoi_bypassed_at" timestamp with time zone,
"added_by" text,
"added_at" timestamp with time zone DEFAULT now() NOT NULL,
"notes" text
);
--> statement-breakpoint
ALTER TABLE "interests" ADD COLUMN "desired_length_ft" numeric;--> statement-breakpoint
ALTER TABLE "interests" ADD COLUMN "desired_width_ft" numeric;--> statement-breakpoint
ALTER TABLE "interests" ADD COLUMN "desired_draft_ft" numeric;--> statement-breakpoint
ALTER TABLE "interest_berths" ADD CONSTRAINT "interest_berths_interest_id_interests_id_fk" FOREIGN KEY ("interest_id") REFERENCES "public"."interests"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "interest_berths" ADD CONSTRAINT "interest_berths_berth_id_berths_id_fk" FOREIGN KEY ("berth_id") REFERENCES "public"."berths"("id") ON DELETE restrict ON UPDATE no action;--> statement-breakpoint
CREATE UNIQUE INDEX "idx_ib_interest_berth" ON "interest_berths" USING btree ("interest_id","berth_id");--> statement-breakpoint
CREATE UNIQUE INDEX "idx_ib_one_primary" ON "interest_berths" USING btree ("interest_id") WHERE "interest_berths"."is_primary" = true;--> statement-breakpoint
CREATE INDEX "idx_ib_berth" ON "interest_berths" USING btree ("berth_id");--> statement-breakpoint
CREATE INDEX "idx_ib_specific" ON "interest_berths" USING btree ("berth_id") WHERE "interest_berths"."is_specific_interest" = true;--> statement-breakpoint
-- Pre-flight: halt if any interests.berth_id points at a row that no
-- longer exists. The new junction's FK is `restrict`, so a dangling
-- value would otherwise abort the insert mid-batch with a confusing
-- error.
DO $$
DECLARE
orphan_count integer;
BEGIN
SELECT count(*) INTO orphan_count
FROM interests i
LEFT JOIN berths b ON b.id = i.berth_id
WHERE i.berth_id IS NOT NULL
AND b.id IS NULL;
IF orphan_count > 0 THEN
RAISE EXCEPTION 'interests.berth_id has % dangling references; resolve manually before re-running', orphan_count;
END IF;
END $$;--> statement-breakpoint
-- Migrate existing interest.berth_id values into the junction. Every
-- pre-existing single-berth link becomes a primary, specific-interest
-- row. is_in_eoi_bundle = true only when the interest already has a
-- signed EOI (the legacy "the berth is contractually committed" case).
INSERT INTO interest_berths (
id, interest_id, berth_id,
is_primary, is_specific_interest, is_in_eoi_bundle,
added_at
)
SELECT
gen_random_uuid()::text,
i.id,
i.berth_id,
true,
true,
COALESCE(i.eoi_status = 'signed', false),
i.created_at
FROM interests i
WHERE i.berth_id IS NOT NULL
ON CONFLICT (interest_id, berth_id) DO NOTHING;

View File

@@ -0,0 +1,2 @@
DROP INDEX "idx_interests_berth";--> statement-breakpoint
ALTER TABLE "interests" DROP COLUMN "berth_id";

View File

@@ -0,0 +1,24 @@
CREATE TABLE "berth_pdf_versions" (
"id" text PRIMARY KEY NOT NULL,
"berth_id" text NOT NULL,
"version_number" integer NOT NULL,
"storage_key" text NOT NULL,
"file_name" text NOT NULL,
"file_size_bytes" integer NOT NULL,
"content_sha256" text NOT NULL,
"uploaded_by" text NOT NULL,
"uploaded_at" timestamp with time zone DEFAULT now() NOT NULL,
"download_url_expires_at" timestamp with time zone,
"parse_results" jsonb
);
--> statement-breakpoint
ALTER TABLE "berths" ADD COLUMN "current_pdf_version_id" text;--> statement-breakpoint
ALTER TABLE "berth_pdf_versions" ADD CONSTRAINT "berth_pdf_versions_berth_id_berths_id_fk" FOREIGN KEY ("berth_id") REFERENCES "public"."berths"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
CREATE UNIQUE INDEX "berth_pdf_versions_berth_version_idx" ON "berth_pdf_versions" USING btree ("berth_id","version_number");--> statement-breakpoint
CREATE INDEX "idx_bpv_berth" ON "berth_pdf_versions" USING btree ("berth_id","uploaded_at");--> statement-breakpoint
-- berths.current_pdf_version_id -> berth_pdf_versions.id (added after both tables
-- exist to break the circular FK declaration; ON DELETE SET NULL so deleting the
-- pointed-at row keeps the berth and just clears the pointer).
ALTER TABLE "berths" ADD CONSTRAINT "berths_current_pdf_version_id_fk"
FOREIGN KEY ("current_pdf_version_id") REFERENCES "public"."berth_pdf_versions"("id")
ON DELETE SET NULL ON UPDATE NO ACTION;

View File

@@ -0,0 +1,59 @@
CREATE TABLE "brochure_versions" (
"id" text PRIMARY KEY NOT NULL,
"brochure_id" text NOT NULL,
"version_number" integer NOT NULL,
"storage_key" text NOT NULL,
"file_name" text NOT NULL,
"file_size_bytes" integer NOT NULL,
"content_sha256" text NOT NULL,
"uploaded_by" text NOT NULL,
"uploaded_at" timestamp with time zone DEFAULT now() NOT NULL,
"download_url_expires_at" timestamp with time zone
);
--> statement-breakpoint
CREATE TABLE "brochures" (
"id" text PRIMARY KEY NOT NULL,
"port_id" text NOT NULL,
"label" text NOT NULL,
"description" text,
"is_default" boolean DEFAULT false NOT NULL,
"archived_at" timestamp with time zone,
"created_by" text NOT NULL,
"created_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "document_sends" (
"id" text PRIMARY KEY NOT NULL,
"port_id" text NOT NULL,
"client_id" text,
"interest_id" text,
"recipient_email" text NOT NULL,
"document_kind" text NOT NULL,
"berth_id" text,
"berth_pdf_version_id" text,
"brochure_id" text,
"brochure_version_id" text,
"body_markdown" text,
"sent_by_user_id" text NOT NULL,
"from_address" text NOT NULL,
"sent_at" timestamp with time zone DEFAULT now() NOT NULL,
"message_id" text,
"fallback_to_link_reason" text,
"failed_at" timestamp with time zone,
"error_reason" text
);
--> statement-breakpoint
ALTER TABLE "brochure_versions" ADD CONSTRAINT "brochure_versions_brochure_id_brochures_id_fk" FOREIGN KEY ("brochure_id") REFERENCES "public"."brochures"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "brochures" ADD CONSTRAINT "brochures_port_id_ports_id_fk" FOREIGN KEY ("port_id") REFERENCES "public"."ports"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "document_sends" ADD CONSTRAINT "document_sends_port_id_ports_id_fk" FOREIGN KEY ("port_id") REFERENCES "public"."ports"("id") ON DELETE no action ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "document_sends" ADD CONSTRAINT "document_sends_client_id_clients_id_fk" FOREIGN KEY ("client_id") REFERENCES "public"."clients"("id") ON DELETE no action ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "document_sends" ADD CONSTRAINT "document_sends_interest_id_interests_id_fk" FOREIGN KEY ("interest_id") REFERENCES "public"."interests"("id") ON DELETE no action ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "document_sends" ADD CONSTRAINT "document_sends_berth_id_berths_id_fk" FOREIGN KEY ("berth_id") REFERENCES "public"."berths"("id") ON DELETE no action ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "document_sends" ADD CONSTRAINT "document_sends_brochure_id_brochures_id_fk" FOREIGN KEY ("brochure_id") REFERENCES "public"."brochures"("id") ON DELETE no action ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "document_sends" ADD CONSTRAINT "document_sends_brochure_version_id_brochure_versions_id_fk" FOREIGN KEY ("brochure_version_id") REFERENCES "public"."brochure_versions"("id") ON DELETE no action ON UPDATE no action;--> statement-breakpoint
CREATE INDEX "idx_brochure_versions_brochure" ON "brochure_versions" USING btree ("brochure_id","uploaded_at");--> statement-breakpoint
CREATE INDEX "idx_brochures_port" ON "brochures" USING btree ("port_id");--> statement-breakpoint
CREATE INDEX "idx_ds_client" ON "document_sends" USING btree ("client_id","sent_at");--> statement-breakpoint
CREATE INDEX "idx_ds_interest" ON "document_sends" USING btree ("interest_id","sent_at");--> statement-breakpoint
CREATE INDEX "idx_ds_berth" ON "document_sends" USING btree ("berth_id","sent_at");--> statement-breakpoint
CREATE INDEX "idx_ds_port" ON "document_sends" USING btree ("port_id","sent_at");

View File

@@ -0,0 +1 @@
CREATE UNIQUE INDEX "idx_brochures_one_default_per_port" ON "brochures" USING btree ("port_id") WHERE "brochures"."is_default" = true AND "brochures"."archived_at" IS NULL;

View File

@@ -0,0 +1 @@
ALTER TABLE "expenses" ADD COLUMN "no_receipt_acknowledged" boolean DEFAULT false NOT NULL;

View File

@@ -0,0 +1,29 @@
-- Audit follow-up: 0024 only normalized rows that contained a literal
-- hyphen (`A-01`), but the audit caught that legacy NocoDB exports also
-- produced un-hyphenated padded forms (`A01`). Those rows skipped the
-- 0024 rewrite and remained non-canonical, which would break the public
-- /berths/:mooringNumber lookup (the route gates on `^[A-Z]+\d+$`).
--
-- This migration re-runs the rewrite with a WHERE clause broadened to
-- catch BOTH variants:
-- - hyphenated padded ("A-01") ← redundant after 0024 but harmless
-- - un-hyphenated padded ("A01")
-- Rows already in canonical form skip the UPDATE because the regex_replace
-- output equals the input AND the WHERE filter excludes them via the
-- "leading zero or hyphen" pattern.
UPDATE berths
SET mooring_number = regexp_replace(mooring_number, '^([A-Z]+)-?0*(\d+)$', '\1\2')
WHERE mooring_number ~ '^[A-Z]+-?0*\d+$'
AND mooring_number !~ '^[A-Z]+\d+$';
DO $$
DECLARE
bad_count integer;
BEGIN
SELECT count(*) INTO bad_count
FROM berths
WHERE mooring_number !~ '^[A-Z]+\d+$';
IF bad_count > 0 THEN
RAISE NOTICE 'Post-rewrite: % rows still do not match ^[A-Z]+\d+$ - manual review needed', bad_count;
END IF;
END $$;

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -169,6 +169,83 @@
"when": 1777927586934,
"tag": "0023_omniscient_reaper",
"breakpoints": true
},
{
"idx": 24,
"version": "7",
"when": 1777938954111,
"tag": "0024_normalize_mooring_numbers",
"breakpoints": true
},
{
"idx": 25,
"version": "7",
"when": 1777939212954,
"tag": "0025_berth_pricing_columns",
"breakpoints": true
},
{
"idx": 26,
"version": "7",
"when": 1777939906731,
"tag": "0026_client_contacts_one_primary_per_channel",
"breakpoints": true
},
{
"idx": 27,
"version": "7",
"when": 1777939914252,
"tag": "0027_backfill_nationality_iso_from_phone",
"breakpoints": true
},
{
"idx": 28,
"version": "7",
"when": 1777940421236,
"tag": "0028_interest_berths_junction",
"breakpoints": true
},
{
"idx": 29,
"version": "7",
"when": 1777941465866,
"tag": "0029_puzzling_romulus",
"breakpoints": true
},
{
"idx": 30,
"version": "7",
"when": 1777944021221,
"tag": "0030_berth_pdf_versions",
"breakpoints": true
},
{
"idx": 31,
"version": "7",
"when": 1777944191753,
"tag": "0031_brochures_and_document_sends",
"breakpoints": true
},
{
"idx": 32,
"version": "7",
"when": 1777946048910,
"tag": "0032_brochures_one_default_per_port_and_storage_fixes",
"breakpoints": true
},
{
"idx": 33,
"version": "7",
"when": 1777948521076,
"tag": "0033_expense_no_receipt_acknowledged",
"breakpoints": true
},
{
"idx": 34,
"version": "7",
"when": 1778000000000,
"tag": "0034_normalize_mooring_numbers_broaden",
"breakpoints": true
}
]
}

View File

@@ -50,9 +50,19 @@ export const berths = pgTable(
access: text('access'),
price: numeric('price'),
priceCurrency: text('price_currency').notNull().default('USD'),
// Lease/rental rates surfaced by the per-berth PDFs (Phase 6b). Null
// until reps upload PDFs; rendered on the berth detail page with a
// "Pricing data may be stale" chip when pricing_valid_until < today().
weeklyRateHighUsd: numeric('weekly_rate_high_usd'),
weeklyRateLowUsd: numeric('weekly_rate_low_usd'),
dailyRateHighUsd: numeric('daily_rate_high_usd'),
dailyRateLowUsd: numeric('daily_rate_low_usd'),
pricingValidUntil: date('pricing_valid_until'),
bowFacing: text('bow_facing'),
berthApproved: boolean('berth_approved').default(false),
tenureType: text('tenure_type').notNull().default('permanent'), // permanent, fixed_term
// permanent, fixed_term, fee_simple, strata_lot (the last two map to
// the Fee Simple / Strata Lot tenures shown in the per-berth PDFs).
tenureType: text('tenure_type').notNull().default('permanent'),
tenureYears: integer('tenure_years'),
tenureStartDate: date('tenure_start_date'),
tenureEndDate: date('tenure_end_date'),
@@ -62,6 +72,15 @@ export const berths = pgTable(
// Optional override flag carried over from NocoDB ("auto" or null in legacy data).
// Reserved for future "manual override" semantics; not surfaced in the UI today.
statusOverrideMode: text('status_override_mode'),
// Set by scripts/import-berths-from-nocodb.ts. The import compares this
// against updated_at to detect human edits made after the last import,
// so re-running the import doesn't clobber CRM-side overrides.
lastImportedAt: timestamp('last_imported_at', { withTimezone: true }),
// Pointer to the active per-berth PDF version (Phase 6b). Null until a
// rep uploads the first PDF; a later rollback can re-target this column
// to any prior `berth_pdf_versions.id`. The full history lives in the
// junction table — this column is just the "current" pointer.
currentPdfVersionId: text('current_pdf_version_id'),
createdAt: timestamp('created_at', { withTimezone: true }).notNull().defaultNow(),
updatedAt: timestamp('updated_at', { withTimezone: true }).notNull().defaultNow(),
},
@@ -73,6 +92,17 @@ export const berths = pgTable(
],
);
// Note: `berths.current_pdf_version_id` has an `ON DELETE SET NULL` FK to
// `berth_pdf_versions.id` installed by migration 0030. The column is left
// without a `.references()` / `foreignKey()` declaration in the Drizzle
// schema because the two tables form a circular FK (berth_pdf_versions →
// berths), and Drizzle's relation inference doesn't tolerate the cycle
// when both sides are declared via column-level `.references()`. The
// migration chain authoritatively maintains the constraint; a fresh
// `db:push` against an empty DB would skip the FK and require a follow-up
// generated migration to add it back. This is acceptable because we
// always apply migrations in order in dev/CI/prod.
export const berthMapData = pgTable(
'berth_map_data',
{
@@ -167,6 +197,46 @@ export const berthMaintenanceLog = pgTable(
(table) => [index('idx_bml_berth').on(table.berthId), index('idx_bml_port').on(table.portId)],
);
/**
* Per-berth PDF version history (Phase 6b — see plan §3.3 / §4.7b).
*
* Each upload creates a new row with a monotonic `versionNumber` per berth.
* The active version is referenced by `berths.current_pdf_version_id`. The
* storage_key points at the file in the active `StorageBackend` (s3/filesystem),
* which is resolved at access time via `getStorageBackend()`.
*
* `parseResults` captures what the 3-tier reverse parser extracted at upload
* time plus any conflicts the rep resolved in the diff dialog. Kept as audit
* trail; rolling back to a prior version does NOT replay these (per §14.6).
*/
export const berthPdfVersions = pgTable(
'berth_pdf_versions',
{
id: text('id')
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
berthId: text('berth_id')
.notNull()
.references(() => berths.id, { onDelete: 'cascade' }),
versionNumber: integer('version_number').notNull(),
/** Object key in the active storage backend (renamed from `s3_key` per §4.7a). */
storageKey: text('storage_key').notNull(),
fileName: text('file_name').notNull(),
fileSizeBytes: integer('file_size_bytes').notNull(),
contentSha256: text('content_sha256').notNull(),
uploadedBy: text('uploaded_by').notNull(),
uploadedAt: timestamp('uploaded_at', { withTimezone: true }).notNull().defaultNow(),
/** Cached signed-URL expiry per §11.1 — re-sign only when within 1h of expiry. */
downloadUrlExpiresAt: timestamp('download_url_expires_at', { withTimezone: true }),
/** { engine: 'acroform'|'ocr'|'ai', extracted: {...}, conflicts: [...], appliedFields: [...] } */
parseResults: jsonb('parse_results'),
},
(table) => [
uniqueIndex('berth_pdf_versions_berth_version_idx').on(table.berthId, table.versionNumber),
index('idx_bpv_berth').on(table.berthId, table.uploadedAt),
],
);
export const berthTags = pgTable(
'berth_tags',
{
@@ -188,3 +258,5 @@ export type BerthWaitingList = typeof berthWaitingList.$inferSelect;
export type NewBerthWaitingList = typeof berthWaitingList.$inferInsert;
export type BerthMaintenanceLog = typeof berthMaintenanceLog.$inferSelect;
export type NewBerthMaintenanceLog = typeof berthMaintenanceLog.$inferInsert;
export type BerthPdfVersion = typeof berthPdfVersions.$inferSelect;
export type NewBerthPdfVersion = typeof berthPdfVersions.$inferInsert;

View File

@@ -0,0 +1,146 @@
import {
pgTable,
text,
boolean,
integer,
timestamp,
index,
uniqueIndex,
} from 'drizzle-orm/pg-core';
import { sql } from 'drizzle-orm';
import { ports } from './ports';
import { clients } from './clients';
import { interests } from './interests';
import { berths } from './berths';
/**
* Port-wide brochures (Phase 7 — see plan §3.3 / §4.8).
*
* Each port can have multiple brochures (e.g. "General", "Investor Pack")
* with one marked as `isDefault`. Archived brochures stay queryable for
* audit purposes but are hidden from the send-out picker.
*/
export const brochures = pgTable(
'brochures',
{
id: text('id')
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
portId: text('port_id')
.notNull()
.references(() => ports.id, { onDelete: 'cascade' }),
label: text('label').notNull(),
description: text('description'),
isDefault: boolean('is_default').notNull().default(false),
archivedAt: timestamp('archived_at', { withTimezone: true }),
createdBy: text('created_by').notNull(),
createdAt: timestamp('created_at', { withTimezone: true }).notNull().defaultNow(),
},
(table) => [
index('idx_brochures_port').on(table.portId),
// At most one default brochure per port (excluding archived rows).
// Service-layer "demote prior default then insert" is correct in the
// single-writer case, but two concurrent createBrochure(isDefault:true)
// calls without this index race past the read-then-write check and
// both win.
uniqueIndex('idx_brochures_one_default_per_port')
.on(table.portId)
.where(sql`${table.isDefault} = true AND ${table.archivedAt} IS NULL`),
],
);
/**
* Versioned brochure files. Identical lifecycle to `berth_pdf_versions`:
* each upload creates a new immutable row with a monotonic version number
* per brochure. `storageKey` follows the §4.7a renamed convention.
*/
export const brochureVersions = pgTable(
'brochure_versions',
{
id: text('id')
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
brochureId: text('brochure_id')
.notNull()
.references(() => brochures.id, { onDelete: 'cascade' }),
versionNumber: integer('version_number').notNull(),
/** Object key in the active storage backend (renamed from `s3_key` per §4.7a). */
storageKey: text('storage_key').notNull(),
fileName: text('file_name').notNull(),
fileSizeBytes: integer('file_size_bytes').notNull(),
contentSha256: text('content_sha256').notNull(),
uploadedBy: text('uploaded_by').notNull(),
uploadedAt: timestamp('uploaded_at', { withTimezone: true }).notNull().defaultNow(),
/** Cached signed-URL expiry per §11.1 — re-sign only when within 1h of expiry. */
downloadUrlExpiresAt: timestamp('download_url_expires_at', { withTimezone: true }),
},
(table) => [index('idx_brochure_versions_brochure').on(table.brochureId, table.uploadedAt)],
);
/**
* Send-out audit log for berth PDFs and brochures (Phase 7 — plan §3.3).
*
* One row per recipient per send. `documentKind` discriminates between
* `'berth_pdf'` and `'brochure'`; the corresponding `*_version_id` column
* pins the exact version sent.
*
* `berthPdfVersionId` is intentionally a plain text column (no FK) — the
* referenced table `berth_pdf_versions` is owned by Phase 6b. Loose-coupling
* keeps the two phases independent (per Phase 7 task brief).
*
* `failedAt` and `errorReason` capture send failures (SMTP auth, transport
* errors). Failed sends are still written so reps can see "I clicked send
* but it didn't go" in the timeline (per §14.7).
*/
export const documentSends = pgTable(
'document_sends',
{
id: text('id')
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
portId: text('port_id')
.notNull()
.references(() => ports.id),
/** Either client_id or interest_id is set (or both). */
clientId: text('client_id').references(() => clients.id),
interestId: text('interest_id').references(() => interests.id),
recipientEmail: text('recipient_email').notNull(),
/** 'berth_pdf' | 'brochure' */
documentKind: text('document_kind').notNull(),
berthId: text('berth_id').references(() => berths.id),
/** Forward FK ref — berth_pdf_versions defined in Phase 6b. Loose-coupled. */
berthPdfVersionId: text('berth_pdf_version_id'),
brochureId: text('brochure_id').references(() => brochures.id),
brochureVersionId: text('brochure_version_id').references(() => brochureVersions.id),
/** Exact body used (after merge-field expansion + sanitization). */
bodyMarkdown: text('body_markdown'),
sentByUserId: text('sent_by_user_id').notNull(),
fromAddress: text('from_address').notNull(),
sentAt: timestamp('sent_at', { withTimezone: true }).notNull().defaultNow(),
/** SMTP provider message-id for deliverability tracking. */
messageId: text('message_id'),
/** When the initial send had its attachment dropped because the SMTP server
* rejected the size (552 etc.) and the system retried with a download
* link, this captures the rejection reason for ops visibility. Null when
* the original send went through as-is. */
fallbackToLinkReason: text('fallback_to_link_reason'),
/** Set when the SMTP send transaction itself failed (auth/transport/etc). */
failedAt: timestamp('failed_at', { withTimezone: true }),
/** Human-readable failure reason; only meaningful when failedAt is non-null. */
errorReason: text('error_reason'),
},
(t) => [
index('idx_ds_client').on(t.clientId, t.sentAt),
index('idx_ds_interest').on(t.interestId, t.sentAt),
index('idx_ds_berth').on(t.berthId, t.sentAt),
index('idx_ds_port').on(t.portId, t.sentAt),
],
);
export type Brochure = typeof brochures.$inferSelect;
export type NewBrochure = typeof brochures.$inferInsert;
export type BrochureVersion = typeof brochureVersions.$inferSelect;
export type NewBrochureVersion = typeof brochureVersions.$inferInsert;
export type DocumentSend = typeof documentSends.$inferSelect;
export type NewDocumentSend = typeof documentSends.$inferInsert;

View File

@@ -77,6 +77,11 @@ export const clientContacts = pgTable(
index('idx_cc_phone')
.on(table.channel, table.value)
.where(sql`${table.channel} = 'phone'`),
// At most one is_primary=true per (client_id, channel). Prevents
// ambiguity when the /clients list pulls "the" primary phone/email.
uniqueIndex('idx_cc_one_primary_per_channel')
.on(table.clientId, table.channel)
.where(sql`${table.isPrimary} = true`),
],
);

View File

@@ -1,6 +1,7 @@
import {
pgTable,
text,
boolean,
numeric,
integer,
timestamp,
@@ -36,6 +37,14 @@ export const expenses = pgTable(
expenseDate: timestamp('expense_date', { withTimezone: true }).notNull(),
description: text('description'),
receiptFileIds: text('receipt_file_ids').array(), // references to files table
/**
* True when the rep deliberately created the expense WITHOUT a receipt
* (e.g. the receipt was lost or never issued). Surfaces a warning at
* creation time AND in the PDF export — the legacy parent-company flow
* may refuse to reimburse expenses without proof, so the warning is
* load-bearing for ops.
*/
noReceiptAcknowledged: boolean('no_receipt_acknowledged').notNull().default(false),
paymentStatus: text('payment_status').default('unpaid'), // unpaid, paid, partial
paymentDate: date('payment_date'),
paymentReference: text('payment_reference'),

View File

@@ -25,6 +25,9 @@ export * from './reservations';
// Documents & Files
export * from './documents';
// Brochures + send-outs (Phase 7)
export * from './brochures';
// Financial
export * from './financial';

View File

@@ -1,6 +1,18 @@
import { pgTable, text, boolean, integer, timestamp, primaryKey, index } from 'drizzle-orm/pg-core';
import {
pgTable,
text,
boolean,
integer,
numeric,
timestamp,
primaryKey,
index,
uniqueIndex,
} from 'drizzle-orm/pg-core';
import { sql } from 'drizzle-orm';
import { ports } from './ports';
import { clients } from './clients';
import { berths } from './berths';
// Pipeline stages: open, details_sent, in_communication, eoi_sent, eoi_signed, deposit_10pct, contract_sent, contract_signed, completed
@@ -16,7 +28,6 @@ export const interests = pgTable(
clientId: text('client_id')
.notNull()
.references(() => clients.id),
berthId: text('berth_id'), // nullable - FK to berths defined in berths.ts, added via relation
yachtId: text('yacht_id'), // FK added via relation; nullable until pipeline leaves 'open'
pipelineStage: text('pipeline_stage').notNull().default('open'),
leadCategory: text('lead_category'), // general_interest, specific_qualified, hot_lead
@@ -47,6 +58,11 @@ export const interests = pgTable(
/** When the outcome was decided. Lets us age 'how long ago did we lose'. */
outcomeAt: timestamp('outcome_at', { withTimezone: true }),
notes: text('notes'),
/** Recommender inputs - imperial; resolver treats nulls as "no constraint"
* on that axis, with a banner prompting the rep to add the missing dim. */
desiredLengthFt: numeric('desired_length_ft'),
desiredWidthFt: numeric('desired_width_ft'),
desiredDraftFt: numeric('desired_draft_ft'),
archivedAt: timestamp('archived_at', { withTimezone: true }),
createdAt: timestamp('created_at', { withTimezone: true }).notNull().defaultNow(),
updatedAt: timestamp('updated_at', { withTimezone: true }).notNull().defaultNow(),
@@ -54,7 +70,6 @@ export const interests = pgTable(
(table) => [
index('idx_interests_port').on(table.portId),
index('idx_interests_client').on(table.clientId),
index('idx_interests_berth').on(table.berthId),
index('idx_interests_yacht').on(table.yachtId),
index('idx_interests_stage').on(table.portId, table.pipelineStage),
index('idx_interests_archived').on(table.portId, table.archivedAt),
@@ -62,6 +77,59 @@ export const interests = pgTable(
],
);
/**
* Many-to-many junction between interests and berths.
*
* Replaces the old single-berth `interests.berth_id` column. Each row
* carries three role flags so a rep can model "actively pitching this
* berth" vs "covered by the EOI bundle but not pitched" vs "primary
* berth for the deal" independently:
*
* - is_primary : at most one row per interest is the primary;
* templates / forms / "the berth for this deal"
* semantics resolve through this row.
* - is_specific_interest : true = berth shows as "Under Offer" on the
* public map. false = legal/EOI-only link.
* - is_in_eoi_bundle : covered by the interest's EOI signature.
*
* EOI bypass: when the interest has a signed primary EOI but a specific
* berth in the bundle still needs its own EOI, a rep records the bypass
* reason here.
*/
export const interestBerths = pgTable(
'interest_berths',
{
id: text('id')
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
interestId: text('interest_id')
.notNull()
.references(() => interests.id, { onDelete: 'cascade' }),
berthId: text('berth_id')
.notNull()
.references(() => berths.id, { onDelete: 'restrict' }),
isPrimary: boolean('is_primary').notNull().default(false),
isSpecificInterest: boolean('is_specific_interest').notNull().default(true),
isInEoiBundle: boolean('is_in_eoi_bundle').notNull().default(false),
eoiBypassReason: text('eoi_bypass_reason'),
eoiBypassedBy: text('eoi_bypassed_by'),
eoiBypassedAt: timestamp('eoi_bypassed_at', { withTimezone: true }),
addedBy: text('added_by'),
addedAt: timestamp('added_at', { withTimezone: true }).notNull().defaultNow(),
notes: text('notes'),
},
(table) => [
uniqueIndex('idx_ib_interest_berth').on(table.interestId, table.berthId),
uniqueIndex('idx_ib_one_primary')
.on(table.interestId)
.where(sql`${table.isPrimary} = true`),
index('idx_ib_berth').on(table.berthId),
index('idx_ib_specific')
.on(table.berthId)
.where(sql`${table.isSpecificInterest} = true`),
],
);
export const interestNotes = pgTable(
'interest_notes',
{
@@ -96,3 +164,5 @@ export type Interest = typeof interests.$inferSelect;
export type NewInterest = typeof interests.$inferInsert;
export type InterestNote = typeof interestNotes.$inferSelect;
export type NewInterestNote = typeof interestNotes.$inferInsert;
export type InterestBerth = typeof interestBerths.$inferSelect;
export type NewInterestBerth = typeof interestBerths.$inferInsert;

View File

@@ -18,7 +18,7 @@ import {
} from './clients';
// Interests
import { interests, interestNotes, interestTags } from './interests';
import { interests, interestNotes, interestTags, interestBerths } from './interests';
// Yachts
import { yachts, yachtOwnershipHistory, yachtNotes, yachtTags } from './yachts';
@@ -40,6 +40,7 @@ import {
berthWaitingList,
berthMaintenanceLog,
berthTags,
berthPdfVersions,
} from './berths';
// Reservations
@@ -57,6 +58,9 @@ import {
formSubmissions,
} from './documents';
// Brochures + send-outs (Phase 7)
import { brochures, brochureVersions, documentSends } from './brochures';
// Financial
import { expenses, invoices, invoiceLineItems, invoiceExpenses } from './financial';
@@ -260,10 +264,6 @@ export const interestsRelations = relations(interests, ({ one, many }) => ({
fields: [interests.clientId],
references: [clients.id],
}),
berth: one(berths, {
fields: [interests.berthId],
references: [berths.id],
}),
yacht: one(yachts, {
fields: [interests.yachtId],
references: [yachts.id],
@@ -274,6 +274,18 @@ export const interestsRelations = relations(interests, ({ one, many }) => ({
reminders: many(reminders),
berthRecommendations: many(berthRecommendations),
formSubmissions: many(formSubmissions),
interestBerths: many(interestBerths),
}));
export const interestBerthsRelations = relations(interestBerths, ({ one }) => ({
interest: one(interests, {
fields: [interestBerths.interestId],
references: [interests.id],
}),
berth: one(berths, {
fields: [interestBerths.berthId],
references: [berths.id],
}),
}));
export const interestNotesRelations = relations(interestNotes, ({ one }) => ({
@@ -401,8 +413,21 @@ export const berthsRelations = relations(berths, ({ one, many }) => ({
waitingList: many(berthWaitingList),
maintenanceLogs: many(berthMaintenanceLog),
tags: many(berthTags),
interests: many(interests),
interestBerths: many(interestBerths),
reminders: many(reminders),
pdfVersions: many(berthPdfVersions),
currentPdfVersion: one(berthPdfVersions, {
fields: [berths.currentPdfVersionId],
references: [berthPdfVersions.id],
relationName: 'berthCurrentPdfVersion',
}),
}));
export const berthPdfVersionsRelations = relations(berthPdfVersions, ({ one }) => ({
berth: one(berths, {
fields: [berthPdfVersions.berthId],
references: [berths.id],
}),
}));
export const berthMapDataRelations = relations(berthMapData, ({ one }) => ({
@@ -861,3 +886,49 @@ export const residentialInterestsRelations = relations(residentialInterests, ({
references: [residentialClients.id],
}),
}));
// ─── Brochures + send-outs (Phase 7) ──────────────────────────────────────────
export const brochuresRelations = relations(brochures, ({ one, many }) => ({
port: one(ports, {
fields: [brochures.portId],
references: [ports.id],
}),
versions: many(brochureVersions),
sends: many(documentSends),
}));
export const brochureVersionsRelations = relations(brochureVersions, ({ one, many }) => ({
brochure: one(brochures, {
fields: [brochureVersions.brochureId],
references: [brochures.id],
}),
sends: many(documentSends),
}));
export const documentSendsRelations = relations(documentSends, ({ one }) => ({
port: one(ports, {
fields: [documentSends.portId],
references: [ports.id],
}),
client: one(clients, {
fields: [documentSends.clientId],
references: [clients.id],
}),
interest: one(interests, {
fields: [documentSends.interestId],
references: [interests.id],
}),
berth: one(berths, {
fields: [documentSends.berthId],
references: [berths.id],
}),
brochure: one(brochures, {
fields: [documentSends.brochureId],
references: [brochures.id],
}),
brochureVersion: one(brochureVersions, {
fields: [documentSends.brochureVersionId],
references: [brochureVersions.id],
}),
}));

View File

@@ -49,11 +49,10 @@ import berthSnapshot from './seed-data/berths.json';
// ─── Berth snapshot ──────────────────────────────────────────────────────────
// 117 rows imported from the legacy NocoDB Berths table on 2026-05-03.
// Refresh by re-running the snapshot script (see git history of this file).
// Refresh via `pnpm tsx scripts/import-berths-from-nocodb.ts --update-snapshot`.
type SeedBerth = {
legacyId: number;
mooringNumber: string;
legacyMooringNumber: string;
area: string | null;
status: 'available' | 'under_offer' | 'sold';
lengthFt: number | null;

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More