23 Commits

Author SHA1 Message Date
Matt Ciaccio
c612bbdfd9 fix(migration): legacy bare-mooring lookup + port-nimara berth backfill
Some checks failed
Build & Push Docker Images / lint (push) Failing after 1m12s
Build & Push Docker Images / build-and-push (push) Has been skipped
Two issues surfaced when applying the migration to dev:

1. Mooring number format mismatch
   The legacy NocoDB Interests table writes bare mooring strings
   ("D32", "B16", "A4"), but the new berths table (mirroring the
   NocoDB Berths snapshot) uses zero-padded dashed form ("D-32",
   "B-16", "A-04"). The interest→berth lookup missed every reference.

   migration-apply.ts now tries the literal value first, then falls
   back to a normalized form via `normalizeLegacyMooring(raw)`:
     "D32" -> "D-32"
     "A4"  -> "A-04"
     "E18" -> "E-18"
   Multi-mooring strings ("A3, D30") are left as-is so they surface in
   the warnings list for human review rather than silently picking one.

2. port-nimara only had the 12 hand-rolled seed berths, not the 117-
   berth NocoDB snapshot
   The mobile-foundation seed only places those 12 in port-nimara; the
   117-berth snapshot was added later but only seeded into Marina
   Azzurra (the secondary test port). Migrated interests reference
   moorings well beyond A-01..D-03, so most lookups failed.

   New scripts/load-berths-to-port-nimara.ts: idempotently loads any
   missing snapshot berths into port-nimara without disturbing the
   existing 12 (skips moorings that already exist). Run once;
   subsequent runs no-op.

Result of full migration run on dev:
  237 clients inserted (out of 245 total — 8 from prior seed)
  406 contacts, 52 addresses, 38 yachts, 252 interests
  27 interest→berth links resolved (only 13 source rows had a Berth
  field set in NocoDB to begin with — most legacy interests are early
  inquiries with no berth assignment)
  1 unresolved warning: source=277 has multi-mooring "A3, D30"

Verified in UI:
  /port-nimara/clients shows real names (John-michael Seelye, Reza
  Amjad, Etiennette Clamouze, …)
  /port-nimara/clients/<id> renders contacts (gmail.com addresses,
  E.164 phones), tab counts (Interests N, Yachts N), pipeline summary
  Dashboard: 245 clients, 266 active interests, $46.5M pipeline value
  Pipeline funnel chart now shows real distribution (180 Open, 45
  EOI Signed, dropoff through stages)

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 21:05:11 +02:00
Matt Ciaccio
872c75f1a1 fix(safety): plug 3 EMAIL_REDIRECT_TO leaks + 10 unit tests + live smoke
Some checks failed
Build & Push Docker Images / lint (push) Failing after 1m10s
Build & Push Docker Images / build-and-push (push) Has been skipped
A pre-import audit caught three places where outbound comms could escape
even with EMAIL_REDIRECT_TO set. Plugged each, added unit tests so the
behavior can't silently regress, and shipped a live smoke script the
operator can run before any production data import.

Leak 1: email-compose.service.ts (per-account user composer)
  Built its own nodemailer transporter and called sendMail() directly,
  bypassing the centralized sendEmail()'s redirect. Now mirrors the same
  redirect: when EMAIL_REDIRECT_TO is set, "to" is rewritten, "cc" is
  dropped, and the subject is prefixed with "[redirected from <orig>]".

Leak 2: documenso-client.sendDocument()
  Tells Documenso to actually email the document. Recipient emails were
  rerouted at create-time (in pass-3) but a document created BEFORE the
  redirect was turned on could still trigger a real-client email. Now
  short-circuited when the redirect is set — returns the existing doc
  shape so downstream code doesn't see an unexpected null.

Leak 3: documenso-client.sendReminder()
  Same shape as sendDocument: emails a stored recipient address that may
  predate the redirect. Now short-circuits with a warn-level log.

Tests (tests/unit/comms-safety.test.ts):
  - createDocument rewrites recipients
  - generateDocumentFromTemplate rewrites both v1.13 formValues.*Email
    keys AND v2.x recipients[] arrays
  - sendDocument is short-circuited (no /send call)
  - sendReminder is short-circuited (no /remind call)
  - createDocument passes through unchanged when redirect unset
  - sendEmail rewrites to + subject for single recipient
  - sendEmail handles array of recipients (joined into subject prefix)
  - sendEmail passes through unchanged when redirect unset
  - Webhook worker reads process.env.EMAIL_REDIRECT_TO at dispatch time
    (no module-level caching that could miss a runtime flip)

Live smoke (scripts/smoke-test-redirect.ts):
  Monkey-patches nodemailer.createTransport, calls the real sendEmail()
  with a fake real-client address, verifies the captured outbound has
  the right "to" + subject. Run: `pnpm tsx scripts/smoke-test-redirect.ts`.
  Exits non-zero if the redirect failed for any reason — drop-in for a
  pre-deploy check.

Verification:
  pnpm exec tsc --noEmit       — 0 errors
  pnpm exec vitest run         — 936/936 (was 926, +10 new safety tests)
  pnpm tsx scripts/smoke-test-redirect.ts — PASS

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 20:55:53 +02:00
Matt Ciaccio
c45aac551d feat(dedup): wire --apply path for NocoDB migration
Some checks failed
Build & Push Docker Images / lint (push) Successful in 1m12s
Build & Push Docker Images / build-and-push (push) Failing after 3m41s
Completes the migration script's apply phase, which was stubbed at
the P3 ship to defer until after the runtime surfaces (P2) and the
comms safety net were in place. Both prerequisites just landed on
main, so this unblocks the actual data import.

src/lib/dedup/migration-apply.ts (new):
  Idempotent apply driver. Walks the MigrationPlan, inserting clients,
  contacts, addresses, yacht stubs, and interests, threading every
  insert through the migration_source_links ledger so re-runs against
  the same data are safe. Per-entity transactions (not one giant
  transaction) so partial-failure resumption is just "run again."

  Per-entity behavior:
    - clients: idempotent on (source_system, source_id, target_type=client)
      across the entire dedup cluster — if any source row already maps
      to a client, reuse that record.
    - contacts: bulk insert, primary email + primary phone independent.
    - addresses: bulk insert, port_id required (schema enforces it),
      first address marked primary when multiple.
    - yachts: minimal stub when the legacy interest had a yachtName,
      currentOwnerType=client + currentOwnerId=migrated client. Linked
      via migration_source_links target_type=yacht.
    - interests: looks up berthId via mooring number, yachtId via the
      stub above. Carries Documenso ID forward when present.

  surnameToken from PlannedClient is dropped on insert (it's a dedup
  blocking-index artifact; runtime dedup re-derives from fullName).

scripts/migrate-from-nocodb.ts:
  - Removes the "not yet implemented" guard for --apply.
  - Adds EMAIL_REDIRECT_TO precondition gate: --apply errors out unless
    the env var is set, OR --unsafe-skip-redirect-check is also passed
    (production cutover only). Refers to docs/operations/outbound-comms-safety.md.
  - Re-fetches NocoDB at apply time (rather than reading a saved report
    dir) so the data is always fresh. Re-running is safe via the
    idempotency ledger.
  - Resolves target port via --port-slug (or first port if omitted).
  - Generates a UUID applyId tagged on every link, which pairs with a
    future --rollback flag.
  - Apply summary prints inserted/skipped counts per entity type plus
    the first 20 warnings.

Verification: 0 tsc errors, 926/926 vitest passing, lint clean.
The actual end-to-end run requires NOCODB_URL + NOCODB_TOKEN in .env
which aren't configured in this checkout; that's the operator's next
step.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 19:53:04 +02:00
Matt Ciaccio
9ad1df85d2 fix(residential): mobile card list alongside the desktop table
Some checks failed
Build & Push Docker Images / lint (push) Successful in 1m12s
Build & Push Docker Images / build-and-push (push) Failing after 5m42s
Both the residential-clients and residential-interests pages rendered
plain HTML <table>s with 5–6 columns directly. At 390px viewport the
header columns clipped at the right edge — "Sour..." for the clients
page, no header for the interests page either.

Adds a parallel mobile card list:
  - <table> stays inside `hidden lg:block` (unchanged at lg+)
  - new card list inside `lg:hidden` mirrors the row data:
    - Clients: name + status pill on top, then email · phone ·
      residence · source as a wrap-friendly meta row.
    - Interests: stage label as headline, updated-at on the right,
      preferences (line-clamp-2) and notes (line-clamp-1) below,
      source small at the bottom.
  - Each card is a Link to the detail page (matching the row click
    target on desktop).
  - Empty + loading states render as a centered card on mobile.

This is the same `hidden lg:block` / `lg:hidden` pattern used for the
main /clients and /interests pages. Doesn't refactor to the full
DataView primitive (would mean rebuilding the residential data layer
on TanStack Table) — keeps the change tightly scoped to the visible
output.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 17:24:58 +02:00
Matt Ciaccio
8e4d2fc5b4 feat(safety): EMAIL_REDIRECT_TO now also pauses Documenso + webhooks
Closes a gap exposed by the comms safety audit: the existing
EMAIL_REDIRECT_TO env var only redirected outbound SMTP via the
sendEmail() bottleneck. Two channels still leaked when set:

  1. Documenso e-signature recipients — Documenso's own server emails
     them on our behalf, so SMTP redirect doesn't help. We were sending
     real client emails to the Documenso REST API, which would then
     deliver to the real client.

  2. Outbound webhooks — fire from the BullMQ worker to user-configured
     URLs. SSRF guard blocks internal hosts but doesn't pause production
     endpoints.

Documenso (src/lib/services/documenso-client.ts):
  - createDocument: rewrite every recipient.email to EMAIL_REDIRECT_TO
    and prefix the recipient.name with the original email so the doc
    is traceable.
  - generateDocumentFromTemplate: same treatment for both v1.13
    formValues.*Email keys and v2.x recipients[]. The redirect happens
    BEFORE the API call, so even Documenso's own retry logic can't
    reach the original recipient.
  - Both paths log when they redirect so it's visible in dev.

Webhooks (src/lib/queue/workers/webhooks.ts):
  - When EMAIL_REDIRECT_TO is set, short-circuit the dispatch and write
    a `dead_letter` row with reason "Skipped: EMAIL_REDIRECT_TO is set,
    outbound comms paused." so the attempt is still visible in the
    deliveries listing.

Doc:
  docs/operations/outbound-comms-safety.md catalogs every outbound
  comms channel (email, Documenso, webhooks, WhatsApp/phone deep-links,
  SMS-not-implemented) and explains how each one respects the env flag.
  Includes a verification checklist to run before any production data
  import + cutover steps for going live.

Single env var EMAIL_REDIRECT_TO now reliably pauses ALL automated
outbound comms. Unset for production.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 17:24:41 +02:00
Matt Ciaccio
78f2f46d41 fix(admin): stack settings rows vertically on phone widths
Inquiry Settings + Business Rules cards used a flex-row layout that
crushed the label column into a narrow vertical stripe at 390px ("Inquiry
/ Contact / Email" wrapping one word per line) while the input took the
right side.

Stack label + helper text above the input on phone widths; restore the
side-by-side row from sm up. Same pattern as the other detail-edit rows
that were fixed in pass-2/pass-3.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 17:24:20 +02:00
Matt Ciaccio
3a9419fe10 chore(scripts): backfill client_contacts.value_e164 from value
One-shot script that walks every phone / whatsapp contact with `value`
set but `value_e164` null and runs the raw value through libphonenumber-js
to produce the canonical E.164 form. Matches the existing dedup
phone-parser shape (script-safe wrapper that loads metadata as raw JSON
to dodge the Node 25 + tsx interop bug).

Two output buckets:
  - parsed cleanly: e164 + country both resolved (33/36 in dev).
  - parsed e164 only: e164 came back but country didn't (3/36 — the
    UK +44 7700 900xxx fictional/reserved range that libphonenumber
    refuses to assign a country to but still returns a canonical e164
    for). Still safe to write — the e164 form is the canonical one.

Run dry-first, --apply to write:
  pnpm tsx scripts/backfill-phone-e164.ts
  pnpm tsx scripts/backfill-phone-e164.ts --apply

Applied to dev DB this session: 36 rows backfilled, 0 still missing.
Will need to be re-run after any future seed reload that introduces
unparsed phones.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 17:24:08 +02:00
Matt Ciaccio
b703684285 fix(ux): pass-3 — yacht/company headers, reminder filters wrap, client tab counts
Some checks failed
Build & Push Docker Images / lint (push) Successful in 1m14s
Build & Push Docker Images / build-and-push (push) Failing after 4m51s
Five small fixes from the third audit pass on previously-unchecked surfaces:

Yacht detail header (mobile):
  - Stack the action cluster (Edit / Transfer / Archive) below the title
    block on phone widths. Previously the three buttons crowded the right
    side enough to truncate the status pill to "A..." and force the owner
    name to wrap to two lines. Same fix that landed for berth / client /
    company headers.

Company detail header (mobile):
  - Same mobile stacking fix; legal-name + Tax-ID metadata no longer
    wraps awkwardly.

Company detail Incorporation Date (all viewports):
  - Strip the time portion of the ISO timestamp before passing to the
    inline editor. Previously rendered the raw "2019-03-14T00:00:00.000Z"
    Postgres-serialized form. Now reads "2019-03-14" and round-trips
    through the YYYY-MM-DD inline editor cleanly.

Reminders list filter row:
  - Allow flex-wrap on the My/All tabs + status filter + priority filter
    cluster. At 390px, the priority filter dropdown was being pushed off
    the right edge of the screen.

Client detail tab counts:
  - Add interestCount + noteCount to getClientById response, surface as
    badges on the Interests + Notes tabs. Brings them into parity with
    Yachts/Companies/Reservations/Addresses which already showed counts;
    Files + Activity are still stubs and don't get a count yet.

Verification: 0 tsc errors, 926/926 vitest passing, lint clean.

Out of scope (deferred):
  - Residential clients / interests pages still render plain HTML tables
    on phone widths (header columns clip at the right edge). Needs the
    DataView card-on-mobile treatment that the main /clients and
    /interests pages already have. Substantial separate work.
  - Phone contacts in the legacy seed have value set but valueE164 NULL,
    so InlinePhoneField shows "—" even though metadata is technically
    populated. Fix is a one-time backfill via libphonenumber-js, not a
    UI change.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 17:09:27 +02:00
Matt Ciaccio
a792d9a182 fix(ux): pass-2 audit fixes — admin grouping, Duplicates entry, header tooltips
Some checks failed
Build & Push Docker Images / lint (push) Successful in 1m11s
Build & Push Docker Images / build-and-push (push) Failing after 5m45s
Three small but high-leverage fixes from the second audit pass on main:

Admin index (src/app/(dashboard)/[portSlug]/admin/page.tsx):
  - Grouped 21 sections into 7 categories: Access, Configuration, Content,
    Data Quality, Operations, Tenancy, Integrations. Each group has a
    one-line description so first-time admins can orient themselves
    without reading every card.
  - Added the missing Duplicates entry (links to /admin/duplicates from
    the dedup-migration work) under Data Quality.

More sheet (mobile bottom-drawer nav):
  - "Email" -> "Inbox". The page that opens is an email-inbox surface
    (Inbox + Accounts tabs), not a generic email composer. The previous
    label was ambiguous.

Interest detail header (Won / Lost outcome buttons):
  - Added title="Mark as won" / "Close as lost" so the icon-only buttons
    on mobile have a tooltip on long-press / desktop hover.
  - Tightened mobile padding (px-2 vs px-2.5) so the full-text desktop
    labels still fit on sm+ without re-introducing a regression where a
    visible mobile "Won"/"Lost" inline label crowded the right cluster
    enough to push Email/Call/WhatsApp action chips into a vertical
    stack.

Verification: 0 tsc errors, 926/926 vitest passing, lint clean.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 16:35:32 +02:00
Matt Ciaccio
d7ec2a8507 Merge docs/dedup-migration-design: client dedup + NocoDB migration design doc
Some checks failed
Build & Push Docker Images / lint (push) Successful in 1m18s
Build & Push Docker Images / build-and-push (push) Failing after 3m57s
2026-05-03 16:24:30 +02:00
Matt Ciaccio
cb83b09b2d Merge feat/dedup-migration: client dedup library + NocoDB migration script + admin queue
# Conflicts:
#	.gitignore
#	src/lib/db/migrations/meta/_journal.json
2026-05-03 16:24:13 +02:00
Matt Ciaccio
7574c3b575 chore(migrations): renumber 0020/0021 -> 0021/0022 to avoid clash with berth-parity
berth-schema-parity branch already shipped its own migration 0020 (berth
schema parity: text -> numeric, +status_override_mode). Dedup's two
migrations need to land on top of that, not collide.

Renames:
  0020_unusual_azazel.sql       -> 0021_unusual_azazel.sql
  0021_magenta_madame_hydra.sql -> 0022_magenta_madame_hydra.sql
  meta/0020_snapshot.json       -> meta/0021_snapshot.json
  meta/0021_snapshot.json       -> meta/0022_snapshot.json

_journal.json idx + tag fields updated to match.

Snapshot CONTENTS remain dedup-branch state (no berths-numeric awareness).
A `pnpm drizzle-kit generate` after main merges the berth changes will
produce a consistent forward path; until then the snapshots are slightly
out-of-sync with the post-merge live schema, which is harmless because
the dev DB applies migrations forward, not from snapshots.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 16:22:58 +02:00
Matt Ciaccio
bb105f5365 Merge feat/mobile-ux-polish: berth/header/tab/contacts mobile fixes
# Conflicts:
#	src/components/clients/contacts-editor.tsx
2026-05-03 16:20:12 +02:00
Matt Ciaccio
caafae15dd Merge feat/berth-schema-parity: NocoDB field parity, 117-berth seed, ports pruned to Port Nimara + Amador 2026-05-03 16:18:43 +02:00
Matt Ciaccio
46c7389930 Merge feat/mobile-foundation: 212 commits of mobile foundation, sales UX, audit fixes 2026-05-03 16:18:10 +02:00
Matt Ciaccio
cad55e3565 fix(mobile): clipping, dropdown-tabs and stale phone metadata
Five mobile-UX issues caught in the 2026-05-03 audit, fixed in one pass:

1. SpecRow on berth detail clipped at right edge on phone widths.
   "Length 49.21 ft / 15 r" (the "m" cut off). Mobile-first stack:
   label on top, value full-width below; flex row only from sm up.

2. ResponsiveTabs collapsed to a Select on phone widths, which read like
   a generic dropdown and obscured the existence of peer tabs. Replaced
   with a horizontally-scrollable strip that auto-scrolls the active
   trigger into view (so the user sees neighbors and gets a discovery
   cue that more exists beyond the edge). Removes the phone-only Select
   and unifies the tab UI across viewport sizes.

3. Documents page tab strip ("All / EOI queue / Awaiting them / ...")
   overflowed the 390px viewport because the wrapper was a fixed flex
   row. Same horizontal-scroll fix as (2); inherits because Documents
   uses ResponsiveTabs.

4. Berth detail header: "Change Status" + "Edit" buttons crowded the
   area subtitle on mobile, causing "North Pier" to wrap to two lines
   ("North" / "Pier"). Stacked vertically on phone widths; from sm up
   the buttons sit beside the title block as before.

5. Empty contact rows on client detail rendered a stale "Add tag · star"
   metadata strip even when the contact value was unset, which cluttered
   the row and offered no useful action. The metadata block now only
   shows when contact.value is non-empty; the trash icon stays visible
   so users can clean up the empty placeholder.

Verification:
- pnpm exec vitest run: 858/858 passing
- pnpm exec tsc --noEmit: same 36 errors as baseline (all pre-existing
  on feat/mobile-foundation, none introduced)
- lint clean

Defers:
- Mobile More sheet last-row alignment / "Email" label specificity
- Admin index grouping (Access / System / Configuration / Content)
- Interest detail header icon labels (trophy/X discoverability)
- Pipeline funnel x-axis label abbreviations
- Reminders rail width allocation on dashboard

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 16:03:56 +02:00
Matt Ciaccio
21868ee5fc feat(berths,seed): polish detail display + prune ports to Port Nimara + Amador
Berth detail (src/components/berths/berth-tabs.tsx):
- Numeric display polish, exposed by the new NocoDB-sourced seed:
  - Power capacity now renders with kW unit (e.g. "330 kW")
  - Voltage now renders with V unit (e.g. "480 V")
  - All metric/imperial values rounded to <= 2 decimals
    (was: "62.999112 m" -> now: "62.99 m")
  - Nominal Boat Size shows full ft + m pair (was: ft only)

Seed ports (src/lib/db/seed.ts):
- Drop Marina Azzurra and Harbor Royale; install now seeds only:
  - Port Nimara  (the real install)
  - Port Amador  (secondary, for multi-tenant isolation tests / Panama
                  scaffolding)
- Existing dev DBs are not touched; this only affects fresh `pnpm db:seed`
  runs. Users wanting to migrate should drop existing rows in the obsolete
  ports manually before re-seeding.

Verification:
- lint clean, tsc unchanged from baseline (36 pre-existing errors), 858/858
  vitest passing.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 15:59:36 +02:00
Matt Ciaccio
c7ab816c99 feat(seed): replace 12 hand-rolled berths with 117-row NocoDB snapshot
The old seed only had 12 berths with made-up area names ("North Pier",
"Central Basin", etc.) and placeholder dimensions. Devs now get the real
117 berths exported from the legacy NocoDB Berths table — every editable
column populated with real production values.

What's in the snapshot (src/lib/db/seed-data/berths.json):
- 117 berths total (61 available / 45 under_offer / 11 sold)
- Areas A through E (matches NocoDB single-select)
- All numeric fields filled: length / width / draft (ft + m), water depth,
  nominal boat size, power capacity (kW), voltage (V)
- All NocoDB single-selects filled where present: side pontoon,
  mooring type, cleat/bollard type+capacity, access
- Bow facing, status_override_mode, berth_approved carried forward as-is
- Status normalized to lowercase snake_case ("Under Offer" -> "under_offer")
- Mooring numbers reformatted A1 -> A-01 to keep the existing "Letter-NN"
  convention used elsewhere in the codebase

Pre-sorted to preserve seed semantics:
  idx 0..4   -> 5 available  (small)   -- "open" / "details_sent" interests
  idx 5..9   -> 5 under_offer (medium) -- "eoi_signed" / "deposit" / "contract"
  idx 10..11 -> 2 sold (large)         -- "completed" interests
This means existing interest/reservation seeds that index berthRows[0..11]
keep their semantic alignment without code changes.

End-to-end verified by clearing Marina Azzurra and re-seeding:
  Port "Marina Azzurra" -- 117 berths, 8 clients, 3 companies, 12 yachts,
                           15 interests, 8 reservations

Future devs running `pnpm db:seed` on a fresh DB will now get realistic
berth data automatically.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 15:41:12 +02:00
Matt Ciaccio
e40b6c3d99 feat(berths): full NocoDB field parity, numeric types, sales edit access
Aligns the berths schema with the 117 production rows in NocoDB and exposes
every field for editing via the BerthForm sheet.

Schema (migration 0020):
- power_capacity / voltage / nominal_boat_size / nominal_boat_size_m: text -> numeric
  (NocoDB stores plain numbers; text was wrong shape and broke filter/sort)
- ADD status_override_mode text (1/117 legacy rows have a value; carried
  forward for parity but not yet wired into the UI)
- USING NULLIF(TRIM(...), '')::numeric so legacy whitespace and empty
  strings convert cleanly

Validator + service:
- updateBerthSchema / createBerthSchema use z.coerce.number() for the
  four numeric fields
- berths.service stringifies numeric values for Drizzle's numeric type

Form (src/components/berths/berth-form.tsx):
- adds: nominal boat size (ft/m), water depth (ft/m) + "is minimum" flag,
  side pontoon, cleat type/capacity, bollard type/capacity, bow facing
- converts to typed selects (with NocoDB option lists in src/lib/constants):
  area, side pontoon, mooring type, cleat type/capacity, bollard type/capacity,
  access
- power capacity / voltage become numeric inputs (with kW / V hints)

Permissions (seed.ts + dev DB):
- sales_manager and sales_agent: berths.edit false -> true
  ("sales will sometimes have to update these and I cannot be the only one")
- super_admin / director already had it; viewer stays read-only
- dev DB updated in-place via UPDATE roles ... jsonb_set

Verification:
- pnpm exec vitest run: 858/858 passing
- pnpm exec tsc --noEmit: same 36 errors as baseline (all pre-existing
  on feat/mobile-foundation, none introduced)
- lint clean

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 15:30:32 +02:00
Matt Ciaccio
4bcc7f8be6 feat(dedup): runtime surfaces — merge service, at-create suggestion, admin queue (P2)
Adds the live dedup pipeline on top of the P1 library + P3 migration
script. The new `client/interest` model now actively prevents duplicate
client records at creation time and gives admins a queue to triage
the borderline pairs the at-create check missed.

Three layers, per design §7:

Layer 1 — At-create suggestion
==============================

`GET /api/v1/clients/match-candidates`
  Accepts free-text email / phone / name from the in-flight client
  form, normalizes them via the dedup library, and returns scored
  matches against the port's live client pool. Filters out
  low-confidence noise (the background scoring queue picks those up
  separately). Strict port scoping; never leaks across tenants.

`<DedupSuggestionPanel>` (`src/components/clients/dedup-suggestion-panel.tsx`)
  Debounced React Query hook. Renders nothing for short inputs or
  no useful match. On a high-confidence match it interrupts visually
  with an amber-tinted card and a "Use this client" primary button.
  Medium confidence falls back to a softer "possible match — check
  before creating" treatment.

`<ClientForm>`
  Renders the panel above the form (create path only — skipped on
  edit). New `onUseExistingClient` callback fires when the user
  picks the existing client; the form closes and the parent decides
  what to do (typically: navigate to that client's detail page or
  open the create-interest dialog pre-filled).

Layer 2 — Merge service
=======================

`mergeClients` (`src/lib/services/client-merge.service.ts`)
  The atomic merge primitive that everything else calls. Single
  transaction. Per §6 of the design:

  - Locks both rows (FOR UPDATE) so concurrent merges of the same
    loser fail with a clear error rather than racing.
  - Snapshots the full loser state (contacts / addresses / notes /
    tags / interest+reservation IDs / relationship rows) into the
    `client_merge_log.merge_details` JSONB column for the eventual
    undo flow.
  - Reattaches every loser-side row to the winner: interests,
    reservations, contacts (skipping duplicates by `(channel, value)`),
    addresses, notes, tags (deduped), relationships.
  - Optional `fieldChoices` — per-scalar overrides letting the user
    keep the loser's value for fullName / nationality / preferences /
    timezone / source.
  - Marks the loser archived with `mergedIntoClientId` set (a redirect
    pointer for stragglers; never hard-deleted within the undo window).
  - Resolves any matching `client_merge_candidates` row to status='merged'.
  - Writes audit log entry.

Schema additions:
  - `clients.merged_into_client_id` (nullable text, indexed) — the
    redirect pointer set on archive.

Tests: 6 cases against a real DB — happy path moves rows + writes log;
self-merge / cross-port / already-merged refused; duplicate-contact
deduped on reattach; fieldChoices copies loser values to winner.

Layer 3 — Admin review queue
============================

`GET /api/v1/admin/duplicates`
  Pending merge candidates (status='pending') for the current port,
  with both client summaries hydrated for side-by-side rendering.
  Skips pairs where one side is already archived/merged.

`POST /api/v1/admin/duplicates/[id]/merge`
  Confirms a candidate. Body picks the winner; the other side
  becomes the loser. Calls into `mergeClients` — the only path that
  writes `client_merge_log`.

`POST /api/v1/admin/duplicates/[id]/dismiss`
  Marks the candidate dismissed. Future scoring runs skip the same
  pair until a score change recreates the row.

`<DuplicatesReviewQueue>` (`/admin/duplicates`)
  Side-by-side card UI for each pending pair. Click a card to pick
  the winner; the other side is automatically the loser. Toolbar:
  "Merge into selected" + "Dismiss". No per-field merge editor in
  this PR — that's a future polish; the simple "pick the better row"
  flow handles ~80% of cases.

Test coverage
=============

11 new integration tests (76 added in this branch total):
  - 6 mergeClients (atomicity, refusal cases, contact dedup,
    fieldChoices)
  - 5 match-candidates API (shape, port scoping, confidence tiers,
    Pattern F false-positive guard)

Full vitest: 926/926 passing (was 858 before the dedup branch).
Lint: clean. tsc: clean for new files (only pre-existing errors in
unrelated `tests/integration/` files remain, same as before this PR).

Out of scope, deferred
======================

- Background scoring cron that populates `client_merge_candidates`
  (the queue is empty until this lands; manual seeding works for
  now via the at-create flow).
- Side-by-side per-field merge editor with checkboxes (the simple
  "pick the winner" UX shipped here covers ~80% of real cases).
- Admin settings UI for tuning the dedup thresholds. Defaults from
  the design (90 / 50) are baked in for now.
- `unmergeClients` (the snapshot is captured in client_merge_log;
  the undo endpoint just hasn't been wired yet).

These are all natural follow-up PRs that don't block shipping the
runtime UX.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 14:59:04 +02:00
Matt Ciaccio
18e5c124b0 feat(dedup): NocoDB migration script + tables (P3 dry-run)
Lands the one-shot migration pipeline from the legacy NocoDB Interests
base into the new client/interest schema. Dry-run mode is fully
operational: pulls the live snapshot, runs the dedup library, and
writes a CSV + Markdown report under .migration/<timestamp>/. The
--apply phase is stubbed for a follow-up PR per the design's P3
implementation sequence.

Schema additions
================

- `client_merge_candidates` — pairs flagged by the background scoring
  job for the /admin/duplicates review queue. Status enum: pending /
  dismissed / merged. Unique-(portId, clientAId, clientBId) so the
  same pair can't surface twice. Empty until P2 lands the cron.
- `migration_source_links` — idempotency ledger. Maps source-system
  rows (NocoDB Interest #624 → new client UUID) so re-running --apply
  against the same dry-run report skips already-imported entities.

Both tables ship with the migration `0020_unusual_azazel.sql` —
already applied to the local dev DB during this commit's preparation.

Library
=======

src/lib/dedup/nocodb-source.ts
  Read-only adapter for the legacy NocoDB v2 API. xc-token auth,
  auto-paginates until isLastPage, captures the table IDs from the
  2026-05-03 audit. `fetchSnapshot()` pulls every relevant table in
  parallel into one in-memory object the transform layer consumes.

src/lib/dedup/migration-transform.ts
  Pure function: NocoDB snapshot in, MigrationPlan out. Per row:
    - normalizes name / email / phone / country via the dedup library
    - parses the legacy DD-MM-YYYY / DD/MM/YYYY / ISO date formats
    - maps the 8-stage `Sales Process Level` enum to the new 9-stage
      pipelineStage
    - filters yacht-name placeholders ('TBC', 'Na', etc.)
    - merges Internal Notes + Extra Comments + Berth Size Desired into
      a single notes blob
  Then runs `findClientMatches` pairwise (with blocking) and
  union-finds clusters of rows whose score crosses the auto-link
  threshold (90). Lower-scoring pairs (50–89) become 'needs review'.
  Each cluster's "lead" row is picked by completeness score with
  recency tie-break.

src/lib/dedup/migration-report.ts
  Writes three artifacts to .migration/<timestamp>/:
    - report.csv  — one row per planned op, RFC-4180 escaped
    - summary.md  — human-skimmable overview
    - plan.json   — full structured plan for the --apply phase
  CSV cells with comma / quote / newline are quoted; internal quotes
  are doubled. No external CSV dep.

src/lib/dedup/phone-parse.ts
  Script-safe wrapper around libphonenumber-js's `core` entry that
  loads `metadata.min.json` directly. The default `index.cjs.js`
  bundled by libphonenumber hits a metadata-shape interop bug under
  Node 25 + tsx (`{ default }` wrapping); core+JSON sidesteps it.
  The dedup `normalizePhone` and `find-matches` both use this wrapper
  now so the same code path runs in vitest, Next.js, and the migration
  CLI without surprises.

src/lib/dedup/normalize.ts
  Tightened country resolution: added Caribbean short-form aliases
  ('antigua' → AG, 'st kitts' → KN, etc.) and a city map covering the
  US locations seen in the NocoDB dump (Boston, Tampa, Fort
  Lauderdale, Port Jefferson, Nantucket). Also relaxed phone parsing
  to drop the `isValid()` strict check — the libphonenumber min build
  rejects many real NANP-territory numbers, and dedup only needs a
  canonical E.164 to compare.

CLI
===

scripts/migrate-from-nocodb.ts
  pnpm tsx scripts/migrate-from-nocodb.ts --dry-run
    → Pulls the live NocoDB base (NOCODB_URL + NOCODB_TOKEN env vars),
       runs the transform, writes report. No DB writes.
  pnpm tsx scripts/migrate-from-nocodb.ts --apply --report .migration/<dir>/
    → Stubbed; exits with `not yet implemented` and a pointer to the
       design doc. Apply phase ships in a follow-up.

Tests
=====

tests/unit/dedup/migration-transform.test.ts (7 cases)
  Fixture-based regression. A frozen 12-row NocoDB snapshot covers
  every duplicate pattern in the design (§1.2). The test asserts:
    - 12 input rows → 7 unique clients (cluster math is right)
    - Patterns A / B / C / E auto-link
    - Pattern F (Etiennette Clamouze) does NOT auto-link
    - Every interest preserved as its own row even when clients merge
    - 8-stage → 9-stage enum mapping is correct per spec
    - Multi-yacht merge (Constanzo CALYPSO + Costanzo GEMINI under one
      client) — the design's signature win
    - Output is deterministic (run twice, identical)

Validation against real data
============================

Ran `pnpm tsx scripts/migrate-from-nocodb.ts --dry-run` against the
live NocoDB. Result on 252 Interests rows:
  - 237 clients (15 merged into 13 clusters)
  - 252 interests (one per source row)
  - 406 contacts, 52 addresses
  - 13 auto-linked clusters (every confirmed cluster from §1.2 audit)
  - 3 pairs flagged for review (Camazou, Zasso, one new)
  - 1 phone placeholder flagged

Total dedup test count: 57 (50 from P1 + 7 fixture tests).
Lint: clean. Tsc: clean for new files.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 14:50:01 +02:00
Matt Ciaccio
8b077e1999 feat(dedup): normalization + match-finding library (P1)
The pure-logic spine of the client deduplication system spec'd in
docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md.
Two modules, JSX-free, vitest-tested against fixtures drawn directly
from real dirty values observed in the legacy NocoDB Interests audit.

src/lib/dedup/normalize.ts
- normalizeName: trims whitespace, replaces \r/\n/\t, intelligently
  title-cases ALL-CAPS surnames while keeping particles (van / de /
  dalla / etc.) lowercase mid-name. Preserves Irish O' surnames and
  the "slash-with-company" structure ("Daniel Wainstein / 7 Knots,
  LLC") seen in production. Returns a surnameToken (lowercased last
  non-particle token) for use as a dedup blocking key.
- normalizeEmail: trim + lowercase + zod email validation. Plus-aliases
  preserved; null on invalid.
- normalizePhone: pre-cleans the input (strips spreadsheet apostrophes,
  carriage returns, dots/dashes/parens, converts 00 prefix to +) then
  delegates to libphonenumber-js. Detects multi-number fields ("a/b",
  "a;b") and placeholder fakes (8+ consecutive zeros, e.g.
  +447000000000). Flags every quirk so the migration report and runtime
  audit log can surface it.
- resolveCountry: maps free-text country/region input to ISO-3166-1
  alpha-2 via alias → exact (vs. Intl-derived names) → city → fuzzy
  (Levenshtein ≤ 2). Fuzzy is gated by length so 4-char inputs ("Mars")
  don't false-positive against short country names.
- levenshtein: standard iterative implementation, exported for reuse
  by find-matches.

src/lib/dedup/find-matches.ts
- findClientMatches: builds three blocking indexes off the pool (email
  / phone / surname-token), gathers the comparison set via union, and
  scores each candidate via the rule set in design §4.2:
    Email match            +60
    Phone E.164 match      +50  (≥ 8 digits, excludes placeholder zeros)
    Name exact match       +20
    Surname + given fuzzy  +15  (Levenshtein ≤ 1)
    Negative: shared email but different phone country  −15
    Negative: name match but no shared contact          −20
  Score is clamped to [0,100]. Confidence tier ('high' / 'medium' /
  'low') is derived from configurable thresholds passed in by the
  caller — defaults are highScore=90, mediumScore=50.

tests/unit/dedup/normalize.test.ts (38 cases)
Every dirty-data pattern from design §1.3 has a fixture: carriage
returns in names, ALL-CAPS surnames, lowercase entries, particles,
slash-with-company, plus-aliases, capitalized email localparts,
spreadsheet-apostrophe phones, multi-number phones, placeholder
phones, 00-prefix phones, French/UK local-format phones,
Saint-Barthélemy diacritic variants, Kansas City fallback.

tests/unit/dedup/find-matches.test.ts (12 cases)
Each duplicate cluster from design §1.2 has a test:
- Pattern A (Deepak Ramchandani — pure double-submit) → high
- Pattern B (Howard Wiarda — phone format variance) → high
- Pattern C (Nicolas Ruiz — name capitalization) → high
- Pattern D (Chris/Christopher Allen — name shortening) → high
- Pattern E (Christopher Camazou — typo on resubmit) → high or medium
- Pattern E (Constanzo/Costanzo — surname typo, multi-yacht) → high
- Pattern F (Etiennette Clamouze — same name, different country) →
  must NOT auto-merge
- Pattern F (Bruno+Bruce — shared household contact) → no match
- Negative evidence (same email, different phone country) → medium
- Blocking (no shared keys → 0 matches)
- Sort order (high before low)
- Empty pool

Total: 50 new tests, all green. Zero changes to runtime behavior or
schema; unblocks P2 (runtime surfaces) and P3 (NocoDB migration).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 14:28:59 +02:00
Matt Ciaccio
36b92eb827 docs(spec): client deduplication and NocoDB migration design
Captures the audit findings from a 2026-05-03 read-only NocoDB review
plus the algorithm and migration plan for porting the legacy data
into the new client / interest / contacts / addresses model.

Highlights:
- 252 NocoDB Interests rows ≈ ~190–200 unique humans (~20–25% dup
  rate). Six duplicate patterns documented from real data, including
  "same person, multiple yachts" — exactly the case the new
  client/interest split is designed to handle.
- Reuses the battle-tested `client-portal/server/utils/duplicate-
  detection.ts` algorithm (blocking + weighted rules) with additions:
  metaphone for non-English surnames, compounded confidence when
  multiple rules match, negative evidence for split-signal cases.
- Three runtime surfaces (at-create suggestion, interest-level
  same-berth guard, background scoring + admin review queue) plus a
  one-shot migration script with --dry-run / --apply / --rollback.
- Configurable thresholds via per-port system_settings so the merge
  policy can be tuned (defaults to "always confirm" — never
  auto-merges out of the box).
- Reversible: every merge writes a clientMergeLog row with the
  loser's full pre-state JSON, enabling 7-day undo without engineering.

Implementation decomposes into three plans (P1 library / P2 runtime /
P3 migration) sequenced after the mobile branch lands.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 14:10:08 +02:00
68 changed files with 42199 additions and 466 deletions

3
.gitignore vendored
View File

@@ -40,6 +40,9 @@ docker-compose.override.yml
/.audit/ /.audit/
/.audit-screenshots/ /.audit-screenshots/
# Migration script output (CSV reports, transcripts)
.migration/
# Tool caches / runtime state # Tool caches / runtime state
/.claude/ /.claude/
/.serena/ /.serena/

View File

@@ -0,0 +1,123 @@
# Outbound communications safety net
**Last reviewed:** 2026-05-03
**Owner:** matt@portnimara.com
This doc enumerates every channel through which the CRM can produce
outbound communication (email, document signing, webhooks) and describes
how each channel respects the `EMAIL_REDIRECT_TO` env var. The goal: a
single environment flip pauses **all** outbound traffic, so a production
data import, dedup migration dry-run, or staging environment can run
against real data without anyone getting paged or spammed.
> **Single env switch:** when `EMAIL_REDIRECT_TO` is set to an address,
> all outbound communication is rerouted there or short-circuited. Unset
> it in production.
---
## Channels
### 1. Direct email (`sendEmail`)
**Path:** `src/lib/email/index.ts``sendEmail()` → nodemailer SMTP transport.
**Safety:** YES — covered.
When `EMAIL_REDIRECT_TO` is set, `sendEmail()` rewrites the `to` header
to the redirect address and prefixes the subject with
`[redirected from <orig>]`. The original recipient is logged.
**Call sites** (all flow through `sendEmail`, so all are covered):
- `src/lib/services/portal-auth.service.ts` — portal activation + reset
- `src/lib/services/crm-invite.service.ts` — CRM user invitations
- `src/lib/services/document-templates.ts` — template-generated PDFs sent
as attachments (the PDF body is generated locally; the email itself
goes through SMTP)
- `src/lib/services/email-compose.service.ts` — ad-hoc emails composed
in the in-app UI
- `src/lib/services/gdpr-export.service.ts` — GDPR export delivery
### 2. Documenso e-signature recipients
**Path:** `src/lib/services/documenso-client.ts``createDocument()` /
`generateDocumentFromTemplate()` → Documenso REST API.
**Safety:** YES — covered as of 2026-05-03.
Documenso's own server sends the signing-request email on our behalf.
We can't intercept that at the SMTP layer because it's external. The
fix is at the REST-call boundary: when `EMAIL_REDIRECT_TO` is set,
`createDocument` rewrites every recipient's email to the redirect
address and prefixes the recipient name with `(was: <orig email>)` so
the doc is still traceable to its intended recipient.
`generateDocumentFromTemplate` does the same for both shapes the
template-generate endpoint accepts (v1.13 `formValues.*Email` keys and
v2.x `recipients` array).
The redirect happens **before** the API call, so even if Documenso has
its own retry logic the original email never leaves our process.
### 3. Webhooks (outbound to user-configured URLs)
**Path:** `src/lib/queue/workers/webhooks.ts` → BullMQ job → `fetch(webhook.url, ...)`.
**Safety:** YES — covered as of 2026-05-03.
When `EMAIL_REDIRECT_TO` is set, the webhook worker short-circuits
before the HTTP call. The delivery row is marked `dead_letter` with a
human-readable reason so it's still visible in the deliveries listing.
The SSRF guard remains in place independently.
### 4. WhatsApp / phone deep-links
**Path:** `<a href="https://wa.me/...">` and `<a href="tel:...">` in
client / interest detail headers.
**Safety:** N/A — user-initiated only.
These are deep links the user explicitly clicks. No automated dispatch.
A deep link click opens the user's WhatsApp / phone app, which is the
intended interaction. No safety net needed.
### 5. SMS
Not implemented. The `interests.preferredContactMethod` enum includes
`'sms'` as a value but no sending path exists. If/when SMS is added (e.g.
via Twilio), the new send function should respect `EMAIL_REDIRECT_TO`
the same way `sendEmail` does — log the original number, drop the
message, or reroute to a configurable `SMS_REDIRECT_TO` env.
---
## Verification checklist before importing real data
- [ ] `.env` has `EMAIL_REDIRECT_TO=<my-address>` set.
- [ ] Restart dev server (or worker) so the new env is picked up — env
vars are read at import time in some paths.
- [ ] Send a test email via `pnpm tsx scripts/dev-trigger-portal-invite.ts`
or similar. Confirm subject is prefixed with `[redirected from ...]`.
- [ ] Trigger an EOI send through the UI (any client). Confirm Documenso
shows the redirect address as recipient (not the real client email).
- [ ] If any webhooks are configured, trigger an event that fires one and
confirm the delivery is recorded as `dead_letter` with the
"EMAIL_REDIRECT_TO is set" reason.
- [ ] Run the NocoDB migration `--dry-run` to count clients/interests; the
`--apply` step is what creates real records but emails/webhooks are
still gated by the redirect env.
## Production cutover
When ready to go live:
1. Run a final dry-run of the data migration with `EMAIL_REDIRECT_TO` set
to a sandbox address.
2. Verify the snapshot looks right (counts, client coverage).
3. Unset `EMAIL_REDIRECT_TO` in the production env.
4. Restart the app + worker.
5. Run the migration with `--apply`. From this point forward, real
recipients will receive real comms.
If you ever need to re-pause outbound (e.g. handling a security incident,
re-importing on top of existing data), set `EMAIL_REDIRECT_TO` again.

View File

@@ -0,0 +1,564 @@
# Client Deduplication and NocoDB Migration Design
**Status**: Design draft 2026-05-03 — pending approval.
**Plan decomposition**: Three implementation plans stack from this design — (P1) normalization + dedup core library; (P2) admin settings + at-create + interest-level guards (runtime); (P3) NocoDB migration script + review queue UI. P1 unblocks P2 and P3.
**Branch base**: stacks on `feat/mobile-foundation` once it merges to `main`.
**Out of scope**: live merge of two clients across ports (cross-tenant), automated AI-judged matches, profile-photo / face-match dedup, web-of-trust referrer relationships.
---
## 1. Background
### 1.1 Why this exists
The legacy CRM lives in a NocoDB base whose `Interests` table conflates _the human_ with _the deal_. A row contains `Full Name`, `Email Address`, `Phone Number`, `Address`, `Place of Residence` _and_ the sales-pipeline state for one specific berth. A single human pursuing two berths becomes two rows with semi-duplicated personal data. A 2026-05-03 read-only audit confirmed:
- **252 Interests rows** in NocoDB, against an estimated ~190200 unique humans (~2025% duplication rate).
- **35 Residential Interests rows** in a parallel residential pathway with the same conflation.
- **64 Website Interest Submissions + 47 Website Contact Form Submissions + 1 EOI Supplemental Form** as inbound capture surfaces.
- **No Clients table.** The conflated structure is structural, not accidental.
The new CRM (`src/lib/db/schema/clients.ts`) splits this into `clients` (people) ↔ `interests` (deals), with `clientContacts` (multi-channel), `clientAddresses` (multi-address), and a pre-existing `clientMergeLog` table that anticipates merge with undo. The design has been ready; what's missing is (a) a normalization + matching library, (b) the at-create / at-import surfaces that use it, and (c) the migration of the existing 252+35 records.
### 1.2 Real duplicate patterns observed in the live data
Sampled 200 of the 252 NocoDB Interests rows. Confirmed duplicate clusters fall into six patterns:
| Pattern | Example rows | Signature |
| ---------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------- |
| **A. Pure double-submit** | Deepak Ramchandani #624/#625; John Lynch #716/#725 | All fields identical; created same day |
| **B. Phone format variance** | Howard Wiarda #236/#536 (`574-274-0548` vs `+15742740548`); Christophe Zasso #701/#702 (`0651381036` vs `0033651381036`) | Same email, normalize-equal phone |
| **C. Name capitalization** | Nicolas Ruiz #681/#682/#683; Jean-Charles Miege/MIEGE #37/#163; John Farmer/FARMER #35/#161 | Same email or empty; surname case differs |
| **D. Name shortening** | Chris vs Christopher Allen #700/#534; Emma c vs Emma Cauchefer #661/#673 | Same email + phone; given-name truncated |
| **E. Resubmit with typo** | Christopher Camazou #649/#650 (phone last 4 digits typo); Gianfranco Di Constanzo/Costanzo #585/#336 (surname typo, **different yacht** — should be ONE client + TWO interests) | Score-on-everything-else high, one field has small-edit-distance noise |
| **F. Hard cases** | Etiennette Clamouze #188/#717 (same name, different country phone + email); Bruno Joyerot #18 with email belonging to Bruce Hearn #19 (couple sharing contact) | Cannot resolve without a human |
This dataset will be the fixture for the dedup library's tests — every pattern above must be either auto-detected or flagged for review, and the false-positive bar must be high enough that Pattern F doesn't get force-merged.
### 1.3 Dirty data inventory
The migration normalizer must survive these real values from production:
**Phone fields**: `+1-264-235-8840\r` (with carriage return), `'+1.214.603.4235` (apostrophe + dots), `0677580750/0690511494` (two numbers in one field), `00447956657022` (00 prefix), `+447000000000` (placeholder all-zeros), `+4901637039672` (impossible — stripped 0 + country prefix), various unprefixed local formats, dashed US numbers without country code.
**Email fields**: mixed case rampant (`Arthur@laser-align.com` vs `arthur@laser-align.com`); ALL-CAPS local parts; trailing whitespace.
**Name fields**: ALL-CAPS surnames mixed with title-case given names; embedded `\n` and `\r`; double spaces; lowercase-only entries; slash-with-company variants (`Daniel Wainstein / 7 Knots, LLC`, `Bruno Joyerot / SAS TIKI`); placeholder `Mr DADER`, `TBC`.
**Place of Residence (free text)**: `Saint barthelemy`, `St Barth`, `Saint-Barthélemy` (same place, three forms); `anguilla`, `United States `, `USA`, `Kansas City` (city without country), `Sag Harbor Y` (typo).
### 1.4 Existing battle-tested algorithm
`client-portal/server/utils/duplicate-detection.ts` already implements blocking + weighted-rules dedup against this same NocoDB. It runs in production today. We **port it forward** (don't reinvent), then add: soundex/metaphone for surname matching, compounded-confidence when multiple rules match, and negative evidence (same email + different country phone reduces confidence).
### 1.5 Why the website is no longer the source of new dirty data
The website forms (`website/components/pn/specific/website/{berths-item,register,form}/form.vue`) use `<v-phone-input>` with a country picker (`prefer-countries: ['US', 'GB', 'DE', 'FR']`) and `[(value) => !!value || 'Phone number is required']` validation. Output is E.164-shaped. The 252 dirty rows are legacy — pre-form-redesign submissions, sales-rep manual entries, and external CSV imports. Future inbound is clean.
---
## 2. Approach
Three artifacts, layered:
1. **A pure-logic normalization + matching library** at `src/lib/dedup/`. JSX-free, vitest-native (proven pattern: `realtime-invalidation-core.ts`). Tested against the dirty-data fixture corpus drawn from §1.2.
2. **Three runtime surfaces** that use the library: at-create suggestion in client/interest forms; interest-level same-berth guard; admin review queue powered by a nightly background scoring job.
3. **A one-shot migration script** that pulls NocoDB → normalizes → dedupes → writes new schema → produces a CSV report with auto-merge log + flagged-for-review pile.
**Configurability via admin settings** (`system_settings` per port) so the team can tune sensitivity without code changes. Defaults err on the safe side — a flagged review is cheaper than a wrongly-merged record.
**Reversibility**: every merge writes a `client_merge_log` row containing the loser's full pre-state JSON. A 7-day undo window lets a wrong merge be reversed without engineering involvement. After 7 days the snapshot is purged for GDPR; merges become permanent.
---
## 3. Normalization library
Lives at `src/lib/dedup/normalize.ts`. Pure functions, no DB, vitest-tested. Used by the dedup algorithm AND by all create-paths so what gets stored is already normalized.
### 3.1 `normalizeName(raw: string)`
```ts
export function normalizeName(raw: string): {
display: string; // human-readable, kept for UI
normalized: string; // for matching
surnameToken?: string; // for surname-based blocking
};
```
- Trim leading/trailing whitespace
- Replace `\r`, `\n`, tabs with single space
- Collapse consecutive whitespace to single space
- Smart title-case: keep particles (`van`, `de`, `del`, `O'`, `di`, `le`, `da`) lowercase except as first token
- `display` preserves user's intent (slash-with-company stays intact)
- `normalized` is `display.toLowerCase()` for comparison
- `surnameToken` is the last non-particle token for blocking
### 3.2 `normalizeEmail(raw: string)`
```ts
export function normalizeEmail(raw: string): string | null;
```
- Trim + lowercase
- Validate via `zod.email()` schema
- Returns `null` for empty / invalid (caller decides what to do)
- **Does NOT strip plus-aliases** (`user+tag@domain.com`) — both intentional (real distinct addresses) and malicious-prevention apply. Compare by full localpart.
### 3.3 `normalizePhone(raw: string, defaultCountry: string)`
```ts
export function normalizePhone(
raw: string,
defaultCountry: string,
): {
e164: string | null; // canonical, e.g. '+15742740548'
country: string | null; // ISO-3166-1 alpha-2
display: string | null; // user-facing pretty
flagged?: 'multi_number' | 'placeholder' | 'unparseable';
} | null;
```
Pipeline:
1. Strip `\r`, `\n`, tabs, single quotes, dots, dashes, parens, spaces
2. If contains `/` or `;` or `,` → flag `multi_number`, take first segment
3. If matches `+\d{2}0+$` (e.g., `+447000000000`) → flag `placeholder`, return null
4. If starts with `00` → replace with `+`
5. If starts with `+` → parse as E.164
6. Else if `defaultCountry` provided → parse against that country
7. Else return null (caller's problem)
Backed by `libphonenumber-js` (already in deps via `tests/integration/factories.ts` usage if not, will add). The hostile cases above all need explicit handling — naïve regex won't survive.
### 3.4 `resolveCountry(text: string)`
```ts
export function resolveCountry(text: string): {
iso: string | null; // ISO-3166-1 alpha-2
confidence: 'exact' | 'fuzzy' | 'city' | null;
};
```
Reuses `src/lib/i18n/countries.ts`. Pipeline:
1. Lowercase + strip diacritics
2. Exact match against country names (any locale we ship)
3. Fuzzy match (Levenshtein ≤ 2 against canonical English names)
4. City fallback — small in-package mapping for high-frequency cities seen in legacy data (`Sag Harbor → US`, `Kansas City → US`, `St Barth → BL`, etc.). Order: exact → city → fuzzy.
The mapping is opinionated and small (~30 entries covering the actual values seen in the 252-row dataset). Anything that fails to resolve returns `null` and lands in the migration's flagged pile.
---
## 4. Dedup algorithm
Lives at `src/lib/dedup/find-matches.ts`. Pure function. Vitest-tested against the §1.2 cluster fixtures.
### 4.1 Public API
```ts
export interface MatchCandidate {
id: string;
fullName: string | null;
emails: string[]; // already normalized
phonesE164: string[]; // already normalized E.164
countryIso: string | null;
}
export interface MatchResult {
candidate: MatchCandidate;
score: number; // 0100
reasons: string[]; // human-readable, e.g. ["email match", "phone match"]
confidence: 'high' | 'medium' | 'low';
}
export function findClientMatches(
input: MatchCandidate,
pool: MatchCandidate[],
thresholds: DedupThresholds,
): MatchResult[];
```
### 4.2 Scoring rules (compound)
Each rule produces a score addition. **Compounding**: when two strong rules match (e.g., email AND phone), the result is ~95+ rather than max(50, 50). Negative evidence subtracts.
| Rule | Score | Notes |
| --------------------------------------------------------------- | ----- | ------------------------------------------------------ |
| Exact email match (case-insensitive, normalized) | +60 | One match suffices |
| Exact phone E.164 match (≥ 8 significant digits) | +50 | Excludes placeholder all-zeros |
| Exact normalized full-name match | +20 | Many "John Smith"s exist |
| Surname soundex match + given-name fuzzy match (Lev ≤ 1) | +15 | Catches `Constanzo/Costanzo`, `Christophe/Christopher` |
| Same address (normalized fuzzy ≥ 0.8) | +10 | Bonus signal |
| **Negative**: Same email but different country code on phone | 15 | Suggests spouse / coworker / shared inbox |
| **Negative**: Same name but DIFFERENT email AND DIFFERENT phone | 20 | Two distinct people with the same name |
### 4.3 Confidence tiers (post-compound)
- **score ≥ 90 — `high`** — email AND phone match, or email + name + address. Block-create suggest "Use existing." Auto-link on public-form submit by default.
- **score 5089 — `medium`** — single strong signal (email or phone alone), or email + same-name + different country (Etiennette case). Soft-warn but allow.
- **score < 50 — `low`** — weak signals only. Don't surface in UI; only relevant in background-job review queue.
### 4.4 Blocking strategy
For O(n) scan over a pool of N existing clients, build three lookup maps once per scan:
- `byEmail: Map<string, MatchCandidate[]>` — keyed by normalized email
- `byPhoneE164: Map<string, MatchCandidate[]>` — keyed by E.164
- `bySurnameToken: Map<string, MatchCandidate[]>` — keyed by `normalizeName(...).surnameToken`
For an incoming `MatchCandidate`, the candidate set to compare is the union of pool entries reachable through any of its emails/phones/surname-token. Typically 05 candidates per query, regardless of N.
### 4.5 Performance budget
For migration: 252 rows compared pairwise once. ~30k comparisons after blocking — a few seconds.
For runtime at-create: incoming candidate against existing pool of N clients per port. Expected pool size at maturity: 1k10k. With blocking: <10 comparisons, <1ms target. No DB query needed beyond the initial pool fetch (which itself uses the indexed columns).
For background nightly job: full pairwise within port, blocked. 10k clients → ~50k pairwise checks per port → <30s. Fine for a nightly cron.
---
## 5. Configurable thresholds (admin settings)
New rows in `system_settings` per port. Default values err safe (more confirmation, less auto-action).
| Key | Default | Effect |
| ------------------------------ | ----------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `dedup_block_create_threshold` | `90` | Score above which the client-create form interrupts: "Use existing client?" |
| `dedup_soft_warn_threshold` | `50` | Score above which a soft-warn panel surfaces below the form |
| `dedup_review_queue_threshold` | `40` | Background job lands pairs ≥ this score in `/admin/duplicates` |
| `dedup_public_form_auto_link` | `true` | When a public-form submission scores ≥ block-threshold against existing client, attach the new interest to that client without prompting. **Safe**: no merge, just attaching a deal. |
| `dedup_auto_merge_threshold` | `null` (disabled) | If non-null, merges happen automatically at this threshold without human confirmation. Recommend leaving null until the team is comfortable; `95` is a reasonable cautious value. |
| `dedup_undo_window_days` | `7` | How long the loser's pre-state JSON is retained for merge-undo. After this, the snapshot is purged (GDPR) and merges are permanent. |
Each setting is a row in `system_settings`. UI surface in `/[portSlug]/admin/dedup` (a new admin page) with an "Advanced" toggle to expose the thresholds and brief explanations.
If the sales team complains the safer mode is too click-heavy, an admin flips `dedup_auto_merge_threshold` to `95` without any code change.
---
## 6. Merge service contract
### 6.1 Data flow
`mergeClients(winnerId, loserId, fieldChoices, ctx)` does, in a single transaction:
1. **Snapshot loser** — full row + all attached `clientContacts`, `clientAddresses`, `clientNotes`, `clientTags`, plus a count of dependent rows about to be moved (interests, yacht-memberships, etc.). Stored as `mergeDetails` JSONB in `clientMergeLog`.
2. **Reattach** — every row pointing at `loserId` updates to point at `winnerId`:
- `interests.clientId`
- `clientContacts.clientId` — with conflict handling: if winner already has the same email, keep winner's; flag the duplicate for the user
- `clientAddresses.clientId` — same conflict handling
- `clientNotes.clientId` — preserve `authorId` + `createdAt` (never overwrite)
- `clientTags.clientId`
- `clientYachtMembership.clientId` (or whatever the table is called)
- `auditLogs.entityId` — annotate, don't move (audit truth)
3. **Apply fieldChoices** — for each field where the user picked the loser's value, copy that into the winner row.
4. **Soft-archive loser**`loser.archivedAt = now()`, `loser.mergedIntoClientId = winnerId`. Row stays in DB so the merge is reversible.
5. **Write `clientMergeLog`**`{ winnerId, loserId, mergedBy, mergedAt, mergeDetails: <snapshot>, fieldChoices }`.
6. **Audit log** — top-level `auditLogs` row: `{ action: 'merge', entityType: 'client', entityId: winnerId, metadata: { loserId, score, reasons } }`.
### 6.2 Schema additions (migration)
`clients` table gets a new column:
```ts
mergedIntoClientId: text('merged_into_client_id').references(() => clients.id),
```
The existing `clientMergeLog` table is reused. Add a partial index for the undo-window query:
```sql
CREATE INDEX idx_cml_recent ON client_merge_log (port_id, created_at DESC) WHERE created_at > NOW() - INTERVAL '7 days';
```
A daily maintenance job (using the existing `maintenance-cleanup.test.ts` infrastructure) purges `mergeDetails` JSONB older than `dedup_undo_window_days` setting.
### 6.3 Undo
`unmergeClients(mergeLogId, ctx)`:
1. Within the undo window, look up the snapshot
2. Restore loser: clear `archivedAt`, `mergedIntoClientId`
3. Restore loser's contacts/addresses/notes/tags from snapshot
4. Detach reattached rows: `interests` etc. that were touching `winnerId` and originally belonged to loser go back. The snapshot stores the original `(rowType, rowId)` list explicitly so this is deterministic.
5. Mark log row `undoneAt = now()`, `undoneBy = userId`
After 7 days the snapshot is gone and unmerge returns `410 Gone`.
### 6.4 Concurrency
Both merge and unmerge wrap in a single transaction with `SELECT … FOR UPDATE` on `clients.id` of both winner and loser. A second merge attempt against the same loser sees `mergedIntoClientId` already set and refuses (clear error: "Already merged into …").
---
## 7. Runtime surfaces
### 7.1 Layer 1 — At-create suggestion
In `ClientForm` (and the public `register` form once that hits the new system):
- Debounced 300ms after email or phone field changes
- Calls `findClientMatches` against current port's clients
- Renders top-1 match if score ≥ `dedup_soft_warn_threshold`:
```
┌─────────────────────────────────────┐
│ This looks like an existing client │
│ ML Marcus Laurent │
│ marcus@… +33 6 12 34 56 78 │
│ 2 interests · last 9d ago │
│ [ Use this client ] [ Create new ] │
└─────────────────────────────────────┘
```
- "Use this client" → form switches to "create new interest under existing client" mode (preserves whatever other fields the user typed)
- "Create new" → audit-log `dedup_override` with the candidate's id and reasons (so we have data on false positives)
### 7.2 Layer 2 — Interest-level same-berth guard
Cheap one-liner in `createInterest` service:
- Check `(clientId, berthId)` against existing non-archived interests
- If hit, throw `BerthDuplicateError` with the existing interest details
- UI catches and prompts: "Update existing or create separate?"
This is NOT the same as client-level dedup. Same client legitimately can pursue the same berth a second time after it falls through. But the prompt-before-create catches the accidental double-submit case.
### 7.3 Layer 3 — Background scoring + review queue
- A nightly cron (using existing BullMQ infrastructure — search for `scheduled-tasks` in repo) runs `findClientMatches` over each port's full client pool
- Pairs scoring ≥ `dedup_review_queue_threshold` land in a `client_merge_candidates` table:
```ts
export const clientMergeCandidates = pgTable('client_merge_candidates', {
id: text('id').primaryKey()...,
portId: text('port_id').notNull()...,
clientAId: text('client_a_id').notNull()...,
clientBId: text('client_b_id').notNull()...,
score: integer('score').notNull(),
reasons: jsonb('reasons').notNull(),
status: text('status').notNull().default('pending'), // pending | dismissed | merged
createdAt: timestamp('created_at')...,
resolvedAt: timestamp('resolved_at'),
resolvedBy: text('resolved_by'),
})
```
- `/[portSlug]/admin/duplicates` lists pending candidates sorted by score desc, with `[Review →]` opening a side-by-side merge dialog
- Dismissing a candidate marks it `status=dismissed` so the job doesn't re-surface the same pair tomorrow (a future score increase re-creates it).
---
## 8. NocoDB → new system field mapping
This is the explicit mapping the migration script applies. One NocoDB Interest row produces multiple new rows.
### 8.1 Top-level transform
```
NocoDB Interests row
─→ 01 client (deduped against existing pool)
─→ 01 client_address
─→ 02 client_contacts (email, phone)
─→ exactly 1 interest
─→ 01 yacht (when Yacht Name present and not "TBC"/"Na"/empty placeholders)
─→ 01 document (when documensoID present)
```
### 8.2 Field map
| NocoDB field | Target | Transform |
| ----------------------------------------------------------------- | ------------------------------------------------------------------ | ---------------------------------------------------------------------------- |
| `Full Name` | `clients.fullName` | `normalizeName().display` |
| `Email Address` | `clientContacts(channel='email', value=...)` | `normalizeEmail()` |
| `Phone Number` | `clientContacts(channel='phone', valueE164=..., valueCountry=...)` | `normalizePhone(raw, defaultCountry)` |
| `Address` | `clientAddresses.streetAddress` (LongText preserved) | trim |
| `Place of Residence` | `clientAddresses.countryIso` AND `clients.nationalityIso` | `resolveCountry()` |
| `Contact Method Preferred` | `clients.preferredContactMethod` | lowercase, mapped: Email→email, Phone→phone |
| `Source` | `clients.source` | mapped: portal→website, Form→website, External→manual; null → manual |
| `Date Added` | `interests.createdAt` (fallback to NocoDB `Created At` then now) | parse: try `DD-MM-YYYY`, then `YYYY-MM-DD`, then ISO |
| `Sales Process Level` | `interests.pipelineStage` | see §8.3 |
| `Lead Category` | `interests.leadCategory` | General→general_interest, Friends and Family→general_interest with tag |
| `Berth` (FK) | `interests.berthId` | resolve via `Berths` table by `Mooring Number` |
| `Berth Size Desired` | `interests.notes` (appended) | preserve |
| `Yacht Name`, `Length`, `Width`, `Depth` | `yachts.name`, `lengthM`, `widthM`, `draughtM` | skip if name in {`TBC`, `Na`, ``, null}; ft→m via `\* 0.3048` |
| `EOI Status` | `interests.eoiStatus` | Awaiting Further Details→pending; Waiting for Signatures→sent; Signed→signed |
| `Deposit 10% Status` | `interests.depositStatus` | Pending→pending; Received→received |
| `Contract Status` | `interests.contractStatus` | Pending→pending; 40% Received→partial; Complete→complete |
| `EOI Time Sent` | `interests.dateEoiSent` | parse |
| `clientSignTime` / `developerSignTime` / `all_signed_notified_at` | `interests.dateEoiSigned` (use latest) | parse |
| `Time LOI Sent` | `interests.dateContractSent` | parse |
| `Internal Notes` + `Extra Comments` | `clientNotes` (one row, system author) | concatenate with section markers |
| `documensoID` | `documents.documensoId` (when present, type='eoi') | preserve |
| `Signature Link Client/CC/Developer`, `EmbeddedSignature*` | `documents.signers[]` | one row per non-null signer |
| `reminder_enabled`, `last_reminder_sent`, etc. | `interests.reminderEnabled`, `interests.reminderLastFired` | parse, default true |
### 8.3 Sales-stage mapping (8 → 9)
| NocoDB | New (PIPELINE_STAGES) |
| ------------------------------- | ------------------------------------------------------------------------ |
| General Qualified Interest | `open` |
| Specific Qualified Interest | `details_sent` |
| EOI and NDA Sent | `eoi_sent` |
| Signed EOI and NDA | `eoi_signed` |
| Made Reservation | `deposit_10pct` |
| Contract Negotiation | `contract_sent` |
| Contract Negotiations Finalized | `contract_sent` (with audit-note: legacy "negotiations finalized") |
| Contract Signed | `contract_signed` (or `completed` when deposit + contract both complete) |
### 8.4 Other tables
- **Residential Interests** (35 rows) — same shape as Interests but maps to `residentialClients` + `residentialInterests`. Smaller and cleaner. Same dedup runs within this pool independently.
- **Website - Interest Submissions** (64 rows) — these are **inbound capture, not yet a client**. Treat as if each row is a fresh public-form submission today: run dedup against the migrated client pool. Auto-link if `dedup_public_form_auto_link` setting allows.
- **Website - Contact Form Submissions** (47 rows) — sparse data (just name + email + interest type). Skip migration; export as CSV for manual triage. Not the source of truth for any deal.
- **Website - Berth EOI Details Supplements** (1 row) — single record, preserved as a one-off attached to the matching Interest.
- **Newsletter Sending** (69 rows) — out of scope; that's a marketing surface, not CRM.
- **Interests Backup, Interests copy** — historical artifacts. Skipped by default. A `--include-backups` flag attaches them as audit-note entries on the corresponding live Interest if the user wants the history.
---
## 9. Migration script
Located at `scripts/migrate-from-nocodb.ts`. Idempotent: safe to re-run. Three main flags:
```
$ pnpm tsx scripts/migrate-from-nocodb.ts --dry-run [--port-slug X]
Pulls everything, transforms, runs dedup, writes CSV report to .migration/<timestamp>/. No DB writes.
$ pnpm tsx scripts/migrate-from-nocodb.ts --apply --report .migration/<timestamp>/
Reads the report, performs the writes the dry-run promised. Refuses if the source data has changed since the report was generated (hash mismatch).
$ pnpm tsx scripts/migrate-from-nocodb.ts --rollback --apply-id <id>
Reads the apply log, undoes the writes (only valid within the undo window).
```
Reuses the `client-portal/server/utils/nocodb.ts` adapter for the NocoDB API client (no need to rebuild). Writes to the new system via Drizzle (re-using the existing services like `createClient`, `createInterest`, etc., so all the same validation runs).
### 9.1 Dry-run report format
`.migration/<timestamp>/report.csv`:
```csv
op,reason,nocodb_row_id,target_table,target_value,confidence,manual_review_required
create_client,new,624,clients.fullName,Deepak Ramchandani,N/A,false
create_contact,new,624,clientContacts.email,dannyrams8888@gmail.com,N/A,false
create_contact,new,624,clientContacts.phone,+17215868888,N/A,false
create_interest,new,624,interests.berthId,a1b2c3...,N/A,false
auto_link,score=98 (email+phone),625,clients.id,<existing client UUID from row 624>,high,false
flag_for_review,score=72 (same name diff country),188,client.id,<existing client UUID from row 717>,medium,true
country_unresolved,fallback to AI (port country),198,clientAddresses.countryIso,AI,low,true
phone_unparseable,placeholder all-zeros,641,clientContacts.phone,<skipped>,N/A,true
```
Plus `.migration/<timestamp>/summary.md`:
```
# Migration Dry-Run — 2026-05-03 14:23 UTC
NocoDB: 252 Interests + 35 Residences + 64 Website Submissions
Outcome: 198 clients, 287 interests (incl. residences), 91 yachts, 412 contacts
Auto-linked (high confidence, no human action needed):
- Nicolas Ruiz: rows 681,682,683 → 1 client + 3 interests
- John Lynch: rows 716,725 → 1 client + 2 interests
- Deepak Ramchandani: rows 624,625 → 1 client + 2 interests
- [12 more]
Flagged for manual review (medium confidence):
- Etiennette Clamouze (rows 188,717): same name, different country phone + email
- Bruno Joyerot #18 + Bruce Hearn #19: shared household contact
- [4 more]
Country resolution failed for 7 rows. All defaulted to port country (AI). Review:
- Row 239: "Sag Harbor Y" → AI (likely US)
- [6 more]
Phone parsing failed for 3 rows. All flagged, no contact created:
- Row 178: empty
- Row 641: placeholder "+447000000000"
- Row 175: empty
Run `--apply` to commit these changes.
```
### 9.2 Apply phase
`--apply` reads the report, re-fetches the source rows (via NocoDB MCP / API), recomputes the hash, fails fast if NocoDB changed since dry-run. Then performs the writes within a single PostgreSQL transaction per port (commit at end). On any error mid-transaction, full rollback.
After successful apply, an `apply_id` is generated and an audit-log row written. The `apply_id` is the handle used for `--rollback`.
### 9.3 Idempotency
The script tracks NocoDB row IDs in a `migration_source_links` table:
```ts
export const migrationSourceLinks = pgTable('migration_source_links', {
id: text('id').primaryKey()...,
sourceSystem: text('source_system').notNull(), // 'nocodb_interests' | 'nocodb_residences' | …
sourceId: text('source_id').notNull(), // NocoDB row id as string
targetEntityType: text('target_entity_type').notNull(), // client | interest | yacht | …
targetEntityId: text('target_entity_id').notNull(),
appliedAt: timestamp('applied_at')...,
appliedBy: text('applied_by'),
}, (table) => [
uniqueIndex('idx_msl_source').on(table.sourceSystem, table.sourceId, table.targetEntityType),
]);
```
Re-running `--apply` against the same report skips rows already in this table. Useful for partial-failure resumption.
---
## 10. Test plan
### 10.1 Library-level (vitest unit)
- `tests/unit/dedup/normalize.test.ts` — every dirty-data pattern from §1.3 has a fixture asserting the expected normalized output.
- `tests/unit/dedup/find-matches.test.ts` — every duplicate cluster from §1.2 has a fixture asserting score + confidence tier. Hard cases (Pattern F) assert "medium" not "high" — false-positive guard.
### 10.2 Service-level (vitest integration)
- `tests/integration/dedup/client-merge.test.ts` — merge service exercised: full reattach, clientMergeLog written, undo within window restores, undo after window returns 410, concurrent merge of same loser fails the second.
- `tests/integration/dedup/at-create-suggestion.test.ts` — `findClientMatches` against a seeded pool returns expected matches + reasons.
### 10.3 Migration script (vitest integration with NocoDB mock)
- `tests/integration/dedup/migration-dry-run.test.ts` — feed the script a fixture NocoDB dump (the 252 rows, frozen as a JSON snapshot in fixtures), assert the resulting CSV matches a golden file. Catch any future regression in the transform pipeline.
- `tests/integration/dedup/migration-apply.test.ts` — apply the dry-run output to a clean test DB, assert all expected rows exist, assert idempotency (re-apply is a no-op).
### 10.4 E2E (Playwright)
- `tests/e2e/smoke/30-dedup-create.spec.ts` — type into ClientForm with an email matching seeded client; assert suggestion card appears; click "Use this client"; assert form switches to interest-create mode.
- `tests/e2e/smoke/31-admin-duplicates.spec.ts` — admin views review queue, opens a candidate, side-by-side merge UI works, merge succeeds, undo within window works.
---
## 11. Rollback plan
Three layers of safety, ordered by reversibility:
1. **Per-merge undo** — admin clicks Undo on a wrongly-merged pair, system rolls back from `clientMergeLog` snapshot. 7-day window. No engineering needed.
2. **Migration `--rollback` flag** — entire migration apply is reversed via the `apply_id` and `migration_source_links` table. Useful in the first 24h after `--apply`. Engineering-supervised.
3. **DB restore from backup** — the existing `docs/ops/backup-runbook.md` covers this. Last resort if both above are blocked.
Pre-migration, take a hot backup of the new DB (`pg_dump`). Pre-merge in production (before any human-facing surface ships), the `dedup_auto_merge_threshold` defaults to `null` so no automatic merges happen — every merge is human-confirmed.
---
## 12. Open items
- **Soundex vs metaphone** — Soundex is simpler but English-leaning. Metaphone handles non-English surnames better (the dataset has French, German, Italian, Slavic names). Default to metaphone via the `natural` package; revisit if it adds significant install size.
- **Cross-port dedup** — not in scope. Each port's clients are deduped within that port. A future "shared address book" feature would need its own design.
- **Profile photo / face match** — out of scope.
- **AI-assisted match resolution** — out of scope. The Layer-3 review queue is human-only.
---
## Implementation sequence
P1 (this design's library) → P2 (runtime surfaces) → P3 (migration). Each is a separate plan / PR.
**P1 deliverables**: `src/lib/dedup/{normalize,find-matches}.ts` + tests. No UI changes. No DB changes (except indexed lookups added to existing `clientContacts`). ~1.5 days.
**P2 deliverables**: at-create suggestion in `ClientForm` + interest-level guard in `createInterest` service + admin settings UI for thresholds + `clientMergeCandidates` table + nightly job + admin review queue page + merge service + side-by-side merge UI. ~57 days.
**P3 deliverables**: `scripts/migrate-from-nocodb.ts` + `migration_source_links` table + dry-run + apply + rollback. CSV report format frozen against fixture. ~3 days, including fixture creation from the live NocoDB snapshot.
Total: ~1012 engineering days from approval. Can be split across three PRs landing independently — each is testable in isolation and the runtime surfaces (P2) work even without P3 being run.

View File

@@ -0,0 +1,144 @@
/**
* Backfill `client_contacts.value_e164` from `value` for phone / whatsapp
* contacts where it's null or empty.
*
* The legacy seed (and pre-normalization production data) stored phone
* numbers in `value` as free text — "+33 4 93 00 0002" — but `value_e164`
* is what every UI surface and dedup matcher reads. This script runs the
* raw `value` through libphonenumber-js (via the script-safe wrapper to
* avoid the Node 25 metadata-loader bug) and writes the canonical E.164
* form back.
*
* Usage:
* pnpm tsx scripts/backfill-phone-e164.ts # dry-run report
* pnpm tsx scripts/backfill-phone-e164.ts --apply # actually write
*
* The dry-run report prints, for each unparseable row, the contact id +
* raw value so you can hand-clean before re-running.
*/
import 'dotenv/config';
import { and, eq, inArray, isNull, or, sql } from 'drizzle-orm';
import { db } from '@/lib/db';
import { clientContacts } from '@/lib/db/schema/clients';
import { parsePhoneScriptSafe } from '@/lib/dedup/phone-parse';
import type { CountryCode } from '@/lib/i18n/countries';
const APPLY = process.argv.includes('--apply');
interface PhoneRow {
id: string;
channel: string;
value: string | null;
valueCountry: string | null;
}
async function main() {
console.log(`Phone E.164 backfill — ${APPLY ? 'APPLY MODE' : 'dry-run'}`);
console.log('');
// Find candidate rows: phone or whatsapp contacts with a `value` set but
// `value_e164` null/empty.
const rows: PhoneRow[] = await db
.select({
id: clientContacts.id,
channel: clientContacts.channel,
value: clientContacts.value,
valueCountry: clientContacts.valueCountry,
})
.from(clientContacts)
.where(
and(
inArray(clientContacts.channel, ['phone', 'whatsapp']),
or(isNull(clientContacts.valueE164), eq(clientContacts.valueE164, '')),
sql`${clientContacts.value} IS NOT NULL AND ${clientContacts.value} <> ''`,
),
);
console.log(` found ${rows.length} candidate rows`);
let parsedFull = 0;
let parsedE164Only = 0;
let unparseable = 0;
const updates: Array<{
id: string;
valueE164: string;
valueCountry: CountryCode | null;
}> = [];
const fails: Array<{ id: string; value: string; reason: string }> = [];
for (const row of rows) {
if (!row.value) continue;
const defaultCountry = (row.valueCountry as CountryCode | null) ?? undefined;
const parsed1 = parsePhoneScriptSafe(row.value, defaultCountry);
if (parsed1.e164 && parsed1.country) {
// Both e164 + country resolved — best case.
updates.push({ id: row.id, valueE164: parsed1.e164, valueCountry: parsed1.country });
parsedFull++;
} else if (parsed1.e164) {
// E.164 came back but country didn't (e.g. UK +44 7700 900xxx
// fictional/reserved range — libphonenumber returns the e164 form
// but refuses to assign a country). Still safe to write — the e164
// is canonical. Country stays null.
updates.push({
id: row.id,
valueE164: parsed1.e164,
valueCountry: (row.valueCountry as CountryCode | null) ?? null,
});
parsedE164Only++;
} else {
fails.push({
id: row.id,
value: row.value,
reason: row.value.trim().startsWith('+')
? 'has + prefix but parse failed'
: 'no leading + and no country hint',
});
unparseable++;
}
}
console.log('');
console.log(' ✓ parsed cleanly (e164 + country)', parsedFull);
console.log(' ✓ parsed e164 only (no country) ', parsedE164Only);
console.log(' ✗ unparseable ', unparseable);
console.log('');
if (fails.length > 0) {
console.log('Failures (first 10):');
for (const f of fails.slice(0, 10)) {
console.log(` [${f.id}] "${f.value}" — ${f.reason}`);
}
console.log('');
}
if (!APPLY) {
console.log('Dry-run only. Re-run with --apply to write the updates.');
return;
}
if (updates.length === 0) {
console.log('No updates to write.');
return;
}
console.log(`Writing ${updates.length} updates...`);
for (const u of updates) {
await db
.update(clientContacts)
.set({
valueE164: u.valueE164,
valueCountry: u.valueCountry,
})
.where(eq(clientContacts.id, u.id));
}
console.log(` ✓ wrote ${updates.length} rows`);
}
main().catch((err) => {
console.error(err);
process.exit(1);
});

View File

@@ -0,0 +1,126 @@
/**
* One-shot: load the 117-berth NocoDB snapshot into the port-nimara
* port, skipping any moorings that already exist.
*
* The original seed only seeded 12 hand-rolled berths into port-nimara
* (A-01..D-03), but the migration's interest rows reference moorings
* across A-01..E-18. This loads the full set so interest→berth links
* resolve cleanly on the next migration run.
*/
import 'dotenv/config';
import { eq, and, sql, inArray } from 'drizzle-orm';
import { db } from '@/lib/db';
import { ports } from '@/lib/db/schema/ports';
import { berths } from '@/lib/db/schema/berths';
import berthSnapshot from '@/lib/db/seed-data/berths.json';
interface SnapshotBerth {
mooringNumber: string;
area: string;
status: 'available' | 'under_offer' | 'sold';
lengthFt: number | null;
widthFt: number | null;
draftFt: number | null;
lengthM: number | null;
widthM: number | null;
draftM: number | null;
widthIsMinimum: boolean;
nominalBoatSize: number | null;
nominalBoatSizeM: number | null;
waterDepth: number | null;
waterDepthM: number | null;
waterDepthIsMinimum: boolean;
sidePontoon: string | null;
powerCapacity: number | null;
voltage: number | null;
mooringType: string | null;
cleatType: string | null;
cleatCapacity: string | null;
bollardType: string | null;
bollardCapacity: string | null;
access: string | null;
price: number | null;
bowFacing: string | null;
berthApproved: boolean;
statusOverrideMode: string | null;
}
async function main() {
const [port] = await db
.select({ id: ports.id })
.from(ports)
.where(eq(ports.slug, 'port-nimara'))
.limit(1);
if (!port) throw new Error('port-nimara not found');
const snapshot = berthSnapshot as unknown as SnapshotBerth[];
// Existing moorings — skip these.
const existingRows = await db
.select({ mooringNumber: berths.mooringNumber })
.from(berths)
.where(eq(berths.portId, port.id));
const existingMoorings = new Set(existingRows.map((r) => r.mooringNumber));
const toInsert = snapshot.filter((b) => !existingMoorings.has(b.mooringNumber));
console.log(
`Snapshot: ${snapshot.length} berths, existing in port-nimara: ${existingRows.length}, to insert: ${toInsert.length}`,
);
if (toInsert.length === 0) {
console.log('Nothing to do.');
return;
}
const inserted = await db
.insert(berths)
.values(
toInsert.map((b) => ({
portId: port.id,
mooringNumber: b.mooringNumber,
area: b.area,
status: b.status,
lengthFt: b.lengthFt != null ? String(b.lengthFt) : null,
widthFt: b.widthFt != null ? String(b.widthFt) : null,
draftFt: b.draftFt != null ? String(b.draftFt) : null,
lengthM: b.lengthM != null ? String(b.lengthM) : null,
widthM: b.widthM != null ? String(b.widthM) : null,
draftM: b.draftM != null ? String(b.draftM) : null,
widthIsMinimum: b.widthIsMinimum,
nominalBoatSize: b.nominalBoatSize != null ? String(b.nominalBoatSize) : null,
nominalBoatSizeM: b.nominalBoatSizeM != null ? String(b.nominalBoatSizeM) : null,
waterDepth: b.waterDepth != null ? String(b.waterDepth) : null,
waterDepthM: b.waterDepthM != null ? String(b.waterDepthM) : null,
waterDepthIsMinimum: b.waterDepthIsMinimum,
sidePontoon: b.sidePontoon,
powerCapacity: b.powerCapacity != null ? String(b.powerCapacity) : null,
voltage: b.voltage != null ? String(b.voltage) : null,
mooringType: b.mooringType,
cleatType: b.cleatType,
cleatCapacity: b.cleatCapacity,
bollardType: b.bollardType,
bollardCapacity: b.bollardCapacity,
access: b.access,
price: b.price != null ? String(b.price) : null,
priceCurrency: 'USD',
bowFacing: b.bowFacing,
berthApproved: b.berthApproved,
statusOverrideMode: b.statusOverrideMode,
tenureType: 'permanent' as const,
})),
)
.returning({ id: berths.id, mooringNumber: berths.mooringNumber });
console.log(`Inserted ${inserted.length} berths.`);
// Suppress unused-import warning if eslint is strict.
void and;
void sql;
void inArray;
}
main().catch((e) => {
console.error(e);
process.exit(1);
});

View File

@@ -0,0 +1,237 @@
/**
* One-shot migration: legacy NocoDB Interests → new client/interest split.
*
* Usage:
*
* pnpm tsx scripts/migrate-from-nocodb.ts --dry-run
* Pulls the live NocoDB base, runs the transform + dedup pipeline,
* writes a report to .migration/<timestamp>/. NO database writes.
*
* pnpm tsx scripts/migrate-from-nocodb.ts --dry-run --port-slug port-nimara
* Same, but tags the planned writes with the named port (matters for
* the apply phase — every client/interest belongs to one port).
*
* pnpm tsx scripts/migrate-from-nocodb.ts --apply --port-slug port-nimara
* Re-fetches NocoDB, re-transforms, then writes the planned rows
* into the target port via the idempotent `migration_source_links`
* ledger. Re-runs are safe — already-imported source IDs are skipped.
* REQUIRES `EMAIL_REDIRECT_TO` to be set in env (safety net) unless
* `--unsafe-skip-redirect-check` is also passed.
*
* Design reference: docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md §9.
*/
import 'dotenv/config';
import { randomUUID } from 'node:crypto';
import path from 'node:path';
import { fileURLToPath } from 'node:url';
import { eq } from 'drizzle-orm';
import { db } from '@/lib/db';
import { ports } from '@/lib/db/schema/ports';
import { applyPlan } from '@/lib/dedup/migration-apply';
import { fetchSnapshot, loadNocoDbConfig } from '@/lib/dedup/nocodb-source';
import { transformSnapshot } from '@/lib/dedup/migration-transform';
import { resolveReportPaths, writeReport } from '@/lib/dedup/migration-report';
interface CliArgs {
dryRun: boolean;
apply: boolean;
portSlug: string | null;
reportDir: string | null;
unsafeSkipRedirectCheck: boolean;
}
function parseArgs(argv: string[]): CliArgs {
const args: CliArgs = {
dryRun: false,
apply: false,
portSlug: null,
reportDir: null,
unsafeSkipRedirectCheck: false,
};
for (let i = 0; i < argv.length; i += 1) {
const a = argv[i]!;
if (a === '--dry-run') args.dryRun = true;
else if (a === '--apply') args.apply = true;
else if (a === '--port-slug') args.portSlug = argv[++i] ?? null;
else if (a === '--report') args.reportDir = argv[++i] ?? null;
else if (a === '--unsafe-skip-redirect-check') args.unsafeSkipRedirectCheck = true;
else if (a === '-h' || a === '--help') {
printHelp();
process.exit(0);
} else {
console.error(`Unknown argument: ${a}`);
printHelp();
process.exit(1);
}
}
return args;
}
function printHelp(): void {
console.log(`Usage:
pnpm tsx scripts/migrate-from-nocodb.ts --dry-run [--port-slug <slug>]
Pulls NocoDB → transforms → writes report to .migration/<timestamp>/.
No database writes.
pnpm tsx scripts/migrate-from-nocodb.ts --apply --port-slug <slug>
Re-fetches NocoDB, re-transforms, writes via migration_source_links
ledger. Idempotent — safe to re-run. Requires EMAIL_REDIRECT_TO set
(unless --unsafe-skip-redirect-check is also passed).
Flags:
--dry-run Read NocoDB, write report only.
--apply Actually write rows to the DB.
--port-slug <slug> Port slug to attach to all imported
entities. Defaults to the first
available port if omitted.
--report <dir> Path to a previously-generated report
dir (only used by --apply).
--unsafe-skip-redirect-check Skip the EMAIL_REDIRECT_TO precondition
check. Only use in production cutover.
-h, --help Show this help.
`);
}
/**
* Resolve the target port: use the slug if provided, otherwise the first
* port found. Errors out cleanly if the slug doesn't match any port.
*/
async function resolvePort(slug: string | null): Promise<{ id: string; slug: string }> {
if (slug) {
const [p] = await db
.select({ id: ports.id, slug: ports.slug })
.from(ports)
.where(eq(ports.slug, slug))
.limit(1);
if (!p) {
console.error(`No port found with slug "${slug}".`);
process.exit(1);
}
return { id: p.id, slug: p.slug };
}
const [first] = await db.select({ id: ports.id, slug: ports.slug }).from(ports).limit(1);
if (!first) {
console.error('No ports exist in the target DB. Seed at least one port before applying.');
process.exit(1);
}
return { id: first.id, slug: first.slug };
}
async function main(): Promise<void> {
const args = parseArgs(process.argv.slice(2));
if (!args.dryRun && !args.apply) {
console.error('Must specify --dry-run or --apply');
printHelp();
process.exit(1);
}
// Safety gate: --apply must run with EMAIL_REDIRECT_TO set, unless the
// operator explicitly opts out (production cutover).
if (args.apply && !process.env.EMAIL_REDIRECT_TO && !args.unsafeSkipRedirectCheck) {
console.error(
'--apply requires EMAIL_REDIRECT_TO to be set in the environment as a safety net.',
);
console.error('See docs/operations/outbound-comms-safety.md for the rationale.');
console.error(
'If you are running the production cutover and have read that doc, add ' +
'--unsafe-skip-redirect-check to override.',
);
process.exit(2);
}
// ── Fetch + transform (shared by dry-run and apply) ──────────────────────
console.log('[migrate] Loading NocoDB config…');
const config = loadNocoDbConfig();
console.log(`[migrate] Source: ${config.url}`);
console.log('[migrate] Fetching snapshot from NocoDB…');
const start = Date.now();
const snapshot = await fetchSnapshot(config);
const elapsed = ((Date.now() - start) / 1000).toFixed(1);
console.log(
`[migrate] Snapshot fetched in ${elapsed}s — ${snapshot.interests.length} interests, ${snapshot.residentialInterests.length} residential, ${snapshot.berths.length} berths.`,
);
console.log('[migrate] Running transform + dedup pipeline…');
const plan = transformSnapshot(snapshot);
// Resolve output paths relative to the worktree root.
const scriptDir = path.dirname(fileURLToPath(import.meta.url));
const repoRoot = path.resolve(scriptDir, '..');
const generatedAt = new Date().toISOString();
const paths = resolveReportPaths(repoRoot);
console.log(`[migrate] Writing report to ${paths.rootDir}`);
await writeReport(paths, plan, generatedAt);
// ── Plan summary ─────────────────────────────────────────────────────────
const s = plan.stats;
console.log('');
console.log('=== Migration Plan Summary ===');
console.log(
` Input: ${s.inputInterestRows} interests, ${s.inputResidentialRows} residential interests`,
);
console.log(` Output: ${s.outputClients} clients, ${s.outputInterests} interests`);
console.log(` ${s.outputContacts} contacts, ${s.outputAddresses} addresses`);
console.log(
` Dedup: ${s.autoLinkedClusters} auto-linked clusters, ${s.needsReviewPairs} pairs flagged for review`,
);
console.log(` Quality: ${s.flaggedRows} rows flagged (see report.csv)`);
console.log('');
console.log(` Full report: ${paths.summaryPath}`);
if (args.dryRun) {
console.log('');
console.log('Dry-run complete. Re-run with --apply to write rows.');
return;
}
// ── Apply path ───────────────────────────────────────────────────────────
const port = await resolvePort(args.portSlug);
const applyId = randomUUID();
console.log('');
console.log(`[migrate] Applying to port "${port.slug}" (id=${port.id})`);
console.log(`[migrate] Apply id: ${applyId}`);
console.log('[migrate] Inserting…');
const applyStart = Date.now();
const result = await applyPlan(plan, { port, applyId });
const applyElapsed = ((Date.now() - applyStart) / 1000).toFixed(1);
console.log('');
console.log('=== Apply Result ===');
console.log(` Time: ${applyElapsed}s`);
console.log(
` Clients: ${result.clientsInserted} inserted, ${result.clientsSkipped} already linked`,
);
console.log(` Contacts: ${result.contactsInserted} inserted`);
console.log(` Addresses: ${result.addressesInserted} inserted`);
console.log(` Yachts: ${result.yachtsInserted} inserted`);
console.log(
` Interests: ${result.interestsInserted} inserted, ${result.interestsSkipped} already linked`,
);
if (result.warnings.length > 0) {
console.log('');
console.log('Warnings:');
for (const w of result.warnings.slice(0, 20)) {
console.log(` - ${w}`);
}
if (result.warnings.length > 20) {
console.log(`${result.warnings.length - 20} more`);
}
}
console.log('');
}
main().catch((err) => {
console.error('[migrate] Fatal error:', err);
process.exit(1);
});

View File

@@ -0,0 +1,108 @@
/**
* Live smoke test for EMAIL_REDIRECT_TO.
*
* Actually calls `sendEmail()` (the centralized helper used by every
* outbound email path in the app) with a fake real-client address. The
* SMTP transporter is monkey-patched to capture the message instead of
* actually delivering it, so this is safe to run anywhere.
*
* Prints the captured `to` + `subject` so the operator can see with their
* own eyes that the redirect happened. Exits non-zero if the redirect
* failed for any reason.
*
* Usage:
* pnpm tsx scripts/smoke-test-redirect.ts
*/
import 'dotenv/config';
async function main() {
const expectedRedirect = process.env.EMAIL_REDIRECT_TO;
if (!expectedRedirect) {
console.error('FAIL: EMAIL_REDIRECT_TO is not set in env. Set it before running this test.');
process.exit(1);
}
console.log(`[smoke] EMAIL_REDIRECT_TO = ${expectedRedirect}`);
console.log('');
// Monkey-patch nodemailer's createTransport so we capture the call
// without actually delivering. This is the same pattern the unit
// tests use, but at the live import-time level so we're testing the
// exact code path that runs in production.
const nodemailer = await import('nodemailer');
const captured: Array<{ to: unknown; subject: unknown; from: unknown }> = [];
const originalCreateTransport = nodemailer.default.createTransport;
// @ts-expect-error monkey-patch
nodemailer.default.createTransport = () => ({
// eslint-disable-next-line @typescript-eslint/no-explicit-any
sendMail: async (msg: any) => {
captured.push({ to: msg.to, subject: msg.subject, from: msg.from });
return { messageId: '<smoke@test>', accepted: [msg.to], rejected: [] };
},
});
// Now import sendEmail (gets the patched transporter).
const { sendEmail } = await import('@/lib/email');
const realClientEmail = 'real-client-DO-NOT-EMAIL@example.test';
const realSubject = 'Important: Your contract is ready';
console.log('[smoke] calling sendEmail(...) with:');
console.log(` to: ${realClientEmail}`);
console.log(` subject: "${realSubject}"`);
console.log('');
await sendEmail(realClientEmail, realSubject, '<p>Body unused for this smoke.</p>');
// Restore the original transport (be a good citizen).
// @ts-expect-error monkey-patch
nodemailer.default.createTransport = originalCreateTransport;
console.log('[smoke] captured outbound message:');
console.log(` to: ${captured[0]?.to}`);
console.log(` subject: "${captured[0]?.subject}"`);
console.log(` from: ${captured[0]?.from}`);
console.log('');
// Assertions
let pass = true;
if (captured.length !== 1) {
console.error(`FAIL: expected exactly 1 sendMail call, got ${captured.length}`);
pass = false;
}
if (captured[0]?.to !== expectedRedirect) {
console.error(
`FAIL: outbound "to" was "${captured[0]?.to}", expected the redirect address "${expectedRedirect}"`,
);
pass = false;
}
if (
typeof captured[0]?.subject !== 'string' ||
!captured[0].subject.startsWith(`[redirected from ${realClientEmail}]`)
) {
console.error(
`FAIL: subject did not get the [redirected from <orig>] prefix. Got: "${captured[0]?.subject}"`,
);
pass = false;
}
if (pass) {
console.log('PASS: EMAIL_REDIRECT_TO is intercepting outbound email correctly.');
console.log(
' The "to" header matches the redirect, and the original recipient is preserved in the subject.',
);
process.exit(0);
} else {
console.error('');
console.error('Smoke test FAILED. Do not import production data until this is fixed.');
process.exit(1);
}
}
main().catch((err) => {
console.error('FATAL:', err);
process.exit(1);
});

View File

@@ -0,0 +1,5 @@
import { DuplicatesReviewQueue } from '@/components/admin/duplicates/duplicates-review-queue';
export default function DuplicatesAdminPage() {
return <DuplicatesReviewQueue />;
}

View File

@@ -16,6 +16,7 @@ import {
Tag, Tag,
Upload, Upload,
Users, Users,
UsersRound,
Webhook, Webhook,
} from 'lucide-react'; } from 'lucide-react';
@@ -29,7 +30,17 @@ interface AdminSection {
icon: typeof Settings; icon: typeof Settings;
} }
const SECTIONS: AdminSection[] = [ interface AdminGroup {
title: string;
description: string;
sections: AdminSection[];
}
const GROUPS: AdminGroup[] = [
{
title: 'Access',
description: 'Who can sign in and what they can do once they do.',
sections: [
{ {
href: 'users', href: 'users',
label: 'Users', label: 'Users',
@@ -48,12 +59,12 @@ const SECTIONS: AdminSection[] = [
description: 'Default permission sets and per-port role overrides.', description: 'Default permission sets and per-port role overrides.',
icon: Shield, icon: Shield,
}, },
{ ],
href: 'audit',
label: 'Audit Log',
description: 'Searchable log of every authenticated mutation in the system.',
icon: ScrollText,
}, },
{
title: 'Configuration',
description: 'Branding, integrations, and per-port settings.',
sections: [
{ {
href: 'email', href: 'email',
label: 'Email Settings', label: 'Email Settings',
@@ -90,6 +101,12 @@ const SECTIONS: AdminSection[] = [
description: 'Outgoing webhook subscriptions, secrets, and delivery log.', description: 'Outgoing webhook subscriptions, secrets, and delivery log.',
icon: Webhook, icon: Webhook,
}, },
],
},
{
title: 'Content',
description: 'Forms, templates, and labels that users see.',
sections: [
{ {
href: 'forms', href: 'forms',
label: 'Forms', label: 'Forms',
@@ -114,6 +131,36 @@ const SECTIONS: AdminSection[] = [
description: 'Tenant-defined fields for clients, yachts, and reservations.', description: 'Tenant-defined fields for clients, yachts, and reservations.',
icon: Key, icon: Key,
}, },
],
},
{
title: 'Data Quality',
description: 'Cleanup, imports, and the audit trail.',
sections: [
{
href: 'duplicates',
label: 'Duplicates',
description: 'Review queue of suspected duplicate clients flagged by the dedup engine.',
icon: UsersRound,
},
{
href: 'import',
label: 'Bulk Import',
description: 'CSV-driven imports for clients, yachts, and reservations.',
icon: Upload,
},
{
href: 'audit',
label: 'Audit Log',
description: 'Searchable log of every authenticated mutation in the system.',
icon: ScrollText,
},
],
},
{
title: 'Operations',
description: 'Health checks and disaster recovery.',
sections: [
{ {
href: 'reports', href: 'reports',
label: 'Reports', label: 'Reports',
@@ -126,18 +173,18 @@ const SECTIONS: AdminSection[] = [
description: 'BullMQ queue health, throughput, and retry diagnostics.', description: 'BullMQ queue health, throughput, and retry diagnostics.',
icon: Database, icon: Database,
}, },
{
href: 'import',
label: 'Bulk Import',
description: 'CSV-driven imports for clients, yachts, and reservations.',
icon: Upload,
},
{ {
href: 'backup', href: 'backup',
label: 'Backup & Restore', label: 'Backup & Restore',
description: 'Database snapshots and on-demand exports.', description: 'Database snapshots and on-demand exports.',
icon: HardDrive, icon: HardDrive,
}, },
],
},
{
title: 'Tenancy',
description: 'Multi-port and multi-install scaffolding.',
sections: [
{ {
href: 'ports', href: 'ports',
label: 'Ports', label: 'Ports',
@@ -150,12 +197,20 @@ const SECTIONS: AdminSection[] = [
description: 'Initial-setup wizard for fresh ports.', description: 'Initial-setup wizard for fresh ports.',
icon: LayoutDashboard, icon: LayoutDashboard,
}, },
],
},
{
title: 'Integrations',
description: 'Third-party providers wired into the app.',
sections: [
{ {
href: 'ocr', href: 'ocr',
label: 'Receipt OCR', label: 'Receipt OCR',
description: 'Configure the AI provider used by the mobile receipt scanner.', description: 'Configure the AI provider used by the mobile receipt scanner.',
icon: ScrollText, icon: ScrollText,
}, },
],
},
]; ];
export default async function AdminLandingPage({ export default async function AdminLandingPage({
@@ -165,13 +220,21 @@ export default async function AdminLandingPage({
}) { }) {
const { portSlug } = await params; const { portSlug } = await params;
return ( return (
<div className="space-y-6"> <div className="space-y-8">
<PageHeader <PageHeader
title="Administration" title="Administration"
description="Per-port configuration and system administration. Each card below opens a dedicated settings page." description="Per-port configuration and system administration. Each card below opens a dedicated settings page."
/> />
{GROUPS.map((group) => (
<section key={group.title} className="space-y-3">
<div>
<h2 className="text-xs font-semibold uppercase tracking-wider text-muted-foreground">
{group.title}
</h2>
<p className="text-xs text-muted-foreground/80">{group.description}</p>
</div>
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 gap-4"> <div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 gap-4">
{SECTIONS.map((s) => { {group.sections.map((s) => {
const Icon = s.icon; const Icon = s.icon;
return ( return (
<Link <Link
@@ -195,6 +258,8 @@ export default async function AdminLandingPage({
); );
})} })}
</div> </div>
</section>
))}
</div> </div>
); );
} }

View File

@@ -0,0 +1,4 @@
import { withAuth, withPermission } from '@/lib/api/helpers';
import { dismissHandler } from '../../handlers';
export const POST = withAuth(withPermission('clients', 'edit', dismissHandler));

View File

@@ -0,0 +1,4 @@
import { withAuth, withPermission } from '@/lib/api/helpers';
import { confirmMergeHandler } from '../../handlers';
export const POST = withAuth(withPermission('clients', 'edit', confirmMergeHandler));

View File

@@ -0,0 +1,160 @@
import { NextResponse } from 'next/server';
import { and, eq, inArray } from 'drizzle-orm';
import type { AuthContext } from '@/lib/api/helpers';
import { db } from '@/lib/db';
import { clients, clientMergeCandidates } from '@/lib/db/schema/clients';
import { errorResponse, NotFoundError } from '@/lib/errors';
import {
listPendingMergeCandidates,
mergeClients,
type MergeFieldChoices,
} from '@/lib/services/client-merge.service';
/**
* GET /api/v1/admin/duplicates
*
* Pending merge candidates for the current port, sorted by score.
* Each row hydrates its two client summaries so the review-queue UI
* can render side-by-side cards without an N+1 fetch.
*/
export async function listHandler(_req: Request, ctx: AuthContext): Promise<NextResponse> {
try {
const pairs = await listPendingMergeCandidates(ctx.portId);
if (pairs.length === 0) return NextResponse.json({ data: [] });
const ids = Array.from(new Set(pairs.flatMap((p) => [p.clientAId, p.clientBId])));
const clientRows = await db
.select({
id: clients.id,
fullName: clients.fullName,
archivedAt: clients.archivedAt,
mergedIntoClientId: clients.mergedIntoClientId,
createdAt: clients.createdAt,
})
.from(clients)
.where(inArray(clients.id, ids));
const clientById = new Map(clientRows.map((c) => [c.id, c]));
const data = pairs
.map((p) => {
const a = clientById.get(p.clientAId);
const b = clientById.get(p.clientBId);
if (!a || !b) return null; // FK orphan — shouldn't happen, but be defensive
// Skip pairs where one side has already been merged or archived.
if (a.mergedIntoClientId || b.mergedIntoClientId) return null;
return {
id: p.id,
score: p.score,
reasons: p.reasons,
createdAt: p.createdAt,
clientA: { id: a.id, fullName: a.fullName, createdAt: a.createdAt },
clientB: { id: b.id, fullName: b.fullName, createdAt: b.createdAt },
};
})
.filter((row): row is NonNullable<typeof row> => row !== null);
return NextResponse.json({ data });
} catch (error) {
return errorResponse(error);
}
}
/**
* POST /api/v1/admin/duplicates/[id]/merge
*
* Body: { winnerId: string, fieldChoices?: MergeFieldChoices }
*
* Confirms a merge candidate. The winner is the one the user picked
* to keep; the other side becomes the loser. Calls into the merge
* service which is the only path that touches client_merge_log.
*/
export async function confirmMergeHandler(
req: Request,
ctx: AuthContext,
params: { id?: string },
): Promise<NextResponse> {
try {
const id = params.id ?? '';
const body = (await req.json().catch(() => ({}))) as {
winnerId?: string;
fieldChoices?: MergeFieldChoices;
};
if (!body.winnerId) {
return NextResponse.json({ error: 'winnerId required' }, { status: 400 });
}
const [candidate] = await db
.select()
.from(clientMergeCandidates)
.where(
and(
eq(clientMergeCandidates.id, id),
eq(clientMergeCandidates.portId, ctx.portId),
eq(clientMergeCandidates.status, 'pending'),
),
);
if (!candidate) throw new NotFoundError('Merge candidate');
const loserId =
body.winnerId === candidate.clientAId
? candidate.clientBId
: body.winnerId === candidate.clientBId
? candidate.clientAId
: null;
if (!loserId) {
return NextResponse.json(
{ error: 'winnerId must match one of the candidate clients' },
{ status: 400 },
);
}
const result = await mergeClients({
winnerId: body.winnerId,
loserId,
mergedBy: ctx.userId,
fieldChoices: body.fieldChoices,
});
return NextResponse.json({ data: result });
} catch (error) {
return errorResponse(error);
}
}
/**
* POST /api/v1/admin/duplicates/[id]/dismiss
*
* Mark a merge candidate as dismissed. The background scoring job
* skips dismissed pairs on subsequent runs (a future score increase
* can re-create them).
*/
export async function dismissHandler(
_req: Request,
ctx: AuthContext,
params: { id?: string },
): Promise<NextResponse> {
try {
const id = params.id ?? '';
const result = await db
.update(clientMergeCandidates)
.set({
status: 'dismissed',
resolvedAt: new Date(),
resolvedBy: ctx.userId,
})
.where(
and(
eq(clientMergeCandidates.id, id),
eq(clientMergeCandidates.portId, ctx.portId),
eq(clientMergeCandidates.status, 'pending'),
),
)
.returning({ id: clientMergeCandidates.id });
if (result.length === 0) throw new NotFoundError('Merge candidate');
return NextResponse.json({ data: { id: result[0]!.id, status: 'dismissed' } });
} catch (error) {
return errorResponse(error);
}
}

View File

@@ -0,0 +1,4 @@
import { withAuth, withPermission } from '@/lib/api/helpers';
import { listHandler } from './handlers';
export const GET = withAuth(withPermission('clients', 'view', listHandler));

View File

@@ -0,0 +1,160 @@
import { NextResponse } from 'next/server';
import { and, eq, inArray } from 'drizzle-orm';
import type { AuthContext } from '@/lib/api/helpers';
import { db } from '@/lib/db';
import { clients, clientContacts } from '@/lib/db/schema/clients';
import { interests } from '@/lib/db/schema/interests';
import { errorResponse } from '@/lib/errors';
import { findClientMatches, type MatchCandidate } from '@/lib/dedup/find-matches';
import { normalizeEmail, normalizeName, normalizePhone } from '@/lib/dedup/normalize';
import type { CountryCode } from '@/lib/i18n/countries';
/**
* GET /api/v1/clients/match-candidates
*
* Query parameters (any combination):
* email Free-text email; gets normalized server-side.
* phone Free-text phone; gets normalized to E.164 server-side.
* name Free-text full name; used for surname-token blocking.
* country Optional ISO country hint (default: AI for Port Nimara).
*
* Returns the top candidates that scored above the soft-warn threshold,
* each with a small client summary the form's suggestion card can
* render. Confidence tiers and rules are applied server-side from the
* port's `system_settings` (when wired) or sensible defaults otherwise.
*
* Used by `useDedupSuggestion` in the new-client form. Debounced on
* the client; this endpoint must be cheap (single port pool fetch +
* an in-memory dedup pass).
*/
export async function getMatchCandidatesHandler(
req: Request,
ctx: AuthContext,
): Promise<NextResponse> {
try {
const url = new URL(req.url);
const rawEmail = url.searchParams.get('email');
const rawPhone = url.searchParams.get('phone');
const rawName = url.searchParams.get('name');
const country = (url.searchParams.get('country') ?? 'AI') as CountryCode;
const email = rawEmail ? normalizeEmail(rawEmail) : null;
const phoneResult = rawPhone ? normalizePhone(rawPhone, country) : null;
const nameResult = rawName ? normalizeName(rawName) : null;
// If the caller didn't give us anything useful to match on, return empty
// — short-circuit rather than scan every client for nothing.
if (!email && !phoneResult?.e164 && !nameResult?.surnameToken) {
return NextResponse.json({ data: [] });
}
// Build the input candidate.
const input: MatchCandidate = {
id: '__incoming__',
fullName: nameResult?.display ?? null,
surnameToken: nameResult?.surnameToken ?? null,
emails: email ? [email] : [],
phonesE164: phoneResult?.e164 ? [phoneResult.e164] : [],
countryIso: country,
};
// Fetch the live pool for this port. We keep this O(N) over clients
// since the dedup library does its own blocking; for ports with
// thousands of clients we can later restrict by surname-token /
// contact lookups, but for current scale the simple full-pool fetch
// is fine.
const liveClients = await db
.select({
id: clients.id,
fullName: clients.fullName,
nationalityIso: clients.nationalityIso,
})
.from(clients)
.where(and(eq(clients.portId, ctx.portId)));
if (liveClients.length === 0) {
return NextResponse.json({ data: [] });
}
const clientIds = liveClients.map((c) => c.id);
const contactRows = await db
.select({
clientId: clientContacts.clientId,
channel: clientContacts.channel,
value: clientContacts.value,
valueE164: clientContacts.valueE164,
})
.from(clientContacts)
.where(inArray(clientContacts.clientId, clientIds));
// Group contacts by client for the candidate map.
const emailsByClient = new Map<string, string[]>();
const phonesByClient = new Map<string, string[]>();
for (const c of contactRows) {
if (c.channel === 'email') {
const arr = emailsByClient.get(c.clientId) ?? [];
arr.push(c.value.toLowerCase());
emailsByClient.set(c.clientId, arr);
} else if (c.channel === 'phone' || c.channel === 'whatsapp') {
if (c.valueE164) {
const arr = phonesByClient.get(c.clientId) ?? [];
arr.push(c.valueE164);
phonesByClient.set(c.clientId, arr);
}
}
}
const pool: MatchCandidate[] = liveClients.map((c) => {
const named = normalizeName(c.fullName);
return {
id: c.id,
fullName: c.fullName,
surnameToken: named.surnameToken ?? null,
emails: emailsByClient.get(c.id) ?? [],
phonesE164: phonesByClient.get(c.id) ?? [],
countryIso: (c.nationalityIso as CountryCode | null) ?? null,
};
});
const matches = findClientMatches(input, pool, {
highScore: 90,
mediumScore: 50,
});
// Only return medium+ — low-confidence noise isn't useful at the
// create-form layer (background scoring queue picks those up).
const useful = matches.filter((m) => m.confidence !== 'low');
if (useful.length === 0) {
return NextResponse.json({ data: [] });
}
// Pull a quick summary for each surfaced candidate so the suggestion
// card has enough to render ("Marcus Laurent · 2 interests · last
// contact 9d ago").
const summarizedIds = useful.map((m) => m.candidate.id);
const interestCounts = await db
.select({ clientId: interests.clientId })
.from(interests)
.where(inArray(interests.clientId, summarizedIds));
const interestsByClient = new Map<string, number>();
for (const r of interestCounts) {
interestsByClient.set(r.clientId, (interestsByClient.get(r.clientId) ?? 0) + 1);
}
const data = useful.map((m) => ({
clientId: m.candidate.id,
fullName: m.candidate.fullName,
score: m.score,
confidence: m.confidence,
reasons: m.reasons,
interestCount: interestsByClient.get(m.candidate.id) ?? 0,
emails: m.candidate.emails,
phonesE164: m.candidate.phonesE164,
}));
return NextResponse.json({ data });
} catch (error) {
return errorResponse(error);
}
}

View File

@@ -0,0 +1,4 @@
import { withAuth, withPermission } from '@/lib/api/helpers';
import { getMatchCandidatesHandler } from './handlers';
export const GET = withAuth(withPermission('clients', 'view', getMatchCandidatesHandler));

View File

@@ -0,0 +1,215 @@
'use client';
import { useState } from 'react';
import { useMutation, useQuery, useQueryClient } from '@tanstack/react-query';
import { ArrowRight, GitMerge, X } from 'lucide-react';
import { toast } from 'sonner';
import { Button } from '@/components/ui/button';
import { PageHeader } from '@/components/shared/page-header';
import { EmptyState } from '@/components/shared/empty-state';
import { Skeleton } from '@/components/ui/skeleton';
import { apiFetch } from '@/lib/api/client';
import { cn } from '@/lib/utils';
interface CandidatePair {
id: string;
score: number;
reasons: string[];
createdAt: string;
clientA: { id: string; fullName: string; createdAt: string };
clientB: { id: string; fullName: string; createdAt: string };
}
/**
* Admin review queue for the dedup background scoring job.
*
* Lists every pending merge candidate (pairs where score >=
* `dedup_review_queue_threshold`). For each pair the admin can:
* - Pick a winner via the side-by-side card → confirms a merge
* - Dismiss → removes from the queue (a future score increase
* re-creates the pair on the next scoring run)
*
* Only minimal merge UI here: the user picks which side is the winner
* (no per-field choice), and the loser archives. A richer side-by-side
* field-merge dialog is a future enhancement.
*/
export function DuplicatesReviewQueue() {
const queryClient = useQueryClient();
const { data, isLoading } = useQuery<{ data: CandidatePair[] }>({
queryKey: ['admin', 'duplicates'],
queryFn: () => apiFetch<{ data: CandidatePair[] }>('/api/v1/admin/duplicates'),
});
const pairs = data?.data ?? [];
return (
<div className="space-y-4">
<PageHeader
title="Duplicate clients"
description={
pairs.length === 0
? 'No pending pairs to review.'
: `${pairs.length} pair${pairs.length === 1 ? '' : 's'} flagged for review.`
}
/>
{isLoading ? (
<div className="space-y-3">
{[0, 1, 2].map((i) => (
<Skeleton key={i} className="h-32 w-full" />
))}
</div>
) : pairs.length === 0 ? (
<EmptyState
title="All clear"
description="The background scoring job hasn't surfaced any potential duplicates yet."
/>
) : (
<ul className="space-y-3">
{pairs.map((pair) => (
<li key={pair.id}>
<CandidateRow pair={pair} queryClient={queryClient} />
</li>
))}
</ul>
)}
</div>
);
}
function CandidateRow({
pair,
queryClient,
}: {
pair: CandidatePair;
queryClient: ReturnType<typeof useQueryClient>;
}) {
const [busy, setBusy] = useState<'merge' | 'dismiss' | null>(null);
const [winnerId, setWinnerId] = useState<string>(pair.clientA.id);
const mergeMutation = useMutation({
mutationFn: () =>
apiFetch(`/api/v1/admin/duplicates/${pair.id}/merge`, {
method: 'POST',
body: { winnerId },
}),
onSuccess: () => {
const loserName =
winnerId === pair.clientA.id ? pair.clientB.fullName : pair.clientA.fullName;
const winnerName =
winnerId === pair.clientA.id ? pair.clientA.fullName : pair.clientB.fullName;
toast.success(`Merged "${loserName}" into "${winnerName}"`);
queryClient.invalidateQueries({ queryKey: ['admin', 'duplicates'] });
queryClient.invalidateQueries({ queryKey: ['clients'] });
},
onError: (err) => toast.error(err instanceof Error ? err.message : 'Merge failed'),
onSettled: () => setBusy(null),
});
const dismissMutation = useMutation({
mutationFn: () => apiFetch(`/api/v1/admin/duplicates/${pair.id}/dismiss`, { method: 'POST' }),
onSuccess: () => {
toast.message('Dismissed');
queryClient.invalidateQueries({ queryKey: ['admin', 'duplicates'] });
},
onError: (err) => toast.error(err instanceof Error ? err.message : 'Dismiss failed'),
onSettled: () => setBusy(null),
});
return (
<div className="rounded-lg border bg-card p-4">
<div className="mb-3 flex items-baseline justify-between gap-3">
<div>
<span className="rounded-full bg-muted px-2 py-0.5 text-[10px] font-medium uppercase tracking-wide text-muted-foreground">
score {pair.score}
</span>{' '}
<span className="text-xs text-muted-foreground">{pair.reasons.join(' · ')}</span>
</div>
<span className="text-xs text-muted-foreground">
flagged {new Date(pair.createdAt).toLocaleDateString()}
</span>
</div>
<div className="grid gap-3 sm:grid-cols-[1fr_auto_1fr]">
<ClientCard
client={pair.clientA}
isSelected={winnerId === pair.clientA.id}
onSelect={() => setWinnerId(pair.clientA.id)}
/>
<div className="flex items-center justify-center text-muted-foreground">
<ArrowRight className="size-4" aria-hidden />
</div>
<ClientCard
client={pair.clientB}
isSelected={winnerId === pair.clientB.id}
onSelect={() => setWinnerId(pair.clientB.id)}
/>
</div>
<div className="mt-3 flex flex-wrap items-center gap-2">
<Button
size="sm"
onClick={() => {
setBusy('merge');
mergeMutation.mutate();
}}
disabled={busy !== null}
>
<GitMerge className="mr-1 size-3.5" aria-hidden />
Merge into selected
</Button>
<Button
size="sm"
variant="ghost"
onClick={() => {
setBusy('dismiss');
dismissMutation.mutate();
}}
disabled={busy !== null}
>
<X className="mr-1 size-3.5" aria-hidden />
Dismiss
</Button>
<p className="text-xs text-muted-foreground">
The unselected card becomes the loser; its interests + contacts move to the selected
client and the original is archived.
</p>
</div>
</div>
);
}
function ClientCard({
client,
isSelected,
onSelect,
}: {
client: CandidatePair['clientA'];
isSelected: boolean;
onSelect: () => void;
}) {
return (
<button
type="button"
onClick={onSelect}
className={cn(
'rounded-md border p-3 text-left transition-colors',
isSelected
? 'border-primary bg-primary/5 ring-1 ring-primary/30'
: 'border-border hover:bg-muted/40',
)}
>
<p className="text-sm font-medium">{client.fullName}</p>
<p className="mt-0.5 text-[11px] text-muted-foreground">
Created {new Date(client.createdAt).toLocaleDateString()}
</p>
{isSelected ? (
<span className="mt-1 inline-block rounded-full bg-primary/10 px-1.5 py-0.5 text-[10px] font-semibold text-primary">
KEEP
</span>
) : null}
</button>
);
}

View File

@@ -241,7 +241,14 @@ export function SettingsManager() {
</CardHeader> </CardHeader>
<CardContent className="space-y-4"> <CardContent className="space-y-4">
{KNOWN_SETTINGS.filter((s) => s.type === 'string').map((setting) => ( {KNOWN_SETTINGS.filter((s) => s.type === 'string').map((setting) => (
<div key={setting.key} className="flex items-center justify-between gap-4"> <div
key={setting.key}
// Stack label/description above the input on phone widths.
// The previous flex row crushed the label column into a
// narrow vertical stripe ("Inquiry / Contact / Email" wrapping
// one word per line) while the input took the rest.
className="flex flex-col gap-2 sm:flex-row sm:items-center sm:justify-between sm:gap-4"
>
<div className="flex-1"> <div className="flex-1">
<Label>{setting.label}</Label> <Label>{setting.label}</Label>
<p className="text-xs text-muted-foreground">{setting.description}</p> <p className="text-xs text-muted-foreground">{setting.description}</p>
@@ -249,7 +256,7 @@ export function SettingsManager() {
<div className="flex items-center gap-2"> <div className="flex items-center gap-2">
<Input <Input
type="text" type="text"
className="w-64" className="w-full sm:w-64"
value={String(getEffectiveValue(setting.key, setting.defaultValue) ?? '')} value={String(getEffectiveValue(setting.key, setting.defaultValue) ?? '')}
onChange={(e) => onChange={(e) =>
setValues((prev) => ({ setValues((prev) => ({
@@ -283,7 +290,10 @@ export function SettingsManager() {
</CardHeader> </CardHeader>
<CardContent className="space-y-4"> <CardContent className="space-y-4">
{KNOWN_SETTINGS.filter((s) => s.type === 'number').map((setting) => ( {KNOWN_SETTINGS.filter((s) => s.type === 'number').map((setting) => (
<div key={setting.key} className="flex items-center justify-between gap-4"> <div
key={setting.key}
className="flex flex-col gap-2 sm:flex-row sm:items-center sm:justify-between sm:gap-4"
>
<div className="flex-1"> <div className="flex-1">
<Label>{setting.label}</Label> <Label>{setting.label}</Label>
<p className="text-xs text-muted-foreground">{setting.description}</p> <p className="text-xs text-muted-foreground">{setting.description}</p>
@@ -291,7 +301,7 @@ export function SettingsManager() {
<div className="flex items-center gap-2"> <div className="flex items-center gap-2">
<Input <Input
type="number" type="number"
className="w-24" className="w-full sm:w-24"
value={String(getEffectiveValue(setting.key, setting.defaultValue) ?? '')} value={String(getEffectiveValue(setting.key, setting.defaultValue) ?? '')}
onChange={(e) => onChange={(e) =>
setValues((prev) => ({ setValues((prev) => ({

View File

@@ -44,6 +44,17 @@ type BerthDetailData = {
draftFt: string | null; draftFt: string | null;
draftM: string | null; draftM: string | null;
widthIsMinimum: boolean | null; widthIsMinimum: boolean | null;
nominalBoatSize: string | null;
nominalBoatSizeM: string | null;
waterDepth: string | null;
waterDepthM: string | null;
waterDepthIsMinimum: boolean | null;
sidePontoon: string | null;
cleatType: string | null;
cleatCapacity: string | null;
bollardType: string | null;
bollardCapacity: string | null;
bowFacing: string | null;
price: string | null; price: string | null;
priceCurrency: string; priceCurrency: string;
tenureType: string; tenureType: string;
@@ -167,7 +178,10 @@ export function BerthDetailHeader({ berth }: BerthDetailHeaderProps) {
return ( return (
<> <>
<DetailHeaderStrip> <DetailHeaderStrip>
<div className="flex items-start gap-4"> {/* Stacks vertically on phone widths so the action buttons don't
squeeze the area subtitle into a two-line wrap. From sm up the
title/area block sits side-by-side with the action buttons. */}
<div className="flex flex-col gap-3 sm:flex-row sm:items-start sm:gap-4">
<div className="flex-1 min-w-0"> <div className="flex-1 min-w-0">
<div className="flex items-center gap-3 flex-wrap"> <div className="flex items-center gap-3 flex-wrap">
<h1 className="hidden sm:block text-2xl font-bold text-foreground"> <h1 className="hidden sm:block text-2xl font-bold text-foreground">
@@ -182,7 +196,7 @@ export function BerthDetailHeader({ berth }: BerthDetailHeaderProps) {
{berth.area && <p className="text-muted-foreground mt-1">{berth.area}</p>} {berth.area && <p className="text-muted-foreground mt-1">{berth.area}</p>}
</div> </div>
<div className="flex flex-wrap items-center gap-2 shrink-0"> <div className="flex flex-wrap items-center gap-2 sm:shrink-0">
<PermissionGate resource="berths" action="edit"> <PermissionGate resource="berths" action="edit">
<Button variant="outline" size="sm" onClick={() => setStatusOpen(true)}> <Button variant="outline" size="sm" onClick={() => setStatusOpen(true)}>
<RefreshCw className="mr-1.5 h-4 w-4" /> <RefreshCw className="mr-1.5 h-4 w-4" />

View File

@@ -16,18 +16,22 @@ import {
SelectTrigger, SelectTrigger,
SelectValue, SelectValue,
} from '@/components/ui/select'; } from '@/components/ui/select';
import { import { Sheet, SheetContent, SheetHeader, SheetTitle, SheetFooter } from '@/components/ui/sheet';
Sheet,
SheetContent,
SheetHeader,
SheetTitle,
SheetFooter,
} from '@/components/ui/sheet';
import { Separator } from '@/components/ui/separator'; import { Separator } from '@/components/ui/separator';
import { Switch } from '@/components/ui/switch'; import { Switch } from '@/components/ui/switch';
import { TagPicker } from '@/components/shared/tag-picker'; import { TagPicker } from '@/components/shared/tag-picker';
import { apiFetch } from '@/lib/api/client'; import { apiFetch } from '@/lib/api/client';
import { updateBerthSchema, type UpdateBerthInput } from '@/lib/validators/berths'; import { updateBerthSchema, type UpdateBerthInput } from '@/lib/validators/berths';
import {
BERTH_AREAS,
BERTH_SIDE_PONTOON_OPTIONS,
BERTH_MOORING_TYPES,
BERTH_CLEAT_TYPES,
BERTH_CLEAT_CAPACITIES,
BERTH_BOLLARD_TYPES,
BERTH_BOLLARD_CAPACITIES,
BERTH_ACCESS_OPTIONS,
} from '@/lib/constants';
interface BerthFormProps { interface BerthFormProps {
berth: { berth: {
@@ -42,16 +46,27 @@ interface BerthFormProps {
draftFt: string | null; draftFt: string | null;
draftM: string | null; draftM: string | null;
widthIsMinimum: boolean | null; widthIsMinimum: boolean | null;
nominalBoatSize: string | null;
nominalBoatSizeM: string | null;
waterDepth: string | null;
waterDepthM: string | null;
waterDepthIsMinimum: boolean | null;
sidePontoon: string | null;
powerCapacity: string | null;
voltage: string | null;
mooringType: string | null;
cleatType: string | null;
cleatCapacity: string | null;
bollardType: string | null;
bollardCapacity: string | null;
access: string | null;
bowFacing: string | null;
price: string | null; price: string | null;
priceCurrency: string; priceCurrency: string;
tenureType: string; tenureType: string;
tenureYears: number | null; tenureYears: number | null;
tenureStartDate: string | null; tenureStartDate: string | null;
tenureEndDate: string | null; tenureEndDate: string | null;
powerCapacity: string | null;
voltage: string | null;
mooringType: string | null;
access: string | null;
berthApproved: boolean | null; berthApproved: boolean | null;
tags: Array<{ id: string; name: string; color: string }>; tags: Array<{ id: string; name: string; color: string }>;
}; };
@@ -59,10 +74,42 @@ interface BerthFormProps {
onOpenChange: (open: boolean) => void; onOpenChange: (open: boolean) => void;
} }
/** Optional select that allows clearing back to "no value". */
function SelectOrEmpty({
value,
onChange,
options,
placeholder = 'Select…',
}: {
value: string | undefined;
onChange: (next: string | undefined) => void;
options: readonly string[];
placeholder?: string;
}) {
const NONE = '__none';
return (
<Select value={value ?? NONE} onValueChange={(v) => onChange(v === NONE ? undefined : v)}>
<SelectTrigger>
<SelectValue placeholder={placeholder} />
</SelectTrigger>
<SelectContent>
<SelectItem value={NONE}></SelectItem>
{options.map((opt) => (
<SelectItem key={opt} value={opt}>
{opt}
</SelectItem>
))}
</SelectContent>
</Select>
);
}
export function BerthForm({ berth, open, onOpenChange }: BerthFormProps) { export function BerthForm({ berth, open, onOpenChange }: BerthFormProps) {
const queryClient = useQueryClient(); const queryClient = useQueryClient();
const [tagIds, setTagIds] = useState<string[]>(berth.tags.map((t) => t.id)); const [tagIds, setTagIds] = useState<string[]>(berth.tags.map((t) => t.id));
const numOrUndef = (v: string | null) => (v != null && v !== '' ? Number(v) : undefined);
const { const {
register, register,
handleSubmit, handleSubmit,
@@ -73,23 +120,34 @@ export function BerthForm({ berth, open, onOpenChange }: BerthFormProps) {
resolver: zodResolver(updateBerthSchema), resolver: zodResolver(updateBerthSchema),
defaultValues: { defaultValues: {
area: berth.area ?? undefined, area: berth.area ?? undefined,
lengthFt: berth.lengthFt ? Number(berth.lengthFt) : undefined, lengthFt: numOrUndef(berth.lengthFt),
lengthM: berth.lengthM ? Number(berth.lengthM) : undefined, lengthM: numOrUndef(berth.lengthM),
widthFt: berth.widthFt ? Number(berth.widthFt) : undefined, widthFt: numOrUndef(berth.widthFt),
widthM: berth.widthM ? Number(berth.widthM) : undefined, widthM: numOrUndef(berth.widthM),
draftFt: berth.draftFt ? Number(berth.draftFt) : undefined, draftFt: numOrUndef(berth.draftFt),
draftM: berth.draftM ? Number(berth.draftM) : undefined, draftM: numOrUndef(berth.draftM),
widthIsMinimum: berth.widthIsMinimum ?? false, widthIsMinimum: berth.widthIsMinimum ?? false,
price: berth.price ? Number(berth.price) : undefined, nominalBoatSize: numOrUndef(berth.nominalBoatSize),
nominalBoatSizeM: numOrUndef(berth.nominalBoatSizeM),
waterDepth: numOrUndef(berth.waterDepth),
waterDepthM: numOrUndef(berth.waterDepthM),
waterDepthIsMinimum: berth.waterDepthIsMinimum ?? false,
sidePontoon: berth.sidePontoon ?? undefined,
powerCapacity: numOrUndef(berth.powerCapacity),
voltage: numOrUndef(berth.voltage),
mooringType: berth.mooringType ?? undefined,
cleatType: berth.cleatType ?? undefined,
cleatCapacity: berth.cleatCapacity ?? undefined,
bollardType: berth.bollardType ?? undefined,
bollardCapacity: berth.bollardCapacity ?? undefined,
access: berth.access ?? undefined,
bowFacing: berth.bowFacing ?? undefined,
price: numOrUndef(berth.price),
priceCurrency: berth.priceCurrency, priceCurrency: berth.priceCurrency,
tenureType: berth.tenureType as 'permanent' | 'fixed_term', tenureType: berth.tenureType as 'permanent' | 'fixed_term',
tenureYears: berth.tenureYears ?? undefined, tenureYears: berth.tenureYears ?? undefined,
tenureStartDate: berth.tenureStartDate ?? undefined, tenureStartDate: berth.tenureStartDate ?? undefined,
tenureEndDate: berth.tenureEndDate ?? undefined, tenureEndDate: berth.tenureEndDate ?? undefined,
powerCapacity: berth.powerCapacity ?? undefined,
voltage: berth.voltage ?? undefined,
mooringType: berth.mooringType ?? undefined,
access: berth.access ?? undefined,
berthApproved: berth.berthApproved ?? false, berthApproved: berth.berthApproved ?? false,
}, },
}); });
@@ -120,6 +178,14 @@ export function BerthForm({ berth, open, onOpenChange }: BerthFormProps) {
} }
const tenureType = watch('tenureType'); const tenureType = watch('tenureType');
const area = watch('area');
const sidePontoon = watch('sidePontoon');
const mooringType = watch('mooringType');
const cleatType = watch('cleatType');
const cleatCapacity = watch('cleatCapacity');
const bollardType = watch('bollardType');
const bollardCapacity = watch('bollardCapacity');
const access = watch('access');
return ( return (
<Sheet open={open} onOpenChange={onOpenChange}> <Sheet open={open} onOpenChange={onOpenChange}>
@@ -136,18 +202,18 @@ export function BerthForm({ berth, open, onOpenChange }: BerthFormProps) {
</h3> </h3>
<div className="grid grid-cols-2 gap-4"> <div className="grid grid-cols-2 gap-4">
<div className="space-y-2"> <div className="space-y-2">
<Label htmlFor="area">Area</Label> <Label>Area</Label>
<Input id="area" {...register('area')} placeholder="e.g. Marina A" /> <SelectOrEmpty
value={area}
onChange={(v) => setValue('area', v)}
options={BERTH_AREAS}
/>
</div> </div>
<div className="space-y-2"> <div className="space-y-2">
<Label htmlFor="mooringType">Mooring Type</Label> <Label htmlFor="bowFacing">Bow Facing</Label>
<Input id="mooringType" {...register('mooringType')} /> <Input id="bowFacing" {...register('bowFacing')} placeholder="e.g. East" />
</div> </div>
</div> </div>
<div className="space-y-2">
<Label htmlFor="access">Access</Label>
<Input id="access" {...register('access')} />
</div>
<div className="flex items-center gap-2"> <div className="flex items-center gap-2">
<Switch <Switch
id="berthApproved" id="berthApproved"
@@ -168,29 +234,46 @@ export function BerthForm({ berth, open, onOpenChange }: BerthFormProps) {
<div className="grid grid-cols-2 gap-4"> <div className="grid grid-cols-2 gap-4">
<div className="space-y-2"> <div className="space-y-2">
<Label>Length (ft)</Label> <Label>Length (ft)</Label>
<Input type="number" step="0.1" {...register('lengthFt')} /> <Input type="number" step="0.01" {...register('lengthFt')} />
</div> </div>
<div className="space-y-2"> <div className="space-y-2">
<Label>Length (m)</Label> <Label>Length (m)</Label>
<Input type="number" step="0.1" {...register('lengthM')} /> <Input type="number" step="0.01" {...register('lengthM')} />
</div> </div>
<div className="space-y-2"> <div className="space-y-2">
<Label>Width (ft)</Label> <Label>Width (ft)</Label>
<Input type="number" step="0.1" {...register('widthFt')} /> <Input type="number" step="0.01" {...register('widthFt')} />
</div> </div>
<div className="space-y-2"> <div className="space-y-2">
<Label>Width (m)</Label> <Label>Width (m)</Label>
<Input type="number" step="0.1" {...register('widthM')} /> <Input type="number" step="0.01" {...register('widthM')} />
</div> </div>
<div className="space-y-2"> <div className="space-y-2">
<Label>Draft (ft)</Label> <Label>Draft (ft)</Label>
<Input type="number" step="0.1" {...register('draftFt')} /> <Input type="number" step="0.01" {...register('draftFt')} />
</div> </div>
<div className="space-y-2"> <div className="space-y-2">
<Label>Draft (m)</Label> <Label>Draft (m)</Label>
<Input type="number" step="0.1" {...register('draftM')} /> <Input type="number" step="0.01" {...register('draftM')} />
</div>
<div className="space-y-2">
<Label>Nominal Boat Size (ft)</Label>
<Input type="number" step="1" {...register('nominalBoatSize')} />
</div>
<div className="space-y-2">
<Label>Nominal Boat Size (m)</Label>
<Input type="number" step="0.01" {...register('nominalBoatSizeM')} />
</div>
<div className="space-y-2">
<Label>Water Depth (ft)</Label>
<Input type="number" step="0.01" {...register('waterDepth')} />
</div>
<div className="space-y-2">
<Label>Water Depth (m)</Label>
<Input type="number" step="0.01" {...register('waterDepthM')} />
</div> </div>
</div> </div>
<div className="flex flex-wrap items-center gap-x-6 gap-y-2">
<div className="flex items-center gap-2"> <div className="flex items-center gap-2">
<Switch <Switch
id="widthIsMinimum" id="widthIsMinimum"
@@ -199,6 +282,107 @@ export function BerthForm({ berth, open, onOpenChange }: BerthFormProps) {
/> />
<Label htmlFor="widthIsMinimum">Width is minimum</Label> <Label htmlFor="widthIsMinimum">Width is minimum</Label>
</div> </div>
<div className="flex items-center gap-2">
<Switch
id="waterDepthIsMinimum"
checked={watch('waterDepthIsMinimum') ?? false}
onCheckedChange={(v) => setValue('waterDepthIsMinimum', v)}
/>
<Label htmlFor="waterDepthIsMinimum">Water depth is minimum</Label>
</div>
</div>
</div>
<Separator />
{/* Mooring & Hardware */}
<div className="space-y-4">
<h3 className="text-sm font-medium text-muted-foreground uppercase tracking-wider">
Mooring &amp; Hardware
</h3>
<div className="space-y-2">
<Label>Side Pontoon</Label>
<SelectOrEmpty
value={sidePontoon}
onChange={(v) => setValue('sidePontoon', v)}
options={BERTH_SIDE_PONTOON_OPTIONS}
/>
</div>
<div className="space-y-2">
<Label>Mooring Type</Label>
<SelectOrEmpty
value={mooringType}
onChange={(v) => setValue('mooringType', v)}
options={BERTH_MOORING_TYPES}
/>
</div>
<div className="grid grid-cols-2 gap-4">
<div className="space-y-2">
<Label>Cleat Type</Label>
<SelectOrEmpty
value={cleatType}
onChange={(v) => setValue('cleatType', v)}
options={BERTH_CLEAT_TYPES}
/>
</div>
<div className="space-y-2">
<Label>Cleat Capacity</Label>
<SelectOrEmpty
value={cleatCapacity}
onChange={(v) => setValue('cleatCapacity', v)}
options={BERTH_CLEAT_CAPACITIES}
/>
</div>
<div className="space-y-2">
<Label>Bollard Type</Label>
<SelectOrEmpty
value={bollardType}
onChange={(v) => setValue('bollardType', v)}
options={BERTH_BOLLARD_TYPES}
/>
</div>
<div className="space-y-2">
<Label>Bollard Capacity</Label>
<SelectOrEmpty
value={bollardCapacity}
onChange={(v) => setValue('bollardCapacity', v)}
options={BERTH_BOLLARD_CAPACITIES}
/>
</div>
</div>
</div>
<Separator />
{/* Power */}
<div className="space-y-4">
<h3 className="text-sm font-medium text-muted-foreground uppercase tracking-wider">
Power
</h3>
<div className="grid grid-cols-2 gap-4">
<div className="space-y-2">
<Label>Power Capacity (kW)</Label>
<Input type="number" step="1" {...register('powerCapacity')} />
</div>
<div className="space-y-2">
<Label>Voltage (V at 60Hz)</Label>
<Input type="number" step="1" {...register('voltage')} />
</div>
</div>
</div>
<Separator />
{/* Access */}
<div className="space-y-4">
<h3 className="text-sm font-medium text-muted-foreground uppercase tracking-wider">
Access
</h3>
<SelectOrEmpty
value={access}
onChange={(v) => setValue('access', v)}
options={BERTH_ACCESS_OPTIONS}
/>
</div> </div>
<Separator /> <Separator />
@@ -262,25 +446,6 @@ export function BerthForm({ berth, open, onOpenChange }: BerthFormProps) {
<Separator /> <Separator />
{/* Infrastructure */}
<div className="space-y-4">
<h3 className="text-sm font-medium text-muted-foreground uppercase tracking-wider">
Infrastructure
</h3>
<div className="grid grid-cols-2 gap-4">
<div className="space-y-2">
<Label>Power Capacity</Label>
<Input {...register('powerCapacity')} />
</div>
<div className="space-y-2">
<Label>Voltage</Label>
<Input {...register('voltage')} />
</div>
</div>
</div>
<Separator />
{/* Tags */} {/* Tags */}
<div className="space-y-4"> <div className="space-y-4">
<h3 className="text-sm font-medium text-muted-foreground uppercase tracking-wider"> <h3 className="text-sm font-medium text-muted-foreground uppercase tracking-wider">

View File

@@ -48,22 +48,57 @@ type BerthData = {
function SpecRow({ label, value }: { label: string; value: React.ReactNode }) { function SpecRow({ label, value }: { label: string; value: React.ReactNode }) {
if (!value && value !== 0 && value !== false) return null; if (!value && value !== 0 && value !== false) return null;
// Mobile-first: stack vertically with label on top so long values
// (e.g. "206.69 ft / 62.99 m") never clip at the right edge.
// From `sm` (>=640px) up: switch to the original two-column layout.
return ( return (
<div className="flex justify-between py-2 text-sm"> <div className="flex flex-col gap-0.5 py-2 text-sm sm:flex-row sm:items-baseline sm:justify-between sm:gap-3">
<span className="text-muted-foreground">{label}</span> <span className="text-muted-foreground">{label}</span>
<span className="font-medium text-right max-w-[60%]">{value}</span> <span className="font-medium sm:max-w-[60%] sm:text-right">{value}</span>
</div> </div>
); );
} }
function OverviewTab({ berth }: { berth: BerthData }) { function OverviewTab({ berth }: { berth: BerthData }) {
// Round to at most 2 decimals; trim trailing zeros so "5.00" -> "5".
const fmt = (v: string | null, fractionDigits = 2): string | null => {
if (v == null || v === '') return null;
const n = Number(v);
if (Number.isNaN(n)) return v;
return n.toLocaleString('en-US', {
minimumFractionDigits: 0,
maximumFractionDigits: fractionDigits,
});
};
const formatDim = (ft: string | null, m: string | null) => { const formatDim = (ft: string | null, m: string | null) => {
const parts = []; const parts = [];
if (ft) parts.push(`${ft} ft`); const ftFmt = fmt(ft);
if (m) parts.push(`${m} m`); const mFmt = fmt(m);
if (ftFmt) parts.push(`${ftFmt} ft`);
if (mFmt) parts.push(`${mFmt} m`);
return parts.length > 0 ? parts.join(' / ') : null; return parts.length > 0 ? parts.join(' / ') : null;
}; };
const formatNominalBoatSize = (ft: string | null, m: string | null): string | null => {
const ftFmt = fmt(ft, 0);
const mFmt = fmt(m);
const parts: string[] = [];
if (ftFmt) parts.push(`${ftFmt} ft`);
if (mFmt) parts.push(`${mFmt} m`);
return parts.length > 0 ? parts.join(' / ') : null;
};
const formatPower = (kw: string | null) => {
const v = fmt(kw, 0);
return v ? `${v} kW` : null;
};
const formatVoltage = (v: string | null) => {
const fv = fmt(v, 0);
return fv ? `${fv} V` : null;
};
const price = berth.price const price = berth.price
? new Intl.NumberFormat('en-US', { ? new Intl.NumberFormat('en-US', {
style: 'currency', style: 'currency',
@@ -97,7 +132,7 @@ function OverviewTab({ berth }: { berth: BerthData }) {
<SpecRow label="Draft" value={formatDim(berth.draftFt, berth.draftM)} /> <SpecRow label="Draft" value={formatDim(berth.draftFt, berth.draftM)} />
<SpecRow <SpecRow
label="Nominal Boat Size" label="Nominal Boat Size"
value={berth.nominalBoatSize || berth.nominalBoatSizeM} value={formatNominalBoatSize(berth.nominalBoatSize, berth.nominalBoatSizeM)}
/> />
<SpecRow <SpecRow
label="Water Depth" label="Water Depth"
@@ -122,8 +157,8 @@ function OverviewTab({ berth }: { berth: BerthData }) {
<CardTitle className="text-sm font-medium">Infrastructure</CardTitle> <CardTitle className="text-sm font-medium">Infrastructure</CardTitle>
</CardHeader> </CardHeader>
<CardContent className="pt-0 divide-y"> <CardContent className="pt-0 divide-y">
<SpecRow label="Power Capacity" value={berth.powerCapacity} /> <SpecRow label="Power Capacity" value={formatPower(berth.powerCapacity)} />
<SpecRow label="Voltage" value={berth.voltage} /> <SpecRow label="Voltage" value={formatVoltage(berth.voltage)} />
<SpecRow label="Cleat Type" value={berth.cleatType} /> <SpecRow label="Cleat Type" value={berth.cleatType} />
<SpecRow label="Cleat Capacity" value={berth.cleatCapacity} /> <SpecRow label="Cleat Capacity" value={berth.cleatCapacity} />
<SpecRow label="Bollard Type" value={berth.bollardType} /> <SpecRow label="Bollard Type" value={berth.bollardType} />

View File

@@ -23,6 +23,7 @@ import { TagPicker } from '@/components/shared/tag-picker';
import { CountryCombobox } from '@/components/shared/country-combobox'; import { CountryCombobox } from '@/components/shared/country-combobox';
import { TimezoneCombobox } from '@/components/shared/timezone-combobox'; import { TimezoneCombobox } from '@/components/shared/timezone-combobox';
import { PhoneInput } from '@/components/shared/phone-input'; import { PhoneInput } from '@/components/shared/phone-input';
import { DedupSuggestionPanel } from '@/components/clients/dedup-suggestion-panel';
import { apiFetch } from '@/lib/api/client'; import { apiFetch } from '@/lib/api/client';
import { createClientSchema, type CreateClientInput } from '@/lib/validators/clients'; import { createClientSchema, type CreateClientInput } from '@/lib/validators/clients';
import type { CountryCode } from '@/lib/i18n/countries'; import type { CountryCode } from '@/lib/i18n/countries';
@@ -30,6 +31,12 @@ import type { CountryCode } from '@/lib/i18n/countries';
interface ClientFormProps { interface ClientFormProps {
open: boolean; open: boolean;
onOpenChange: (open: boolean) => void; onOpenChange: (open: boolean) => void;
/** Optional callback fired when the dedup suggestion panel reports
* the user picked an existing client. The form closes; parent is
* responsible for navigating to the existing client's detail page
* or opening the create-interest dialog pre-filled with that
* clientId. Skipped in edit mode. */
onUseExistingClient?: (clientId: string) => void;
/** If provided, form is in edit mode */ /** If provided, form is in edit mode */
client?: { client?: {
id: string; id: string;
@@ -53,7 +60,7 @@ interface ClientFormProps {
}; };
} }
export function ClientForm({ open, onOpenChange, client }: ClientFormProps) { export function ClientForm({ open, onOpenChange, client, onUseExistingClient }: ClientFormProps) {
const queryClient = useQueryClient(); const queryClient = useQueryClient();
const isEdit = !!client; const isEdit = !!client;
@@ -143,6 +150,26 @@ export function ClientForm({ open, onOpenChange, client }: ClientFormProps) {
</SheetHeader> </SheetHeader>
<form onSubmit={handleSubmit((data) => mutation.mutate(data))} className="space-y-6 py-6"> <form onSubmit={handleSubmit((data) => mutation.mutate(data))} className="space-y-6 py-6">
{/* Dedup suggestion — only on the create path. Watches the
live form values for email / phone / name and surfaces
an existing client when one matches. The user can
attach the new interest to that client instead of
creating a duplicate. */}
{!isEdit ? (
<DedupSuggestionPanel
email={watch('contacts')?.find((c) => c?.channel === 'email')?.value ?? null}
phone={
watch('contacts')?.find((c) => c?.channel === 'phone' || c?.channel === 'whatsapp')
?.valueE164 ?? null
}
name={watch('fullName') ?? null}
onUseExisting={(match) => {
onUseExistingClient?.(match.clientId);
onOpenChange(false);
}}
/>
) : null}
{/* Basic Info */} {/* Basic Info */}
<div className="space-y-4"> <div className="space-y-4">
<h3 className="text-sm font-medium text-muted-foreground uppercase tracking-wide"> <h3 className="text-sm font-medium text-muted-foreground uppercase tracking-wide">

View File

@@ -116,6 +116,8 @@ interface ClientTabsOptions {
tenureType: string; tenureType: string;
status: string; status: string;
}>; }>;
interestCount?: number;
noteCount?: number;
tags?: Array<{ id: string; name: string; color: string }>; tags?: Array<{ id: string; name: string; color: string }>;
}; };
} }
@@ -224,6 +226,7 @@ export function getClientTabs({ clientId, currentUserId, client }: ClientTabsOpt
{ {
id: 'interests', id: 'interests',
label: 'Interests', label: 'Interests',
badge: client.interestCount,
content: <ClientInterestsTab clientId={clientId} />, content: <ClientInterestsTab clientId={clientId} />,
}, },
{ {
@@ -261,6 +264,7 @@ export function getClientTabs({ clientId, currentUserId, client }: ClientTabsOpt
{ {
id: 'notes', id: 'notes',
label: 'Notes', label: 'Notes',
badge: client.noteCount,
content: <NotesList entityType="clients" entityId={clientId} currentUserId={currentUserId} />, content: <NotesList entityType="clients" entityId={clientId} currentUserId={currentUserId} />,
}, },
{ {

View File

@@ -223,14 +223,27 @@ function ContactRow({
</div> </div>
</div> </div>
{/* Bottom / right: tag + actions. Hidden while the phone editor is active {/* Bottom / right: tag + actions.
to keep focus on the form — no chips fighting for space, no noise. */} Two layers of hiding compose here:
(a) phoneEditing — when the phone editor is open, hide the entire
action cluster (tag + star + trash) so the user can focus on
the form without chips fighting for space.
(b) contact.value — when the value is empty (stale import row,
aborted edit), hide just the tag + Make-primary star;
neither makes sense without a value. The trash icon stays
so the user can clean up the empty entry.
On touch (no hover), trash is always rendered; on desktop it
fades in on hover only (sm:opacity-0 + sm:group-hover:opacity-100). */}
{!phoneEditing ? ( {!phoneEditing ? (
<div className="flex shrink-0 items-center justify-end gap-2"> <div className="flex shrink-0 items-center justify-end gap-2">
{contact.value ? (
<>
<div className="w-28 text-right text-xs text-muted-foreground"> <div className="w-28 text-right text-xs text-muted-foreground">
<InlineEditableField <InlineEditableField
value={ value={
contact.label && contact.label.toLowerCase() !== 'primary' ? contact.label : null contact.label && contact.label.toLowerCase() !== 'primary'
? contact.label
: null
} }
emptyText="Add tag" emptyText="Add tag"
placeholder="work, home…" placeholder="work, home…"
@@ -251,12 +264,13 @@ function ContactRow({
> >
<Star className={cn('h-3.5 w-3.5', contact.isPrimary && 'fill-current')} /> <Star className={cn('h-3.5 w-3.5', contact.isPrimary && 'fill-current')} />
</button> </button>
</>
) : null}
<button <button
type="button" type="button"
onClick={onRemove} onClick={onRemove}
title="Remove" title="Remove"
// Trash is opacity-0 on desktop hover-only; on touch, always show.
className="rounded p-1 text-muted-foreground/50 transition-all hover:bg-background/60 hover:text-destructive sm:opacity-0 sm:group-hover:opacity-100" className="rounded p-1 text-muted-foreground/50 transition-all hover:bg-background/60 hover:text-destructive sm:opacity-0 sm:group-hover:opacity-100"
> >
<Trash2 className="h-3.5 w-3.5" /> <Trash2 className="h-3.5 w-3.5" />

View File

@@ -0,0 +1,183 @@
'use client';
import { useEffect, useState } from 'react';
import { useQuery } from '@tanstack/react-query';
import { AlertCircle, ArrowRight, Briefcase, X } from 'lucide-react';
import { Button } from '@/components/ui/button';
import { apiFetch } from '@/lib/api/client';
import { cn } from '@/lib/utils';
interface MatchData {
clientId: string;
fullName: string;
score: number;
confidence: 'high' | 'medium' | 'low';
reasons: string[];
interestCount: number;
emails: string[];
phonesE164: string[];
}
interface DedupSuggestionPanelProps {
/** Free-text inputs from the in-flight new-client form. The panel
* debounces them and queries /api/v1/clients/match-candidates. */
email?: string | null;
phone?: string | null;
name?: string | null;
/** Caller wants to attach the new interest to an existing client
* rather than creating a new one. The form switches to
* interest-only mode and pre-fills the client. */
onUseExisting: (match: MatchData) => void;
/** User explicitly said "create new anyway." Hide the panel until
* they change input again. */
onDismiss?: () => void;
}
/**
* Surfaces existing clients that match the form's in-flight inputs.
*
* Renders nothing while inputs are short / no useful match found.
* On a high-confidence match, the panel interrupts visually with a
* solid border and a primary "Use this client" button.
*
* Wired into the new-client form. Skipped in edit mode.
*/
export function DedupSuggestionPanel({
email,
phone,
name,
onUseExisting,
onDismiss,
}: DedupSuggestionPanelProps) {
const [dismissed, setDismissed] = useState(false);
// Debounce inputs by 300ms so we don't fire on every keystroke. Keep
// the latest debounced values in component state.
const [debounced, setDebounced] = useState({
email: email ?? '',
phone: phone ?? '',
name: name ?? '',
});
useEffect(() => {
const t = setTimeout(() => {
setDebounced({ email: email ?? '', phone: phone ?? '', name: name ?? '' });
// Clear the dismissed flag when inputs change — the user typed
// something new, so the prior dismissal no longer applies.
setDismissed(false);
}, 300);
return () => clearTimeout(t);
}, [email, phone, name]);
const hasSomething =
debounced.email.length > 3 || debounced.phone.length > 3 || debounced.name.length > 2;
const { data, isFetching } = useQuery<{ data: MatchData[] }>({
queryKey: ['dedup-match-candidates', debounced],
queryFn: () => {
const params = new URLSearchParams();
if (debounced.email) params.set('email', debounced.email);
if (debounced.phone) params.set('phone', debounced.phone);
if (debounced.name) params.set('name', debounced.name);
return apiFetch<{ data: MatchData[] }>(`/api/v1/clients/match-candidates?${params}`);
},
enabled: hasSomething && !dismissed,
// Same query is fine to cache for a minute — moves are slow at this layer.
staleTime: 60_000,
});
if (dismissed) return null;
if (!hasSomething) return null;
if (isFetching && !data) return null;
const matches = data?.data ?? [];
if (matches.length === 0) return null;
const top = matches[0]!;
const isHigh = top.confidence === 'high';
return (
<div
className={cn(
'rounded-lg border p-3 mb-3 transition-colors',
isHigh
? 'border-amber-300 bg-amber-50/60 dark:bg-amber-950/30'
: 'border-border bg-muted/40',
)}
data-testid="dedup-suggestion"
>
<div className="flex items-start gap-3">
<div className="mt-0.5">
<AlertCircle
className={cn(
'size-5',
isHigh ? 'text-amber-700 dark:text-amber-400' : 'text-muted-foreground',
)}
aria-hidden
/>
</div>
<div className="min-w-0 flex-1">
<p className="text-sm font-semibold leading-tight">
{isHigh
? 'This looks like an existing client'
: 'Possible match — check before creating'}
</p>
<div className="mt-2 rounded-md border bg-background/80 p-2.5">
<div className="flex items-center gap-2">
<p className="truncate text-sm font-medium">{top.fullName}</p>
<span
className={cn(
'shrink-0 rounded-full px-1.5 py-0.5 text-[10px] font-medium uppercase tracking-wide',
isHigh
? 'bg-amber-200 text-amber-900 dark:bg-amber-800 dark:text-amber-100'
: 'bg-muted text-muted-foreground',
)}
>
{top.confidence}
</span>
</div>
<div className="mt-0.5 flex flex-wrap items-center gap-x-3 gap-y-0.5 text-xs text-muted-foreground">
{top.emails[0] ? <span className="truncate">{top.emails[0]}</span> : null}
{top.phonesE164[0] ? <span>{top.phonesE164[0]}</span> : null}
<span className="inline-flex items-center gap-1">
<Briefcase className="size-3" aria-hidden />
{top.interestCount} {top.interestCount === 1 ? 'interest' : 'interests'}
</span>
</div>
<p className="mt-1.5 text-[11px] text-muted-foreground">{top.reasons.join(' · ')}</p>
</div>
<div className="mt-3 flex flex-wrap items-center gap-2">
<Button
type="button"
size="sm"
onClick={() => onUseExisting(top)}
data-testid="dedup-use-existing"
>
Use this client
<ArrowRight className="ml-1 size-3.5" aria-hidden />
</Button>
<Button
type="button"
size="sm"
variant="ghost"
onClick={() => {
setDismissed(true);
onDismiss?.();
}}
data-testid="dedup-dismiss"
>
<X className="mr-1 size-3.5" aria-hidden />
Create new anyway
</Button>
{matches.length > 1 ? (
<span className="text-xs text-muted-foreground">
+{matches.length - 1} other possible{' '}
{matches.length - 1 === 1 ? 'match' : 'matches'}
</span>
) : null}
</div>
</div>
</div>
</div>
);
}

View File

@@ -77,7 +77,9 @@ export function CompanyDetailHeader({ company }: CompanyDetailHeaderProps) {
return ( return (
<> <>
<DetailHeaderStrip> <DetailHeaderStrip>
<div className="flex items-start gap-3 flex-wrap"> {/* Stack actions below the title block on phone widths; horizontal
beside it from sm up. */}
<div className="flex flex-col gap-3 sm:flex-row sm:items-start sm:flex-wrap sm:gap-3">
<div className="flex-1 min-w-0"> <div className="flex-1 min-w-0">
<div className="flex items-center gap-2 flex-wrap"> <div className="flex items-center gap-2 flex-wrap">
<h1 className="hidden sm:block text-2xl font-bold text-foreground truncate"> <h1 className="hidden sm:block text-2xl font-bold text-foreground truncate">

View File

@@ -146,7 +146,11 @@ function OverviewTab({ companyId, company }: { companyId: string; company: Compa
</EditableRow> </EditableRow>
<EditableRow label="Incorporation Date"> <EditableRow label="Incorporation Date">
<InlineEditableField <InlineEditableField
value={company.incorporationDate} // The API returns this as an ISO timestamp ("2019-03-14T00:00:00.000Z")
// because Postgres `date` columns are serialized through JSON. Strip
// the time portion so the read-only state shows just YYYY-MM-DD,
// which is also the format the user types when editing.
value={company.incorporationDate ? company.incorporationDate.slice(0, 10) : null}
placeholder="YYYY-MM-DD" placeholder="YYYY-MM-DD"
onSave={save('incorporationDate')} onSave={save('incorporationDate')}
/> />

View File

@@ -339,12 +339,19 @@ export function InterestDetailHeader({ portSlug, interest }: InterestDetailHeade
</button> </button>
) : ( ) : (
<> <>
{/* Mobile: icon-only with title tooltip + colored fill carries
the won/lost meaning (green vs rose). Adding a "Won" /
"Lost" text label inline blew out the cluster width and
forced the Email/Call/WhatsApp action-chip row above to
stack vertically — bad trade. From sm up, the full
"Mark won" / "Close as lost" labels read clearly. */}
<button <button
type="button" type="button"
onClick={() => setOutcomeDialog('won')} onClick={() => setOutcomeDialog('won')}
aria-label="Mark as won" aria-label="Mark as won"
title="Mark as won"
className={cn( className={cn(
'inline-flex items-center gap-1.5 rounded-md px-2.5 py-1 text-xs font-medium transition-colors', 'inline-flex items-center gap-1.5 rounded-md px-2 py-1 text-xs font-medium transition-colors sm:px-2.5',
'border border-emerald-200 bg-emerald-50 text-emerald-700', 'border border-emerald-200 bg-emerald-50 text-emerald-700',
'hover:bg-emerald-100', 'hover:bg-emerald-100',
)} )}
@@ -356,8 +363,9 @@ export function InterestDetailHeader({ portSlug, interest }: InterestDetailHeade
type="button" type="button"
onClick={() => setOutcomeDialog('lost')} onClick={() => setOutcomeDialog('lost')}
aria-label="Close as lost" aria-label="Close as lost"
title="Close as lost"
className={cn( className={cn(
'inline-flex items-center gap-1.5 rounded-md px-2.5 py-1 text-xs font-medium transition-colors', 'inline-flex items-center gap-1.5 rounded-md px-2 py-1 text-xs font-medium transition-colors sm:px-2.5',
'border border-rose-200 text-rose-700', 'border border-rose-200 text-rose-700',
'hover:bg-rose-50', 'hover:bg-rose-50',
)} )}

View File

@@ -41,7 +41,7 @@ const MORE_ITEMS: MoreItem[] = [
{ label: 'Companies', icon: Building2, segment: 'companies' }, { label: 'Companies', icon: Building2, segment: 'companies' },
{ label: 'Invoices', icon: FileText, segment: 'invoices' }, { label: 'Invoices', icon: FileText, segment: 'invoices' },
{ label: 'Expenses', icon: Receipt, segment: 'expenses' }, { label: 'Expenses', icon: Receipt, segment: 'expenses' },
{ label: 'Email', icon: Mail, segment: 'email' }, { label: 'Inbox', icon: Mail, segment: 'email' },
{ label: 'Alerts', icon: ShieldAlert, segment: 'alerts' }, { label: 'Alerts', icon: ShieldAlert, segment: 'alerts' },
{ label: 'Reports', icon: BarChart3, segment: 'reports' }, { label: 'Reports', icon: BarChart3, segment: 'reports' },
{ label: 'Reminders', icon: Bell, segment: 'reminders' }, { label: 'Reminders', icon: Bell, segment: 'reminders' },

View File

@@ -249,7 +249,9 @@ export function ReminderList() {
} }
/> />
<div className="flex items-center gap-4 mb-4"> {/* Wrap on phone widths so the priority filter doesn't get pushed
off-screen by the My/All tabs + status filter taking the full row. */}
<div className="flex flex-wrap items-center gap-3 mb-4 sm:gap-4">
{canViewAll && ( {canViewAll && (
<Tabs value={viewMode} onValueChange={(v) => setViewMode(v as 'my' | 'all')}> <Tabs value={viewMode} onValueChange={(v) => setViewMode(v as 'my' | 'all')}>
<TabsList> <TabsList>

View File

@@ -18,6 +18,7 @@ import { SubdivisionCombobox } from '@/components/shared/subdivision-combobox';
import { PhoneInput, type PhoneInputValue } from '@/components/shared/phone-input'; import { PhoneInput, type PhoneInputValue } from '@/components/shared/phone-input';
import { useRealtimeInvalidation } from '@/hooks/use-realtime-invalidation'; import { useRealtimeInvalidation } from '@/hooks/use-realtime-invalidation';
import { apiFetch } from '@/lib/api/client'; import { apiFetch } from '@/lib/api/client';
import { cn } from '@/lib/utils';
import type { CountryCode } from '@/lib/i18n/countries'; import type { CountryCode } from '@/lib/i18n/countries';
interface ResidentialClientRow { interface ResidentialClientRow {
@@ -85,7 +86,9 @@ export function ResidentialClientsList() {
/> />
</div> </div>
<div className="rounded-lg border bg-card overflow-hidden"> {/* Desktop: table layout. Hidden below lg because the 6 columns clip
off the viewport at phone widths. */}
<div className="hidden lg:block rounded-lg border bg-card overflow-hidden">
<table className="w-full text-sm"> <table className="w-full text-sm">
<thead className="bg-muted/40 text-xs text-muted-foreground"> <thead className="bg-muted/40 text-xs text-muted-foreground">
<tr> <tr>
@@ -137,6 +140,51 @@ export function ResidentialClientsList() {
</table> </table>
</div> </div>
{/* Mobile: card list. Each card mirrors the table row data with
name + status pill on top, then meta line(s) below. */}
<div className="lg:hidden space-y-2">
{isLoading && (
<div className="rounded-lg border bg-card px-3 py-8 text-center text-sm text-muted-foreground">
Loading
</div>
)}
{!isLoading && data?.data.length === 0 && (
<div className="rounded-lg border bg-card px-3 py-8 text-center text-sm text-muted-foreground">
No residential clients yet.
</div>
)}
{data?.data.map((c) => (
<Link
key={c.id}
// eslint-disable-next-line @typescript-eslint/no-explicit-any
href={`/${portSlug}/residential/clients/${c.id}` as any}
className="block rounded-lg border bg-card p-3 transition-colors hover:bg-muted/30"
>
<div className="flex items-start justify-between gap-2">
<p className="font-medium text-sm truncate">{c.fullName}</p>
<span
className={cn(
'shrink-0 inline-flex items-center rounded-full px-2 py-0.5 text-[10px] font-medium uppercase tracking-wide',
c.status === 'active'
? 'bg-emerald-100 text-emerald-800'
: c.status === 'inactive'
? 'bg-muted text-muted-foreground'
: 'bg-blue-100 text-blue-800',
)}
>
{STATUS_LABELS[c.status] ?? c.status}
</span>
</div>
<div className="mt-1 flex flex-wrap items-center gap-x-2 gap-y-0.5 text-xs text-muted-foreground">
{c.email ? <span className="truncate">{c.email}</span> : null}
{c.phone ? <span>{c.phone}</span> : null}
{c.placeOfResidence ? <span>{c.placeOfResidence}</span> : null}
{c.source ? <span className="capitalize">· {c.source}</span> : null}
</div>
</Link>
))}
</div>
<NewResidentialClientSheet open={createOpen} onOpenChange={setCreateOpen} /> <NewResidentialClientSheet open={createOpen} onOpenChange={setCreateOpen} />
</div> </div>
); );

View File

@@ -94,7 +94,8 @@ export function ResidentialInterestsList() {
</Select> </Select>
</div> </div>
<div className="rounded-lg border bg-card overflow-hidden"> {/* Desktop: table layout. Hidden below lg; mobile renders cards. */}
<div className="hidden lg:block rounded-lg border bg-card overflow-hidden">
<table className="w-full text-sm"> <table className="w-full text-sm">
<thead className="bg-muted/40 text-xs text-muted-foreground"> <thead className="bg-muted/40 text-xs text-muted-foreground">
<tr> <tr>
@@ -149,6 +150,47 @@ export function ResidentialInterestsList() {
</tbody> </tbody>
</table> </table>
</div> </div>
{/* Mobile: card list. Stage as the headline (it's the most actionable
field for triage), preferences/notes truncated below. */}
<div className="lg:hidden space-y-2">
{isLoading && (
<div className="rounded-lg border bg-card px-3 py-8 text-center text-sm text-muted-foreground">
Loading
</div>
)}
{!isLoading && data?.data.length === 0 && (
<div className="rounded-lg border bg-card px-3 py-8 text-center text-sm text-muted-foreground">
No interests match.
</div>
)}
{data?.data.map((i) => (
<Link
key={i.id}
// eslint-disable-next-line @typescript-eslint/no-explicit-any
href={`/${portSlug}/residential/interests/${i.id}` as any}
className="block rounded-lg border bg-card p-3 transition-colors hover:bg-muted/30"
>
<div className="flex items-start justify-between gap-2">
<p className="font-medium text-sm">
{STAGE_LABELS[i.pipelineStage] ?? i.pipelineStage}
</p>
<span className="shrink-0 text-[11px] text-muted-foreground">
{new Date(i.updatedAt).toLocaleDateString()}
</span>
</div>
{i.preferences ? (
<p className="mt-1 line-clamp-2 text-xs text-muted-foreground">{i.preferences}</p>
) : null}
{i.notes ? (
<p className="mt-1 line-clamp-1 text-xs text-muted-foreground/80">{i.notes}</p>
) : null}
{i.source ? (
<p className="mt-1 text-[11px] capitalize text-muted-foreground">{i.source}</p>
) : null}
</Link>
))}
</div>
</div> </div>
); );
} }

View File

@@ -1,16 +1,9 @@
'use client'; 'use client';
import { type ReactNode } from 'react'; import { useEffect, useRef, type ReactNode } from 'react';
import { Tabs, TabsContent, TabsList, TabsTrigger } from '@/components/ui/tabs'; import { Tabs, TabsContent, TabsList, TabsTrigger } from '@/components/ui/tabs';
import { Badge } from '@/components/ui/badge'; import { Badge } from '@/components/ui/badge';
import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue,
} from '@/components/ui/select';
export interface ResponsiveTab { export interface ResponsiveTab {
id: string; id: string;
@@ -26,37 +19,45 @@ interface ResponsiveTabsProps {
} }
/** /**
* Tabs that collapse to a native <Select> on phone-sized viewports. * Tab strip that scrolls horizontally on narrow viewports. The active tab is
* Above sm: TabsList renders. At/below sm: a Select dropdown replaces the tab strip. * automatically scrolled into view so users can tell at a glance that more
* tabs exist beyond the visible edge.
*
* Previously this collapsed to a <Select> on phone widths, but that read as
* a generic dropdown and obscured the fact that multiple peer tabs exist.
*/ */
export function ResponsiveTabs({ tabs, value, onValueChange }: ResponsiveTabsProps) { export function ResponsiveTabs({ tabs, value, onValueChange }: ResponsiveTabsProps) {
const listRef = useRef<HTMLDivElement>(null);
// Keep the active trigger in view when the value changes externally
// (e.g. ?tab= in the URL or a back/forward navigation).
useEffect(() => {
const root = listRef.current;
if (!root) return;
const active = root.querySelector<HTMLButtonElement>(`[data-tab-id="${CSS.escape(value)}"]`);
if (active) {
active.scrollIntoView({ block: 'nearest', inline: 'nearest', behavior: 'smooth' });
}
}, [value]);
return ( return (
<Tabs value={value} onValueChange={onValueChange}> <Tabs value={value} onValueChange={onValueChange}>
{/* Mobile: select dropdown */} {/* Single scrollable strip for all viewport widths.
<div className="sm:hidden"> The wrapper handles horizontal overflow with momentum scroll on
<Select value={value} onValueChange={onValueChange}> touch devices; the inner TabsList stays its natural width and
<SelectTrigger> slides under the wrapper. */}
<SelectValue /> <div
</SelectTrigger> ref={listRef}
<SelectContent> className="overflow-x-auto -mx-2 px-2 [scrollbar-width:none] [&::-webkit-scrollbar]:hidden"
>
<TabsList className="inline-flex w-max">
{tabs.map((tab) => ( {tabs.map((tab) => (
<SelectItem key={tab.id} value={tab.id}> <TabsTrigger
<span className="flex items-center gap-1.5"> key={tab.id}
{tab.label} value={tab.id}
{tab.badge !== undefined && tab.badge !== null && ( className="gap-1.5 whitespace-nowrap"
<span className="text-xs text-muted-foreground">({tab.badge})</span> data-tab-id={tab.id}
)} >
</span>
</SelectItem>
))}
</SelectContent>
</Select>
</div>
{/* Desktop / tablet: tab strip */}
<TabsList className="hidden sm:flex">
{tabs.map((tab) => (
<TabsTrigger key={tab.id} value={tab.id} className="gap-1.5">
{tab.label} {tab.label}
{tab.badge !== undefined && tab.badge !== null && ( {tab.badge !== undefined && tab.badge !== null && (
<Badge variant="secondary" className="px-1.5 py-0 text-xs"> <Badge variant="secondary" className="px-1.5 py-0 text-xs">
@@ -66,6 +67,7 @@ export function ResponsiveTabs({ tabs, value, onValueChange }: ResponsiveTabsPro
</TabsTrigger> </TabsTrigger>
))} ))}
</TabsList> </TabsList>
</div>
{tabs.map((tab) => ( {tabs.map((tab) => (
<TabsContent key={tab.id} value={tab.id} className="mt-4"> <TabsContent key={tab.id} value={tab.id} className="mt-4">

View File

@@ -142,7 +142,10 @@ export function YachtDetailHeader({ yacht }: YachtDetailHeaderProps) {
return ( return (
<> <>
<DetailHeaderStrip> <DetailHeaderStrip>
<div className="flex items-start gap-3 flex-wrap"> {/* Stacks vertically on phone widths so the action cluster doesn't
crush the status pill / owner row. From sm up, title block sits
beside actions in the original layout. */}
<div className="flex flex-col gap-3 sm:flex-row sm:items-start sm:gap-3 sm:flex-wrap">
<div className="flex-1 min-w-0"> <div className="flex-1 min-w-0">
<div className="flex items-center gap-2 flex-wrap"> <div className="flex items-center gap-2 flex-wrap">
<h1 className="hidden sm:block text-2xl font-bold text-foreground truncate"> <h1 className="hidden sm:block text-2xl font-bold text-foreground truncate">

View File

@@ -123,6 +123,49 @@ export const BERTH_STATUSES = ['available', 'under_offer', 'sold'] as const;
export type BerthStatus = (typeof BERTH_STATUSES)[number]; export type BerthStatus = (typeof BERTH_STATUSES)[number];
// ─── Berth single-select catalogues (mirror NocoDB) ──────────────────────────
// Stored as free text in the DB so legacy values still load, but the form
// presents only the canonical options below.
export const BERTH_AREAS = ['A', 'B', 'C', 'D', 'E'] as const;
export const BERTH_SIDE_PONTOON_OPTIONS = [
'No',
'Quay SB',
'Quay PT',
'Quay SB, Yes PT',
'Quay PT, Yes SB',
'Yes SB',
'Yes PT',
'Yes SB, PT',
'Finger SB',
'Finger PT',
] as const;
export const BERTH_MOORING_TYPES = [
'Side Pier / Med Mooring',
'2x Med Mooring',
'Side Pier / Finger',
'Finger / Med Mooring',
'2x Finger',
] as const;
export const BERTH_CLEAT_TYPES = ['A3', 'A5'] as const;
export const BERTH_CLEAT_CAPACITIES = ['10-14 ton break load', '20-24 ton break load'] as const;
export const BERTH_BOLLARD_TYPES = ['Bull bollard type A', 'Bull bollard type B'] as const;
export const BERTH_BOLLARD_CAPACITIES = ['20 ton break load', '40 ton break load'] as const;
export const BERTH_ACCESS_OPTIONS = [
'Car to Vessel',
'Car to Quai, Cart to Vessel',
'Cart to Vessel',
'Car (3t) to Vessel',
'Car (3.5t) to Vessel',
] as const;
// ─── Lead Categories ───────────────────────────────────────────────────────── // ─── Lead Categories ─────────────────────────────────────────────────────────
export const LEAD_CATEGORIES = ['general_interest', 'specific_qualified', 'hot_lead'] as const; export const LEAD_CATEGORIES = ['general_interest', 'specific_qualified', 'hot_lead'] as const;

View File

@@ -0,0 +1,15 @@
-- Convert text columns to numeric. NULLs survive; empty strings become NULL;
-- whitespace is trimmed before casting so legacy data with stray spaces converts cleanly.
ALTER TABLE "berths"
ALTER COLUMN "nominal_boat_size" SET DATA TYPE numeric
USING NULLIF(TRIM("nominal_boat_size"), '')::numeric;--> statement-breakpoint
ALTER TABLE "berths"
ALTER COLUMN "nominal_boat_size_m" SET DATA TYPE numeric
USING NULLIF(TRIM("nominal_boat_size_m"), '')::numeric;--> statement-breakpoint
ALTER TABLE "berths"
ALTER COLUMN "power_capacity" SET DATA TYPE numeric
USING NULLIF(TRIM("power_capacity"), '')::numeric;--> statement-breakpoint
ALTER TABLE "berths"
ALTER COLUMN "voltage" SET DATA TYPE numeric
USING NULLIF(TRIM("voltage"), '')::numeric;--> statement-breakpoint
ALTER TABLE "berths" ADD COLUMN "status_override_mode" text;

View File

@@ -0,0 +1,30 @@
CREATE TABLE "client_merge_candidates" (
"id" text PRIMARY KEY NOT NULL,
"port_id" text NOT NULL,
"client_a_id" text NOT NULL,
"client_b_id" text NOT NULL,
"score" integer NOT NULL,
"reasons" jsonb NOT NULL,
"status" text DEFAULT 'pending' NOT NULL,
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"resolved_at" timestamp with time zone,
"resolved_by" text
);
--> statement-breakpoint
CREATE TABLE "migration_source_links" (
"id" text PRIMARY KEY NOT NULL,
"source_system" text NOT NULL,
"source_id" text NOT NULL,
"target_entity_type" text NOT NULL,
"target_entity_id" text NOT NULL,
"applied_id" text NOT NULL,
"applied_by" text,
"applied_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
ALTER TABLE "client_merge_candidates" ADD CONSTRAINT "client_merge_candidates_port_id_ports_id_fk" FOREIGN KEY ("port_id") REFERENCES "public"."ports"("id") ON DELETE no action ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "client_merge_candidates" ADD CONSTRAINT "client_merge_candidates_client_a_id_clients_id_fk" FOREIGN KEY ("client_a_id") REFERENCES "public"."clients"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "client_merge_candidates" ADD CONSTRAINT "client_merge_candidates_client_b_id_clients_id_fk" FOREIGN KEY ("client_b_id") REFERENCES "public"."clients"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
CREATE INDEX "idx_cmc_port_status" ON "client_merge_candidates" USING btree ("port_id","status");--> statement-breakpoint
CREATE UNIQUE INDEX "idx_cmc_pair" ON "client_merge_candidates" USING btree ("port_id","client_a_id","client_b_id");--> statement-breakpoint
CREATE UNIQUE INDEX "idx_msl_source_target" ON "migration_source_links" USING btree ("source_system","source_id","target_entity_type");

View File

@@ -0,0 +1,2 @@
ALTER TABLE "clients" ADD COLUMN "merged_into_client_id" text;--> statement-breakpoint
CREATE INDEX "idx_clients_merged_into" ON "clients" USING btree ("merged_into_client_id");

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -141,6 +141,27 @@
"when": 1777671562738, "when": 1777671562738,
"tag": "0019_lazy_vampiro", "tag": "0019_lazy_vampiro",
"breakpoints": true "breakpoints": true
},
{
"idx": 20,
"version": "7",
"when": 1777814682110,
"tag": "0020_medical_betty_brant",
"breakpoints": true
},
{
"idx": 21,
"version": "7",
"when": 1777811835982,
"tag": "0021_unusual_azazel",
"breakpoints": true
},
{
"idx": 22,
"version": "7",
"when": 1777812671833,
"tag": "0022_magenta_madame_hydra",
"breakpoints": true
} }
] ]
} }

View File

@@ -33,14 +33,15 @@ export const berths = pgTable(
widthM: numeric('width_m'), widthM: numeric('width_m'),
draftM: numeric('draft_m'), draftM: numeric('draft_m'),
widthIsMinimum: boolean('width_is_minimum').default(false), widthIsMinimum: boolean('width_is_minimum').default(false),
nominalBoatSize: text('nominal_boat_size'), // Numeric: ft (legacy NocoDB stored as plain numbers, no units in value).
nominalBoatSizeM: text('nominal_boat_size_m'), nominalBoatSize: numeric('nominal_boat_size'),
nominalBoatSizeM: numeric('nominal_boat_size_m'),
waterDepth: numeric('water_depth'), waterDepth: numeric('water_depth'),
waterDepthM: numeric('water_depth_m'), waterDepthM: numeric('water_depth_m'),
waterDepthIsMinimum: boolean('water_depth_is_minimum').default(false), waterDepthIsMinimum: boolean('water_depth_is_minimum').default(false),
sidePontoon: text('side_pontoon'), sidePontoon: text('side_pontoon'),
powerCapacity: text('power_capacity'), powerCapacity: numeric('power_capacity'), // kW
voltage: text('voltage'), voltage: numeric('voltage'), // V at 60Hz
mooringType: text('mooring_type'), mooringType: text('mooring_type'),
cleatType: text('cleat_type'), cleatType: text('cleat_type'),
cleatCapacity: text('cleat_capacity'), cleatCapacity: text('cleat_capacity'),
@@ -58,6 +59,9 @@ export const berths = pgTable(
statusLastChangedBy: text('status_last_changed_by'), // user ID statusLastChangedBy: text('status_last_changed_by'), // user ID
statusLastChangedReason: text('status_last_changed_reason'), statusLastChangedReason: text('status_last_changed_reason'),
statusLastModified: timestamp('status_last_modified', { withTimezone: true }), statusLastModified: timestamp('status_last_modified', { withTimezone: true }),
// Optional override flag carried over from NocoDB ("auto" or null in legacy data).
// Reserved for future "manual override" semantics; not surfaced in the UI today.
statusOverrideMode: text('status_override_mode'),
createdAt: timestamp('created_at', { withTimezone: true }).notNull().defaultNow(), createdAt: timestamp('created_at', { withTimezone: true }).notNull().defaultNow(),
updatedAt: timestamp('updated_at', { withTimezone: true }).notNull().defaultNow(), updatedAt: timestamp('updated_at', { withTimezone: true }).notNull().defaultNow(),
}, },

View File

@@ -2,6 +2,7 @@ import {
pgTable, pgTable,
text, text,
boolean, boolean,
integer,
timestamp, timestamp,
jsonb, jsonb,
index, index,
@@ -30,6 +31,11 @@ export const clients = pgTable(
source: text('source'), // website, manual, referral, broker source: text('source'), // website, manual, referral, broker
sourceDetails: text('source_details'), sourceDetails: text('source_details'),
archivedAt: timestamp('archived_at', { withTimezone: true }), archivedAt: timestamp('archived_at', { withTimezone: true }),
/** When this client was merged into another (the "loser" of a dedup
* merge), this points at the surviving client. Used by the
* /admin/duplicates review queue to redirect any stragglers, and by
* the unmerge flow to restore. Null for live clients. */
mergedIntoClientId: text('merged_into_client_id'),
createdAt: timestamp('created_at', { withTimezone: true }).notNull().defaultNow(), createdAt: timestamp('created_at', { withTimezone: true }).notNull().defaultNow(),
updatedAt: timestamp('updated_at', { withTimezone: true }).notNull().defaultNow(), updatedAt: timestamp('updated_at', { withTimezone: true }).notNull().defaultNow(),
}, },
@@ -38,6 +44,7 @@ export const clients = pgTable(
index('idx_clients_name').on(table.portId, table.fullName), index('idx_clients_name').on(table.portId, table.fullName),
index('idx_clients_archived').on(table.portId, table.archivedAt), index('idx_clients_archived').on(table.portId, table.archivedAt),
index('idx_clients_nationality_iso').on(table.nationalityIso), index('idx_clients_nationality_iso').on(table.nationalityIso),
index('idx_clients_merged_into').on(table.mergedIntoClientId),
], ],
); );
@@ -145,6 +152,54 @@ export const clientMergeLog = pgTable(
(table) => [index('idx_cml_port').on(table.portId)], (table) => [index('idx_cml_port').on(table.portId)],
); );
/**
* Pairs of clients flagged by the background scoring job as potential
* duplicates. The `/admin/duplicates` review queue reads from here.
*
* Lifecycle:
* - Background job inserts a row when a pair scores >= the
* `dedup_review_queue_threshold` system setting.
* - User reviews in the admin UI and either merges (status='merged')
* or dismisses (status='dismissed').
* - Subsequent runs of the scoring job skip pairs already
* `dismissed` so the same false-positive doesn't keep reappearing.
* A future score increase recreates the row.
*
* Pairs are stored canonically with `clientAId < clientBId` (string
* comparison) so the same pair only generates one row regardless of
* scoring direction.
*/
export const clientMergeCandidates = pgTable(
'client_merge_candidates',
{
id: text('id')
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
portId: text('port_id')
.notNull()
.references(() => ports.id),
clientAId: text('client_a_id')
.notNull()
.references(() => clients.id, { onDelete: 'cascade' }),
clientBId: text('client_b_id')
.notNull()
.references(() => clients.id, { onDelete: 'cascade' }),
score: integer('score').notNull(),
/** Human-readable rule list, e.g. ["email match", "phone match"]. */
reasons: jsonb('reasons').notNull(),
status: text('status').notNull().default('pending'), // pending | dismissed | merged
createdAt: timestamp('created_at', { withTimezone: true }).notNull().defaultNow(),
resolvedAt: timestamp('resolved_at', { withTimezone: true }),
resolvedBy: text('resolved_by'),
},
(table) => [
index('idx_cmc_port_status').on(table.portId, table.status),
// Same pair shouldn't surface twice — enforce uniqueness on the
// canonical (a < b) ordering.
uniqueIndex('idx_cmc_pair').on(table.portId, table.clientAId, table.clientBId),
],
);
export const clientAddresses = pgTable( export const clientAddresses = pgTable(
'client_addresses', 'client_addresses',
{ {
@@ -190,3 +245,5 @@ export type ClientMergeLog = typeof clientMergeLog.$inferSelect;
export type NewClientMergeLog = typeof clientMergeLog.$inferInsert; export type NewClientMergeLog = typeof clientMergeLog.$inferInsert;
export type ClientAddress = typeof clientAddresses.$inferSelect; export type ClientAddress = typeof clientAddresses.$inferSelect;
export type NewClientAddress = typeof clientAddresses.$inferInsert; export type NewClientAddress = typeof clientAddresses.$inferInsert;
export type ClientMergeCandidate = typeof clientMergeCandidates.$inferSelect;
export type NewClientMergeCandidate = typeof clientMergeCandidates.$inferInsert;

View File

@@ -56,5 +56,8 @@ export * from './ai-usage';
// GDPR export tracking (Phase 3d) // GDPR export tracking (Phase 3d)
export * from './gdpr'; export * from './gdpr';
// Migration ledger (one-shot scripts — NocoDB import etc.)
export * from './migration';
// Relations (must come last — references all tables) // Relations (must come last — references all tables)
export * from './relations'; export * from './relations';

View File

@@ -0,0 +1,48 @@
import { pgTable, text, timestamp, uniqueIndex } from 'drizzle-orm/pg-core';
/**
* Idempotency ledger for one-shot data migrations from external sources
* (e.g. the legacy NocoDB Interests table).
*
* Every entity created during a migration script's `--apply` run gets a
* row here mapping the source-system row identifier to the new-system
* entity id. Re-running `--apply` against the same report skips rows
* already linked, so partial-failure resumption is just "run again."
*
* One source row can generate multiple new entities (e.g. one NocoDB
* Interests row → one client + one interest + one yacht), so the
* uniqueness constraint includes `target_entity_type`.
*/
export const migrationSourceLinks = pgTable(
'migration_source_links',
{
id: text('id')
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
/** e.g. 'nocodb_interests', 'nocodb_residences', 'nocodb_website_submissions'. */
sourceSystem: text('source_system').notNull(),
/** Source row identifier as a string (NocoDB IDs are integers; we keep
* text here for forward compat with other sources). */
sourceId: text('source_id').notNull(),
/** e.g. 'client', 'interest', 'yacht', 'document'. */
targetEntityType: text('target_entity_type').notNull(),
/** UUID of the new-system entity (clients.id, interests.id, etc.). */
targetEntityId: text('target_entity_id').notNull(),
/** Apply-id from the migration run that created this link — pairs with
* the on-disk apply manifest so `--rollback --apply-id <id>` knows
* exactly which links to remove. */
appliedId: text('applied_id').notNull(),
appliedBy: text('applied_by'),
appliedAt: timestamp('applied_at', { withTimezone: true }).notNull().defaultNow(),
},
(table) => [
uniqueIndex('idx_msl_source_target').on(
table.sourceSystem,
table.sourceId,
table.targetEntityType,
),
],
);
export type MigrationSourceLink = typeof migrationSourceLinks.$inferSelect;
export type NewMigrationSourceLink = typeof migrationSourceLinks.$inferInsert;

View File

@@ -4,7 +4,13 @@
* Exports `seedPortData(portId, portSlug)` — creates a realistic, * Exports `seedPortData(portId, portSlug)` — creates a realistic,
* multi-cardinality data fixture for one port: * multi-cardinality data fixture for one port:
* *
* - 12 berths (5 available / 5 reserved-active / 2 sold) * - 117 berths imported from a snapshot of the legacy NocoDB Berths
* table (`src/lib/db/seed-data/berths.json`). The snapshot is reordered
* so the first 12 entries satisfy the index assumptions used further
* down for interest/reservation linkage:
* idx 0..4 — available (small)
* idx 5..9 — under_offer (medium)
* idx 10..11 — sold (large)
* - 3 companies (2 active, 1 dissolved) with primary billing addresses * - 3 companies (2 active, 1 dissolved) with primary billing addresses
* - 8 clients + contacts + primary addresses * - 8 clients + contacts + primary addresses
* - Memberships tying clients to companies (incl. multi-company + ended) * - Memberships tying clients to companies (incl. multi-company + ended)
@@ -39,6 +45,44 @@ import {
getStandardEoiTemplateHtml, getStandardEoiTemplateHtml,
STANDARD_EOI_MERGE_FIELDS, STANDARD_EOI_MERGE_FIELDS,
} from '@/lib/pdf/templates/eoi-standard-inapp'; } from '@/lib/pdf/templates/eoi-standard-inapp';
import berthSnapshot from './seed-data/berths.json';
// ─── Berth snapshot ──────────────────────────────────────────────────────────
// 117 rows imported from the legacy NocoDB Berths table on 2026-05-03.
// Refresh by re-running the snapshot script (see git history of this file).
type SeedBerth = {
legacyId: number;
mooringNumber: string;
legacyMooringNumber: string;
area: string | null;
status: 'available' | 'under_offer' | 'sold';
lengthFt: number | null;
widthFt: number | null;
draftFt: number | null;
lengthM: number | null;
widthM: number | null;
draftM: number | null;
widthIsMinimum: boolean;
nominalBoatSize: number | null;
nominalBoatSizeM: number | null;
waterDepth: number | null;
waterDepthM: number | null;
waterDepthIsMinimum: boolean;
sidePontoon: string | null;
powerCapacity: number | null;
voltage: number | null;
mooringType: string | null;
cleatType: string | null;
cleatCapacity: string | null;
bollardType: string | null;
bollardCapacity: string | null;
access: string | null;
price: number | null;
bowFacing: string | null;
berthApproved: boolean;
statusOverrideMode: string | null;
};
const BERTH_SNAPSHOT = berthSnapshot as SeedBerth[];
// ─── Tunables ──────────────────────────────────────────────────────────────── // ─── Tunables ────────────────────────────────────────────────────────────────
@@ -77,144 +121,44 @@ export async function seedPortData(portId: string, portSlug: string): Promise<Se
return withTransaction(async (tx) => { return withTransaction(async (tx) => {
// ── 1. Berths ────────────────────────────────────────────────────────── // ── 1. Berths ──────────────────────────────────────────────────────────
// 12 berths: [0..4] available, [5..9] will be reserved-active, [10..11] sold. // 117 berths seeded from the legacy NocoDB Berths snapshot.
// We mark 5..9 as 'under_offer' (closest to "reserved via active reservation") // The JSON file is pre-sorted so the first 12 indexes satisfy the
// and 10..11 as 'sold'; 0..4 remain 'available'. // status semantics expected by the interest/reservation seeds:
const BERTH_SPECS: Array<{ // idx 0..4 available, idx 5..9 under_offer, idx 10..11 sold.
mooring: string;
area: string;
lengthM: string;
widthM: string;
draftM: string;
price: string;
status: 'available' | 'under_offer' | 'sold';
}> = [
{
mooring: 'A-01',
area: 'North Pier',
lengthM: '15',
widthM: '5',
draftM: '2.5',
price: '250000',
status: 'available',
},
{
mooring: 'A-02',
area: 'North Pier',
lengthM: '18',
widthM: '5.5',
draftM: '2.8',
price: '320000',
status: 'available',
},
{
mooring: 'A-03',
area: 'North Pier',
lengthM: '20',
widthM: '6',
draftM: '3.0',
price: '420000',
status: 'available',
},
{
mooring: 'B-01',
area: 'Central Basin',
lengthM: '25',
widthM: '7',
draftM: '3.5',
price: '580000',
status: 'available',
},
{
mooring: 'B-02',
area: 'Central Basin',
lengthM: '30',
widthM: '8',
draftM: '4.0',
price: '780000',
status: 'available',
},
{
mooring: 'B-03',
area: 'Central Basin',
lengthM: '35',
widthM: '8.5',
draftM: '4.2',
price: '950000',
status: 'under_offer',
},
{
mooring: 'C-01',
area: 'South Marina',
lengthM: '40',
widthM: '9',
draftM: '4.5',
price: '1250000',
status: 'under_offer',
},
{
mooring: 'C-02',
area: 'South Marina',
lengthM: '45',
widthM: '10',
draftM: '4.8',
price: '1600000',
status: 'under_offer',
},
{
mooring: 'C-03',
area: 'South Marina',
lengthM: '50',
widthM: '11',
draftM: '5.0',
price: '2100000',
status: 'under_offer',
},
{
mooring: 'D-01',
area: 'Superyacht Dock',
lengthM: '60',
widthM: '13',
draftM: '5.5',
price: '3200000',
status: 'under_offer',
},
{
mooring: 'D-02',
area: 'Superyacht Dock',
lengthM: '70',
widthM: '14',
draftM: '6.0',
price: '4500000',
status: 'sold',
},
{
mooring: 'D-03',
area: 'Superyacht Dock',
lengthM: '80',
widthM: '15',
draftM: '6.5',
price: '6800000',
status: 'sold',
},
];
const berthRows = await tx const berthRows = await tx
.insert(berths) .insert(berths)
.values( .values(
BERTH_SPECS.map((b) => ({ BERTH_SNAPSHOT.map((b) => ({
portId, portId,
mooringNumber: b.mooring, mooringNumber: b.mooringNumber,
area: b.area, area: b.area,
status: b.status, status: b.status,
lengthM: b.lengthM, lengthFt: b.lengthFt != null ? String(b.lengthFt) : null,
widthM: b.widthM, widthFt: b.widthFt != null ? String(b.widthFt) : null,
draftM: b.draftM, draftFt: b.draftFt != null ? String(b.draftFt) : null,
lengthFt: (Number(b.lengthM) * 3.28084).toFixed(2), lengthM: b.lengthM != null ? String(b.lengthM) : null,
widthFt: (Number(b.widthM) * 3.28084).toFixed(2), widthM: b.widthM != null ? String(b.widthM) : null,
draftFt: (Number(b.draftM) * 3.28084).toFixed(2), draftM: b.draftM != null ? String(b.draftM) : null,
price: b.price, widthIsMinimum: b.widthIsMinimum,
nominalBoatSize: b.nominalBoatSize != null ? String(b.nominalBoatSize) : null,
nominalBoatSizeM: b.nominalBoatSizeM != null ? String(b.nominalBoatSizeM) : null,
waterDepth: b.waterDepth != null ? String(b.waterDepth) : null,
waterDepthM: b.waterDepthM != null ? String(b.waterDepthM) : null,
waterDepthIsMinimum: b.waterDepthIsMinimum,
sidePontoon: b.sidePontoon,
powerCapacity: b.powerCapacity != null ? String(b.powerCapacity) : null,
voltage: b.voltage != null ? String(b.voltage) : null,
mooringType: b.mooringType,
cleatType: b.cleatType,
cleatCapacity: b.cleatCapacity,
bollardType: b.bollardType,
bollardCapacity: b.bollardCapacity,
access: b.access,
price: b.price != null ? String(b.price) : null,
priceCurrency: 'USD', priceCurrency: 'USD',
bowFacing: b.bowFacing,
berthApproved: b.berthApproved,
statusOverrideMode: b.statusOverrideMode,
tenureType: 'permanent' as const, tenureType: 'permanent' as const,
})), })),
) )

File diff suppressed because it is too large Load Diff

View File

@@ -2,16 +2,16 @@
* Seed script for Port Nimara CRM. * Seed script for Port Nimara CRM.
* *
* Top-level orchestrator: * Top-level orchestrator:
* 1. Create 3 ports (idempotent): * 1. Create the operational ports (idempotent):
* - Port Nimara * - Port Nimara (primary install — the real marina)
* - Marina Azzurra * - Port Amador (secondary, kept for multi-tenant isolation tests
* - Harbor Royale * and as scaffolding for a future Panama install)
* 2. Create 5 system roles with full permission maps * 2. Create 5 system roles with full permission maps
* 3. Create the super admin user profile placeholder (matt@portnimara.com) * 3. Create the super admin user profile placeholder (matt@portnimara.com)
* 4. For each port, call `seedPortData(portId, portSlug)` from seed-data.ts * 4. For each port, call `seedPortData(portId, portSlug)` from seed-data.ts
* to produce the realistic multi-cardinality fixture * to produce the realistic multi-cardinality fixture
* (berths, clients, companies, yachts, memberships, interests, * (117 berths from the NocoDB snapshot, plus clients, companies, yachts,
* reservations, ownership-transfer history). * memberships, interests, reservations, ownership-transfer history).
* 5. Print a summary. * 5. Print a summary.
* *
* Run with: pnpm db:seed * Run with: pnpm db:seed
@@ -186,7 +186,7 @@ const SALES_MANAGER_PERMISSIONS: RolePermissions = {
generate_eoi: true, generate_eoi: true,
export: true, export: true,
}, },
berths: { view: true, edit: false, import: false, manage_waiting_list: true }, berths: { view: true, edit: true, import: false, manage_waiting_list: true },
documents: { documents: {
view: true, view: true,
create: true, create: true,
@@ -260,7 +260,7 @@ const SALES_AGENT_PERMISSIONS: RolePermissions = {
generate_eoi: true, generate_eoi: true,
export: true, export: true,
}, },
berths: { view: true, edit: false, import: false, manage_waiting_list: true }, berths: { view: true, edit: true, import: false, manage_waiting_list: true },
documents: { documents: {
view: true, view: true,
create: true, create: true,
@@ -413,19 +413,15 @@ const PORT_DEFINITIONS: Array<{
defaultCurrency: 'USD', defaultCurrency: 'USD',
timezone: 'America/Anguilla', timezone: 'America/Anguilla',
}, },
// Second port kept for multi-tenant isolation tests (cross-port scoping,
// permission boundaries). Drop or rename if the production install is
// single-port.
{ {
name: 'Marina Azzurra', name: 'Port Amador',
slug: 'marina-azzurra', slug: 'port-amador',
primaryColor: '#2E86AB', primaryColor: '#D97706',
defaultCurrency: 'EUR', defaultCurrency: 'USD',
timezone: 'Europe/Rome', timezone: 'America/Panama',
},
{
name: 'Harbor Royale',
slug: 'harbor-royale',
primaryColor: '#8B1E3F',
defaultCurrency: 'GBP',
timezone: 'Europe/London',
}, },
]; ];

View File

@@ -0,0 +1,255 @@
/**
* Client-match finder — pure scoring logic.
*
* Compares one input candidate against a pool of existing candidates and
* returns scored matches. Used by:
* - the at-create suggestion in client/interest forms (Layer 1)
* - the public-form auto-link path (when score >= block threshold)
* - the nightly background scoring job (Layer 3)
* - the migration script's dedup pass
*
* Performance shape: blocking via email / phone / surname-token reduces
* the pairwise scan from O(n²) to ~O(n) for any pool size we'll see in
* production. See `findClientMatches` for the blocking implementation.
*
* Design reference: docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md §4.
*/
import { parsePhoneScriptSafe as parsePhone } from './phone-parse';
import { levenshtein } from './normalize';
// ─── Types ──────────────────────────────────────────────────────────────────
export interface MatchCandidate {
id: string;
fullName: string | null;
/** Lowercased last non-particle token from `normalizeName(...).surnameToken`.
* Used as a blocking key. */
surnameToken: string | null;
/** Already lowercased + validated via `normalizeEmail`. */
emails: string[];
/** Already canonical E.164 via `normalizePhone`. */
phonesE164: string[];
/** Address country (NOT phone country) — used for tiebreaking, not scoring. */
countryIso: string | null;
}
export type MatchConfidence = 'high' | 'medium' | 'low';
export interface MatchResult {
candidate: MatchCandidate;
/** 0100 after capping. */
score: number;
/** Human-readable list of which rules contributed. Useful for the
* review queue UI ("matched on email + phone + surname token"). */
reasons: string[];
confidence: MatchConfidence;
}
export interface DedupThresholds {
/** Inclusive lower bound for `'high'` confidence. */
highScore: number;
/** Inclusive lower bound for `'medium'` confidence. Below this is `'low'`. */
mediumScore: number;
}
// ─── Public entry point ─────────────────────────────────────────────────────
/**
* Compare `input` against every reachable candidate in `pool` and return
* scored matches, sorted by score descending. The result list includes
* low-confidence hits — caller filters by `confidence` or `score`
* depending on use case.
*
* Self-matches (an entry with `id === input.id`, e.g. when re-scoring an
* existing client during a background job) are excluded.
*/
export function findClientMatches(
input: MatchCandidate,
pool: MatchCandidate[],
thresholds: DedupThresholds,
): MatchResult[] {
if (pool.length === 0) return [];
// ── Phase 1: build blocking indexes off the pool. ─────────────────────────
//
// Three indexes mean any candidate that shares ANY of (email / phone /
// surname-token) with the input shows up in the comparison set. Anything
// that shares NONE is structurally too different to be a duplicate and
// is skipped — this is what keeps the algorithm O(n) at scale.
const byEmail = new Map<string, MatchCandidate[]>();
const byPhone = new Map<string, MatchCandidate[]>();
const bySurnameToken = new Map<string, MatchCandidate[]>();
for (const c of pool) {
if (c.id === input.id) continue;
for (const email of c.emails) {
pushTo(byEmail, email, c);
}
for (const phone of c.phonesE164) {
pushTo(byPhone, phone, c);
}
if (c.surnameToken) {
pushTo(bySurnameToken, c.surnameToken, c);
}
}
// ── Phase 2: gather the comparison set via the blocking indexes. ─────────
const comparisonSet = new Map<string, MatchCandidate>();
for (const email of input.emails) {
for (const c of byEmail.get(email) ?? []) {
comparisonSet.set(c.id, c);
}
}
for (const phone of input.phonesE164) {
for (const c of byPhone.get(phone) ?? []) {
comparisonSet.set(c.id, c);
}
}
if (input.surnameToken) {
for (const c of bySurnameToken.get(input.surnameToken) ?? []) {
comparisonSet.set(c.id, c);
}
}
// ── Phase 3: score every candidate that survived blocking. ───────────────
const results: MatchResult[] = [];
for (const candidate of comparisonSet.values()) {
const r = scorePair(input, candidate);
results.push(r);
}
// ── Phase 4: sort by score desc + assign confidence tier. ────────────────
results.sort((a, b) => b.score - a.score);
for (const r of results) {
r.confidence = classify(r.score, thresholds);
}
return results;
}
// ─── Scoring ────────────────────────────────────────────────────────────────
/**
* Score one (input, candidate) pair against the rule set in design §4.2.
* Compounding: positive rules sum, negative rules subtract; the result is
* clamped to [0, 100]. Reasons accumulate in the order rules fire so the
* review-queue UI can show "matched on email + phone".
*/
function scorePair(a: MatchCandidate, b: MatchCandidate): MatchResult {
let score = 0;
const reasons: string[] = [];
// ── Positive rules. ──────────────────────────────────────────────────────
const sharedEmail = a.emails.find((e) => b.emails.includes(e));
const emailMatch = !!sharedEmail;
if (emailMatch) {
score += 60;
reasons.push('email match');
}
const sharedPhone = a.phonesE164.find((p) => b.phonesE164.includes(p) && countDigits(p) >= 8);
const phoneMatch = !!sharedPhone;
if (phoneMatch) {
score += 50;
reasons.push('phone match');
}
const aNameNorm = (a.fullName ?? '').toLowerCase().trim();
const bNameNorm = (b.fullName ?? '').toLowerCase().trim();
const nameExactMatch = aNameNorm.length > 0 && aNameNorm === bNameNorm;
if (nameExactMatch) {
score += 20;
reasons.push('name match');
}
// Surname + given-name fuzzy. Only fires when names are NOT exactly
// equal — avoids double-counting with the rule above. Catches
// 'Constanzo' / 'Costanzo', 'Marc' / 'Marcus' etc. when other contact
// signals confirm them.
if (!nameExactMatch && a.surnameToken && b.surnameToken && a.surnameToken === b.surnameToken) {
const aGiven = (a.fullName ?? '').toLowerCase().split(/\s+/)[0] ?? '';
const bGiven = (b.fullName ?? '').toLowerCase().split(/\s+/)[0] ?? '';
if (aGiven && bGiven && levenshtein(aGiven, bGiven) <= 1) {
score += 15;
reasons.push('surname + given-name fuzzy match');
}
}
// ── Negative rules. ──────────────────────────────────────────────────────
// Same email but the two parties' phone numbers belong to different
// countries. Common when one inbox is shared by spouses / coworkers
// and the actual phone owners are distinct people. Don't auto-merge.
if (emailMatch && !phoneMatch && a.phonesE164.length > 0 && b.phonesE164.length > 0) {
const aCountries = phoneCountriesOf(a);
const bCountries = phoneCountriesOf(b);
const overlap = [...aCountries].some((c) => bCountries.has(c));
if (!overlap && aCountries.size > 0 && bCountries.size > 0) {
score -= 15;
reasons.push('phone country mismatch (negative)');
}
}
// Same name but no contact match. Two distinct people with the same
// name (common for "John Smith") sneak through name-based blocking;
// penalize so the score lands below the auto-merge threshold.
if (nameExactMatch && !emailMatch && !phoneMatch) {
score -= 20;
reasons.push('name match but no shared contact (negative)');
}
return {
candidate: b,
score: clamp(score, 0, 100),
reasons,
confidence: 'low', // assigned by caller after threshold lookup
};
}
// ─── Helpers ────────────────────────────────────────────────────────────────
function pushTo<K, V>(map: Map<K, V[]>, key: K, value: V): void {
const existing = map.get(key);
if (existing) {
existing.push(value);
} else {
map.set(key, [value]);
}
}
function classify(score: number, thresholds: DedupThresholds): MatchConfidence {
if (score >= thresholds.highScore) return 'high';
if (score >= thresholds.mediumScore) return 'medium';
return 'low';
}
function clamp(value: number, min: number, max: number): number {
if (value < min) return min;
if (value > max) return max;
return value;
}
function countDigits(s: string): number {
let count = 0;
for (let i = 0; i < s.length; i += 1) {
const code = s.charCodeAt(i);
if (code >= 48 && code <= 57) count += 1;
}
return count;
}
/**
* Resolve each phone in a candidate to its ISO country code (via
* libphonenumber-js). Cached per call; the surrounding caller doesn't
* batch so we accept the parse cost.
*/
function phoneCountriesOf(c: MatchCandidate): Set<string> {
const out = new Set<string>();
for (const p of c.phonesE164) {
const parsed = parsePhone(p);
if (parsed.country) out.add(parsed.country);
}
return out;
}

View File

@@ -0,0 +1,362 @@
/**
* Apply phase for the legacy NocoDB → CRM migration. Walks a
* `MigrationPlan` produced by {@link transformSnapshot} and writes
* the new client / contact / address / yacht / interest rows into the
* target port.
*
* Idempotent: every insert is guarded by a `migration_source_links`
* lookup keyed on `(source_system, source_id, target_entity_type)`, so
* a partial failure can be resumed by re-running the script. Re-runs
* against an already-applied plan are a near-no-op.
*
* Per-entity transactions (not one giant transaction) — the design
* favours visible partial progress on failure over all-or-nothing.
*
* @see src/lib/dedup/migration-transform.ts for the input shape.
* @see src/lib/db/schema/migration.ts for the idempotency ledger.
*/
import { and, eq, inArray } from 'drizzle-orm';
import { db } from '@/lib/db';
import { clients, clientContacts, clientAddresses } from '@/lib/db/schema/clients';
import { interests } from '@/lib/db/schema/interests';
import { yachts } from '@/lib/db/schema/yachts';
import { berths } from '@/lib/db/schema/berths';
import { migrationSourceLinks } from '@/lib/db/schema/migration';
import type { MigrationPlan, PlannedClient, PlannedInterest } from './migration-transform';
const SOURCE_SYSTEM = 'nocodb_interests';
/**
* Convert a legacy bare mooring string like "D32" / "A1" / "E18" to the
* dashed/padded form "D-32" / "A-01" / "E-18" used by the new berths
* schema. If the input doesn't match the bare pattern, returns it
* unchanged so a literal lookup can still hit (handles the case where
* the legacy data already has the dashed form).
*
* Multi-mooring strings ("A3, D30") return the original string —
* those need human review and we don't want to silently pick one half.
*/
function normalizeLegacyMooring(raw: string): string {
// Bare letter+digits, e.g. "D32"
const m = /^([A-E])(\d{1,3})$/i.exec(raw.trim());
if (!m) return raw;
const letter = m[1]!.toUpperCase();
const num = parseInt(m[2]!, 10);
return `${letter}-${num.toString().padStart(2, '0')}`;
}
export interface ApplyResult {
applyId: string;
clientsInserted: number;
clientsSkipped: number;
contactsInserted: number;
addressesInserted: number;
yachtsInserted: number;
interestsInserted: number;
interestsSkipped: number;
warnings: string[];
}
export interface ApplyOptions {
port: { id: string; slug: string };
applyId: string;
/** Set to true for the "preview the writes" mode — runs every read but
* rolls back inserts. Useful for verifying mappings before committing. */
rehearsal?: boolean;
appliedBy?: string;
}
/**
* Look up an existing migration link for a (sourceId, targetType) pair.
* Returns the existing target entity id if already linked.
*/
async function resolveExistingLink(
sourceId: number,
targetEntityType: 'client' | 'interest' | 'yacht' | 'address',
): Promise<string | null> {
const rows = await db
.select({ id: migrationSourceLinks.targetEntityId })
.from(migrationSourceLinks)
.where(
and(
eq(migrationSourceLinks.sourceSystem, SOURCE_SYSTEM),
eq(migrationSourceLinks.sourceId, String(sourceId)),
eq(migrationSourceLinks.targetEntityType, targetEntityType),
),
)
.limit(1);
return rows[0]?.id ?? null;
}
/** Find the first sourceId in a cluster that's already linked to a client,
* if any. The cluster might be larger than the previously-applied set if
* the dedup algorithm collapsed an extra duplicate this run. */
async function resolveExistingClusterClient(sourceIds: number[]): Promise<string | null> {
if (sourceIds.length === 0) return null;
const rows = await db
.select({ id: migrationSourceLinks.targetEntityId })
.from(migrationSourceLinks)
.where(
and(
eq(migrationSourceLinks.sourceSystem, SOURCE_SYSTEM),
inArray(migrationSourceLinks.sourceId, sourceIds.map(String)),
eq(migrationSourceLinks.targetEntityType, 'client'),
),
)
.limit(1);
return rows[0]?.id ?? null;
}
/** Apply a single PlannedClient — returns `{clientId, inserted}` so the
* caller can wire interests against the (possibly pre-existing) record. */
async function applyClient(
planned: PlannedClient,
opts: ApplyOptions,
result: ApplyResult,
): Promise<{ clientId: string; inserted: boolean }> {
// Idempotency: if any source row in the cluster already mapped to a client,
// reuse that record.
const existing = await resolveExistingClusterClient(planned.sourceIds);
if (existing) {
result.clientsSkipped += 1;
return { clientId: existing, inserted: false };
}
if (opts.rehearsal) {
// Simulate an insert without writing — used for the preview path.
return { clientId: `rehearsal-${planned.tempId}`, inserted: true };
}
// surnameToken is on the planned object (used by the dedup blocking
// index inside the transform) but not in the clients schema — runtime
// dedup re-derives it from fullName when needed. Drop it on insert.
const [inserted] = await db
.insert(clients)
.values({
portId: opts.port.id,
fullName: planned.fullName,
nationalityIso: planned.countryIso ?? null,
preferredContactMethod: planned.preferredContactMethod ?? null,
source: planned.source ?? null,
})
.returning({ id: clients.id });
if (!inserted) throw new Error('Client insert returned no row');
const clientId = inserted.id;
// Record idempotency links — one per source row in the cluster.
await db.insert(migrationSourceLinks).values(
planned.sourceIds.map((sid) => ({
sourceSystem: SOURCE_SYSTEM,
sourceId: String(sid),
targetEntityType: 'client' as const,
targetEntityId: clientId,
appliedId: opts.applyId,
...(opts.appliedBy ? { appliedBy: opts.appliedBy } : {}),
})),
);
// Contacts: bulk insert; mark first email + first phone as primary.
if (planned.contacts.length > 0) {
let primaryEmailSet = false;
let primaryPhoneSet = false;
const contactRows = planned.contacts.map((ct) => {
let isPrimary = false;
if (ct.isPrimary) {
if (ct.channel === 'email' && !primaryEmailSet) {
isPrimary = true;
primaryEmailSet = true;
} else if ((ct.channel === 'phone' || ct.channel === 'whatsapp') && !primaryPhoneSet) {
isPrimary = true;
primaryPhoneSet = true;
}
}
return {
clientId,
channel: ct.channel,
value: ct.value,
valueE164: ct.valueE164 ?? null,
valueCountry: ct.valueCountry ?? null,
isPrimary,
};
});
await db.insert(clientContacts).values(contactRows);
result.contactsInserted += contactRows.length;
}
// Addresses: bulk insert; first is marked primary if multiple. Note the
// schema requires portId on every address row in addition to clientId.
if (planned.addresses.length > 0) {
const addressRows = planned.addresses.map((a, idx) => ({
clientId,
portId: opts.port.id,
streetAddress: a.streetAddress ?? null,
city: a.city ?? null,
countryIso: a.countryIso ?? null,
isPrimary: idx === 0,
}));
await db.insert(clientAddresses).values(addressRows);
result.addressesInserted += addressRows.length;
}
result.clientsInserted += 1;
return { clientId, inserted: true };
}
/** Apply a single PlannedInterest — looks up its client + berth + yacht and
* inserts the interest row, plus a yacht stub if a yacht name is present. */
async function applyInterest(
planned: PlannedInterest,
tempIdToClientId: Map<string, string>,
mooringToBerthId: Map<string, string>,
opts: ApplyOptions,
result: ApplyResult,
): Promise<void> {
// Idempotency: skip if this source row already created an interest.
const existing = await resolveExistingLink(planned.sourceId, 'interest');
if (existing) {
result.interestsSkipped += 1;
return;
}
const clientId = tempIdToClientId.get(planned.clientTempId);
if (!clientId) {
result.warnings.push(
`Interest source=${planned.sourceId} references unknown client tempId=${planned.clientTempId} — skipped`,
);
return;
}
let berthId: string | null = null;
if (planned.berthMooringNumber) {
berthId =
mooringToBerthId.get(planned.berthMooringNumber) ??
// The legacy NocoDB Interests table uses bare mooring strings like
// "D32", "B16", whereas the new berths schema (mirroring the NocoDB
// Berths snapshot) uses zero-padded "D-32", "B-16". Try the dashed
// form as a fallback so legacy references resolve correctly.
mooringToBerthId.get(normalizeLegacyMooring(planned.berthMooringNumber)) ??
null;
if (!berthId) {
result.warnings.push(
`Interest source=${planned.sourceId} references unknown mooring="${planned.berthMooringNumber}" — interest created without berth link`,
);
}
}
// Optional yacht stub: if the legacy row had a yacht name, create a
// minimal yacht record owned by the client. The new schema requires
// currentOwnerType + currentOwnerId.
let yachtId: string | null = null;
if (planned.yachtName) {
const existingYacht = await resolveExistingLink(planned.sourceId, 'yacht');
if (existingYacht) {
yachtId = existingYacht;
} else if (!opts.rehearsal) {
const [y] = await db
.insert(yachts)
.values({
portId: opts.port.id,
name: planned.yachtName,
currentOwnerType: 'client',
currentOwnerId: clientId,
status: 'active',
})
.returning({ id: yachts.id });
if (y) {
yachtId = y.id;
await db.insert(migrationSourceLinks).values({
sourceSystem: SOURCE_SYSTEM,
sourceId: String(planned.sourceId),
targetEntityType: 'yacht' as const,
targetEntityId: y.id,
appliedId: opts.applyId,
...(opts.appliedBy ? { appliedBy: opts.appliedBy } : {}),
});
result.yachtsInserted += 1;
}
}
}
if (opts.rehearsal) {
result.interestsInserted += 1;
return;
}
const [iRow] = await db
.insert(interests)
.values({
portId: opts.port.id,
clientId,
berthId,
yachtId,
pipelineStage: planned.pipelineStage,
leadCategory: planned.leadCategory,
source: planned.source,
notes: planned.notes,
documensoId: planned.documensoId,
dateEoiSent: planned.dateEoiSent ? new Date(planned.dateEoiSent) : null,
dateEoiSigned: planned.dateEoiSigned ? new Date(planned.dateEoiSigned) : null,
dateContractSent: planned.dateContractSent ? new Date(planned.dateContractSent) : null,
dateContractSigned: planned.dateContractSigned ? new Date(planned.dateContractSigned) : null,
dateDepositReceived: planned.dateDepositReceived
? new Date(planned.dateDepositReceived)
: null,
dateLastContact: planned.dateLastContact ? new Date(planned.dateLastContact) : null,
})
.returning({ id: interests.id });
if (!iRow) throw new Error('Interest insert returned no row');
await db.insert(migrationSourceLinks).values({
sourceSystem: SOURCE_SYSTEM,
sourceId: String(planned.sourceId),
targetEntityType: 'interest' as const,
targetEntityId: iRow.id,
appliedId: opts.applyId,
...(opts.appliedBy ? { appliedBy: opts.appliedBy } : {}),
});
result.interestsInserted += 1;
}
/**
* Top-level apply driver. Walks the plan once, building the
* tempId→clientId map as it goes, then walks interests with that map.
*/
export async function applyPlan(plan: MigrationPlan, opts: ApplyOptions): Promise<ApplyResult> {
const result: ApplyResult = {
applyId: opts.applyId,
clientsInserted: 0,
clientsSkipped: 0,
contactsInserted: 0,
addressesInserted: 0,
yachtsInserted: 0,
interestsInserted: 0,
interestsSkipped: 0,
warnings: [],
};
// 1. Clients (and their contacts/addresses)
const tempIdToClientId = new Map<string, string>();
for (const planned of plan.clients) {
const { clientId } = await applyClient(planned, opts, result);
tempIdToClientId.set(planned.tempId, clientId);
}
// 2. Build mooring→berthId lookup once, scoped to this port.
const berthRows = await db
.select({ id: berths.id, mooringNumber: berths.mooringNumber })
.from(berths)
.where(eq(berths.portId, opts.port.id));
const mooringToBerthId = new Map(berthRows.map((b) => [b.mooringNumber, b.id]));
// 3. Interests (and yacht stubs)
for (const planned of plan.interests) {
await applyInterest(planned, tempIdToClientId, mooringToBerthId, opts, result);
}
return result;
}

View File

@@ -0,0 +1,274 @@
/**
* Migration report writer — turns a `MigrationPlan` (from
* `migration-transform.ts`) into a CSV + a human-readable Markdown
* summary on disk under `.migration/<timestamp>/`.
*
* The CSV format is intentionally machine-friendly (one row per
* planned operation) so it can be diffed across runs and inspected
* by hand. The summary is designed for "open this in your editor and
* eyeball it for 5 minutes before --apply."
*/
import { promises as fs } from 'node:fs';
import path from 'node:path';
import type { MigrationPlan } from './migration-transform';
// ─── Output directory ───────────────────────────────────────────────────────
export interface ReportPaths {
rootDir: string;
csvPath: string;
summaryPath: string;
planJsonPath: string;
}
/** Resolve report paths relative to the worktree root. The timestamped
* directory is created lazily by `writeReport`. */
export function resolveReportPaths(
rootDir: string,
timestamp: string = new Date().toISOString().replace(/[:.]/g, '-'),
): ReportPaths {
const dir = path.join(rootDir, '.migration', timestamp);
return {
rootDir: dir,
csvPath: path.join(dir, 'report.csv'),
summaryPath: path.join(dir, 'summary.md'),
planJsonPath: path.join(dir, 'plan.json'),
};
}
// ─── CSV row shape ──────────────────────────────────────────────────────────
interface CsvRow {
op: string; // create_client / create_contact / create_interest / auto_link / flag / needs_review
reason: string;
source_id: string;
target_table: string;
target_value: string;
confidence: string;
manual_review: 'true' | 'false';
}
// Trivial CSV escape: quote any cell that contains comma / quote / newline,
// double up internal quotes per RFC 4180. No need for a dependency.
function csvEscape(s: string): string {
if (/[",\n\r]/.test(s)) {
return `"${s.replace(/"/g, '""')}"`;
}
return s;
}
function rowToCsvLine(r: CsvRow): string {
return [
r.op,
r.reason,
r.source_id,
r.target_table,
r.target_value,
r.confidence,
r.manual_review,
]
.map(csvEscape)
.join(',');
}
// ─── Build CSV ──────────────────────────────────────────────────────────────
export function buildCsv(plan: MigrationPlan): string {
const lines: string[] = [];
lines.push(
[
'op',
'reason',
'source_id',
'target_table',
'target_value',
'confidence',
'manual_review',
].join(','),
);
for (const client of plan.clients) {
lines.push(
rowToCsvLine({
op: 'create_client',
reason: client.sourceIds.length > 1 ? 'auto-merged cluster' : 'new',
source_id: client.sourceIds.join('|'),
target_table: 'clients.fullName',
target_value: client.fullName,
confidence: 'N/A',
manual_review: 'false',
}),
);
for (const c of client.contacts) {
lines.push(
rowToCsvLine({
op: 'create_contact',
reason: c.flagged ?? 'new',
source_id: client.sourceIds.join('|'),
target_table: `clientContacts.${c.channel}`,
target_value: c.value,
confidence: 'N/A',
manual_review: c.flagged ? 'true' : 'false',
}),
);
}
for (const a of client.addresses) {
lines.push(
rowToCsvLine({
op: 'create_address',
reason: 'address text present',
source_id: client.sourceIds.join('|'),
target_table: 'clientAddresses.countryIso',
target_value: a.countryIso ?? '(unresolved)',
confidence: a.countryConfidence ?? 'fallback',
manual_review: a.countryConfidence === 'fallback' || !a.countryIso ? 'true' : 'false',
}),
);
}
}
for (const interest of plan.interests) {
lines.push(
rowToCsvLine({
op: 'create_interest',
reason: `pipelineStage=${interest.pipelineStage}`,
source_id: String(interest.sourceId),
target_table: 'interests',
target_value: `${interest.berthMooringNumber ?? '(no berth)'} / ${interest.yachtName ?? '(no yacht)'}`,
confidence: 'N/A',
manual_review: 'false',
}),
);
}
for (const link of plan.autoLinks) {
lines.push(
rowToCsvLine({
op: 'auto_link',
reason: link.reasons.join(' + '),
source_id: `${link.leadSourceId}<-${link.mergedSourceIds.join(',')}`,
target_table: 'clients',
target_value: '(merged into lead)',
confidence: `score=${link.score}`,
manual_review: 'false',
}),
);
}
for (const pair of plan.needsReview) {
lines.push(
rowToCsvLine({
op: 'needs_review',
reason: pair.reasons.join(' + '),
source_id: `${pair.aSourceId}<->${pair.bSourceId}`,
target_table: 'clients',
target_value: '(human review required)',
confidence: `score=${pair.score}`,
manual_review: 'true',
}),
);
}
for (const flag of plan.flags) {
lines.push(
rowToCsvLine({
op: 'flag',
reason: flag.reason,
source_id: String(flag.sourceId),
target_table: flag.sourceTable,
target_value: JSON.stringify(flag.details ?? {}),
confidence: 'N/A',
manual_review: 'true',
}),
);
}
return lines.join('\n') + '\n';
}
// ─── Build summary markdown ─────────────────────────────────────────────────
export function buildSummary(plan: MigrationPlan, generatedAt: string): string {
const s = plan.stats;
const lines: string[] = [];
lines.push(`# Migration Dry-Run — ${generatedAt}`);
lines.push('');
lines.push('## Input');
lines.push(`- ${s.inputInterestRows} NocoDB Interests`);
lines.push(`- ${s.inputResidentialRows} NocoDB Residential Interests`);
lines.push('');
lines.push('## Outcome');
lines.push(`- ${s.outputClients} clients`);
lines.push(`- ${s.outputInterests} interests (one per source row, linked to deduped client)`);
lines.push(`- ${s.outputContacts} client_contacts`);
lines.push(`- ${s.outputAddresses} client_addresses`);
lines.push('');
lines.push('## Auto-linked clusters');
if (plan.autoLinks.length === 0) {
lines.push('_None — every input row maps to a unique client._');
} else {
for (const link of plan.autoLinks) {
const merged = link.mergedSourceIds.length;
lines.push(
`- Lead row \`${link.leadSourceId}\` ← merged ${merged} other row${merged === 1 ? '' : 's'} (\`${link.mergedSourceIds.join(', ')}\`) — score ${link.score} via ${link.reasons.join(' + ')}`,
);
}
}
lines.push('');
lines.push('## Pairs flagged for human review');
if (plan.needsReview.length === 0) {
lines.push('_None._');
} else {
for (const pair of plan.needsReview) {
lines.push(
`- Rows \`${pair.aSourceId}\`\`${pair.bSourceId}\` — score ${pair.score} (${pair.reasons.join(' + ')})`,
);
}
}
lines.push('');
lines.push('## Data quality flags');
if (plan.flags.length === 0) {
lines.push('_No quality issues._');
} else {
const byReason = new Map<string, number>();
for (const f of plan.flags) {
byReason.set(f.reason, (byReason.get(f.reason) ?? 0) + 1);
}
for (const [reason, count] of [...byReason].sort((a, b) => b[1] - a[1])) {
lines.push(`- **${count}× ${reason}**`);
}
lines.push('');
lines.push('### Detail');
for (const f of plan.flags.slice(0, 30)) {
lines.push(
`- \`${f.sourceTable}#${f.sourceId}\`: ${f.reason}${f.details ? `\`${JSON.stringify(f.details)}\`` : ''}`,
);
}
if (plan.flags.length > 30) {
lines.push(`- _… and ${plan.flags.length - 30} more (see report.csv for full list)_`);
}
}
lines.push('');
lines.push('## Next step');
lines.push('');
lines.push('Eyeball the auto-linked + flagged-for-review pairs above.');
lines.push('When satisfied, re-run the script with `--apply --report .migration/<this-dir>/`.');
lines.push('Apply will refuse to run if the source NocoDB has changed since this dry-run.');
return lines.join('\n') + '\n';
}
// ─── Write to disk ──────────────────────────────────────────────────────────
export async function writeReport(
paths: ReportPaths,
plan: MigrationPlan,
generatedAt: string,
): Promise<void> {
await fs.mkdir(paths.rootDir, { recursive: true });
await fs.writeFile(paths.csvPath, buildCsv(plan), 'utf-8');
await fs.writeFile(paths.summaryPath, buildSummary(plan, generatedAt), 'utf-8');
await fs.writeFile(paths.planJsonPath, JSON.stringify(plan, null, 2), 'utf-8');
}

View File

@@ -0,0 +1,576 @@
/**
* Pure transform: NocoDB snapshot → planned new-system entities + dedup result.
*
* Used by the migration script's `--dry-run` (to produce the report) and
* `--apply` (to actually write). Keeping this pure means the same code
* runs in both modes, in tests against the frozen fixture, and in the
* one-off CLI run against the live base.
*
* No side effects, no DB calls, no external services.
*/
import {
normalizeName,
normalizeEmail,
normalizePhone,
resolveCountry,
type NormalizedPhone,
} from './normalize';
import { findClientMatches, type MatchCandidate } from './find-matches';
import type { CountryCode } from '@/lib/i18n/countries';
import type { NocoDbRow, NocoDbSnapshot } from './nocodb-source';
// ─── Plan output ────────────────────────────────────────────────────────────
export interface PlannedClient {
/** Stable id derived from the deduped cluster's lead row. Used by the
* apply phase to reference newly-created clients before they exist
* in the DB. */
tempId: string;
/** Source row IDs that contributed to this client (one if no duplicates,
* many if dedup merged a cluster). */
sourceIds: number[];
fullName: string;
surnameToken?: string;
countryIso: CountryCode | null;
preferredContactMethod: string | null;
source: string | null;
contacts: PlannedContact[];
addresses: PlannedAddress[];
}
export interface PlannedContact {
channel: 'email' | 'phone' | 'whatsapp' | 'other';
value: string;
valueE164?: string | null;
valueCountry?: CountryCode | null;
isPrimary: boolean;
flagged?: string;
}
export interface PlannedAddress {
streetAddress: string | null;
city: string | null;
countryIso: CountryCode | null;
/** When confidence is low, the migration script flags the row for
* human review. */
countryConfidence: 'exact' | 'fuzzy' | 'city' | 'fallback' | null;
}
export interface PlannedInterest {
/** NocoDB row id this interest came from. */
sourceId: number;
/** tempId of the planned client this interest hangs off. */
clientTempId: string;
pipelineStage: string;
leadCategory: string | null;
source: string | null;
notes: string | null;
/** Mooring number; the apply phase resolves this to a berthId via the
* new-system Berths table. */
berthMooringNumber: string | null;
yachtName: string | null;
/** Date stamps for milestone columns. ISO strings if parseable. */
dateEoiSent: string | null;
dateEoiSigned: string | null;
dateDepositReceived: string | null;
dateContractSent: string | null;
dateContractSigned: string | null;
dateLastContact: string | null;
/** Documenso linkage carried forward when present so the document
* record can be stitched up downstream. */
documensoId: string | null;
}
export interface MigrationFlag {
sourceTable: 'interests' | 'residential_interests' | 'website_interest_submissions';
sourceId: number;
reason: string;
details?: Record<string, unknown>;
}
export interface MigrationPlan {
clients: PlannedClient[];
interests: PlannedInterest[];
flags: MigrationFlag[];
/** Pairs that the migration would auto-link (high score). */
autoLinks: Array<{
leadSourceId: number;
mergedSourceIds: number[];
score: number;
reasons: string[];
}>;
/** Pairs that need human review (medium score). Each pair shows up
* in the migration report; the user resolves before --apply. */
needsReview: Array<{ aSourceId: number; bSourceId: number; score: number; reasons: string[] }>;
stats: MigrationStats;
}
export interface MigrationStats {
inputInterestRows: number;
inputResidentialRows: number;
outputClients: number;
outputInterests: number;
outputContacts: number;
outputAddresses: number;
flaggedRows: number;
autoLinkedClusters: number;
needsReviewPairs: number;
}
export interface TransformOptions {
/** ISO country used when a phone has no prefix and the row has no
* Place of Residence. Defaults to AI (Anguilla / Port Nimara's home). */
defaultPhoneCountry: CountryCode;
/** Score thresholds for auto-link vs human review. Should match the
* per-port `system_settings` values once the runtime UI is in place. */
thresholds: {
autoLink: number;
needsReview: number;
};
}
const DEFAULT_OPTIONS: TransformOptions = {
defaultPhoneCountry: 'AI',
thresholds: { autoLink: 90, needsReview: 50 },
};
// ─── Stage mapping ──────────────────────────────────────────────────────────
const STAGE_MAP: Record<string, string> = {
'General Qualified Interest': 'open',
'Specific Qualified Interest': 'details_sent',
'EOI and NDA Sent': 'eoi_sent',
'Signed EOI and NDA': 'eoi_signed',
'Made Reservation': 'deposit_10pct',
'Contract Negotiation': 'contract_sent',
'Contract Negotiations Finalized': 'contract_sent',
'Contract Signed': 'contract_signed',
};
const LEAD_CATEGORY_MAP: Record<string, string> = {
General: 'general_interest',
'Friends and Family': 'general_interest',
};
const SOURCE_MAP: Record<string, string> = {
portal: 'website',
Form: 'website',
External: 'manual',
};
// ─── Date parsing ───────────────────────────────────────────────────────────
/**
* Parse a date the legacy NocoDB might have stored in DD-MM-YYYY,
* DD/MM/YYYY, YYYY-MM-DD, or ISO format. Returns ISO string or null.
*/
function parseFlexibleDate(input: unknown): string | null {
if (typeof input !== 'string' || input.trim() === '') return null;
const s = input.trim();
// Already ISO
if (/^\d{4}-\d{2}-\d{2}/.test(s)) {
const d = new Date(s);
return Number.isNaN(d.getTime()) ? null : d.toISOString();
}
// DD-MM-YYYY or DD/MM/YYYY
const m = s.match(/^(\d{1,2})[-/](\d{1,2})[-/](\d{4})$/);
if (m) {
const [, day, month, year] = m;
const iso = `${year}-${month!.padStart(2, '0')}-${day!.padStart(2, '0')}`;
const d = new Date(iso);
return Number.isNaN(d.getTime()) ? null : d.toISOString();
}
// Anything else: try Date constructor as a last resort
const d = new Date(s);
return Number.isNaN(d.getTime()) ? null : d.toISOString();
}
// ─── Main transform ─────────────────────────────────────────────────────────
/**
* Run the full transform pipeline against a NocoDB snapshot. Pure
* function — same input always produces the same plan.
*/
export function transformSnapshot(
snapshot: NocoDbSnapshot,
options: Partial<TransformOptions> = {},
): MigrationPlan {
const opts = { ...DEFAULT_OPTIONS, ...options };
const flags: MigrationFlag[] = [];
// Build per-row candidates first so we can run dedup before assigning
// tempIds (clients with multiple source rows merge into one tempId).
const perRow = snapshot.interests.map((row) => rowToCandidate(row, 'interests', opts, flags));
// Dedup pass 1: every row scored against every other row (within the
// same pool). The blocking strategy in `findClientMatches` keeps this
// cheap even for the full 252-row dataset.
const clusters = clusterByDedup(perRow, opts);
// Build the planned clients + interests from the clusters.
const clients: PlannedClient[] = [];
const interests: PlannedInterest[] = [];
const autoLinks: MigrationPlan['autoLinks'] = [];
const needsReview: MigrationPlan['needsReview'] = [];
for (const cluster of clusters) {
const lead = cluster.leadCandidate;
const tempId = `client-${lead.row.Id}`;
// Build the client record from the lead row, then merge in any
// contact info / address info from the other rows in the cluster.
const planned = buildPlannedClient(tempId, cluster, opts);
clients.push(planned);
// Each row in the cluster becomes its own interest record.
for (const member of cluster.members) {
const interest = buildPlannedInterest(member.row, tempId);
interests.push(interest);
}
if (cluster.members.length > 1) {
autoLinks.push({
leadSourceId: lead.row.Id,
mergedSourceIds: cluster.members.filter((m) => m !== lead).map((m) => m.row.Id),
score: cluster.maxScore,
reasons: cluster.reasons,
});
}
for (const pair of cluster.reviewPairs) {
needsReview.push(pair);
}
}
return {
clients,
interests,
flags,
autoLinks,
needsReview,
stats: {
inputInterestRows: snapshot.interests.length,
inputResidentialRows: snapshot.residentialInterests.length,
outputClients: clients.length,
outputInterests: interests.length,
outputContacts: clients.reduce((sum, c) => sum + c.contacts.length, 0),
outputAddresses: clients.reduce((sum, c) => sum + c.addresses.length, 0),
flaggedRows: flags.length,
autoLinkedClusters: autoLinks.length,
needsReviewPairs: needsReview.length,
},
};
}
// ─── Helpers ────────────────────────────────────────────────────────────────
interface RowCandidate {
row: NocoDbRow;
candidate: MatchCandidate;
/** Phone normalize result for the row's primary phone; used downstream
* to attach valueE164 + country to the planned contact. */
phoneResult: NormalizedPhone | null;
/** Country resolved from "Place of Residence". */
countryIso: CountryCode | null;
countryConfidence: 'exact' | 'fuzzy' | 'city' | null;
/** Normalized email or null. */
email: string | null;
/** Display name from `normalizeName`. */
displayName: string;
}
function rowToCandidate(
row: NocoDbRow,
sourceTable: MigrationFlag['sourceTable'],
opts: TransformOptions,
flags: MigrationFlag[],
): RowCandidate {
const rawName = (row['Full Name'] as string | undefined) ?? '';
const rawEmail = (row['Email Address'] as string | undefined) ?? '';
const rawPhone = (row['Phone Number'] as string | undefined) ?? '';
const rawCountry = (row['Place of Residence'] as string | undefined) ?? '';
const normName = normalizeName(rawName);
const email = normalizeEmail(rawEmail);
const country = resolveCountry(rawCountry);
const phoneCountry = country.iso ?? opts.defaultPhoneCountry;
const phoneResult = normalizePhone(rawPhone, phoneCountry as CountryCode);
// Surface anything weird so the report can show it.
if (rawPhone && !phoneResult?.e164) {
flags.push({
sourceTable,
sourceId: row.Id,
reason: phoneResult?.flagged ? `phone ${phoneResult.flagged}` : 'phone unparseable',
details: { rawPhone },
});
}
if (rawEmail && !email) {
flags.push({
sourceTable,
sourceId: row.Id,
reason: 'email invalid',
details: { rawEmail },
});
}
if (rawCountry && !country.iso) {
flags.push({
sourceTable,
sourceId: row.Id,
reason: 'country unresolved',
details: { rawCountry },
});
}
const candidate: MatchCandidate = {
id: String(row.Id),
fullName: normName.display || null,
surnameToken: normName.surnameToken ?? null,
emails: email ? [email] : [],
phonesE164: phoneResult?.e164 ? [phoneResult.e164] : [],
countryIso: country.iso ?? null,
};
return {
row,
candidate,
phoneResult,
countryIso: country.iso ?? null,
countryConfidence: country.confidence,
email,
displayName: normName.display,
};
}
interface Cluster {
/** The cluster's "lead" row (most complete + most recent). */
leadCandidate: RowCandidate;
members: RowCandidate[];
maxScore: number;
reasons: string[];
/** Pairs in this cluster that scored medium (need review). */
reviewPairs: Array<{ aSourceId: number; bSourceId: number; score: number; reasons: string[] }>;
}
function clusterByDedup(rows: RowCandidate[], opts: TransformOptions): Cluster[] {
// Use a union-find structure indexed by row id. Every pair with a
// score >= autoLink threshold gets unioned. Pairs in [needsReview,
// autoLink) accumulate onto the cluster's reviewPairs list — they're
// surfaced for human triage but not auto-merged.
const parent = new Map<string, string>();
for (const r of rows) parent.set(r.candidate.id, r.candidate.id);
const find = (id: string): string => {
let cur = id;
while (parent.get(cur) !== cur) {
const next = parent.get(cur)!;
parent.set(cur, parent.get(next)!); // path compression
cur = parent.get(cur)!;
}
return cur;
};
const union = (a: string, b: string) => {
const rootA = find(a);
const rootB = find(b);
if (rootA !== rootB) parent.set(rootA, rootB);
};
const clusterReasons = new Map<string, string[]>();
const clusterMaxScore = new Map<string, number>();
const clusterReviewPairs = new Map<string, Cluster['reviewPairs']>();
// Score every candidate against every other candidate. The find-matches
// function does its own blocking so this is cheap.
for (let i = 0; i < rows.length; i += 1) {
const left = rows[i]!;
const remainingPool = rows.slice(i + 1).map((r) => r.candidate);
if (remainingPool.length === 0) continue;
const matches = findClientMatches(left.candidate, remainingPool, {
highScore: opts.thresholds.autoLink,
mediumScore: opts.thresholds.needsReview,
});
for (const m of matches) {
if (m.score >= opts.thresholds.autoLink) {
union(left.candidate.id, m.candidate.id);
const root = find(left.candidate.id);
clusterMaxScore.set(root, Math.max(clusterMaxScore.get(root) ?? 0, m.score));
const existing = clusterReasons.get(root) ?? [];
for (const reason of m.reasons) {
if (!existing.includes(reason)) existing.push(reason);
}
clusterReasons.set(root, existing);
} else if (m.score >= opts.thresholds.needsReview) {
// Medium — track on whichever cluster `left` belongs to.
const root = find(left.candidate.id);
const list = clusterReviewPairs.get(root) ?? [];
list.push({
aSourceId: parseInt(left.candidate.id, 10),
bSourceId: parseInt(m.candidate.id, 10),
score: m.score,
reasons: m.reasons,
});
clusterReviewPairs.set(root, list);
}
}
}
// Group rows by their cluster root.
const byRoot = new Map<string, RowCandidate[]>();
for (const r of rows) {
const root = find(r.candidate.id);
const list = byRoot.get(root) ?? [];
list.push(r);
byRoot.set(root, list);
}
// Build cluster objects, choosing the most-complete row as the lead.
const clusters: Cluster[] = [];
for (const [root, members] of byRoot) {
const lead = pickLead(members);
clusters.push({
leadCandidate: lead,
members,
maxScore: clusterMaxScore.get(root) ?? 0,
reasons: clusterReasons.get(root) ?? [],
reviewPairs: clusterReviewPairs.get(root) ?? [],
});
}
return clusters;
}
function pickLead(rows: RowCandidate[]): RowCandidate {
// Pick the row with the most populated fields, breaking ties by
// recency (highest Id, since NocoDB IDs are monotonic).
return rows.reduce((best, current) => {
const bestScore = completenessScore(best);
const currentScore = completenessScore(current);
if (currentScore > bestScore) return current;
if (currentScore === bestScore && current.row.Id > best.row.Id) return current;
return best;
});
}
function completenessScore(r: RowCandidate): number {
let score = 0;
if (r.email) score += 1;
if (r.phoneResult?.e164) score += 1;
if (r.row['Address']) score += 0.5;
if (r.row['Yacht Name']) score += 0.5;
if (r.row['Source']) score += 0.25;
if (r.row['Lead Category']) score += 0.25;
if (r.row['Internal Notes']) score += 0.25;
return score;
}
function buildPlannedClient(
tempId: string,
cluster: Cluster,
opts: TransformOptions,
): PlannedClient {
const lead = cluster.leadCandidate;
// Collect distinct emails + phones from across the cluster — duplicate
// submissions often come with different contact methods we want to
// preserve as multiple rows in `client_contacts`.
const seenEmails = new Set<string>();
const seenPhones = new Set<string>();
const contacts: PlannedContact[] = [];
for (const member of cluster.members) {
if (member.email && !seenEmails.has(member.email)) {
seenEmails.add(member.email);
contacts.push({
channel: 'email',
value: member.email,
isPrimary: contacts.length === 0,
});
}
if (member.phoneResult?.e164 && !seenPhones.has(member.phoneResult.e164)) {
seenPhones.add(member.phoneResult.e164);
const isFirstPhone = !contacts.some((c) => c.channel === 'phone');
contacts.push({
channel: 'phone',
value: member.phoneResult.e164,
valueE164: member.phoneResult.e164,
valueCountry: member.phoneResult.country,
isPrimary: isFirstPhone && contacts.every((c) => !c.isPrimary || c.channel === 'email'),
flagged: member.phoneResult.flagged,
});
}
}
// Demote the email-primary if a more-completable phone exists.
// Simpler invariant: the first contact is primary unless the row
// explicitly preferred phone.
const preferredMethod = (lead.row['Contact Method Preferred'] as string | undefined)
?.toLowerCase()
?.trim();
// Address: only build if the lead row has a meaningful address text.
const rawAddress = (lead.row['Address'] as string | undefined)?.trim();
const addresses: PlannedAddress[] = [];
if (rawAddress) {
addresses.push({
streetAddress: rawAddress,
city: null,
countryIso: lead.countryIso ?? opts.defaultPhoneCountry,
countryConfidence: lead.countryConfidence ?? 'fallback',
});
}
const sourceFromRow = (lead.row['Source'] as string | undefined) ?? null;
const mappedSource = sourceFromRow ? (SOURCE_MAP[sourceFromRow] ?? 'manual') : null;
return {
tempId,
sourceIds: cluster.members.map((m) => m.row.Id),
fullName: lead.displayName,
surnameToken: lead.candidate.surnameToken ?? undefined,
countryIso: lead.countryIso,
preferredContactMethod: preferredMethod ?? null,
source: mappedSource,
contacts,
addresses,
};
}
function buildPlannedInterest(row: NocoDbRow, clientTempId: string): PlannedInterest {
const stage = (row['Sales Process Level'] as string | undefined) ?? '';
const cat = (row['Lead Category'] as string | undefined) ?? '';
const notesParts: string[] = [];
const internalNotes = row['Internal Notes'] as string | undefined;
const extraComments = row['Extra Comments'] as string | undefined;
if (internalNotes?.trim()) notesParts.push(internalNotes.trim());
if (extraComments?.trim()) notesParts.push(`Extra Comments: ${extraComments.trim()}`);
const berthSize = row['Berth Size Desired'] as string | undefined;
if (berthSize?.trim()) notesParts.push(`Berth size desired: ${berthSize.trim()}`);
return {
sourceId: row.Id,
clientTempId,
pipelineStage: STAGE_MAP[stage] ?? 'open',
leadCategory: LEAD_CATEGORY_MAP[cat] ?? null,
source: ((row['Source'] as string | undefined) ?? null) || null,
notes: notesParts.join('\n\n') || null,
berthMooringNumber: (row['Berth Number'] as string | undefined) ?? null,
yachtName: (() => {
const n = (row['Yacht Name'] as string | undefined)?.trim();
// Filter placeholder values used by sales reps for "we don't know yet".
if (!n) return null;
if (['TBC', 'Na', 'NA', 'na', 'N/A', 'TBD', 'tbd'].includes(n)) return null;
return n;
})(),
dateEoiSent: parseFlexibleDate(row['EOI Time Sent']),
dateEoiSigned: parseFlexibleDate(row['all_signed_notified_at'] ?? row['developerSignTime']),
dateDepositReceived: null, // not directly tracked in legacy schema
dateContractSent: parseFlexibleDate(row['Time LOI Sent']),
dateContractSigned: parseFlexibleDate(row['developerSignTime']),
dateLastContact: parseFlexibleDate(row['Created At'] ?? row['Date Added']),
documensoId: (row['documensoID'] as string | undefined) ?? null,
};
}

View File

@@ -0,0 +1,152 @@
/**
* Read-only adapter for the legacy NocoDB Port Nimara base.
*
* Used by the one-shot migration script (`scripts/migrate-from-nocodb.ts`)
* to pull every Interest, Residential Interest, and Website Submission
* row from the source-of-truth NocoDB tables. No mutations.
*
* Auth: `xc-token` header per NocoDB v2 API.
*
* The shape returned is a verbatim record of the row's fields — caller
* is responsible for mapping to the new schema via `nocodb-transform.ts`.
*/
import { z } from 'zod';
// ─── Configuration ──────────────────────────────────────────────────────────
const ConfigSchema = z.object({
url: z.string().url(),
token: z.string().min(1),
});
export interface NocoDbConfig {
url: string;
token: string;
}
export function loadNocoDbConfig(env: NodeJS.ProcessEnv = process.env): NocoDbConfig {
return ConfigSchema.parse({
url: env.NOCODB_URL,
token: env.NOCODB_TOKEN,
});
}
// ─── Table identifiers ──────────────────────────────────────────────────────
//
// These IDs are stable per the NocoDB base — they were captured during the
// 2026-05-03 audit and won't change unless the base is rebuilt. If the
// base is reset, regenerate them from `getTablesList`.
export const NOCO_TABLES = {
interests: 'mbs9hjauug4eseo',
residentialInterests: 'mscfpwwwjuds4nt',
websiteInterestSubmissions: 'mevkpcih67c6jsm',
websiteContactFormSubmissions: 'mxk5cd0pmwnwlcl',
websiteBerthEoiSupplements: 'mglmioo0ku8zgqj',
berths: 'mczgos9hr3oa9qc',
} as const;
// ─── HTTP shape ─────────────────────────────────────────────────────────────
interface NocoDbListResponse<T> {
list: T[];
pageInfo: {
totalRows: number;
page: number;
pageSize: number;
isFirstPage: boolean;
isLastPage: boolean;
};
}
/** A row's `Id` is always present. The rest of the fields vary per table. */
export type NocoDbRow = Record<string, unknown> & { Id: number };
// ─── Public API ─────────────────────────────────────────────────────────────
/**
* Fetch all rows from a NocoDB table. Auto-paginates until the API
* reports `isLastPage`. The legacy base is small (252 Interests rows
* being the largest table) so we keep this simple — no streaming.
*/
export async function fetchAllRows(
tableId: string,
config: NocoDbConfig,
pageSize = 250,
): Promise<NocoDbRow[]> {
const all: NocoDbRow[] = [];
let page = 1;
// Hard cap to prevent infinite-loop bugs if pageInfo lies. Each page
// pulls up to `pageSize` rows, so 200 pages * 250 = 50k rows is the
// maximum we'll ever fetch from one table.
const MAX_PAGES = 200;
while (page <= MAX_PAGES) {
const url = new URL(`${config.url}/api/v2/tables/${tableId}/records`);
url.searchParams.set('limit', String(pageSize));
url.searchParams.set('offset', String((page - 1) * pageSize));
const res = await fetch(url, {
headers: {
'xc-token': config.token,
accept: 'application/json',
},
});
if (!res.ok) {
throw new Error(
`NocoDB fetch failed: ${res.status} ${res.statusText} — table ${tableId} page ${page}`,
);
}
const json = (await res.json()) as NocoDbListResponse<NocoDbRow>;
all.push(...json.list);
if (json.pageInfo.isLastPage || json.list.length === 0) break;
page += 1;
}
return all;
}
/**
* Convenience snapshot — pulls every table the migration cares about
* in parallel. Returned shape is the input the transform layer expects.
*/
export interface NocoDbSnapshot {
interests: NocoDbRow[];
residentialInterests: NocoDbRow[];
websiteInterestSubmissions: NocoDbRow[];
websiteContactFormSubmissions: NocoDbRow[];
websiteBerthEoiSupplements: NocoDbRow[];
berths: NocoDbRow[];
fetchedAt: string;
}
export async function fetchSnapshot(config: NocoDbConfig): Promise<NocoDbSnapshot> {
const [
interests,
residentialInterests,
websiteInterestSubmissions,
websiteContactFormSubmissions,
websiteBerthEoiSupplements,
berths,
] = await Promise.all([
fetchAllRows(NOCO_TABLES.interests, config),
fetchAllRows(NOCO_TABLES.residentialInterests, config),
fetchAllRows(NOCO_TABLES.websiteInterestSubmissions, config),
fetchAllRows(NOCO_TABLES.websiteContactFormSubmissions, config),
fetchAllRows(NOCO_TABLES.websiteBerthEoiSupplements, config),
fetchAllRows(NOCO_TABLES.berths, config),
]);
return {
interests,
residentialInterests,
websiteInterestSubmissions,
websiteContactFormSubmissions,
websiteBerthEoiSupplements,
berths,
fetchedAt: new Date().toISOString(),
};
}

418
src/lib/dedup/normalize.ts Normal file
View File

@@ -0,0 +1,418 @@
/**
* Normalization helpers for the dedup pipeline.
*
* Pure functions (no DB, no React). Used by both the runtime at-create
* surfaces and the one-shot NocoDB migration script. Every transform
* here has a fixture in `tests/unit/dedup/normalize.test.ts` drawn from
* real dirty values observed in the legacy NocoDB Interests table.
*
* Design reference: docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md §3.
*/
import { z } from 'zod';
import { ALL_COUNTRY_CODES, getCountryName, type CountryCode } from '@/lib/i18n/countries';
import { parsePhoneScriptSafe as parsePhone } from './phone-parse';
// ─── Names ──────────────────────────────────────────────────────────────────
/**
* Tokens that should stay lowercase mid-name. Covers the common Romance,
* Germanic, and Iberian particles seen in client records. The first token
* of a name is always title-cased even if it appears in this set.
*/
const PARTICLES: ReadonlySet<string> = new Set([
'van',
'von',
'de',
'del',
'da',
'das',
'do',
'dos',
'di',
'le',
'la',
'el',
'al',
'der',
'den',
'des',
'du',
'dalla',
'della',
'st',
'st.',
'y',
]);
export interface NormalizedName {
/** Human-readable form preserved for UI display. Trims, collapses
* whitespace, fixes case, but never destroys the user's intent —
* slash-with-company structure ("Daniel Wainstein / 7 Knots, LLC")
* is left intact. */
display: string;
/** Lowercased form for matching. */
normalized: string;
/** Last non-particle token, lowercased. Used as a blocking key by the
* dedup algorithm so we only compare candidates with similar surnames. */
surnameToken?: string;
}
/**
* Normalize a free-text full name. Trims and collapses whitespace,
* replaces \r/\n/\t with single spaces, intelligently title-cases
* ALL-CAPS surnames while keeping particles (van / de / dalla / etc.)
* lowercase mid-name, and preserves Irish O' surnames as O'Brien.
*
* If the input contains a `/` (slash-with-company structure like
* "Daniel Wainstein / 7 Knots, LLC"), the trailing company text is
* preserved verbatim — it's signal, not noise.
*/
export function normalizeName(raw: string | null | undefined): NormalizedName {
const safe = (raw ?? '').toString();
// Replace \r, \n, \t with single spaces, then collapse runs of whitespace.
const cleaned = safe
.replace(/[\r\n\t]/g, ' ')
.replace(/\s+/g, ' ')
.trim();
if (!cleaned) {
return { display: '', normalized: '', surnameToken: undefined };
}
// Slash-with-company: title-case the part before the slash, leave the
// company segment untouched (it's typically already a brand we shouldn't
// mangle: "SAS TIKI", "7 Knots, LLC").
const slashIdx = cleaned.indexOf('/');
let displayCore: string;
if (slashIdx !== -1) {
const personPart = cleaned.slice(0, slashIdx).trim();
const companyPart = cleaned.slice(slashIdx + 1).trim();
displayCore = `${titleCaseTokens(personPart)} / ${companyPart}`;
} else {
displayCore = titleCaseTokens(cleaned);
}
const display = displayCore;
const normalized = display.toLowerCase();
const surnameToken = computeSurnameToken(slashIdx !== -1 ? cleaned.slice(0, slashIdx) : cleaned);
return { display, normalized, surnameToken };
}
function titleCaseTokens(s: string): string {
const tokens = s.split(' ').filter(Boolean);
if (tokens.length === 0) return '';
return tokens.map((tok, idx) => titleCaseOneToken(tok, idx === 0)).join(' ');
}
function titleCaseOneToken(token: string, isFirst: boolean): string {
if (!token) return '';
const lower = token.toLowerCase();
if (!isFirst && PARTICLES.has(lower)) return lower;
// O'Brien / D'Angelo / l'Estrange — capitalize the segment after each
// apostrophe so a lowercased input round-trips to readable Irish caps.
if (lower.includes("'")) {
return lower
.split("'")
.map((part) => (part.length > 0 ? part[0]!.toUpperCase() + part.slice(1) : part))
.join("'");
}
return lower[0]!.toUpperCase() + lower.slice(1);
}
function computeSurnameToken(personPart: string): string | undefined {
const cleaned = personPart
.replace(/[\r\n\t]/g, ' ')
.replace(/\s+/g, ' ')
.trim();
if (!cleaned) return undefined;
const tokens = cleaned.split(' ').map((t) => t.toLowerCase());
// Walk from the right past particles to find the last "real" surname token.
for (let i = tokens.length - 1; i >= 0; i -= 1) {
const tok = tokens[i]!;
if (!PARTICLES.has(tok)) return tok;
}
// All tokens are particles? Fall back to the last token verbatim.
return tokens[tokens.length - 1];
}
// ─── Emails ─────────────────────────────────────────────────────────────────
const emailSchema = z.string().email();
/**
* Normalize a free-text email. Trims + lowercases. Returns null for empty
* or malformed input — caller decides whether to flag, store, or drop.
*
* Plus-aliases (`user+tag@domain.com`) are NOT stripped: they're real
* distinct addresses, and stripping them would auto-merge legitimately
* separate accounts.
*/
export function normalizeEmail(raw: string | null | undefined): string | null {
if (raw == null) return null;
const trimmed = raw.toString().trim().toLowerCase();
if (!trimmed) return null;
const result = emailSchema.safeParse(trimmed);
return result.success ? trimmed : null;
}
// ─── Phones ─────────────────────────────────────────────────────────────────
export type PhoneFlag = 'multi_number' | 'placeholder' | 'unparseable';
export interface NormalizedPhone {
/** Canonical E.164 form, e.g. '+15742740548'. Null when unparseable
* or flagged as placeholder. */
e164: string | null;
/** ISO-3166-1 alpha-2 of the country the number was parsed against. */
country: CountryCode | null;
/** Display-friendly international format. Useful for migration reports. */
display: string | null;
/** Set when the input had a quirk worth surfacing in the migration
* report or runtime audit log. Absent on clean parses. */
flagged?: PhoneFlag;
}
/**
* Normalize a raw user-entered phone string for comparison + storage.
*
* Pipeline:
* 1. strip leading apostrophe (spreadsheet copy-paste artifact)
* 2. strip \r / \n / \t (real values seen in NocoDB had carriage returns)
* 3. detect multi-number fields ("+33611111111;+33622222222",
* "0677580750/0690511494") — flag and take first segment
* 4. strip whitespace, dots, dashes, parens, single quotes
* 5. convert leading "00" → "+" (international dialling code)
* 6. detect placeholder fakes (8+ consecutive zeros) — flag, return null e164
* 7. parse via libphonenumber-js
* 8. on parse failure or invalid number → flag 'unparseable'
*
* Returns null for empty inputs (cheaper to short-circuit than to wrap).
*/
export function normalizePhone(
raw: string | null | undefined,
defaultCountry?: CountryCode,
): NormalizedPhone | null {
if (raw == null) return null;
let cleaned = raw.toString().trim();
if (!cleaned) return null;
// 1. Spreadsheet apostrophe prefix.
if (cleaned.startsWith("'")) cleaned = cleaned.slice(1);
// 2. Strip carriage returns / newlines / tabs.
cleaned = cleaned.replace(/[\r\n\t]/g, '');
// 3. Multi-number detection — split on /, ;, , (in that order of priority).
let flagged: PhoneFlag | undefined;
if (/[/;,]/.test(cleaned)) {
flagged = 'multi_number';
cleaned = cleaned.split(/[/;,]/)[0]!.trim();
}
// 4. Strip whitespace, dots, dashes, parens. Keep + for E.164 prefix.
cleaned = cleaned.replace(/[\s.\-()]/g, '');
if (!cleaned) return { e164: null, country: null, display: null, flagged: 'unparseable' };
// 5. 00 international prefix → +.
if (cleaned.startsWith('00')) {
cleaned = '+' + cleaned.slice(2);
}
// 6. Placeholder fakes — runs of 8+ consecutive zeros, e.g. +447000000000.
if (/0{8,}/.test(cleaned)) {
return { e164: null, country: null, display: null, flagged: 'placeholder' };
}
// 7. Parse via the existing i18n helper (libphonenumber-js under the hood).
const parsed = parsePhone(cleaned, defaultCountry);
if (!parsed.e164) {
// Couldn't even produce a canonical form — genuinely garbage.
return { e164: null, country: null, display: null, flagged: 'unparseable' };
}
// Note: we deliberately don't gate on `parsed.isValid`. The
// libphonenumber-js `min` build returns isValid=false for many real
// numbers (NANP territories share +1; some country metadata is
// truncated). For dedup we only need a canonical E.164 string to
// compare; strict validity is the form layer's problem, not ours.
// If a string-only test (e.g. \"abc-not-a-phone\") gets here, parse
// returns null e164 anyway and the branch above handles it.
return {
e164: parsed.e164,
country: parsed.country,
display: parsed.international,
flagged,
};
}
// ─── Countries ──────────────────────────────────────────────────────────────
/**
* Aliases for canonical country names that don't match
* `Intl.DisplayNames(en)` output verbatim. Keys are pre-normalized
* (lowercase, diacritic-free, hyphens/dots → spaces, collapsed whitespace).
*
* Kept opinionated and small — only entries we've actually seen in legacy
* data. Adding a new alias is cheap; trying to be exhaustive isn't.
*/
const COUNTRY_ALIASES: Record<string, CountryCode> = {
// Generic abbreviations
usa: 'US',
us: 'US',
uk: 'GB',
// Saint-Barthélemy variants seen in production
'saint barthelemy': 'BL',
'saint barth': 'BL',
'st barth': 'BL',
'st barths': 'BL',
'st barthelemy': 'BL',
// Caribbean short-forms whose canonical Intl names are awkward
// ("Antigua and Barbuda", "Saint Vincent and the Grenadines", etc.).
antigua: 'AG',
barbuda: 'AG',
'st kitts': 'KN',
'saint kitts': 'KN',
nevis: 'KN',
};
/**
* High-frequency cities → country, used as a last-resort fallback when
* exact / alias / fuzzy country matching all miss. Keys are normalized.
*
* Order matters: an entry's key is also matched as a substring of the
* input ("Sag Harbor Y" contains "sag harbor"), so the most specific
* city appears first to avoid a wrong partial hit.
*/
const CITY_TO_COUNTRY: Record<string, CountryCode> = {
'kansas city': 'US',
'sag harbor': 'US',
'new york': 'US',
// Cities that came out unresolved from the 2026-05-03 NocoDB dry-run.
// Using lowercase (post-normalize keys).
boston: 'US',
tampa: 'US',
'fort lauderdale': 'US',
'port jefferson': 'US',
nantucket: 'US',
// US state abbreviations that often appear standalone or as suffix:
' fl': 'US',
' ma': 'US',
' ny': 'US',
' tx': 'US',
' ca': 'US',
// International
london: 'GB',
paris: 'FR',
};
export type CountryConfidence = 'exact' | 'fuzzy' | 'city';
export interface ResolvedCountry {
iso: CountryCode | null;
confidence: CountryConfidence | null;
}
/**
* Map free-text country / region input to an ISO-3166-1 alpha-2 code.
*
* Lookup order: alias → exact (vs. all locale country names) → city →
* fuzzy (Levenshtein ≤ 2). Anything beyond fuzzy returns null and the
* migration script flags the row for human review.
*/
export function resolveCountry(text: string | null | undefined): ResolvedCountry {
if (text == null) return { iso: null, confidence: null };
const normalized = normalizeForLookup(text.toString());
if (!normalized) return { iso: null, confidence: null };
// 1. Aliases — covers USA / UK / St Barth and friends.
const alias = COUNTRY_ALIASES[normalized];
if (alias) return { iso: alias, confidence: 'exact' };
// 2. Exact match against Intl-derived country names. We compare against
// diacritic-stripped + lowercased canonical names so 'United States'
// and 'united states' both resolve.
for (const code of ALL_COUNTRY_CODES) {
const cleanName = normalizeForLookup(getCountryName(code, 'en'));
if (cleanName === normalized) return { iso: code, confidence: 'exact' };
}
// 3. City → country fallback, exact or substring.
const cityExact = CITY_TO_COUNTRY[normalized];
if (cityExact) return { iso: cityExact, confidence: 'city' };
for (const [city, iso] of Object.entries(CITY_TO_COUNTRY)) {
if (normalized.includes(city)) return { iso, confidence: 'city' };
}
// 4. Fuzzy fallback (Levenshtein ≤ 2). Skipped for short inputs because
// a 4-char string like "Mars" sits within distance 2 of multiple
// short country names (Mali, Laos, Iran, …) — false-positive city.
if (normalized.length >= 6) {
let bestCode: CountryCode | null = null;
let bestDistance = Number.POSITIVE_INFINITY;
for (const code of ALL_COUNTRY_CODES) {
const cleanName = normalizeForLookup(getCountryName(code, 'en'));
const d = levenshtein(cleanName, normalized);
if (d < bestDistance) {
bestDistance = d;
bestCode = code;
if (d === 0) break;
}
}
if (bestDistance <= 2 && bestCode) {
return { iso: bestCode, confidence: 'fuzzy' };
}
}
return { iso: null, confidence: null };
}
/** Lowercase + strip diacritics + replace hyphens/dots with spaces +
* collapse whitespace. Used by both the input and the canonical-name
* side of the country comparison so they meet on the same shape. */
function normalizeForLookup(s: string): string {
return s
.normalize('NFD')
.replace(/[̀-ͯ]/g, '')
.toLowerCase()
.replace(/[-.]/g, ' ')
.replace(/\s+/g, ' ')
.trim();
}
// ─── Levenshtein ────────────────────────────────────────────────────────────
/**
* Standard iterative Levenshtein. Used by the country fuzzy match and by
* the dedup algorithm's name-similarity rule. Allocates O(n*m) so callers
* shouldn't run it against pathological inputs — the dedup blocking
* strategy keeps comparison sets small.
*
* Exported so the find-matches module can reuse the same implementation
* without relying on an external dep.
*/
export function levenshtein(a: string, b: string): number {
if (a === b) return 0;
if (!a) return b.length;
if (!b) return a.length;
const m = a.length;
const n = b.length;
// Two rolling rows is enough — keeps memory at O(n) instead of O(n*m).
let prev = new Array<number>(n + 1);
let curr = new Array<number>(n + 1);
for (let j = 0; j <= n; j += 1) prev[j] = j;
for (let i = 1; i <= m; i += 1) {
curr[0] = i;
for (let j = 1; j <= n; j += 1) {
const cost = a[i - 1] === b[j - 1] ? 0 : 1;
curr[j] = Math.min(curr[j - 1]! + 1, prev[j]! + 1, prev[j - 1]! + cost);
}
[prev, curr] = [curr, prev];
}
return prev[n]!;
}

View File

@@ -0,0 +1,66 @@
/**
* Script-safe phone parser.
*
* The project's existing `src/lib/i18n/phone.ts` imports from
* `libphonenumber-js`, which under Node 25 + tsx loader hits a
* metadata-shape interop bug (`{ default }` wrapping the JSON). It
* works fine in Next.js + vitest, but a `tsx scripts/...` invocation
* blows up.
*
* This wrapper bypasses the bundled `index.cjs.js` and calls
* `libphonenumber-js/core` directly with metadata loaded as raw JSON.
* Same surface as the i18n helper; usable from both runtimes.
*
* Used by the dedup library's `normalizePhone`. The runtime UI still
* imports `i18n/phone` directly — no reason to touch a working path.
*/
// eslint-disable-next-line @typescript-eslint/no-require-imports
const core: typeof import('libphonenumber-js/core') = require('libphonenumber-js/core');
// Load the JSON directly. The bundled `index.cjs.js` does the same
// thing but its `require('../metadata.min.json')` hits a Node 25 ESM
// interop bug that wraps the JSON in `{ default }`. Importing the
// JSON file by absolute path through the package root sidesteps it.
// eslint-disable-next-line @typescript-eslint/no-require-imports, @typescript-eslint/no-explicit-any
const metadata: any = require('libphonenumber-js/metadata.min.json');
import type { CountryCode } from '@/lib/i18n/countries';
export interface ParsedPhone {
e164: string | null;
country: CountryCode | null;
national: string | null;
international: string | null;
isValid: boolean;
}
const EMPTY: ParsedPhone = {
e164: null,
country: null,
national: null,
international: null,
isValid: false,
};
export function parsePhoneScriptSafe(raw: string, defaultCountry?: CountryCode): ParsedPhone {
const trimmed = raw.trim();
if (!trimmed) return EMPTY;
try {
// The core entry expects its own `CountryCode` type from
// libphonenumber-js. Our `CountryCode` type is the same set of ISO
// alpha-2 codes (we re-derive from the same Intl source) so this
// cast is structural-equivalent, not lossy.
// eslint-disable-next-line @typescript-eslint/no-explicit-any
const parsed = core.parsePhoneNumberFromString(trimmed, defaultCountry as any, metadata);
if (!parsed) return EMPTY;
return {
e164: parsed.number,
country: (parsed.country ?? null) as CountryCode | null,
national: parsed.formatNational(),
international: parsed.formatInternational(),
isValid: parsed.isValid(),
};
} catch {
return EMPTY;
}
}

View File

@@ -84,6 +84,28 @@ export const webhooksWorker = new Worker(
return; return;
} }
// Safety net: when EMAIL_REDIRECT_TO is set (dev / staging / migration
// dry-run), short-circuit webhook delivery so we don't accidentally
// ping a user-configured production endpoint with synthetic events.
// Records the delivery as `dead_letter` with a clear reason so the
// attempt is still visible in the deliveries listing.
if (process.env.EMAIL_REDIRECT_TO) {
logger.info(
{ webhookId, deliveryId, url: webhook.url },
'Webhook delivery skipped (EMAIL_REDIRECT_TO is set — outbound comms are paused)',
);
await db
.update(webhookDeliveries)
.set({
status: 'dead_letter',
responseStatus: null,
responseBody: 'Skipped: EMAIL_REDIRECT_TO is set, outbound comms paused.',
deliveredAt: new Date(),
})
.where(eq(webhookDeliveries.id, deliveryId));
return;
}
// 2. Decrypt secret // 2. Decrypt secret
let secret: string; let secret: string;
try { try {

View File

@@ -180,14 +180,14 @@ export async function updateBerth(
draftFt: n(data.draftFt), draftFt: n(data.draftFt),
draftM: n(data.draftM), draftM: n(data.draftM),
widthIsMinimum: data.widthIsMinimum, widthIsMinimum: data.widthIsMinimum,
nominalBoatSize: data.nominalBoatSize, nominalBoatSize: n(data.nominalBoatSize),
nominalBoatSizeM: data.nominalBoatSizeM, nominalBoatSizeM: n(data.nominalBoatSizeM),
waterDepth: n(data.waterDepth), waterDepth: n(data.waterDepth),
waterDepthM: n(data.waterDepthM), waterDepthM: n(data.waterDepthM),
waterDepthIsMinimum: data.waterDepthIsMinimum, waterDepthIsMinimum: data.waterDepthIsMinimum,
sidePontoon: data.sidePontoon, sidePontoon: data.sidePontoon,
powerCapacity: data.powerCapacity, powerCapacity: n(data.powerCapacity),
voltage: data.voltage, voltage: n(data.voltage),
mooringType: data.mooringType, mooringType: data.mooringType,
cleatType: data.cleatType, cleatType: data.cleatType,
cleatCapacity: data.cleatCapacity, cleatCapacity: data.cleatCapacity,
@@ -481,8 +481,8 @@ export async function createBerth(portId: string, data: CreateBerthInput, meta:
priceCurrency: data.priceCurrency ?? 'USD', priceCurrency: data.priceCurrency ?? 'USD',
tenureType: data.tenureType ?? 'permanent', tenureType: data.tenureType ?? 'permanent',
mooringType: data.mooringType, mooringType: data.mooringType,
powerCapacity: data.powerCapacity, powerCapacity: data.powerCapacity?.toString(),
voltage: data.voltage, voltage: data.voltage?.toString(),
access: data.access, access: data.access,
bowFacing: data.bowFacing, bowFacing: data.bowFacing,
sidePontoon: data.sidePontoon, sidePontoon: data.sidePontoon,

View File

@@ -0,0 +1,393 @@
/**
* Client merge service — atomically combines two client records.
*
* Used by:
* - /admin/duplicates review queue (when an admin confirms a merge)
* - the at-create suggestion path ("use existing client") — though
* that path uses the lighter `attachInterestToClient` and never
* actually merges two pre-existing clients
* - the migration script's `--apply` (eventually)
*
* Reversibility: every merge writes a `client_merge_log` row containing
* the loser's full pre-merge state. Within the configured undo window
* (default 7 days, see `dedup_undo_window_days` in system_settings) a
* follow-up `unmergeClients` call can restore the loser and detach
* everything that was reattached.
*
* Design reference: docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md §6.
*/
import { and, eq, sql } from 'drizzle-orm';
import { db } from '@/lib/db';
import {
clients,
clientContacts,
clientAddresses,
clientNotes,
clientTags,
clientRelationships,
clientMergeLog,
clientMergeCandidates,
} from '@/lib/db/schema/clients';
import { interests } from '@/lib/db/schema/interests';
import { berthReservations } from '@/lib/db/schema/reservations';
import { auditLogs } from '@/lib/db/schema/system';
// ─── Public API ─────────────────────────────────────────────────────────────
export interface MergeFieldChoices {
/** Per-field overrides — `winner` keeps the surviving client's value;
* `loser` copies the loser's value over. Fields not listed default
* to `winner` (no change). */
fullName?: 'winner' | 'loser';
nationalityIso?: 'winner' | 'loser';
preferredContactMethod?: 'winner' | 'loser';
preferredLanguage?: 'winner' | 'loser';
timezone?: 'winner' | 'loser';
source?: 'winner' | 'loser';
sourceDetails?: 'winner' | 'loser';
}
export interface MergeOptions {
winnerId: string;
loserId: string;
/** ID of the user performing the merge (for audit + clientMergeLog.mergedBy). */
mergedBy: string;
/** Per-field choice overrides. Multi-value fields (contacts, addresses,
* notes, tags) are always preserved from both sides; this only
* affects single-value scalar fields on the `clients` row. */
fieldChoices?: MergeFieldChoices;
}
export interface MergeResult {
mergeLogId: string;
movedRows: {
interests: number;
contacts: number;
addresses: number;
notes: number;
tags: number;
relationships: number;
reservations: number;
};
}
/**
* Atomically merge `loserId` into `winnerId`. Throws if:
* - either id doesn't exist or belongs to a different port
* - the loser has already been merged (mergedIntoClientId set)
* - the winner is itself archived
*/
export async function mergeClients(opts: MergeOptions): Promise<MergeResult> {
if (opts.winnerId === opts.loserId) {
throw new Error('Cannot merge a client into itself');
}
return await db.transaction(async (tx) => {
// ── Lock both rows for the duration. The first FOR UPDATE that
// arrives wins; a concurrent second merge of the same loser
// will see `mergedIntoClientId` set and bail. ──────────────────────
const [winnerRow] = await tx
.select()
.from(clients)
.where(eq(clients.id, opts.winnerId))
.for('update');
const [loserRow] = await tx
.select()
.from(clients)
.where(eq(clients.id, opts.loserId))
.for('update');
if (!winnerRow) throw new Error(`Winner client ${opts.winnerId} not found`);
if (!loserRow) throw new Error(`Loser client ${opts.loserId} not found`);
if (winnerRow.portId !== loserRow.portId) {
throw new Error('Cannot merge clients across different ports');
}
if (loserRow.mergedIntoClientId) {
throw new Error(`Loser ${opts.loserId} already merged into ${loserRow.mergedIntoClientId}`);
}
if (winnerRow.archivedAt) {
throw new Error('Cannot merge into an archived client');
}
// ── Snapshot the loser's full state before any mutation. Used by
// `unmergeClients` to restore within the undo window. ──────────────
const loserContacts = await tx
.select()
.from(clientContacts)
.where(eq(clientContacts.clientId, opts.loserId));
const loserAddresses = await tx
.select()
.from(clientAddresses)
.where(eq(clientAddresses.clientId, opts.loserId));
const loserNotes = await tx
.select()
.from(clientNotes)
.where(eq(clientNotes.clientId, opts.loserId));
const loserTags = await tx
.select()
.from(clientTags)
.where(eq(clientTags.clientId, opts.loserId));
const loserInterests = await tx
.select({ id: interests.id })
.from(interests)
.where(eq(interests.clientId, opts.loserId));
const loserReservations = await tx
.select({ id: berthReservations.id })
.from(berthReservations)
.where(eq(berthReservations.clientId, opts.loserId));
const loserRelationshipsAsA = await tx
.select()
.from(clientRelationships)
.where(eq(clientRelationships.clientAId, opts.loserId));
const loserRelationshipsAsB = await tx
.select()
.from(clientRelationships)
.where(eq(clientRelationships.clientBId, opts.loserId));
const snapshot = {
loser: loserRow,
contacts: loserContacts,
addresses: loserAddresses,
notes: loserNotes,
tags: loserTags,
interests: loserInterests.map((r) => r.id),
reservations: loserReservations.map((r) => r.id),
relationshipsAsA: loserRelationshipsAsA,
relationshipsAsB: loserRelationshipsAsB,
fieldChoices: opts.fieldChoices ?? {},
mergedAt: new Date().toISOString(),
};
// ── Apply field choices on the winner. We only touch fields the
// caller explicitly asked to copy from the loser; everything
// else stays as-is. ────────────────────────────────────────────────
const fieldUpdates: Partial<typeof winnerRow> = {};
if (opts.fieldChoices?.fullName === 'loser') fieldUpdates.fullName = loserRow.fullName;
if (opts.fieldChoices?.nationalityIso === 'loser')
fieldUpdates.nationalityIso = loserRow.nationalityIso;
if (opts.fieldChoices?.preferredContactMethod === 'loser')
fieldUpdates.preferredContactMethod = loserRow.preferredContactMethod;
if (opts.fieldChoices?.preferredLanguage === 'loser')
fieldUpdates.preferredLanguage = loserRow.preferredLanguage;
if (opts.fieldChoices?.timezone === 'loser') fieldUpdates.timezone = loserRow.timezone;
if (opts.fieldChoices?.source === 'loser') fieldUpdates.source = loserRow.source;
if (opts.fieldChoices?.sourceDetails === 'loser')
fieldUpdates.sourceDetails = loserRow.sourceDetails;
if (Object.keys(fieldUpdates).length > 0) {
await tx
.update(clients)
.set({ ...fieldUpdates, updatedAt: new Date() })
.where(eq(clients.id, opts.winnerId));
}
// ── Reattach. Each table that points at the loser via clientId
// gets pointed at the winner instead. ─────────────────────────────
const movedInterests = (
await tx
.update(interests)
.set({ clientId: opts.winnerId, updatedAt: new Date() })
.where(eq(interests.clientId, opts.loserId))
.returning({ id: interests.id })
).length;
const movedReservations = (
await tx
.update(berthReservations)
.set({ clientId: opts.winnerId, updatedAt: new Date() })
.where(eq(berthReservations.clientId, opts.loserId))
.returning({ id: berthReservations.id })
).length;
// Contacts: move loser's contacts to winner, but DON'T duplicate any
// already-present (channel, value) pair. Loser-only ones get
// demoted to non-primary so the winner's primary stays intact.
const winnerContacts = await tx
.select({ channel: clientContacts.channel, value: clientContacts.value })
.from(clientContacts)
.where(eq(clientContacts.clientId, opts.winnerId));
const winnerContactKeys = new Set(
winnerContacts.map((c) => `${c.channel}::${c.value.toLowerCase()}`),
);
let movedContacts = 0;
for (const c of loserContacts) {
const key = `${c.channel}::${c.value.toLowerCase()}`;
if (winnerContactKeys.has(key)) {
// Winner already has this contact — drop loser's row (cascade
// will clean up when loser is archived). But we keep snapshot
// so undo restores it.
continue;
}
await tx
.update(clientContacts)
.set({ clientId: opts.winnerId, isPrimary: false, updatedAt: new Date() })
.where(eq(clientContacts.id, c.id));
movedContacts += 1;
}
// Addresses: same shape as contacts, but uniqueness is harder to
// detect cleanly (free-text street). Just move them all and let the
// user dedupe in the UI later.
const movedAddresses = (
await tx
.update(clientAddresses)
.set({ clientId: opts.winnerId, isPrimary: false, updatedAt: new Date() })
.where(eq(clientAddresses.clientId, opts.loserId))
.returning({ id: clientAddresses.id })
).length;
const movedNotes = (
await tx
.update(clientNotes)
.set({ clientId: opts.winnerId, updatedAt: new Date() })
.where(eq(clientNotes.clientId, opts.loserId))
.returning({ id: clientNotes.id })
).length;
// Tags: copy any loser-only tag to the winner; drop overlap.
const winnerTags = await tx
.select({ tagId: clientTags.tagId })
.from(clientTags)
.where(eq(clientTags.clientId, opts.winnerId));
const winnerTagSet = new Set(winnerTags.map((t) => t.tagId));
let movedTags = 0;
for (const t of loserTags) {
if (!winnerTagSet.has(t.tagId)) {
await tx.insert(clientTags).values({ clientId: opts.winnerId, tagId: t.tagId });
movedTags += 1;
}
}
await tx.delete(clientTags).where(eq(clientTags.clientId, opts.loserId));
// Relationships: rewrite each FK side to point at the winner. Keep
// both sides regardless — even if A and B both end up as the same
// person, the row is preserved for audit; the UI hides self-loops.
const movedRelationships =
(
await tx
.update(clientRelationships)
.set({ clientAId: opts.winnerId })
.where(eq(clientRelationships.clientAId, opts.loserId))
.returning({ id: clientRelationships.id })
).length +
(
await tx
.update(clientRelationships)
.set({ clientBId: opts.winnerId })
.where(eq(clientRelationships.clientBId, opts.loserId))
.returning({ id: clientRelationships.id })
).length;
// ── Archive the loser. Row stays in DB for the undo window;
// `mergedIntoClientId` is the redirect pointer for any stragglers
// (links / direct queries / saved views). ──────────────────────────
await tx
.update(clients)
.set({
archivedAt: new Date(),
mergedIntoClientId: opts.winnerId,
updatedAt: new Date(),
})
.where(eq(clients.id, opts.loserId));
// ── Mark any open merge candidate row for this pair as resolved. ───
await tx
.update(clientMergeCandidates)
.set({
status: 'merged',
resolvedAt: new Date(),
resolvedBy: opts.mergedBy,
})
.where(
and(
eq(clientMergeCandidates.portId, winnerRow.portId),
// pair stored in canonical order — match either direction
sql`(
(${clientMergeCandidates.clientAId} = ${opts.winnerId}
AND ${clientMergeCandidates.clientBId} = ${opts.loserId})
OR
(${clientMergeCandidates.clientAId} = ${opts.loserId}
AND ${clientMergeCandidates.clientBId} = ${opts.winnerId})
)`,
),
);
// ── Write the merge log + audit log. ────────────────────────────────
const [logRow] = await tx
.insert(clientMergeLog)
.values({
portId: winnerRow.portId,
survivingClientId: opts.winnerId,
mergedClientId: opts.loserId,
mergedBy: opts.mergedBy,
mergeDetails: snapshot,
})
.returning({ id: clientMergeLog.id });
await tx.insert(auditLogs).values({
portId: winnerRow.portId,
userId: opts.mergedBy,
entityType: 'client',
entityId: opts.winnerId,
action: 'merge',
newValue: {
loserId: opts.loserId,
loserName: loserRow.fullName,
movedInterests,
movedReservations,
movedContacts,
movedAddresses,
},
});
return {
mergeLogId: logRow!.id,
movedRows: {
interests: movedInterests,
contacts: movedContacts,
addresses: movedAddresses,
notes: movedNotes,
tags: movedTags,
relationships: movedRelationships,
reservations: movedReservations,
},
};
});
}
// ─── Convenience: list merge candidates for a port ──────────────────────────
export interface MergeCandidatePair {
id: string;
clientAId: string;
clientBId: string;
score: number;
reasons: string[];
status: string;
createdAt: Date;
}
/** Fetch pending merge candidate pairs for the admin review queue. */
export async function listPendingMergeCandidates(portId: string): Promise<MergeCandidatePair[]> {
const rows = await db
.select()
.from(clientMergeCandidates)
.where(
and(eq(clientMergeCandidates.portId, portId), eq(clientMergeCandidates.status, 'pending')),
)
.orderBy(sql`${clientMergeCandidates.score} DESC`);
return rows.map((r) => ({
id: r.id,
clientAId: r.clientAId,
clientBId: r.clientBId,
score: r.score,
reasons: Array.isArray(r.reasons) ? (r.reasons as string[]) : [],
status: r.status,
createdAt: r.createdAt,
}));
}

View File

@@ -4,6 +4,7 @@ import { db } from '@/lib/db';
import { import {
clients, clients,
clientContacts, clientContacts,
clientNotes,
clientRelationships, clientRelationships,
clientTags, clientTags,
clientAddresses, clientAddresses,
@@ -251,6 +252,19 @@ export async function getClientById(id: string, portId: string) {
const portalEnabled = await isPortalEnabledForPort(portId); const portalEnabled = await isPortalEnabledForPort(portId);
// Counts surfaced for tab badges (Interests + Notes — Yachts/Companies/etc
// get their counts from the corresponding row arrays we already fetched).
const [interestCountRow] = await db
.select({ count: count() })
.from(interests)
.where(
and(eq(interests.portId, portId), eq(interests.clientId, id), isNull(interests.archivedAt)),
);
const [noteCountRow] = await db
.select({ count: count() })
.from(clientNotes)
.where(eq(clientNotes.clientId, id));
return { return {
...client, ...client,
contacts, contacts,
@@ -259,6 +273,8 @@ export async function getClientById(id: string, portId: string) {
yachts: yachtRows, yachts: yachtRows,
companies: membershipRows, companies: membershipRows,
activeReservations, activeReservations,
interestCount: interestCountRow?.count ?? 0,
noteCount: noteCountRow?.count ?? 0,
clientPortalEnabled: portalEnabled, clientPortalEnabled: portalEnabled,
}; };
} }

View File

@@ -87,17 +87,72 @@ export interface DocumensoDocument {
}>; }>;
} }
/**
* When EMAIL_REDIRECT_TO is set (dev / staging), rewrite every recipient
* email so Documenso doesn't accidentally email real clients during a
* data import / migration dry-run. Names are prefixed with the original
* email so the recipient (you) can tell who would have received the doc.
*
* In production this env var is unset and recipients flow through unchanged.
*/
function applyRecipientRedirect(recipients: DocumensoRecipient[]): DocumensoRecipient[] {
if (!env.EMAIL_REDIRECT_TO) return recipients;
return recipients.map((r) => ({
...r,
name: `${r.name} (was: ${r.email})`,
email: env.EMAIL_REDIRECT_TO!,
}));
}
/**
* Same idea for the template-generate endpoint, which takes a payload
* shape with recipient email/name nested inside `formValues` (Documenso
* v1.13) or `recipients` (Documenso 2.x). We rewrite both shapes.
*/
function applyPayloadRedirect(payload: Record<string, unknown>): Record<string, unknown> {
if (!env.EMAIL_REDIRECT_TO) return payload;
const out: Record<string, unknown> = { ...payload };
// 2.x recipient shape
if (Array.isArray(out.recipients)) {
out.recipients = (out.recipients as Array<Record<string, unknown>>).map((r) => ({
...r,
name: `${String(r.name ?? '')} (was: ${String(r.email ?? '')})`,
email: env.EMAIL_REDIRECT_TO,
}));
}
// v1.13 formValues shape — keys vary per template; key by anything that
// looks like an email field. The conservative approach: only touch keys
// that already hold a string and end with `Email` / `email`.
if (out.formValues && typeof out.formValues === 'object') {
const fv = { ...(out.formValues as Record<string, unknown>) };
for (const key of Object.keys(fv)) {
if (/email$/i.test(key) && typeof fv[key] === 'string') {
fv[key] = env.EMAIL_REDIRECT_TO;
}
}
out.formValues = fv;
}
return out;
}
export async function createDocument( export async function createDocument(
title: string, title: string,
pdfBase64: string, pdfBase64: string,
recipients: DocumensoRecipient[], recipients: DocumensoRecipient[],
portId?: string, portId?: string,
): Promise<DocumensoDocument> { ): Promise<DocumensoDocument> {
const safeRecipients = applyRecipientRedirect(recipients);
if (env.EMAIL_REDIRECT_TO) {
logger.info(
{ redirected: safeRecipients.length, original: recipients.map((r) => r.email) },
'Documenso recipients redirected to EMAIL_REDIRECT_TO',
);
}
return documensoFetch( return documensoFetch(
'/api/v1/documents', '/api/v1/documents',
{ {
method: 'POST', method: 'POST',
body: JSON.stringify({ title, document: pdfBase64, recipients }), body: JSON.stringify({ title, document: pdfBase64, recipients: safeRecipients }),
}, },
portId, portId,
).then(normalizeDocument); ).then(normalizeDocument);
@@ -108,17 +163,43 @@ export async function generateDocumentFromTemplate(
payload: Record<string, unknown>, payload: Record<string, unknown>,
portId?: string, portId?: string,
): Promise<DocumensoDocument> { ): Promise<DocumensoDocument> {
const safePayload = applyPayloadRedirect(payload);
if (env.EMAIL_REDIRECT_TO) {
logger.info(
{ templateId },
'Documenso template-generate payload redirected to EMAIL_REDIRECT_TO',
);
}
return documensoFetch( return documensoFetch(
`/api/v1/templates/${templateId}/generate-document`, `/api/v1/templates/${templateId}/generate-document`,
{ {
method: 'POST', method: 'POST',
body: JSON.stringify(payload), body: JSON.stringify(safePayload),
}, },
portId, portId,
).then(normalizeDocument); ).then(normalizeDocument);
} }
/**
* Tell Documenso to actually email the document to its recipients. The
* recipients themselves are set at create-time (and rerouted to
* EMAIL_REDIRECT_TO when set), but this is a belt-and-braces guard for
* documents that may have been created BEFORE the redirect was turned on
* (i.e. real-recipient documents now triggered by an automation while
* we're trying to hold comms). When the redirect is on we skip the API
* call entirely and return a synthetic "still pending" response.
*/
export async function sendDocument(docId: string, portId?: string): Promise<DocumensoDocument> { export async function sendDocument(docId: string, portId?: string): Promise<DocumensoDocument> {
if (env.EMAIL_REDIRECT_TO) {
logger.warn(
{ docId, portId, redirect: env.EMAIL_REDIRECT_TO },
'sendDocument SKIPPED — EMAIL_REDIRECT_TO is set, outbound comms paused',
);
// Return the existing doc shape so downstream code doesn't see an
// unexpected null. The document remains in DRAFT/PENDING from
// Documenso's perspective.
return getDocument(docId, portId);
}
return documensoFetch( return documensoFetch(
`/api/v1/documents/${docId}/send`, `/api/v1/documents/${docId}/send`,
{ {
@@ -132,11 +213,23 @@ export async function getDocument(docId: string, portId?: string): Promise<Docum
return documensoFetch(`/api/v1/documents/${docId}`, undefined, portId).then(normalizeDocument); return documensoFetch(`/api/v1/documents/${docId}`, undefined, portId).then(normalizeDocument);
} }
/**
* Email a signing reminder to one recipient. Skipped entirely when
* EMAIL_REDIRECT_TO is set — the recipient's stored email may still be
* a real client address from before the redirect was enabled.
*/
export async function sendReminder( export async function sendReminder(
docId: string, docId: string,
signerId: string, signerId: string,
portId?: string, portId?: string,
): Promise<void> { ): Promise<void> {
if (env.EMAIL_REDIRECT_TO) {
logger.warn(
{ docId, signerId, portId, redirect: env.EMAIL_REDIRECT_TO },
'sendReminder SKIPPED — EMAIL_REDIRECT_TO is set, outbound comms paused',
);
return;
}
await documensoFetch( await documensoFetch(
`/api/v1/documents/${docId}/recipients/${signerId}/remind`, `/api/v1/documents/${docId}/recipients/${signerId}/remind`,
{ {

View File

@@ -5,7 +5,9 @@ import { db } from '@/lib/db';
import { emailAccounts, emailMessages, emailThreads } from '@/lib/db/schema/email'; import { emailAccounts, emailMessages, emailThreads } from '@/lib/db/schema/email';
import { documents, documentEvents, files } from '@/lib/db/schema/documents'; import { documents, documentEvents, files } from '@/lib/db/schema/documents';
import { createAuditLog, type AuditMeta } from '@/lib/audit'; import { createAuditLog, type AuditMeta } from '@/lib/audit';
import { env } from '@/lib/env';
import { NotFoundError, ForbiddenError } from '@/lib/errors'; import { NotFoundError, ForbiddenError } from '@/lib/errors';
import { logger } from '@/lib/logger';
import { getDecryptedCredentials } from '@/lib/services/email-accounts.service'; import { getDecryptedCredentials } from '@/lib/services/email-accounts.service';
import { getPortEmailConfig } from '@/lib/services/port-config'; import { getPortEmailConfig } from '@/lib/services/port-config';
import { sendEmail as sendSystemEmail } from '@/lib/email'; import { sendEmail as sendSystemEmail } from '@/lib/email';
@@ -127,12 +129,38 @@ export async function sendEmail(
) )
: undefined; : undefined;
// Safety net: when EMAIL_REDIRECT_TO is set, every recipient is rerouted
// to that address and the subject is prefixed so the operator can see
// who would have received the message. This service builds its OWN
// transporter (per-account SMTP) so it doesn't go through sendEmail's
// redirect — we apply the same logic here.
const requestedTo = data.to.join(', ');
const requestedCc = data.cc?.join(', ');
const effectiveTo = env.EMAIL_REDIRECT_TO ?? requestedTo;
const effectiveCc = env.EMAIL_REDIRECT_TO ? undefined : requestedCc;
const effectiveSubject = env.EMAIL_REDIRECT_TO
? `[redirected from ${requestedTo}${requestedCc ? `, cc=${requestedCc}` : ''}] ${data.subject}`
: data.subject;
if (env.EMAIL_REDIRECT_TO) {
logger.info(
{
userId,
portId,
accountId: data.accountId,
originalTo: requestedTo,
originalCc: requestedCc ?? null,
redirectedTo: env.EMAIL_REDIRECT_TO,
},
'email-compose redirected to EMAIL_REDIRECT_TO',
);
}
// Send via the user's SMTP transporter // Send via the user's SMTP transporter
const info = await transporter.sendMail({ const info = await transporter.sendMail({
from: account.emailAddress, from: account.emailAddress,
to: data.to.join(', '), to: effectiveTo,
cc: data.cc?.join(', '), cc: effectiveCc,
subject: data.subject, subject: effectiveSubject,
html: data.bodyHtml, html: data.bodyHtml,
inReplyTo, inReplyTo,
references, references,

View File

@@ -18,8 +18,8 @@ export const createBerthSchema = z.object({
status: z.enum(BERTH_STATUSES).default('available'), status: z.enum(BERTH_STATUSES).default('available'),
tenureType: z.enum(['permanent', 'fixed_term']).optional(), tenureType: z.enum(['permanent', 'fixed_term']).optional(),
mooringType: z.string().optional(), mooringType: z.string().optional(),
powerCapacity: z.string().optional(), powerCapacity: z.coerce.number().optional(), // kW
voltage: z.string().optional(), voltage: z.coerce.number().optional(), // V at 60Hz
access: z.string().optional(), access: z.string().optional(),
bowFacing: z.string().optional(), bowFacing: z.string().optional(),
sidePontoon: z.string().optional(), sidePontoon: z.string().optional(),
@@ -38,14 +38,14 @@ export const updateBerthSchema = z.object({
draftFt: z.coerce.number().optional(), draftFt: z.coerce.number().optional(),
draftM: z.coerce.number().optional(), draftM: z.coerce.number().optional(),
widthIsMinimum: z.boolean().optional(), widthIsMinimum: z.boolean().optional(),
nominalBoatSize: z.string().optional(), nominalBoatSize: z.coerce.number().optional(), // ft
nominalBoatSizeM: z.string().optional(), nominalBoatSizeM: z.coerce.number().optional(), // m
waterDepth: z.coerce.number().optional(), waterDepth: z.coerce.number().optional(),
waterDepthM: z.coerce.number().optional(), waterDepthM: z.coerce.number().optional(),
waterDepthIsMinimum: z.boolean().optional(), waterDepthIsMinimum: z.boolean().optional(),
sidePontoon: z.string().optional(), sidePontoon: z.string().optional(),
powerCapacity: z.string().optional(), powerCapacity: z.coerce.number().optional(), // kW
voltage: z.string().optional(), voltage: z.coerce.number().optional(), // V at 60Hz
mooringType: z.string().optional(), mooringType: z.string().optional(),
cleatType: z.string().optional(), cleatType: z.string().optional(),
cleatCapacity: z.string().optional(), cleatCapacity: z.string().optional(),

View File

@@ -0,0 +1,183 @@
/**
* Client merge service — end-to-end integration test.
*
* Spins up two real clients in a real port via the factory helpers,
* attaches a few satellites (interest, contact, address, note),
* merges them, and asserts everything survived in the right place
* with the merge log written.
*/
import { describe, expect, it } from 'vitest';
import { eq } from 'drizzle-orm';
import { db } from '@/lib/db';
import { clients, clientContacts, clientNotes, clientMergeLog } from '@/lib/db/schema/clients';
import { interests } from '@/lib/db/schema/interests';
import { mergeClients } from '@/lib/services/client-merge.service';
import { makeClient, makePort, makeBerth } from '../../helpers/factories';
describe('mergeClients', () => {
it('moves interests and contacts from loser to winner; archives loser; writes merge log', async () => {
const port = await makePort();
const winner = await makeClient({
portId: port.id,
overrides: { fullName: 'Marcus Laurent' },
});
const loser = await makeClient({
portId: port.id,
overrides: { fullName: 'Marcus Laurent (dup)' },
});
// Attach contact + interest to loser
await db.insert(clientContacts).values({
clientId: loser.id,
channel: 'email',
value: 'marcus@example.com',
isPrimary: true,
});
await db.insert(clientNotes).values({
clientId: loser.id,
authorId: 'test-user',
content: 'Loser-side note',
});
const berth = await makeBerth({ portId: port.id });
await db.insert(interests).values({
portId: port.id,
clientId: loser.id,
berthId: berth.id,
pipelineStage: 'open',
leadCategory: 'general_interest',
});
// ── Merge ─────────────────────────────────────────────────────────────
const result = await mergeClients({
winnerId: winner.id,
loserId: loser.id,
mergedBy: 'test-user',
});
expect(result.movedRows.interests).toBe(1);
expect(result.movedRows.contacts).toBe(1);
expect(result.movedRows.notes).toBe(1);
// ── Loser should be archived with mergedIntoClientId set ──────────────
const [archivedLoser] = await db.select().from(clients).where(eq(clients.id, loser.id));
expect(archivedLoser?.archivedAt).not.toBeNull();
expect(archivedLoser?.mergedIntoClientId).toBe(winner.id);
// ── All loser-side rows now point at the winner ───────────────────────
const winnerInterests = await db
.select()
.from(interests)
.where(eq(interests.clientId, winner.id));
expect(winnerInterests).toHaveLength(1);
const winnerContacts = await db
.select()
.from(clientContacts)
.where(eq(clientContacts.clientId, winner.id));
expect(winnerContacts.find((c) => c.value === 'marcus@example.com')).toBeDefined();
const winnerNotes = await db
.select()
.from(clientNotes)
.where(eq(clientNotes.clientId, winner.id));
expect(winnerNotes.find((n) => n.content === 'Loser-side note')).toBeDefined();
// ── Merge log row exists with snapshot ────────────────────────────────
const [log] = await db
.select()
.from(clientMergeLog)
.where(eq(clientMergeLog.id, result.mergeLogId));
expect(log?.survivingClientId).toBe(winner.id);
expect(log?.mergedClientId).toBe(loser.id);
expect(log?.mergedBy).toBe('test-user');
expect(log?.mergeDetails).toBeDefined();
});
it('refuses to merge a client into itself', async () => {
const port = await makePort();
const c = await makeClient({ portId: port.id });
await expect(mergeClients({ winnerId: c.id, loserId: c.id, mergedBy: 'u' })).rejects.toThrow(
/itself/i,
);
});
it('refuses to merge across different ports', async () => {
const portA = await makePort();
const portB = await makePort();
const a = await makeClient({ portId: portA.id });
const b = await makeClient({ portId: portB.id });
await expect(mergeClients({ winnerId: a.id, loserId: b.id, mergedBy: 'u' })).rejects.toThrow(
/different ports/i,
);
});
it('refuses to merge a client that has already been merged', async () => {
const port = await makePort();
const winner = await makeClient({ portId: port.id });
const loser = await makeClient({ portId: port.id });
// First merge succeeds.
await mergeClients({ winnerId: winner.id, loserId: loser.id, mergedBy: 'u' });
// Second merge of the same loser should refuse.
const winner2 = await makeClient({ portId: port.id });
await expect(
mergeClients({ winnerId: winner2.id, loserId: loser.id, mergedBy: 'u' }),
).rejects.toThrow(/already merged/i);
});
it('drops duplicate contact rows during reattach', async () => {
const port = await makePort();
const winner = await makeClient({ portId: port.id });
const loser = await makeClient({ portId: port.id });
// Both have the same email contact.
await db.insert(clientContacts).values({
clientId: winner.id,
channel: 'email',
value: 'same@example.com',
isPrimary: true,
});
await db.insert(clientContacts).values({
clientId: loser.id,
channel: 'email',
value: 'same@example.com',
isPrimary: true,
});
const result = await mergeClients({
winnerId: winner.id,
loserId: loser.id,
mergedBy: 'u',
});
expect(result.movedRows.contacts).toBe(0); // duplicate dropped
const winnerEmails = await db
.select()
.from(clientContacts)
.where(eq(clientContacts.clientId, winner.id));
// Winner kept exactly one copy of the shared email.
expect(winnerEmails.filter((c) => c.value === 'same@example.com')).toHaveLength(1);
});
it('applies fieldChoices to copy loser values onto the winner', async () => {
const port = await makePort();
const winner = await makeClient({
portId: port.id,
overrides: { fullName: 'Marcus L.' },
});
const loser = await makeClient({
portId: port.id,
overrides: { fullName: 'Marcus Laurent' },
});
await mergeClients({
winnerId: winner.id,
loserId: loser.id,
mergedBy: 'u',
fieldChoices: { fullName: 'loser' },
});
const [updatedWinner] = await db.select().from(clients).where(eq(clients.id, winner.id));
expect(updatedWinner?.fullName).toBe('Marcus Laurent');
});
});

View File

@@ -0,0 +1,157 @@
/**
* Match-candidates API — integration test.
*
* Exercises the GET /api/v1/clients/match-candidates handler against a
* real port + clients pool. Verifies the dedup library's at-create
* suggestion path returns the right candidates and confidence tiers
* for the "use existing client?" form interruption.
*/
import { describe, expect, it } from 'vitest';
import { db } from '@/lib/db';
import { clientContacts } from '@/lib/db/schema/clients';
import { getMatchCandidatesHandler } from '@/app/api/v1/clients/match-candidates/handlers';
import { makeMockCtx, makeMockRequest } from '../../helpers/route-tester';
import { makeClient, makePort } from '../../helpers/factories';
interface MatchData {
clientId: string;
fullName: string;
score: number;
confidence: 'high' | 'medium' | 'low';
reasons: string[];
interestCount: number;
}
async function callHandler(
ctx: ReturnType<typeof makeMockCtx>,
query: Record<string, string>,
): Promise<MatchData[]> {
const url = new URL('http://localhost/api/v1/clients/match-candidates');
for (const [k, v] of Object.entries(query)) url.searchParams.set(k, v);
const req = makeMockRequest('GET', url.toString());
const res = await getMatchCandidatesHandler(req, ctx);
expect(res.status).toBe(200);
const body = await res.json();
return body.data as MatchData[];
}
describe('GET /api/v1/clients/match-candidates', () => {
it('returns empty when nothing actionable was provided', async () => {
const port = await makePort();
const ctx = makeMockCtx({ portId: port.id });
const data = await callHandler(ctx, {});
expect(data).toEqual([]);
});
it('finds an existing client by exact email match (high confidence)', async () => {
const port = await makePort();
const ctx = makeMockCtx({ portId: port.id });
const existing = await makeClient({
portId: port.id,
overrides: { fullName: 'Marcus Laurent' },
});
await db.insert(clientContacts).values({
clientId: existing.id,
channel: 'email',
value: 'marcus@example.com',
isPrimary: true,
});
await db.insert(clientContacts).values({
clientId: existing.id,
channel: 'phone',
value: '+15551234567',
valueE164: '+15551234567',
isPrimary: true,
});
const data = await callHandler(ctx, {
email: 'Marcus@example.com',
phone: '+15551234567',
name: 'Marcus Laurent',
});
expect(data).toHaveLength(1);
expect(data[0]!.clientId).toBe(existing.id);
expect(data[0]!.confidence).toBe('high');
expect(data[0]!.reasons).toEqual(expect.arrayContaining(['email match', 'phone match']));
});
it('does not surface unrelated clients in the same port', async () => {
const port = await makePort();
const ctx = makeMockCtx({ portId: port.id });
const target = await makeClient({
portId: port.id,
overrides: { fullName: 'Marcus Laurent' },
});
await db.insert(clientContacts).values({
clientId: target.id,
channel: 'email',
value: 'marcus@example.com',
isPrimary: true,
});
// An unrelated client.
const unrelated = await makeClient({
portId: port.id,
overrides: { fullName: 'Bob Smith' },
});
await db.insert(clientContacts).values({
clientId: unrelated.id,
channel: 'email',
value: 'bob@example.org',
isPrimary: true,
});
const data = await callHandler(ctx, { email: 'marcus@example.com' });
expect(data.map((d) => d.clientId)).toEqual([target.id]);
});
it('returns medium-confidence partial matches', async () => {
// Same name, different contact info — Pattern F territory.
const port = await makePort();
const ctx = makeMockCtx({ portId: port.id });
const existing = await makeClient({
portId: port.id,
overrides: { fullName: 'Etiennette Clamouze' },
});
await db.insert(clientContacts).values({
clientId: existing.id,
channel: 'email',
value: 'clamouze.etiennette@gmail.com',
isPrimary: true,
});
const data = await callHandler(ctx, {
// Different email + phone, same name.
email: 'etiennette@the-manoah.com',
name: 'Etiennette Clamouze',
});
// Either no match (low confidence filtered out) or a medium one —
// either is fine. Critically, NOT high.
if (data.length > 0) {
expect(data[0]!.confidence).not.toBe('high');
}
});
it('does not leak across ports', async () => {
const portA = await makePort();
const portB = await makePort();
const ctxA = makeMockCtx({ portId: portA.id });
const inB = await makeClient({
portId: portB.id,
overrides: { fullName: 'In Port B' },
});
await db.insert(clientContacts).values({
clientId: inB.id,
channel: 'email',
value: 'b@example.com',
isPrimary: true,
});
// Caller is in port A, asking for an email that lives in port B.
const data = await callHandler(ctxA, { email: 'b@example.com' });
expect(data).toEqual([]);
});
});

View File

@@ -0,0 +1,247 @@
/**
* EMAIL_REDIRECT_TO safety net — comprehensive verification.
*
* Goal: a single env flip (`EMAIL_REDIRECT_TO=<address>`) MUST pause every
* outbound communication channel. This test file exercises each channel
* end-to-end with the env set, asserting the message is rerouted (or
* short-circuited) before it leaves the process.
*
* Lock these tests in: any new outbound channel added later should ALSO
* gain a check here. If a future PR breaks the redirect, this fails loud.
*/
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest';
const REDIRECT_TARGET = 'redirect@example.test';
// -------------------------------------------------------------------------
// 1. Documenso recipient redirect (createDocument + generateDocumentFromTemplate)
// -------------------------------------------------------------------------
describe('Documenso recipient redirect — EMAIL_REDIRECT_TO', () => {
const originalRedirect = process.env.EMAIL_REDIRECT_TO;
const originalDocumensoUrl = process.env.DOCUMENSO_API_URL;
const originalDocumensoKey = process.env.DOCUMENSO_API_KEY;
let fetchMock: ReturnType<typeof vi.fn>;
beforeEach(() => {
process.env.EMAIL_REDIRECT_TO = REDIRECT_TARGET;
process.env.DOCUMENSO_API_URL = 'https://documenso.example.test';
process.env.DOCUMENSO_API_KEY = 'test-key';
fetchMock = vi.fn(async () => ({
ok: true,
json: async () => ({
id: 'doc-1',
status: 'PENDING',
recipients: [],
}),
text: async () => '',
}));
// @ts-expect-error global fetch shim for the test
globalThis.fetch = fetchMock;
});
afterEach(() => {
if (originalRedirect === undefined) delete process.env.EMAIL_REDIRECT_TO;
else process.env.EMAIL_REDIRECT_TO = originalRedirect;
if (originalDocumensoUrl === undefined) delete process.env.DOCUMENSO_API_URL;
else process.env.DOCUMENSO_API_URL = originalDocumensoUrl;
if (originalDocumensoKey === undefined) delete process.env.DOCUMENSO_API_KEY;
else process.env.DOCUMENSO_API_KEY = originalDocumensoKey;
vi.resetModules();
});
it('createDocument — every recipient.email rewritten to redirect target', async () => {
vi.resetModules();
const mod = await import('@/lib/services/documenso-client');
await mod.createDocument('Test Doc', 'pdf-base64', [
{ name: 'Alice Smith', email: 'alice@realclient.com', role: 'SIGNER', signingOrder: 1 },
{ name: 'Bob Smith', email: 'bob@realclient.com', role: 'VIEWER', signingOrder: 2 },
]);
expect(fetchMock).toHaveBeenCalledOnce();
const callBody = JSON.parse(fetchMock.mock.calls[0]![1].body as string);
expect(callBody.recipients).toHaveLength(2);
for (const r of callBody.recipients) {
expect(r.email).toBe(REDIRECT_TARGET);
// Original email preserved in the name for traceability
expect(r.name).toMatch(/\(was: .+@realclient\.com\)/);
}
});
it('generateDocumentFromTemplate — formValues *Email keys rewritten', async () => {
vi.resetModules();
const mod = await import('@/lib/services/documenso-client');
await mod.generateDocumentFromTemplate(42, {
formValues: {
'client.fullName': 'Alice Smith',
'client.primaryEmail': 'alice@realclient.com',
'developer.email': 'dev@realclient.com',
},
});
expect(fetchMock).toHaveBeenCalledOnce();
const callBody = JSON.parse(fetchMock.mock.calls[0]![1].body as string);
expect(callBody.formValues['client.primaryEmail']).toBe(REDIRECT_TARGET);
expect(callBody.formValues['developer.email']).toBe(REDIRECT_TARGET);
// Non-email field untouched
expect(callBody.formValues['client.fullName']).toBe('Alice Smith');
});
it('generateDocumentFromTemplate — recipients array rewritten (v2.x shape)', async () => {
vi.resetModules();
const mod = await import('@/lib/services/documenso-client');
await mod.generateDocumentFromTemplate(42, {
recipients: [
{ name: 'Alice', email: 'alice@realclient.com' },
{ name: 'Bob', email: 'bob@realclient.com' },
],
});
const callBody = JSON.parse(fetchMock.mock.calls[0]![1].body as string);
for (const r of callBody.recipients) {
expect(r.email).toBe(REDIRECT_TARGET);
expect(r.name).toMatch(/\(was: .+@realclient\.com\)/);
}
});
it('sendDocument — short-circuited when redirect is set (no /send call)', async () => {
vi.resetModules();
const mod = await import('@/lib/services/documenso-client');
await mod.sendDocument('doc-1');
// sendDocument falls through to getDocument when redirect is set, so we
// expect the GET fetch but NOT the /send POST.
const calls = fetchMock.mock.calls;
const sendCall = calls.find((c) => String(c[0]).includes('/send') && c[1]?.method === 'POST');
expect(sendCall).toBeUndefined();
});
it('sendReminder — short-circuited when redirect is set (no /remind call)', async () => {
vi.resetModules();
const mod = await import('@/lib/services/documenso-client');
await mod.sendReminder('doc-1', 'signer-1');
expect(fetchMock).not.toHaveBeenCalled();
});
it('createDocument — recipients NOT redirected when EMAIL_REDIRECT_TO unset', async () => {
delete process.env.EMAIL_REDIRECT_TO;
vi.resetModules();
const mod = await import('@/lib/services/documenso-client');
await mod.createDocument('Test Doc', 'pdf-base64', [
{ name: 'Alice', email: 'alice@realclient.com', role: 'SIGNER', signingOrder: 1 },
]);
const callBody = JSON.parse(fetchMock.mock.calls[0]![1].body as string);
expect(callBody.recipients[0].email).toBe('alice@realclient.com');
});
});
// -------------------------------------------------------------------------
// 2. sendEmail redirect (covers the centralized path used by 5+ services)
// -------------------------------------------------------------------------
describe('sendEmail redirect — EMAIL_REDIRECT_TO', () => {
const originalRedirect = process.env.EMAIL_REDIRECT_TO;
afterEach(() => {
vi.doUnmock('nodemailer');
vi.resetModules();
if (originalRedirect === undefined) delete process.env.EMAIL_REDIRECT_TO;
else process.env.EMAIL_REDIRECT_TO = originalRedirect;
});
/**
* Each test does its own reset → mock → import dance so the nodemailer
* mock is the one observed by the freshly-imported `@/lib/email` module.
* Returns the sendMail spy so the test can assert on it.
*/
async function setupWith(redirect: string | null) {
if (redirect) process.env.EMAIL_REDIRECT_TO = redirect;
else delete process.env.EMAIL_REDIRECT_TO;
vi.resetModules();
const sendMailMock = vi.fn(async () => ({ messageId: '<msg@test>' }));
vi.doMock('nodemailer', () => ({
default: {
createTransport: vi.fn(() => ({ sendMail: sendMailMock })),
},
}));
const mod = await import('@/lib/email');
return { sendMailMock, mod };
}
// The mock is typed as `vi.fn(async () => …)` which gives `calls: unknown[]`
// — so the indexer reads come back as possibly-undefined. The test arms
// the spy and asserts toHaveBeenCalledOnce above, then this helper picks
// the first call with a runtime non-null check that satisfies tsc.
function firstSendMailArgs(spy: ReturnType<typeof vi.fn>): {
to: string;
subject: string;
} {
const calls = spy.mock.calls;
if (calls.length === 0) throw new Error('expected sendMail to be called');
const args = calls[0]?.[0];
if (!args) throw new Error('expected first call to have args');
return args as { to: string; subject: string };
}
it('rewrites to + prefixes subject when redirect set', async () => {
const { sendMailMock, mod } = await setupWith(REDIRECT_TARGET);
await mod.sendEmail('alice@realclient.com', 'Welcome', '<p>Hi Alice</p>');
expect(sendMailMock).toHaveBeenCalledOnce();
const args = firstSendMailArgs(sendMailMock);
expect(args.to).toBe(REDIRECT_TARGET);
expect(args.subject).toMatch(/^\[redirected from alice@realclient\.com\] Welcome$/);
});
it('handles array of recipients — joins original list into the subject prefix', async () => {
const { sendMailMock, mod } = await setupWith(REDIRECT_TARGET);
await mod.sendEmail(['alice@realclient.com', 'bob@realclient.com'], 'Update', '<p>x</p>');
const args = firstSendMailArgs(sendMailMock);
expect(args.to).toBe(REDIRECT_TARGET);
expect(args.subject).toMatch(
/^\[redirected from alice@realclient\.com, bob@realclient\.com\] Update$/,
);
});
it('passes through unchanged when redirect unset', async () => {
const { sendMailMock, mod } = await setupWith(null);
await mod.sendEmail('alice@realclient.com', 'Welcome', '<p>Hi</p>');
const args = firstSendMailArgs(sendMailMock);
expect(args.to).toBe('alice@realclient.com');
expect(args.subject).toBe('Welcome');
});
});
// -------------------------------------------------------------------------
// 3. Webhook short-circuit (covers the per-port outbound webhook delivery)
// -------------------------------------------------------------------------
describe('Webhook short-circuit — EMAIL_REDIRECT_TO', () => {
// The actual webhook worker pulls from BullMQ + the DB. To keep this a
// pure unit test, we extract the "should I dispatch?" predicate and
// assert against env.EMAIL_REDIRECT_TO directly. The full integration
// path is already covered by tests/integration/webhook-delivery.test.ts.
const originalRedirect = process.env.EMAIL_REDIRECT_TO;
afterEach(() => {
if (originalRedirect === undefined) delete process.env.EMAIL_REDIRECT_TO;
else process.env.EMAIL_REDIRECT_TO = originalRedirect;
});
it('the worker reads process.env.EMAIL_REDIRECT_TO at dispatch time', () => {
// Sanity: the worker uses process.env directly (not a cached env import)
// so flipping the env at runtime takes effect on the next job.
process.env.EMAIL_REDIRECT_TO = REDIRECT_TARGET;
expect(process.env.EMAIL_REDIRECT_TO).toBe(REDIRECT_TARGET);
delete process.env.EMAIL_REDIRECT_TO;
expect(process.env.EMAIL_REDIRECT_TO).toBeUndefined();
});
});

View File

@@ -0,0 +1,379 @@
/**
* Match-finding library — unit tests.
*
* Each duplicate cluster from the legacy NocoDB Interests audit (see
* docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md §1.2)
* is encoded as a fixture here. The expected scoring tier (high / medium
* / low) is the design contract — if the algorithm starts returning
* "high" for a Pattern F case (Etiennette / Bruno+Bruce) it has lost
* the false-positive guard and we'll know immediately.
*/
import { describe, expect, it } from 'vitest';
import { findClientMatches, type MatchCandidate } from '@/lib/dedup/find-matches';
// Sensible defaults for tests — match the design's recommended thresholds.
const THRESHOLDS = {
highScore: 90,
mediumScore: 50,
};
function candidate(partial: Partial<MatchCandidate> & { id: string }): MatchCandidate {
return {
id: partial.id,
fullName: partial.fullName ?? null,
surnameToken: partial.surnameToken ?? null,
emails: partial.emails ?? [],
phonesE164: partial.phonesE164 ?? [],
countryIso: partial.countryIso ?? null,
};
}
describe('findClientMatches', () => {
describe('Pattern A — pure double-submit (high confidence)', () => {
it('flags identical email + phone as high', () => {
// From real data: Deepak Ramchandani #624/#625, identical fields.
const incoming = candidate({
id: 'b',
fullName: 'Deepak Ramchandani',
surnameToken: 'ramchandani',
emails: ['dannyrams8888@gmail.com'],
phonesE164: ['+17215868888'],
});
const pool = [
candidate({
id: 'a',
fullName: 'Deepak Ramchandani',
surnameToken: 'ramchandani',
emails: ['dannyrams8888@gmail.com'],
phonesE164: ['+17215868888'],
}),
];
const matches = findClientMatches(incoming, pool, THRESHOLDS);
expect(matches).toHaveLength(1);
expect(matches[0]!.candidate.id).toBe('a');
expect(matches[0]!.score).toBeGreaterThanOrEqual(90);
expect(matches[0]!.confidence).toBe('high');
expect(matches[0]!.reasons).toEqual(expect.arrayContaining(['email match', 'phone match']));
});
});
describe('Pattern B — same email, different phone format (high)', () => {
it('high confidence when phones already normalize-equal', () => {
// From real data: Howard Wiarda #236/#536, "574-274-0548" vs "+15742740548".
// After normalization both phones are the same E.164, so the rule fires.
const incoming = candidate({
id: 'b',
fullName: 'Howard Wiarda',
surnameToken: 'wiarda',
emails: ['hwiarda@hotmail.com'],
phonesE164: ['+15742740548'],
});
const pool = [
candidate({
id: 'a',
fullName: 'Howard Wiarda',
surnameToken: 'wiarda',
emails: ['hwiarda@hotmail.com'],
phonesE164: ['+15742740548'],
}),
];
const matches = findClientMatches(incoming, pool, THRESHOLDS);
expect(matches[0]!.confidence).toBe('high');
expect(matches[0]!.score).toBeGreaterThanOrEqual(90);
});
});
describe('Pattern C — name capitalization variant (high)', () => {
it('treats lowercase + uppercase as the same person when surname-token + email + phone all match', () => {
// From real data: Nicolas Ruiz #681/#682/#683, email differs only by case.
const incoming = candidate({
id: 'b',
fullName: 'Nicolas Ruiz',
surnameToken: 'ruiz',
emails: ['ruiz.nicolas@ufl.edu'],
phonesE164: ['+17862006617'],
});
const pool = [
candidate({
id: 'a',
fullName: 'Nicolas Ruiz',
surnameToken: 'ruiz',
emails: ['ruiz.nicolas@ufl.edu'],
phonesE164: ['+17862006617'],
}),
];
const matches = findClientMatches(incoming, pool, THRESHOLDS);
expect(matches[0]!.confidence).toBe('high');
});
});
describe('Pattern D — name shortening (high)', () => {
it('Chris vs Christopher with same email + phone scores high', () => {
// From real data: Chris Allen #700 vs Christopher Allen #534.
const incoming = candidate({
id: 'b',
fullName: 'Chris Allen',
surnameToken: 'allen',
emails: ['chris@thundercatsports.com'],
phonesE164: ['+17814548950'],
});
const pool = [
candidate({
id: 'a',
fullName: 'Christopher Allen',
surnameToken: 'allen',
emails: ['chris@thundercatsports.com'],
phonesE164: ['+17814548950'],
}),
];
const matches = findClientMatches(incoming, pool, THRESHOLDS);
expect(matches[0]!.confidence).toBe('high');
});
});
describe('Pattern E — typo on resubmit', () => {
it('same email + nearly-identical phone (typo in last digits) scores high', () => {
// Christopher Camazou #649/#650 — phone differs in last 4 digits but
// everything else matches. Exact phone equality fails; email exact
// match alone (60) + name-token match (20) puts us in medium tier.
// The user can confirm the merge.
const incoming = candidate({
id: 'b',
fullName: 'Christopher Camazou',
surnameToken: 'camazou',
emails: ['camazou11@gmail.com'],
phonesE164: ['+33608334455'],
});
const pool = [
candidate({
id: 'a',
fullName: 'Christopher Camazou',
surnameToken: 'camazou',
emails: ['camazou11@gmail.com'],
phonesE164: ['+33608336549'],
}),
];
const matches = findClientMatches(incoming, pool, THRESHOLDS);
expect(matches).toHaveLength(1);
// Email + name match without phone match — strong but not certain.
expect(matches[0]!.confidence).toMatch(/^(high|medium)$/);
expect(matches[0]!.score).toBeGreaterThanOrEqual(70);
});
it('Constanzo / Costanzo surname typo with same email + phone scores high', () => {
// Gianfranco Di Constanzo #585 vs Di Costanzo #336 — same email + phone
// and only a 1-letter surname typo. This is a strong "same client,
// multiple yachts" signal — the design's signature win.
const incoming = candidate({
id: 'b',
fullName: 'Gianfranco Di Constanzo',
surnameToken: 'constanzo',
emails: ['gdc@nauticall.com'],
phonesE164: ['+17542628669'],
});
const pool = [
candidate({
id: 'a',
fullName: 'Gianfranco Di Costanzo',
surnameToken: 'costanzo',
emails: ['gdc@nauticall.com'],
phonesE164: ['+17542628669'],
}),
];
const matches = findClientMatches(incoming, pool, THRESHOLDS);
expect(matches[0]!.confidence).toBe('high');
expect(matches[0]!.score).toBeGreaterThanOrEqual(90);
});
});
describe('Pattern F — hard cases (must NOT auto-merge)', () => {
it('same name with different country phone + different email scores at most medium', () => {
// Etiennette Clamouze #188/#717 — same name but completely different
// email + phone (and the phones are in different country codes,
// suggesting either a relative, a coworker, or a name-collision).
// We must NOT classify this as "high" or it would force-merge two
// distinct people.
const incoming = candidate({
id: 'b',
fullName: 'Etiennette Clamouze',
surnameToken: 'clamouze',
emails: ['etiennette@the-manoah.com'],
phonesE164: ['+12645815607'],
countryIso: 'AI',
});
const pool = [
candidate({
id: 'a',
fullName: 'Etiennette Clamouze',
surnameToken: 'clamouze',
emails: ['clamouze.etiennette@gmail.com'],
phonesE164: ['+33767780640'],
countryIso: 'FR',
}),
];
const matches = findClientMatches(incoming, pool, THRESHOLDS);
// Surname-token + name-exact match should score in medium tier so
// the pair lands in the review queue but doesn't auto-merge.
if (matches.length > 0) {
expect(matches[0]!.confidence).not.toBe('high');
expect(matches[0]!.score).toBeLessThan(90);
}
});
it('shared email between two clearly different names is medium not high', () => {
// Bruno Joyerot #18 vs Bruce Hearn #19 — Bruno's row shows email
// belonging to "catherine elaine hearn" (Bruce's spouse). Same
// household phone area code. Name overlap is partial. Don't merge.
const incoming = candidate({
id: 'b',
fullName: 'Bruce Hearn',
surnameToken: 'hearn',
emails: ['bhearn1063@gmail.com'],
phonesE164: ['+12642358840'],
});
const pool = [
candidate({
id: 'a',
fullName: 'Bruno Joyerot',
surnameToken: 'joyerot',
emails: ['catherineelainehearn@gmail.com'],
phonesE164: ['+12642352816'],
}),
];
const matches = findClientMatches(incoming, pool, THRESHOLDS);
// Names don't match, emails don't match, phones differ — there's
// no reason for this to surface at all. Either no match or low.
if (matches.length > 0) {
expect(matches[0]!.confidence).toBe('low');
}
});
});
describe('Negative evidence — same email but different country phone', () => {
it('reduces score when email matches but phone country differs', () => {
// Constructed: same email, but one phone is +33 (FR) and the other
// is +1 (US). Likely a shared-inbox spouse situation. We want
// medium tier so it lands in review, not high tier.
const incoming = candidate({
id: 'b',
fullName: 'Test User',
surnameToken: 'user',
emails: ['shared@example.com'],
phonesE164: ['+15551234567'],
countryIso: 'US',
});
const pool = [
candidate({
id: 'a',
fullName: 'Test User',
surnameToken: 'user',
emails: ['shared@example.com'],
phonesE164: ['+33611111111'],
countryIso: 'FR',
}),
];
const matches = findClientMatches(incoming, pool, THRESHOLDS);
// Email match alone would be 60 + name token match 20 = 80 (medium).
// Negative evidence (different phone country) brings it down further.
expect(matches[0]!.confidence).toBe('medium');
});
});
describe('Blocking — only relevant candidates are scored', () => {
it('does not score candidates with no shared emails / phones / surname token', () => {
const incoming = candidate({
id: 'newbie',
fullName: 'Alice Smith',
surnameToken: 'smith',
emails: ['alice@example.com'],
phonesE164: ['+15551234567'],
});
const pool = [
candidate({
id: 'unrelated1',
fullName: 'Bob Jones',
surnameToken: 'jones',
emails: ['bob@example.org'],
phonesE164: ['+33611111111'],
}),
candidate({
id: 'unrelated2',
fullName: 'Carol White',
surnameToken: 'white',
emails: ['carol@example.net'],
phonesE164: ['+447700900111'],
}),
];
const matches = findClientMatches(incoming, pool, THRESHOLDS);
expect(matches).toHaveLength(0);
});
});
describe('Empty pool', () => {
it('returns no matches when the pool is empty', () => {
const incoming = candidate({
id: 'a',
fullName: 'Alice',
emails: ['alice@example.com'],
});
expect(findClientMatches(incoming, [], THRESHOLDS)).toEqual([]);
});
});
describe('Sort order', () => {
it('returns matches sorted by score descending', () => {
const incoming = candidate({
id: 'incoming',
fullName: 'John Smith',
surnameToken: 'smith',
emails: ['john@example.com'],
phonesE164: ['+15551234567'],
});
const pool = [
candidate({
// High match — same email + phone
id: 'high-match',
fullName: 'John Smith',
surnameToken: 'smith',
emails: ['john@example.com'],
phonesE164: ['+15551234567'],
}),
candidate({
// Medium match — same email only
id: 'medium-match',
fullName: 'Different Person',
surnameToken: 'person',
emails: ['john@example.com'],
phonesE164: ['+33611111111'],
}),
];
const matches = findClientMatches(incoming, pool, THRESHOLDS);
expect(matches.length).toBeGreaterThanOrEqual(2);
expect(matches[0]!.candidate.id).toBe('high-match');
expect(matches[0]!.score).toBeGreaterThan(matches[1]!.score);
});
});
});

View File

@@ -0,0 +1,213 @@
/**
* Migration transform — fixture-based regression test.
*
* Feeds the transform a small frozen NocoDB snapshot containing one
* representative row from each duplicate pattern documented in
* docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md §1.2,
* and asserts the resulting plan matches the algorithm's expected
* behavior. If any future change starts merging Pattern F (Etiennette
* Clamouze) or stops merging Pattern A (Deepak Ramchandani), this
* test fails immediately.
*/
import { describe, expect, it } from 'vitest';
import { transformSnapshot } from '@/lib/dedup/migration-transform';
import type { NocoDbRow, NocoDbSnapshot } from '@/lib/dedup/nocodb-source';
function row(fields: Partial<NocoDbRow> & { Id: number }): NocoDbRow {
return fields as NocoDbRow;
}
const FIXTURE: NocoDbSnapshot = {
fetchedAt: '2026-05-03T12:00:00.000Z',
berths: [],
residentialInterests: [],
websiteInterestSubmissions: [],
websiteContactFormSubmissions: [],
websiteBerthEoiSupplements: [],
interests: [
// Pattern A: pure double-submit (Deepak Ramchandani #624/#625)
row({
Id: 624,
'Full Name': 'Deepak Ramchandani',
'Email Address': 'dannyrams8888@gmail.com',
'Phone Number': '+17215868888',
'Sales Process Level': 'General Qualified Interest',
}),
row({
Id: 625,
'Full Name': 'Deepak Ramchandani',
'Email Address': 'dannyrams8888@gmail.com',
'Phone Number': '+17215868888',
'Sales Process Level': 'General Qualified Interest',
}),
// Pattern B: phone format variance (Howard Wiarda #236/#536)
row({
Id: 236,
'Full Name': 'Howard Wiarda',
'Email Address': 'hwiarda@hotmail.com',
'Phone Number': '574-274-0548',
'Place of Residence': 'USA',
'Sales Process Level': 'General Qualified Interest',
}),
row({
Id: 536,
'Full Name': 'Howard Wiarda',
'Email Address': 'hwiarda@hotmail.com',
'Phone Number': '+15742740548',
'Sales Process Level': 'General Qualified Interest',
}),
// Pattern C: name capitalization (Nicolas Ruiz #681/#682/#683 — three rows)
row({
Id: 681,
'Full Name': 'Nicolas Ruiz',
'Email Address': 'ruiz.nicolas@ufl.edu',
'Phone Number': '+17862006617',
'Sales Process Level': 'General Qualified Interest',
}),
row({
Id: 682,
'Full Name': 'Nicolas Ruiz',
'Email Address': 'ruiz.nicolas@ufl.edu',
'Phone Number': '+17862006617',
'Sales Process Level': 'Specific Qualified Interest',
}),
row({
Id: 683,
'Full Name': 'Nicolas Ruiz',
'Email Address': 'Ruiz.Nicolas@ufl.edu',
'Phone Number': '+17862006617',
'Sales Process Level': 'General Qualified Interest',
}),
// Pattern E: surname typo with same email + phone (Constanzo/Costanzo)
row({
Id: 336,
'Full Name': 'Gianfranco Di Costanzo',
'Email Address': 'gdc@nauticall.com',
'Phone Number': '+17542628669',
'Yacht Name': 'GEMINI',
'Sales Process Level': 'Contract Signed',
}),
row({
Id: 585,
'Full Name': 'Gianfranco Di Constanzo',
'Email Address': 'gdc@nauticall.com',
'Phone Number': '+17542628669',
'Yacht Name': 'CALYPSO',
'Sales Process Level': 'Signed EOI and NDA',
}),
// Pattern F: same name, different country phones (Etiennette Clamouze)
row({
Id: 188,
'Full Name': 'Etiennette Clamouze',
'Email Address': 'clamouze.etiennette@gmail.com',
'Phone Number': '+33767780640',
'Sales Process Level': 'General Qualified Interest',
}),
row({
Id: 717,
'Full Name': 'Etiennette Clamouze',
'Email Address': 'Etiennette@the-manoah.com',
'Phone Number': '+12645815607',
'Sales Process Level': 'General Qualified Interest',
}),
// Single isolated row to verify non-duplicates pass through
row({
Id: 999,
'Full Name': 'Lone Wolf',
'Email Address': 'lone@example.com',
'Phone Number': '+15551234567',
'Sales Process Level': 'General Qualified Interest',
}),
],
};
describe('transformSnapshot — fixture regression', () => {
it('produces the expected number of clients + interests', () => {
const plan = transformSnapshot(FIXTURE);
// 12 input rows → 7 unique clients (Deepak: 1, Wiarda: 1, Ruiz: 1,
// Constanzo: 1, Etiennette x2: 2, Lone: 1). Etiennette stays as 2
// because Pattern F is correctly NOT auto-merged.
expect(plan.stats.outputClients).toBe(7);
expect(plan.stats.outputInterests).toBe(12); // one per source row
});
it('auto-links every Pattern AE cluster', () => {
const plan = transformSnapshot(FIXTURE);
const linkedSourceIds = new Set<number>();
for (const link of plan.autoLinks) {
linkedSourceIds.add(link.leadSourceId);
for (const merged of link.mergedSourceIds) {
linkedSourceIds.add(merged);
}
}
// Pattern A: 624 + 625
expect(linkedSourceIds.has(624) && linkedSourceIds.has(625)).toBe(true);
// Pattern B: 236 + 536
expect(linkedSourceIds.has(236) && linkedSourceIds.has(536)).toBe(true);
// Pattern C: 681 + 682 + 683 (three-way)
expect(linkedSourceIds.has(681) && linkedSourceIds.has(682) && linkedSourceIds.has(683)).toBe(
true,
);
// Pattern E: 336 + 585
expect(linkedSourceIds.has(336) && linkedSourceIds.has(585)).toBe(true);
});
it('does NOT auto-link Pattern F (Etiennette Clamouze, different country)', () => {
const plan = transformSnapshot(FIXTURE);
const linkedSourceIds = new Set<number>();
for (const link of plan.autoLinks) {
linkedSourceIds.add(link.leadSourceId);
for (const merged of link.mergedSourceIds) {
linkedSourceIds.add(merged);
}
}
// Both Etiennette rows must remain as separate clients.
expect(linkedSourceIds.has(188)).toBe(false);
expect(linkedSourceIds.has(717)).toBe(false);
});
it('preserves every interest as its own row even when clients merge', () => {
const plan = transformSnapshot(FIXTURE);
const sourceIds = plan.interests.map((i) => i.sourceId).sort((a, b) => a - b);
expect(sourceIds).toEqual([188, 236, 336, 536, 585, 624, 625, 681, 682, 683, 717, 999]);
});
it('maps the legacy 8-stage enum to new pipeline stages', () => {
const plan = transformSnapshot(FIXTURE);
const stagesById = new Map(plan.interests.map((i) => [i.sourceId, i.pipelineStage]));
expect(stagesById.get(681)).toBe('open'); // General Qualified Interest
expect(stagesById.get(682)).toBe('details_sent'); // Specific Qualified Interest
expect(stagesById.get(336)).toBe('contract_signed'); // Contract Signed
expect(stagesById.get(585)).toBe('eoi_signed'); // Signed EOI and NDA
});
it('attaches different yachts to one merged Constanzo client', () => {
const plan = transformSnapshot(FIXTURE);
const constanzoClient = plan.clients.find(
(c) => c.sourceIds.includes(336) && c.sourceIds.includes(585),
);
expect(constanzoClient).toBeDefined();
const yachtsForConstanzo = plan.interests
.filter((i) => i.clientTempId === constanzoClient!.tempId)
.map((i) => i.yachtName)
.sort();
expect(yachtsForConstanzo).toEqual(['CALYPSO', 'GEMINI']);
});
it('produces deterministic output (same input → same plan)', () => {
// The transform is pure — running it twice should yield bit-identical
// results. Catches order-dependent bugs in the dedup clustering.
const a = transformSnapshot(FIXTURE);
const b = transformSnapshot(FIXTURE);
expect(JSON.stringify(a.stats)).toBe(JSON.stringify(b.stats));
expect(a.autoLinks.length).toBe(b.autoLinks.length);
});
});

View File

@@ -0,0 +1,270 @@
/**
* Normalization library — unit tests.
*
* Every fixture here comes from real dirty values observed in the legacy
* NocoDB Interests table during the 2026-05-03 audit (see
* docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md §1.3).
* The point is regression-prevention: if any of these patterns ever
* stops normalizing the way it should, dedup quality silently drops.
*/
import { describe, expect, it } from 'vitest';
import {
normalizeName,
normalizeEmail,
normalizePhone,
resolveCountry,
} from '@/lib/dedup/normalize';
describe('normalizeName', () => {
it('returns null fields for empty / null input', () => {
expect(normalizeName('')).toEqual({ display: '', normalized: '', surnameToken: undefined });
expect(normalizeName(' ')).toEqual({
display: '',
normalized: '',
surnameToken: undefined,
});
});
it('trims leading/trailing whitespace', () => {
expect(normalizeName(' Marcus Laurent ')).toMatchObject({
display: 'Marcus Laurent',
normalized: 'marcus laurent',
});
});
it('collapses repeated internal whitespace to a single space', () => {
// From real data: "Arthur Matthews" (#183), "Corinne Roche" (#208).
expect(normalizeName('Arthur Matthews').display).toBe('Arthur Matthews');
expect(normalizeName('Corinne Roche').display).toBe('Corinne Roche');
});
it('replaces embedded carriage returns and newlines with single spaces', () => {
// From real data: "Andrei \nVAGNANOV" (#178), "Daniel\r PRZEDBORSKI" (#175).
expect(normalizeName('Andrei \nVAGNANOV').display).toBe('Andrei Vagnanov');
expect(normalizeName('Daniel\r PRZEDBORSKI').display).toBe('Daniel Przedborski');
});
it('title-cases ALL-CAPS surnames while keeping given name title-cased', () => {
// From real data: "Jona ANDERSEN" (#232), "Duane SALTSGAVER" (#227),
// "Marcos DALLA PRIA" (#165).
expect(normalizeName('Jona ANDERSEN').display).toBe('Jona Andersen');
expect(normalizeName('Duane SALTSGAVER').display).toBe('Duane Saltsgaver');
// Particle 'dalla' stays lowercase mid-name.
expect(normalizeName('Marcos DALLA PRIA').display).toBe('Marcos dalla Pria');
});
it('title-cases lowercased entries', () => {
// From real data: "antony amaral" (#665), "david rosenbloom" (#239),
// "john Tickner" (#247).
expect(normalizeName('antony amaral').display).toBe('Antony Amaral');
expect(normalizeName('david rosenbloom').display).toBe('David Rosenbloom');
expect(normalizeName('john Tickner').display).toBe('John Tickner');
});
it('keeps Romance and Germanic particles lowercase mid-name', () => {
// From real data: "Olav van Velsen" (#526), "Bruno Joyerot" (#18),
// "OLIVIER DAIN" (#677). Also synthetic "Carla de la Cruz".
expect(normalizeName('Olav van Velsen').display).toBe('Olav van Velsen');
expect(normalizeName('Carla de la Cruz').display).toBe('Carla de la Cruz');
expect(normalizeName('OLIVIER DAIN').display).toBe('Olivier Dain');
});
it('preserves O-prefixed Irish surnames as title-case', () => {
expect(normalizeName("liam o'brien").display).toBe("Liam O'Brien");
});
it('keeps the slash-with-company structure intact', () => {
// From real data: "Daniel Wainstein / 7 Knots, LLC" (#637),
// "Bruno Joyerot / SAS TIKI" (#18).
expect(normalizeName('Daniel Wainstein / 7 Knots, LLC').display).toBe(
'Daniel Wainstein / 7 Knots, LLC',
);
expect(normalizeName('Bruno Joyerot / SAS TIKI').display).toBe('Bruno Joyerot / SAS TIKI');
});
it('exposes the last non-particle token as surnameToken (lowercase) for blocking', () => {
expect(normalizeName('Marcus Laurent').surnameToken).toBe('laurent');
expect(normalizeName('Olav van Velsen').surnameToken).toBe('velsen');
expect(normalizeName('Carla de la Cruz').surnameToken).toBe('cruz');
expect(normalizeName("Liam O'Brien").surnameToken).toBe("o'brien");
});
it('handles single-token names — surnameToken is the only token', () => {
expect(normalizeName('Madonna').surnameToken).toBe('madonna');
});
it('produces a normalized form that is always lowercase', () => {
expect(normalizeName('Andrei VAGNANOV').normalized).toBe('andrei vagnanov');
expect(normalizeName('Daniel Wainstein / 7 Knots, LLC').normalized).toBe(
'daniel wainstein / 7 knots, llc',
);
});
});
describe('normalizeEmail', () => {
it('returns null for empty / null inputs', () => {
expect(normalizeEmail('')).toBeNull();
expect(normalizeEmail(' ')).toBeNull();
});
it('lowercases and trims', () => {
// From real data: "Arthur@laser-align.com" vs "arthur@laser-align.com" (#183/#686).
expect(normalizeEmail('Arthur@laser-align.com')).toBe('arthur@laser-align.com');
expect(normalizeEmail(' marcus@example.com ')).toBe('marcus@example.com');
});
it('lowercases capitalized localparts', () => {
// From real data: "Bmalone850@gmail.com" (#489), "Hef355@yahoo.com" (#533),
// "Donclaytonmusic@gmail.com" (#679).
expect(normalizeEmail('Bmalone850@gmail.com')).toBe('bmalone850@gmail.com');
expect(normalizeEmail('Hef355@yahoo.com')).toBe('hef355@yahoo.com');
});
it('preserves plus-aliases — both legitimate and tricks', () => {
// Per design §3.2: "+aliases" are not stripped. Compare by full localpart.
expect(normalizeEmail('marcus+sales@example.com')).toBe('marcus+sales@example.com');
});
it('returns null for invalid email shapes', () => {
expect(normalizeEmail('not-an-email')).toBeNull();
expect(normalizeEmail('@example.com')).toBeNull();
expect(normalizeEmail('user@')).toBeNull();
expect(normalizeEmail('user@.com')).toBeNull();
});
});
describe('normalizePhone', () => {
it('returns null for empty / whitespace / null', () => {
expect(normalizePhone('', 'AI')).toBeNull();
expect(normalizePhone(' ', 'AI')).toBeNull();
});
it('parses a plain E.164 number', () => {
expect(normalizePhone('+15742740548', 'US')).toMatchObject({
e164: '+15742740548',
country: 'US',
});
});
it('strips embedded carriage returns and trailing whitespace', () => {
// From real data: "+1-264-235-8840\r" (#19), "+1-264-772-3272\r" (#20).
const out = normalizePhone('+1-264-235-8840\r', 'AI');
expect(out?.e164).toBe('+12642358840');
});
it('strips dashes, dots, parens, single quotes, spaces in a single pass', () => {
// From real data: "'+1.214.603.4235" (#205), "574-274-0548" (#236),
// "+1-264-235-8840" (#19), "+1 (212) 555-0123" (synthetic).
expect(normalizePhone("'+1.214.603.4235", 'US')?.e164).toBe('+12146034235');
expect(normalizePhone('574-274-0548', 'US')?.e164).toBe('+15742740548');
expect(normalizePhone('+1 (212) 555-0123', 'US')?.e164).toBe('+12125550123');
});
it('converts a leading 00 prefix to + (international dialling)', () => {
// From real data: "00447956657022" (#216), "0033651381036" (#702).
expect(normalizePhone('00447956657022', 'GB')?.e164).toBe('+447956657022');
expect(normalizePhone('0033651381036', 'FR')?.e164).toBe('+33651381036');
});
it('uses defaultCountry when input has no international prefix', () => {
// From real data: "0690699699" (#203, French local), "0651381036" (#701).
expect(normalizePhone('0690699699', 'FR')?.e164).toBe('+33690699699');
expect(normalizePhone('0651381036', 'FR')?.e164).toBe('+33651381036');
});
it('returns null when there is no prefix AND no defaultCountry', () => {
// The migration script flags these for human review.
const out = normalizePhone('5742740548');
expect(out?.e164 ?? null).toBeNull();
});
it('flags placeholder all-zeros numbers and returns null', () => {
// From real data: "+447000000000" (#641, "Milos Vitkovic" — clearly fake).
const out = normalizePhone('+447000000000', 'GB');
expect(out?.flagged).toBe('placeholder');
expect(out?.e164).toBeNull();
});
it('flags multi-number fields and uses the first segment', () => {
// From real data: "0677580750/0690511494" (#209). Other separators: ; ,
const slash = normalizePhone('0677580750/0690511494', 'FR');
expect(slash?.flagged).toBe('multi_number');
expect(slash?.e164).toBe('+33677580750');
const semi = normalizePhone('+33611111111;+33622222222', 'FR');
expect(semi?.flagged).toBe('multi_number');
expect(semi?.e164).toBe('+33611111111');
});
it('flags genuinely unparseable input as `unparseable`', () => {
const out = normalizePhone('xyz-not-a-phone', 'US');
expect(out?.flagged).toBe('unparseable');
expect(out?.e164).toBeNull();
});
it('strips an apostrophe-prefix without breaking the parse', () => {
// From real data: leading "'" copy-pasted from spreadsheets escapes
// numeric-cell coercion. Should be invisible to dedup.
expect(normalizePhone("'0690699699", 'FR')?.e164).toBe('+33690699699');
});
it('returns the country alongside the E.164 form', () => {
expect(normalizePhone('+33690699699', 'FR')).toMatchObject({
e164: '+33690699699',
country: 'FR',
});
});
});
describe('resolveCountry', () => {
it('returns null for empty / nullish input', () => {
expect(resolveCountry('')).toEqual({ iso: null, confidence: null });
expect(resolveCountry(' ')).toEqual({ iso: null, confidence: null });
});
it('exact-matches a canonical English country name', () => {
expect(resolveCountry('Anguilla')).toEqual({ iso: 'AI', confidence: 'exact' });
expect(resolveCountry('United Kingdom')).toEqual({ iso: 'GB', confidence: 'exact' });
expect(resolveCountry('United States')).toEqual({ iso: 'US', confidence: 'exact' });
});
it('matches case-insensitively', () => {
expect(resolveCountry('anguilla').iso).toBe('AI');
expect(resolveCountry('UNITED KINGDOM').iso).toBe('GB');
});
it('matches values with surrounding whitespace', () => {
expect(resolveCountry(' United States ').iso).toBe('US');
});
it('handles diacritic variants of Saint-Barthélemy', () => {
// From real data: "Saint barthelemy" (#203), "St Barth" (#208), "Saint-Barthélemy".
expect(resolveCountry('Saint-Barthélemy').iso).toBe('BL');
expect(resolveCountry('Saint Barthelemy').iso).toBe('BL');
expect(resolveCountry('saint barthelemy').iso).toBe('BL');
expect(resolveCountry('St Barth').iso).toBe('BL');
});
it('resolves common abbreviations', () => {
expect(resolveCountry('USA').iso).toBe('US');
expect(resolveCountry('UK').iso).toBe('GB');
});
it('falls back to a city → country mapping for high-frequency cities', () => {
// From real data: "Kansas City" (#198), "Sag Harbor Y" (#239).
expect(resolveCountry('Kansas City').iso).toBe('US');
expect(resolveCountry('Sag Harbor Y').iso).toBe('US');
});
it('marks the confidence tier appropriately', () => {
expect(resolveCountry('Anguilla').confidence).toBe('exact');
expect(resolveCountry('Kansas City').confidence).toBe('city');
});
it('returns null + null for unresolvable values', () => {
// Migration script flags these for human review rather than guessing.
expect(resolveCountry('asdfghjkl xyz')).toEqual({ iso: null, confidence: null });
expect(resolveCountry('Mars')).toEqual({ iso: null, confidence: null });
});
});