Compare commits
29 Commits
feat/berth
...
c612bbdfd9
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
c612bbdfd9 | ||
|
|
872c75f1a1 | ||
|
|
c45aac551d | ||
|
|
9ad1df85d2 | ||
|
|
8e4d2fc5b4 | ||
|
|
78f2f46d41 | ||
|
|
3a9419fe10 | ||
|
|
b703684285 | ||
|
|
a792d9a182 | ||
|
|
d7ec2a8507 | ||
|
|
cb83b09b2d | ||
|
|
7574c3b575 | ||
|
|
bb105f5365 | ||
|
|
caafae15dd | ||
|
|
46c7389930 | ||
|
|
80fc5932be | ||
|
|
b26b87b2fa | ||
|
|
88f76b6b04 | ||
|
|
a32f41b91d | ||
|
|
cf1c8b66db | ||
|
|
596476280d | ||
|
|
e9359fc431 | ||
|
|
4767caec01 | ||
|
|
49d92234dd | ||
|
|
cad55e3565 | ||
|
|
4bcc7f8be6 | ||
|
|
18e5c124b0 | ||
|
|
8b077e1999 | ||
|
|
36b92eb827 |
@@ -1 +0,0 @@
|
||||
{"sessionId":"fd05cbd7-d695-4a70-9223-4b25f3369829","pid":88534,"acquiredAt":1776866083076}
|
||||
13
.gitignore
vendored
13
.gitignore
vendored
@@ -28,9 +28,22 @@ docker-compose.override.yml
|
||||
|
||||
# Ad-hoc screenshots / scratch artifacts at repo root
|
||||
/*.png
|
||||
/*.jpg
|
||||
|
||||
# Legacy Nuxt portal — kept on disk for reference, not tracked here
|
||||
/client-portal/
|
||||
|
||||
# Sister marketing site — separate Nuxt project, not part of CRM tracking
|
||||
/website/
|
||||
|
||||
# Mobile audit screenshots — generated locally, regenerable
|
||||
/.audit/
|
||||
/.audit-screenshots/
|
||||
|
||||
# Migration script output (CSV reports, transcripts)
|
||||
.migration/
|
||||
|
||||
# Tool caches / runtime state
|
||||
/.claude/
|
||||
/.serena/
|
||||
/ruvector.db
|
||||
|
||||
123
docs/operations/outbound-comms-safety.md
Normal file
123
docs/operations/outbound-comms-safety.md
Normal file
@@ -0,0 +1,123 @@
|
||||
# Outbound communications safety net
|
||||
|
||||
**Last reviewed:** 2026-05-03
|
||||
**Owner:** matt@portnimara.com
|
||||
|
||||
This doc enumerates every channel through which the CRM can produce
|
||||
outbound communication (email, document signing, webhooks) and describes
|
||||
how each channel respects the `EMAIL_REDIRECT_TO` env var. The goal: a
|
||||
single environment flip pauses **all** outbound traffic, so a production
|
||||
data import, dedup migration dry-run, or staging environment can run
|
||||
against real data without anyone getting paged or spammed.
|
||||
|
||||
> **Single env switch:** when `EMAIL_REDIRECT_TO` is set to an address,
|
||||
> all outbound communication is rerouted there or short-circuited. Unset
|
||||
> it in production.
|
||||
|
||||
---
|
||||
|
||||
## Channels
|
||||
|
||||
### 1. Direct email (`sendEmail`)
|
||||
|
||||
**Path:** `src/lib/email/index.ts` → `sendEmail()` → nodemailer SMTP transport.
|
||||
|
||||
**Safety:** YES — covered.
|
||||
|
||||
When `EMAIL_REDIRECT_TO` is set, `sendEmail()` rewrites the `to` header
|
||||
to the redirect address and prefixes the subject with
|
||||
`[redirected from <orig>]`. The original recipient is logged.
|
||||
|
||||
**Call sites** (all flow through `sendEmail`, so all are covered):
|
||||
|
||||
- `src/lib/services/portal-auth.service.ts` — portal activation + reset
|
||||
- `src/lib/services/crm-invite.service.ts` — CRM user invitations
|
||||
- `src/lib/services/document-templates.ts` — template-generated PDFs sent
|
||||
as attachments (the PDF body is generated locally; the email itself
|
||||
goes through SMTP)
|
||||
- `src/lib/services/email-compose.service.ts` — ad-hoc emails composed
|
||||
in the in-app UI
|
||||
- `src/lib/services/gdpr-export.service.ts` — GDPR export delivery
|
||||
|
||||
### 2. Documenso e-signature recipients
|
||||
|
||||
**Path:** `src/lib/services/documenso-client.ts` → `createDocument()` /
|
||||
`generateDocumentFromTemplate()` → Documenso REST API.
|
||||
|
||||
**Safety:** YES — covered as of 2026-05-03.
|
||||
|
||||
Documenso's own server sends the signing-request email on our behalf.
|
||||
We can't intercept that at the SMTP layer because it's external. The
|
||||
fix is at the REST-call boundary: when `EMAIL_REDIRECT_TO` is set,
|
||||
`createDocument` rewrites every recipient's email to the redirect
|
||||
address and prefixes the recipient name with `(was: <orig email>)` so
|
||||
the doc is still traceable to its intended recipient.
|
||||
`generateDocumentFromTemplate` does the same for both shapes the
|
||||
template-generate endpoint accepts (v1.13 `formValues.*Email` keys and
|
||||
v2.x `recipients` array).
|
||||
|
||||
The redirect happens **before** the API call, so even if Documenso has
|
||||
its own retry logic the original email never leaves our process.
|
||||
|
||||
### 3. Webhooks (outbound to user-configured URLs)
|
||||
|
||||
**Path:** `src/lib/queue/workers/webhooks.ts` → BullMQ job → `fetch(webhook.url, ...)`.
|
||||
|
||||
**Safety:** YES — covered as of 2026-05-03.
|
||||
|
||||
When `EMAIL_REDIRECT_TO` is set, the webhook worker short-circuits
|
||||
before the HTTP call. The delivery row is marked `dead_letter` with a
|
||||
human-readable reason so it's still visible in the deliveries listing.
|
||||
The SSRF guard remains in place independently.
|
||||
|
||||
### 4. WhatsApp / phone deep-links
|
||||
|
||||
**Path:** `<a href="https://wa.me/...">` and `<a href="tel:...">` in
|
||||
client / interest detail headers.
|
||||
|
||||
**Safety:** N/A — user-initiated only.
|
||||
|
||||
These are deep links the user explicitly clicks. No automated dispatch.
|
||||
A deep link click opens the user's WhatsApp / phone app, which is the
|
||||
intended interaction. No safety net needed.
|
||||
|
||||
### 5. SMS
|
||||
|
||||
Not implemented. The `interests.preferredContactMethod` enum includes
|
||||
`'sms'` as a value but no sending path exists. If/when SMS is added (e.g.
|
||||
via Twilio), the new send function should respect `EMAIL_REDIRECT_TO`
|
||||
the same way `sendEmail` does — log the original number, drop the
|
||||
message, or reroute to a configurable `SMS_REDIRECT_TO` env.
|
||||
|
||||
---
|
||||
|
||||
## Verification checklist before importing real data
|
||||
|
||||
- [ ] `.env` has `EMAIL_REDIRECT_TO=<my-address>` set.
|
||||
- [ ] Restart dev server (or worker) so the new env is picked up — env
|
||||
vars are read at import time in some paths.
|
||||
- [ ] Send a test email via `pnpm tsx scripts/dev-trigger-portal-invite.ts`
|
||||
or similar. Confirm subject is prefixed with `[redirected from ...]`.
|
||||
- [ ] Trigger an EOI send through the UI (any client). Confirm Documenso
|
||||
shows the redirect address as recipient (not the real client email).
|
||||
- [ ] If any webhooks are configured, trigger an event that fires one and
|
||||
confirm the delivery is recorded as `dead_letter` with the
|
||||
"EMAIL_REDIRECT_TO is set" reason.
|
||||
- [ ] Run the NocoDB migration `--dry-run` to count clients/interests; the
|
||||
`--apply` step is what creates real records but emails/webhooks are
|
||||
still gated by the redirect env.
|
||||
|
||||
## Production cutover
|
||||
|
||||
When ready to go live:
|
||||
|
||||
1. Run a final dry-run of the data migration with `EMAIL_REDIRECT_TO` set
|
||||
to a sandbox address.
|
||||
2. Verify the snapshot looks right (counts, client coverage).
|
||||
3. Unset `EMAIL_REDIRECT_TO` in the production env.
|
||||
4. Restart the app + worker.
|
||||
5. Run the migration with `--apply`. From this point forward, real
|
||||
recipients will receive real comms.
|
||||
|
||||
If you ever need to re-pause outbound (e.g. handling a security incident,
|
||||
re-importing on top of existing data), set `EMAIL_REDIRECT_TO` again.
|
||||
564
docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md
Normal file
564
docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md
Normal file
@@ -0,0 +1,564 @@
|
||||
# Client Deduplication and NocoDB Migration Design
|
||||
|
||||
**Status**: Design draft 2026-05-03 — pending approval.
|
||||
**Plan decomposition**: Three implementation plans stack from this design — (P1) normalization + dedup core library; (P2) admin settings + at-create + interest-level guards (runtime); (P3) NocoDB migration script + review queue UI. P1 unblocks P2 and P3.
|
||||
**Branch base**: stacks on `feat/mobile-foundation` once it merges to `main`.
|
||||
**Out of scope**: live merge of two clients across ports (cross-tenant), automated AI-judged matches, profile-photo / face-match dedup, web-of-trust referrer relationships.
|
||||
|
||||
---
|
||||
|
||||
## 1. Background
|
||||
|
||||
### 1.1 Why this exists
|
||||
|
||||
The legacy CRM lives in a NocoDB base whose `Interests` table conflates _the human_ with _the deal_. A row contains `Full Name`, `Email Address`, `Phone Number`, `Address`, `Place of Residence` _and_ the sales-pipeline state for one specific berth. A single human pursuing two berths becomes two rows with semi-duplicated personal data. A 2026-05-03 read-only audit confirmed:
|
||||
|
||||
- **252 Interests rows** in NocoDB, against an estimated ~190–200 unique humans (~20–25% duplication rate).
|
||||
- **35 Residential Interests rows** in a parallel residential pathway with the same conflation.
|
||||
- **64 Website Interest Submissions + 47 Website Contact Form Submissions + 1 EOI Supplemental Form** as inbound capture surfaces.
|
||||
- **No Clients table.** The conflated structure is structural, not accidental.
|
||||
|
||||
The new CRM (`src/lib/db/schema/clients.ts`) splits this into `clients` (people) ↔ `interests` (deals), with `clientContacts` (multi-channel), `clientAddresses` (multi-address), and a pre-existing `clientMergeLog` table that anticipates merge with undo. The design has been ready; what's missing is (a) a normalization + matching library, (b) the at-create / at-import surfaces that use it, and (c) the migration of the existing 252+35 records.
|
||||
|
||||
### 1.2 Real duplicate patterns observed in the live data
|
||||
|
||||
Sampled 200 of the 252 NocoDB Interests rows. Confirmed duplicate clusters fall into six patterns:
|
||||
|
||||
| Pattern | Example rows | Signature |
|
||||
| ---------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------- |
|
||||
| **A. Pure double-submit** | Deepak Ramchandani #624/#625; John Lynch #716/#725 | All fields identical; created same day |
|
||||
| **B. Phone format variance** | Howard Wiarda #236/#536 (`574-274-0548` vs `+15742740548`); Christophe Zasso #701/#702 (`0651381036` vs `0033651381036`) | Same email, normalize-equal phone |
|
||||
| **C. Name capitalization** | Nicolas Ruiz #681/#682/#683; Jean-Charles Miege/MIEGE #37/#163; John Farmer/FARMER #35/#161 | Same email or empty; surname case differs |
|
||||
| **D. Name shortening** | Chris vs Christopher Allen #700/#534; Emma c vs Emma Cauchefer #661/#673 | Same email + phone; given-name truncated |
|
||||
| **E. Resubmit with typo** | Christopher Camazou #649/#650 (phone last 4 digits typo); Gianfranco Di Constanzo/Costanzo #585/#336 (surname typo, **different yacht** — should be ONE client + TWO interests) | Score-on-everything-else high, one field has small-edit-distance noise |
|
||||
| **F. Hard cases** | Etiennette Clamouze #188/#717 (same name, different country phone + email); Bruno Joyerot #18 with email belonging to Bruce Hearn #19 (couple sharing contact) | Cannot resolve without a human |
|
||||
|
||||
This dataset will be the fixture for the dedup library's tests — every pattern above must be either auto-detected or flagged for review, and the false-positive bar must be high enough that Pattern F doesn't get force-merged.
|
||||
|
||||
### 1.3 Dirty data inventory
|
||||
|
||||
The migration normalizer must survive these real values from production:
|
||||
|
||||
**Phone fields**: `+1-264-235-8840\r` (with carriage return), `'+1.214.603.4235` (apostrophe + dots), `0677580750/0690511494` (two numbers in one field), `00447956657022` (00 prefix), `+447000000000` (placeholder all-zeros), `+4901637039672` (impossible — stripped 0 + country prefix), various unprefixed local formats, dashed US numbers without country code.
|
||||
|
||||
**Email fields**: mixed case rampant (`Arthur@laser-align.com` vs `arthur@laser-align.com`); ALL-CAPS local parts; trailing whitespace.
|
||||
|
||||
**Name fields**: ALL-CAPS surnames mixed with title-case given names; embedded `\n` and `\r`; double spaces; lowercase-only entries; slash-with-company variants (`Daniel Wainstein / 7 Knots, LLC`, `Bruno Joyerot / SAS TIKI`); placeholder `Mr DADER`, `TBC`.
|
||||
|
||||
**Place of Residence (free text)**: `Saint barthelemy`, `St Barth`, `Saint-Barthélemy` (same place, three forms); `anguilla`, `United States `, `USA`, `Kansas City` (city without country), `Sag Harbor Y` (typo).
|
||||
|
||||
### 1.4 Existing battle-tested algorithm
|
||||
|
||||
`client-portal/server/utils/duplicate-detection.ts` already implements blocking + weighted-rules dedup against this same NocoDB. It runs in production today. We **port it forward** (don't reinvent), then add: soundex/metaphone for surname matching, compounded-confidence when multiple rules match, and negative evidence (same email + different country phone reduces confidence).
|
||||
|
||||
### 1.5 Why the website is no longer the source of new dirty data
|
||||
|
||||
The website forms (`website/components/pn/specific/website/{berths-item,register,form}/form.vue`) use `<v-phone-input>` with a country picker (`prefer-countries: ['US', 'GB', 'DE', 'FR']`) and `[(value) => !!value || 'Phone number is required']` validation. Output is E.164-shaped. The 252 dirty rows are legacy — pre-form-redesign submissions, sales-rep manual entries, and external CSV imports. Future inbound is clean.
|
||||
|
||||
---
|
||||
|
||||
## 2. Approach
|
||||
|
||||
Three artifacts, layered:
|
||||
|
||||
1. **A pure-logic normalization + matching library** at `src/lib/dedup/`. JSX-free, vitest-native (proven pattern: `realtime-invalidation-core.ts`). Tested against the dirty-data fixture corpus drawn from §1.2.
|
||||
2. **Three runtime surfaces** that use the library: at-create suggestion in client/interest forms; interest-level same-berth guard; admin review queue powered by a nightly background scoring job.
|
||||
3. **A one-shot migration script** that pulls NocoDB → normalizes → dedupes → writes new schema → produces a CSV report with auto-merge log + flagged-for-review pile.
|
||||
|
||||
**Configurability via admin settings** (`system_settings` per port) so the team can tune sensitivity without code changes. Defaults err on the safe side — a flagged review is cheaper than a wrongly-merged record.
|
||||
|
||||
**Reversibility**: every merge writes a `client_merge_log` row containing the loser's full pre-state JSON. A 7-day undo window lets a wrong merge be reversed without engineering involvement. After 7 days the snapshot is purged for GDPR; merges become permanent.
|
||||
|
||||
---
|
||||
|
||||
## 3. Normalization library
|
||||
|
||||
Lives at `src/lib/dedup/normalize.ts`. Pure functions, no DB, vitest-tested. Used by the dedup algorithm AND by all create-paths so what gets stored is already normalized.
|
||||
|
||||
### 3.1 `normalizeName(raw: string)`
|
||||
|
||||
```ts
|
||||
export function normalizeName(raw: string): {
|
||||
display: string; // human-readable, kept for UI
|
||||
normalized: string; // for matching
|
||||
surnameToken?: string; // for surname-based blocking
|
||||
};
|
||||
```
|
||||
|
||||
- Trim leading/trailing whitespace
|
||||
- Replace `\r`, `\n`, tabs with single space
|
||||
- Collapse consecutive whitespace to single space
|
||||
- Smart title-case: keep particles (`van`, `de`, `del`, `O'`, `di`, `le`, `da`) lowercase except as first token
|
||||
- `display` preserves user's intent (slash-with-company stays intact)
|
||||
- `normalized` is `display.toLowerCase()` for comparison
|
||||
- `surnameToken` is the last non-particle token for blocking
|
||||
|
||||
### 3.2 `normalizeEmail(raw: string)`
|
||||
|
||||
```ts
|
||||
export function normalizeEmail(raw: string): string | null;
|
||||
```
|
||||
|
||||
- Trim + lowercase
|
||||
- Validate via `zod.email()` schema
|
||||
- Returns `null` for empty / invalid (caller decides what to do)
|
||||
- **Does NOT strip plus-aliases** (`user+tag@domain.com`) — both intentional (real distinct addresses) and malicious-prevention apply. Compare by full localpart.
|
||||
|
||||
### 3.3 `normalizePhone(raw: string, defaultCountry: string)`
|
||||
|
||||
```ts
|
||||
export function normalizePhone(
|
||||
raw: string,
|
||||
defaultCountry: string,
|
||||
): {
|
||||
e164: string | null; // canonical, e.g. '+15742740548'
|
||||
country: string | null; // ISO-3166-1 alpha-2
|
||||
display: string | null; // user-facing pretty
|
||||
flagged?: 'multi_number' | 'placeholder' | 'unparseable';
|
||||
} | null;
|
||||
```
|
||||
|
||||
Pipeline:
|
||||
|
||||
1. Strip `\r`, `\n`, tabs, single quotes, dots, dashes, parens, spaces
|
||||
2. If contains `/` or `;` or `,` → flag `multi_number`, take first segment
|
||||
3. If matches `+\d{2}0+$` (e.g., `+447000000000`) → flag `placeholder`, return null
|
||||
4. If starts with `00` → replace with `+`
|
||||
5. If starts with `+` → parse as E.164
|
||||
6. Else if `defaultCountry` provided → parse against that country
|
||||
7. Else return null (caller's problem)
|
||||
|
||||
Backed by `libphonenumber-js` (already in deps via `tests/integration/factories.ts` usage if not, will add). The hostile cases above all need explicit handling — naïve regex won't survive.
|
||||
|
||||
### 3.4 `resolveCountry(text: string)`
|
||||
|
||||
```ts
|
||||
export function resolveCountry(text: string): {
|
||||
iso: string | null; // ISO-3166-1 alpha-2
|
||||
confidence: 'exact' | 'fuzzy' | 'city' | null;
|
||||
};
|
||||
```
|
||||
|
||||
Reuses `src/lib/i18n/countries.ts`. Pipeline:
|
||||
|
||||
1. Lowercase + strip diacritics
|
||||
2. Exact match against country names (any locale we ship)
|
||||
3. Fuzzy match (Levenshtein ≤ 2 against canonical English names)
|
||||
4. City fallback — small in-package mapping for high-frequency cities seen in legacy data (`Sag Harbor → US`, `Kansas City → US`, `St Barth → BL`, etc.). Order: exact → city → fuzzy.
|
||||
|
||||
The mapping is opinionated and small (~30 entries covering the actual values seen in the 252-row dataset). Anything that fails to resolve returns `null` and lands in the migration's flagged pile.
|
||||
|
||||
---
|
||||
|
||||
## 4. Dedup algorithm
|
||||
|
||||
Lives at `src/lib/dedup/find-matches.ts`. Pure function. Vitest-tested against the §1.2 cluster fixtures.
|
||||
|
||||
### 4.1 Public API
|
||||
|
||||
```ts
|
||||
export interface MatchCandidate {
|
||||
id: string;
|
||||
fullName: string | null;
|
||||
emails: string[]; // already normalized
|
||||
phonesE164: string[]; // already normalized E.164
|
||||
countryIso: string | null;
|
||||
}
|
||||
|
||||
export interface MatchResult {
|
||||
candidate: MatchCandidate;
|
||||
score: number; // 0–100
|
||||
reasons: string[]; // human-readable, e.g. ["email match", "phone match"]
|
||||
confidence: 'high' | 'medium' | 'low';
|
||||
}
|
||||
|
||||
export function findClientMatches(
|
||||
input: MatchCandidate,
|
||||
pool: MatchCandidate[],
|
||||
thresholds: DedupThresholds,
|
||||
): MatchResult[];
|
||||
```
|
||||
|
||||
### 4.2 Scoring rules (compound)
|
||||
|
||||
Each rule produces a score addition. **Compounding**: when two strong rules match (e.g., email AND phone), the result is ~95+ rather than max(50, 50). Negative evidence subtracts.
|
||||
|
||||
| Rule | Score | Notes |
|
||||
| --------------------------------------------------------------- | ----- | ------------------------------------------------------ |
|
||||
| Exact email match (case-insensitive, normalized) | +60 | One match suffices |
|
||||
| Exact phone E.164 match (≥ 8 significant digits) | +50 | Excludes placeholder all-zeros |
|
||||
| Exact normalized full-name match | +20 | Many "John Smith"s exist |
|
||||
| Surname soundex match + given-name fuzzy match (Lev ≤ 1) | +15 | Catches `Constanzo/Costanzo`, `Christophe/Christopher` |
|
||||
| Same address (normalized fuzzy ≥ 0.8) | +10 | Bonus signal |
|
||||
| **Negative**: Same email but different country code on phone | −15 | Suggests spouse / coworker / shared inbox |
|
||||
| **Negative**: Same name but DIFFERENT email AND DIFFERENT phone | −20 | Two distinct people with the same name |
|
||||
|
||||
### 4.3 Confidence tiers (post-compound)
|
||||
|
||||
- **score ≥ 90 — `high`** — email AND phone match, or email + name + address. Block-create suggest "Use existing." Auto-link on public-form submit by default.
|
||||
- **score 50–89 — `medium`** — single strong signal (email or phone alone), or email + same-name + different country (Etiennette case). Soft-warn but allow.
|
||||
- **score < 50 — `low`** — weak signals only. Don't surface in UI; only relevant in background-job review queue.
|
||||
|
||||
### 4.4 Blocking strategy
|
||||
|
||||
For O(n) scan over a pool of N existing clients, build three lookup maps once per scan:
|
||||
|
||||
- `byEmail: Map<string, MatchCandidate[]>` — keyed by normalized email
|
||||
- `byPhoneE164: Map<string, MatchCandidate[]>` — keyed by E.164
|
||||
- `bySurnameToken: Map<string, MatchCandidate[]>` — keyed by `normalizeName(...).surnameToken`
|
||||
|
||||
For an incoming `MatchCandidate`, the candidate set to compare is the union of pool entries reachable through any of its emails/phones/surname-token. Typically 0–5 candidates per query, regardless of N.
|
||||
|
||||
### 4.5 Performance budget
|
||||
|
||||
For migration: 252 rows compared pairwise once. ~30k comparisons after blocking — a few seconds.
|
||||
|
||||
For runtime at-create: incoming candidate against existing pool of N clients per port. Expected pool size at maturity: 1k–10k. With blocking: <10 comparisons, <1ms target. No DB query needed beyond the initial pool fetch (which itself uses the indexed columns).
|
||||
|
||||
For background nightly job: full pairwise within port, blocked. 10k clients → ~50k pairwise checks per port → <30s. Fine for a nightly cron.
|
||||
|
||||
---
|
||||
|
||||
## 5. Configurable thresholds (admin settings)
|
||||
|
||||
New rows in `system_settings` per port. Default values err safe (more confirmation, less auto-action).
|
||||
|
||||
| Key | Default | Effect |
|
||||
| ------------------------------ | ----------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
|
||||
| `dedup_block_create_threshold` | `90` | Score above which the client-create form interrupts: "Use existing client?" |
|
||||
| `dedup_soft_warn_threshold` | `50` | Score above which a soft-warn panel surfaces below the form |
|
||||
| `dedup_review_queue_threshold` | `40` | Background job lands pairs ≥ this score in `/admin/duplicates` |
|
||||
| `dedup_public_form_auto_link` | `true` | When a public-form submission scores ≥ block-threshold against existing client, attach the new interest to that client without prompting. **Safe**: no merge, just attaching a deal. |
|
||||
| `dedup_auto_merge_threshold` | `null` (disabled) | If non-null, merges happen automatically at this threshold without human confirmation. Recommend leaving null until the team is comfortable; `95` is a reasonable cautious value. |
|
||||
| `dedup_undo_window_days` | `7` | How long the loser's pre-state JSON is retained for merge-undo. After this, the snapshot is purged (GDPR) and merges are permanent. |
|
||||
|
||||
Each setting is a row in `system_settings`. UI surface in `/[portSlug]/admin/dedup` (a new admin page) with an "Advanced" toggle to expose the thresholds and brief explanations.
|
||||
|
||||
If the sales team complains the safer mode is too click-heavy, an admin flips `dedup_auto_merge_threshold` to `95` without any code change.
|
||||
|
||||
---
|
||||
|
||||
## 6. Merge service contract
|
||||
|
||||
### 6.1 Data flow
|
||||
|
||||
`mergeClients(winnerId, loserId, fieldChoices, ctx)` does, in a single transaction:
|
||||
|
||||
1. **Snapshot loser** — full row + all attached `clientContacts`, `clientAddresses`, `clientNotes`, `clientTags`, plus a count of dependent rows about to be moved (interests, yacht-memberships, etc.). Stored as `mergeDetails` JSONB in `clientMergeLog`.
|
||||
2. **Reattach** — every row pointing at `loserId` updates to point at `winnerId`:
|
||||
- `interests.clientId`
|
||||
- `clientContacts.clientId` — with conflict handling: if winner already has the same email, keep winner's; flag the duplicate for the user
|
||||
- `clientAddresses.clientId` — same conflict handling
|
||||
- `clientNotes.clientId` — preserve `authorId` + `createdAt` (never overwrite)
|
||||
- `clientTags.clientId`
|
||||
- `clientYachtMembership.clientId` (or whatever the table is called)
|
||||
- `auditLogs.entityId` — annotate, don't move (audit truth)
|
||||
3. **Apply fieldChoices** — for each field where the user picked the loser's value, copy that into the winner row.
|
||||
4. **Soft-archive loser** — `loser.archivedAt = now()`, `loser.mergedIntoClientId = winnerId`. Row stays in DB so the merge is reversible.
|
||||
5. **Write `clientMergeLog`** — `{ winnerId, loserId, mergedBy, mergedAt, mergeDetails: <snapshot>, fieldChoices }`.
|
||||
6. **Audit log** — top-level `auditLogs` row: `{ action: 'merge', entityType: 'client', entityId: winnerId, metadata: { loserId, score, reasons } }`.
|
||||
|
||||
### 6.2 Schema additions (migration)
|
||||
|
||||
`clients` table gets a new column:
|
||||
|
||||
```ts
|
||||
mergedIntoClientId: text('merged_into_client_id').references(() => clients.id),
|
||||
```
|
||||
|
||||
The existing `clientMergeLog` table is reused. Add a partial index for the undo-window query:
|
||||
|
||||
```sql
|
||||
CREATE INDEX idx_cml_recent ON client_merge_log (port_id, created_at DESC) WHERE created_at > NOW() - INTERVAL '7 days';
|
||||
```
|
||||
|
||||
A daily maintenance job (using the existing `maintenance-cleanup.test.ts` infrastructure) purges `mergeDetails` JSONB older than `dedup_undo_window_days` setting.
|
||||
|
||||
### 6.3 Undo
|
||||
|
||||
`unmergeClients(mergeLogId, ctx)`:
|
||||
|
||||
1. Within the undo window, look up the snapshot
|
||||
2. Restore loser: clear `archivedAt`, `mergedIntoClientId`
|
||||
3. Restore loser's contacts/addresses/notes/tags from snapshot
|
||||
4. Detach reattached rows: `interests` etc. that were touching `winnerId` and originally belonged to loser go back. The snapshot stores the original `(rowType, rowId)` list explicitly so this is deterministic.
|
||||
5. Mark log row `undoneAt = now()`, `undoneBy = userId`
|
||||
|
||||
After 7 days the snapshot is gone and unmerge returns `410 Gone`.
|
||||
|
||||
### 6.4 Concurrency
|
||||
|
||||
Both merge and unmerge wrap in a single transaction with `SELECT … FOR UPDATE` on `clients.id` of both winner and loser. A second merge attempt against the same loser sees `mergedIntoClientId` already set and refuses (clear error: "Already merged into …").
|
||||
|
||||
---
|
||||
|
||||
## 7. Runtime surfaces
|
||||
|
||||
### 7.1 Layer 1 — At-create suggestion
|
||||
|
||||
In `ClientForm` (and the public `register` form once that hits the new system):
|
||||
|
||||
- Debounced 300ms after email or phone field changes
|
||||
- Calls `findClientMatches` against current port's clients
|
||||
- Renders top-1 match if score ≥ `dedup_soft_warn_threshold`:
|
||||
```
|
||||
┌─────────────────────────────────────┐
|
||||
│ This looks like an existing client │
|
||||
│ ML Marcus Laurent │
|
||||
│ marcus@… +33 6 12 34 56 78 │
|
||||
│ 2 interests · last 9d ago │
|
||||
│ [ Use this client ] [ Create new ] │
|
||||
└─────────────────────────────────────┘
|
||||
```
|
||||
- "Use this client" → form switches to "create new interest under existing client" mode (preserves whatever other fields the user typed)
|
||||
- "Create new" → audit-log `dedup_override` with the candidate's id and reasons (so we have data on false positives)
|
||||
|
||||
### 7.2 Layer 2 — Interest-level same-berth guard
|
||||
|
||||
Cheap one-liner in `createInterest` service:
|
||||
|
||||
- Check `(clientId, berthId)` against existing non-archived interests
|
||||
- If hit, throw `BerthDuplicateError` with the existing interest details
|
||||
- UI catches and prompts: "Update existing or create separate?"
|
||||
|
||||
This is NOT the same as client-level dedup. Same client legitimately can pursue the same berth a second time after it falls through. But the prompt-before-create catches the accidental double-submit case.
|
||||
|
||||
### 7.3 Layer 3 — Background scoring + review queue
|
||||
|
||||
- A nightly cron (using existing BullMQ infrastructure — search for `scheduled-tasks` in repo) runs `findClientMatches` over each port's full client pool
|
||||
- Pairs scoring ≥ `dedup_review_queue_threshold` land in a `client_merge_candidates` table:
|
||||
```ts
|
||||
export const clientMergeCandidates = pgTable('client_merge_candidates', {
|
||||
id: text('id').primaryKey()...,
|
||||
portId: text('port_id').notNull()...,
|
||||
clientAId: text('client_a_id').notNull()...,
|
||||
clientBId: text('client_b_id').notNull()...,
|
||||
score: integer('score').notNull(),
|
||||
reasons: jsonb('reasons').notNull(),
|
||||
status: text('status').notNull().default('pending'), // pending | dismissed | merged
|
||||
createdAt: timestamp('created_at')...,
|
||||
resolvedAt: timestamp('resolved_at'),
|
||||
resolvedBy: text('resolved_by'),
|
||||
})
|
||||
```
|
||||
- `/[portSlug]/admin/duplicates` lists pending candidates sorted by score desc, with `[Review →]` opening a side-by-side merge dialog
|
||||
- Dismissing a candidate marks it `status=dismissed` so the job doesn't re-surface the same pair tomorrow (a future score increase re-creates it).
|
||||
|
||||
---
|
||||
|
||||
## 8. NocoDB → new system field mapping
|
||||
|
||||
This is the explicit mapping the migration script applies. One NocoDB Interest row produces multiple new rows.
|
||||
|
||||
### 8.1 Top-level transform
|
||||
|
||||
```
|
||||
NocoDB Interests row
|
||||
─→ 0–1 client (deduped against existing pool)
|
||||
─→ 0–1 client_address
|
||||
─→ 0–2 client_contacts (email, phone)
|
||||
─→ exactly 1 interest
|
||||
─→ 0–1 yacht (when Yacht Name present and not "TBC"/"Na"/empty placeholders)
|
||||
─→ 0–1 document (when documensoID present)
|
||||
```
|
||||
|
||||
### 8.2 Field map
|
||||
|
||||
| NocoDB field | Target | Transform |
|
||||
| ----------------------------------------------------------------- | ------------------------------------------------------------------ | ---------------------------------------------------------------------------- |
|
||||
| `Full Name` | `clients.fullName` | `normalizeName().display` |
|
||||
| `Email Address` | `clientContacts(channel='email', value=...)` | `normalizeEmail()` |
|
||||
| `Phone Number` | `clientContacts(channel='phone', valueE164=..., valueCountry=...)` | `normalizePhone(raw, defaultCountry)` |
|
||||
| `Address` | `clientAddresses.streetAddress` (LongText preserved) | trim |
|
||||
| `Place of Residence` | `clientAddresses.countryIso` AND `clients.nationalityIso` | `resolveCountry()` |
|
||||
| `Contact Method Preferred` | `clients.preferredContactMethod` | lowercase, mapped: Email→email, Phone→phone |
|
||||
| `Source` | `clients.source` | mapped: portal→website, Form→website, External→manual; null → manual |
|
||||
| `Date Added` | `interests.createdAt` (fallback to NocoDB `Created At` then now) | parse: try `DD-MM-YYYY`, then `YYYY-MM-DD`, then ISO |
|
||||
| `Sales Process Level` | `interests.pipelineStage` | see §8.3 |
|
||||
| `Lead Category` | `interests.leadCategory` | General→general_interest, Friends and Family→general_interest with tag |
|
||||
| `Berth` (FK) | `interests.berthId` | resolve via `Berths` table by `Mooring Number` |
|
||||
| `Berth Size Desired` | `interests.notes` (appended) | preserve |
|
||||
| `Yacht Name`, `Length`, `Width`, `Depth` | `yachts.name`, `lengthM`, `widthM`, `draughtM` | skip if name in {`TBC`, `Na`, ``, null}; ft→m via `\* 0.3048` |
|
||||
| `EOI Status` | `interests.eoiStatus` | Awaiting Further Details→pending; Waiting for Signatures→sent; Signed→signed |
|
||||
| `Deposit 10% Status` | `interests.depositStatus` | Pending→pending; Received→received |
|
||||
| `Contract Status` | `interests.contractStatus` | Pending→pending; 40% Received→partial; Complete→complete |
|
||||
| `EOI Time Sent` | `interests.dateEoiSent` | parse |
|
||||
| `clientSignTime` / `developerSignTime` / `all_signed_notified_at` | `interests.dateEoiSigned` (use latest) | parse |
|
||||
| `Time LOI Sent` | `interests.dateContractSent` | parse |
|
||||
| `Internal Notes` + `Extra Comments` | `clientNotes` (one row, system author) | concatenate with section markers |
|
||||
| `documensoID` | `documents.documensoId` (when present, type='eoi') | preserve |
|
||||
| `Signature Link Client/CC/Developer`, `EmbeddedSignature*` | `documents.signers[]` | one row per non-null signer |
|
||||
| `reminder_enabled`, `last_reminder_sent`, etc. | `interests.reminderEnabled`, `interests.reminderLastFired` | parse, default true |
|
||||
|
||||
### 8.3 Sales-stage mapping (8 → 9)
|
||||
|
||||
| NocoDB | New (PIPELINE_STAGES) |
|
||||
| ------------------------------- | ------------------------------------------------------------------------ |
|
||||
| General Qualified Interest | `open` |
|
||||
| Specific Qualified Interest | `details_sent` |
|
||||
| EOI and NDA Sent | `eoi_sent` |
|
||||
| Signed EOI and NDA | `eoi_signed` |
|
||||
| Made Reservation | `deposit_10pct` |
|
||||
| Contract Negotiation | `contract_sent` |
|
||||
| Contract Negotiations Finalized | `contract_sent` (with audit-note: legacy "negotiations finalized") |
|
||||
| Contract Signed | `contract_signed` (or `completed` when deposit + contract both complete) |
|
||||
|
||||
### 8.4 Other tables
|
||||
|
||||
- **Residential Interests** (35 rows) — same shape as Interests but maps to `residentialClients` + `residentialInterests`. Smaller and cleaner. Same dedup runs within this pool independently.
|
||||
- **Website - Interest Submissions** (64 rows) — these are **inbound capture, not yet a client**. Treat as if each row is a fresh public-form submission today: run dedup against the migrated client pool. Auto-link if `dedup_public_form_auto_link` setting allows.
|
||||
- **Website - Contact Form Submissions** (47 rows) — sparse data (just name + email + interest type). Skip migration; export as CSV for manual triage. Not the source of truth for any deal.
|
||||
- **Website - Berth EOI Details Supplements** (1 row) — single record, preserved as a one-off attached to the matching Interest.
|
||||
- **Newsletter Sending** (69 rows) — out of scope; that's a marketing surface, not CRM.
|
||||
- **Interests Backup, Interests copy** — historical artifacts. Skipped by default. A `--include-backups` flag attaches them as audit-note entries on the corresponding live Interest if the user wants the history.
|
||||
|
||||
---
|
||||
|
||||
## 9. Migration script
|
||||
|
||||
Located at `scripts/migrate-from-nocodb.ts`. Idempotent: safe to re-run. Three main flags:
|
||||
|
||||
```
|
||||
$ pnpm tsx scripts/migrate-from-nocodb.ts --dry-run [--port-slug X]
|
||||
Pulls everything, transforms, runs dedup, writes CSV report to .migration/<timestamp>/. No DB writes.
|
||||
|
||||
$ pnpm tsx scripts/migrate-from-nocodb.ts --apply --report .migration/<timestamp>/
|
||||
Reads the report, performs the writes the dry-run promised. Refuses if the source data has changed since the report was generated (hash mismatch).
|
||||
|
||||
$ pnpm tsx scripts/migrate-from-nocodb.ts --rollback --apply-id <id>
|
||||
Reads the apply log, undoes the writes (only valid within the undo window).
|
||||
```
|
||||
|
||||
Reuses the `client-portal/server/utils/nocodb.ts` adapter for the NocoDB API client (no need to rebuild). Writes to the new system via Drizzle (re-using the existing services like `createClient`, `createInterest`, etc., so all the same validation runs).
|
||||
|
||||
### 9.1 Dry-run report format
|
||||
|
||||
`.migration/<timestamp>/report.csv`:
|
||||
|
||||
```csv
|
||||
op,reason,nocodb_row_id,target_table,target_value,confidence,manual_review_required
|
||||
create_client,new,624,clients.fullName,Deepak Ramchandani,N/A,false
|
||||
create_contact,new,624,clientContacts.email,dannyrams8888@gmail.com,N/A,false
|
||||
create_contact,new,624,clientContacts.phone,+17215868888,N/A,false
|
||||
create_interest,new,624,interests.berthId,a1b2c3...,N/A,false
|
||||
auto_link,score=98 (email+phone),625,clients.id,<existing client UUID from row 624>,high,false
|
||||
flag_for_review,score=72 (same name diff country),188,client.id,<existing client UUID from row 717>,medium,true
|
||||
country_unresolved,fallback to AI (port country),198,clientAddresses.countryIso,AI,low,true
|
||||
phone_unparseable,placeholder all-zeros,641,clientContacts.phone,<skipped>,N/A,true
|
||||
```
|
||||
|
||||
Plus `.migration/<timestamp>/summary.md`:
|
||||
|
||||
```
|
||||
# Migration Dry-Run — 2026-05-03 14:23 UTC
|
||||
|
||||
NocoDB: 252 Interests + 35 Residences + 64 Website Submissions
|
||||
Outcome: 198 clients, 287 interests (incl. residences), 91 yachts, 412 contacts
|
||||
|
||||
Auto-linked (high confidence, no human action needed):
|
||||
- Nicolas Ruiz: rows 681,682,683 → 1 client + 3 interests
|
||||
- John Lynch: rows 716,725 → 1 client + 2 interests
|
||||
- Deepak Ramchandani: rows 624,625 → 1 client + 2 interests
|
||||
- [12 more]
|
||||
|
||||
Flagged for manual review (medium confidence):
|
||||
- Etiennette Clamouze (rows 188,717): same name, different country phone + email
|
||||
- Bruno Joyerot #18 + Bruce Hearn #19: shared household contact
|
||||
- [4 more]
|
||||
|
||||
Country resolution failed for 7 rows. All defaulted to port country (AI). Review:
|
||||
- Row 239: "Sag Harbor Y" → AI (likely US)
|
||||
- [6 more]
|
||||
|
||||
Phone parsing failed for 3 rows. All flagged, no contact created:
|
||||
- Row 178: empty
|
||||
- Row 641: placeholder "+447000000000"
|
||||
- Row 175: empty
|
||||
|
||||
Run `--apply` to commit these changes.
|
||||
```
|
||||
|
||||
### 9.2 Apply phase
|
||||
|
||||
`--apply` reads the report, re-fetches the source rows (via NocoDB MCP / API), recomputes the hash, fails fast if NocoDB changed since dry-run. Then performs the writes within a single PostgreSQL transaction per port (commit at end). On any error mid-transaction, full rollback.
|
||||
|
||||
After successful apply, an `apply_id` is generated and an audit-log row written. The `apply_id` is the handle used for `--rollback`.
|
||||
|
||||
### 9.3 Idempotency
|
||||
|
||||
The script tracks NocoDB row IDs in a `migration_source_links` table:
|
||||
|
||||
```ts
|
||||
export const migrationSourceLinks = pgTable('migration_source_links', {
|
||||
id: text('id').primaryKey()...,
|
||||
sourceSystem: text('source_system').notNull(), // 'nocodb_interests' | 'nocodb_residences' | …
|
||||
sourceId: text('source_id').notNull(), // NocoDB row id as string
|
||||
targetEntityType: text('target_entity_type').notNull(), // client | interest | yacht | …
|
||||
targetEntityId: text('target_entity_id').notNull(),
|
||||
appliedAt: timestamp('applied_at')...,
|
||||
appliedBy: text('applied_by'),
|
||||
}, (table) => [
|
||||
uniqueIndex('idx_msl_source').on(table.sourceSystem, table.sourceId, table.targetEntityType),
|
||||
]);
|
||||
```
|
||||
|
||||
Re-running `--apply` against the same report skips rows already in this table. Useful for partial-failure resumption.
|
||||
|
||||
---
|
||||
|
||||
## 10. Test plan
|
||||
|
||||
### 10.1 Library-level (vitest unit)
|
||||
|
||||
- `tests/unit/dedup/normalize.test.ts` — every dirty-data pattern from §1.3 has a fixture asserting the expected normalized output.
|
||||
- `tests/unit/dedup/find-matches.test.ts` — every duplicate cluster from §1.2 has a fixture asserting score + confidence tier. Hard cases (Pattern F) assert "medium" not "high" — false-positive guard.
|
||||
|
||||
### 10.2 Service-level (vitest integration)
|
||||
|
||||
- `tests/integration/dedup/client-merge.test.ts` — merge service exercised: full reattach, clientMergeLog written, undo within window restores, undo after window returns 410, concurrent merge of same loser fails the second.
|
||||
- `tests/integration/dedup/at-create-suggestion.test.ts` — `findClientMatches` against a seeded pool returns expected matches + reasons.
|
||||
|
||||
### 10.3 Migration script (vitest integration with NocoDB mock)
|
||||
|
||||
- `tests/integration/dedup/migration-dry-run.test.ts` — feed the script a fixture NocoDB dump (the 252 rows, frozen as a JSON snapshot in fixtures), assert the resulting CSV matches a golden file. Catch any future regression in the transform pipeline.
|
||||
- `tests/integration/dedup/migration-apply.test.ts` — apply the dry-run output to a clean test DB, assert all expected rows exist, assert idempotency (re-apply is a no-op).
|
||||
|
||||
### 10.4 E2E (Playwright)
|
||||
|
||||
- `tests/e2e/smoke/30-dedup-create.spec.ts` — type into ClientForm with an email matching seeded client; assert suggestion card appears; click "Use this client"; assert form switches to interest-create mode.
|
||||
- `tests/e2e/smoke/31-admin-duplicates.spec.ts` — admin views review queue, opens a candidate, side-by-side merge UI works, merge succeeds, undo within window works.
|
||||
|
||||
---
|
||||
|
||||
## 11. Rollback plan
|
||||
|
||||
Three layers of safety, ordered by reversibility:
|
||||
|
||||
1. **Per-merge undo** — admin clicks Undo on a wrongly-merged pair, system rolls back from `clientMergeLog` snapshot. 7-day window. No engineering needed.
|
||||
2. **Migration `--rollback` flag** — entire migration apply is reversed via the `apply_id` and `migration_source_links` table. Useful in the first 24h after `--apply`. Engineering-supervised.
|
||||
3. **DB restore from backup** — the existing `docs/ops/backup-runbook.md` covers this. Last resort if both above are blocked.
|
||||
|
||||
Pre-migration, take a hot backup of the new DB (`pg_dump`). Pre-merge in production (before any human-facing surface ships), the `dedup_auto_merge_threshold` defaults to `null` so no automatic merges happen — every merge is human-confirmed.
|
||||
|
||||
---
|
||||
|
||||
## 12. Open items
|
||||
|
||||
- **Soundex vs metaphone** — Soundex is simpler but English-leaning. Metaphone handles non-English surnames better (the dataset has French, German, Italian, Slavic names). Default to metaphone via the `natural` package; revisit if it adds significant install size.
|
||||
- **Cross-port dedup** — not in scope. Each port's clients are deduped within that port. A future "shared address book" feature would need its own design.
|
||||
- **Profile photo / face match** — out of scope.
|
||||
- **AI-assisted match resolution** — out of scope. The Layer-3 review queue is human-only.
|
||||
|
||||
---
|
||||
|
||||
## Implementation sequence
|
||||
|
||||
P1 (this design's library) → P2 (runtime surfaces) → P3 (migration). Each is a separate plan / PR.
|
||||
|
||||
**P1 deliverables**: `src/lib/dedup/{normalize,find-matches}.ts` + tests. No UI changes. No DB changes (except indexed lookups added to existing `clientContacts`). ~1.5 days.
|
||||
|
||||
**P2 deliverables**: at-create suggestion in `ClientForm` + interest-level guard in `createInterest` service + admin settings UI for thresholds + `clientMergeCandidates` table + nightly job + admin review queue page + merge service + side-by-side merge UI. ~5–7 days.
|
||||
|
||||
**P3 deliverables**: `scripts/migrate-from-nocodb.ts` + `migration_source_links` table + dry-run + apply + rollback. CSV report format frozen against fixture. ~3 days, including fixture creation from the live NocoDB snapshot.
|
||||
|
||||
Total: ~10–12 engineering days from approval. Can be split across three PRs landing independently — each is testable in isolation and the runtime surfaces (P2) work even without P3 being run.
|
||||
144
scripts/backfill-phone-e164.ts
Normal file
144
scripts/backfill-phone-e164.ts
Normal file
@@ -0,0 +1,144 @@
|
||||
/**
|
||||
* Backfill `client_contacts.value_e164` from `value` for phone / whatsapp
|
||||
* contacts where it's null or empty.
|
||||
*
|
||||
* The legacy seed (and pre-normalization production data) stored phone
|
||||
* numbers in `value` as free text — "+33 4 93 00 0002" — but `value_e164`
|
||||
* is what every UI surface and dedup matcher reads. This script runs the
|
||||
* raw `value` through libphonenumber-js (via the script-safe wrapper to
|
||||
* avoid the Node 25 metadata-loader bug) and writes the canonical E.164
|
||||
* form back.
|
||||
*
|
||||
* Usage:
|
||||
* pnpm tsx scripts/backfill-phone-e164.ts # dry-run report
|
||||
* pnpm tsx scripts/backfill-phone-e164.ts --apply # actually write
|
||||
*
|
||||
* The dry-run report prints, for each unparseable row, the contact id +
|
||||
* raw value so you can hand-clean before re-running.
|
||||
*/
|
||||
import 'dotenv/config';
|
||||
import { and, eq, inArray, isNull, or, sql } from 'drizzle-orm';
|
||||
|
||||
import { db } from '@/lib/db';
|
||||
import { clientContacts } from '@/lib/db/schema/clients';
|
||||
import { parsePhoneScriptSafe } from '@/lib/dedup/phone-parse';
|
||||
import type { CountryCode } from '@/lib/i18n/countries';
|
||||
|
||||
const APPLY = process.argv.includes('--apply');
|
||||
|
||||
interface PhoneRow {
|
||||
id: string;
|
||||
channel: string;
|
||||
value: string | null;
|
||||
valueCountry: string | null;
|
||||
}
|
||||
|
||||
async function main() {
|
||||
console.log(`Phone E.164 backfill — ${APPLY ? 'APPLY MODE' : 'dry-run'}`);
|
||||
console.log('');
|
||||
|
||||
// Find candidate rows: phone or whatsapp contacts with a `value` set but
|
||||
// `value_e164` null/empty.
|
||||
const rows: PhoneRow[] = await db
|
||||
.select({
|
||||
id: clientContacts.id,
|
||||
channel: clientContacts.channel,
|
||||
value: clientContacts.value,
|
||||
valueCountry: clientContacts.valueCountry,
|
||||
})
|
||||
.from(clientContacts)
|
||||
.where(
|
||||
and(
|
||||
inArray(clientContacts.channel, ['phone', 'whatsapp']),
|
||||
or(isNull(clientContacts.valueE164), eq(clientContacts.valueE164, '')),
|
||||
sql`${clientContacts.value} IS NOT NULL AND ${clientContacts.value} <> ''`,
|
||||
),
|
||||
);
|
||||
|
||||
console.log(` found ${rows.length} candidate rows`);
|
||||
|
||||
let parsedFull = 0;
|
||||
let parsedE164Only = 0;
|
||||
let unparseable = 0;
|
||||
const updates: Array<{
|
||||
id: string;
|
||||
valueE164: string;
|
||||
valueCountry: CountryCode | null;
|
||||
}> = [];
|
||||
const fails: Array<{ id: string; value: string; reason: string }> = [];
|
||||
|
||||
for (const row of rows) {
|
||||
if (!row.value) continue;
|
||||
const defaultCountry = (row.valueCountry as CountryCode | null) ?? undefined;
|
||||
const parsed1 = parsePhoneScriptSafe(row.value, defaultCountry);
|
||||
|
||||
if (parsed1.e164 && parsed1.country) {
|
||||
// Both e164 + country resolved — best case.
|
||||
updates.push({ id: row.id, valueE164: parsed1.e164, valueCountry: parsed1.country });
|
||||
parsedFull++;
|
||||
} else if (parsed1.e164) {
|
||||
// E.164 came back but country didn't (e.g. UK +44 7700 900xxx
|
||||
// fictional/reserved range — libphonenumber returns the e164 form
|
||||
// but refuses to assign a country). Still safe to write — the e164
|
||||
// is canonical. Country stays null.
|
||||
updates.push({
|
||||
id: row.id,
|
||||
valueE164: parsed1.e164,
|
||||
valueCountry: (row.valueCountry as CountryCode | null) ?? null,
|
||||
});
|
||||
parsedE164Only++;
|
||||
} else {
|
||||
fails.push({
|
||||
id: row.id,
|
||||
value: row.value,
|
||||
reason: row.value.trim().startsWith('+')
|
||||
? 'has + prefix but parse failed'
|
||||
: 'no leading + and no country hint',
|
||||
});
|
||||
unparseable++;
|
||||
}
|
||||
}
|
||||
|
||||
console.log('');
|
||||
console.log(' ✓ parsed cleanly (e164 + country)', parsedFull);
|
||||
console.log(' ✓ parsed e164 only (no country) ', parsedE164Only);
|
||||
console.log(' ✗ unparseable ', unparseable);
|
||||
console.log('');
|
||||
|
||||
if (fails.length > 0) {
|
||||
console.log('Failures (first 10):');
|
||||
for (const f of fails.slice(0, 10)) {
|
||||
console.log(` [${f.id}] "${f.value}" — ${f.reason}`);
|
||||
}
|
||||
console.log('');
|
||||
}
|
||||
|
||||
if (!APPLY) {
|
||||
console.log('Dry-run only. Re-run with --apply to write the updates.');
|
||||
return;
|
||||
}
|
||||
|
||||
if (updates.length === 0) {
|
||||
console.log('No updates to write.');
|
||||
return;
|
||||
}
|
||||
|
||||
console.log(`Writing ${updates.length} updates...`);
|
||||
|
||||
for (const u of updates) {
|
||||
await db
|
||||
.update(clientContacts)
|
||||
.set({
|
||||
valueE164: u.valueE164,
|
||||
valueCountry: u.valueCountry,
|
||||
})
|
||||
.where(eq(clientContacts.id, u.id));
|
||||
}
|
||||
|
||||
console.log(` ✓ wrote ${updates.length} rows`);
|
||||
}
|
||||
|
||||
main().catch((err) => {
|
||||
console.error(err);
|
||||
process.exit(1);
|
||||
});
|
||||
126
scripts/load-berths-to-port-nimara.ts
Normal file
126
scripts/load-berths-to-port-nimara.ts
Normal file
@@ -0,0 +1,126 @@
|
||||
/**
|
||||
* One-shot: load the 117-berth NocoDB snapshot into the port-nimara
|
||||
* port, skipping any moorings that already exist.
|
||||
*
|
||||
* The original seed only seeded 12 hand-rolled berths into port-nimara
|
||||
* (A-01..D-03), but the migration's interest rows reference moorings
|
||||
* across A-01..E-18. This loads the full set so interest→berth links
|
||||
* resolve cleanly on the next migration run.
|
||||
*/
|
||||
import 'dotenv/config';
|
||||
import { eq, and, sql, inArray } from 'drizzle-orm';
|
||||
|
||||
import { db } from '@/lib/db';
|
||||
import { ports } from '@/lib/db/schema/ports';
|
||||
import { berths } from '@/lib/db/schema/berths';
|
||||
import berthSnapshot from '@/lib/db/seed-data/berths.json';
|
||||
|
||||
interface SnapshotBerth {
|
||||
mooringNumber: string;
|
||||
area: string;
|
||||
status: 'available' | 'under_offer' | 'sold';
|
||||
lengthFt: number | null;
|
||||
widthFt: number | null;
|
||||
draftFt: number | null;
|
||||
lengthM: number | null;
|
||||
widthM: number | null;
|
||||
draftM: number | null;
|
||||
widthIsMinimum: boolean;
|
||||
nominalBoatSize: number | null;
|
||||
nominalBoatSizeM: number | null;
|
||||
waterDepth: number | null;
|
||||
waterDepthM: number | null;
|
||||
waterDepthIsMinimum: boolean;
|
||||
sidePontoon: string | null;
|
||||
powerCapacity: number | null;
|
||||
voltage: number | null;
|
||||
mooringType: string | null;
|
||||
cleatType: string | null;
|
||||
cleatCapacity: string | null;
|
||||
bollardType: string | null;
|
||||
bollardCapacity: string | null;
|
||||
access: string | null;
|
||||
price: number | null;
|
||||
bowFacing: string | null;
|
||||
berthApproved: boolean;
|
||||
statusOverrideMode: string | null;
|
||||
}
|
||||
|
||||
async function main() {
|
||||
const [port] = await db
|
||||
.select({ id: ports.id })
|
||||
.from(ports)
|
||||
.where(eq(ports.slug, 'port-nimara'))
|
||||
.limit(1);
|
||||
if (!port) throw new Error('port-nimara not found');
|
||||
|
||||
const snapshot = berthSnapshot as unknown as SnapshotBerth[];
|
||||
|
||||
// Existing moorings — skip these.
|
||||
const existingRows = await db
|
||||
.select({ mooringNumber: berths.mooringNumber })
|
||||
.from(berths)
|
||||
.where(eq(berths.portId, port.id));
|
||||
const existingMoorings = new Set(existingRows.map((r) => r.mooringNumber));
|
||||
|
||||
const toInsert = snapshot.filter((b) => !existingMoorings.has(b.mooringNumber));
|
||||
console.log(
|
||||
`Snapshot: ${snapshot.length} berths, existing in port-nimara: ${existingRows.length}, to insert: ${toInsert.length}`,
|
||||
);
|
||||
|
||||
if (toInsert.length === 0) {
|
||||
console.log('Nothing to do.');
|
||||
return;
|
||||
}
|
||||
|
||||
const inserted = await db
|
||||
.insert(berths)
|
||||
.values(
|
||||
toInsert.map((b) => ({
|
||||
portId: port.id,
|
||||
mooringNumber: b.mooringNumber,
|
||||
area: b.area,
|
||||
status: b.status,
|
||||
lengthFt: b.lengthFt != null ? String(b.lengthFt) : null,
|
||||
widthFt: b.widthFt != null ? String(b.widthFt) : null,
|
||||
draftFt: b.draftFt != null ? String(b.draftFt) : null,
|
||||
lengthM: b.lengthM != null ? String(b.lengthM) : null,
|
||||
widthM: b.widthM != null ? String(b.widthM) : null,
|
||||
draftM: b.draftM != null ? String(b.draftM) : null,
|
||||
widthIsMinimum: b.widthIsMinimum,
|
||||
nominalBoatSize: b.nominalBoatSize != null ? String(b.nominalBoatSize) : null,
|
||||
nominalBoatSizeM: b.nominalBoatSizeM != null ? String(b.nominalBoatSizeM) : null,
|
||||
waterDepth: b.waterDepth != null ? String(b.waterDepth) : null,
|
||||
waterDepthM: b.waterDepthM != null ? String(b.waterDepthM) : null,
|
||||
waterDepthIsMinimum: b.waterDepthIsMinimum,
|
||||
sidePontoon: b.sidePontoon,
|
||||
powerCapacity: b.powerCapacity != null ? String(b.powerCapacity) : null,
|
||||
voltage: b.voltage != null ? String(b.voltage) : null,
|
||||
mooringType: b.mooringType,
|
||||
cleatType: b.cleatType,
|
||||
cleatCapacity: b.cleatCapacity,
|
||||
bollardType: b.bollardType,
|
||||
bollardCapacity: b.bollardCapacity,
|
||||
access: b.access,
|
||||
price: b.price != null ? String(b.price) : null,
|
||||
priceCurrency: 'USD',
|
||||
bowFacing: b.bowFacing,
|
||||
berthApproved: b.berthApproved,
|
||||
statusOverrideMode: b.statusOverrideMode,
|
||||
tenureType: 'permanent' as const,
|
||||
})),
|
||||
)
|
||||
.returning({ id: berths.id, mooringNumber: berths.mooringNumber });
|
||||
|
||||
console.log(`Inserted ${inserted.length} berths.`);
|
||||
|
||||
// Suppress unused-import warning if eslint is strict.
|
||||
void and;
|
||||
void sql;
|
||||
void inArray;
|
||||
}
|
||||
|
||||
main().catch((e) => {
|
||||
console.error(e);
|
||||
process.exit(1);
|
||||
});
|
||||
237
scripts/migrate-from-nocodb.ts
Normal file
237
scripts/migrate-from-nocodb.ts
Normal file
@@ -0,0 +1,237 @@
|
||||
/**
|
||||
* One-shot migration: legacy NocoDB Interests → new client/interest split.
|
||||
*
|
||||
* Usage:
|
||||
*
|
||||
* pnpm tsx scripts/migrate-from-nocodb.ts --dry-run
|
||||
* Pulls the live NocoDB base, runs the transform + dedup pipeline,
|
||||
* writes a report to .migration/<timestamp>/. NO database writes.
|
||||
*
|
||||
* pnpm tsx scripts/migrate-from-nocodb.ts --dry-run --port-slug port-nimara
|
||||
* Same, but tags the planned writes with the named port (matters for
|
||||
* the apply phase — every client/interest belongs to one port).
|
||||
*
|
||||
* pnpm tsx scripts/migrate-from-nocodb.ts --apply --port-slug port-nimara
|
||||
* Re-fetches NocoDB, re-transforms, then writes the planned rows
|
||||
* into the target port via the idempotent `migration_source_links`
|
||||
* ledger. Re-runs are safe — already-imported source IDs are skipped.
|
||||
* REQUIRES `EMAIL_REDIRECT_TO` to be set in env (safety net) unless
|
||||
* `--unsafe-skip-redirect-check` is also passed.
|
||||
*
|
||||
* Design reference: docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md §9.
|
||||
*/
|
||||
|
||||
import 'dotenv/config';
|
||||
import { randomUUID } from 'node:crypto';
|
||||
import path from 'node:path';
|
||||
import { fileURLToPath } from 'node:url';
|
||||
|
||||
import { eq } from 'drizzle-orm';
|
||||
|
||||
import { db } from '@/lib/db';
|
||||
import { ports } from '@/lib/db/schema/ports';
|
||||
import { applyPlan } from '@/lib/dedup/migration-apply';
|
||||
import { fetchSnapshot, loadNocoDbConfig } from '@/lib/dedup/nocodb-source';
|
||||
import { transformSnapshot } from '@/lib/dedup/migration-transform';
|
||||
import { resolveReportPaths, writeReport } from '@/lib/dedup/migration-report';
|
||||
|
||||
interface CliArgs {
|
||||
dryRun: boolean;
|
||||
apply: boolean;
|
||||
portSlug: string | null;
|
||||
reportDir: string | null;
|
||||
unsafeSkipRedirectCheck: boolean;
|
||||
}
|
||||
|
||||
function parseArgs(argv: string[]): CliArgs {
|
||||
const args: CliArgs = {
|
||||
dryRun: false,
|
||||
apply: false,
|
||||
portSlug: null,
|
||||
reportDir: null,
|
||||
unsafeSkipRedirectCheck: false,
|
||||
};
|
||||
for (let i = 0; i < argv.length; i += 1) {
|
||||
const a = argv[i]!;
|
||||
if (a === '--dry-run') args.dryRun = true;
|
||||
else if (a === '--apply') args.apply = true;
|
||||
else if (a === '--port-slug') args.portSlug = argv[++i] ?? null;
|
||||
else if (a === '--report') args.reportDir = argv[++i] ?? null;
|
||||
else if (a === '--unsafe-skip-redirect-check') args.unsafeSkipRedirectCheck = true;
|
||||
else if (a === '-h' || a === '--help') {
|
||||
printHelp();
|
||||
process.exit(0);
|
||||
} else {
|
||||
console.error(`Unknown argument: ${a}`);
|
||||
printHelp();
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
return args;
|
||||
}
|
||||
|
||||
function printHelp(): void {
|
||||
console.log(`Usage:
|
||||
pnpm tsx scripts/migrate-from-nocodb.ts --dry-run [--port-slug <slug>]
|
||||
Pulls NocoDB → transforms → writes report to .migration/<timestamp>/.
|
||||
No database writes.
|
||||
|
||||
pnpm tsx scripts/migrate-from-nocodb.ts --apply --port-slug <slug>
|
||||
Re-fetches NocoDB, re-transforms, writes via migration_source_links
|
||||
ledger. Idempotent — safe to re-run. Requires EMAIL_REDIRECT_TO set
|
||||
(unless --unsafe-skip-redirect-check is also passed).
|
||||
|
||||
Flags:
|
||||
--dry-run Read NocoDB, write report only.
|
||||
--apply Actually write rows to the DB.
|
||||
--port-slug <slug> Port slug to attach to all imported
|
||||
entities. Defaults to the first
|
||||
available port if omitted.
|
||||
--report <dir> Path to a previously-generated report
|
||||
dir (only used by --apply).
|
||||
--unsafe-skip-redirect-check Skip the EMAIL_REDIRECT_TO precondition
|
||||
check. Only use in production cutover.
|
||||
-h, --help Show this help.
|
||||
`);
|
||||
}
|
||||
|
||||
/**
|
||||
* Resolve the target port: use the slug if provided, otherwise the first
|
||||
* port found. Errors out cleanly if the slug doesn't match any port.
|
||||
*/
|
||||
async function resolvePort(slug: string | null): Promise<{ id: string; slug: string }> {
|
||||
if (slug) {
|
||||
const [p] = await db
|
||||
.select({ id: ports.id, slug: ports.slug })
|
||||
.from(ports)
|
||||
.where(eq(ports.slug, slug))
|
||||
.limit(1);
|
||||
if (!p) {
|
||||
console.error(`No port found with slug "${slug}".`);
|
||||
process.exit(1);
|
||||
}
|
||||
return { id: p.id, slug: p.slug };
|
||||
}
|
||||
const [first] = await db.select({ id: ports.id, slug: ports.slug }).from(ports).limit(1);
|
||||
if (!first) {
|
||||
console.error('No ports exist in the target DB. Seed at least one port before applying.');
|
||||
process.exit(1);
|
||||
}
|
||||
return { id: first.id, slug: first.slug };
|
||||
}
|
||||
|
||||
async function main(): Promise<void> {
|
||||
const args = parseArgs(process.argv.slice(2));
|
||||
|
||||
if (!args.dryRun && !args.apply) {
|
||||
console.error('Must specify --dry-run or --apply');
|
||||
printHelp();
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
// Safety gate: --apply must run with EMAIL_REDIRECT_TO set, unless the
|
||||
// operator explicitly opts out (production cutover).
|
||||
if (args.apply && !process.env.EMAIL_REDIRECT_TO && !args.unsafeSkipRedirectCheck) {
|
||||
console.error(
|
||||
'--apply requires EMAIL_REDIRECT_TO to be set in the environment as a safety net.',
|
||||
);
|
||||
console.error('See docs/operations/outbound-comms-safety.md for the rationale.');
|
||||
console.error(
|
||||
'If you are running the production cutover and have read that doc, add ' +
|
||||
'--unsafe-skip-redirect-check to override.',
|
||||
);
|
||||
process.exit(2);
|
||||
}
|
||||
|
||||
// ── Fetch + transform (shared by dry-run and apply) ──────────────────────
|
||||
|
||||
console.log('[migrate] Loading NocoDB config…');
|
||||
const config = loadNocoDbConfig();
|
||||
console.log(`[migrate] Source: ${config.url}`);
|
||||
|
||||
console.log('[migrate] Fetching snapshot from NocoDB…');
|
||||
const start = Date.now();
|
||||
const snapshot = await fetchSnapshot(config);
|
||||
const elapsed = ((Date.now() - start) / 1000).toFixed(1);
|
||||
console.log(
|
||||
`[migrate] Snapshot fetched in ${elapsed}s — ${snapshot.interests.length} interests, ${snapshot.residentialInterests.length} residential, ${snapshot.berths.length} berths.`,
|
||||
);
|
||||
|
||||
console.log('[migrate] Running transform + dedup pipeline…');
|
||||
const plan = transformSnapshot(snapshot);
|
||||
|
||||
// Resolve output paths relative to the worktree root.
|
||||
const scriptDir = path.dirname(fileURLToPath(import.meta.url));
|
||||
const repoRoot = path.resolve(scriptDir, '..');
|
||||
const generatedAt = new Date().toISOString();
|
||||
const paths = resolveReportPaths(repoRoot);
|
||||
|
||||
console.log(`[migrate] Writing report to ${paths.rootDir}…`);
|
||||
await writeReport(paths, plan, generatedAt);
|
||||
|
||||
// ── Plan summary ─────────────────────────────────────────────────────────
|
||||
const s = plan.stats;
|
||||
console.log('');
|
||||
console.log('=== Migration Plan Summary ===');
|
||||
console.log(
|
||||
` Input: ${s.inputInterestRows} interests, ${s.inputResidentialRows} residential interests`,
|
||||
);
|
||||
console.log(` Output: ${s.outputClients} clients, ${s.outputInterests} interests`);
|
||||
console.log(` ${s.outputContacts} contacts, ${s.outputAddresses} addresses`);
|
||||
console.log(
|
||||
` Dedup: ${s.autoLinkedClusters} auto-linked clusters, ${s.needsReviewPairs} pairs flagged for review`,
|
||||
);
|
||||
console.log(` Quality: ${s.flaggedRows} rows flagged (see report.csv)`);
|
||||
console.log('');
|
||||
console.log(` Full report: ${paths.summaryPath}`);
|
||||
|
||||
if (args.dryRun) {
|
||||
console.log('');
|
||||
console.log('Dry-run complete. Re-run with --apply to write rows.');
|
||||
return;
|
||||
}
|
||||
|
||||
// ── Apply path ───────────────────────────────────────────────────────────
|
||||
|
||||
const port = await resolvePort(args.portSlug);
|
||||
const applyId = randomUUID();
|
||||
|
||||
console.log('');
|
||||
console.log(`[migrate] Applying to port "${port.slug}" (id=${port.id})`);
|
||||
console.log(`[migrate] Apply id: ${applyId}`);
|
||||
console.log('[migrate] Inserting…');
|
||||
|
||||
const applyStart = Date.now();
|
||||
const result = await applyPlan(plan, { port, applyId });
|
||||
const applyElapsed = ((Date.now() - applyStart) / 1000).toFixed(1);
|
||||
|
||||
console.log('');
|
||||
console.log('=== Apply Result ===');
|
||||
console.log(` Time: ${applyElapsed}s`);
|
||||
console.log(
|
||||
` Clients: ${result.clientsInserted} inserted, ${result.clientsSkipped} already linked`,
|
||||
);
|
||||
console.log(` Contacts: ${result.contactsInserted} inserted`);
|
||||
console.log(` Addresses: ${result.addressesInserted} inserted`);
|
||||
console.log(` Yachts: ${result.yachtsInserted} inserted`);
|
||||
console.log(
|
||||
` Interests: ${result.interestsInserted} inserted, ${result.interestsSkipped} already linked`,
|
||||
);
|
||||
|
||||
if (result.warnings.length > 0) {
|
||||
console.log('');
|
||||
console.log('Warnings:');
|
||||
for (const w of result.warnings.slice(0, 20)) {
|
||||
console.log(` - ${w}`);
|
||||
}
|
||||
if (result.warnings.length > 20) {
|
||||
console.log(` … ${result.warnings.length - 20} more`);
|
||||
}
|
||||
}
|
||||
console.log('');
|
||||
}
|
||||
|
||||
main().catch((err) => {
|
||||
console.error('[migrate] Fatal error:', err);
|
||||
process.exit(1);
|
||||
});
|
||||
108
scripts/smoke-test-redirect.ts
Normal file
108
scripts/smoke-test-redirect.ts
Normal file
@@ -0,0 +1,108 @@
|
||||
/**
|
||||
* Live smoke test for EMAIL_REDIRECT_TO.
|
||||
*
|
||||
* Actually calls `sendEmail()` (the centralized helper used by every
|
||||
* outbound email path in the app) with a fake real-client address. The
|
||||
* SMTP transporter is monkey-patched to capture the message instead of
|
||||
* actually delivering it, so this is safe to run anywhere.
|
||||
*
|
||||
* Prints the captured `to` + `subject` so the operator can see with their
|
||||
* own eyes that the redirect happened. Exits non-zero if the redirect
|
||||
* failed for any reason.
|
||||
*
|
||||
* Usage:
|
||||
* pnpm tsx scripts/smoke-test-redirect.ts
|
||||
*/
|
||||
import 'dotenv/config';
|
||||
|
||||
async function main() {
|
||||
const expectedRedirect = process.env.EMAIL_REDIRECT_TO;
|
||||
if (!expectedRedirect) {
|
||||
console.error('FAIL: EMAIL_REDIRECT_TO is not set in env. Set it before running this test.');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log(`[smoke] EMAIL_REDIRECT_TO = ${expectedRedirect}`);
|
||||
console.log('');
|
||||
|
||||
// Monkey-patch nodemailer's createTransport so we capture the call
|
||||
// without actually delivering. This is the same pattern the unit
|
||||
// tests use, but at the live import-time level so we're testing the
|
||||
// exact code path that runs in production.
|
||||
const nodemailer = await import('nodemailer');
|
||||
const captured: Array<{ to: unknown; subject: unknown; from: unknown }> = [];
|
||||
const originalCreateTransport = nodemailer.default.createTransport;
|
||||
// @ts-expect-error monkey-patch
|
||||
nodemailer.default.createTransport = () => ({
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
sendMail: async (msg: any) => {
|
||||
captured.push({ to: msg.to, subject: msg.subject, from: msg.from });
|
||||
return { messageId: '<smoke@test>', accepted: [msg.to], rejected: [] };
|
||||
},
|
||||
});
|
||||
|
||||
// Now import sendEmail (gets the patched transporter).
|
||||
const { sendEmail } = await import('@/lib/email');
|
||||
|
||||
const realClientEmail = 'real-client-DO-NOT-EMAIL@example.test';
|
||||
const realSubject = 'Important: Your contract is ready';
|
||||
|
||||
console.log('[smoke] calling sendEmail(...) with:');
|
||||
console.log(` to: ${realClientEmail}`);
|
||||
console.log(` subject: "${realSubject}"`);
|
||||
console.log('');
|
||||
|
||||
await sendEmail(realClientEmail, realSubject, '<p>Body unused for this smoke.</p>');
|
||||
|
||||
// Restore the original transport (be a good citizen).
|
||||
// @ts-expect-error monkey-patch
|
||||
nodemailer.default.createTransport = originalCreateTransport;
|
||||
|
||||
console.log('[smoke] captured outbound message:');
|
||||
console.log(` to: ${captured[0]?.to}`);
|
||||
console.log(` subject: "${captured[0]?.subject}"`);
|
||||
console.log(` from: ${captured[0]?.from}`);
|
||||
console.log('');
|
||||
|
||||
// Assertions
|
||||
let pass = true;
|
||||
|
||||
if (captured.length !== 1) {
|
||||
console.error(`FAIL: expected exactly 1 sendMail call, got ${captured.length}`);
|
||||
pass = false;
|
||||
}
|
||||
|
||||
if (captured[0]?.to !== expectedRedirect) {
|
||||
console.error(
|
||||
`FAIL: outbound "to" was "${captured[0]?.to}", expected the redirect address "${expectedRedirect}"`,
|
||||
);
|
||||
pass = false;
|
||||
}
|
||||
|
||||
if (
|
||||
typeof captured[0]?.subject !== 'string' ||
|
||||
!captured[0].subject.startsWith(`[redirected from ${realClientEmail}]`)
|
||||
) {
|
||||
console.error(
|
||||
`FAIL: subject did not get the [redirected from <orig>] prefix. Got: "${captured[0]?.subject}"`,
|
||||
);
|
||||
pass = false;
|
||||
}
|
||||
|
||||
if (pass) {
|
||||
console.log('PASS: EMAIL_REDIRECT_TO is intercepting outbound email correctly.');
|
||||
console.log(
|
||||
' The "to" header matches the redirect, and the original recipient is preserved in the subject.',
|
||||
);
|
||||
process.exit(0);
|
||||
} else {
|
||||
console.error('');
|
||||
console.error('Smoke test FAILED. Do not import production data until this is fixed.');
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
main().catch((err) => {
|
||||
console.error('FATAL:', err);
|
||||
process.exit(1);
|
||||
});
|
||||
5
src/app/(dashboard)/[portSlug]/admin/duplicates/page.tsx
Normal file
5
src/app/(dashboard)/[portSlug]/admin/duplicates/page.tsx
Normal file
@@ -0,0 +1,5 @@
|
||||
import { DuplicatesReviewQueue } from '@/components/admin/duplicates/duplicates-review-queue';
|
||||
|
||||
export default function DuplicatesAdminPage() {
|
||||
return <DuplicatesReviewQueue />;
|
||||
}
|
||||
@@ -16,6 +16,7 @@ import {
|
||||
Tag,
|
||||
Upload,
|
||||
Users,
|
||||
UsersRound,
|
||||
Webhook,
|
||||
} from 'lucide-react';
|
||||
|
||||
@@ -29,132 +30,186 @@ interface AdminSection {
|
||||
icon: typeof Settings;
|
||||
}
|
||||
|
||||
const SECTIONS: AdminSection[] = [
|
||||
interface AdminGroup {
|
||||
title: string;
|
||||
description: string;
|
||||
sections: AdminSection[];
|
||||
}
|
||||
|
||||
const GROUPS: AdminGroup[] = [
|
||||
{
|
||||
href: 'users',
|
||||
label: 'Users',
|
||||
description: 'CRM accounts, role assignments, and per-user residential access toggles.',
|
||||
icon: Users,
|
||||
title: 'Access',
|
||||
description: 'Who can sign in and what they can do once they do.',
|
||||
sections: [
|
||||
{
|
||||
href: 'users',
|
||||
label: 'Users',
|
||||
description: 'CRM accounts, role assignments, and per-user residential access toggles.',
|
||||
icon: Users,
|
||||
},
|
||||
{
|
||||
href: 'invitations',
|
||||
label: 'Invitations',
|
||||
description: 'Send invitations, track pending invites, and resend or revoke them.',
|
||||
icon: Mail,
|
||||
},
|
||||
{
|
||||
href: 'roles',
|
||||
label: 'Roles & Permissions',
|
||||
description: 'Default permission sets and per-port role overrides.',
|
||||
icon: Shield,
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
href: 'invitations',
|
||||
label: 'Invitations',
|
||||
description: 'Send invitations, track pending invites, and resend or revoke them.',
|
||||
icon: Mail,
|
||||
title: 'Configuration',
|
||||
description: 'Branding, integrations, and per-port settings.',
|
||||
sections: [
|
||||
{
|
||||
href: 'email',
|
||||
label: 'Email Settings',
|
||||
description: 'From address, signatures, and per-port SMTP overrides.',
|
||||
icon: Mail,
|
||||
},
|
||||
{
|
||||
href: 'documenso',
|
||||
label: 'Documenso & EOI',
|
||||
description: 'API credentials, EOI template, and default in-app vs Documenso pathway.',
|
||||
icon: FileText,
|
||||
},
|
||||
{
|
||||
href: 'reminders',
|
||||
label: 'Reminders',
|
||||
description: 'Default reminder behaviour and the daily-digest delivery window.',
|
||||
icon: Bell,
|
||||
},
|
||||
{
|
||||
href: 'branding',
|
||||
label: 'Branding',
|
||||
description: 'App name, logo, primary color, and email header/footer HTML.',
|
||||
icon: Palette,
|
||||
},
|
||||
{
|
||||
href: 'settings',
|
||||
label: 'System Settings',
|
||||
description: 'Generic key/value configuration store for advanced flags.',
|
||||
icon: Settings,
|
||||
},
|
||||
{
|
||||
href: 'webhooks',
|
||||
label: 'Webhooks',
|
||||
description: 'Outgoing webhook subscriptions, secrets, and delivery log.',
|
||||
icon: Webhook,
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
href: 'roles',
|
||||
label: 'Roles & Permissions',
|
||||
description: 'Default permission sets and per-port role overrides.',
|
||||
icon: Shield,
|
||||
title: 'Content',
|
||||
description: 'Forms, templates, and labels that users see.',
|
||||
sections: [
|
||||
{
|
||||
href: 'forms',
|
||||
label: 'Forms',
|
||||
description: 'Form templates used by client-facing inquiry and intake flows.',
|
||||
icon: Sliders,
|
||||
},
|
||||
{
|
||||
href: 'templates',
|
||||
label: 'Document Templates',
|
||||
description: 'PDF + email templates with merge-field placeholders.',
|
||||
icon: FileText,
|
||||
},
|
||||
{
|
||||
href: 'tags',
|
||||
label: 'Tags',
|
||||
description: 'Color-coded tags applied to clients, yachts, companies, and interests.',
|
||||
icon: Tag,
|
||||
},
|
||||
{
|
||||
href: 'custom-fields',
|
||||
label: 'Custom Fields',
|
||||
description: 'Tenant-defined fields for clients, yachts, and reservations.',
|
||||
icon: Key,
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
href: 'audit',
|
||||
label: 'Audit Log',
|
||||
description: 'Searchable log of every authenticated mutation in the system.',
|
||||
icon: ScrollText,
|
||||
title: 'Data Quality',
|
||||
description: 'Cleanup, imports, and the audit trail.',
|
||||
sections: [
|
||||
{
|
||||
href: 'duplicates',
|
||||
label: 'Duplicates',
|
||||
description: 'Review queue of suspected duplicate clients flagged by the dedup engine.',
|
||||
icon: UsersRound,
|
||||
},
|
||||
{
|
||||
href: 'import',
|
||||
label: 'Bulk Import',
|
||||
description: 'CSV-driven imports for clients, yachts, and reservations.',
|
||||
icon: Upload,
|
||||
},
|
||||
{
|
||||
href: 'audit',
|
||||
label: 'Audit Log',
|
||||
description: 'Searchable log of every authenticated mutation in the system.',
|
||||
icon: ScrollText,
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
href: 'email',
|
||||
label: 'Email Settings',
|
||||
description: 'From address, signatures, and per-port SMTP overrides.',
|
||||
icon: Mail,
|
||||
title: 'Operations',
|
||||
description: 'Health checks and disaster recovery.',
|
||||
sections: [
|
||||
{
|
||||
href: 'reports',
|
||||
label: 'Reports',
|
||||
description: 'Saved analytics views and ad-hoc query results.',
|
||||
icon: LayoutDashboard,
|
||||
},
|
||||
{
|
||||
href: 'monitoring',
|
||||
label: 'Queue Monitoring',
|
||||
description: 'BullMQ queue health, throughput, and retry diagnostics.',
|
||||
icon: Database,
|
||||
},
|
||||
{
|
||||
href: 'backup',
|
||||
label: 'Backup & Restore',
|
||||
description: 'Database snapshots and on-demand exports.',
|
||||
icon: HardDrive,
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
href: 'documenso',
|
||||
label: 'Documenso & EOI',
|
||||
description: 'API credentials, EOI template, and default in-app vs Documenso pathway.',
|
||||
icon: FileText,
|
||||
title: 'Tenancy',
|
||||
description: 'Multi-port and multi-install scaffolding.',
|
||||
sections: [
|
||||
{
|
||||
href: 'ports',
|
||||
label: 'Ports',
|
||||
description: 'Manage the marinas/ports this installation serves.',
|
||||
icon: Briefcase,
|
||||
},
|
||||
{
|
||||
href: 'onboarding',
|
||||
label: 'Onboarding',
|
||||
description: 'Initial-setup wizard for fresh ports.',
|
||||
icon: LayoutDashboard,
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
href: 'reminders',
|
||||
label: 'Reminders',
|
||||
description: 'Default reminder behaviour and the daily-digest delivery window.',
|
||||
icon: Bell,
|
||||
},
|
||||
{
|
||||
href: 'branding',
|
||||
label: 'Branding',
|
||||
description: 'App name, logo, primary color, and email header/footer HTML.',
|
||||
icon: Palette,
|
||||
},
|
||||
{
|
||||
href: 'settings',
|
||||
label: 'System Settings',
|
||||
description: 'Generic key/value configuration store for advanced flags.',
|
||||
icon: Settings,
|
||||
},
|
||||
{
|
||||
href: 'webhooks',
|
||||
label: 'Webhooks',
|
||||
description: 'Outgoing webhook subscriptions, secrets, and delivery log.',
|
||||
icon: Webhook,
|
||||
},
|
||||
{
|
||||
href: 'forms',
|
||||
label: 'Forms',
|
||||
description: 'Form templates used by client-facing inquiry and intake flows.',
|
||||
icon: Sliders,
|
||||
},
|
||||
{
|
||||
href: 'templates',
|
||||
label: 'Document Templates',
|
||||
description: 'PDF + email templates with merge-field placeholders.',
|
||||
icon: FileText,
|
||||
},
|
||||
{
|
||||
href: 'tags',
|
||||
label: 'Tags',
|
||||
description: 'Color-coded tags applied to clients, yachts, companies, and interests.',
|
||||
icon: Tag,
|
||||
},
|
||||
{
|
||||
href: 'custom-fields',
|
||||
label: 'Custom Fields',
|
||||
description: 'Tenant-defined fields for clients, yachts, and reservations.',
|
||||
icon: Key,
|
||||
},
|
||||
{
|
||||
href: 'reports',
|
||||
label: 'Reports',
|
||||
description: 'Saved analytics views and ad-hoc query results.',
|
||||
icon: LayoutDashboard,
|
||||
},
|
||||
{
|
||||
href: 'monitoring',
|
||||
label: 'Queue Monitoring',
|
||||
description: 'BullMQ queue health, throughput, and retry diagnostics.',
|
||||
icon: Database,
|
||||
},
|
||||
{
|
||||
href: 'import',
|
||||
label: 'Bulk Import',
|
||||
description: 'CSV-driven imports for clients, yachts, and reservations.',
|
||||
icon: Upload,
|
||||
},
|
||||
{
|
||||
href: 'backup',
|
||||
label: 'Backup & Restore',
|
||||
description: 'Database snapshots and on-demand exports.',
|
||||
icon: HardDrive,
|
||||
},
|
||||
{
|
||||
href: 'ports',
|
||||
label: 'Ports',
|
||||
description: 'Manage the marinas/ports this installation serves.',
|
||||
icon: Briefcase,
|
||||
},
|
||||
{
|
||||
href: 'onboarding',
|
||||
label: 'Onboarding',
|
||||
description: 'Initial-setup wizard for fresh ports.',
|
||||
icon: LayoutDashboard,
|
||||
},
|
||||
{
|
||||
href: 'ocr',
|
||||
label: 'Receipt OCR',
|
||||
description: 'Configure the AI provider used by the mobile receipt scanner.',
|
||||
icon: ScrollText,
|
||||
title: 'Integrations',
|
||||
description: 'Third-party providers wired into the app.',
|
||||
sections: [
|
||||
{
|
||||
href: 'ocr',
|
||||
label: 'Receipt OCR',
|
||||
description: 'Configure the AI provider used by the mobile receipt scanner.',
|
||||
icon: ScrollText,
|
||||
},
|
||||
],
|
||||
},
|
||||
];
|
||||
|
||||
@@ -165,36 +220,46 @@ export default async function AdminLandingPage({
|
||||
}) {
|
||||
const { portSlug } = await params;
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div className="space-y-8">
|
||||
<PageHeader
|
||||
title="Administration"
|
||||
description="Per-port configuration and system administration. Each card below opens a dedicated settings page."
|
||||
/>
|
||||
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 gap-4">
|
||||
{SECTIONS.map((s) => {
|
||||
const Icon = s.icon;
|
||||
return (
|
||||
<Link
|
||||
key={s.href}
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
href={`/${portSlug}/admin/${s.href}` as any}
|
||||
className="block group"
|
||||
>
|
||||
<Card className="h-full transition-colors group-hover:border-primary/50 group-hover:bg-muted/30">
|
||||
<CardHeader className="flex flex-row items-start gap-3 space-y-0 pb-2">
|
||||
<Icon className="h-5 w-5 mt-0.5 text-muted-foreground group-hover:text-primary" />
|
||||
<div className="flex-1">
|
||||
<CardTitle className="text-base">{s.label}</CardTitle>
|
||||
</div>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<CardDescription>{s.description}</CardDescription>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</Link>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
{GROUPS.map((group) => (
|
||||
<section key={group.title} className="space-y-3">
|
||||
<div>
|
||||
<h2 className="text-xs font-semibold uppercase tracking-wider text-muted-foreground">
|
||||
{group.title}
|
||||
</h2>
|
||||
<p className="text-xs text-muted-foreground/80">{group.description}</p>
|
||||
</div>
|
||||
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 gap-4">
|
||||
{group.sections.map((s) => {
|
||||
const Icon = s.icon;
|
||||
return (
|
||||
<Link
|
||||
key={s.href}
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
href={`/${portSlug}/admin/${s.href}` as any}
|
||||
className="block group"
|
||||
>
|
||||
<Card className="h-full transition-colors group-hover:border-primary/50 group-hover:bg-muted/30">
|
||||
<CardHeader className="flex flex-row items-start gap-3 space-y-0 pb-2">
|
||||
<Icon className="h-5 w-5 mt-0.5 text-muted-foreground group-hover:text-primary" />
|
||||
<div className="flex-1">
|
||||
<CardTitle className="text-base">{s.label}</CardTitle>
|
||||
</div>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<CardDescription>{s.description}</CardDescription>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</Link>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
</section>
|
||||
))}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
4
src/app/api/v1/admin/duplicates/[id]/dismiss/route.ts
Normal file
4
src/app/api/v1/admin/duplicates/[id]/dismiss/route.ts
Normal file
@@ -0,0 +1,4 @@
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { dismissHandler } from '../../handlers';
|
||||
|
||||
export const POST = withAuth(withPermission('clients', 'edit', dismissHandler));
|
||||
4
src/app/api/v1/admin/duplicates/[id]/merge/route.ts
Normal file
4
src/app/api/v1/admin/duplicates/[id]/merge/route.ts
Normal file
@@ -0,0 +1,4 @@
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { confirmMergeHandler } from '../../handlers';
|
||||
|
||||
export const POST = withAuth(withPermission('clients', 'edit', confirmMergeHandler));
|
||||
160
src/app/api/v1/admin/duplicates/handlers.ts
Normal file
160
src/app/api/v1/admin/duplicates/handlers.ts
Normal file
@@ -0,0 +1,160 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { and, eq, inArray } from 'drizzle-orm';
|
||||
|
||||
import type { AuthContext } from '@/lib/api/helpers';
|
||||
import { db } from '@/lib/db';
|
||||
import { clients, clientMergeCandidates } from '@/lib/db/schema/clients';
|
||||
import { errorResponse, NotFoundError } from '@/lib/errors';
|
||||
import {
|
||||
listPendingMergeCandidates,
|
||||
mergeClients,
|
||||
type MergeFieldChoices,
|
||||
} from '@/lib/services/client-merge.service';
|
||||
|
||||
/**
|
||||
* GET /api/v1/admin/duplicates
|
||||
*
|
||||
* Pending merge candidates for the current port, sorted by score.
|
||||
* Each row hydrates its two client summaries so the review-queue UI
|
||||
* can render side-by-side cards without an N+1 fetch.
|
||||
*/
|
||||
export async function listHandler(_req: Request, ctx: AuthContext): Promise<NextResponse> {
|
||||
try {
|
||||
const pairs = await listPendingMergeCandidates(ctx.portId);
|
||||
if (pairs.length === 0) return NextResponse.json({ data: [] });
|
||||
|
||||
const ids = Array.from(new Set(pairs.flatMap((p) => [p.clientAId, p.clientBId])));
|
||||
const clientRows = await db
|
||||
.select({
|
||||
id: clients.id,
|
||||
fullName: clients.fullName,
|
||||
archivedAt: clients.archivedAt,
|
||||
mergedIntoClientId: clients.mergedIntoClientId,
|
||||
createdAt: clients.createdAt,
|
||||
})
|
||||
.from(clients)
|
||||
.where(inArray(clients.id, ids));
|
||||
const clientById = new Map(clientRows.map((c) => [c.id, c]));
|
||||
|
||||
const data = pairs
|
||||
.map((p) => {
|
||||
const a = clientById.get(p.clientAId);
|
||||
const b = clientById.get(p.clientBId);
|
||||
if (!a || !b) return null; // FK orphan — shouldn't happen, but be defensive
|
||||
// Skip pairs where one side has already been merged or archived.
|
||||
if (a.mergedIntoClientId || b.mergedIntoClientId) return null;
|
||||
return {
|
||||
id: p.id,
|
||||
score: p.score,
|
||||
reasons: p.reasons,
|
||||
createdAt: p.createdAt,
|
||||
clientA: { id: a.id, fullName: a.fullName, createdAt: a.createdAt },
|
||||
clientB: { id: b.id, fullName: b.fullName, createdAt: b.createdAt },
|
||||
};
|
||||
})
|
||||
.filter((row): row is NonNullable<typeof row> => row !== null);
|
||||
|
||||
return NextResponse.json({ data });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/v1/admin/duplicates/[id]/merge
|
||||
*
|
||||
* Body: { winnerId: string, fieldChoices?: MergeFieldChoices }
|
||||
*
|
||||
* Confirms a merge candidate. The winner is the one the user picked
|
||||
* to keep; the other side becomes the loser. Calls into the merge
|
||||
* service which is the only path that touches client_merge_log.
|
||||
*/
|
||||
export async function confirmMergeHandler(
|
||||
req: Request,
|
||||
ctx: AuthContext,
|
||||
params: { id?: string },
|
||||
): Promise<NextResponse> {
|
||||
try {
|
||||
const id = params.id ?? '';
|
||||
const body = (await req.json().catch(() => ({}))) as {
|
||||
winnerId?: string;
|
||||
fieldChoices?: MergeFieldChoices;
|
||||
};
|
||||
if (!body.winnerId) {
|
||||
return NextResponse.json({ error: 'winnerId required' }, { status: 400 });
|
||||
}
|
||||
|
||||
const [candidate] = await db
|
||||
.select()
|
||||
.from(clientMergeCandidates)
|
||||
.where(
|
||||
and(
|
||||
eq(clientMergeCandidates.id, id),
|
||||
eq(clientMergeCandidates.portId, ctx.portId),
|
||||
eq(clientMergeCandidates.status, 'pending'),
|
||||
),
|
||||
);
|
||||
if (!candidate) throw new NotFoundError('Merge candidate');
|
||||
|
||||
const loserId =
|
||||
body.winnerId === candidate.clientAId
|
||||
? candidate.clientBId
|
||||
: body.winnerId === candidate.clientBId
|
||||
? candidate.clientAId
|
||||
: null;
|
||||
if (!loserId) {
|
||||
return NextResponse.json(
|
||||
{ error: 'winnerId must match one of the candidate clients' },
|
||||
{ status: 400 },
|
||||
);
|
||||
}
|
||||
|
||||
const result = await mergeClients({
|
||||
winnerId: body.winnerId,
|
||||
loserId,
|
||||
mergedBy: ctx.userId,
|
||||
fieldChoices: body.fieldChoices,
|
||||
});
|
||||
|
||||
return NextResponse.json({ data: result });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/v1/admin/duplicates/[id]/dismiss
|
||||
*
|
||||
* Mark a merge candidate as dismissed. The background scoring job
|
||||
* skips dismissed pairs on subsequent runs (a future score increase
|
||||
* can re-create them).
|
||||
*/
|
||||
export async function dismissHandler(
|
||||
_req: Request,
|
||||
ctx: AuthContext,
|
||||
params: { id?: string },
|
||||
): Promise<NextResponse> {
|
||||
try {
|
||||
const id = params.id ?? '';
|
||||
const result = await db
|
||||
.update(clientMergeCandidates)
|
||||
.set({
|
||||
status: 'dismissed',
|
||||
resolvedAt: new Date(),
|
||||
resolvedBy: ctx.userId,
|
||||
})
|
||||
.where(
|
||||
and(
|
||||
eq(clientMergeCandidates.id, id),
|
||||
eq(clientMergeCandidates.portId, ctx.portId),
|
||||
eq(clientMergeCandidates.status, 'pending'),
|
||||
),
|
||||
)
|
||||
.returning({ id: clientMergeCandidates.id });
|
||||
|
||||
if (result.length === 0) throw new NotFoundError('Merge candidate');
|
||||
return NextResponse.json({ data: { id: result[0]!.id, status: 'dismissed' } });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}
|
||||
4
src/app/api/v1/admin/duplicates/route.ts
Normal file
4
src/app/api/v1/admin/duplicates/route.ts
Normal file
@@ -0,0 +1,4 @@
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { listHandler } from './handlers';
|
||||
|
||||
export const GET = withAuth(withPermission('clients', 'view', listHandler));
|
||||
160
src/app/api/v1/clients/match-candidates/handlers.ts
Normal file
160
src/app/api/v1/clients/match-candidates/handlers.ts
Normal file
@@ -0,0 +1,160 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { and, eq, inArray } from 'drizzle-orm';
|
||||
|
||||
import type { AuthContext } from '@/lib/api/helpers';
|
||||
import { db } from '@/lib/db';
|
||||
import { clients, clientContacts } from '@/lib/db/schema/clients';
|
||||
import { interests } from '@/lib/db/schema/interests';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { findClientMatches, type MatchCandidate } from '@/lib/dedup/find-matches';
|
||||
import { normalizeEmail, normalizeName, normalizePhone } from '@/lib/dedup/normalize';
|
||||
import type { CountryCode } from '@/lib/i18n/countries';
|
||||
|
||||
/**
|
||||
* GET /api/v1/clients/match-candidates
|
||||
*
|
||||
* Query parameters (any combination):
|
||||
* email Free-text email; gets normalized server-side.
|
||||
* phone Free-text phone; gets normalized to E.164 server-side.
|
||||
* name Free-text full name; used for surname-token blocking.
|
||||
* country Optional ISO country hint (default: AI for Port Nimara).
|
||||
*
|
||||
* Returns the top candidates that scored above the soft-warn threshold,
|
||||
* each with a small client summary the form's suggestion card can
|
||||
* render. Confidence tiers and rules are applied server-side from the
|
||||
* port's `system_settings` (when wired) or sensible defaults otherwise.
|
||||
*
|
||||
* Used by `useDedupSuggestion` in the new-client form. Debounced on
|
||||
* the client; this endpoint must be cheap (single port pool fetch +
|
||||
* an in-memory dedup pass).
|
||||
*/
|
||||
export async function getMatchCandidatesHandler(
|
||||
req: Request,
|
||||
ctx: AuthContext,
|
||||
): Promise<NextResponse> {
|
||||
try {
|
||||
const url = new URL(req.url);
|
||||
const rawEmail = url.searchParams.get('email');
|
||||
const rawPhone = url.searchParams.get('phone');
|
||||
const rawName = url.searchParams.get('name');
|
||||
const country = (url.searchParams.get('country') ?? 'AI') as CountryCode;
|
||||
|
||||
const email = rawEmail ? normalizeEmail(rawEmail) : null;
|
||||
const phoneResult = rawPhone ? normalizePhone(rawPhone, country) : null;
|
||||
const nameResult = rawName ? normalizeName(rawName) : null;
|
||||
|
||||
// If the caller didn't give us anything useful to match on, return empty
|
||||
// — short-circuit rather than scan every client for nothing.
|
||||
if (!email && !phoneResult?.e164 && !nameResult?.surnameToken) {
|
||||
return NextResponse.json({ data: [] });
|
||||
}
|
||||
|
||||
// Build the input candidate.
|
||||
const input: MatchCandidate = {
|
||||
id: '__incoming__',
|
||||
fullName: nameResult?.display ?? null,
|
||||
surnameToken: nameResult?.surnameToken ?? null,
|
||||
emails: email ? [email] : [],
|
||||
phonesE164: phoneResult?.e164 ? [phoneResult.e164] : [],
|
||||
countryIso: country,
|
||||
};
|
||||
|
||||
// Fetch the live pool for this port. We keep this O(N) over clients
|
||||
// since the dedup library does its own blocking; for ports with
|
||||
// thousands of clients we can later restrict by surname-token /
|
||||
// contact lookups, but for current scale the simple full-pool fetch
|
||||
// is fine.
|
||||
const liveClients = await db
|
||||
.select({
|
||||
id: clients.id,
|
||||
fullName: clients.fullName,
|
||||
nationalityIso: clients.nationalityIso,
|
||||
})
|
||||
.from(clients)
|
||||
.where(and(eq(clients.portId, ctx.portId)));
|
||||
|
||||
if (liveClients.length === 0) {
|
||||
return NextResponse.json({ data: [] });
|
||||
}
|
||||
|
||||
const clientIds = liveClients.map((c) => c.id);
|
||||
const contactRows = await db
|
||||
.select({
|
||||
clientId: clientContacts.clientId,
|
||||
channel: clientContacts.channel,
|
||||
value: clientContacts.value,
|
||||
valueE164: clientContacts.valueE164,
|
||||
})
|
||||
.from(clientContacts)
|
||||
.where(inArray(clientContacts.clientId, clientIds));
|
||||
|
||||
// Group contacts by client for the candidate map.
|
||||
const emailsByClient = new Map<string, string[]>();
|
||||
const phonesByClient = new Map<string, string[]>();
|
||||
for (const c of contactRows) {
|
||||
if (c.channel === 'email') {
|
||||
const arr = emailsByClient.get(c.clientId) ?? [];
|
||||
arr.push(c.value.toLowerCase());
|
||||
emailsByClient.set(c.clientId, arr);
|
||||
} else if (c.channel === 'phone' || c.channel === 'whatsapp') {
|
||||
if (c.valueE164) {
|
||||
const arr = phonesByClient.get(c.clientId) ?? [];
|
||||
arr.push(c.valueE164);
|
||||
phonesByClient.set(c.clientId, arr);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const pool: MatchCandidate[] = liveClients.map((c) => {
|
||||
const named = normalizeName(c.fullName);
|
||||
return {
|
||||
id: c.id,
|
||||
fullName: c.fullName,
|
||||
surnameToken: named.surnameToken ?? null,
|
||||
emails: emailsByClient.get(c.id) ?? [],
|
||||
phonesE164: phonesByClient.get(c.id) ?? [],
|
||||
countryIso: (c.nationalityIso as CountryCode | null) ?? null,
|
||||
};
|
||||
});
|
||||
|
||||
const matches = findClientMatches(input, pool, {
|
||||
highScore: 90,
|
||||
mediumScore: 50,
|
||||
});
|
||||
|
||||
// Only return medium+ — low-confidence noise isn't useful at the
|
||||
// create-form layer (background scoring queue picks those up).
|
||||
const useful = matches.filter((m) => m.confidence !== 'low');
|
||||
if (useful.length === 0) {
|
||||
return NextResponse.json({ data: [] });
|
||||
}
|
||||
|
||||
// Pull a quick summary for each surfaced candidate so the suggestion
|
||||
// card has enough to render ("Marcus Laurent · 2 interests · last
|
||||
// contact 9d ago").
|
||||
const summarizedIds = useful.map((m) => m.candidate.id);
|
||||
const interestCounts = await db
|
||||
.select({ clientId: interests.clientId })
|
||||
.from(interests)
|
||||
.where(inArray(interests.clientId, summarizedIds));
|
||||
const interestsByClient = new Map<string, number>();
|
||||
for (const r of interestCounts) {
|
||||
interestsByClient.set(r.clientId, (interestsByClient.get(r.clientId) ?? 0) + 1);
|
||||
}
|
||||
|
||||
const data = useful.map((m) => ({
|
||||
clientId: m.candidate.id,
|
||||
fullName: m.candidate.fullName,
|
||||
score: m.score,
|
||||
confidence: m.confidence,
|
||||
reasons: m.reasons,
|
||||
interestCount: interestsByClient.get(m.candidate.id) ?? 0,
|
||||
emails: m.candidate.emails,
|
||||
phonesE164: m.candidate.phonesE164,
|
||||
}));
|
||||
|
||||
return NextResponse.json({ data });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}
|
||||
4
src/app/api/v1/clients/match-candidates/route.ts
Normal file
4
src/app/api/v1/clients/match-candidates/route.ts
Normal file
@@ -0,0 +1,4 @@
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { getMatchCandidatesHandler } from './handlers';
|
||||
|
||||
export const GET = withAuth(withPermission('clients', 'view', getMatchCandidatesHandler));
|
||||
@@ -4,6 +4,7 @@ import { headers } from 'next/headers';
|
||||
import { Inter, JetBrains_Mono } from 'next/font/google';
|
||||
import { Toaster } from 'sonner';
|
||||
import { classifyFormFactor } from '@/lib/form-factor';
|
||||
import { ReactGrabViewportSync } from '@/components/dev/react-grab-viewport-sync';
|
||||
import './globals.css';
|
||||
|
||||
const inter = Inter({
|
||||
@@ -66,6 +67,7 @@ export default async function RootLayout({ children }: { children: React.ReactNo
|
||||
>
|
||||
{children}
|
||||
<Toaster richColors position="top-right" />
|
||||
{process.env.NODE_ENV === 'development' && <ReactGrabViewportSync />}
|
||||
</body>
|
||||
</html>
|
||||
);
|
||||
|
||||
215
src/components/admin/duplicates/duplicates-review-queue.tsx
Normal file
215
src/components/admin/duplicates/duplicates-review-queue.tsx
Normal file
@@ -0,0 +1,215 @@
|
||||
'use client';
|
||||
|
||||
import { useState } from 'react';
|
||||
import { useMutation, useQuery, useQueryClient } from '@tanstack/react-query';
|
||||
import { ArrowRight, GitMerge, X } from 'lucide-react';
|
||||
import { toast } from 'sonner';
|
||||
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { PageHeader } from '@/components/shared/page-header';
|
||||
import { EmptyState } from '@/components/shared/empty-state';
|
||||
import { Skeleton } from '@/components/ui/skeleton';
|
||||
import { apiFetch } from '@/lib/api/client';
|
||||
import { cn } from '@/lib/utils';
|
||||
|
||||
interface CandidatePair {
|
||||
id: string;
|
||||
score: number;
|
||||
reasons: string[];
|
||||
createdAt: string;
|
||||
clientA: { id: string; fullName: string; createdAt: string };
|
||||
clientB: { id: string; fullName: string; createdAt: string };
|
||||
}
|
||||
|
||||
/**
|
||||
* Admin review queue for the dedup background scoring job.
|
||||
*
|
||||
* Lists every pending merge candidate (pairs where score >=
|
||||
* `dedup_review_queue_threshold`). For each pair the admin can:
|
||||
* - Pick a winner via the side-by-side card → confirms a merge
|
||||
* - Dismiss → removes from the queue (a future score increase
|
||||
* re-creates the pair on the next scoring run)
|
||||
*
|
||||
* Only minimal merge UI here: the user picks which side is the winner
|
||||
* (no per-field choice), and the loser archives. A richer side-by-side
|
||||
* field-merge dialog is a future enhancement.
|
||||
*/
|
||||
export function DuplicatesReviewQueue() {
|
||||
const queryClient = useQueryClient();
|
||||
|
||||
const { data, isLoading } = useQuery<{ data: CandidatePair[] }>({
|
||||
queryKey: ['admin', 'duplicates'],
|
||||
queryFn: () => apiFetch<{ data: CandidatePair[] }>('/api/v1/admin/duplicates'),
|
||||
});
|
||||
|
||||
const pairs = data?.data ?? [];
|
||||
|
||||
return (
|
||||
<div className="space-y-4">
|
||||
<PageHeader
|
||||
title="Duplicate clients"
|
||||
description={
|
||||
pairs.length === 0
|
||||
? 'No pending pairs to review.'
|
||||
: `${pairs.length} pair${pairs.length === 1 ? '' : 's'} flagged for review.`
|
||||
}
|
||||
/>
|
||||
|
||||
{isLoading ? (
|
||||
<div className="space-y-3">
|
||||
{[0, 1, 2].map((i) => (
|
||||
<Skeleton key={i} className="h-32 w-full" />
|
||||
))}
|
||||
</div>
|
||||
) : pairs.length === 0 ? (
|
||||
<EmptyState
|
||||
title="All clear"
|
||||
description="The background scoring job hasn't surfaced any potential duplicates yet."
|
||||
/>
|
||||
) : (
|
||||
<ul className="space-y-3">
|
||||
{pairs.map((pair) => (
|
||||
<li key={pair.id}>
|
||||
<CandidateRow pair={pair} queryClient={queryClient} />
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
function CandidateRow({
|
||||
pair,
|
||||
queryClient,
|
||||
}: {
|
||||
pair: CandidatePair;
|
||||
queryClient: ReturnType<typeof useQueryClient>;
|
||||
}) {
|
||||
const [busy, setBusy] = useState<'merge' | 'dismiss' | null>(null);
|
||||
const [winnerId, setWinnerId] = useState<string>(pair.clientA.id);
|
||||
|
||||
const mergeMutation = useMutation({
|
||||
mutationFn: () =>
|
||||
apiFetch(`/api/v1/admin/duplicates/${pair.id}/merge`, {
|
||||
method: 'POST',
|
||||
body: { winnerId },
|
||||
}),
|
||||
onSuccess: () => {
|
||||
const loserName =
|
||||
winnerId === pair.clientA.id ? pair.clientB.fullName : pair.clientA.fullName;
|
||||
const winnerName =
|
||||
winnerId === pair.clientA.id ? pair.clientA.fullName : pair.clientB.fullName;
|
||||
toast.success(`Merged "${loserName}" into "${winnerName}"`);
|
||||
queryClient.invalidateQueries({ queryKey: ['admin', 'duplicates'] });
|
||||
queryClient.invalidateQueries({ queryKey: ['clients'] });
|
||||
},
|
||||
onError: (err) => toast.error(err instanceof Error ? err.message : 'Merge failed'),
|
||||
onSettled: () => setBusy(null),
|
||||
});
|
||||
|
||||
const dismissMutation = useMutation({
|
||||
mutationFn: () => apiFetch(`/api/v1/admin/duplicates/${pair.id}/dismiss`, { method: 'POST' }),
|
||||
onSuccess: () => {
|
||||
toast.message('Dismissed');
|
||||
queryClient.invalidateQueries({ queryKey: ['admin', 'duplicates'] });
|
||||
},
|
||||
onError: (err) => toast.error(err instanceof Error ? err.message : 'Dismiss failed'),
|
||||
onSettled: () => setBusy(null),
|
||||
});
|
||||
|
||||
return (
|
||||
<div className="rounded-lg border bg-card p-4">
|
||||
<div className="mb-3 flex items-baseline justify-between gap-3">
|
||||
<div>
|
||||
<span className="rounded-full bg-muted px-2 py-0.5 text-[10px] font-medium uppercase tracking-wide text-muted-foreground">
|
||||
score {pair.score}
|
||||
</span>{' '}
|
||||
<span className="text-xs text-muted-foreground">{pair.reasons.join(' · ')}</span>
|
||||
</div>
|
||||
<span className="text-xs text-muted-foreground">
|
||||
flagged {new Date(pair.createdAt).toLocaleDateString()}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
<div className="grid gap-3 sm:grid-cols-[1fr_auto_1fr]">
|
||||
<ClientCard
|
||||
client={pair.clientA}
|
||||
isSelected={winnerId === pair.clientA.id}
|
||||
onSelect={() => setWinnerId(pair.clientA.id)}
|
||||
/>
|
||||
<div className="flex items-center justify-center text-muted-foreground">
|
||||
<ArrowRight className="size-4" aria-hidden />
|
||||
</div>
|
||||
<ClientCard
|
||||
client={pair.clientB}
|
||||
isSelected={winnerId === pair.clientB.id}
|
||||
onSelect={() => setWinnerId(pair.clientB.id)}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div className="mt-3 flex flex-wrap items-center gap-2">
|
||||
<Button
|
||||
size="sm"
|
||||
onClick={() => {
|
||||
setBusy('merge');
|
||||
mergeMutation.mutate();
|
||||
}}
|
||||
disabled={busy !== null}
|
||||
>
|
||||
<GitMerge className="mr-1 size-3.5" aria-hidden />
|
||||
Merge into selected
|
||||
</Button>
|
||||
<Button
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
onClick={() => {
|
||||
setBusy('dismiss');
|
||||
dismissMutation.mutate();
|
||||
}}
|
||||
disabled={busy !== null}
|
||||
>
|
||||
<X className="mr-1 size-3.5" aria-hidden />
|
||||
Dismiss
|
||||
</Button>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
The unselected card becomes the loser; its interests + contacts move to the selected
|
||||
client and the original is archived.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
function ClientCard({
|
||||
client,
|
||||
isSelected,
|
||||
onSelect,
|
||||
}: {
|
||||
client: CandidatePair['clientA'];
|
||||
isSelected: boolean;
|
||||
onSelect: () => void;
|
||||
}) {
|
||||
return (
|
||||
<button
|
||||
type="button"
|
||||
onClick={onSelect}
|
||||
className={cn(
|
||||
'rounded-md border p-3 text-left transition-colors',
|
||||
isSelected
|
||||
? 'border-primary bg-primary/5 ring-1 ring-primary/30'
|
||||
: 'border-border hover:bg-muted/40',
|
||||
)}
|
||||
>
|
||||
<p className="text-sm font-medium">{client.fullName}</p>
|
||||
<p className="mt-0.5 text-[11px] text-muted-foreground">
|
||||
Created {new Date(client.createdAt).toLocaleDateString()}
|
||||
</p>
|
||||
{isSelected ? (
|
||||
<span className="mt-1 inline-block rounded-full bg-primary/10 px-1.5 py-0.5 text-[10px] font-semibold text-primary">
|
||||
KEEP
|
||||
</span>
|
||||
) : null}
|
||||
</button>
|
||||
);
|
||||
}
|
||||
@@ -241,7 +241,14 @@ export function SettingsManager() {
|
||||
</CardHeader>
|
||||
<CardContent className="space-y-4">
|
||||
{KNOWN_SETTINGS.filter((s) => s.type === 'string').map((setting) => (
|
||||
<div key={setting.key} className="flex items-center justify-between gap-4">
|
||||
<div
|
||||
key={setting.key}
|
||||
// Stack label/description above the input on phone widths.
|
||||
// The previous flex row crushed the label column into a
|
||||
// narrow vertical stripe ("Inquiry / Contact / Email" wrapping
|
||||
// one word per line) while the input took the rest.
|
||||
className="flex flex-col gap-2 sm:flex-row sm:items-center sm:justify-between sm:gap-4"
|
||||
>
|
||||
<div className="flex-1">
|
||||
<Label>{setting.label}</Label>
|
||||
<p className="text-xs text-muted-foreground">{setting.description}</p>
|
||||
@@ -249,7 +256,7 @@ export function SettingsManager() {
|
||||
<div className="flex items-center gap-2">
|
||||
<Input
|
||||
type="text"
|
||||
className="w-64"
|
||||
className="w-full sm:w-64"
|
||||
value={String(getEffectiveValue(setting.key, setting.defaultValue) ?? '')}
|
||||
onChange={(e) =>
|
||||
setValues((prev) => ({
|
||||
@@ -283,7 +290,10 @@ export function SettingsManager() {
|
||||
</CardHeader>
|
||||
<CardContent className="space-y-4">
|
||||
{KNOWN_SETTINGS.filter((s) => s.type === 'number').map((setting) => (
|
||||
<div key={setting.key} className="flex items-center justify-between gap-4">
|
||||
<div
|
||||
key={setting.key}
|
||||
className="flex flex-col gap-2 sm:flex-row sm:items-center sm:justify-between sm:gap-4"
|
||||
>
|
||||
<div className="flex-1">
|
||||
<Label>{setting.label}</Label>
|
||||
<p className="text-xs text-muted-foreground">{setting.description}</p>
|
||||
@@ -291,7 +301,7 @@ export function SettingsManager() {
|
||||
<div className="flex items-center gap-2">
|
||||
<Input
|
||||
type="number"
|
||||
className="w-24"
|
||||
className="w-full sm:w-24"
|
||||
value={String(getEffectiveValue(setting.key, setting.defaultValue) ?? '')}
|
||||
onChange={(e) =>
|
||||
setValues((prev) => ({
|
||||
|
||||
@@ -22,7 +22,11 @@ export function AlertRail() {
|
||||
<section
|
||||
data-testid="alert-rail"
|
||||
aria-label="Active alerts"
|
||||
className="flex h-full flex-col gap-3"
|
||||
// `h-full` is intentional only at xl: where the parent dashboard grid
|
||||
// gives this rail a sibling column whose height it should match. On
|
||||
// mobile (single-column stack) there's no fixed-height context, so
|
||||
// forcing 100% height makes the section overflow / look stretched.
|
||||
className="flex flex-col gap-3 xl:h-full"
|
||||
>
|
||||
<div className="flex items-baseline justify-between">
|
||||
<h2 className="text-sm font-semibold tracking-tight">Alerts</h2>
|
||||
|
||||
@@ -178,7 +178,10 @@ export function BerthDetailHeader({ berth }: BerthDetailHeaderProps) {
|
||||
return (
|
||||
<>
|
||||
<DetailHeaderStrip>
|
||||
<div className="flex items-start gap-4">
|
||||
{/* Stacks vertically on phone widths so the action buttons don't
|
||||
squeeze the area subtitle into a two-line wrap. From sm up the
|
||||
title/area block sits side-by-side with the action buttons. */}
|
||||
<div className="flex flex-col gap-3 sm:flex-row sm:items-start sm:gap-4">
|
||||
<div className="flex-1 min-w-0">
|
||||
<div className="flex items-center gap-3 flex-wrap">
|
||||
<h1 className="hidden sm:block text-2xl font-bold text-foreground">
|
||||
@@ -193,7 +196,7 @@ export function BerthDetailHeader({ berth }: BerthDetailHeaderProps) {
|
||||
{berth.area && <p className="text-muted-foreground mt-1">{berth.area}</p>}
|
||||
</div>
|
||||
|
||||
<div className="flex flex-wrap items-center gap-2 shrink-0">
|
||||
<div className="flex flex-wrap items-center gap-2 sm:shrink-0">
|
||||
<PermissionGate resource="berths" action="edit">
|
||||
<Button variant="outline" size="sm" onClick={() => setStatusOpen(true)}>
|
||||
<RefreshCw className="mr-1.5 h-4 w-4" />
|
||||
|
||||
@@ -48,10 +48,13 @@ type BerthData = {
|
||||
|
||||
function SpecRow({ label, value }: { label: string; value: React.ReactNode }) {
|
||||
if (!value && value !== 0 && value !== false) return null;
|
||||
// Mobile-first: stack vertically with label on top so long values
|
||||
// (e.g. "206.69 ft / 62.99 m") never clip at the right edge.
|
||||
// From `sm` (>=640px) up: switch to the original two-column layout.
|
||||
return (
|
||||
<div className="flex justify-between py-2 text-sm">
|
||||
<div className="flex flex-col gap-0.5 py-2 text-sm sm:flex-row sm:items-baseline sm:justify-between sm:gap-3">
|
||||
<span className="text-muted-foreground">{label}</span>
|
||||
<span className="font-medium text-right max-w-[60%]">{value}</span>
|
||||
<span className="font-medium sm:max-w-[60%] sm:text-right">{value}</span>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -25,6 +25,8 @@ export interface ClientRow {
|
||||
createdAt: string;
|
||||
yachtCount?: number;
|
||||
companyCount?: number;
|
||||
interestCount?: number;
|
||||
latestInterest?: { stage: string; mooringNumber: string | null } | null;
|
||||
contacts?: Array<{ channel: string; value: string; isPrimary: boolean }>;
|
||||
tags?: Array<{ id: string; name: string; color: string }>;
|
||||
}
|
||||
|
||||
@@ -2,7 +2,8 @@
|
||||
|
||||
import { useState } from 'react';
|
||||
import { useMutation, useQueryClient } from '@tanstack/react-query';
|
||||
import { Archive, RotateCcw, Mail, Phone } from 'lucide-react';
|
||||
import { Archive, Mail, MessageCircle, Phone, RotateCcw } from 'lucide-react';
|
||||
import { format } from 'date-fns';
|
||||
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { Badge } from '@/components/ui/badge';
|
||||
@@ -12,31 +13,28 @@ import { DetailHeaderStrip } from '@/components/shared/detail-header-strip';
|
||||
import { PortalInviteButton } from '@/components/clients/portal-invite-button';
|
||||
import { GdprExportButton } from '@/components/clients/gdpr-export-button';
|
||||
import { apiFetch } from '@/lib/api/client';
|
||||
import { cn } from '@/lib/utils';
|
||||
import { getCountryName } from '@/lib/i18n/countries';
|
||||
|
||||
interface ClientDetailHeaderProps {
|
||||
client: {
|
||||
id: string;
|
||||
fullName: string;
|
||||
nationality?: string | null;
|
||||
preferredContactMethod?: string | null;
|
||||
preferredLanguage?: string | null;
|
||||
timezone?: string | null;
|
||||
source?: string | null;
|
||||
sourceDetails?: string | null;
|
||||
nationalityIso?: string | null;
|
||||
archivedAt?: string | null;
|
||||
contacts?: Array<{ channel: string; value: string; isPrimary: boolean; label?: string | null }>;
|
||||
createdAt?: string;
|
||||
contacts?: Array<{
|
||||
channel: string;
|
||||
value: string;
|
||||
valueE164?: string | null;
|
||||
isPrimary: boolean;
|
||||
label?: string | null;
|
||||
}>;
|
||||
tags?: Array<{ id: string; name: string; color: string }>;
|
||||
clientPortalEnabled?: boolean;
|
||||
};
|
||||
}
|
||||
|
||||
const SOURCE_LABELS: Record<string, string> = {
|
||||
website: 'Website',
|
||||
manual: 'Manual',
|
||||
referral: 'Referral',
|
||||
broker: 'Broker',
|
||||
};
|
||||
|
||||
export function ClientDetailHeader({ client }: ClientDetailHeaderProps) {
|
||||
const queryClient = useQueryClient();
|
||||
const [archiveOpen, setArchiveOpen] = useState(false);
|
||||
@@ -62,19 +60,34 @@ export function ClientDetailHeader({ client }: ClientDetailHeaderProps) {
|
||||
});
|
||||
|
||||
const primaryEmail =
|
||||
client.contacts?.find((c) => c.channel === 'email' && c.isPrimary) ??
|
||||
client.contacts?.find((c) => c.channel === 'email');
|
||||
const primaryPhone =
|
||||
client.contacts?.find((c) => c.channel === 'email' && c.isPrimary)?.value ??
|
||||
client.contacts?.find((c) => c.channel === 'email')?.value;
|
||||
const primaryPhoneContact =
|
||||
client.contacts?.find((c) => c.channel === 'phone' && c.isPrimary) ??
|
||||
client.contacts?.find((c) => c.channel === 'phone');
|
||||
const primaryPhone = primaryPhoneContact?.value;
|
||||
// wa.me requires the E.164 number without the leading "+". Strip from the
|
||||
// canonical E.164 form when available; otherwise strip non-digits from the
|
||||
// display value as a best-effort fallback.
|
||||
const whatsappNumber = primaryPhoneContact?.valueE164
|
||||
? primaryPhoneContact.valueE164.replace(/^\+/, '')
|
||||
: primaryPhoneContact?.value
|
||||
? primaryPhoneContact.value.replace(/[^\d]/g, '')
|
||||
: null;
|
||||
|
||||
const country = client.nationalityIso ? getCountryName(client.nationalityIso, 'en') : null;
|
||||
const addedLabel = client.createdAt
|
||||
? `Added ${format(new Date(client.createdAt), 'MMM d, yyyy')}`
|
||||
: null;
|
||||
const meta = [country, addedLabel].filter(Boolean) as string[];
|
||||
|
||||
return (
|
||||
<>
|
||||
<DetailHeaderStrip>
|
||||
<div className="flex items-start gap-3 flex-wrap">
|
||||
<div className="flex-1 min-w-0">
|
||||
<div className="flex items-start gap-3">
|
||||
<div className="min-w-0 flex-1 space-y-2">
|
||||
<div className="flex items-center gap-2 flex-wrap">
|
||||
<h1 className="hidden sm:block text-2xl font-bold text-foreground truncate">
|
||||
<h1 className="truncate text-lg font-bold text-foreground sm:text-2xl">
|
||||
{client.fullName}
|
||||
</h1>
|
||||
{isArchived && (
|
||||
@@ -84,31 +97,71 @@ export function ClientDetailHeader({ client }: ClientDetailHeaderProps) {
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-4 mt-2 flex-wrap text-sm text-muted-foreground">
|
||||
{client.source && (
|
||||
<span>
|
||||
Source:{' '}
|
||||
<span className="text-foreground">
|
||||
{SOURCE_LABELS[client.source] ?? client.source}
|
||||
</span>
|
||||
</span>
|
||||
)}
|
||||
{primaryEmail && (
|
||||
<span className="flex items-center gap-1">
|
||||
<Mail className="h-3.5 w-3.5" />
|
||||
{primaryEmail.value}
|
||||
</span>
|
||||
)}
|
||||
{primaryPhone && (
|
||||
<span className="flex items-center gap-1">
|
||||
<Phone className="h-3.5 w-3.5" />
|
||||
{primaryPhone.value}
|
||||
</span>
|
||||
)}
|
||||
{meta.length > 0 ? (
|
||||
<p className="text-xs text-muted-foreground sm:text-sm">{meta.join(' · ')}</p>
|
||||
) : null}
|
||||
|
||||
<div className="flex flex-wrap items-center gap-1.5 pt-1">
|
||||
{primaryEmail ? (
|
||||
<Button
|
||||
asChild
|
||||
variant="outline"
|
||||
size="sm"
|
||||
className="h-8 gap-1.5 px-2.5 [&_svg]:size-3.5"
|
||||
>
|
||||
<a href={`mailto:${primaryEmail}`} aria-label={`Email ${primaryEmail}`}>
|
||||
<Mail />
|
||||
Email
|
||||
</a>
|
||||
</Button>
|
||||
) : null}
|
||||
{primaryPhone ? (
|
||||
<Button
|
||||
asChild
|
||||
variant="outline"
|
||||
size="sm"
|
||||
className="h-8 gap-1.5 px-2.5 [&_svg]:size-3.5"
|
||||
>
|
||||
<a href={`tel:${primaryPhone}`} aria-label={`Call ${primaryPhone}`}>
|
||||
<Phone />
|
||||
Call
|
||||
</a>
|
||||
</Button>
|
||||
) : null}
|
||||
{whatsappNumber ? (
|
||||
<Button
|
||||
asChild
|
||||
variant="outline"
|
||||
size="sm"
|
||||
className="h-8 gap-1.5 px-2.5 [&_svg]:size-3.5"
|
||||
>
|
||||
<a
|
||||
href={`https://wa.me/${whatsappNumber}`}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
aria-label={`Message ${primaryPhone} on WhatsApp`}
|
||||
>
|
||||
<MessageCircle />
|
||||
WhatsApp
|
||||
</a>
|
||||
</Button>
|
||||
) : null}
|
||||
{!isArchived && client.clientPortalEnabled !== false ? (
|
||||
<div className="hidden sm:inline-flex">
|
||||
<PortalInviteButton
|
||||
clientId={client.id}
|
||||
clientName={client.fullName}
|
||||
defaultEmail={primaryEmail}
|
||||
/>
|
||||
</div>
|
||||
) : null}
|
||||
<div className="hidden sm:inline-flex">
|
||||
<GdprExportButton clientId={client.id} />
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{client.tags && client.tags.length > 0 && (
|
||||
<div className="flex flex-wrap gap-1 mt-2">
|
||||
<div className="flex flex-wrap gap-1">
|
||||
{client.tags.map((tag) => (
|
||||
<TagBadge key={tag.id} name={tag.name} color={tag.color} />
|
||||
))}
|
||||
@@ -116,34 +169,21 @@ export function ClientDetailHeader({ client }: ClientDetailHeaderProps) {
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Actions */}
|
||||
<div className="flex flex-wrap items-center gap-2">
|
||||
{!isArchived && client.clientPortalEnabled !== false && (
|
||||
<PortalInviteButton
|
||||
clientId={client.id}
|
||||
clientName={client.fullName}
|
||||
defaultEmail={primaryEmail?.value}
|
||||
/>
|
||||
{/* Top-right: archive/restore as a small icon button — destructive
|
||||
action sits out of the primary action flow. */}
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => setArchiveOpen(true)}
|
||||
aria-label={isArchived ? 'Restore client' : 'Archive client'}
|
||||
title={isArchived ? 'Restore client' : 'Archive client'}
|
||||
className={cn(
|
||||
'shrink-0 rounded-md p-1.5 text-muted-foreground/70 transition-colors',
|
||||
'hover:bg-foreground/5 hover:text-foreground',
|
||||
isArchived ? 'hover:text-foreground' : 'hover:text-destructive',
|
||||
)}
|
||||
<GdprExportButton clientId={client.id} />
|
||||
<Button
|
||||
variant={isArchived ? 'outline' : 'outline'}
|
||||
size="sm"
|
||||
onClick={() => setArchiveOpen(true)}
|
||||
>
|
||||
{isArchived ? (
|
||||
<>
|
||||
<RotateCcw className="mr-1.5 h-3.5 w-3.5" />
|
||||
Restore
|
||||
</>
|
||||
) : (
|
||||
<>
|
||||
<Archive className="mr-1.5 h-3.5 w-3.5" />
|
||||
Archive
|
||||
</>
|
||||
)}
|
||||
</Button>
|
||||
</div>
|
||||
>
|
||||
{isArchived ? <RotateCcw className="size-4" /> : <Archive className="size-4" />}
|
||||
</button>
|
||||
</div>
|
||||
</DetailHeaderStrip>
|
||||
|
||||
|
||||
@@ -29,6 +29,8 @@ interface ClientData {
|
||||
id: string;
|
||||
channel: string;
|
||||
value: string;
|
||||
valueE164: string | null;
|
||||
valueCountry: string | null;
|
||||
label: string | null;
|
||||
isPrimary: boolean;
|
||||
notes: string | null;
|
||||
|
||||
@@ -23,6 +23,7 @@ import { TagPicker } from '@/components/shared/tag-picker';
|
||||
import { CountryCombobox } from '@/components/shared/country-combobox';
|
||||
import { TimezoneCombobox } from '@/components/shared/timezone-combobox';
|
||||
import { PhoneInput } from '@/components/shared/phone-input';
|
||||
import { DedupSuggestionPanel } from '@/components/clients/dedup-suggestion-panel';
|
||||
import { apiFetch } from '@/lib/api/client';
|
||||
import { createClientSchema, type CreateClientInput } from '@/lib/validators/clients';
|
||||
import type { CountryCode } from '@/lib/i18n/countries';
|
||||
@@ -30,6 +31,12 @@ import type { CountryCode } from '@/lib/i18n/countries';
|
||||
interface ClientFormProps {
|
||||
open: boolean;
|
||||
onOpenChange: (open: boolean) => void;
|
||||
/** Optional callback fired when the dedup suggestion panel reports
|
||||
* the user picked an existing client. The form closes; parent is
|
||||
* responsible for navigating to the existing client's detail page
|
||||
* or opening the create-interest dialog pre-filled with that
|
||||
* clientId. Skipped in edit mode. */
|
||||
onUseExistingClient?: (clientId: string) => void;
|
||||
/** If provided, form is in edit mode */
|
||||
client?: {
|
||||
id: string;
|
||||
@@ -53,7 +60,7 @@ interface ClientFormProps {
|
||||
};
|
||||
}
|
||||
|
||||
export function ClientForm({ open, onOpenChange, client }: ClientFormProps) {
|
||||
export function ClientForm({ open, onOpenChange, client, onUseExistingClient }: ClientFormProps) {
|
||||
const queryClient = useQueryClient();
|
||||
const isEdit = !!client;
|
||||
|
||||
@@ -143,6 +150,26 @@ export function ClientForm({ open, onOpenChange, client }: ClientFormProps) {
|
||||
</SheetHeader>
|
||||
|
||||
<form onSubmit={handleSubmit((data) => mutation.mutate(data))} className="space-y-6 py-6">
|
||||
{/* Dedup suggestion — only on the create path. Watches the
|
||||
live form values for email / phone / name and surfaces
|
||||
an existing client when one matches. The user can
|
||||
attach the new interest to that client instead of
|
||||
creating a duplicate. */}
|
||||
{!isEdit ? (
|
||||
<DedupSuggestionPanel
|
||||
email={watch('contacts')?.find((c) => c?.channel === 'email')?.value ?? null}
|
||||
phone={
|
||||
watch('contacts')?.find((c) => c?.channel === 'phone' || c?.channel === 'whatsapp')
|
||||
?.valueE164 ?? null
|
||||
}
|
||||
name={watch('fullName') ?? null}
|
||||
onUseExisting={(match) => {
|
||||
onUseExistingClient?.(match.clientId);
|
||||
onOpenChange(false);
|
||||
}}
|
||||
/>
|
||||
) : null}
|
||||
|
||||
{/* Basic Info */}
|
||||
<div className="space-y-4">
|
||||
<h3 className="text-sm font-medium text-muted-foreground uppercase tracking-wide">
|
||||
@@ -339,10 +366,6 @@ export function ClientForm({ open, onOpenChange, client }: ClientFormProps) {
|
||||
</SelectContent>
|
||||
</Select>
|
||||
</div>
|
||||
<div className="space-y-1">
|
||||
<Label>Preferred Language</Label>
|
||||
<Input {...register('preferredLanguage')} placeholder="English" />
|
||||
</div>
|
||||
<div className="space-y-1">
|
||||
<Label>Timezone</Label>
|
||||
<TimezoneCombobox
|
||||
|
||||
460
src/components/clients/client-interests-tab.tsx
Normal file
460
src/components/clients/client-interests-tab.tsx
Normal file
@@ -0,0 +1,460 @@
|
||||
'use client';
|
||||
|
||||
import { useState } from 'react';
|
||||
import Link from 'next/link';
|
||||
import { useParams } from 'next/navigation';
|
||||
import type { Route } from 'next';
|
||||
import { useQuery } from '@tanstack/react-query';
|
||||
import { format, formatDistanceToNowStrict } from 'date-fns';
|
||||
import { ArrowRight, CheckCircle2, ChevronRight, Circle, Plus } from 'lucide-react';
|
||||
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { EmptyState } from '@/components/shared/empty-state';
|
||||
import { Skeleton } from '@/components/ui/skeleton';
|
||||
import { Drawer, DrawerContent, DrawerHeader, DrawerTitle } from '@/components/shared/drawer';
|
||||
import { apiFetch } from '@/lib/api/client';
|
||||
import { PIPELINE_STAGES, type PipelineStage } from '@/lib/constants';
|
||||
import { cn } from '@/lib/utils';
|
||||
|
||||
import { STAGE_BADGE, STAGE_LABELS, safeStage } from '@/components/clients/pipeline-constants';
|
||||
import {
|
||||
StageStepper,
|
||||
useClientInterests,
|
||||
type ClientInterestRow,
|
||||
} from '@/components/clients/client-pipeline-summary';
|
||||
import { InterestForm } from '@/components/interests/interest-form';
|
||||
|
||||
const LEAD_CATEGORY_LABELS: Record<string, string> = {
|
||||
general_interest: 'General interest',
|
||||
specific_qualified: 'Specific qualified',
|
||||
hot_lead: 'Hot lead',
|
||||
};
|
||||
|
||||
function InterestRowItem({
|
||||
interest,
|
||||
onOpen,
|
||||
}: {
|
||||
interest: ClientInterestRow;
|
||||
onOpen: (i: ClientInterestRow) => void;
|
||||
}) {
|
||||
const stage = safeStage(interest.pipelineStage);
|
||||
|
||||
const berthLabel = interest.berthMooringNumber
|
||||
? `Berth ${interest.berthMooringNumber}`
|
||||
: 'General interest';
|
||||
|
||||
const yachtLabel = interest.yachtName ?? null;
|
||||
|
||||
return (
|
||||
// Tap opens a bottom-sheet preview drawer rather than navigating to the
|
||||
// full interest page. The drawer covers ~80% of mobile interactions
|
||||
// ("what stage is this at, when did we last touch it"). For deeper
|
||||
// edits the drawer has an "Open full page" CTA.
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => onOpen(interest)}
|
||||
className={cn(
|
||||
'group block w-full rounded-xl border border-border bg-card p-4 text-left shadow-sm transition-all',
|
||||
'hover:border-border/70 hover:shadow-md',
|
||||
)}
|
||||
>
|
||||
<div className="flex items-start justify-between gap-3">
|
||||
<div className="min-w-0 flex-1">
|
||||
<div className="flex items-center gap-2 flex-wrap">
|
||||
<h3 className="truncate text-base font-semibold tracking-tight text-foreground">
|
||||
{berthLabel}
|
||||
</h3>
|
||||
<span
|
||||
className={cn(
|
||||
'shrink-0 rounded-full px-2 py-0.5 text-[11px] font-medium',
|
||||
STAGE_BADGE[stage],
|
||||
)}
|
||||
>
|
||||
{STAGE_LABELS[stage]}
|
||||
</span>
|
||||
</div>
|
||||
{yachtLabel ? (
|
||||
<p className="mt-0.5 truncate text-xs text-muted-foreground">{yachtLabel}</p>
|
||||
) : null}
|
||||
</div>
|
||||
<ChevronRight className="size-4 shrink-0 text-muted-foreground transition-transform group-hover:translate-x-0.5" />
|
||||
</div>
|
||||
|
||||
<div className="mt-3">
|
||||
<StageStepper current={stage} />
|
||||
</div>
|
||||
</button>
|
||||
);
|
||||
}
|
||||
|
||||
function lastActivityFor(interest: ClientInterestRow): string | null {
|
||||
const candidates = [interest.dateLastContact, interest.updatedAt]
|
||||
.filter((v): v is string => Boolean(v))
|
||||
.map((v) => new Date(v).getTime())
|
||||
.filter((t) => !Number.isNaN(t));
|
||||
if (candidates.length === 0) return null;
|
||||
return `${formatDistanceToNowStrict(new Date(Math.max(...candidates)))} ago`;
|
||||
}
|
||||
|
||||
/** Full interest record returned by `/api/v1/interests/[id]`. Only the fields
|
||||
* the drawer actually reads are typed here; the API returns more. */
|
||||
interface InterestDetail {
|
||||
id: string;
|
||||
pipelineStage: string;
|
||||
leadCategory: string | null;
|
||||
source: string | null;
|
||||
notes: string | null;
|
||||
dateLastContact: string | null;
|
||||
dateEoiSent: string | null;
|
||||
dateEoiSigned: string | null;
|
||||
dateDepositReceived: string | null;
|
||||
dateContractSent: string | null;
|
||||
dateContractSigned: string | null;
|
||||
}
|
||||
|
||||
function useInterestDetail(id: string | null) {
|
||||
return useQuery<{ data: InterestDetail }>({
|
||||
queryKey: ['interest-detail-drawer', id],
|
||||
queryFn: () => apiFetch<{ data: InterestDetail }>(`/api/v1/interests/${id}`),
|
||||
enabled: id !== null,
|
||||
// Detail rarely changes during a single drawer-open session; stale-time
|
||||
// keeps re-opens snappy without preventing background refetch.
|
||||
staleTime: 30_000,
|
||||
});
|
||||
}
|
||||
|
||||
/** Format a date-only or ISO timestamp as e.g. "Apr 8, 2026". Returns null for
|
||||
* empty input so callers can render an "empty" state. */
|
||||
function formatDate(value: string | null | undefined): string | null {
|
||||
if (!value) return null;
|
||||
const d = new Date(value);
|
||||
if (Number.isNaN(d.getTime())) return null;
|
||||
return format(d, 'MMM d, yyyy');
|
||||
}
|
||||
|
||||
/** A single milestone row inside the drawer's milestone summary. Filled
|
||||
* circle when the step is done, hollow when pending. Trailing meta line
|
||||
* shows the date stamp or a "pending" hint. */
|
||||
function MilestoneRow({
|
||||
label,
|
||||
done,
|
||||
date,
|
||||
hint = 'pending',
|
||||
}: {
|
||||
label: string;
|
||||
done: boolean;
|
||||
date: string | null;
|
||||
hint?: string;
|
||||
}) {
|
||||
return (
|
||||
<li className="flex items-center gap-2 py-1">
|
||||
{done ? (
|
||||
<CheckCircle2 className="size-4 shrink-0 text-emerald-600" aria-hidden />
|
||||
) : (
|
||||
<Circle className="size-4 shrink-0 text-muted-foreground/40" aria-hidden />
|
||||
)}
|
||||
<span className={cn('flex-1 text-sm', done ? 'text-foreground' : 'text-muted-foreground')}>
|
||||
{label}
|
||||
</span>
|
||||
<span className="text-xs text-muted-foreground tabular-nums">{date ?? hint}</span>
|
||||
</li>
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Bottom-sheet preview of a single interest. Designed for the mobile
|
||||
* "tap an interest → see what's happening without leaving the client
|
||||
* page" flow. Shows the pipeline progress, a compact milestone summary
|
||||
* (EOI / Deposit / Contract), lead context, last contact, and a notes
|
||||
* teaser. Tap-out / drag-down dismisses; the full edit page is one tap
|
||||
* away via "Open full page →".
|
||||
*/
|
||||
function InterestPreviewDrawer({
|
||||
interest,
|
||||
portSlug,
|
||||
onClose,
|
||||
}: {
|
||||
interest: ClientInterestRow | null;
|
||||
portSlug: string;
|
||||
onClose: () => void;
|
||||
}) {
|
||||
// Pin the most recently selected interest so the drawer stays populated
|
||||
// during the close-animation tail (Vaul keeps the content mounted ~250ms
|
||||
// after `open=false`). Conditional setState is safe here — the guard
|
||||
// ensures it only fires when the prop actually changes to a new row.
|
||||
const [pinned, setPinned] = useState<ClientInterestRow | null>(interest);
|
||||
if (interest && interest !== pinned) setPinned(interest);
|
||||
const showing = pinned;
|
||||
|
||||
const detail = useInterestDetail(showing?.id ?? null);
|
||||
const fullDetail = detail.data?.data ?? null;
|
||||
|
||||
const open = interest !== null;
|
||||
const stage = showing ? safeStage(showing.pipelineStage) : null;
|
||||
const stageIdx = stage ? PIPELINE_STAGES.indexOf(stage) : -1;
|
||||
const reached = (target: PipelineStage) =>
|
||||
stageIdx !== -1 && PIPELINE_STAGES.indexOf(target) <= stageIdx;
|
||||
|
||||
const berthLabel = showing
|
||||
? showing.berthMooringNumber
|
||||
? `Berth ${showing.berthMooringNumber}`
|
||||
: 'General interest'
|
||||
: '';
|
||||
const yachtLabel = showing?.yachtName ?? null;
|
||||
const activity = showing ? lastActivityFor(showing) : null;
|
||||
const fullHref = showing ? (`/${portSlug}/interests/${showing.id}` as Route) : ('/' as Route);
|
||||
|
||||
const leadLabel = fullDetail?.leadCategory
|
||||
? (LEAD_CATEGORY_LABELS[fullDetail.leadCategory] ?? fullDetail.leadCategory)
|
||||
: null;
|
||||
const sourceLabel = fullDetail?.source
|
||||
? fullDetail.source.replace(/\b\w/g, (m) => m.toUpperCase())
|
||||
: null;
|
||||
const lastContactDate = formatDate(fullDetail?.dateLastContact);
|
||||
const notesPreview = fullDetail?.notes?.trim() || null;
|
||||
|
||||
return (
|
||||
<Drawer
|
||||
open={open}
|
||||
onOpenChange={(next) => {
|
||||
if (!next) onClose();
|
||||
}}
|
||||
>
|
||||
<DrawerContent className="max-h-[85vh]">
|
||||
<DrawerHeader>
|
||||
<div className="flex items-start justify-between gap-3">
|
||||
<div className="min-w-0 flex-1">
|
||||
<DrawerTitle className="truncate">{berthLabel}</DrawerTitle>
|
||||
{yachtLabel ? (
|
||||
<p className="mt-0.5 truncate text-sm text-muted-foreground">{yachtLabel}</p>
|
||||
) : null}
|
||||
</div>
|
||||
{stage ? (
|
||||
<span
|
||||
className={cn(
|
||||
'shrink-0 rounded-full px-2.5 py-1 text-xs font-medium',
|
||||
STAGE_BADGE[stage],
|
||||
)}
|
||||
>
|
||||
{STAGE_LABELS[stage]}
|
||||
</span>
|
||||
) : null}
|
||||
</div>
|
||||
</DrawerHeader>
|
||||
|
||||
<div className="space-y-5 overflow-y-auto px-4 pb-4">
|
||||
{/* Pipeline-stepper segmented bar — the same primitive used on the
|
||||
row card, so the at-a-glance progress hint is consistent
|
||||
across surfaces. */}
|
||||
{stage ? (
|
||||
<div>
|
||||
<p className="mb-1.5 text-xs font-medium uppercase tracking-wide text-muted-foreground">
|
||||
Pipeline progress
|
||||
</p>
|
||||
<StageStepper current={stage} />
|
||||
</div>
|
||||
) : null}
|
||||
|
||||
{/* Milestones — three sections matching the full interest detail
|
||||
page (EOI / Deposit / Contract). Done-state is derived from
|
||||
the pipeline stage so seed data without per-step dates still
|
||||
renders correctly. The full milestone columns + per-step
|
||||
actions live behind "Open full page". */}
|
||||
<section>
|
||||
<p className="mb-2 text-xs font-medium uppercase tracking-wide text-muted-foreground">
|
||||
Milestones
|
||||
</p>
|
||||
<div className="space-y-3">
|
||||
<div className="rounded-lg border border-border bg-card/50 p-3">
|
||||
<p className="mb-1 text-sm font-semibold">EOI</p>
|
||||
<ul>
|
||||
<MilestoneRow
|
||||
label="EOI sent"
|
||||
done={reached('eoi_sent') || !!fullDetail?.dateEoiSent}
|
||||
date={formatDate(fullDetail?.dateEoiSent)}
|
||||
/>
|
||||
<MilestoneRow
|
||||
label="EOI signed"
|
||||
done={reached('eoi_signed') || !!fullDetail?.dateEoiSigned}
|
||||
date={formatDate(fullDetail?.dateEoiSigned)}
|
||||
/>
|
||||
</ul>
|
||||
</div>
|
||||
<div className="rounded-lg border border-border bg-card/50 p-3">
|
||||
<p className="mb-1 text-sm font-semibold">Deposit</p>
|
||||
<ul>
|
||||
<MilestoneRow
|
||||
label="Deposit received"
|
||||
done={reached('deposit_10pct') || !!fullDetail?.dateDepositReceived}
|
||||
date={formatDate(fullDetail?.dateDepositReceived)}
|
||||
/>
|
||||
</ul>
|
||||
</div>
|
||||
<div className="rounded-lg border border-border bg-card/50 p-3">
|
||||
<p className="mb-1 text-sm font-semibold">Contract</p>
|
||||
<ul>
|
||||
<MilestoneRow
|
||||
label="Contract sent"
|
||||
done={reached('contract_sent') || !!fullDetail?.dateContractSent}
|
||||
date={formatDate(fullDetail?.dateContractSent)}
|
||||
/>
|
||||
<MilestoneRow
|
||||
label="Contract signed"
|
||||
done={reached('contract_signed') || !!fullDetail?.dateContractSigned}
|
||||
date={formatDate(fullDetail?.dateContractSigned)}
|
||||
/>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
{/* Compact key/value pairs — lead category, source, last contact,
|
||||
activity. Each row collapses cleanly when its value is
|
||||
missing so the drawer scales from sparse seed data to full
|
||||
records without empty placeholders. */}
|
||||
<dl className="grid grid-cols-[max-content_1fr] gap-x-4 gap-y-1.5 text-sm">
|
||||
{leadLabel ? (
|
||||
<>
|
||||
<dt className="text-muted-foreground">Lead</dt>
|
||||
<dd className="text-right font-medium">{leadLabel}</dd>
|
||||
</>
|
||||
) : null}
|
||||
{sourceLabel ? (
|
||||
<>
|
||||
<dt className="text-muted-foreground">Source</dt>
|
||||
<dd className="text-right font-medium">{sourceLabel}</dd>
|
||||
</>
|
||||
) : null}
|
||||
{lastContactDate ? (
|
||||
<>
|
||||
<dt className="text-muted-foreground">Last contact</dt>
|
||||
<dd className="text-right font-medium">{lastContactDate}</dd>
|
||||
</>
|
||||
) : null}
|
||||
{activity ? (
|
||||
<>
|
||||
<dt className="text-muted-foreground">Last activity</dt>
|
||||
<dd className="text-right font-medium">{activity}</dd>
|
||||
</>
|
||||
) : null}
|
||||
</dl>
|
||||
|
||||
{notesPreview ? (
|
||||
<section>
|
||||
<p className="mb-1.5 text-xs font-medium uppercase tracking-wide text-muted-foreground">
|
||||
Notes
|
||||
</p>
|
||||
<p className="line-clamp-3 text-sm text-foreground/90 whitespace-pre-wrap">
|
||||
{notesPreview}
|
||||
</p>
|
||||
</section>
|
||||
) : null}
|
||||
|
||||
<Button asChild className="w-full" size="lg">
|
||||
<Link href={fullHref}>
|
||||
Open full page
|
||||
<ArrowRight className="ml-1.5 size-4" aria-hidden />
|
||||
</Link>
|
||||
</Button>
|
||||
</div>
|
||||
</DrawerContent>
|
||||
</Drawer>
|
||||
);
|
||||
}
|
||||
|
||||
function InterestSkeleton() {
|
||||
return (
|
||||
<div className="rounded-xl border border-border bg-card p-4 shadow-sm">
|
||||
<Skeleton className="h-4 w-32" />
|
||||
<Skeleton className="mt-2 h-3 w-24" />
|
||||
<Skeleton className="mt-3 h-2 w-48" />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
interface ClientInterestsTabProps {
|
||||
clientId: string;
|
||||
}
|
||||
|
||||
export function ClientInterestsTab({ clientId }: ClientInterestsTabProps) {
|
||||
const routeParams = useParams<{ portSlug: string }>();
|
||||
const portSlug = routeParams?.portSlug ?? '';
|
||||
const [createOpen, setCreateOpen] = useState(false);
|
||||
const [previewInterest, setPreviewInterest] = useState<ClientInterestRow | null>(null);
|
||||
|
||||
const { data, isLoading, isError } = useClientInterests(clientId);
|
||||
|
||||
if (isLoading) {
|
||||
return (
|
||||
<div className="space-y-3">
|
||||
<InterestSkeleton />
|
||||
<InterestSkeleton />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
if (isError) {
|
||||
return <p className="text-sm text-destructive">Could not load interests for this client.</p>;
|
||||
}
|
||||
|
||||
const interests = data?.data ?? [];
|
||||
|
||||
if (interests.length === 0) {
|
||||
return (
|
||||
<>
|
||||
<EmptyState
|
||||
title="No interests yet"
|
||||
description="When this client expresses interest in a berth, the sales process will appear here."
|
||||
action={{
|
||||
label: 'Add interest',
|
||||
onClick: () => setCreateOpen(true),
|
||||
}}
|
||||
/>
|
||||
<InterestForm open={createOpen} onOpenChange={setCreateOpen} defaultClientId={clientId} />
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
const active = interests.filter((i) => !i.archivedAt);
|
||||
const archived = interests.filter((i) => i.archivedAt);
|
||||
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div className="flex justify-end">
|
||||
<Button size="sm" onClick={() => setCreateOpen(true)}>
|
||||
<Plus className="mr-1.5 size-3.5" />
|
||||
Add interest
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
{active.length > 0 ? (
|
||||
<div className="space-y-3">
|
||||
{active.map((i) => (
|
||||
<InterestRowItem key={i.id} interest={i} onOpen={setPreviewInterest} />
|
||||
))}
|
||||
</div>
|
||||
) : null}
|
||||
|
||||
{archived.length > 0 ? (
|
||||
<div className="space-y-3">
|
||||
<h4 className="text-xs font-medium uppercase tracking-wide text-muted-foreground">
|
||||
Archived
|
||||
</h4>
|
||||
<div className="space-y-3 opacity-60">
|
||||
{archived.map((i) => (
|
||||
<InterestRowItem key={i.id} interest={i} onOpen={setPreviewInterest} />
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
) : null}
|
||||
|
||||
<InterestPreviewDrawer
|
||||
interest={previewInterest}
|
||||
portSlug={portSlug}
|
||||
onClose={() => setPreviewInterest(null)}
|
||||
/>
|
||||
|
||||
<InterestForm open={createOpen} onOpenChange={setCreateOpen} defaultClientId={clientId} />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -9,6 +9,8 @@ import { InlineTimezoneField } from '@/components/shared/inline-timezone-field';
|
||||
import { InlineTagEditor } from '@/components/shared/inline-tag-editor';
|
||||
import { NotesList } from '@/components/shared/notes-list';
|
||||
import type { CountryCode } from '@/lib/i18n/countries';
|
||||
import { ClientInterestsTab } from '@/components/clients/client-interests-tab';
|
||||
import { ClientPipelineSummary } from '@/components/clients/client-pipeline-summary';
|
||||
import { ClientYachtsTab } from '@/components/clients/client-yachts-tab';
|
||||
import { ClientCompaniesTab } from '@/components/clients/client-companies-tab';
|
||||
import { ClientReservationsTab } from '@/components/clients/client-reservations-tab';
|
||||
@@ -114,6 +116,8 @@ interface ClientTabsOptions {
|
||||
tenureType: string;
|
||||
status: string;
|
||||
}>;
|
||||
interestCount?: number;
|
||||
noteCount?: number;
|
||||
tags?: Array<{ id: string; name: string; color: string }>;
|
||||
};
|
||||
}
|
||||
@@ -131,82 +135,82 @@ function OverviewTab({
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-6">
|
||||
{/* Personal Info */}
|
||||
<div className="space-y-1">
|
||||
<h3 className="text-sm font-medium mb-2">Personal Information</h3>
|
||||
<dl>
|
||||
<EditableRow label="Full Name">
|
||||
<InlineEditableField value={client.fullName} onSave={save('fullName')} />
|
||||
</EditableRow>
|
||||
<EditableRow label="Nationality">
|
||||
<InlineCountryField
|
||||
value={client.nationalityIso ?? null}
|
||||
onSave={async (iso) => {
|
||||
await mutation.mutateAsync({ nationalityIso: iso });
|
||||
}}
|
||||
data-testid="client-nationality-inline"
|
||||
/>
|
||||
</EditableRow>
|
||||
<EditableRow label="Preferred Language">
|
||||
<InlineEditableField
|
||||
value={client.preferredLanguage}
|
||||
onSave={save('preferredLanguage')}
|
||||
/>
|
||||
</EditableRow>
|
||||
<EditableRow label="Timezone">
|
||||
<InlineTimezoneField
|
||||
value={client.timezone}
|
||||
countryHint={(client.nationalityIso as CountryCode | null) ?? null}
|
||||
onSave={async (tz) => {
|
||||
await mutation.mutateAsync({ timezone: tz });
|
||||
}}
|
||||
data-testid="client-timezone-inline"
|
||||
/>
|
||||
</EditableRow>
|
||||
<EditableRow label="Preferred Contact">
|
||||
<InlineEditableField
|
||||
variant="select"
|
||||
options={CONTACT_METHOD_OPTIONS}
|
||||
value={client.preferredContactMethod}
|
||||
onSave={save('preferredContactMethod')}
|
||||
/>
|
||||
</EditableRow>
|
||||
</dl>
|
||||
<div className="space-y-6">
|
||||
<div className="rounded-xl border border-border bg-card p-4 shadow-sm">
|
||||
<ClientPipelineSummary clientId={clientId} variant="panel" />
|
||||
</div>
|
||||
|
||||
{/* Contacts */}
|
||||
<div className="space-y-1">
|
||||
<h3 className="text-sm font-medium mb-2">Contact Details</h3>
|
||||
<ContactsEditor clientId={clientId} contacts={client.contacts ?? []} />
|
||||
</div>
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-6">
|
||||
{/* Personal Info */}
|
||||
<div className="space-y-1">
|
||||
<h3 className="text-sm font-medium mb-2">Personal Information</h3>
|
||||
<dl>
|
||||
<EditableRow label="Full Name">
|
||||
<InlineEditableField value={client.fullName} onSave={save('fullName')} />
|
||||
</EditableRow>
|
||||
<EditableRow label="Nationality">
|
||||
<InlineCountryField
|
||||
value={client.nationalityIso ?? null}
|
||||
onSave={async (iso) => {
|
||||
await mutation.mutateAsync({ nationalityIso: iso });
|
||||
}}
|
||||
data-testid="client-nationality-inline"
|
||||
/>
|
||||
</EditableRow>
|
||||
<EditableRow label="Timezone">
|
||||
<InlineTimezoneField
|
||||
value={client.timezone}
|
||||
countryHint={(client.nationalityIso as CountryCode | null) ?? null}
|
||||
onSave={async (tz) => {
|
||||
await mutation.mutateAsync({ timezone: tz });
|
||||
}}
|
||||
data-testid="client-timezone-inline"
|
||||
/>
|
||||
</EditableRow>
|
||||
<EditableRow label="Preferred Contact">
|
||||
<InlineEditableField
|
||||
variant="select"
|
||||
options={CONTACT_METHOD_OPTIONS}
|
||||
value={client.preferredContactMethod}
|
||||
onSave={save('preferredContactMethod')}
|
||||
/>
|
||||
</EditableRow>
|
||||
</dl>
|
||||
</div>
|
||||
|
||||
{/* Source */}
|
||||
<div className="space-y-1">
|
||||
<h3 className="text-sm font-medium mb-2">Source</h3>
|
||||
<dl>
|
||||
<EditableRow label="Source">
|
||||
<InlineEditableField
|
||||
variant="select"
|
||||
options={SOURCE_OPTIONS}
|
||||
value={client.source}
|
||||
onSave={save('source')}
|
||||
/>
|
||||
</EditableRow>
|
||||
<EditableRow label="Source Details">
|
||||
<InlineEditableField value={client.sourceDetails} onSave={save('sourceDetails')} />
|
||||
</EditableRow>
|
||||
</dl>
|
||||
</div>
|
||||
{/* Contacts */}
|
||||
<div className="space-y-1">
|
||||
<h3 className="text-sm font-medium mb-2">Contact Details</h3>
|
||||
<ContactsEditor clientId={clientId} contacts={client.contacts ?? []} />
|
||||
</div>
|
||||
|
||||
{/* Tags */}
|
||||
<div className="space-y-1">
|
||||
<h3 className="text-sm font-medium mb-2">Tags</h3>
|
||||
<InlineTagEditor
|
||||
endpoint={`/api/v1/clients/${clientId}/tags`}
|
||||
currentTags={client.tags ?? []}
|
||||
invalidateKey={['clients', clientId]}
|
||||
/>
|
||||
{/* Source */}
|
||||
<div className="space-y-1">
|
||||
<h3 className="text-sm font-medium mb-2">Source</h3>
|
||||
<dl>
|
||||
<EditableRow label="Source">
|
||||
<InlineEditableField
|
||||
variant="select"
|
||||
options={SOURCE_OPTIONS}
|
||||
value={client.source}
|
||||
onSave={save('source')}
|
||||
/>
|
||||
</EditableRow>
|
||||
<EditableRow label="Source Details">
|
||||
<InlineEditableField value={client.sourceDetails} onSave={save('sourceDetails')} />
|
||||
</EditableRow>
|
||||
</dl>
|
||||
</div>
|
||||
|
||||
{/* Tags */}
|
||||
<div className="space-y-1">
|
||||
<h3 className="text-sm font-medium mb-2">Tags</h3>
|
||||
<InlineTagEditor
|
||||
endpoint={`/api/v1/clients/${clientId}/tags`}
|
||||
currentTags={client.tags ?? []}
|
||||
invalidateKey={['clients', clientId]}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
@@ -219,6 +223,12 @@ export function getClientTabs({ clientId, currentUserId, client }: ClientTabsOpt
|
||||
label: 'Overview',
|
||||
content: <OverviewTab clientId={clientId} client={client} />,
|
||||
},
|
||||
{
|
||||
id: 'interests',
|
||||
label: 'Interests',
|
||||
badge: client.interestCount,
|
||||
content: <ClientInterestsTab clientId={clientId} />,
|
||||
},
|
||||
{
|
||||
id: 'yachts',
|
||||
label: 'Yachts',
|
||||
@@ -251,18 +261,10 @@ export function getClientTabs({ clientId, currentUserId, client }: ClientTabsOpt
|
||||
/>
|
||||
),
|
||||
},
|
||||
{
|
||||
id: 'interests',
|
||||
label: 'Interests',
|
||||
content: (
|
||||
<div className="text-center py-12 text-muted-foreground">
|
||||
<p>Interests will appear here once created.</p>
|
||||
</div>
|
||||
),
|
||||
},
|
||||
{
|
||||
id: 'notes',
|
||||
label: 'Notes',
|
||||
badge: client.noteCount,
|
||||
content: <NotesList entityType="clients" entityId={clientId} currentUserId={currentUserId} />,
|
||||
},
|
||||
{
|
||||
|
||||
@@ -155,6 +155,7 @@ function ContactRow({
|
||||
onRemove: () => void;
|
||||
}) {
|
||||
const Icon = CHANNEL_ICONS[contact.channel] ?? MoreHorizontal;
|
||||
const [phoneEditing, setPhoneEditing] = useState(false);
|
||||
|
||||
async function togglePrimary() {
|
||||
try {
|
||||
@@ -174,17 +175,31 @@ function ContactRow({
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="group flex items-center gap-2 p-2 rounded-lg border bg-muted/30 text-sm">
|
||||
{/* Left: channel + value */}
|
||||
<div className="flex items-center gap-2 flex-1 min-w-0">
|
||||
<div
|
||||
data-editing={phoneEditing ? 'true' : undefined}
|
||||
className={cn(
|
||||
'group rounded-lg border text-sm transition-all duration-150',
|
||||
// Active-edit dilation: lift the row out of the muted baseline with a
|
||||
// soft primary ring + slightly brighter surface. Single visual signal
|
||||
// replaces the need for any "now editing" label.
|
||||
phoneEditing
|
||||
? 'bg-card border-primary/30 ring-2 ring-primary/15 shadow-sm p-3 gap-3'
|
||||
: 'bg-muted/30 p-2 gap-2',
|
||||
// Stack value editor / action cluster on mobile; single row on sm+.
|
||||
'flex flex-col sm:flex-row sm:items-center',
|
||||
)}
|
||||
>
|
||||
{/* Top / left: channel + value */}
|
||||
<div className="flex min-w-0 flex-1 items-center gap-2">
|
||||
<ChannelPicker value={contact.channel} onChange={changeChannel}>
|
||||
<Icon className="h-3.5 w-3.5 text-muted-foreground" />
|
||||
</ChannelPicker>
|
||||
<div className="min-w-0">
|
||||
<div className="min-w-0 flex-1">
|
||||
{contact.channel === 'phone' || contact.channel === 'whatsapp' ? (
|
||||
<InlinePhoneField
|
||||
e164={contact.valueE164 ?? null}
|
||||
country={contact.valueCountry ?? null}
|
||||
onEditingChange={setPhoneEditing}
|
||||
onSave={async ({ e164, country }) => {
|
||||
if (!e164) {
|
||||
toast.error('Phone number is required');
|
||||
@@ -208,42 +223,60 @@ function ContactRow({
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Right: tag + actions */}
|
||||
<div className="flex items-center gap-2 shrink-0">
|
||||
<div className="w-28 text-xs text-muted-foreground text-right">
|
||||
<InlineEditableField
|
||||
value={
|
||||
contact.label && contact.label.toLowerCase() !== 'primary' ? contact.label : null
|
||||
}
|
||||
emptyText="Add tag"
|
||||
placeholder="work, home…"
|
||||
onSave={async (v) => {
|
||||
await onUpdate({ label: v });
|
||||
}}
|
||||
/>
|
||||
{/* Bottom / right: tag + actions.
|
||||
Two layers of hiding compose here:
|
||||
(a) phoneEditing — when the phone editor is open, hide the entire
|
||||
action cluster (tag + star + trash) so the user can focus on
|
||||
the form without chips fighting for space.
|
||||
(b) contact.value — when the value is empty (stale import row,
|
||||
aborted edit), hide just the tag + Make-primary star;
|
||||
neither makes sense without a value. The trash icon stays
|
||||
so the user can clean up the empty entry.
|
||||
On touch (no hover), trash is always rendered; on desktop it
|
||||
fades in on hover only (sm:opacity-0 + sm:group-hover:opacity-100). */}
|
||||
{!phoneEditing ? (
|
||||
<div className="flex shrink-0 items-center justify-end gap-2">
|
||||
{contact.value ? (
|
||||
<>
|
||||
<div className="w-28 text-right text-xs text-muted-foreground">
|
||||
<InlineEditableField
|
||||
value={
|
||||
contact.label && contact.label.toLowerCase() !== 'primary'
|
||||
? contact.label
|
||||
: null
|
||||
}
|
||||
emptyText="Add tag"
|
||||
placeholder="work, home…"
|
||||
onSave={async (v) => {
|
||||
await onUpdate({ label: v });
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<button
|
||||
type="button"
|
||||
onClick={togglePrimary}
|
||||
title={contact.isPrimary ? 'Primary' : 'Make primary'}
|
||||
className={cn(
|
||||
'rounded p-1 transition-colors hover:bg-background/60',
|
||||
contact.isPrimary ? 'text-primary' : 'text-muted-foreground/50',
|
||||
)}
|
||||
>
|
||||
<Star className={cn('h-3.5 w-3.5', contact.isPrimary && 'fill-current')} />
|
||||
</button>
|
||||
</>
|
||||
) : null}
|
||||
|
||||
<button
|
||||
type="button"
|
||||
onClick={onRemove}
|
||||
title="Remove"
|
||||
className="rounded p-1 text-muted-foreground/50 transition-all hover:bg-background/60 hover:text-destructive sm:opacity-0 sm:group-hover:opacity-100"
|
||||
>
|
||||
<Trash2 className="h-3.5 w-3.5" />
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<button
|
||||
type="button"
|
||||
onClick={togglePrimary}
|
||||
title={contact.isPrimary ? 'Primary' : 'Make primary'}
|
||||
className={cn(
|
||||
'p-1 rounded hover:bg-background/60 transition-colors',
|
||||
contact.isPrimary ? 'text-primary' : 'text-muted-foreground/50',
|
||||
)}
|
||||
>
|
||||
<Star className={cn('h-3.5 w-3.5', contact.isPrimary && 'fill-current')} />
|
||||
</button>
|
||||
|
||||
<button
|
||||
type="button"
|
||||
onClick={onRemove}
|
||||
title="Remove"
|
||||
className="p-1 rounded text-muted-foreground/50 hover:text-destructive hover:bg-background/60 opacity-0 group-hover:opacity-100 transition-all"
|
||||
>
|
||||
<Trash2 className="h-3.5 w-3.5" />
|
||||
</button>
|
||||
</div>
|
||||
) : null}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -330,7 +363,9 @@ function NewContactForm({
|
||||
const submitDisabled = saving || (isPhoneChannel ? !phoneValue?.e164 : !value.trim());
|
||||
|
||||
return (
|
||||
<div className="flex items-center gap-2 p-2 rounded-lg border bg-muted/30 text-sm">
|
||||
// Single row on sm+; wraps onto multiple lines below 640px so the channel
|
||||
// picker, value field, label, and buttons each get their own usable width.
|
||||
<div className="flex flex-wrap items-center gap-2 rounded-lg border bg-muted/30 p-2 text-sm">
|
||||
<Select
|
||||
value={channel}
|
||||
onValueChange={(next) => {
|
||||
@@ -353,7 +388,7 @@ function NewContactForm({
|
||||
</Select>
|
||||
|
||||
{isPhoneChannel ? (
|
||||
<div className="flex-1 min-w-0">
|
||||
<div className="min-w-0 flex-1 basis-full sm:basis-auto">
|
||||
<PhoneInput
|
||||
value={phoneValue}
|
||||
onChange={(v) => setPhoneValue(v)}
|
||||
@@ -365,7 +400,7 @@ function NewContactForm({
|
||||
value={value}
|
||||
onChange={(e) => setValue(e.target.value)}
|
||||
placeholder={channel === 'email' ? 'name@example.com' : 'value'}
|
||||
className="h-7 text-sm flex-1 min-w-0"
|
||||
className="h-7 min-w-0 flex-1 basis-full text-sm sm:basis-auto"
|
||||
autoFocus
|
||||
disabled={saving}
|
||||
onKeyDown={(e) => {
|
||||
@@ -382,7 +417,7 @@ function NewContactForm({
|
||||
value={label}
|
||||
onChange={(e) => setLabel(e.target.value)}
|
||||
placeholder="tag (optional)"
|
||||
className="h-7 text-xs w-28"
|
||||
className="h-7 w-28 text-xs"
|
||||
disabled={saving}
|
||||
onKeyDown={(e) => {
|
||||
if (e.key === 'Enter') {
|
||||
@@ -393,12 +428,14 @@ function NewContactForm({
|
||||
}}
|
||||
/>
|
||||
|
||||
<Button type="button" size="sm" onClick={submit} disabled={submitDisabled}>
|
||||
{saving ? <Loader2 className="h-3.5 w-3.5 animate-spin" /> : 'Save'}
|
||||
</Button>
|
||||
<Button type="button" size="sm" variant="ghost" onClick={onCancel} disabled={saving}>
|
||||
Cancel
|
||||
</Button>
|
||||
<div className="ml-auto flex gap-2">
|
||||
<Button type="button" size="sm" onClick={submit} disabled={submitDisabled}>
|
||||
{saving ? <Loader2 className="h-3.5 w-3.5 animate-spin" /> : 'Save'}
|
||||
</Button>
|
||||
<Button type="button" size="sm" variant="ghost" onClick={onCancel} disabled={saving}>
|
||||
Cancel
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
183
src/components/clients/dedup-suggestion-panel.tsx
Normal file
183
src/components/clients/dedup-suggestion-panel.tsx
Normal file
@@ -0,0 +1,183 @@
|
||||
'use client';
|
||||
|
||||
import { useEffect, useState } from 'react';
|
||||
import { useQuery } from '@tanstack/react-query';
|
||||
import { AlertCircle, ArrowRight, Briefcase, X } from 'lucide-react';
|
||||
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { apiFetch } from '@/lib/api/client';
|
||||
import { cn } from '@/lib/utils';
|
||||
|
||||
interface MatchData {
|
||||
clientId: string;
|
||||
fullName: string;
|
||||
score: number;
|
||||
confidence: 'high' | 'medium' | 'low';
|
||||
reasons: string[];
|
||||
interestCount: number;
|
||||
emails: string[];
|
||||
phonesE164: string[];
|
||||
}
|
||||
|
||||
interface DedupSuggestionPanelProps {
|
||||
/** Free-text inputs from the in-flight new-client form. The panel
|
||||
* debounces them and queries /api/v1/clients/match-candidates. */
|
||||
email?: string | null;
|
||||
phone?: string | null;
|
||||
name?: string | null;
|
||||
/** Caller wants to attach the new interest to an existing client
|
||||
* rather than creating a new one. The form switches to
|
||||
* interest-only mode and pre-fills the client. */
|
||||
onUseExisting: (match: MatchData) => void;
|
||||
/** User explicitly said "create new anyway." Hide the panel until
|
||||
* they change input again. */
|
||||
onDismiss?: () => void;
|
||||
}
|
||||
|
||||
/**
|
||||
* Surfaces existing clients that match the form's in-flight inputs.
|
||||
*
|
||||
* Renders nothing while inputs are short / no useful match found.
|
||||
* On a high-confidence match, the panel interrupts visually with a
|
||||
* solid border and a primary "Use this client" button.
|
||||
*
|
||||
* Wired into the new-client form. Skipped in edit mode.
|
||||
*/
|
||||
export function DedupSuggestionPanel({
|
||||
email,
|
||||
phone,
|
||||
name,
|
||||
onUseExisting,
|
||||
onDismiss,
|
||||
}: DedupSuggestionPanelProps) {
|
||||
const [dismissed, setDismissed] = useState(false);
|
||||
|
||||
// Debounce inputs by 300ms so we don't fire on every keystroke. Keep
|
||||
// the latest debounced values in component state.
|
||||
const [debounced, setDebounced] = useState({
|
||||
email: email ?? '',
|
||||
phone: phone ?? '',
|
||||
name: name ?? '',
|
||||
});
|
||||
|
||||
useEffect(() => {
|
||||
const t = setTimeout(() => {
|
||||
setDebounced({ email: email ?? '', phone: phone ?? '', name: name ?? '' });
|
||||
// Clear the dismissed flag when inputs change — the user typed
|
||||
// something new, so the prior dismissal no longer applies.
|
||||
setDismissed(false);
|
||||
}, 300);
|
||||
return () => clearTimeout(t);
|
||||
}, [email, phone, name]);
|
||||
|
||||
const hasSomething =
|
||||
debounced.email.length > 3 || debounced.phone.length > 3 || debounced.name.length > 2;
|
||||
|
||||
const { data, isFetching } = useQuery<{ data: MatchData[] }>({
|
||||
queryKey: ['dedup-match-candidates', debounced],
|
||||
queryFn: () => {
|
||||
const params = new URLSearchParams();
|
||||
if (debounced.email) params.set('email', debounced.email);
|
||||
if (debounced.phone) params.set('phone', debounced.phone);
|
||||
if (debounced.name) params.set('name', debounced.name);
|
||||
return apiFetch<{ data: MatchData[] }>(`/api/v1/clients/match-candidates?${params}`);
|
||||
},
|
||||
enabled: hasSomething && !dismissed,
|
||||
// Same query is fine to cache for a minute — moves are slow at this layer.
|
||||
staleTime: 60_000,
|
||||
});
|
||||
|
||||
if (dismissed) return null;
|
||||
if (!hasSomething) return null;
|
||||
if (isFetching && !data) return null;
|
||||
const matches = data?.data ?? [];
|
||||
if (matches.length === 0) return null;
|
||||
|
||||
const top = matches[0]!;
|
||||
const isHigh = top.confidence === 'high';
|
||||
|
||||
return (
|
||||
<div
|
||||
className={cn(
|
||||
'rounded-lg border p-3 mb-3 transition-colors',
|
||||
isHigh
|
||||
? 'border-amber-300 bg-amber-50/60 dark:bg-amber-950/30'
|
||||
: 'border-border bg-muted/40',
|
||||
)}
|
||||
data-testid="dedup-suggestion"
|
||||
>
|
||||
<div className="flex items-start gap-3">
|
||||
<div className="mt-0.5">
|
||||
<AlertCircle
|
||||
className={cn(
|
||||
'size-5',
|
||||
isHigh ? 'text-amber-700 dark:text-amber-400' : 'text-muted-foreground',
|
||||
)}
|
||||
aria-hidden
|
||||
/>
|
||||
</div>
|
||||
<div className="min-w-0 flex-1">
|
||||
<p className="text-sm font-semibold leading-tight">
|
||||
{isHigh
|
||||
? 'This looks like an existing client'
|
||||
: 'Possible match — check before creating'}
|
||||
</p>
|
||||
<div className="mt-2 rounded-md border bg-background/80 p-2.5">
|
||||
<div className="flex items-center gap-2">
|
||||
<p className="truncate text-sm font-medium">{top.fullName}</p>
|
||||
<span
|
||||
className={cn(
|
||||
'shrink-0 rounded-full px-1.5 py-0.5 text-[10px] font-medium uppercase tracking-wide',
|
||||
isHigh
|
||||
? 'bg-amber-200 text-amber-900 dark:bg-amber-800 dark:text-amber-100'
|
||||
: 'bg-muted text-muted-foreground',
|
||||
)}
|
||||
>
|
||||
{top.confidence}
|
||||
</span>
|
||||
</div>
|
||||
<div className="mt-0.5 flex flex-wrap items-center gap-x-3 gap-y-0.5 text-xs text-muted-foreground">
|
||||
{top.emails[0] ? <span className="truncate">{top.emails[0]}</span> : null}
|
||||
{top.phonesE164[0] ? <span>{top.phonesE164[0]}</span> : null}
|
||||
<span className="inline-flex items-center gap-1">
|
||||
<Briefcase className="size-3" aria-hidden />
|
||||
{top.interestCount} {top.interestCount === 1 ? 'interest' : 'interests'}
|
||||
</span>
|
||||
</div>
|
||||
<p className="mt-1.5 text-[11px] text-muted-foreground">{top.reasons.join(' · ')}</p>
|
||||
</div>
|
||||
<div className="mt-3 flex flex-wrap items-center gap-2">
|
||||
<Button
|
||||
type="button"
|
||||
size="sm"
|
||||
onClick={() => onUseExisting(top)}
|
||||
data-testid="dedup-use-existing"
|
||||
>
|
||||
Use this client
|
||||
<ArrowRight className="ml-1 size-3.5" aria-hidden />
|
||||
</Button>
|
||||
<Button
|
||||
type="button"
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
onClick={() => {
|
||||
setDismissed(true);
|
||||
onDismiss?.();
|
||||
}}
|
||||
data-testid="dedup-dismiss"
|
||||
>
|
||||
<X className="mr-1 size-3.5" aria-hidden />
|
||||
Create new anyway
|
||||
</Button>
|
||||
{matches.length > 1 ? (
|
||||
<span className="text-xs text-muted-foreground">
|
||||
+{matches.length - 1} other possible{' '}
|
||||
{matches.length - 1 === 1 ? 'match' : 'matches'}
|
||||
</span>
|
||||
) : null}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -77,7 +77,9 @@ export function CompanyDetailHeader({ company }: CompanyDetailHeaderProps) {
|
||||
return (
|
||||
<>
|
||||
<DetailHeaderStrip>
|
||||
<div className="flex items-start gap-3 flex-wrap">
|
||||
{/* Stack actions below the title block on phone widths; horizontal
|
||||
beside it from sm up. */}
|
||||
<div className="flex flex-col gap-3 sm:flex-row sm:items-start sm:flex-wrap sm:gap-3">
|
||||
<div className="flex-1 min-w-0">
|
||||
<div className="flex items-center gap-2 flex-wrap">
|
||||
<h1 className="hidden sm:block text-2xl font-bold text-foreground truncate">
|
||||
|
||||
@@ -146,7 +146,11 @@ function OverviewTab({ companyId, company }: { companyId: string; company: Compa
|
||||
</EditableRow>
|
||||
<EditableRow label="Incorporation Date">
|
||||
<InlineEditableField
|
||||
value={company.incorporationDate}
|
||||
// The API returns this as an ISO timestamp ("2019-03-14T00:00:00.000Z")
|
||||
// because Postgres `date` columns are serialized through JSON. Strip
|
||||
// the time portion so the read-only state shows just YYYY-MM-DD,
|
||||
// which is also the format the user types when editing.
|
||||
value={company.incorporationDate ? company.incorporationDate.slice(0, 10) : null}
|
||||
placeholder="YYYY-MM-DD"
|
||||
onSave={save('incorporationDate')}
|
||||
/>
|
||||
|
||||
@@ -28,11 +28,10 @@ function formatPercent(value: number): string {
|
||||
|
||||
function KpiTileSkeleton() {
|
||||
return (
|
||||
<div className="relative overflow-hidden rounded-xl border border-border bg-card p-5 shadow-sm">
|
||||
<div className="relative overflow-hidden rounded-xl border border-border bg-card p-3 shadow-sm sm:p-5">
|
||||
<div className="absolute inset-x-0 top-0 h-1 bg-muted" aria-hidden />
|
||||
<Skeleton className="h-3 w-24" />
|
||||
<Skeleton className="mt-3 h-7 w-32" />
|
||||
<Skeleton className="mt-2 h-3 w-12" />
|
||||
<Skeleton className="h-3 w-20" />
|
||||
<Skeleton className="mt-2 h-6 w-24 sm:mt-3 sm:h-7" />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -67,8 +67,11 @@ export function MyRemindersRail() {
|
||||
return `/${portSlug}/reminders`;
|
||||
}
|
||||
|
||||
// `h-full` only at xl: where the dashboard grid pairs this rail with
|
||||
// a sibling chart column. On mobile (stacked) it produced a weirdly
|
||||
// tall empty card.
|
||||
return (
|
||||
<Card className="h-full">
|
||||
<Card className="xl:h-full">
|
||||
<CardHeader className="flex flex-row items-start justify-between gap-2 space-y-0 pb-3">
|
||||
<div className="space-y-0.5">
|
||||
<CardTitle className="flex items-center gap-1.5 text-base">
|
||||
|
||||
104
src/components/dev/react-grab-viewport-sync.tsx
Normal file
104
src/components/dev/react-grab-viewport-sync.tsx
Normal file
@@ -0,0 +1,104 @@
|
||||
'use client';
|
||||
|
||||
import { useEffect } from 'react';
|
||||
|
||||
type Edge = 'top' | 'bottom' | 'left' | 'right';
|
||||
|
||||
interface ToolbarState {
|
||||
edge: Edge;
|
||||
ratio: number;
|
||||
collapsed: boolean;
|
||||
enabled: boolean;
|
||||
defaultAction?: string;
|
||||
}
|
||||
|
||||
interface ReactGrabAPI {
|
||||
setToolbarState: (state: Partial<ToolbarState>) => void;
|
||||
onToolbarStateChange: (cb: (state: ToolbarState) => void) => () => void;
|
||||
}
|
||||
|
||||
declare global {
|
||||
interface Window {
|
||||
__REACT_GRAB__?: ReactGrabAPI;
|
||||
}
|
||||
}
|
||||
|
||||
const MOBILE_QUERY = '(max-width: 1023.98px)';
|
||||
const DESKTOP_KEY = 'react-grab-toolbar-state-desktop';
|
||||
const MOBILE_KEY = 'react-grab-toolbar-state-mobile';
|
||||
|
||||
const DESKTOP_DEFAULT: Partial<ToolbarState> = {
|
||||
edge: 'bottom',
|
||||
ratio: 0.5,
|
||||
collapsed: false,
|
||||
};
|
||||
|
||||
const MOBILE_DEFAULT: Partial<ToolbarState> = {
|
||||
edge: 'right',
|
||||
ratio: 0.5,
|
||||
collapsed: false,
|
||||
};
|
||||
|
||||
export function ReactGrabViewportSync() {
|
||||
useEffect(() => {
|
||||
if (process.env.NODE_ENV !== 'development') return;
|
||||
|
||||
const cleanups: Array<() => void> = [];
|
||||
let pollId: number | undefined;
|
||||
|
||||
const wireUp = (api: ReactGrabAPI) => {
|
||||
const mql = window.matchMedia(MOBILE_QUERY);
|
||||
const keyFor = () => (mql.matches ? MOBILE_KEY : DESKTOP_KEY);
|
||||
const defaultFor = () => (mql.matches ? MOBILE_DEFAULT : DESKTOP_DEFAULT);
|
||||
|
||||
let suppressNextWrite = false;
|
||||
const apply = () => {
|
||||
const stored = localStorage.getItem(keyFor());
|
||||
suppressNextWrite = true;
|
||||
api.setToolbarState(stored ? (JSON.parse(stored) as ToolbarState) : defaultFor());
|
||||
};
|
||||
|
||||
apply();
|
||||
|
||||
const unsubscribe = api.onToolbarStateChange((state) => {
|
||||
if (suppressNextWrite) {
|
||||
suppressNextWrite = false;
|
||||
return;
|
||||
}
|
||||
localStorage.setItem(keyFor(), JSON.stringify(state));
|
||||
});
|
||||
|
||||
mql.addEventListener('change', apply);
|
||||
cleanups.push(unsubscribe, () => mql.removeEventListener('change', apply));
|
||||
};
|
||||
|
||||
const tryWire = () => {
|
||||
const api = window.__REACT_GRAB__;
|
||||
if (!api) return false;
|
||||
wireUp(api);
|
||||
return true;
|
||||
};
|
||||
|
||||
if (!tryWire()) {
|
||||
pollId = window.setInterval(() => {
|
||||
if (tryWire() && pollId !== undefined) {
|
||||
window.clearInterval(pollId);
|
||||
pollId = undefined;
|
||||
}
|
||||
}, 100);
|
||||
window.setTimeout(() => {
|
||||
if (pollId !== undefined) {
|
||||
window.clearInterval(pollId);
|
||||
pollId = undefined;
|
||||
}
|
||||
}, 5000);
|
||||
}
|
||||
|
||||
return () => {
|
||||
if (pollId !== undefined) window.clearInterval(pollId);
|
||||
cleanups.forEach((fn) => fn());
|
||||
};
|
||||
}, []);
|
||||
|
||||
return null;
|
||||
}
|
||||
@@ -339,12 +339,19 @@ export function InterestDetailHeader({ portSlug, interest }: InterestDetailHeade
|
||||
</button>
|
||||
) : (
|
||||
<>
|
||||
{/* Mobile: icon-only with title tooltip + colored fill carries
|
||||
the won/lost meaning (green vs rose). Adding a "Won" /
|
||||
"Lost" text label inline blew out the cluster width and
|
||||
forced the Email/Call/WhatsApp action-chip row above to
|
||||
stack vertically — bad trade. From sm up, the full
|
||||
"Mark won" / "Close as lost" labels read clearly. */}
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => setOutcomeDialog('won')}
|
||||
aria-label="Mark as won"
|
||||
title="Mark as won"
|
||||
className={cn(
|
||||
'inline-flex items-center gap-1.5 rounded-md px-2.5 py-1 text-xs font-medium transition-colors',
|
||||
'inline-flex items-center gap-1.5 rounded-md px-2 py-1 text-xs font-medium transition-colors sm:px-2.5',
|
||||
'border border-emerald-200 bg-emerald-50 text-emerald-700',
|
||||
'hover:bg-emerald-100',
|
||||
)}
|
||||
@@ -356,8 +363,9 @@ export function InterestDetailHeader({ portSlug, interest }: InterestDetailHeade
|
||||
type="button"
|
||||
onClick={() => setOutcomeDialog('lost')}
|
||||
aria-label="Close as lost"
|
||||
title="Close as lost"
|
||||
className={cn(
|
||||
'inline-flex items-center gap-1.5 rounded-md px-2.5 py-1 text-xs font-medium transition-colors',
|
||||
'inline-flex items-center gap-1.5 rounded-md px-2 py-1 text-xs font-medium transition-colors sm:px-2.5',
|
||||
'border border-rose-200 text-rose-700',
|
||||
'hover:bg-rose-50',
|
||||
)}
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
import Link from 'next/link';
|
||||
import { usePathname } from 'next/navigation';
|
||||
import { LayoutDashboard, Users, Ship, Anchor, Menu } from 'lucide-react';
|
||||
import { Anchor, FileSignature, LayoutDashboard, Menu, Users } from 'lucide-react';
|
||||
|
||||
import { cn } from '@/lib/utils';
|
||||
|
||||
@@ -12,11 +12,27 @@ type TabSpec = {
|
||||
segment: string; // route segment after /[portSlug]/
|
||||
};
|
||||
|
||||
// Bottom nav ordering, left → right:
|
||||
// Dashboard — daily overview
|
||||
// Berths — marina inventory grid (touches sales + ops both)
|
||||
// Clients — the address book / dedup surface (centered: it's the
|
||||
// primary mental anchor for "find this person", with
|
||||
// interests living as a tab on the client detail rather
|
||||
// than a peer in the bottom nav)
|
||||
// Documents — signature tracking (chase signers, EOI queue)
|
||||
// More — overflow drawer (Interests, Yachts, Companies, …)
|
||||
//
|
||||
// Interests is intentionally NOT in the bottom row — having both Clients
|
||||
// and Interests as peer tabs created a Clients-vs-Interests confusion
|
||||
// for sales reps, and the per-client interests tab + the new bottom-sheet
|
||||
// drawer cover the day-to-day deal review without needing a dedicated tab.
|
||||
// Yachts stays out for the same reason as before: it's an asset record
|
||||
// most often reached from inside an interest or client, not browsed.
|
||||
const TABS: TabSpec[] = [
|
||||
{ label: 'Dashboard', icon: LayoutDashboard, segment: 'dashboard' },
|
||||
{ label: 'Clients', icon: Users, segment: 'clients' },
|
||||
{ label: 'Yachts', icon: Ship, segment: 'yachts' },
|
||||
{ label: 'Berths', icon: Anchor, segment: 'berths' },
|
||||
{ label: 'Clients', icon: Users, segment: 'clients' },
|
||||
{ label: 'Documents', icon: FileSignature, segment: 'documents' },
|
||||
];
|
||||
|
||||
export function MobileBottomTabs({ onMoreClick }: { onMoreClick: () => void }) {
|
||||
|
||||
@@ -3,17 +3,17 @@
|
||||
import Link from 'next/link';
|
||||
import { usePathname } from 'next/navigation';
|
||||
import {
|
||||
Building2,
|
||||
Bookmark,
|
||||
Receipt,
|
||||
FileText,
|
||||
FolderOpen,
|
||||
Mail,
|
||||
Bell,
|
||||
ShieldAlert,
|
||||
BarChart3,
|
||||
Bell,
|
||||
Bookmark,
|
||||
Building2,
|
||||
FileText,
|
||||
Mail,
|
||||
Receipt,
|
||||
Settings,
|
||||
Shield,
|
||||
ShieldAlert,
|
||||
Ship,
|
||||
} from 'lucide-react';
|
||||
|
||||
import {
|
||||
@@ -30,13 +30,18 @@ type MoreItem = {
|
||||
segment: string;
|
||||
};
|
||||
|
||||
// Order: most-likely overflow targets first. Interests is here (rather
|
||||
// than the bottom row) to dodge the Clients-vs-Interests UX confusion;
|
||||
// reps reach the active deals via the Interests tab on a client detail
|
||||
// (or via the new bottom-sheet drawer). Yachts is asset-record traffic
|
||||
// best reached contextually from inside an interest or client.
|
||||
const MORE_ITEMS: MoreItem[] = [
|
||||
{ label: 'Companies', icon: Building2, segment: 'companies' },
|
||||
{ label: 'Interests', icon: Bookmark, segment: 'interests' },
|
||||
{ label: 'Yachts', icon: Ship, segment: 'yachts' },
|
||||
{ label: 'Companies', icon: Building2, segment: 'companies' },
|
||||
{ label: 'Invoices', icon: FileText, segment: 'invoices' },
|
||||
{ label: 'Expenses', icon: Receipt, segment: 'expenses' },
|
||||
{ label: 'Documents', icon: FolderOpen, segment: 'documents' },
|
||||
{ label: 'Email', icon: Mail, segment: 'email' },
|
||||
{ label: 'Inbox', icon: Mail, segment: 'email' },
|
||||
{ label: 'Alerts', icon: ShieldAlert, segment: 'alerts' },
|
||||
{ label: 'Reports', icon: BarChart3, segment: 'reports' },
|
||||
{ label: 'Reminders', icon: Bell, segment: 'reminders' },
|
||||
|
||||
@@ -249,7 +249,9 @@ export function ReminderList() {
|
||||
}
|
||||
/>
|
||||
|
||||
<div className="flex items-center gap-4 mb-4">
|
||||
{/* Wrap on phone widths so the priority filter doesn't get pushed
|
||||
off-screen by the My/All tabs + status filter taking the full row. */}
|
||||
<div className="flex flex-wrap items-center gap-3 mb-4 sm:gap-4">
|
||||
{canViewAll && (
|
||||
<Tabs value={viewMode} onValueChange={(v) => setViewMode(v as 'my' | 'all')}>
|
||||
<TabsList>
|
||||
|
||||
@@ -18,6 +18,7 @@ import { SubdivisionCombobox } from '@/components/shared/subdivision-combobox';
|
||||
import { PhoneInput, type PhoneInputValue } from '@/components/shared/phone-input';
|
||||
import { useRealtimeInvalidation } from '@/hooks/use-realtime-invalidation';
|
||||
import { apiFetch } from '@/lib/api/client';
|
||||
import { cn } from '@/lib/utils';
|
||||
import type { CountryCode } from '@/lib/i18n/countries';
|
||||
|
||||
interface ResidentialClientRow {
|
||||
@@ -85,7 +86,9 @@ export function ResidentialClientsList() {
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div className="rounded-lg border bg-card overflow-hidden">
|
||||
{/* Desktop: table layout. Hidden below lg because the 6 columns clip
|
||||
off the viewport at phone widths. */}
|
||||
<div className="hidden lg:block rounded-lg border bg-card overflow-hidden">
|
||||
<table className="w-full text-sm">
|
||||
<thead className="bg-muted/40 text-xs text-muted-foreground">
|
||||
<tr>
|
||||
@@ -137,6 +140,51 @@ export function ResidentialClientsList() {
|
||||
</table>
|
||||
</div>
|
||||
|
||||
{/* Mobile: card list. Each card mirrors the table row data with
|
||||
name + status pill on top, then meta line(s) below. */}
|
||||
<div className="lg:hidden space-y-2">
|
||||
{isLoading && (
|
||||
<div className="rounded-lg border bg-card px-3 py-8 text-center text-sm text-muted-foreground">
|
||||
Loading…
|
||||
</div>
|
||||
)}
|
||||
{!isLoading && data?.data.length === 0 && (
|
||||
<div className="rounded-lg border bg-card px-3 py-8 text-center text-sm text-muted-foreground">
|
||||
No residential clients yet.
|
||||
</div>
|
||||
)}
|
||||
{data?.data.map((c) => (
|
||||
<Link
|
||||
key={c.id}
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
href={`/${portSlug}/residential/clients/${c.id}` as any}
|
||||
className="block rounded-lg border bg-card p-3 transition-colors hover:bg-muted/30"
|
||||
>
|
||||
<div className="flex items-start justify-between gap-2">
|
||||
<p className="font-medium text-sm truncate">{c.fullName}</p>
|
||||
<span
|
||||
className={cn(
|
||||
'shrink-0 inline-flex items-center rounded-full px-2 py-0.5 text-[10px] font-medium uppercase tracking-wide',
|
||||
c.status === 'active'
|
||||
? 'bg-emerald-100 text-emerald-800'
|
||||
: c.status === 'inactive'
|
||||
? 'bg-muted text-muted-foreground'
|
||||
: 'bg-blue-100 text-blue-800',
|
||||
)}
|
||||
>
|
||||
{STATUS_LABELS[c.status] ?? c.status}
|
||||
</span>
|
||||
</div>
|
||||
<div className="mt-1 flex flex-wrap items-center gap-x-2 gap-y-0.5 text-xs text-muted-foreground">
|
||||
{c.email ? <span className="truncate">{c.email}</span> : null}
|
||||
{c.phone ? <span>{c.phone}</span> : null}
|
||||
{c.placeOfResidence ? <span>{c.placeOfResidence}</span> : null}
|
||||
{c.source ? <span className="capitalize">· {c.source}</span> : null}
|
||||
</div>
|
||||
</Link>
|
||||
))}
|
||||
</div>
|
||||
|
||||
<NewResidentialClientSheet open={createOpen} onOpenChange={setCreateOpen} />
|
||||
</div>
|
||||
);
|
||||
|
||||
@@ -94,7 +94,8 @@ export function ResidentialInterestsList() {
|
||||
</Select>
|
||||
</div>
|
||||
|
||||
<div className="rounded-lg border bg-card overflow-hidden">
|
||||
{/* Desktop: table layout. Hidden below lg; mobile renders cards. */}
|
||||
<div className="hidden lg:block rounded-lg border bg-card overflow-hidden">
|
||||
<table className="w-full text-sm">
|
||||
<thead className="bg-muted/40 text-xs text-muted-foreground">
|
||||
<tr>
|
||||
@@ -149,6 +150,47 @@ export function ResidentialInterestsList() {
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
{/* Mobile: card list. Stage as the headline (it's the most actionable
|
||||
field for triage), preferences/notes truncated below. */}
|
||||
<div className="lg:hidden space-y-2">
|
||||
{isLoading && (
|
||||
<div className="rounded-lg border bg-card px-3 py-8 text-center text-sm text-muted-foreground">
|
||||
Loading…
|
||||
</div>
|
||||
)}
|
||||
{!isLoading && data?.data.length === 0 && (
|
||||
<div className="rounded-lg border bg-card px-3 py-8 text-center text-sm text-muted-foreground">
|
||||
No interests match.
|
||||
</div>
|
||||
)}
|
||||
{data?.data.map((i) => (
|
||||
<Link
|
||||
key={i.id}
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
href={`/${portSlug}/residential/interests/${i.id}` as any}
|
||||
className="block rounded-lg border bg-card p-3 transition-colors hover:bg-muted/30"
|
||||
>
|
||||
<div className="flex items-start justify-between gap-2">
|
||||
<p className="font-medium text-sm">
|
||||
{STAGE_LABELS[i.pipelineStage] ?? i.pipelineStage}
|
||||
</p>
|
||||
<span className="shrink-0 text-[11px] text-muted-foreground">
|
||||
{new Date(i.updatedAt).toLocaleDateString()}
|
||||
</span>
|
||||
</div>
|
||||
{i.preferences ? (
|
||||
<p className="mt-1 line-clamp-2 text-xs text-muted-foreground">{i.preferences}</p>
|
||||
) : null}
|
||||
{i.notes ? (
|
||||
<p className="mt-1 line-clamp-1 text-xs text-muted-foreground/80">{i.notes}</p>
|
||||
) : null}
|
||||
{i.source ? (
|
||||
<p className="mt-1 text-[11px] capitalize text-muted-foreground">{i.source}</p>
|
||||
) : null}
|
||||
</Link>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
'use client';
|
||||
|
||||
import { useState } from 'react';
|
||||
import { useRef, useState } from 'react';
|
||||
import { useMutation, useQueryClient } from '@tanstack/react-query';
|
||||
import { Loader2, MapPin, Plus, Star, Trash2 } from 'lucide-react';
|
||||
import { toast } from 'sonner';
|
||||
@@ -225,6 +225,14 @@ function Field({ label, children }: { label: string; children: React.ReactNode }
|
||||
);
|
||||
}
|
||||
|
||||
/** Regional-indicator emoji flag for an ISO alpha-2 code (e.g. 'FR' → 🇫🇷). */
|
||||
function flagEmoji(code: string | null | undefined): string {
|
||||
if (!code || code.length !== 2) return '';
|
||||
const A = 0x1f1e6;
|
||||
const a = 'A'.charCodeAt(0);
|
||||
return String.fromCodePoint(A + code.charCodeAt(0) - a, A + code.charCodeAt(1) - a);
|
||||
}
|
||||
|
||||
function CountryFieldInline({
|
||||
value,
|
||||
onSave,
|
||||
@@ -233,20 +241,34 @@ function CountryFieldInline({
|
||||
onSave: (iso: string | null) => Promise<void>;
|
||||
}) {
|
||||
const [editing, setEditing] = useState(false);
|
||||
// Tracks whether a value was picked this edit cycle so the open-change
|
||||
// handler doesn't double-exit while commit is still in flight.
|
||||
const pickedRef = useRef(false);
|
||||
|
||||
if (editing) {
|
||||
return (
|
||||
<CountryCombobox
|
||||
value={value ?? null}
|
||||
onChange={async (iso) => {
|
||||
pickedRef.current = true;
|
||||
setEditing(false);
|
||||
await onSave(iso ?? null);
|
||||
}}
|
||||
clearable
|
||||
className="w-full"
|
||||
// Drop the user straight into the picker — no extra click on the
|
||||
// trigger required.
|
||||
defaultOpen
|
||||
onOpenChange={(open) => {
|
||||
// Auto-exit edit mode when the popover closes without a pick so
|
||||
// the user isn't stuck staring at a "Select country…" trigger.
|
||||
if (!open && !pickedRef.current) setEditing(false);
|
||||
if (open) pickedRef.current = false;
|
||||
}}
|
||||
/>
|
||||
);
|
||||
}
|
||||
const display = value ? getCountryName(value, 'en') : null;
|
||||
const display = value ? `${flagEmoji(value)} ${getCountryName(value, 'en')}` : null;
|
||||
return (
|
||||
<button
|
||||
type="button"
|
||||
@@ -268,17 +290,25 @@ function SubdivisionFieldInline({
|
||||
onSave: (code: string | null) => Promise<void>;
|
||||
}) {
|
||||
const [editing, setEditing] = useState(false);
|
||||
const pickedRef = useRef(false);
|
||||
|
||||
if (editing) {
|
||||
return (
|
||||
<SubdivisionCombobox
|
||||
value={value ?? null}
|
||||
country={country}
|
||||
onChange={async (code) => {
|
||||
pickedRef.current = true;
|
||||
setEditing(false);
|
||||
await onSave(code ?? null);
|
||||
}}
|
||||
clearable
|
||||
className="w-full"
|
||||
defaultOpen
|
||||
onOpenChange={(open) => {
|
||||
if (!open && !pickedRef.current) setEditing(false);
|
||||
if (open) pickedRef.current = false;
|
||||
}}
|
||||
/>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -30,6 +30,12 @@ interface CountryComboboxProps {
|
||||
clearable?: boolean;
|
||||
id?: string;
|
||||
'data-testid'?: string;
|
||||
/** Open the dropdown on first render. Used by inline-edit wrappers so the
|
||||
* user lands directly in the picker after clicking the edit affordance. */
|
||||
defaultOpen?: boolean;
|
||||
/** Notified whenever the dropdown opens/closes. Inline-edit wrappers use
|
||||
* this to auto-exit edit mode when the user dismisses without picking. */
|
||||
onOpenChange?: (open: boolean) => void;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -58,8 +64,14 @@ export function CountryCombobox({
|
||||
clearable = true,
|
||||
id,
|
||||
'data-testid': testId,
|
||||
defaultOpen = false,
|
||||
onOpenChange,
|
||||
}: CountryComboboxProps) {
|
||||
const [open, setOpen] = useState(false);
|
||||
const [open, setOpen] = useState(defaultOpen);
|
||||
const handleOpenChange = (next: boolean) => {
|
||||
setOpen(next);
|
||||
onOpenChange?.(next);
|
||||
};
|
||||
const effectiveLocale = locale ?? (typeof navigator !== 'undefined' ? navigator.language : 'en');
|
||||
|
||||
// Pre-build the options list once per locale change so the cmdk filter
|
||||
@@ -75,7 +87,7 @@ export function CountryCombobox({
|
||||
const selected = value ? options.find((o) => o.code === value) : undefined;
|
||||
|
||||
return (
|
||||
<Popover open={open} onOpenChange={setOpen}>
|
||||
<Popover open={open} onOpenChange={handleOpenChange}>
|
||||
<PopoverTrigger asChild>
|
||||
<Button
|
||||
id={id}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
'use client';
|
||||
|
||||
import { useState } from 'react';
|
||||
import { useRef, useState } from 'react';
|
||||
import { Loader2, Pencil } from 'lucide-react';
|
||||
import { toast } from 'sonner';
|
||||
|
||||
@@ -31,8 +31,12 @@ export function InlineCountryField({
|
||||
}: InlineCountryFieldProps) {
|
||||
const [editing, setEditing] = useState(false);
|
||||
const [saving, setSaving] = useState(false);
|
||||
// Set true when the user picks a value from the dropdown, so the
|
||||
// popover-close handler knows commit() will exit edit mode itself.
|
||||
const pickedRef = useRef(false);
|
||||
|
||||
async function commit(next: CountryCode | null) {
|
||||
pickedRef.current = true;
|
||||
if (next === (value ?? null)) {
|
||||
setEditing(false);
|
||||
return;
|
||||
@@ -51,7 +55,23 @@ export function InlineCountryField({
|
||||
if (editing) {
|
||||
return (
|
||||
<div className={cn('flex items-center gap-1', className)}>
|
||||
<CountryCombobox value={value} onChange={(iso) => void commit(iso)} data-testid={testId} />
|
||||
<CountryCombobox
|
||||
value={value}
|
||||
onChange={(iso) => void commit(iso)}
|
||||
data-testid={testId}
|
||||
defaultOpen
|
||||
onOpenChange={(open) => {
|
||||
// When the dropdown closes without a selection, leave edit mode
|
||||
// so the user isn't stuck staring at the trigger button. If a
|
||||
// pick happened, commit() handles the exit (and may need to keep
|
||||
// edit mode briefly to show the saving spinner).
|
||||
if (!open && !pickedRef.current) {
|
||||
setEditing(false);
|
||||
}
|
||||
// Reset for the next open cycle.
|
||||
if (open) pickedRef.current = false;
|
||||
}}
|
||||
/>
|
||||
{saving && <Loader2 className="h-3 w-3 animate-spin text-muted-foreground" />}
|
||||
</div>
|
||||
);
|
||||
|
||||
@@ -17,6 +17,12 @@ interface InlinePhoneFieldProps {
|
||||
/** Falls back to this country if `country` isn't set. */
|
||||
defaultCountry?: CountryCode;
|
||||
onSave: (next: { e164: string | null; country: CountryCode }) => Promise<void>;
|
||||
/**
|
||||
* Notifies the parent when the field enters/exits edit mode. Lets the row
|
||||
* dim or hide noise (tag chips, action buttons) while the user is focused
|
||||
* on the editor.
|
||||
*/
|
||||
onEditingChange?: (editing: boolean) => void;
|
||||
emptyText?: string;
|
||||
disabled?: boolean;
|
||||
className?: string;
|
||||
@@ -28,12 +34,13 @@ export function InlinePhoneField({
|
||||
country,
|
||||
defaultCountry,
|
||||
onSave,
|
||||
onEditingChange,
|
||||
emptyText = '—',
|
||||
disabled,
|
||||
className,
|
||||
'data-testid': testId,
|
||||
}: InlinePhoneFieldProps) {
|
||||
const [editing, setEditing] = useState(false);
|
||||
const [editing, setEditingRaw] = useState(false);
|
||||
const [draft, setDraft] = useState<PhoneInputValue | null>(() => {
|
||||
if (!e164 && !country) return null;
|
||||
return {
|
||||
@@ -43,6 +50,11 @@ export function InlinePhoneField({
|
||||
});
|
||||
const [saving, setSaving] = useState(false);
|
||||
|
||||
function setEditing(next: boolean) {
|
||||
setEditingRaw(next);
|
||||
onEditingChange?.(next);
|
||||
}
|
||||
|
||||
async function commit() {
|
||||
const next = draft ?? { e164: null, country: defaultCountry ?? 'US' };
|
||||
if (next.e164 === (e164 ?? null) && next.country === (country ?? null)) {
|
||||
@@ -62,39 +74,50 @@ export function InlinePhoneField({
|
||||
|
||||
if (editing) {
|
||||
return (
|
||||
<div className={cn('flex items-center gap-1', className)}>
|
||||
// Two clean lines: country picker + number on top, action pair below.
|
||||
<div className={cn('flex w-full flex-col gap-2.5', className)}>
|
||||
<PhoneInput
|
||||
value={draft}
|
||||
onChange={(v) => setDraft(v)}
|
||||
defaultCountry={defaultCountry}
|
||||
data-testid={testId}
|
||||
/>
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => void commit()}
|
||||
disabled={saving}
|
||||
className="rounded px-2 py-1 text-xs font-medium hover:bg-muted disabled:opacity-50"
|
||||
>
|
||||
{saving ? <Loader2 className="h-3 w-3 animate-spin" /> : 'Save'}
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => {
|
||||
setDraft(
|
||||
e164 || country
|
||||
? {
|
||||
e164: e164 ?? null,
|
||||
country: (country as CountryCode | null) ?? defaultCountry ?? 'US',
|
||||
}
|
||||
: null,
|
||||
);
|
||||
setEditing(false);
|
||||
}}
|
||||
disabled={saving}
|
||||
className="rounded px-2 py-1 text-xs text-muted-foreground hover:bg-muted disabled:opacity-50"
|
||||
>
|
||||
Cancel
|
||||
</button>
|
||||
<div className="flex items-center justify-end gap-1.5">
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => {
|
||||
setDraft(
|
||||
e164 || country
|
||||
? {
|
||||
e164: e164 ?? null,
|
||||
country: (country as CountryCode | null) ?? defaultCountry ?? 'US',
|
||||
}
|
||||
: null,
|
||||
);
|
||||
setEditing(false);
|
||||
}}
|
||||
disabled={saving}
|
||||
className={cn(
|
||||
'inline-flex h-8 items-center rounded-md px-3 text-xs font-medium',
|
||||
'text-muted-foreground transition-colors hover:bg-muted hover:text-foreground',
|
||||
'disabled:opacity-50',
|
||||
)}
|
||||
>
|
||||
Cancel
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => void commit()}
|
||||
disabled={saving}
|
||||
className={cn(
|
||||
'inline-flex h-8 min-w-[64px] items-center justify-center rounded-md px-3',
|
||||
'bg-primary text-xs font-semibold text-primary-foreground shadow-sm',
|
||||
'transition-colors hover:bg-primary/90 disabled:opacity-50',
|
||||
)}
|
||||
>
|
||||
{saving ? <Loader2 className="size-3.5 animate-spin" /> : 'Save'}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
'use client';
|
||||
|
||||
import { useState } from 'react';
|
||||
import { useRef, useState } from 'react';
|
||||
import { Loader2, Pencil } from 'lucide-react';
|
||||
import { toast } from 'sonner';
|
||||
|
||||
@@ -31,8 +31,12 @@ export function InlineTimezoneField({
|
||||
}: InlineTimezoneFieldProps) {
|
||||
const [editing, setEditing] = useState(false);
|
||||
const [saving, setSaving] = useState(false);
|
||||
// Set true when the user picks a value from the dropdown, so the
|
||||
// popover-close handler knows commit() will exit edit mode itself.
|
||||
const pickedRef = useRef(false);
|
||||
|
||||
async function commit(next: string | null) {
|
||||
pickedRef.current = true;
|
||||
if (next === (value ?? null)) {
|
||||
setEditing(false);
|
||||
return;
|
||||
@@ -56,6 +60,16 @@ export function InlineTimezoneField({
|
||||
onChange={(tz) => void commit(tz)}
|
||||
countryHint={countryHint ?? undefined}
|
||||
data-testid={testId}
|
||||
defaultOpen
|
||||
onOpenChange={(open) => {
|
||||
// Auto-exit edit mode when the dropdown closes without a pick,
|
||||
// so the user isn't stuck looking at the trigger. commit() owns
|
||||
// the exit when a value was selected.
|
||||
if (!open && !pickedRef.current) {
|
||||
setEditing(false);
|
||||
}
|
||||
if (open) pickedRef.current = false;
|
||||
}}
|
||||
/>
|
||||
{saving && <Loader2 className="h-3 w-3 animate-spin text-muted-foreground" />}
|
||||
</div>
|
||||
|
||||
@@ -1,16 +1,9 @@
|
||||
'use client';
|
||||
|
||||
import { type ReactNode } from 'react';
|
||||
import { useEffect, useRef, type ReactNode } from 'react';
|
||||
|
||||
import { Tabs, TabsContent, TabsList, TabsTrigger } from '@/components/ui/tabs';
|
||||
import { Badge } from '@/components/ui/badge';
|
||||
import {
|
||||
Select,
|
||||
SelectContent,
|
||||
SelectItem,
|
||||
SelectTrigger,
|
||||
SelectValue,
|
||||
} from '@/components/ui/select';
|
||||
|
||||
export interface ResponsiveTab {
|
||||
id: string;
|
||||
@@ -26,47 +19,56 @@ interface ResponsiveTabsProps {
|
||||
}
|
||||
|
||||
/**
|
||||
* Tabs that collapse to a native <Select> on phone-sized viewports.
|
||||
* Above sm: TabsList renders. At/below sm: a Select dropdown replaces the tab strip.
|
||||
* Tab strip that scrolls horizontally on narrow viewports. The active tab is
|
||||
* automatically scrolled into view so users can tell at a glance that more
|
||||
* tabs exist beyond the visible edge.
|
||||
*
|
||||
* Previously this collapsed to a <Select> on phone widths, but that read as
|
||||
* a generic dropdown and obscured the fact that multiple peer tabs exist.
|
||||
*/
|
||||
export function ResponsiveTabs({ tabs, value, onValueChange }: ResponsiveTabsProps) {
|
||||
const listRef = useRef<HTMLDivElement>(null);
|
||||
|
||||
// Keep the active trigger in view when the value changes externally
|
||||
// (e.g. ?tab= in the URL or a back/forward navigation).
|
||||
useEffect(() => {
|
||||
const root = listRef.current;
|
||||
if (!root) return;
|
||||
const active = root.querySelector<HTMLButtonElement>(`[data-tab-id="${CSS.escape(value)}"]`);
|
||||
if (active) {
|
||||
active.scrollIntoView({ block: 'nearest', inline: 'nearest', behavior: 'smooth' });
|
||||
}
|
||||
}, [value]);
|
||||
|
||||
return (
|
||||
<Tabs value={value} onValueChange={onValueChange}>
|
||||
{/* Mobile: select dropdown */}
|
||||
<div className="sm:hidden">
|
||||
<Select value={value} onValueChange={onValueChange}>
|
||||
<SelectTrigger>
|
||||
<SelectValue />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
{tabs.map((tab) => (
|
||||
<SelectItem key={tab.id} value={tab.id}>
|
||||
<span className="flex items-center gap-1.5">
|
||||
{tab.label}
|
||||
{tab.badge !== undefined && tab.badge !== null && (
|
||||
<span className="text-xs text-muted-foreground">({tab.badge})</span>
|
||||
)}
|
||||
</span>
|
||||
</SelectItem>
|
||||
))}
|
||||
</SelectContent>
|
||||
</Select>
|
||||
{/* Single scrollable strip for all viewport widths.
|
||||
The wrapper handles horizontal overflow with momentum scroll on
|
||||
touch devices; the inner TabsList stays its natural width and
|
||||
slides under the wrapper. */}
|
||||
<div
|
||||
ref={listRef}
|
||||
className="overflow-x-auto -mx-2 px-2 [scrollbar-width:none] [&::-webkit-scrollbar]:hidden"
|
||||
>
|
||||
<TabsList className="inline-flex w-max">
|
||||
{tabs.map((tab) => (
|
||||
<TabsTrigger
|
||||
key={tab.id}
|
||||
value={tab.id}
|
||||
className="gap-1.5 whitespace-nowrap"
|
||||
data-tab-id={tab.id}
|
||||
>
|
||||
{tab.label}
|
||||
{tab.badge !== undefined && tab.badge !== null && (
|
||||
<Badge variant="secondary" className="px-1.5 py-0 text-xs">
|
||||
{tab.badge}
|
||||
</Badge>
|
||||
)}
|
||||
</TabsTrigger>
|
||||
))}
|
||||
</TabsList>
|
||||
</div>
|
||||
|
||||
{/* Desktop / tablet: tab strip */}
|
||||
<TabsList className="hidden sm:flex">
|
||||
{tabs.map((tab) => (
|
||||
<TabsTrigger key={tab.id} value={tab.id} className="gap-1.5">
|
||||
{tab.label}
|
||||
{tab.badge !== undefined && tab.badge !== null && (
|
||||
<Badge variant="secondary" className="px-1.5 py-0 text-xs">
|
||||
{tab.badge}
|
||||
</Badge>
|
||||
)}
|
||||
</TabsTrigger>
|
||||
))}
|
||||
</TabsList>
|
||||
|
||||
{tabs.map((tab) => (
|
||||
<TabsContent key={tab.id} value={tab.id} className="mt-4">
|
||||
{tab.content}
|
||||
|
||||
@@ -32,6 +32,11 @@ interface SubdivisionComboboxProps {
|
||||
clearable?: boolean;
|
||||
id?: string;
|
||||
'data-testid'?: string;
|
||||
/** Open the dropdown on first render. Used by inline-edit wrappers. */
|
||||
defaultOpen?: boolean;
|
||||
/** Notified whenever the dropdown opens/closes. Inline-edit wrappers use
|
||||
* this to auto-exit edit mode when the user dismisses without picking. */
|
||||
onOpenChange?: (open: boolean) => void;
|
||||
}
|
||||
|
||||
export function SubdivisionCombobox({
|
||||
@@ -44,8 +49,14 @@ export function SubdivisionCombobox({
|
||||
clearable = true,
|
||||
id,
|
||||
'data-testid': testId,
|
||||
defaultOpen = false,
|
||||
onOpenChange,
|
||||
}: SubdivisionComboboxProps) {
|
||||
const [open, setOpen] = useState(false);
|
||||
const [open, setOpen] = useState(defaultOpen);
|
||||
const handleOpenChange = (next: boolean) => {
|
||||
setOpen(next);
|
||||
onOpenChange?.(next);
|
||||
};
|
||||
|
||||
const options = useMemo(() => {
|
||||
if (!country) return [];
|
||||
@@ -64,7 +75,7 @@ export function SubdivisionCombobox({
|
||||
else triggerLabel = placeholder;
|
||||
|
||||
return (
|
||||
<Popover open={open} onOpenChange={setOpen}>
|
||||
<Popover open={open} onOpenChange={handleOpenChange}>
|
||||
<PopoverTrigger asChild>
|
||||
<Button
|
||||
id={id}
|
||||
|
||||
@@ -29,6 +29,11 @@ interface TimezoneComboboxProps {
|
||||
clearable?: boolean;
|
||||
id?: string;
|
||||
'data-testid'?: string;
|
||||
/** Open the dropdown on first render. Used by inline-edit wrappers. */
|
||||
defaultOpen?: boolean;
|
||||
/** Notified whenever the dropdown opens/closes. Inline-edit wrappers use
|
||||
* this to auto-exit edit mode when the user dismisses without picking. */
|
||||
onOpenChange?: (open: boolean) => void;
|
||||
}
|
||||
|
||||
export function TimezoneCombobox({
|
||||
@@ -41,8 +46,14 @@ export function TimezoneCombobox({
|
||||
clearable = true,
|
||||
id,
|
||||
'data-testid': testId,
|
||||
defaultOpen = false,
|
||||
onOpenChange,
|
||||
}: TimezoneComboboxProps) {
|
||||
const [open, setOpen] = useState(false);
|
||||
const [open, setOpen] = useState(defaultOpen);
|
||||
const handleOpenChange = (next: boolean) => {
|
||||
setOpen(next);
|
||||
onOpenChange?.(next);
|
||||
};
|
||||
|
||||
const allOptions = useMemo(() => {
|
||||
return listAllTimezones().map((tz) => ({
|
||||
@@ -66,7 +77,7 @@ export function TimezoneCombobox({
|
||||
const selectedLabel = value ? formatTimezoneLabel(value) : placeholder;
|
||||
|
||||
return (
|
||||
<Popover open={open} onOpenChange={setOpen}>
|
||||
<Popover open={open} onOpenChange={handleOpenChange}>
|
||||
<PopoverTrigger asChild>
|
||||
<Button
|
||||
id={id}
|
||||
|
||||
@@ -45,7 +45,7 @@ export function KPITile({
|
||||
<div
|
||||
data-testid="kpi-tile"
|
||||
className={cn(
|
||||
'group relative overflow-hidden rounded-xl border border-border bg-gradient-brand-soft p-5 shadow-sm transition-all duration-base ease-smooth hover:shadow-md',
|
||||
'group relative overflow-hidden rounded-xl border border-border bg-gradient-brand-soft p-3 shadow-sm transition-all duration-base ease-smooth hover:shadow-md sm:p-5',
|
||||
className,
|
||||
)}
|
||||
{...props}
|
||||
@@ -53,10 +53,12 @@ export function KPITile({
|
||||
<div className={cn('absolute inset-x-0 top-0 h-1', ACCENT_STRIPES[accent])} aria-hidden />
|
||||
<div className="flex items-start justify-between gap-4">
|
||||
<div className="min-w-0">
|
||||
<div className="text-xs font-medium uppercase tracking-wide text-muted-foreground">
|
||||
<div className="text-[10px] font-medium uppercase tracking-wide text-muted-foreground sm:text-xs">
|
||||
{title}
|
||||
</div>
|
||||
<div className="mt-2 text-2xl font-semibold tabular-nums text-foreground">{value}</div>
|
||||
<div className="mt-1 truncate text-lg font-semibold tabular-nums text-foreground sm:mt-2 sm:text-2xl">
|
||||
{value}
|
||||
</div>
|
||||
{typeof delta === 'number' ? (
|
||||
<div className={cn('mt-1 text-xs font-medium', deltaClass)}>
|
||||
{deltaPrefix}
|
||||
|
||||
@@ -142,7 +142,10 @@ export function YachtDetailHeader({ yacht }: YachtDetailHeaderProps) {
|
||||
return (
|
||||
<>
|
||||
<DetailHeaderStrip>
|
||||
<div className="flex items-start gap-3 flex-wrap">
|
||||
{/* Stacks vertically on phone widths so the action cluster doesn't
|
||||
crush the status pill / owner row. From sm up, title block sits
|
||||
beside actions in the original layout. */}
|
||||
<div className="flex flex-col gap-3 sm:flex-row sm:items-start sm:gap-3 sm:flex-wrap">
|
||||
<div className="flex-1 min-w-0">
|
||||
<div className="flex items-center gap-2 flex-wrap">
|
||||
<h1 className="hidden sm:block text-2xl font-bold text-foreground truncate">
|
||||
|
||||
30
src/lib/db/migrations/0021_unusual_azazel.sql
Normal file
30
src/lib/db/migrations/0021_unusual_azazel.sql
Normal file
@@ -0,0 +1,30 @@
|
||||
CREATE TABLE "client_merge_candidates" (
|
||||
"id" text PRIMARY KEY NOT NULL,
|
||||
"port_id" text NOT NULL,
|
||||
"client_a_id" text NOT NULL,
|
||||
"client_b_id" text NOT NULL,
|
||||
"score" integer NOT NULL,
|
||||
"reasons" jsonb NOT NULL,
|
||||
"status" text DEFAULT 'pending' NOT NULL,
|
||||
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
|
||||
"resolved_at" timestamp with time zone,
|
||||
"resolved_by" text
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE TABLE "migration_source_links" (
|
||||
"id" text PRIMARY KEY NOT NULL,
|
||||
"source_system" text NOT NULL,
|
||||
"source_id" text NOT NULL,
|
||||
"target_entity_type" text NOT NULL,
|
||||
"target_entity_id" text NOT NULL,
|
||||
"applied_id" text NOT NULL,
|
||||
"applied_by" text,
|
||||
"applied_at" timestamp with time zone DEFAULT now() NOT NULL
|
||||
);
|
||||
--> statement-breakpoint
|
||||
ALTER TABLE "client_merge_candidates" ADD CONSTRAINT "client_merge_candidates_port_id_ports_id_fk" FOREIGN KEY ("port_id") REFERENCES "public"."ports"("id") ON DELETE no action ON UPDATE no action;--> statement-breakpoint
|
||||
ALTER TABLE "client_merge_candidates" ADD CONSTRAINT "client_merge_candidates_client_a_id_clients_id_fk" FOREIGN KEY ("client_a_id") REFERENCES "public"."clients"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
|
||||
ALTER TABLE "client_merge_candidates" ADD CONSTRAINT "client_merge_candidates_client_b_id_clients_id_fk" FOREIGN KEY ("client_b_id") REFERENCES "public"."clients"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
|
||||
CREATE INDEX "idx_cmc_port_status" ON "client_merge_candidates" USING btree ("port_id","status");--> statement-breakpoint
|
||||
CREATE UNIQUE INDEX "idx_cmc_pair" ON "client_merge_candidates" USING btree ("port_id","client_a_id","client_b_id");--> statement-breakpoint
|
||||
CREATE UNIQUE INDEX "idx_msl_source_target" ON "migration_source_links" USING btree ("source_system","source_id","target_entity_type");
|
||||
2
src/lib/db/migrations/0022_magenta_madame_hydra.sql
Normal file
2
src/lib/db/migrations/0022_magenta_madame_hydra.sql
Normal file
@@ -0,0 +1,2 @@
|
||||
ALTER TABLE "clients" ADD COLUMN "merged_into_client_id" text;--> statement-breakpoint
|
||||
CREATE INDEX "idx_clients_merged_into" ON "clients" USING btree ("merged_into_client_id");
|
||||
10482
src/lib/db/migrations/meta/0021_snapshot.json
Normal file
10482
src/lib/db/migrations/meta/0021_snapshot.json
Normal file
File diff suppressed because it is too large
Load Diff
10503
src/lib/db/migrations/meta/0022_snapshot.json
Normal file
10503
src/lib/db/migrations/meta/0022_snapshot.json
Normal file
File diff suppressed because it is too large
Load Diff
@@ -148,6 +148,20 @@
|
||||
"when": 1777814682110,
|
||||
"tag": "0020_medical_betty_brant",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 21,
|
||||
"version": "7",
|
||||
"when": 1777811835982,
|
||||
"tag": "0021_unusual_azazel",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 22,
|
||||
"version": "7",
|
||||
"when": 1777812671833,
|
||||
"tag": "0022_magenta_madame_hydra",
|
||||
"breakpoints": true
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
@@ -2,6 +2,7 @@ import {
|
||||
pgTable,
|
||||
text,
|
||||
boolean,
|
||||
integer,
|
||||
timestamp,
|
||||
jsonb,
|
||||
index,
|
||||
@@ -30,6 +31,11 @@ export const clients = pgTable(
|
||||
source: text('source'), // website, manual, referral, broker
|
||||
sourceDetails: text('source_details'),
|
||||
archivedAt: timestamp('archived_at', { withTimezone: true }),
|
||||
/** When this client was merged into another (the "loser" of a dedup
|
||||
* merge), this points at the surviving client. Used by the
|
||||
* /admin/duplicates review queue to redirect any stragglers, and by
|
||||
* the unmerge flow to restore. Null for live clients. */
|
||||
mergedIntoClientId: text('merged_into_client_id'),
|
||||
createdAt: timestamp('created_at', { withTimezone: true }).notNull().defaultNow(),
|
||||
updatedAt: timestamp('updated_at', { withTimezone: true }).notNull().defaultNow(),
|
||||
},
|
||||
@@ -38,6 +44,7 @@ export const clients = pgTable(
|
||||
index('idx_clients_name').on(table.portId, table.fullName),
|
||||
index('idx_clients_archived').on(table.portId, table.archivedAt),
|
||||
index('idx_clients_nationality_iso').on(table.nationalityIso),
|
||||
index('idx_clients_merged_into').on(table.mergedIntoClientId),
|
||||
],
|
||||
);
|
||||
|
||||
@@ -145,6 +152,54 @@ export const clientMergeLog = pgTable(
|
||||
(table) => [index('idx_cml_port').on(table.portId)],
|
||||
);
|
||||
|
||||
/**
|
||||
* Pairs of clients flagged by the background scoring job as potential
|
||||
* duplicates. The `/admin/duplicates` review queue reads from here.
|
||||
*
|
||||
* Lifecycle:
|
||||
* - Background job inserts a row when a pair scores >= the
|
||||
* `dedup_review_queue_threshold` system setting.
|
||||
* - User reviews in the admin UI and either merges (status='merged')
|
||||
* or dismisses (status='dismissed').
|
||||
* - Subsequent runs of the scoring job skip pairs already
|
||||
* `dismissed` so the same false-positive doesn't keep reappearing.
|
||||
* A future score increase recreates the row.
|
||||
*
|
||||
* Pairs are stored canonically with `clientAId < clientBId` (string
|
||||
* comparison) so the same pair only generates one row regardless of
|
||||
* scoring direction.
|
||||
*/
|
||||
export const clientMergeCandidates = pgTable(
|
||||
'client_merge_candidates',
|
||||
{
|
||||
id: text('id')
|
||||
.primaryKey()
|
||||
.$defaultFn(() => crypto.randomUUID()),
|
||||
portId: text('port_id')
|
||||
.notNull()
|
||||
.references(() => ports.id),
|
||||
clientAId: text('client_a_id')
|
||||
.notNull()
|
||||
.references(() => clients.id, { onDelete: 'cascade' }),
|
||||
clientBId: text('client_b_id')
|
||||
.notNull()
|
||||
.references(() => clients.id, { onDelete: 'cascade' }),
|
||||
score: integer('score').notNull(),
|
||||
/** Human-readable rule list, e.g. ["email match", "phone match"]. */
|
||||
reasons: jsonb('reasons').notNull(),
|
||||
status: text('status').notNull().default('pending'), // pending | dismissed | merged
|
||||
createdAt: timestamp('created_at', { withTimezone: true }).notNull().defaultNow(),
|
||||
resolvedAt: timestamp('resolved_at', { withTimezone: true }),
|
||||
resolvedBy: text('resolved_by'),
|
||||
},
|
||||
(table) => [
|
||||
index('idx_cmc_port_status').on(table.portId, table.status),
|
||||
// Same pair shouldn't surface twice — enforce uniqueness on the
|
||||
// canonical (a < b) ordering.
|
||||
uniqueIndex('idx_cmc_pair').on(table.portId, table.clientAId, table.clientBId),
|
||||
],
|
||||
);
|
||||
|
||||
export const clientAddresses = pgTable(
|
||||
'client_addresses',
|
||||
{
|
||||
@@ -190,3 +245,5 @@ export type ClientMergeLog = typeof clientMergeLog.$inferSelect;
|
||||
export type NewClientMergeLog = typeof clientMergeLog.$inferInsert;
|
||||
export type ClientAddress = typeof clientAddresses.$inferSelect;
|
||||
export type NewClientAddress = typeof clientAddresses.$inferInsert;
|
||||
export type ClientMergeCandidate = typeof clientMergeCandidates.$inferSelect;
|
||||
export type NewClientMergeCandidate = typeof clientMergeCandidates.$inferInsert;
|
||||
|
||||
@@ -56,5 +56,8 @@ export * from './ai-usage';
|
||||
// GDPR export tracking (Phase 3d)
|
||||
export * from './gdpr';
|
||||
|
||||
// Migration ledger (one-shot scripts — NocoDB import etc.)
|
||||
export * from './migration';
|
||||
|
||||
// Relations (must come last — references all tables)
|
||||
export * from './relations';
|
||||
|
||||
48
src/lib/db/schema/migration.ts
Normal file
48
src/lib/db/schema/migration.ts
Normal file
@@ -0,0 +1,48 @@
|
||||
import { pgTable, text, timestamp, uniqueIndex } from 'drizzle-orm/pg-core';
|
||||
|
||||
/**
|
||||
* Idempotency ledger for one-shot data migrations from external sources
|
||||
* (e.g. the legacy NocoDB Interests table).
|
||||
*
|
||||
* Every entity created during a migration script's `--apply` run gets a
|
||||
* row here mapping the source-system row identifier to the new-system
|
||||
* entity id. Re-running `--apply` against the same report skips rows
|
||||
* already linked, so partial-failure resumption is just "run again."
|
||||
*
|
||||
* One source row can generate multiple new entities (e.g. one NocoDB
|
||||
* Interests row → one client + one interest + one yacht), so the
|
||||
* uniqueness constraint includes `target_entity_type`.
|
||||
*/
|
||||
export const migrationSourceLinks = pgTable(
|
||||
'migration_source_links',
|
||||
{
|
||||
id: text('id')
|
||||
.primaryKey()
|
||||
.$defaultFn(() => crypto.randomUUID()),
|
||||
/** e.g. 'nocodb_interests', 'nocodb_residences', 'nocodb_website_submissions'. */
|
||||
sourceSystem: text('source_system').notNull(),
|
||||
/** Source row identifier as a string (NocoDB IDs are integers; we keep
|
||||
* text here for forward compat with other sources). */
|
||||
sourceId: text('source_id').notNull(),
|
||||
/** e.g. 'client', 'interest', 'yacht', 'document'. */
|
||||
targetEntityType: text('target_entity_type').notNull(),
|
||||
/** UUID of the new-system entity (clients.id, interests.id, etc.). */
|
||||
targetEntityId: text('target_entity_id').notNull(),
|
||||
/** Apply-id from the migration run that created this link — pairs with
|
||||
* the on-disk apply manifest so `--rollback --apply-id <id>` knows
|
||||
* exactly which links to remove. */
|
||||
appliedId: text('applied_id').notNull(),
|
||||
appliedBy: text('applied_by'),
|
||||
appliedAt: timestamp('applied_at', { withTimezone: true }).notNull().defaultNow(),
|
||||
},
|
||||
(table) => [
|
||||
uniqueIndex('idx_msl_source_target').on(
|
||||
table.sourceSystem,
|
||||
table.sourceId,
|
||||
table.targetEntityType,
|
||||
),
|
||||
],
|
||||
);
|
||||
|
||||
export type MigrationSourceLink = typeof migrationSourceLinks.$inferSelect;
|
||||
export type NewMigrationSourceLink = typeof migrationSourceLinks.$inferInsert;
|
||||
255
src/lib/dedup/find-matches.ts
Normal file
255
src/lib/dedup/find-matches.ts
Normal file
@@ -0,0 +1,255 @@
|
||||
/**
|
||||
* Client-match finder — pure scoring logic.
|
||||
*
|
||||
* Compares one input candidate against a pool of existing candidates and
|
||||
* returns scored matches. Used by:
|
||||
* - the at-create suggestion in client/interest forms (Layer 1)
|
||||
* - the public-form auto-link path (when score >= block threshold)
|
||||
* - the nightly background scoring job (Layer 3)
|
||||
* - the migration script's dedup pass
|
||||
*
|
||||
* Performance shape: blocking via email / phone / surname-token reduces
|
||||
* the pairwise scan from O(n²) to ~O(n) for any pool size we'll see in
|
||||
* production. See `findClientMatches` for the blocking implementation.
|
||||
*
|
||||
* Design reference: docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md §4.
|
||||
*/
|
||||
|
||||
import { parsePhoneScriptSafe as parsePhone } from './phone-parse';
|
||||
|
||||
import { levenshtein } from './normalize';
|
||||
|
||||
// ─── Types ──────────────────────────────────────────────────────────────────
|
||||
|
||||
export interface MatchCandidate {
|
||||
id: string;
|
||||
fullName: string | null;
|
||||
/** Lowercased last non-particle token from `normalizeName(...).surnameToken`.
|
||||
* Used as a blocking key. */
|
||||
surnameToken: string | null;
|
||||
/** Already lowercased + validated via `normalizeEmail`. */
|
||||
emails: string[];
|
||||
/** Already canonical E.164 via `normalizePhone`. */
|
||||
phonesE164: string[];
|
||||
/** Address country (NOT phone country) — used for tiebreaking, not scoring. */
|
||||
countryIso: string | null;
|
||||
}
|
||||
|
||||
export type MatchConfidence = 'high' | 'medium' | 'low';
|
||||
|
||||
export interface MatchResult {
|
||||
candidate: MatchCandidate;
|
||||
/** 0–100 after capping. */
|
||||
score: number;
|
||||
/** Human-readable list of which rules contributed. Useful for the
|
||||
* review queue UI ("matched on email + phone + surname token"). */
|
||||
reasons: string[];
|
||||
confidence: MatchConfidence;
|
||||
}
|
||||
|
||||
export interface DedupThresholds {
|
||||
/** Inclusive lower bound for `'high'` confidence. */
|
||||
highScore: number;
|
||||
/** Inclusive lower bound for `'medium'` confidence. Below this is `'low'`. */
|
||||
mediumScore: number;
|
||||
}
|
||||
|
||||
// ─── Public entry point ─────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Compare `input` against every reachable candidate in `pool` and return
|
||||
* scored matches, sorted by score descending. The result list includes
|
||||
* low-confidence hits — caller filters by `confidence` or `score`
|
||||
* depending on use case.
|
||||
*
|
||||
* Self-matches (an entry with `id === input.id`, e.g. when re-scoring an
|
||||
* existing client during a background job) are excluded.
|
||||
*/
|
||||
export function findClientMatches(
|
||||
input: MatchCandidate,
|
||||
pool: MatchCandidate[],
|
||||
thresholds: DedupThresholds,
|
||||
): MatchResult[] {
|
||||
if (pool.length === 0) return [];
|
||||
|
||||
// ── Phase 1: build blocking indexes off the pool. ─────────────────────────
|
||||
//
|
||||
// Three indexes mean any candidate that shares ANY of (email / phone /
|
||||
// surname-token) with the input shows up in the comparison set. Anything
|
||||
// that shares NONE is structurally too different to be a duplicate and
|
||||
// is skipped — this is what keeps the algorithm O(n) at scale.
|
||||
const byEmail = new Map<string, MatchCandidate[]>();
|
||||
const byPhone = new Map<string, MatchCandidate[]>();
|
||||
const bySurnameToken = new Map<string, MatchCandidate[]>();
|
||||
|
||||
for (const c of pool) {
|
||||
if (c.id === input.id) continue;
|
||||
for (const email of c.emails) {
|
||||
pushTo(byEmail, email, c);
|
||||
}
|
||||
for (const phone of c.phonesE164) {
|
||||
pushTo(byPhone, phone, c);
|
||||
}
|
||||
if (c.surnameToken) {
|
||||
pushTo(bySurnameToken, c.surnameToken, c);
|
||||
}
|
||||
}
|
||||
|
||||
// ── Phase 2: gather the comparison set via the blocking indexes. ─────────
|
||||
const comparisonSet = new Map<string, MatchCandidate>();
|
||||
for (const email of input.emails) {
|
||||
for (const c of byEmail.get(email) ?? []) {
|
||||
comparisonSet.set(c.id, c);
|
||||
}
|
||||
}
|
||||
for (const phone of input.phonesE164) {
|
||||
for (const c of byPhone.get(phone) ?? []) {
|
||||
comparisonSet.set(c.id, c);
|
||||
}
|
||||
}
|
||||
if (input.surnameToken) {
|
||||
for (const c of bySurnameToken.get(input.surnameToken) ?? []) {
|
||||
comparisonSet.set(c.id, c);
|
||||
}
|
||||
}
|
||||
|
||||
// ── Phase 3: score every candidate that survived blocking. ───────────────
|
||||
const results: MatchResult[] = [];
|
||||
for (const candidate of comparisonSet.values()) {
|
||||
const r = scorePair(input, candidate);
|
||||
results.push(r);
|
||||
}
|
||||
|
||||
// ── Phase 4: sort by score desc + assign confidence tier. ────────────────
|
||||
results.sort((a, b) => b.score - a.score);
|
||||
for (const r of results) {
|
||||
r.confidence = classify(r.score, thresholds);
|
||||
}
|
||||
return results;
|
||||
}
|
||||
|
||||
// ─── Scoring ────────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Score one (input, candidate) pair against the rule set in design §4.2.
|
||||
* Compounding: positive rules sum, negative rules subtract; the result is
|
||||
* clamped to [0, 100]. Reasons accumulate in the order rules fire so the
|
||||
* review-queue UI can show "matched on email + phone".
|
||||
*/
|
||||
function scorePair(a: MatchCandidate, b: MatchCandidate): MatchResult {
|
||||
let score = 0;
|
||||
const reasons: string[] = [];
|
||||
|
||||
// ── Positive rules. ──────────────────────────────────────────────────────
|
||||
|
||||
const sharedEmail = a.emails.find((e) => b.emails.includes(e));
|
||||
const emailMatch = !!sharedEmail;
|
||||
if (emailMatch) {
|
||||
score += 60;
|
||||
reasons.push('email match');
|
||||
}
|
||||
|
||||
const sharedPhone = a.phonesE164.find((p) => b.phonesE164.includes(p) && countDigits(p) >= 8);
|
||||
const phoneMatch = !!sharedPhone;
|
||||
if (phoneMatch) {
|
||||
score += 50;
|
||||
reasons.push('phone match');
|
||||
}
|
||||
|
||||
const aNameNorm = (a.fullName ?? '').toLowerCase().trim();
|
||||
const bNameNorm = (b.fullName ?? '').toLowerCase().trim();
|
||||
const nameExactMatch = aNameNorm.length > 0 && aNameNorm === bNameNorm;
|
||||
if (nameExactMatch) {
|
||||
score += 20;
|
||||
reasons.push('name match');
|
||||
}
|
||||
|
||||
// Surname + given-name fuzzy. Only fires when names are NOT exactly
|
||||
// equal — avoids double-counting with the rule above. Catches
|
||||
// 'Constanzo' / 'Costanzo', 'Marc' / 'Marcus' etc. when other contact
|
||||
// signals confirm them.
|
||||
if (!nameExactMatch && a.surnameToken && b.surnameToken && a.surnameToken === b.surnameToken) {
|
||||
const aGiven = (a.fullName ?? '').toLowerCase().split(/\s+/)[0] ?? '';
|
||||
const bGiven = (b.fullName ?? '').toLowerCase().split(/\s+/)[0] ?? '';
|
||||
if (aGiven && bGiven && levenshtein(aGiven, bGiven) <= 1) {
|
||||
score += 15;
|
||||
reasons.push('surname + given-name fuzzy match');
|
||||
}
|
||||
}
|
||||
|
||||
// ── Negative rules. ──────────────────────────────────────────────────────
|
||||
|
||||
// Same email but the two parties' phone numbers belong to different
|
||||
// countries. Common when one inbox is shared by spouses / coworkers
|
||||
// and the actual phone owners are distinct people. Don't auto-merge.
|
||||
if (emailMatch && !phoneMatch && a.phonesE164.length > 0 && b.phonesE164.length > 0) {
|
||||
const aCountries = phoneCountriesOf(a);
|
||||
const bCountries = phoneCountriesOf(b);
|
||||
const overlap = [...aCountries].some((c) => bCountries.has(c));
|
||||
if (!overlap && aCountries.size > 0 && bCountries.size > 0) {
|
||||
score -= 15;
|
||||
reasons.push('phone country mismatch (negative)');
|
||||
}
|
||||
}
|
||||
|
||||
// Same name but no contact match. Two distinct people with the same
|
||||
// name (common for "John Smith") sneak through name-based blocking;
|
||||
// penalize so the score lands below the auto-merge threshold.
|
||||
if (nameExactMatch && !emailMatch && !phoneMatch) {
|
||||
score -= 20;
|
||||
reasons.push('name match but no shared contact (negative)');
|
||||
}
|
||||
|
||||
return {
|
||||
candidate: b,
|
||||
score: clamp(score, 0, 100),
|
||||
reasons,
|
||||
confidence: 'low', // assigned by caller after threshold lookup
|
||||
};
|
||||
}
|
||||
|
||||
// ─── Helpers ────────────────────────────────────────────────────────────────
|
||||
|
||||
function pushTo<K, V>(map: Map<K, V[]>, key: K, value: V): void {
|
||||
const existing = map.get(key);
|
||||
if (existing) {
|
||||
existing.push(value);
|
||||
} else {
|
||||
map.set(key, [value]);
|
||||
}
|
||||
}
|
||||
|
||||
function classify(score: number, thresholds: DedupThresholds): MatchConfidence {
|
||||
if (score >= thresholds.highScore) return 'high';
|
||||
if (score >= thresholds.mediumScore) return 'medium';
|
||||
return 'low';
|
||||
}
|
||||
|
||||
function clamp(value: number, min: number, max: number): number {
|
||||
if (value < min) return min;
|
||||
if (value > max) return max;
|
||||
return value;
|
||||
}
|
||||
|
||||
function countDigits(s: string): number {
|
||||
let count = 0;
|
||||
for (let i = 0; i < s.length; i += 1) {
|
||||
const code = s.charCodeAt(i);
|
||||
if (code >= 48 && code <= 57) count += 1;
|
||||
}
|
||||
return count;
|
||||
}
|
||||
|
||||
/**
|
||||
* Resolve each phone in a candidate to its ISO country code (via
|
||||
* libphonenumber-js). Cached per call; the surrounding caller doesn't
|
||||
* batch so we accept the parse cost.
|
||||
*/
|
||||
function phoneCountriesOf(c: MatchCandidate): Set<string> {
|
||||
const out = new Set<string>();
|
||||
for (const p of c.phonesE164) {
|
||||
const parsed = parsePhone(p);
|
||||
if (parsed.country) out.add(parsed.country);
|
||||
}
|
||||
return out;
|
||||
}
|
||||
362
src/lib/dedup/migration-apply.ts
Normal file
362
src/lib/dedup/migration-apply.ts
Normal file
@@ -0,0 +1,362 @@
|
||||
/**
|
||||
* Apply phase for the legacy NocoDB → CRM migration. Walks a
|
||||
* `MigrationPlan` produced by {@link transformSnapshot} and writes
|
||||
* the new client / contact / address / yacht / interest rows into the
|
||||
* target port.
|
||||
*
|
||||
* Idempotent: every insert is guarded by a `migration_source_links`
|
||||
* lookup keyed on `(source_system, source_id, target_entity_type)`, so
|
||||
* a partial failure can be resumed by re-running the script. Re-runs
|
||||
* against an already-applied plan are a near-no-op.
|
||||
*
|
||||
* Per-entity transactions (not one giant transaction) — the design
|
||||
* favours visible partial progress on failure over all-or-nothing.
|
||||
*
|
||||
* @see src/lib/dedup/migration-transform.ts for the input shape.
|
||||
* @see src/lib/db/schema/migration.ts for the idempotency ledger.
|
||||
*/
|
||||
|
||||
import { and, eq, inArray } from 'drizzle-orm';
|
||||
|
||||
import { db } from '@/lib/db';
|
||||
import { clients, clientContacts, clientAddresses } from '@/lib/db/schema/clients';
|
||||
import { interests } from '@/lib/db/schema/interests';
|
||||
import { yachts } from '@/lib/db/schema/yachts';
|
||||
import { berths } from '@/lib/db/schema/berths';
|
||||
import { migrationSourceLinks } from '@/lib/db/schema/migration';
|
||||
import type { MigrationPlan, PlannedClient, PlannedInterest } from './migration-transform';
|
||||
|
||||
const SOURCE_SYSTEM = 'nocodb_interests';
|
||||
|
||||
/**
|
||||
* Convert a legacy bare mooring string like "D32" / "A1" / "E18" to the
|
||||
* dashed/padded form "D-32" / "A-01" / "E-18" used by the new berths
|
||||
* schema. If the input doesn't match the bare pattern, returns it
|
||||
* unchanged so a literal lookup can still hit (handles the case where
|
||||
* the legacy data already has the dashed form).
|
||||
*
|
||||
* Multi-mooring strings ("A3, D30") return the original string —
|
||||
* those need human review and we don't want to silently pick one half.
|
||||
*/
|
||||
function normalizeLegacyMooring(raw: string): string {
|
||||
// Bare letter+digits, e.g. "D32"
|
||||
const m = /^([A-E])(\d{1,3})$/i.exec(raw.trim());
|
||||
if (!m) return raw;
|
||||
const letter = m[1]!.toUpperCase();
|
||||
const num = parseInt(m[2]!, 10);
|
||||
return `${letter}-${num.toString().padStart(2, '0')}`;
|
||||
}
|
||||
|
||||
export interface ApplyResult {
|
||||
applyId: string;
|
||||
clientsInserted: number;
|
||||
clientsSkipped: number;
|
||||
contactsInserted: number;
|
||||
addressesInserted: number;
|
||||
yachtsInserted: number;
|
||||
interestsInserted: number;
|
||||
interestsSkipped: number;
|
||||
warnings: string[];
|
||||
}
|
||||
|
||||
export interface ApplyOptions {
|
||||
port: { id: string; slug: string };
|
||||
applyId: string;
|
||||
/** Set to true for the "preview the writes" mode — runs every read but
|
||||
* rolls back inserts. Useful for verifying mappings before committing. */
|
||||
rehearsal?: boolean;
|
||||
appliedBy?: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Look up an existing migration link for a (sourceId, targetType) pair.
|
||||
* Returns the existing target entity id if already linked.
|
||||
*/
|
||||
async function resolveExistingLink(
|
||||
sourceId: number,
|
||||
targetEntityType: 'client' | 'interest' | 'yacht' | 'address',
|
||||
): Promise<string | null> {
|
||||
const rows = await db
|
||||
.select({ id: migrationSourceLinks.targetEntityId })
|
||||
.from(migrationSourceLinks)
|
||||
.where(
|
||||
and(
|
||||
eq(migrationSourceLinks.sourceSystem, SOURCE_SYSTEM),
|
||||
eq(migrationSourceLinks.sourceId, String(sourceId)),
|
||||
eq(migrationSourceLinks.targetEntityType, targetEntityType),
|
||||
),
|
||||
)
|
||||
.limit(1);
|
||||
return rows[0]?.id ?? null;
|
||||
}
|
||||
|
||||
/** Find the first sourceId in a cluster that's already linked to a client,
|
||||
* if any. The cluster might be larger than the previously-applied set if
|
||||
* the dedup algorithm collapsed an extra duplicate this run. */
|
||||
async function resolveExistingClusterClient(sourceIds: number[]): Promise<string | null> {
|
||||
if (sourceIds.length === 0) return null;
|
||||
const rows = await db
|
||||
.select({ id: migrationSourceLinks.targetEntityId })
|
||||
.from(migrationSourceLinks)
|
||||
.where(
|
||||
and(
|
||||
eq(migrationSourceLinks.sourceSystem, SOURCE_SYSTEM),
|
||||
inArray(migrationSourceLinks.sourceId, sourceIds.map(String)),
|
||||
eq(migrationSourceLinks.targetEntityType, 'client'),
|
||||
),
|
||||
)
|
||||
.limit(1);
|
||||
return rows[0]?.id ?? null;
|
||||
}
|
||||
|
||||
/** Apply a single PlannedClient — returns `{clientId, inserted}` so the
|
||||
* caller can wire interests against the (possibly pre-existing) record. */
|
||||
async function applyClient(
|
||||
planned: PlannedClient,
|
||||
opts: ApplyOptions,
|
||||
result: ApplyResult,
|
||||
): Promise<{ clientId: string; inserted: boolean }> {
|
||||
// Idempotency: if any source row in the cluster already mapped to a client,
|
||||
// reuse that record.
|
||||
const existing = await resolveExistingClusterClient(planned.sourceIds);
|
||||
if (existing) {
|
||||
result.clientsSkipped += 1;
|
||||
return { clientId: existing, inserted: false };
|
||||
}
|
||||
|
||||
if (opts.rehearsal) {
|
||||
// Simulate an insert without writing — used for the preview path.
|
||||
return { clientId: `rehearsal-${planned.tempId}`, inserted: true };
|
||||
}
|
||||
|
||||
// surnameToken is on the planned object (used by the dedup blocking
|
||||
// index inside the transform) but not in the clients schema — runtime
|
||||
// dedup re-derives it from fullName when needed. Drop it on insert.
|
||||
const [inserted] = await db
|
||||
.insert(clients)
|
||||
.values({
|
||||
portId: opts.port.id,
|
||||
fullName: planned.fullName,
|
||||
nationalityIso: planned.countryIso ?? null,
|
||||
preferredContactMethod: planned.preferredContactMethod ?? null,
|
||||
source: planned.source ?? null,
|
||||
})
|
||||
.returning({ id: clients.id });
|
||||
|
||||
if (!inserted) throw new Error('Client insert returned no row');
|
||||
const clientId = inserted.id;
|
||||
|
||||
// Record idempotency links — one per source row in the cluster.
|
||||
await db.insert(migrationSourceLinks).values(
|
||||
planned.sourceIds.map((sid) => ({
|
||||
sourceSystem: SOURCE_SYSTEM,
|
||||
sourceId: String(sid),
|
||||
targetEntityType: 'client' as const,
|
||||
targetEntityId: clientId,
|
||||
appliedId: opts.applyId,
|
||||
...(opts.appliedBy ? { appliedBy: opts.appliedBy } : {}),
|
||||
})),
|
||||
);
|
||||
|
||||
// Contacts: bulk insert; mark first email + first phone as primary.
|
||||
if (planned.contacts.length > 0) {
|
||||
let primaryEmailSet = false;
|
||||
let primaryPhoneSet = false;
|
||||
const contactRows = planned.contacts.map((ct) => {
|
||||
let isPrimary = false;
|
||||
if (ct.isPrimary) {
|
||||
if (ct.channel === 'email' && !primaryEmailSet) {
|
||||
isPrimary = true;
|
||||
primaryEmailSet = true;
|
||||
} else if ((ct.channel === 'phone' || ct.channel === 'whatsapp') && !primaryPhoneSet) {
|
||||
isPrimary = true;
|
||||
primaryPhoneSet = true;
|
||||
}
|
||||
}
|
||||
return {
|
||||
clientId,
|
||||
channel: ct.channel,
|
||||
value: ct.value,
|
||||
valueE164: ct.valueE164 ?? null,
|
||||
valueCountry: ct.valueCountry ?? null,
|
||||
isPrimary,
|
||||
};
|
||||
});
|
||||
await db.insert(clientContacts).values(contactRows);
|
||||
result.contactsInserted += contactRows.length;
|
||||
}
|
||||
|
||||
// Addresses: bulk insert; first is marked primary if multiple. Note the
|
||||
// schema requires portId on every address row in addition to clientId.
|
||||
if (planned.addresses.length > 0) {
|
||||
const addressRows = planned.addresses.map((a, idx) => ({
|
||||
clientId,
|
||||
portId: opts.port.id,
|
||||
streetAddress: a.streetAddress ?? null,
|
||||
city: a.city ?? null,
|
||||
countryIso: a.countryIso ?? null,
|
||||
isPrimary: idx === 0,
|
||||
}));
|
||||
await db.insert(clientAddresses).values(addressRows);
|
||||
result.addressesInserted += addressRows.length;
|
||||
}
|
||||
|
||||
result.clientsInserted += 1;
|
||||
return { clientId, inserted: true };
|
||||
}
|
||||
|
||||
/** Apply a single PlannedInterest — looks up its client + berth + yacht and
|
||||
* inserts the interest row, plus a yacht stub if a yacht name is present. */
|
||||
async function applyInterest(
|
||||
planned: PlannedInterest,
|
||||
tempIdToClientId: Map<string, string>,
|
||||
mooringToBerthId: Map<string, string>,
|
||||
opts: ApplyOptions,
|
||||
result: ApplyResult,
|
||||
): Promise<void> {
|
||||
// Idempotency: skip if this source row already created an interest.
|
||||
const existing = await resolveExistingLink(planned.sourceId, 'interest');
|
||||
if (existing) {
|
||||
result.interestsSkipped += 1;
|
||||
return;
|
||||
}
|
||||
|
||||
const clientId = tempIdToClientId.get(planned.clientTempId);
|
||||
if (!clientId) {
|
||||
result.warnings.push(
|
||||
`Interest source=${planned.sourceId} references unknown client tempId=${planned.clientTempId} — skipped`,
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
let berthId: string | null = null;
|
||||
if (planned.berthMooringNumber) {
|
||||
berthId =
|
||||
mooringToBerthId.get(planned.berthMooringNumber) ??
|
||||
// The legacy NocoDB Interests table uses bare mooring strings like
|
||||
// "D32", "B16", whereas the new berths schema (mirroring the NocoDB
|
||||
// Berths snapshot) uses zero-padded "D-32", "B-16". Try the dashed
|
||||
// form as a fallback so legacy references resolve correctly.
|
||||
mooringToBerthId.get(normalizeLegacyMooring(planned.berthMooringNumber)) ??
|
||||
null;
|
||||
if (!berthId) {
|
||||
result.warnings.push(
|
||||
`Interest source=${planned.sourceId} references unknown mooring="${planned.berthMooringNumber}" — interest created without berth link`,
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// Optional yacht stub: if the legacy row had a yacht name, create a
|
||||
// minimal yacht record owned by the client. The new schema requires
|
||||
// currentOwnerType + currentOwnerId.
|
||||
let yachtId: string | null = null;
|
||||
if (planned.yachtName) {
|
||||
const existingYacht = await resolveExistingLink(planned.sourceId, 'yacht');
|
||||
if (existingYacht) {
|
||||
yachtId = existingYacht;
|
||||
} else if (!opts.rehearsal) {
|
||||
const [y] = await db
|
||||
.insert(yachts)
|
||||
.values({
|
||||
portId: opts.port.id,
|
||||
name: planned.yachtName,
|
||||
currentOwnerType: 'client',
|
||||
currentOwnerId: clientId,
|
||||
status: 'active',
|
||||
})
|
||||
.returning({ id: yachts.id });
|
||||
if (y) {
|
||||
yachtId = y.id;
|
||||
await db.insert(migrationSourceLinks).values({
|
||||
sourceSystem: SOURCE_SYSTEM,
|
||||
sourceId: String(planned.sourceId),
|
||||
targetEntityType: 'yacht' as const,
|
||||
targetEntityId: y.id,
|
||||
appliedId: opts.applyId,
|
||||
...(opts.appliedBy ? { appliedBy: opts.appliedBy } : {}),
|
||||
});
|
||||
result.yachtsInserted += 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (opts.rehearsal) {
|
||||
result.interestsInserted += 1;
|
||||
return;
|
||||
}
|
||||
|
||||
const [iRow] = await db
|
||||
.insert(interests)
|
||||
.values({
|
||||
portId: opts.port.id,
|
||||
clientId,
|
||||
berthId,
|
||||
yachtId,
|
||||
pipelineStage: planned.pipelineStage,
|
||||
leadCategory: planned.leadCategory,
|
||||
source: planned.source,
|
||||
notes: planned.notes,
|
||||
documensoId: planned.documensoId,
|
||||
dateEoiSent: planned.dateEoiSent ? new Date(planned.dateEoiSent) : null,
|
||||
dateEoiSigned: planned.dateEoiSigned ? new Date(planned.dateEoiSigned) : null,
|
||||
dateContractSent: planned.dateContractSent ? new Date(planned.dateContractSent) : null,
|
||||
dateContractSigned: planned.dateContractSigned ? new Date(planned.dateContractSigned) : null,
|
||||
dateDepositReceived: planned.dateDepositReceived
|
||||
? new Date(planned.dateDepositReceived)
|
||||
: null,
|
||||
dateLastContact: planned.dateLastContact ? new Date(planned.dateLastContact) : null,
|
||||
})
|
||||
.returning({ id: interests.id });
|
||||
|
||||
if (!iRow) throw new Error('Interest insert returned no row');
|
||||
|
||||
await db.insert(migrationSourceLinks).values({
|
||||
sourceSystem: SOURCE_SYSTEM,
|
||||
sourceId: String(planned.sourceId),
|
||||
targetEntityType: 'interest' as const,
|
||||
targetEntityId: iRow.id,
|
||||
appliedId: opts.applyId,
|
||||
...(opts.appliedBy ? { appliedBy: opts.appliedBy } : {}),
|
||||
});
|
||||
|
||||
result.interestsInserted += 1;
|
||||
}
|
||||
|
||||
/**
|
||||
* Top-level apply driver. Walks the plan once, building the
|
||||
* tempId→clientId map as it goes, then walks interests with that map.
|
||||
*/
|
||||
export async function applyPlan(plan: MigrationPlan, opts: ApplyOptions): Promise<ApplyResult> {
|
||||
const result: ApplyResult = {
|
||||
applyId: opts.applyId,
|
||||
clientsInserted: 0,
|
||||
clientsSkipped: 0,
|
||||
contactsInserted: 0,
|
||||
addressesInserted: 0,
|
||||
yachtsInserted: 0,
|
||||
interestsInserted: 0,
|
||||
interestsSkipped: 0,
|
||||
warnings: [],
|
||||
};
|
||||
|
||||
// 1. Clients (and their contacts/addresses)
|
||||
const tempIdToClientId = new Map<string, string>();
|
||||
for (const planned of plan.clients) {
|
||||
const { clientId } = await applyClient(planned, opts, result);
|
||||
tempIdToClientId.set(planned.tempId, clientId);
|
||||
}
|
||||
|
||||
// 2. Build mooring→berthId lookup once, scoped to this port.
|
||||
const berthRows = await db
|
||||
.select({ id: berths.id, mooringNumber: berths.mooringNumber })
|
||||
.from(berths)
|
||||
.where(eq(berths.portId, opts.port.id));
|
||||
const mooringToBerthId = new Map(berthRows.map((b) => [b.mooringNumber, b.id]));
|
||||
|
||||
// 3. Interests (and yacht stubs)
|
||||
for (const planned of plan.interests) {
|
||||
await applyInterest(planned, tempIdToClientId, mooringToBerthId, opts, result);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
274
src/lib/dedup/migration-report.ts
Normal file
274
src/lib/dedup/migration-report.ts
Normal file
@@ -0,0 +1,274 @@
|
||||
/**
|
||||
* Migration report writer — turns a `MigrationPlan` (from
|
||||
* `migration-transform.ts`) into a CSV + a human-readable Markdown
|
||||
* summary on disk under `.migration/<timestamp>/`.
|
||||
*
|
||||
* The CSV format is intentionally machine-friendly (one row per
|
||||
* planned operation) so it can be diffed across runs and inspected
|
||||
* by hand. The summary is designed for "open this in your editor and
|
||||
* eyeball it for 5 minutes before --apply."
|
||||
*/
|
||||
|
||||
import { promises as fs } from 'node:fs';
|
||||
import path from 'node:path';
|
||||
|
||||
import type { MigrationPlan } from './migration-transform';
|
||||
|
||||
// ─── Output directory ───────────────────────────────────────────────────────
|
||||
|
||||
export interface ReportPaths {
|
||||
rootDir: string;
|
||||
csvPath: string;
|
||||
summaryPath: string;
|
||||
planJsonPath: string;
|
||||
}
|
||||
|
||||
/** Resolve report paths relative to the worktree root. The timestamped
|
||||
* directory is created lazily by `writeReport`. */
|
||||
export function resolveReportPaths(
|
||||
rootDir: string,
|
||||
timestamp: string = new Date().toISOString().replace(/[:.]/g, '-'),
|
||||
): ReportPaths {
|
||||
const dir = path.join(rootDir, '.migration', timestamp);
|
||||
return {
|
||||
rootDir: dir,
|
||||
csvPath: path.join(dir, 'report.csv'),
|
||||
summaryPath: path.join(dir, 'summary.md'),
|
||||
planJsonPath: path.join(dir, 'plan.json'),
|
||||
};
|
||||
}
|
||||
|
||||
// ─── CSV row shape ──────────────────────────────────────────────────────────
|
||||
|
||||
interface CsvRow {
|
||||
op: string; // create_client / create_contact / create_interest / auto_link / flag / needs_review
|
||||
reason: string;
|
||||
source_id: string;
|
||||
target_table: string;
|
||||
target_value: string;
|
||||
confidence: string;
|
||||
manual_review: 'true' | 'false';
|
||||
}
|
||||
|
||||
// Trivial CSV escape: quote any cell that contains comma / quote / newline,
|
||||
// double up internal quotes per RFC 4180. No need for a dependency.
|
||||
function csvEscape(s: string): string {
|
||||
if (/[",\n\r]/.test(s)) {
|
||||
return `"${s.replace(/"/g, '""')}"`;
|
||||
}
|
||||
return s;
|
||||
}
|
||||
|
||||
function rowToCsvLine(r: CsvRow): string {
|
||||
return [
|
||||
r.op,
|
||||
r.reason,
|
||||
r.source_id,
|
||||
r.target_table,
|
||||
r.target_value,
|
||||
r.confidence,
|
||||
r.manual_review,
|
||||
]
|
||||
.map(csvEscape)
|
||||
.join(',');
|
||||
}
|
||||
|
||||
// ─── Build CSV ──────────────────────────────────────────────────────────────
|
||||
|
||||
export function buildCsv(plan: MigrationPlan): string {
|
||||
const lines: string[] = [];
|
||||
lines.push(
|
||||
[
|
||||
'op',
|
||||
'reason',
|
||||
'source_id',
|
||||
'target_table',
|
||||
'target_value',
|
||||
'confidence',
|
||||
'manual_review',
|
||||
].join(','),
|
||||
);
|
||||
|
||||
for (const client of plan.clients) {
|
||||
lines.push(
|
||||
rowToCsvLine({
|
||||
op: 'create_client',
|
||||
reason: client.sourceIds.length > 1 ? 'auto-merged cluster' : 'new',
|
||||
source_id: client.sourceIds.join('|'),
|
||||
target_table: 'clients.fullName',
|
||||
target_value: client.fullName,
|
||||
confidence: 'N/A',
|
||||
manual_review: 'false',
|
||||
}),
|
||||
);
|
||||
for (const c of client.contacts) {
|
||||
lines.push(
|
||||
rowToCsvLine({
|
||||
op: 'create_contact',
|
||||
reason: c.flagged ?? 'new',
|
||||
source_id: client.sourceIds.join('|'),
|
||||
target_table: `clientContacts.${c.channel}`,
|
||||
target_value: c.value,
|
||||
confidence: 'N/A',
|
||||
manual_review: c.flagged ? 'true' : 'false',
|
||||
}),
|
||||
);
|
||||
}
|
||||
for (const a of client.addresses) {
|
||||
lines.push(
|
||||
rowToCsvLine({
|
||||
op: 'create_address',
|
||||
reason: 'address text present',
|
||||
source_id: client.sourceIds.join('|'),
|
||||
target_table: 'clientAddresses.countryIso',
|
||||
target_value: a.countryIso ?? '(unresolved)',
|
||||
confidence: a.countryConfidence ?? 'fallback',
|
||||
manual_review: a.countryConfidence === 'fallback' || !a.countryIso ? 'true' : 'false',
|
||||
}),
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
for (const interest of plan.interests) {
|
||||
lines.push(
|
||||
rowToCsvLine({
|
||||
op: 'create_interest',
|
||||
reason: `pipelineStage=${interest.pipelineStage}`,
|
||||
source_id: String(interest.sourceId),
|
||||
target_table: 'interests',
|
||||
target_value: `${interest.berthMooringNumber ?? '(no berth)'} / ${interest.yachtName ?? '(no yacht)'}`,
|
||||
confidence: 'N/A',
|
||||
manual_review: 'false',
|
||||
}),
|
||||
);
|
||||
}
|
||||
|
||||
for (const link of plan.autoLinks) {
|
||||
lines.push(
|
||||
rowToCsvLine({
|
||||
op: 'auto_link',
|
||||
reason: link.reasons.join(' + '),
|
||||
source_id: `${link.leadSourceId}<-${link.mergedSourceIds.join(',')}`,
|
||||
target_table: 'clients',
|
||||
target_value: '(merged into lead)',
|
||||
confidence: `score=${link.score}`,
|
||||
manual_review: 'false',
|
||||
}),
|
||||
);
|
||||
}
|
||||
|
||||
for (const pair of plan.needsReview) {
|
||||
lines.push(
|
||||
rowToCsvLine({
|
||||
op: 'needs_review',
|
||||
reason: pair.reasons.join(' + '),
|
||||
source_id: `${pair.aSourceId}<->${pair.bSourceId}`,
|
||||
target_table: 'clients',
|
||||
target_value: '(human review required)',
|
||||
confidence: `score=${pair.score}`,
|
||||
manual_review: 'true',
|
||||
}),
|
||||
);
|
||||
}
|
||||
|
||||
for (const flag of plan.flags) {
|
||||
lines.push(
|
||||
rowToCsvLine({
|
||||
op: 'flag',
|
||||
reason: flag.reason,
|
||||
source_id: String(flag.sourceId),
|
||||
target_table: flag.sourceTable,
|
||||
target_value: JSON.stringify(flag.details ?? {}),
|
||||
confidence: 'N/A',
|
||||
manual_review: 'true',
|
||||
}),
|
||||
);
|
||||
}
|
||||
|
||||
return lines.join('\n') + '\n';
|
||||
}
|
||||
|
||||
// ─── Build summary markdown ─────────────────────────────────────────────────
|
||||
|
||||
export function buildSummary(plan: MigrationPlan, generatedAt: string): string {
|
||||
const s = plan.stats;
|
||||
const lines: string[] = [];
|
||||
lines.push(`# Migration Dry-Run — ${generatedAt}`);
|
||||
lines.push('');
|
||||
lines.push('## Input');
|
||||
lines.push(`- ${s.inputInterestRows} NocoDB Interests`);
|
||||
lines.push(`- ${s.inputResidentialRows} NocoDB Residential Interests`);
|
||||
lines.push('');
|
||||
lines.push('## Outcome');
|
||||
lines.push(`- ${s.outputClients} clients`);
|
||||
lines.push(`- ${s.outputInterests} interests (one per source row, linked to deduped client)`);
|
||||
lines.push(`- ${s.outputContacts} client_contacts`);
|
||||
lines.push(`- ${s.outputAddresses} client_addresses`);
|
||||
lines.push('');
|
||||
lines.push('## Auto-linked clusters');
|
||||
if (plan.autoLinks.length === 0) {
|
||||
lines.push('_None — every input row maps to a unique client._');
|
||||
} else {
|
||||
for (const link of plan.autoLinks) {
|
||||
const merged = link.mergedSourceIds.length;
|
||||
lines.push(
|
||||
`- Lead row \`${link.leadSourceId}\` ← merged ${merged} other row${merged === 1 ? '' : 's'} (\`${link.mergedSourceIds.join(', ')}\`) — score ${link.score} via ${link.reasons.join(' + ')}`,
|
||||
);
|
||||
}
|
||||
}
|
||||
lines.push('');
|
||||
lines.push('## Pairs flagged for human review');
|
||||
if (plan.needsReview.length === 0) {
|
||||
lines.push('_None._');
|
||||
} else {
|
||||
for (const pair of plan.needsReview) {
|
||||
lines.push(
|
||||
`- Rows \`${pair.aSourceId}\` ↔ \`${pair.bSourceId}\` — score ${pair.score} (${pair.reasons.join(' + ')})`,
|
||||
);
|
||||
}
|
||||
}
|
||||
lines.push('');
|
||||
lines.push('## Data quality flags');
|
||||
if (plan.flags.length === 0) {
|
||||
lines.push('_No quality issues._');
|
||||
} else {
|
||||
const byReason = new Map<string, number>();
|
||||
for (const f of plan.flags) {
|
||||
byReason.set(f.reason, (byReason.get(f.reason) ?? 0) + 1);
|
||||
}
|
||||
for (const [reason, count] of [...byReason].sort((a, b) => b[1] - a[1])) {
|
||||
lines.push(`- **${count}× ${reason}**`);
|
||||
}
|
||||
lines.push('');
|
||||
lines.push('### Detail');
|
||||
for (const f of plan.flags.slice(0, 30)) {
|
||||
lines.push(
|
||||
`- \`${f.sourceTable}#${f.sourceId}\`: ${f.reason}${f.details ? ` — \`${JSON.stringify(f.details)}\`` : ''}`,
|
||||
);
|
||||
}
|
||||
if (plan.flags.length > 30) {
|
||||
lines.push(`- _… and ${plan.flags.length - 30} more (see report.csv for full list)_`);
|
||||
}
|
||||
}
|
||||
lines.push('');
|
||||
lines.push('## Next step');
|
||||
lines.push('');
|
||||
lines.push('Eyeball the auto-linked + flagged-for-review pairs above.');
|
||||
lines.push('When satisfied, re-run the script with `--apply --report .migration/<this-dir>/`.');
|
||||
lines.push('Apply will refuse to run if the source NocoDB has changed since this dry-run.');
|
||||
|
||||
return lines.join('\n') + '\n';
|
||||
}
|
||||
|
||||
// ─── Write to disk ──────────────────────────────────────────────────────────
|
||||
|
||||
export async function writeReport(
|
||||
paths: ReportPaths,
|
||||
plan: MigrationPlan,
|
||||
generatedAt: string,
|
||||
): Promise<void> {
|
||||
await fs.mkdir(paths.rootDir, { recursive: true });
|
||||
await fs.writeFile(paths.csvPath, buildCsv(plan), 'utf-8');
|
||||
await fs.writeFile(paths.summaryPath, buildSummary(plan, generatedAt), 'utf-8');
|
||||
await fs.writeFile(paths.planJsonPath, JSON.stringify(plan, null, 2), 'utf-8');
|
||||
}
|
||||
576
src/lib/dedup/migration-transform.ts
Normal file
576
src/lib/dedup/migration-transform.ts
Normal file
@@ -0,0 +1,576 @@
|
||||
/**
|
||||
* Pure transform: NocoDB snapshot → planned new-system entities + dedup result.
|
||||
*
|
||||
* Used by the migration script's `--dry-run` (to produce the report) and
|
||||
* `--apply` (to actually write). Keeping this pure means the same code
|
||||
* runs in both modes, in tests against the frozen fixture, and in the
|
||||
* one-off CLI run against the live base.
|
||||
*
|
||||
* No side effects, no DB calls, no external services.
|
||||
*/
|
||||
|
||||
import {
|
||||
normalizeName,
|
||||
normalizeEmail,
|
||||
normalizePhone,
|
||||
resolveCountry,
|
||||
type NormalizedPhone,
|
||||
} from './normalize';
|
||||
import { findClientMatches, type MatchCandidate } from './find-matches';
|
||||
import type { CountryCode } from '@/lib/i18n/countries';
|
||||
import type { NocoDbRow, NocoDbSnapshot } from './nocodb-source';
|
||||
|
||||
// ─── Plan output ────────────────────────────────────────────────────────────
|
||||
|
||||
export interface PlannedClient {
|
||||
/** Stable id derived from the deduped cluster's lead row. Used by the
|
||||
* apply phase to reference newly-created clients before they exist
|
||||
* in the DB. */
|
||||
tempId: string;
|
||||
/** Source row IDs that contributed to this client (one if no duplicates,
|
||||
* many if dedup merged a cluster). */
|
||||
sourceIds: number[];
|
||||
fullName: string;
|
||||
surnameToken?: string;
|
||||
countryIso: CountryCode | null;
|
||||
preferredContactMethod: string | null;
|
||||
source: string | null;
|
||||
contacts: PlannedContact[];
|
||||
addresses: PlannedAddress[];
|
||||
}
|
||||
|
||||
export interface PlannedContact {
|
||||
channel: 'email' | 'phone' | 'whatsapp' | 'other';
|
||||
value: string;
|
||||
valueE164?: string | null;
|
||||
valueCountry?: CountryCode | null;
|
||||
isPrimary: boolean;
|
||||
flagged?: string;
|
||||
}
|
||||
|
||||
export interface PlannedAddress {
|
||||
streetAddress: string | null;
|
||||
city: string | null;
|
||||
countryIso: CountryCode | null;
|
||||
/** When confidence is low, the migration script flags the row for
|
||||
* human review. */
|
||||
countryConfidence: 'exact' | 'fuzzy' | 'city' | 'fallback' | null;
|
||||
}
|
||||
|
||||
export interface PlannedInterest {
|
||||
/** NocoDB row id this interest came from. */
|
||||
sourceId: number;
|
||||
/** tempId of the planned client this interest hangs off. */
|
||||
clientTempId: string;
|
||||
pipelineStage: string;
|
||||
leadCategory: string | null;
|
||||
source: string | null;
|
||||
notes: string | null;
|
||||
/** Mooring number; the apply phase resolves this to a berthId via the
|
||||
* new-system Berths table. */
|
||||
berthMooringNumber: string | null;
|
||||
yachtName: string | null;
|
||||
/** Date stamps for milestone columns. ISO strings if parseable. */
|
||||
dateEoiSent: string | null;
|
||||
dateEoiSigned: string | null;
|
||||
dateDepositReceived: string | null;
|
||||
dateContractSent: string | null;
|
||||
dateContractSigned: string | null;
|
||||
dateLastContact: string | null;
|
||||
/** Documenso linkage carried forward when present so the document
|
||||
* record can be stitched up downstream. */
|
||||
documensoId: string | null;
|
||||
}
|
||||
|
||||
export interface MigrationFlag {
|
||||
sourceTable: 'interests' | 'residential_interests' | 'website_interest_submissions';
|
||||
sourceId: number;
|
||||
reason: string;
|
||||
details?: Record<string, unknown>;
|
||||
}
|
||||
|
||||
export interface MigrationPlan {
|
||||
clients: PlannedClient[];
|
||||
interests: PlannedInterest[];
|
||||
flags: MigrationFlag[];
|
||||
/** Pairs that the migration would auto-link (high score). */
|
||||
autoLinks: Array<{
|
||||
leadSourceId: number;
|
||||
mergedSourceIds: number[];
|
||||
score: number;
|
||||
reasons: string[];
|
||||
}>;
|
||||
/** Pairs that need human review (medium score). Each pair shows up
|
||||
* in the migration report; the user resolves before --apply. */
|
||||
needsReview: Array<{ aSourceId: number; bSourceId: number; score: number; reasons: string[] }>;
|
||||
stats: MigrationStats;
|
||||
}
|
||||
|
||||
export interface MigrationStats {
|
||||
inputInterestRows: number;
|
||||
inputResidentialRows: number;
|
||||
outputClients: number;
|
||||
outputInterests: number;
|
||||
outputContacts: number;
|
||||
outputAddresses: number;
|
||||
flaggedRows: number;
|
||||
autoLinkedClusters: number;
|
||||
needsReviewPairs: number;
|
||||
}
|
||||
|
||||
export interface TransformOptions {
|
||||
/** ISO country used when a phone has no prefix and the row has no
|
||||
* Place of Residence. Defaults to AI (Anguilla / Port Nimara's home). */
|
||||
defaultPhoneCountry: CountryCode;
|
||||
/** Score thresholds for auto-link vs human review. Should match the
|
||||
* per-port `system_settings` values once the runtime UI is in place. */
|
||||
thresholds: {
|
||||
autoLink: number;
|
||||
needsReview: number;
|
||||
};
|
||||
}
|
||||
|
||||
const DEFAULT_OPTIONS: TransformOptions = {
|
||||
defaultPhoneCountry: 'AI',
|
||||
thresholds: { autoLink: 90, needsReview: 50 },
|
||||
};
|
||||
|
||||
// ─── Stage mapping ──────────────────────────────────────────────────────────
|
||||
|
||||
const STAGE_MAP: Record<string, string> = {
|
||||
'General Qualified Interest': 'open',
|
||||
'Specific Qualified Interest': 'details_sent',
|
||||
'EOI and NDA Sent': 'eoi_sent',
|
||||
'Signed EOI and NDA': 'eoi_signed',
|
||||
'Made Reservation': 'deposit_10pct',
|
||||
'Contract Negotiation': 'contract_sent',
|
||||
'Contract Negotiations Finalized': 'contract_sent',
|
||||
'Contract Signed': 'contract_signed',
|
||||
};
|
||||
|
||||
const LEAD_CATEGORY_MAP: Record<string, string> = {
|
||||
General: 'general_interest',
|
||||
'Friends and Family': 'general_interest',
|
||||
};
|
||||
|
||||
const SOURCE_MAP: Record<string, string> = {
|
||||
portal: 'website',
|
||||
Form: 'website',
|
||||
External: 'manual',
|
||||
};
|
||||
|
||||
// ─── Date parsing ───────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Parse a date the legacy NocoDB might have stored in DD-MM-YYYY,
|
||||
* DD/MM/YYYY, YYYY-MM-DD, or ISO format. Returns ISO string or null.
|
||||
*/
|
||||
function parseFlexibleDate(input: unknown): string | null {
|
||||
if (typeof input !== 'string' || input.trim() === '') return null;
|
||||
const s = input.trim();
|
||||
|
||||
// Already ISO
|
||||
if (/^\d{4}-\d{2}-\d{2}/.test(s)) {
|
||||
const d = new Date(s);
|
||||
return Number.isNaN(d.getTime()) ? null : d.toISOString();
|
||||
}
|
||||
|
||||
// DD-MM-YYYY or DD/MM/YYYY
|
||||
const m = s.match(/^(\d{1,2})[-/](\d{1,2})[-/](\d{4})$/);
|
||||
if (m) {
|
||||
const [, day, month, year] = m;
|
||||
const iso = `${year}-${month!.padStart(2, '0')}-${day!.padStart(2, '0')}`;
|
||||
const d = new Date(iso);
|
||||
return Number.isNaN(d.getTime()) ? null : d.toISOString();
|
||||
}
|
||||
|
||||
// Anything else: try Date constructor as a last resort
|
||||
const d = new Date(s);
|
||||
return Number.isNaN(d.getTime()) ? null : d.toISOString();
|
||||
}
|
||||
|
||||
// ─── Main transform ─────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Run the full transform pipeline against a NocoDB snapshot. Pure
|
||||
* function — same input always produces the same plan.
|
||||
*/
|
||||
export function transformSnapshot(
|
||||
snapshot: NocoDbSnapshot,
|
||||
options: Partial<TransformOptions> = {},
|
||||
): MigrationPlan {
|
||||
const opts = { ...DEFAULT_OPTIONS, ...options };
|
||||
|
||||
const flags: MigrationFlag[] = [];
|
||||
// Build per-row candidates first so we can run dedup before assigning
|
||||
// tempIds (clients with multiple source rows merge into one tempId).
|
||||
const perRow = snapshot.interests.map((row) => rowToCandidate(row, 'interests', opts, flags));
|
||||
|
||||
// Dedup pass 1: every row scored against every other row (within the
|
||||
// same pool). The blocking strategy in `findClientMatches` keeps this
|
||||
// cheap even for the full 252-row dataset.
|
||||
const clusters = clusterByDedup(perRow, opts);
|
||||
|
||||
// Build the planned clients + interests from the clusters.
|
||||
const clients: PlannedClient[] = [];
|
||||
const interests: PlannedInterest[] = [];
|
||||
const autoLinks: MigrationPlan['autoLinks'] = [];
|
||||
const needsReview: MigrationPlan['needsReview'] = [];
|
||||
|
||||
for (const cluster of clusters) {
|
||||
const lead = cluster.leadCandidate;
|
||||
const tempId = `client-${lead.row.Id}`;
|
||||
|
||||
// Build the client record from the lead row, then merge in any
|
||||
// contact info / address info from the other rows in the cluster.
|
||||
const planned = buildPlannedClient(tempId, cluster, opts);
|
||||
clients.push(planned);
|
||||
|
||||
// Each row in the cluster becomes its own interest record.
|
||||
for (const member of cluster.members) {
|
||||
const interest = buildPlannedInterest(member.row, tempId);
|
||||
interests.push(interest);
|
||||
}
|
||||
|
||||
if (cluster.members.length > 1) {
|
||||
autoLinks.push({
|
||||
leadSourceId: lead.row.Id,
|
||||
mergedSourceIds: cluster.members.filter((m) => m !== lead).map((m) => m.row.Id),
|
||||
score: cluster.maxScore,
|
||||
reasons: cluster.reasons,
|
||||
});
|
||||
}
|
||||
|
||||
for (const pair of cluster.reviewPairs) {
|
||||
needsReview.push(pair);
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
clients,
|
||||
interests,
|
||||
flags,
|
||||
autoLinks,
|
||||
needsReview,
|
||||
stats: {
|
||||
inputInterestRows: snapshot.interests.length,
|
||||
inputResidentialRows: snapshot.residentialInterests.length,
|
||||
outputClients: clients.length,
|
||||
outputInterests: interests.length,
|
||||
outputContacts: clients.reduce((sum, c) => sum + c.contacts.length, 0),
|
||||
outputAddresses: clients.reduce((sum, c) => sum + c.addresses.length, 0),
|
||||
flaggedRows: flags.length,
|
||||
autoLinkedClusters: autoLinks.length,
|
||||
needsReviewPairs: needsReview.length,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
// ─── Helpers ────────────────────────────────────────────────────────────────
|
||||
|
||||
interface RowCandidate {
|
||||
row: NocoDbRow;
|
||||
candidate: MatchCandidate;
|
||||
/** Phone normalize result for the row's primary phone; used downstream
|
||||
* to attach valueE164 + country to the planned contact. */
|
||||
phoneResult: NormalizedPhone | null;
|
||||
/** Country resolved from "Place of Residence". */
|
||||
countryIso: CountryCode | null;
|
||||
countryConfidence: 'exact' | 'fuzzy' | 'city' | null;
|
||||
/** Normalized email or null. */
|
||||
email: string | null;
|
||||
/** Display name from `normalizeName`. */
|
||||
displayName: string;
|
||||
}
|
||||
|
||||
function rowToCandidate(
|
||||
row: NocoDbRow,
|
||||
sourceTable: MigrationFlag['sourceTable'],
|
||||
opts: TransformOptions,
|
||||
flags: MigrationFlag[],
|
||||
): RowCandidate {
|
||||
const rawName = (row['Full Name'] as string | undefined) ?? '';
|
||||
const rawEmail = (row['Email Address'] as string | undefined) ?? '';
|
||||
const rawPhone = (row['Phone Number'] as string | undefined) ?? '';
|
||||
const rawCountry = (row['Place of Residence'] as string | undefined) ?? '';
|
||||
|
||||
const normName = normalizeName(rawName);
|
||||
const email = normalizeEmail(rawEmail);
|
||||
const country = resolveCountry(rawCountry);
|
||||
const phoneCountry = country.iso ?? opts.defaultPhoneCountry;
|
||||
const phoneResult = normalizePhone(rawPhone, phoneCountry as CountryCode);
|
||||
|
||||
// Surface anything weird so the report can show it.
|
||||
if (rawPhone && !phoneResult?.e164) {
|
||||
flags.push({
|
||||
sourceTable,
|
||||
sourceId: row.Id,
|
||||
reason: phoneResult?.flagged ? `phone ${phoneResult.flagged}` : 'phone unparseable',
|
||||
details: { rawPhone },
|
||||
});
|
||||
}
|
||||
if (rawEmail && !email) {
|
||||
flags.push({
|
||||
sourceTable,
|
||||
sourceId: row.Id,
|
||||
reason: 'email invalid',
|
||||
details: { rawEmail },
|
||||
});
|
||||
}
|
||||
if (rawCountry && !country.iso) {
|
||||
flags.push({
|
||||
sourceTable,
|
||||
sourceId: row.Id,
|
||||
reason: 'country unresolved',
|
||||
details: { rawCountry },
|
||||
});
|
||||
}
|
||||
|
||||
const candidate: MatchCandidate = {
|
||||
id: String(row.Id),
|
||||
fullName: normName.display || null,
|
||||
surnameToken: normName.surnameToken ?? null,
|
||||
emails: email ? [email] : [],
|
||||
phonesE164: phoneResult?.e164 ? [phoneResult.e164] : [],
|
||||
countryIso: country.iso ?? null,
|
||||
};
|
||||
|
||||
return {
|
||||
row,
|
||||
candidate,
|
||||
phoneResult,
|
||||
countryIso: country.iso ?? null,
|
||||
countryConfidence: country.confidence,
|
||||
email,
|
||||
displayName: normName.display,
|
||||
};
|
||||
}
|
||||
|
||||
interface Cluster {
|
||||
/** The cluster's "lead" row (most complete + most recent). */
|
||||
leadCandidate: RowCandidate;
|
||||
members: RowCandidate[];
|
||||
maxScore: number;
|
||||
reasons: string[];
|
||||
/** Pairs in this cluster that scored medium (need review). */
|
||||
reviewPairs: Array<{ aSourceId: number; bSourceId: number; score: number; reasons: string[] }>;
|
||||
}
|
||||
|
||||
function clusterByDedup(rows: RowCandidate[], opts: TransformOptions): Cluster[] {
|
||||
// Use a union-find structure indexed by row id. Every pair with a
|
||||
// score >= autoLink threshold gets unioned. Pairs in [needsReview,
|
||||
// autoLink) accumulate onto the cluster's reviewPairs list — they're
|
||||
// surfaced for human triage but not auto-merged.
|
||||
const parent = new Map<string, string>();
|
||||
for (const r of rows) parent.set(r.candidate.id, r.candidate.id);
|
||||
const find = (id: string): string => {
|
||||
let cur = id;
|
||||
while (parent.get(cur) !== cur) {
|
||||
const next = parent.get(cur)!;
|
||||
parent.set(cur, parent.get(next)!); // path compression
|
||||
cur = parent.get(cur)!;
|
||||
}
|
||||
return cur;
|
||||
};
|
||||
const union = (a: string, b: string) => {
|
||||
const rootA = find(a);
|
||||
const rootB = find(b);
|
||||
if (rootA !== rootB) parent.set(rootA, rootB);
|
||||
};
|
||||
|
||||
const clusterReasons = new Map<string, string[]>();
|
||||
const clusterMaxScore = new Map<string, number>();
|
||||
const clusterReviewPairs = new Map<string, Cluster['reviewPairs']>();
|
||||
|
||||
// Score every candidate against every other candidate. The find-matches
|
||||
// function does its own blocking so this is cheap.
|
||||
for (let i = 0; i < rows.length; i += 1) {
|
||||
const left = rows[i]!;
|
||||
const remainingPool = rows.slice(i + 1).map((r) => r.candidate);
|
||||
if (remainingPool.length === 0) continue;
|
||||
const matches = findClientMatches(left.candidate, remainingPool, {
|
||||
highScore: opts.thresholds.autoLink,
|
||||
mediumScore: opts.thresholds.needsReview,
|
||||
});
|
||||
|
||||
for (const m of matches) {
|
||||
if (m.score >= opts.thresholds.autoLink) {
|
||||
union(left.candidate.id, m.candidate.id);
|
||||
const root = find(left.candidate.id);
|
||||
clusterMaxScore.set(root, Math.max(clusterMaxScore.get(root) ?? 0, m.score));
|
||||
const existing = clusterReasons.get(root) ?? [];
|
||||
for (const reason of m.reasons) {
|
||||
if (!existing.includes(reason)) existing.push(reason);
|
||||
}
|
||||
clusterReasons.set(root, existing);
|
||||
} else if (m.score >= opts.thresholds.needsReview) {
|
||||
// Medium — track on whichever cluster `left` belongs to.
|
||||
const root = find(left.candidate.id);
|
||||
const list = clusterReviewPairs.get(root) ?? [];
|
||||
list.push({
|
||||
aSourceId: parseInt(left.candidate.id, 10),
|
||||
bSourceId: parseInt(m.candidate.id, 10),
|
||||
score: m.score,
|
||||
reasons: m.reasons,
|
||||
});
|
||||
clusterReviewPairs.set(root, list);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Group rows by their cluster root.
|
||||
const byRoot = new Map<string, RowCandidate[]>();
|
||||
for (const r of rows) {
|
||||
const root = find(r.candidate.id);
|
||||
const list = byRoot.get(root) ?? [];
|
||||
list.push(r);
|
||||
byRoot.set(root, list);
|
||||
}
|
||||
|
||||
// Build cluster objects, choosing the most-complete row as the lead.
|
||||
const clusters: Cluster[] = [];
|
||||
for (const [root, members] of byRoot) {
|
||||
const lead = pickLead(members);
|
||||
clusters.push({
|
||||
leadCandidate: lead,
|
||||
members,
|
||||
maxScore: clusterMaxScore.get(root) ?? 0,
|
||||
reasons: clusterReasons.get(root) ?? [],
|
||||
reviewPairs: clusterReviewPairs.get(root) ?? [],
|
||||
});
|
||||
}
|
||||
return clusters;
|
||||
}
|
||||
|
||||
function pickLead(rows: RowCandidate[]): RowCandidate {
|
||||
// Pick the row with the most populated fields, breaking ties by
|
||||
// recency (highest Id, since NocoDB IDs are monotonic).
|
||||
return rows.reduce((best, current) => {
|
||||
const bestScore = completenessScore(best);
|
||||
const currentScore = completenessScore(current);
|
||||
if (currentScore > bestScore) return current;
|
||||
if (currentScore === bestScore && current.row.Id > best.row.Id) return current;
|
||||
return best;
|
||||
});
|
||||
}
|
||||
|
||||
function completenessScore(r: RowCandidate): number {
|
||||
let score = 0;
|
||||
if (r.email) score += 1;
|
||||
if (r.phoneResult?.e164) score += 1;
|
||||
if (r.row['Address']) score += 0.5;
|
||||
if (r.row['Yacht Name']) score += 0.5;
|
||||
if (r.row['Source']) score += 0.25;
|
||||
if (r.row['Lead Category']) score += 0.25;
|
||||
if (r.row['Internal Notes']) score += 0.25;
|
||||
return score;
|
||||
}
|
||||
|
||||
function buildPlannedClient(
|
||||
tempId: string,
|
||||
cluster: Cluster,
|
||||
opts: TransformOptions,
|
||||
): PlannedClient {
|
||||
const lead = cluster.leadCandidate;
|
||||
|
||||
// Collect distinct emails + phones from across the cluster — duplicate
|
||||
// submissions often come with different contact methods we want to
|
||||
// preserve as multiple rows in `client_contacts`.
|
||||
const seenEmails = new Set<string>();
|
||||
const seenPhones = new Set<string>();
|
||||
const contacts: PlannedContact[] = [];
|
||||
|
||||
for (const member of cluster.members) {
|
||||
if (member.email && !seenEmails.has(member.email)) {
|
||||
seenEmails.add(member.email);
|
||||
contacts.push({
|
||||
channel: 'email',
|
||||
value: member.email,
|
||||
isPrimary: contacts.length === 0,
|
||||
});
|
||||
}
|
||||
if (member.phoneResult?.e164 && !seenPhones.has(member.phoneResult.e164)) {
|
||||
seenPhones.add(member.phoneResult.e164);
|
||||
const isFirstPhone = !contacts.some((c) => c.channel === 'phone');
|
||||
contacts.push({
|
||||
channel: 'phone',
|
||||
value: member.phoneResult.e164,
|
||||
valueE164: member.phoneResult.e164,
|
||||
valueCountry: member.phoneResult.country,
|
||||
isPrimary: isFirstPhone && contacts.every((c) => !c.isPrimary || c.channel === 'email'),
|
||||
flagged: member.phoneResult.flagged,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Demote the email-primary if a more-completable phone exists.
|
||||
// Simpler invariant: the first contact is primary unless the row
|
||||
// explicitly preferred phone.
|
||||
const preferredMethod = (lead.row['Contact Method Preferred'] as string | undefined)
|
||||
?.toLowerCase()
|
||||
?.trim();
|
||||
|
||||
// Address: only build if the lead row has a meaningful address text.
|
||||
const rawAddress = (lead.row['Address'] as string | undefined)?.trim();
|
||||
const addresses: PlannedAddress[] = [];
|
||||
if (rawAddress) {
|
||||
addresses.push({
|
||||
streetAddress: rawAddress,
|
||||
city: null,
|
||||
countryIso: lead.countryIso ?? opts.defaultPhoneCountry,
|
||||
countryConfidence: lead.countryConfidence ?? 'fallback',
|
||||
});
|
||||
}
|
||||
|
||||
const sourceFromRow = (lead.row['Source'] as string | undefined) ?? null;
|
||||
const mappedSource = sourceFromRow ? (SOURCE_MAP[sourceFromRow] ?? 'manual') : null;
|
||||
|
||||
return {
|
||||
tempId,
|
||||
sourceIds: cluster.members.map((m) => m.row.Id),
|
||||
fullName: lead.displayName,
|
||||
surnameToken: lead.candidate.surnameToken ?? undefined,
|
||||
countryIso: lead.countryIso,
|
||||
preferredContactMethod: preferredMethod ?? null,
|
||||
source: mappedSource,
|
||||
contacts,
|
||||
addresses,
|
||||
};
|
||||
}
|
||||
|
||||
function buildPlannedInterest(row: NocoDbRow, clientTempId: string): PlannedInterest {
|
||||
const stage = (row['Sales Process Level'] as string | undefined) ?? '';
|
||||
const cat = (row['Lead Category'] as string | undefined) ?? '';
|
||||
|
||||
const notesParts: string[] = [];
|
||||
const internalNotes = row['Internal Notes'] as string | undefined;
|
||||
const extraComments = row['Extra Comments'] as string | undefined;
|
||||
if (internalNotes?.trim()) notesParts.push(internalNotes.trim());
|
||||
if (extraComments?.trim()) notesParts.push(`Extra Comments: ${extraComments.trim()}`);
|
||||
const berthSize = row['Berth Size Desired'] as string | undefined;
|
||||
if (berthSize?.trim()) notesParts.push(`Berth size desired: ${berthSize.trim()}`);
|
||||
|
||||
return {
|
||||
sourceId: row.Id,
|
||||
clientTempId,
|
||||
pipelineStage: STAGE_MAP[stage] ?? 'open',
|
||||
leadCategory: LEAD_CATEGORY_MAP[cat] ?? null,
|
||||
source: ((row['Source'] as string | undefined) ?? null) || null,
|
||||
notes: notesParts.join('\n\n') || null,
|
||||
berthMooringNumber: (row['Berth Number'] as string | undefined) ?? null,
|
||||
yachtName: (() => {
|
||||
const n = (row['Yacht Name'] as string | undefined)?.trim();
|
||||
// Filter placeholder values used by sales reps for "we don't know yet".
|
||||
if (!n) return null;
|
||||
if (['TBC', 'Na', 'NA', 'na', 'N/A', 'TBD', 'tbd'].includes(n)) return null;
|
||||
return n;
|
||||
})(),
|
||||
dateEoiSent: parseFlexibleDate(row['EOI Time Sent']),
|
||||
dateEoiSigned: parseFlexibleDate(row['all_signed_notified_at'] ?? row['developerSignTime']),
|
||||
dateDepositReceived: null, // not directly tracked in legacy schema
|
||||
dateContractSent: parseFlexibleDate(row['Time LOI Sent']),
|
||||
dateContractSigned: parseFlexibleDate(row['developerSignTime']),
|
||||
dateLastContact: parseFlexibleDate(row['Created At'] ?? row['Date Added']),
|
||||
documensoId: (row['documensoID'] as string | undefined) ?? null,
|
||||
};
|
||||
}
|
||||
152
src/lib/dedup/nocodb-source.ts
Normal file
152
src/lib/dedup/nocodb-source.ts
Normal file
@@ -0,0 +1,152 @@
|
||||
/**
|
||||
* Read-only adapter for the legacy NocoDB Port Nimara base.
|
||||
*
|
||||
* Used by the one-shot migration script (`scripts/migrate-from-nocodb.ts`)
|
||||
* to pull every Interest, Residential Interest, and Website Submission
|
||||
* row from the source-of-truth NocoDB tables. No mutations.
|
||||
*
|
||||
* Auth: `xc-token` header per NocoDB v2 API.
|
||||
*
|
||||
* The shape returned is a verbatim record of the row's fields — caller
|
||||
* is responsible for mapping to the new schema via `nocodb-transform.ts`.
|
||||
*/
|
||||
|
||||
import { z } from 'zod';
|
||||
|
||||
// ─── Configuration ──────────────────────────────────────────────────────────
|
||||
|
||||
const ConfigSchema = z.object({
|
||||
url: z.string().url(),
|
||||
token: z.string().min(1),
|
||||
});
|
||||
|
||||
export interface NocoDbConfig {
|
||||
url: string;
|
||||
token: string;
|
||||
}
|
||||
|
||||
export function loadNocoDbConfig(env: NodeJS.ProcessEnv = process.env): NocoDbConfig {
|
||||
return ConfigSchema.parse({
|
||||
url: env.NOCODB_URL,
|
||||
token: env.NOCODB_TOKEN,
|
||||
});
|
||||
}
|
||||
|
||||
// ─── Table identifiers ──────────────────────────────────────────────────────
|
||||
//
|
||||
// These IDs are stable per the NocoDB base — they were captured during the
|
||||
// 2026-05-03 audit and won't change unless the base is rebuilt. If the
|
||||
// base is reset, regenerate them from `getTablesList`.
|
||||
export const NOCO_TABLES = {
|
||||
interests: 'mbs9hjauug4eseo',
|
||||
residentialInterests: 'mscfpwwwjuds4nt',
|
||||
websiteInterestSubmissions: 'mevkpcih67c6jsm',
|
||||
websiteContactFormSubmissions: 'mxk5cd0pmwnwlcl',
|
||||
websiteBerthEoiSupplements: 'mglmioo0ku8zgqj',
|
||||
berths: 'mczgos9hr3oa9qc',
|
||||
} as const;
|
||||
|
||||
// ─── HTTP shape ─────────────────────────────────────────────────────────────
|
||||
|
||||
interface NocoDbListResponse<T> {
|
||||
list: T[];
|
||||
pageInfo: {
|
||||
totalRows: number;
|
||||
page: number;
|
||||
pageSize: number;
|
||||
isFirstPage: boolean;
|
||||
isLastPage: boolean;
|
||||
};
|
||||
}
|
||||
|
||||
/** A row's `Id` is always present. The rest of the fields vary per table. */
|
||||
export type NocoDbRow = Record<string, unknown> & { Id: number };
|
||||
|
||||
// ─── Public API ─────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Fetch all rows from a NocoDB table. Auto-paginates until the API
|
||||
* reports `isLastPage`. The legacy base is small (252 Interests rows
|
||||
* being the largest table) so we keep this simple — no streaming.
|
||||
*/
|
||||
export async function fetchAllRows(
|
||||
tableId: string,
|
||||
config: NocoDbConfig,
|
||||
pageSize = 250,
|
||||
): Promise<NocoDbRow[]> {
|
||||
const all: NocoDbRow[] = [];
|
||||
let page = 1;
|
||||
// Hard cap to prevent infinite-loop bugs if pageInfo lies. Each page
|
||||
// pulls up to `pageSize` rows, so 200 pages * 250 = 50k rows is the
|
||||
// maximum we'll ever fetch from one table.
|
||||
const MAX_PAGES = 200;
|
||||
|
||||
while (page <= MAX_PAGES) {
|
||||
const url = new URL(`${config.url}/api/v2/tables/${tableId}/records`);
|
||||
url.searchParams.set('limit', String(pageSize));
|
||||
url.searchParams.set('offset', String((page - 1) * pageSize));
|
||||
|
||||
const res = await fetch(url, {
|
||||
headers: {
|
||||
'xc-token': config.token,
|
||||
accept: 'application/json',
|
||||
},
|
||||
});
|
||||
|
||||
if (!res.ok) {
|
||||
throw new Error(
|
||||
`NocoDB fetch failed: ${res.status} ${res.statusText} — table ${tableId} page ${page}`,
|
||||
);
|
||||
}
|
||||
|
||||
const json = (await res.json()) as NocoDbListResponse<NocoDbRow>;
|
||||
all.push(...json.list);
|
||||
|
||||
if (json.pageInfo.isLastPage || json.list.length === 0) break;
|
||||
page += 1;
|
||||
}
|
||||
|
||||
return all;
|
||||
}
|
||||
|
||||
/**
|
||||
* Convenience snapshot — pulls every table the migration cares about
|
||||
* in parallel. Returned shape is the input the transform layer expects.
|
||||
*/
|
||||
export interface NocoDbSnapshot {
|
||||
interests: NocoDbRow[];
|
||||
residentialInterests: NocoDbRow[];
|
||||
websiteInterestSubmissions: NocoDbRow[];
|
||||
websiteContactFormSubmissions: NocoDbRow[];
|
||||
websiteBerthEoiSupplements: NocoDbRow[];
|
||||
berths: NocoDbRow[];
|
||||
fetchedAt: string;
|
||||
}
|
||||
|
||||
export async function fetchSnapshot(config: NocoDbConfig): Promise<NocoDbSnapshot> {
|
||||
const [
|
||||
interests,
|
||||
residentialInterests,
|
||||
websiteInterestSubmissions,
|
||||
websiteContactFormSubmissions,
|
||||
websiteBerthEoiSupplements,
|
||||
berths,
|
||||
] = await Promise.all([
|
||||
fetchAllRows(NOCO_TABLES.interests, config),
|
||||
fetchAllRows(NOCO_TABLES.residentialInterests, config),
|
||||
fetchAllRows(NOCO_TABLES.websiteInterestSubmissions, config),
|
||||
fetchAllRows(NOCO_TABLES.websiteContactFormSubmissions, config),
|
||||
fetchAllRows(NOCO_TABLES.websiteBerthEoiSupplements, config),
|
||||
fetchAllRows(NOCO_TABLES.berths, config),
|
||||
]);
|
||||
|
||||
return {
|
||||
interests,
|
||||
residentialInterests,
|
||||
websiteInterestSubmissions,
|
||||
websiteContactFormSubmissions,
|
||||
websiteBerthEoiSupplements,
|
||||
berths,
|
||||
fetchedAt: new Date().toISOString(),
|
||||
};
|
||||
}
|
||||
418
src/lib/dedup/normalize.ts
Normal file
418
src/lib/dedup/normalize.ts
Normal file
@@ -0,0 +1,418 @@
|
||||
/**
|
||||
* Normalization helpers for the dedup pipeline.
|
||||
*
|
||||
* Pure functions (no DB, no React). Used by both the runtime at-create
|
||||
* surfaces and the one-shot NocoDB migration script. Every transform
|
||||
* here has a fixture in `tests/unit/dedup/normalize.test.ts` drawn from
|
||||
* real dirty values observed in the legacy NocoDB Interests table.
|
||||
*
|
||||
* Design reference: docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md §3.
|
||||
*/
|
||||
|
||||
import { z } from 'zod';
|
||||
|
||||
import { ALL_COUNTRY_CODES, getCountryName, type CountryCode } from '@/lib/i18n/countries';
|
||||
import { parsePhoneScriptSafe as parsePhone } from './phone-parse';
|
||||
|
||||
// ─── Names ──────────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Tokens that should stay lowercase mid-name. Covers the common Romance,
|
||||
* Germanic, and Iberian particles seen in client records. The first token
|
||||
* of a name is always title-cased even if it appears in this set.
|
||||
*/
|
||||
const PARTICLES: ReadonlySet<string> = new Set([
|
||||
'van',
|
||||
'von',
|
||||
'de',
|
||||
'del',
|
||||
'da',
|
||||
'das',
|
||||
'do',
|
||||
'dos',
|
||||
'di',
|
||||
'le',
|
||||
'la',
|
||||
'el',
|
||||
'al',
|
||||
'der',
|
||||
'den',
|
||||
'des',
|
||||
'du',
|
||||
'dalla',
|
||||
'della',
|
||||
'st',
|
||||
'st.',
|
||||
'y',
|
||||
]);
|
||||
|
||||
export interface NormalizedName {
|
||||
/** Human-readable form preserved for UI display. Trims, collapses
|
||||
* whitespace, fixes case, but never destroys the user's intent —
|
||||
* slash-with-company structure ("Daniel Wainstein / 7 Knots, LLC")
|
||||
* is left intact. */
|
||||
display: string;
|
||||
/** Lowercased form for matching. */
|
||||
normalized: string;
|
||||
/** Last non-particle token, lowercased. Used as a blocking key by the
|
||||
* dedup algorithm so we only compare candidates with similar surnames. */
|
||||
surnameToken?: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Normalize a free-text full name. Trims and collapses whitespace,
|
||||
* replaces \r/\n/\t with single spaces, intelligently title-cases
|
||||
* ALL-CAPS surnames while keeping particles (van / de / dalla / etc.)
|
||||
* lowercase mid-name, and preserves Irish O' surnames as O'Brien.
|
||||
*
|
||||
* If the input contains a `/` (slash-with-company structure like
|
||||
* "Daniel Wainstein / 7 Knots, LLC"), the trailing company text is
|
||||
* preserved verbatim — it's signal, not noise.
|
||||
*/
|
||||
export function normalizeName(raw: string | null | undefined): NormalizedName {
|
||||
const safe = (raw ?? '').toString();
|
||||
// Replace \r, \n, \t with single spaces, then collapse runs of whitespace.
|
||||
const cleaned = safe
|
||||
.replace(/[\r\n\t]/g, ' ')
|
||||
.replace(/\s+/g, ' ')
|
||||
.trim();
|
||||
|
||||
if (!cleaned) {
|
||||
return { display: '', normalized: '', surnameToken: undefined };
|
||||
}
|
||||
|
||||
// Slash-with-company: title-case the part before the slash, leave the
|
||||
// company segment untouched (it's typically already a brand we shouldn't
|
||||
// mangle: "SAS TIKI", "7 Knots, LLC").
|
||||
const slashIdx = cleaned.indexOf('/');
|
||||
let displayCore: string;
|
||||
if (slashIdx !== -1) {
|
||||
const personPart = cleaned.slice(0, slashIdx).trim();
|
||||
const companyPart = cleaned.slice(slashIdx + 1).trim();
|
||||
displayCore = `${titleCaseTokens(personPart)} / ${companyPart}`;
|
||||
} else {
|
||||
displayCore = titleCaseTokens(cleaned);
|
||||
}
|
||||
|
||||
const display = displayCore;
|
||||
const normalized = display.toLowerCase();
|
||||
const surnameToken = computeSurnameToken(slashIdx !== -1 ? cleaned.slice(0, slashIdx) : cleaned);
|
||||
|
||||
return { display, normalized, surnameToken };
|
||||
}
|
||||
|
||||
function titleCaseTokens(s: string): string {
|
||||
const tokens = s.split(' ').filter(Boolean);
|
||||
if (tokens.length === 0) return '';
|
||||
return tokens.map((tok, idx) => titleCaseOneToken(tok, idx === 0)).join(' ');
|
||||
}
|
||||
|
||||
function titleCaseOneToken(token: string, isFirst: boolean): string {
|
||||
if (!token) return '';
|
||||
const lower = token.toLowerCase();
|
||||
if (!isFirst && PARTICLES.has(lower)) return lower;
|
||||
// O'Brien / D'Angelo / l'Estrange — capitalize the segment after each
|
||||
// apostrophe so a lowercased input round-trips to readable Irish caps.
|
||||
if (lower.includes("'")) {
|
||||
return lower
|
||||
.split("'")
|
||||
.map((part) => (part.length > 0 ? part[0]!.toUpperCase() + part.slice(1) : part))
|
||||
.join("'");
|
||||
}
|
||||
return lower[0]!.toUpperCase() + lower.slice(1);
|
||||
}
|
||||
|
||||
function computeSurnameToken(personPart: string): string | undefined {
|
||||
const cleaned = personPart
|
||||
.replace(/[\r\n\t]/g, ' ')
|
||||
.replace(/\s+/g, ' ')
|
||||
.trim();
|
||||
if (!cleaned) return undefined;
|
||||
const tokens = cleaned.split(' ').map((t) => t.toLowerCase());
|
||||
// Walk from the right past particles to find the last "real" surname token.
|
||||
for (let i = tokens.length - 1; i >= 0; i -= 1) {
|
||||
const tok = tokens[i]!;
|
||||
if (!PARTICLES.has(tok)) return tok;
|
||||
}
|
||||
// All tokens are particles? Fall back to the last token verbatim.
|
||||
return tokens[tokens.length - 1];
|
||||
}
|
||||
|
||||
// ─── Emails ─────────────────────────────────────────────────────────────────
|
||||
|
||||
const emailSchema = z.string().email();
|
||||
|
||||
/**
|
||||
* Normalize a free-text email. Trims + lowercases. Returns null for empty
|
||||
* or malformed input — caller decides whether to flag, store, or drop.
|
||||
*
|
||||
* Plus-aliases (`user+tag@domain.com`) are NOT stripped: they're real
|
||||
* distinct addresses, and stripping them would auto-merge legitimately
|
||||
* separate accounts.
|
||||
*/
|
||||
export function normalizeEmail(raw: string | null | undefined): string | null {
|
||||
if (raw == null) return null;
|
||||
const trimmed = raw.toString().trim().toLowerCase();
|
||||
if (!trimmed) return null;
|
||||
const result = emailSchema.safeParse(trimmed);
|
||||
return result.success ? trimmed : null;
|
||||
}
|
||||
|
||||
// ─── Phones ─────────────────────────────────────────────────────────────────
|
||||
|
||||
export type PhoneFlag = 'multi_number' | 'placeholder' | 'unparseable';
|
||||
|
||||
export interface NormalizedPhone {
|
||||
/** Canonical E.164 form, e.g. '+15742740548'. Null when unparseable
|
||||
* or flagged as placeholder. */
|
||||
e164: string | null;
|
||||
/** ISO-3166-1 alpha-2 of the country the number was parsed against. */
|
||||
country: CountryCode | null;
|
||||
/** Display-friendly international format. Useful for migration reports. */
|
||||
display: string | null;
|
||||
/** Set when the input had a quirk worth surfacing in the migration
|
||||
* report or runtime audit log. Absent on clean parses. */
|
||||
flagged?: PhoneFlag;
|
||||
}
|
||||
|
||||
/**
|
||||
* Normalize a raw user-entered phone string for comparison + storage.
|
||||
*
|
||||
* Pipeline:
|
||||
* 1. strip leading apostrophe (spreadsheet copy-paste artifact)
|
||||
* 2. strip \r / \n / \t (real values seen in NocoDB had carriage returns)
|
||||
* 3. detect multi-number fields ("+33611111111;+33622222222",
|
||||
* "0677580750/0690511494") — flag and take first segment
|
||||
* 4. strip whitespace, dots, dashes, parens, single quotes
|
||||
* 5. convert leading "00" → "+" (international dialling code)
|
||||
* 6. detect placeholder fakes (8+ consecutive zeros) — flag, return null e164
|
||||
* 7. parse via libphonenumber-js
|
||||
* 8. on parse failure or invalid number → flag 'unparseable'
|
||||
*
|
||||
* Returns null for empty inputs (cheaper to short-circuit than to wrap).
|
||||
*/
|
||||
export function normalizePhone(
|
||||
raw: string | null | undefined,
|
||||
defaultCountry?: CountryCode,
|
||||
): NormalizedPhone | null {
|
||||
if (raw == null) return null;
|
||||
let cleaned = raw.toString().trim();
|
||||
if (!cleaned) return null;
|
||||
|
||||
// 1. Spreadsheet apostrophe prefix.
|
||||
if (cleaned.startsWith("'")) cleaned = cleaned.slice(1);
|
||||
|
||||
// 2. Strip carriage returns / newlines / tabs.
|
||||
cleaned = cleaned.replace(/[\r\n\t]/g, '');
|
||||
|
||||
// 3. Multi-number detection — split on /, ;, , (in that order of priority).
|
||||
let flagged: PhoneFlag | undefined;
|
||||
if (/[/;,]/.test(cleaned)) {
|
||||
flagged = 'multi_number';
|
||||
cleaned = cleaned.split(/[/;,]/)[0]!.trim();
|
||||
}
|
||||
|
||||
// 4. Strip whitespace, dots, dashes, parens. Keep + for E.164 prefix.
|
||||
cleaned = cleaned.replace(/[\s.\-()]/g, '');
|
||||
if (!cleaned) return { e164: null, country: null, display: null, flagged: 'unparseable' };
|
||||
|
||||
// 5. 00 international prefix → +.
|
||||
if (cleaned.startsWith('00')) {
|
||||
cleaned = '+' + cleaned.slice(2);
|
||||
}
|
||||
|
||||
// 6. Placeholder fakes — runs of 8+ consecutive zeros, e.g. +447000000000.
|
||||
if (/0{8,}/.test(cleaned)) {
|
||||
return { e164: null, country: null, display: null, flagged: 'placeholder' };
|
||||
}
|
||||
|
||||
// 7. Parse via the existing i18n helper (libphonenumber-js under the hood).
|
||||
const parsed = parsePhone(cleaned, defaultCountry);
|
||||
if (!parsed.e164) {
|
||||
// Couldn't even produce a canonical form — genuinely garbage.
|
||||
return { e164: null, country: null, display: null, flagged: 'unparseable' };
|
||||
}
|
||||
|
||||
// Note: we deliberately don't gate on `parsed.isValid`. The
|
||||
// libphonenumber-js `min` build returns isValid=false for many real
|
||||
// numbers (NANP territories share +1; some country metadata is
|
||||
// truncated). For dedup we only need a canonical E.164 string to
|
||||
// compare; strict validity is the form layer's problem, not ours.
|
||||
// If a string-only test (e.g. \"abc-not-a-phone\") gets here, parse
|
||||
// returns null e164 anyway and the branch above handles it.
|
||||
return {
|
||||
e164: parsed.e164,
|
||||
country: parsed.country,
|
||||
display: parsed.international,
|
||||
flagged,
|
||||
};
|
||||
}
|
||||
|
||||
// ─── Countries ──────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Aliases for canonical country names that don't match
|
||||
* `Intl.DisplayNames(en)` output verbatim. Keys are pre-normalized
|
||||
* (lowercase, diacritic-free, hyphens/dots → spaces, collapsed whitespace).
|
||||
*
|
||||
* Kept opinionated and small — only entries we've actually seen in legacy
|
||||
* data. Adding a new alias is cheap; trying to be exhaustive isn't.
|
||||
*/
|
||||
const COUNTRY_ALIASES: Record<string, CountryCode> = {
|
||||
// Generic abbreviations
|
||||
usa: 'US',
|
||||
us: 'US',
|
||||
uk: 'GB',
|
||||
// Saint-Barthélemy variants seen in production
|
||||
'saint barthelemy': 'BL',
|
||||
'saint barth': 'BL',
|
||||
'st barth': 'BL',
|
||||
'st barths': 'BL',
|
||||
'st barthelemy': 'BL',
|
||||
// Caribbean short-forms whose canonical Intl names are awkward
|
||||
// ("Antigua and Barbuda", "Saint Vincent and the Grenadines", etc.).
|
||||
antigua: 'AG',
|
||||
barbuda: 'AG',
|
||||
'st kitts': 'KN',
|
||||
'saint kitts': 'KN',
|
||||
nevis: 'KN',
|
||||
};
|
||||
|
||||
/**
|
||||
* High-frequency cities → country, used as a last-resort fallback when
|
||||
* exact / alias / fuzzy country matching all miss. Keys are normalized.
|
||||
*
|
||||
* Order matters: an entry's key is also matched as a substring of the
|
||||
* input ("Sag Harbor Y" contains "sag harbor"), so the most specific
|
||||
* city appears first to avoid a wrong partial hit.
|
||||
*/
|
||||
const CITY_TO_COUNTRY: Record<string, CountryCode> = {
|
||||
'kansas city': 'US',
|
||||
'sag harbor': 'US',
|
||||
'new york': 'US',
|
||||
// Cities that came out unresolved from the 2026-05-03 NocoDB dry-run.
|
||||
// Using lowercase (post-normalize keys).
|
||||
boston: 'US',
|
||||
tampa: 'US',
|
||||
'fort lauderdale': 'US',
|
||||
'port jefferson': 'US',
|
||||
nantucket: 'US',
|
||||
// US state abbreviations that often appear standalone or as suffix:
|
||||
' fl': 'US',
|
||||
' ma': 'US',
|
||||
' ny': 'US',
|
||||
' tx': 'US',
|
||||
' ca': 'US',
|
||||
// International
|
||||
london: 'GB',
|
||||
paris: 'FR',
|
||||
};
|
||||
|
||||
export type CountryConfidence = 'exact' | 'fuzzy' | 'city';
|
||||
|
||||
export interface ResolvedCountry {
|
||||
iso: CountryCode | null;
|
||||
confidence: CountryConfidence | null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Map free-text country / region input to an ISO-3166-1 alpha-2 code.
|
||||
*
|
||||
* Lookup order: alias → exact (vs. all locale country names) → city →
|
||||
* fuzzy (Levenshtein ≤ 2). Anything beyond fuzzy returns null and the
|
||||
* migration script flags the row for human review.
|
||||
*/
|
||||
export function resolveCountry(text: string | null | undefined): ResolvedCountry {
|
||||
if (text == null) return { iso: null, confidence: null };
|
||||
const normalized = normalizeForLookup(text.toString());
|
||||
if (!normalized) return { iso: null, confidence: null };
|
||||
|
||||
// 1. Aliases — covers USA / UK / St Barth and friends.
|
||||
const alias = COUNTRY_ALIASES[normalized];
|
||||
if (alias) return { iso: alias, confidence: 'exact' };
|
||||
|
||||
// 2. Exact match against Intl-derived country names. We compare against
|
||||
// diacritic-stripped + lowercased canonical names so 'United States'
|
||||
// and 'united states' both resolve.
|
||||
for (const code of ALL_COUNTRY_CODES) {
|
||||
const cleanName = normalizeForLookup(getCountryName(code, 'en'));
|
||||
if (cleanName === normalized) return { iso: code, confidence: 'exact' };
|
||||
}
|
||||
|
||||
// 3. City → country fallback, exact or substring.
|
||||
const cityExact = CITY_TO_COUNTRY[normalized];
|
||||
if (cityExact) return { iso: cityExact, confidence: 'city' };
|
||||
for (const [city, iso] of Object.entries(CITY_TO_COUNTRY)) {
|
||||
if (normalized.includes(city)) return { iso, confidence: 'city' };
|
||||
}
|
||||
|
||||
// 4. Fuzzy fallback (Levenshtein ≤ 2). Skipped for short inputs because
|
||||
// a 4-char string like "Mars" sits within distance 2 of multiple
|
||||
// short country names (Mali, Laos, Iran, …) — false-positive city.
|
||||
if (normalized.length >= 6) {
|
||||
let bestCode: CountryCode | null = null;
|
||||
let bestDistance = Number.POSITIVE_INFINITY;
|
||||
for (const code of ALL_COUNTRY_CODES) {
|
||||
const cleanName = normalizeForLookup(getCountryName(code, 'en'));
|
||||
const d = levenshtein(cleanName, normalized);
|
||||
if (d < bestDistance) {
|
||||
bestDistance = d;
|
||||
bestCode = code;
|
||||
if (d === 0) break;
|
||||
}
|
||||
}
|
||||
if (bestDistance <= 2 && bestCode) {
|
||||
return { iso: bestCode, confidence: 'fuzzy' };
|
||||
}
|
||||
}
|
||||
|
||||
return { iso: null, confidence: null };
|
||||
}
|
||||
|
||||
/** Lowercase + strip diacritics + replace hyphens/dots with spaces +
|
||||
* collapse whitespace. Used by both the input and the canonical-name
|
||||
* side of the country comparison so they meet on the same shape. */
|
||||
function normalizeForLookup(s: string): string {
|
||||
return s
|
||||
.normalize('NFD')
|
||||
.replace(/[̀-ͯ]/g, '')
|
||||
.toLowerCase()
|
||||
.replace(/[-.]/g, ' ')
|
||||
.replace(/\s+/g, ' ')
|
||||
.trim();
|
||||
}
|
||||
|
||||
// ─── Levenshtein ────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Standard iterative Levenshtein. Used by the country fuzzy match and by
|
||||
* the dedup algorithm's name-similarity rule. Allocates O(n*m) so callers
|
||||
* shouldn't run it against pathological inputs — the dedup blocking
|
||||
* strategy keeps comparison sets small.
|
||||
*
|
||||
* Exported so the find-matches module can reuse the same implementation
|
||||
* without relying on an external dep.
|
||||
*/
|
||||
export function levenshtein(a: string, b: string): number {
|
||||
if (a === b) return 0;
|
||||
if (!a) return b.length;
|
||||
if (!b) return a.length;
|
||||
|
||||
const m = a.length;
|
||||
const n = b.length;
|
||||
// Two rolling rows is enough — keeps memory at O(n) instead of O(n*m).
|
||||
let prev = new Array<number>(n + 1);
|
||||
let curr = new Array<number>(n + 1);
|
||||
for (let j = 0; j <= n; j += 1) prev[j] = j;
|
||||
|
||||
for (let i = 1; i <= m; i += 1) {
|
||||
curr[0] = i;
|
||||
for (let j = 1; j <= n; j += 1) {
|
||||
const cost = a[i - 1] === b[j - 1] ? 0 : 1;
|
||||
curr[j] = Math.min(curr[j - 1]! + 1, prev[j]! + 1, prev[j - 1]! + cost);
|
||||
}
|
||||
[prev, curr] = [curr, prev];
|
||||
}
|
||||
|
||||
return prev[n]!;
|
||||
}
|
||||
66
src/lib/dedup/phone-parse.ts
Normal file
66
src/lib/dedup/phone-parse.ts
Normal file
@@ -0,0 +1,66 @@
|
||||
/**
|
||||
* Script-safe phone parser.
|
||||
*
|
||||
* The project's existing `src/lib/i18n/phone.ts` imports from
|
||||
* `libphonenumber-js`, which under Node 25 + tsx loader hits a
|
||||
* metadata-shape interop bug (`{ default }` wrapping the JSON). It
|
||||
* works fine in Next.js + vitest, but a `tsx scripts/...` invocation
|
||||
* blows up.
|
||||
*
|
||||
* This wrapper bypasses the bundled `index.cjs.js` and calls
|
||||
* `libphonenumber-js/core` directly with metadata loaded as raw JSON.
|
||||
* Same surface as the i18n helper; usable from both runtimes.
|
||||
*
|
||||
* Used by the dedup library's `normalizePhone`. The runtime UI still
|
||||
* imports `i18n/phone` directly — no reason to touch a working path.
|
||||
*/
|
||||
|
||||
// eslint-disable-next-line @typescript-eslint/no-require-imports
|
||||
const core: typeof import('libphonenumber-js/core') = require('libphonenumber-js/core');
|
||||
// Load the JSON directly. The bundled `index.cjs.js` does the same
|
||||
// thing but its `require('../metadata.min.json')` hits a Node 25 ESM
|
||||
// interop bug that wraps the JSON in `{ default }`. Importing the
|
||||
// JSON file by absolute path through the package root sidesteps it.
|
||||
// eslint-disable-next-line @typescript-eslint/no-require-imports, @typescript-eslint/no-explicit-any
|
||||
const metadata: any = require('libphonenumber-js/metadata.min.json');
|
||||
|
||||
import type { CountryCode } from '@/lib/i18n/countries';
|
||||
|
||||
export interface ParsedPhone {
|
||||
e164: string | null;
|
||||
country: CountryCode | null;
|
||||
national: string | null;
|
||||
international: string | null;
|
||||
isValid: boolean;
|
||||
}
|
||||
|
||||
const EMPTY: ParsedPhone = {
|
||||
e164: null,
|
||||
country: null,
|
||||
national: null,
|
||||
international: null,
|
||||
isValid: false,
|
||||
};
|
||||
|
||||
export function parsePhoneScriptSafe(raw: string, defaultCountry?: CountryCode): ParsedPhone {
|
||||
const trimmed = raw.trim();
|
||||
if (!trimmed) return EMPTY;
|
||||
try {
|
||||
// The core entry expects its own `CountryCode` type from
|
||||
// libphonenumber-js. Our `CountryCode` type is the same set of ISO
|
||||
// alpha-2 codes (we re-derive from the same Intl source) so this
|
||||
// cast is structural-equivalent, not lossy.
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
const parsed = core.parsePhoneNumberFromString(trimmed, defaultCountry as any, metadata);
|
||||
if (!parsed) return EMPTY;
|
||||
return {
|
||||
e164: parsed.number,
|
||||
country: (parsed.country ?? null) as CountryCode | null,
|
||||
national: parsed.formatNational(),
|
||||
international: parsed.formatInternational(),
|
||||
isValid: parsed.isValid(),
|
||||
};
|
||||
} catch {
|
||||
return EMPTY;
|
||||
}
|
||||
}
|
||||
@@ -58,10 +58,11 @@ export function buildPipelineInputs(
|
||||
'open',
|
||||
'details_sent',
|
||||
'in_communication',
|
||||
'visited',
|
||||
'signed_eoi_nda',
|
||||
'eoi_sent',
|
||||
'eoi_signed',
|
||||
'deposit_10pct',
|
||||
'contract',
|
||||
'contract_sent',
|
||||
'contract_signed',
|
||||
'completed',
|
||||
];
|
||||
|
||||
@@ -73,9 +74,7 @@ export function buildPipelineInputs(
|
||||
});
|
||||
|
||||
// Include stages not in standard order
|
||||
const unknownStages = Object.keys(data.stageCounts).filter(
|
||||
(s) => !stageOrder.includes(s),
|
||||
);
|
||||
const unknownStages = Object.keys(data.stageCounts).filter((s) => !stageOrder.includes(s));
|
||||
for (const stage of unknownStages) {
|
||||
summaryLines.push(`${stage}: ${data.stageCounts[stage]} interest(s)`);
|
||||
}
|
||||
|
||||
@@ -50,18 +50,16 @@ export const revenueReportTemplate: Template = {
|
||||
],
|
||||
};
|
||||
|
||||
export function buildRevenueInputs(
|
||||
data: RevenueData,
|
||||
portName?: string,
|
||||
): Record<string, string>[] {
|
||||
export function buildRevenueInputs(data: RevenueData, portName?: string): Record<string, string>[] {
|
||||
const stageOrder = [
|
||||
'open',
|
||||
'details_sent',
|
||||
'in_communication',
|
||||
'visited',
|
||||
'signed_eoi_nda',
|
||||
'eoi_sent',
|
||||
'eoi_signed',
|
||||
'deposit_10pct',
|
||||
'contract',
|
||||
'contract_sent',
|
||||
'contract_signed',
|
||||
'completed',
|
||||
];
|
||||
|
||||
|
||||
@@ -84,6 +84,28 @@ export const webhooksWorker = new Worker(
|
||||
return;
|
||||
}
|
||||
|
||||
// Safety net: when EMAIL_REDIRECT_TO is set (dev / staging / migration
|
||||
// dry-run), short-circuit webhook delivery so we don't accidentally
|
||||
// ping a user-configured production endpoint with synthetic events.
|
||||
// Records the delivery as `dead_letter` with a clear reason so the
|
||||
// attempt is still visible in the deliveries listing.
|
||||
if (process.env.EMAIL_REDIRECT_TO) {
|
||||
logger.info(
|
||||
{ webhookId, deliveryId, url: webhook.url },
|
||||
'Webhook delivery skipped (EMAIL_REDIRECT_TO is set — outbound comms are paused)',
|
||||
);
|
||||
await db
|
||||
.update(webhookDeliveries)
|
||||
.set({
|
||||
status: 'dead_letter',
|
||||
responseStatus: null,
|
||||
responseBody: 'Skipped: EMAIL_REDIRECT_TO is set, outbound comms paused.',
|
||||
deliveredAt: new Date(),
|
||||
})
|
||||
.where(eq(webhookDeliveries.id, deliveryId));
|
||||
return;
|
||||
}
|
||||
|
||||
// 2. Decrypt secret
|
||||
let secret: string;
|
||||
try {
|
||||
|
||||
393
src/lib/services/client-merge.service.ts
Normal file
393
src/lib/services/client-merge.service.ts
Normal file
@@ -0,0 +1,393 @@
|
||||
/**
|
||||
* Client merge service — atomically combines two client records.
|
||||
*
|
||||
* Used by:
|
||||
* - /admin/duplicates review queue (when an admin confirms a merge)
|
||||
* - the at-create suggestion path ("use existing client") — though
|
||||
* that path uses the lighter `attachInterestToClient` and never
|
||||
* actually merges two pre-existing clients
|
||||
* - the migration script's `--apply` (eventually)
|
||||
*
|
||||
* Reversibility: every merge writes a `client_merge_log` row containing
|
||||
* the loser's full pre-merge state. Within the configured undo window
|
||||
* (default 7 days, see `dedup_undo_window_days` in system_settings) a
|
||||
* follow-up `unmergeClients` call can restore the loser and detach
|
||||
* everything that was reattached.
|
||||
*
|
||||
* Design reference: docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md §6.
|
||||
*/
|
||||
|
||||
import { and, eq, sql } from 'drizzle-orm';
|
||||
|
||||
import { db } from '@/lib/db';
|
||||
import {
|
||||
clients,
|
||||
clientContacts,
|
||||
clientAddresses,
|
||||
clientNotes,
|
||||
clientTags,
|
||||
clientRelationships,
|
||||
clientMergeLog,
|
||||
clientMergeCandidates,
|
||||
} from '@/lib/db/schema/clients';
|
||||
import { interests } from '@/lib/db/schema/interests';
|
||||
import { berthReservations } from '@/lib/db/schema/reservations';
|
||||
import { auditLogs } from '@/lib/db/schema/system';
|
||||
|
||||
// ─── Public API ─────────────────────────────────────────────────────────────
|
||||
|
||||
export interface MergeFieldChoices {
|
||||
/** Per-field overrides — `winner` keeps the surviving client's value;
|
||||
* `loser` copies the loser's value over. Fields not listed default
|
||||
* to `winner` (no change). */
|
||||
fullName?: 'winner' | 'loser';
|
||||
nationalityIso?: 'winner' | 'loser';
|
||||
preferredContactMethod?: 'winner' | 'loser';
|
||||
preferredLanguage?: 'winner' | 'loser';
|
||||
timezone?: 'winner' | 'loser';
|
||||
source?: 'winner' | 'loser';
|
||||
sourceDetails?: 'winner' | 'loser';
|
||||
}
|
||||
|
||||
export interface MergeOptions {
|
||||
winnerId: string;
|
||||
loserId: string;
|
||||
/** ID of the user performing the merge (for audit + clientMergeLog.mergedBy). */
|
||||
mergedBy: string;
|
||||
/** Per-field choice overrides. Multi-value fields (contacts, addresses,
|
||||
* notes, tags) are always preserved from both sides; this only
|
||||
* affects single-value scalar fields on the `clients` row. */
|
||||
fieldChoices?: MergeFieldChoices;
|
||||
}
|
||||
|
||||
export interface MergeResult {
|
||||
mergeLogId: string;
|
||||
movedRows: {
|
||||
interests: number;
|
||||
contacts: number;
|
||||
addresses: number;
|
||||
notes: number;
|
||||
tags: number;
|
||||
relationships: number;
|
||||
reservations: number;
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Atomically merge `loserId` into `winnerId`. Throws if:
|
||||
* - either id doesn't exist or belongs to a different port
|
||||
* - the loser has already been merged (mergedIntoClientId set)
|
||||
* - the winner is itself archived
|
||||
*/
|
||||
export async function mergeClients(opts: MergeOptions): Promise<MergeResult> {
|
||||
if (opts.winnerId === opts.loserId) {
|
||||
throw new Error('Cannot merge a client into itself');
|
||||
}
|
||||
|
||||
return await db.transaction(async (tx) => {
|
||||
// ── Lock both rows for the duration. The first FOR UPDATE that
|
||||
// arrives wins; a concurrent second merge of the same loser
|
||||
// will see `mergedIntoClientId` set and bail. ──────────────────────
|
||||
const [winnerRow] = await tx
|
||||
.select()
|
||||
.from(clients)
|
||||
.where(eq(clients.id, opts.winnerId))
|
||||
.for('update');
|
||||
const [loserRow] = await tx
|
||||
.select()
|
||||
.from(clients)
|
||||
.where(eq(clients.id, opts.loserId))
|
||||
.for('update');
|
||||
|
||||
if (!winnerRow) throw new Error(`Winner client ${opts.winnerId} not found`);
|
||||
if (!loserRow) throw new Error(`Loser client ${opts.loserId} not found`);
|
||||
if (winnerRow.portId !== loserRow.portId) {
|
||||
throw new Error('Cannot merge clients across different ports');
|
||||
}
|
||||
if (loserRow.mergedIntoClientId) {
|
||||
throw new Error(`Loser ${opts.loserId} already merged into ${loserRow.mergedIntoClientId}`);
|
||||
}
|
||||
if (winnerRow.archivedAt) {
|
||||
throw new Error('Cannot merge into an archived client');
|
||||
}
|
||||
|
||||
// ── Snapshot the loser's full state before any mutation. Used by
|
||||
// `unmergeClients` to restore within the undo window. ──────────────
|
||||
const loserContacts = await tx
|
||||
.select()
|
||||
.from(clientContacts)
|
||||
.where(eq(clientContacts.clientId, opts.loserId));
|
||||
const loserAddresses = await tx
|
||||
.select()
|
||||
.from(clientAddresses)
|
||||
.where(eq(clientAddresses.clientId, opts.loserId));
|
||||
const loserNotes = await tx
|
||||
.select()
|
||||
.from(clientNotes)
|
||||
.where(eq(clientNotes.clientId, opts.loserId));
|
||||
const loserTags = await tx
|
||||
.select()
|
||||
.from(clientTags)
|
||||
.where(eq(clientTags.clientId, opts.loserId));
|
||||
const loserInterests = await tx
|
||||
.select({ id: interests.id })
|
||||
.from(interests)
|
||||
.where(eq(interests.clientId, opts.loserId));
|
||||
const loserReservations = await tx
|
||||
.select({ id: berthReservations.id })
|
||||
.from(berthReservations)
|
||||
.where(eq(berthReservations.clientId, opts.loserId));
|
||||
const loserRelationshipsAsA = await tx
|
||||
.select()
|
||||
.from(clientRelationships)
|
||||
.where(eq(clientRelationships.clientAId, opts.loserId));
|
||||
const loserRelationshipsAsB = await tx
|
||||
.select()
|
||||
.from(clientRelationships)
|
||||
.where(eq(clientRelationships.clientBId, opts.loserId));
|
||||
|
||||
const snapshot = {
|
||||
loser: loserRow,
|
||||
contacts: loserContacts,
|
||||
addresses: loserAddresses,
|
||||
notes: loserNotes,
|
||||
tags: loserTags,
|
||||
interests: loserInterests.map((r) => r.id),
|
||||
reservations: loserReservations.map((r) => r.id),
|
||||
relationshipsAsA: loserRelationshipsAsA,
|
||||
relationshipsAsB: loserRelationshipsAsB,
|
||||
fieldChoices: opts.fieldChoices ?? {},
|
||||
mergedAt: new Date().toISOString(),
|
||||
};
|
||||
|
||||
// ── Apply field choices on the winner. We only touch fields the
|
||||
// caller explicitly asked to copy from the loser; everything
|
||||
// else stays as-is. ────────────────────────────────────────────────
|
||||
const fieldUpdates: Partial<typeof winnerRow> = {};
|
||||
if (opts.fieldChoices?.fullName === 'loser') fieldUpdates.fullName = loserRow.fullName;
|
||||
if (opts.fieldChoices?.nationalityIso === 'loser')
|
||||
fieldUpdates.nationalityIso = loserRow.nationalityIso;
|
||||
if (opts.fieldChoices?.preferredContactMethod === 'loser')
|
||||
fieldUpdates.preferredContactMethod = loserRow.preferredContactMethod;
|
||||
if (opts.fieldChoices?.preferredLanguage === 'loser')
|
||||
fieldUpdates.preferredLanguage = loserRow.preferredLanguage;
|
||||
if (opts.fieldChoices?.timezone === 'loser') fieldUpdates.timezone = loserRow.timezone;
|
||||
if (opts.fieldChoices?.source === 'loser') fieldUpdates.source = loserRow.source;
|
||||
if (opts.fieldChoices?.sourceDetails === 'loser')
|
||||
fieldUpdates.sourceDetails = loserRow.sourceDetails;
|
||||
|
||||
if (Object.keys(fieldUpdates).length > 0) {
|
||||
await tx
|
||||
.update(clients)
|
||||
.set({ ...fieldUpdates, updatedAt: new Date() })
|
||||
.where(eq(clients.id, opts.winnerId));
|
||||
}
|
||||
|
||||
// ── Reattach. Each table that points at the loser via clientId
|
||||
// gets pointed at the winner instead. ─────────────────────────────
|
||||
|
||||
const movedInterests = (
|
||||
await tx
|
||||
.update(interests)
|
||||
.set({ clientId: opts.winnerId, updatedAt: new Date() })
|
||||
.where(eq(interests.clientId, opts.loserId))
|
||||
.returning({ id: interests.id })
|
||||
).length;
|
||||
|
||||
const movedReservations = (
|
||||
await tx
|
||||
.update(berthReservations)
|
||||
.set({ clientId: opts.winnerId, updatedAt: new Date() })
|
||||
.where(eq(berthReservations.clientId, opts.loserId))
|
||||
.returning({ id: berthReservations.id })
|
||||
).length;
|
||||
|
||||
// Contacts: move loser's contacts to winner, but DON'T duplicate any
|
||||
// already-present (channel, value) pair. Loser-only ones get
|
||||
// demoted to non-primary so the winner's primary stays intact.
|
||||
const winnerContacts = await tx
|
||||
.select({ channel: clientContacts.channel, value: clientContacts.value })
|
||||
.from(clientContacts)
|
||||
.where(eq(clientContacts.clientId, opts.winnerId));
|
||||
const winnerContactKeys = new Set(
|
||||
winnerContacts.map((c) => `${c.channel}::${c.value.toLowerCase()}`),
|
||||
);
|
||||
|
||||
let movedContacts = 0;
|
||||
for (const c of loserContacts) {
|
||||
const key = `${c.channel}::${c.value.toLowerCase()}`;
|
||||
if (winnerContactKeys.has(key)) {
|
||||
// Winner already has this contact — drop loser's row (cascade
|
||||
// will clean up when loser is archived). But we keep snapshot
|
||||
// so undo restores it.
|
||||
continue;
|
||||
}
|
||||
await tx
|
||||
.update(clientContacts)
|
||||
.set({ clientId: opts.winnerId, isPrimary: false, updatedAt: new Date() })
|
||||
.where(eq(clientContacts.id, c.id));
|
||||
movedContacts += 1;
|
||||
}
|
||||
|
||||
// Addresses: same shape as contacts, but uniqueness is harder to
|
||||
// detect cleanly (free-text street). Just move them all and let the
|
||||
// user dedupe in the UI later.
|
||||
const movedAddresses = (
|
||||
await tx
|
||||
.update(clientAddresses)
|
||||
.set({ clientId: opts.winnerId, isPrimary: false, updatedAt: new Date() })
|
||||
.where(eq(clientAddresses.clientId, opts.loserId))
|
||||
.returning({ id: clientAddresses.id })
|
||||
).length;
|
||||
|
||||
const movedNotes = (
|
||||
await tx
|
||||
.update(clientNotes)
|
||||
.set({ clientId: opts.winnerId, updatedAt: new Date() })
|
||||
.where(eq(clientNotes.clientId, opts.loserId))
|
||||
.returning({ id: clientNotes.id })
|
||||
).length;
|
||||
|
||||
// Tags: copy any loser-only tag to the winner; drop overlap.
|
||||
const winnerTags = await tx
|
||||
.select({ tagId: clientTags.tagId })
|
||||
.from(clientTags)
|
||||
.where(eq(clientTags.clientId, opts.winnerId));
|
||||
const winnerTagSet = new Set(winnerTags.map((t) => t.tagId));
|
||||
let movedTags = 0;
|
||||
for (const t of loserTags) {
|
||||
if (!winnerTagSet.has(t.tagId)) {
|
||||
await tx.insert(clientTags).values({ clientId: opts.winnerId, tagId: t.tagId });
|
||||
movedTags += 1;
|
||||
}
|
||||
}
|
||||
await tx.delete(clientTags).where(eq(clientTags.clientId, opts.loserId));
|
||||
|
||||
// Relationships: rewrite each FK side to point at the winner. Keep
|
||||
// both sides regardless — even if A and B both end up as the same
|
||||
// person, the row is preserved for audit; the UI hides self-loops.
|
||||
const movedRelationships =
|
||||
(
|
||||
await tx
|
||||
.update(clientRelationships)
|
||||
.set({ clientAId: opts.winnerId })
|
||||
.where(eq(clientRelationships.clientAId, opts.loserId))
|
||||
.returning({ id: clientRelationships.id })
|
||||
).length +
|
||||
(
|
||||
await tx
|
||||
.update(clientRelationships)
|
||||
.set({ clientBId: opts.winnerId })
|
||||
.where(eq(clientRelationships.clientBId, opts.loserId))
|
||||
.returning({ id: clientRelationships.id })
|
||||
).length;
|
||||
|
||||
// ── Archive the loser. Row stays in DB for the undo window;
|
||||
// `mergedIntoClientId` is the redirect pointer for any stragglers
|
||||
// (links / direct queries / saved views). ──────────────────────────
|
||||
await tx
|
||||
.update(clients)
|
||||
.set({
|
||||
archivedAt: new Date(),
|
||||
mergedIntoClientId: opts.winnerId,
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.where(eq(clients.id, opts.loserId));
|
||||
|
||||
// ── Mark any open merge candidate row for this pair as resolved. ───
|
||||
await tx
|
||||
.update(clientMergeCandidates)
|
||||
.set({
|
||||
status: 'merged',
|
||||
resolvedAt: new Date(),
|
||||
resolvedBy: opts.mergedBy,
|
||||
})
|
||||
.where(
|
||||
and(
|
||||
eq(clientMergeCandidates.portId, winnerRow.portId),
|
||||
// pair stored in canonical order — match either direction
|
||||
sql`(
|
||||
(${clientMergeCandidates.clientAId} = ${opts.winnerId}
|
||||
AND ${clientMergeCandidates.clientBId} = ${opts.loserId})
|
||||
OR
|
||||
(${clientMergeCandidates.clientAId} = ${opts.loserId}
|
||||
AND ${clientMergeCandidates.clientBId} = ${opts.winnerId})
|
||||
)`,
|
||||
),
|
||||
);
|
||||
|
||||
// ── Write the merge log + audit log. ────────────────────────────────
|
||||
const [logRow] = await tx
|
||||
.insert(clientMergeLog)
|
||||
.values({
|
||||
portId: winnerRow.portId,
|
||||
survivingClientId: opts.winnerId,
|
||||
mergedClientId: opts.loserId,
|
||||
mergedBy: opts.mergedBy,
|
||||
mergeDetails: snapshot,
|
||||
})
|
||||
.returning({ id: clientMergeLog.id });
|
||||
|
||||
await tx.insert(auditLogs).values({
|
||||
portId: winnerRow.portId,
|
||||
userId: opts.mergedBy,
|
||||
entityType: 'client',
|
||||
entityId: opts.winnerId,
|
||||
action: 'merge',
|
||||
newValue: {
|
||||
loserId: opts.loserId,
|
||||
loserName: loserRow.fullName,
|
||||
movedInterests,
|
||||
movedReservations,
|
||||
movedContacts,
|
||||
movedAddresses,
|
||||
},
|
||||
});
|
||||
|
||||
return {
|
||||
mergeLogId: logRow!.id,
|
||||
movedRows: {
|
||||
interests: movedInterests,
|
||||
contacts: movedContacts,
|
||||
addresses: movedAddresses,
|
||||
notes: movedNotes,
|
||||
tags: movedTags,
|
||||
relationships: movedRelationships,
|
||||
reservations: movedReservations,
|
||||
},
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
// ─── Convenience: list merge candidates for a port ──────────────────────────
|
||||
|
||||
export interface MergeCandidatePair {
|
||||
id: string;
|
||||
clientAId: string;
|
||||
clientBId: string;
|
||||
score: number;
|
||||
reasons: string[];
|
||||
status: string;
|
||||
createdAt: Date;
|
||||
}
|
||||
|
||||
/** Fetch pending merge candidate pairs for the admin review queue. */
|
||||
export async function listPendingMergeCandidates(portId: string): Promise<MergeCandidatePair[]> {
|
||||
const rows = await db
|
||||
.select()
|
||||
.from(clientMergeCandidates)
|
||||
.where(
|
||||
and(eq(clientMergeCandidates.portId, portId), eq(clientMergeCandidates.status, 'pending')),
|
||||
)
|
||||
.orderBy(sql`${clientMergeCandidates.score} DESC`);
|
||||
|
||||
return rows.map((r) => ({
|
||||
id: r.id,
|
||||
clientAId: r.clientAId,
|
||||
clientBId: r.clientBId,
|
||||
score: r.score,
|
||||
reasons: Array.isArray(r.reasons) ? (r.reasons as string[]) : [],
|
||||
status: r.status,
|
||||
createdAt: r.createdAt,
|
||||
}));
|
||||
}
|
||||
@@ -1,9 +1,10 @@
|
||||
import { and, count, eq, ilike, inArray, isNull } from 'drizzle-orm';
|
||||
import { and, count, desc, eq, ilike, inArray, isNull } from 'drizzle-orm';
|
||||
|
||||
import { db } from '@/lib/db';
|
||||
import {
|
||||
clients,
|
||||
clientContacts,
|
||||
clientNotes,
|
||||
clientRelationships,
|
||||
clientTags,
|
||||
clientAddresses,
|
||||
@@ -11,6 +12,8 @@ import {
|
||||
import { companies, companyMemberships } from '@/lib/db/schema/companies';
|
||||
import { yachts } from '@/lib/db/schema/yachts';
|
||||
import { berthReservations } from '@/lib/db/schema/reservations';
|
||||
import { interests } from '@/lib/db/schema/interests';
|
||||
import { berths } from '@/lib/db/schema/berths';
|
||||
import { tags } from '@/lib/db/schema/system';
|
||||
import { createAuditLog, type AuditMeta } from '@/lib/audit';
|
||||
import { NotFoundError, ValidationError } from '@/lib/errors';
|
||||
@@ -81,7 +84,7 @@ export async function listClients(portId: string, query: ListClientsInput) {
|
||||
|
||||
const ids = result.data.map((r) => r.id);
|
||||
|
||||
const [yachtCounts, companyCounts] = await Promise.all([
|
||||
const [yachtCounts, companyCounts, interestRows, interestCounts] = await Promise.all([
|
||||
db
|
||||
.select({ ownerId: yachts.currentOwnerId, count: count() })
|
||||
.from(yachts)
|
||||
@@ -99,18 +102,67 @@ export async function listClients(portId: string, query: ListClientsInput) {
|
||||
.from(companyMemberships)
|
||||
.where(and(inArray(companyMemberships.clientId, ids), isNull(companyMemberships.endDate)))
|
||||
.groupBy(companyMemberships.clientId),
|
||||
db
|
||||
.select({
|
||||
clientId: interests.clientId,
|
||||
pipelineStage: interests.pipelineStage,
|
||||
updatedAt: interests.updatedAt,
|
||||
mooringNumber: berths.mooringNumber,
|
||||
})
|
||||
.from(interests)
|
||||
.leftJoin(berths, eq(berths.id, interests.berthId))
|
||||
.where(
|
||||
and(
|
||||
eq(interests.portId, portId),
|
||||
inArray(interests.clientId, ids),
|
||||
isNull(interests.archivedAt),
|
||||
),
|
||||
)
|
||||
.orderBy(desc(interests.updatedAt)),
|
||||
db
|
||||
.select({ clientId: interests.clientId, count: count() })
|
||||
.from(interests)
|
||||
.where(
|
||||
and(
|
||||
eq(interests.portId, portId),
|
||||
inArray(interests.clientId, ids),
|
||||
isNull(interests.archivedAt),
|
||||
),
|
||||
)
|
||||
.groupBy(interests.clientId),
|
||||
]);
|
||||
|
||||
const yachtCountMap = new Map(yachtCounts.map((r) => [r.ownerId, r.count]));
|
||||
const companyCountMap = new Map(companyCounts.map((r) => [r.clientId, r.count]));
|
||||
const interestCountMap = new Map(interestCounts.map((r) => [r.clientId, r.count]));
|
||||
// interestRows is sorted desc by updatedAt; first hit per clientId is the latest.
|
||||
const latestInterestMap = new Map<string, { stage: string; mooringNumber: string | null }>();
|
||||
for (const row of interestRows) {
|
||||
if (!latestInterestMap.has(row.clientId)) {
|
||||
latestInterestMap.set(row.clientId, {
|
||||
stage: row.pipelineStage,
|
||||
mooringNumber: row.mooringNumber,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
...result,
|
||||
data: result.data.map((row) => ({
|
||||
...row,
|
||||
yachtCount: yachtCountMap.get(row.id) ?? 0,
|
||||
companyCount: companyCountMap.get(row.id) ?? 0,
|
||||
})),
|
||||
data: result.data.map((row) => {
|
||||
const latest = latestInterestMap.get(row.id);
|
||||
return {
|
||||
...row,
|
||||
yachtCount: yachtCountMap.get(row.id) ?? 0,
|
||||
companyCount: companyCountMap.get(row.id) ?? 0,
|
||||
interestCount: interestCountMap.get(row.id) ?? 0,
|
||||
latestInterest: latest
|
||||
? {
|
||||
stage: latest.stage,
|
||||
mooringNumber: latest.mooringNumber,
|
||||
}
|
||||
: null,
|
||||
};
|
||||
}),
|
||||
};
|
||||
}
|
||||
|
||||
@@ -200,6 +252,19 @@ export async function getClientById(id: string, portId: string) {
|
||||
|
||||
const portalEnabled = await isPortalEnabledForPort(portId);
|
||||
|
||||
// Counts surfaced for tab badges (Interests + Notes — Yachts/Companies/etc
|
||||
// get their counts from the corresponding row arrays we already fetched).
|
||||
const [interestCountRow] = await db
|
||||
.select({ count: count() })
|
||||
.from(interests)
|
||||
.where(
|
||||
and(eq(interests.portId, portId), eq(interests.clientId, id), isNull(interests.archivedAt)),
|
||||
);
|
||||
const [noteCountRow] = await db
|
||||
.select({ count: count() })
|
||||
.from(clientNotes)
|
||||
.where(eq(clientNotes.clientId, id));
|
||||
|
||||
return {
|
||||
...client,
|
||||
contacts,
|
||||
@@ -208,6 +273,8 @@ export async function getClientById(id: string, portId: string) {
|
||||
yachts: yachtRows,
|
||||
companies: membershipRows,
|
||||
activeReservations,
|
||||
interestCount: interestCountRow?.count ?? 0,
|
||||
noteCount: noteCountRow?.count ?? 0,
|
||||
clientPortalEnabled: portalEnabled,
|
||||
};
|
||||
}
|
||||
|
||||
@@ -87,17 +87,72 @@ export interface DocumensoDocument {
|
||||
}>;
|
||||
}
|
||||
|
||||
/**
|
||||
* When EMAIL_REDIRECT_TO is set (dev / staging), rewrite every recipient
|
||||
* email so Documenso doesn't accidentally email real clients during a
|
||||
* data import / migration dry-run. Names are prefixed with the original
|
||||
* email so the recipient (you) can tell who would have received the doc.
|
||||
*
|
||||
* In production this env var is unset and recipients flow through unchanged.
|
||||
*/
|
||||
function applyRecipientRedirect(recipients: DocumensoRecipient[]): DocumensoRecipient[] {
|
||||
if (!env.EMAIL_REDIRECT_TO) return recipients;
|
||||
return recipients.map((r) => ({
|
||||
...r,
|
||||
name: `${r.name} (was: ${r.email})`,
|
||||
email: env.EMAIL_REDIRECT_TO!,
|
||||
}));
|
||||
}
|
||||
|
||||
/**
|
||||
* Same idea for the template-generate endpoint, which takes a payload
|
||||
* shape with recipient email/name nested inside `formValues` (Documenso
|
||||
* v1.13) or `recipients` (Documenso 2.x). We rewrite both shapes.
|
||||
*/
|
||||
function applyPayloadRedirect(payload: Record<string, unknown>): Record<string, unknown> {
|
||||
if (!env.EMAIL_REDIRECT_TO) return payload;
|
||||
const out: Record<string, unknown> = { ...payload };
|
||||
// 2.x recipient shape
|
||||
if (Array.isArray(out.recipients)) {
|
||||
out.recipients = (out.recipients as Array<Record<string, unknown>>).map((r) => ({
|
||||
...r,
|
||||
name: `${String(r.name ?? '')} (was: ${String(r.email ?? '')})`,
|
||||
email: env.EMAIL_REDIRECT_TO,
|
||||
}));
|
||||
}
|
||||
// v1.13 formValues shape — keys vary per template; key by anything that
|
||||
// looks like an email field. The conservative approach: only touch keys
|
||||
// that already hold a string and end with `Email` / `email`.
|
||||
if (out.formValues && typeof out.formValues === 'object') {
|
||||
const fv = { ...(out.formValues as Record<string, unknown>) };
|
||||
for (const key of Object.keys(fv)) {
|
||||
if (/email$/i.test(key) && typeof fv[key] === 'string') {
|
||||
fv[key] = env.EMAIL_REDIRECT_TO;
|
||||
}
|
||||
}
|
||||
out.formValues = fv;
|
||||
}
|
||||
return out;
|
||||
}
|
||||
|
||||
export async function createDocument(
|
||||
title: string,
|
||||
pdfBase64: string,
|
||||
recipients: DocumensoRecipient[],
|
||||
portId?: string,
|
||||
): Promise<DocumensoDocument> {
|
||||
const safeRecipients = applyRecipientRedirect(recipients);
|
||||
if (env.EMAIL_REDIRECT_TO) {
|
||||
logger.info(
|
||||
{ redirected: safeRecipients.length, original: recipients.map((r) => r.email) },
|
||||
'Documenso recipients redirected to EMAIL_REDIRECT_TO',
|
||||
);
|
||||
}
|
||||
return documensoFetch(
|
||||
'/api/v1/documents',
|
||||
{
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ title, document: pdfBase64, recipients }),
|
||||
body: JSON.stringify({ title, document: pdfBase64, recipients: safeRecipients }),
|
||||
},
|
||||
portId,
|
||||
).then(normalizeDocument);
|
||||
@@ -108,17 +163,43 @@ export async function generateDocumentFromTemplate(
|
||||
payload: Record<string, unknown>,
|
||||
portId?: string,
|
||||
): Promise<DocumensoDocument> {
|
||||
const safePayload = applyPayloadRedirect(payload);
|
||||
if (env.EMAIL_REDIRECT_TO) {
|
||||
logger.info(
|
||||
{ templateId },
|
||||
'Documenso template-generate payload redirected to EMAIL_REDIRECT_TO',
|
||||
);
|
||||
}
|
||||
return documensoFetch(
|
||||
`/api/v1/templates/${templateId}/generate-document`,
|
||||
{
|
||||
method: 'POST',
|
||||
body: JSON.stringify(payload),
|
||||
body: JSON.stringify(safePayload),
|
||||
},
|
||||
portId,
|
||||
).then(normalizeDocument);
|
||||
}
|
||||
|
||||
/**
|
||||
* Tell Documenso to actually email the document to its recipients. The
|
||||
* recipients themselves are set at create-time (and rerouted to
|
||||
* EMAIL_REDIRECT_TO when set), but this is a belt-and-braces guard for
|
||||
* documents that may have been created BEFORE the redirect was turned on
|
||||
* (i.e. real-recipient documents now triggered by an automation while
|
||||
* we're trying to hold comms). When the redirect is on we skip the API
|
||||
* call entirely and return a synthetic "still pending" response.
|
||||
*/
|
||||
export async function sendDocument(docId: string, portId?: string): Promise<DocumensoDocument> {
|
||||
if (env.EMAIL_REDIRECT_TO) {
|
||||
logger.warn(
|
||||
{ docId, portId, redirect: env.EMAIL_REDIRECT_TO },
|
||||
'sendDocument SKIPPED — EMAIL_REDIRECT_TO is set, outbound comms paused',
|
||||
);
|
||||
// Return the existing doc shape so downstream code doesn't see an
|
||||
// unexpected null. The document remains in DRAFT/PENDING from
|
||||
// Documenso's perspective.
|
||||
return getDocument(docId, portId);
|
||||
}
|
||||
return documensoFetch(
|
||||
`/api/v1/documents/${docId}/send`,
|
||||
{
|
||||
@@ -132,11 +213,23 @@ export async function getDocument(docId: string, portId?: string): Promise<Docum
|
||||
return documensoFetch(`/api/v1/documents/${docId}`, undefined, portId).then(normalizeDocument);
|
||||
}
|
||||
|
||||
/**
|
||||
* Email a signing reminder to one recipient. Skipped entirely when
|
||||
* EMAIL_REDIRECT_TO is set — the recipient's stored email may still be
|
||||
* a real client address from before the redirect was enabled.
|
||||
*/
|
||||
export async function sendReminder(
|
||||
docId: string,
|
||||
signerId: string,
|
||||
portId?: string,
|
||||
): Promise<void> {
|
||||
if (env.EMAIL_REDIRECT_TO) {
|
||||
logger.warn(
|
||||
{ docId, signerId, portId, redirect: env.EMAIL_REDIRECT_TO },
|
||||
'sendReminder SKIPPED — EMAIL_REDIRECT_TO is set, outbound comms paused',
|
||||
);
|
||||
return;
|
||||
}
|
||||
await documensoFetch(
|
||||
`/api/v1/documents/${docId}/recipients/${signerId}/remind`,
|
||||
{
|
||||
|
||||
@@ -5,7 +5,9 @@ import { db } from '@/lib/db';
|
||||
import { emailAccounts, emailMessages, emailThreads } from '@/lib/db/schema/email';
|
||||
import { documents, documentEvents, files } from '@/lib/db/schema/documents';
|
||||
import { createAuditLog, type AuditMeta } from '@/lib/audit';
|
||||
import { env } from '@/lib/env';
|
||||
import { NotFoundError, ForbiddenError } from '@/lib/errors';
|
||||
import { logger } from '@/lib/logger';
|
||||
import { getDecryptedCredentials } from '@/lib/services/email-accounts.service';
|
||||
import { getPortEmailConfig } from '@/lib/services/port-config';
|
||||
import { sendEmail as sendSystemEmail } from '@/lib/email';
|
||||
@@ -127,12 +129,38 @@ export async function sendEmail(
|
||||
)
|
||||
: undefined;
|
||||
|
||||
// Safety net: when EMAIL_REDIRECT_TO is set, every recipient is rerouted
|
||||
// to that address and the subject is prefixed so the operator can see
|
||||
// who would have received the message. This service builds its OWN
|
||||
// transporter (per-account SMTP) so it doesn't go through sendEmail's
|
||||
// redirect — we apply the same logic here.
|
||||
const requestedTo = data.to.join(', ');
|
||||
const requestedCc = data.cc?.join(', ');
|
||||
const effectiveTo = env.EMAIL_REDIRECT_TO ?? requestedTo;
|
||||
const effectiveCc = env.EMAIL_REDIRECT_TO ? undefined : requestedCc;
|
||||
const effectiveSubject = env.EMAIL_REDIRECT_TO
|
||||
? `[redirected from ${requestedTo}${requestedCc ? `, cc=${requestedCc}` : ''}] ${data.subject}`
|
||||
: data.subject;
|
||||
if (env.EMAIL_REDIRECT_TO) {
|
||||
logger.info(
|
||||
{
|
||||
userId,
|
||||
portId,
|
||||
accountId: data.accountId,
|
||||
originalTo: requestedTo,
|
||||
originalCc: requestedCc ?? null,
|
||||
redirectedTo: env.EMAIL_REDIRECT_TO,
|
||||
},
|
||||
'email-compose redirected to EMAIL_REDIRECT_TO',
|
||||
);
|
||||
}
|
||||
|
||||
// Send via the user's SMTP transporter
|
||||
const info = await transporter.sendMail({
|
||||
from: account.emailAddress,
|
||||
to: data.to.join(', '),
|
||||
cc: data.cc?.join(', '),
|
||||
subject: data.subject,
|
||||
to: effectiveTo,
|
||||
cc: effectiveCc,
|
||||
subject: effectiveSubject,
|
||||
html: data.bodyHtml,
|
||||
inReplyTo,
|
||||
references,
|
||||
|
||||
@@ -635,10 +635,11 @@ export function makeCreateInterestInput(overrides?: {
|
||||
| 'open'
|
||||
| 'details_sent'
|
||||
| 'in_communication'
|
||||
| 'visited'
|
||||
| 'signed_eoi_nda'
|
||||
| 'eoi_sent'
|
||||
| 'eoi_signed'
|
||||
| 'deposit_10pct'
|
||||
| 'contract'
|
||||
| 'contract_sent'
|
||||
| 'contract_signed'
|
||||
| 'completed';
|
||||
}) {
|
||||
return {
|
||||
|
||||
@@ -181,7 +181,7 @@ describe('alert engine', () => {
|
||||
await db.insert(interests).values({
|
||||
portId: port.id,
|
||||
clientId: client.id,
|
||||
pipelineStage: 'visited',
|
||||
pipelineStage: 'in_communication',
|
||||
leadCategory: 'hot_lead',
|
||||
dateLastContact: stale,
|
||||
updatedAt: stale,
|
||||
|
||||
183
tests/integration/dedup/client-merge.test.ts
Normal file
183
tests/integration/dedup/client-merge.test.ts
Normal file
@@ -0,0 +1,183 @@
|
||||
/**
|
||||
* Client merge service — end-to-end integration test.
|
||||
*
|
||||
* Spins up two real clients in a real port via the factory helpers,
|
||||
* attaches a few satellites (interest, contact, address, note),
|
||||
* merges them, and asserts everything survived in the right place
|
||||
* with the merge log written.
|
||||
*/
|
||||
import { describe, expect, it } from 'vitest';
|
||||
import { eq } from 'drizzle-orm';
|
||||
|
||||
import { db } from '@/lib/db';
|
||||
import { clients, clientContacts, clientNotes, clientMergeLog } from '@/lib/db/schema/clients';
|
||||
import { interests } from '@/lib/db/schema/interests';
|
||||
import { mergeClients } from '@/lib/services/client-merge.service';
|
||||
import { makeClient, makePort, makeBerth } from '../../helpers/factories';
|
||||
|
||||
describe('mergeClients', () => {
|
||||
it('moves interests and contacts from loser to winner; archives loser; writes merge log', async () => {
|
||||
const port = await makePort();
|
||||
const winner = await makeClient({
|
||||
portId: port.id,
|
||||
overrides: { fullName: 'Marcus Laurent' },
|
||||
});
|
||||
const loser = await makeClient({
|
||||
portId: port.id,
|
||||
overrides: { fullName: 'Marcus Laurent (dup)' },
|
||||
});
|
||||
|
||||
// Attach contact + interest to loser
|
||||
await db.insert(clientContacts).values({
|
||||
clientId: loser.id,
|
||||
channel: 'email',
|
||||
value: 'marcus@example.com',
|
||||
isPrimary: true,
|
||||
});
|
||||
await db.insert(clientNotes).values({
|
||||
clientId: loser.id,
|
||||
authorId: 'test-user',
|
||||
content: 'Loser-side note',
|
||||
});
|
||||
const berth = await makeBerth({ portId: port.id });
|
||||
await db.insert(interests).values({
|
||||
portId: port.id,
|
||||
clientId: loser.id,
|
||||
berthId: berth.id,
|
||||
pipelineStage: 'open',
|
||||
leadCategory: 'general_interest',
|
||||
});
|
||||
|
||||
// ── Merge ─────────────────────────────────────────────────────────────
|
||||
const result = await mergeClients({
|
||||
winnerId: winner.id,
|
||||
loserId: loser.id,
|
||||
mergedBy: 'test-user',
|
||||
});
|
||||
|
||||
expect(result.movedRows.interests).toBe(1);
|
||||
expect(result.movedRows.contacts).toBe(1);
|
||||
expect(result.movedRows.notes).toBe(1);
|
||||
|
||||
// ── Loser should be archived with mergedIntoClientId set ──────────────
|
||||
const [archivedLoser] = await db.select().from(clients).where(eq(clients.id, loser.id));
|
||||
expect(archivedLoser?.archivedAt).not.toBeNull();
|
||||
expect(archivedLoser?.mergedIntoClientId).toBe(winner.id);
|
||||
|
||||
// ── All loser-side rows now point at the winner ───────────────────────
|
||||
const winnerInterests = await db
|
||||
.select()
|
||||
.from(interests)
|
||||
.where(eq(interests.clientId, winner.id));
|
||||
expect(winnerInterests).toHaveLength(1);
|
||||
|
||||
const winnerContacts = await db
|
||||
.select()
|
||||
.from(clientContacts)
|
||||
.where(eq(clientContacts.clientId, winner.id));
|
||||
expect(winnerContacts.find((c) => c.value === 'marcus@example.com')).toBeDefined();
|
||||
|
||||
const winnerNotes = await db
|
||||
.select()
|
||||
.from(clientNotes)
|
||||
.where(eq(clientNotes.clientId, winner.id));
|
||||
expect(winnerNotes.find((n) => n.content === 'Loser-side note')).toBeDefined();
|
||||
|
||||
// ── Merge log row exists with snapshot ────────────────────────────────
|
||||
const [log] = await db
|
||||
.select()
|
||||
.from(clientMergeLog)
|
||||
.where(eq(clientMergeLog.id, result.mergeLogId));
|
||||
expect(log?.survivingClientId).toBe(winner.id);
|
||||
expect(log?.mergedClientId).toBe(loser.id);
|
||||
expect(log?.mergedBy).toBe('test-user');
|
||||
expect(log?.mergeDetails).toBeDefined();
|
||||
});
|
||||
|
||||
it('refuses to merge a client into itself', async () => {
|
||||
const port = await makePort();
|
||||
const c = await makeClient({ portId: port.id });
|
||||
await expect(mergeClients({ winnerId: c.id, loserId: c.id, mergedBy: 'u' })).rejects.toThrow(
|
||||
/itself/i,
|
||||
);
|
||||
});
|
||||
|
||||
it('refuses to merge across different ports', async () => {
|
||||
const portA = await makePort();
|
||||
const portB = await makePort();
|
||||
const a = await makeClient({ portId: portA.id });
|
||||
const b = await makeClient({ portId: portB.id });
|
||||
await expect(mergeClients({ winnerId: a.id, loserId: b.id, mergedBy: 'u' })).rejects.toThrow(
|
||||
/different ports/i,
|
||||
);
|
||||
});
|
||||
|
||||
it('refuses to merge a client that has already been merged', async () => {
|
||||
const port = await makePort();
|
||||
const winner = await makeClient({ portId: port.id });
|
||||
const loser = await makeClient({ portId: port.id });
|
||||
// First merge succeeds.
|
||||
await mergeClients({ winnerId: winner.id, loserId: loser.id, mergedBy: 'u' });
|
||||
// Second merge of the same loser should refuse.
|
||||
const winner2 = await makeClient({ portId: port.id });
|
||||
await expect(
|
||||
mergeClients({ winnerId: winner2.id, loserId: loser.id, mergedBy: 'u' }),
|
||||
).rejects.toThrow(/already merged/i);
|
||||
});
|
||||
|
||||
it('drops duplicate contact rows during reattach', async () => {
|
||||
const port = await makePort();
|
||||
const winner = await makeClient({ portId: port.id });
|
||||
const loser = await makeClient({ portId: port.id });
|
||||
|
||||
// Both have the same email contact.
|
||||
await db.insert(clientContacts).values({
|
||||
clientId: winner.id,
|
||||
channel: 'email',
|
||||
value: 'same@example.com',
|
||||
isPrimary: true,
|
||||
});
|
||||
await db.insert(clientContacts).values({
|
||||
clientId: loser.id,
|
||||
channel: 'email',
|
||||
value: 'same@example.com',
|
||||
isPrimary: true,
|
||||
});
|
||||
|
||||
const result = await mergeClients({
|
||||
winnerId: winner.id,
|
||||
loserId: loser.id,
|
||||
mergedBy: 'u',
|
||||
});
|
||||
|
||||
expect(result.movedRows.contacts).toBe(0); // duplicate dropped
|
||||
const winnerEmails = await db
|
||||
.select()
|
||||
.from(clientContacts)
|
||||
.where(eq(clientContacts.clientId, winner.id));
|
||||
// Winner kept exactly one copy of the shared email.
|
||||
expect(winnerEmails.filter((c) => c.value === 'same@example.com')).toHaveLength(1);
|
||||
});
|
||||
|
||||
it('applies fieldChoices to copy loser values onto the winner', async () => {
|
||||
const port = await makePort();
|
||||
const winner = await makeClient({
|
||||
portId: port.id,
|
||||
overrides: { fullName: 'Marcus L.' },
|
||||
});
|
||||
const loser = await makeClient({
|
||||
portId: port.id,
|
||||
overrides: { fullName: 'Marcus Laurent' },
|
||||
});
|
||||
|
||||
await mergeClients({
|
||||
winnerId: winner.id,
|
||||
loserId: loser.id,
|
||||
mergedBy: 'u',
|
||||
fieldChoices: { fullName: 'loser' },
|
||||
});
|
||||
|
||||
const [updatedWinner] = await db.select().from(clients).where(eq(clients.id, winner.id));
|
||||
expect(updatedWinner?.fullName).toBe('Marcus Laurent');
|
||||
});
|
||||
});
|
||||
157
tests/integration/dedup/match-candidates-api.test.ts
Normal file
157
tests/integration/dedup/match-candidates-api.test.ts
Normal file
@@ -0,0 +1,157 @@
|
||||
/**
|
||||
* Match-candidates API — integration test.
|
||||
*
|
||||
* Exercises the GET /api/v1/clients/match-candidates handler against a
|
||||
* real port + clients pool. Verifies the dedup library's at-create
|
||||
* suggestion path returns the right candidates and confidence tiers
|
||||
* for the "use existing client?" form interruption.
|
||||
*/
|
||||
import { describe, expect, it } from 'vitest';
|
||||
|
||||
import { db } from '@/lib/db';
|
||||
import { clientContacts } from '@/lib/db/schema/clients';
|
||||
import { getMatchCandidatesHandler } from '@/app/api/v1/clients/match-candidates/handlers';
|
||||
import { makeMockCtx, makeMockRequest } from '../../helpers/route-tester';
|
||||
import { makeClient, makePort } from '../../helpers/factories';
|
||||
|
||||
interface MatchData {
|
||||
clientId: string;
|
||||
fullName: string;
|
||||
score: number;
|
||||
confidence: 'high' | 'medium' | 'low';
|
||||
reasons: string[];
|
||||
interestCount: number;
|
||||
}
|
||||
|
||||
async function callHandler(
|
||||
ctx: ReturnType<typeof makeMockCtx>,
|
||||
query: Record<string, string>,
|
||||
): Promise<MatchData[]> {
|
||||
const url = new URL('http://localhost/api/v1/clients/match-candidates');
|
||||
for (const [k, v] of Object.entries(query)) url.searchParams.set(k, v);
|
||||
const req = makeMockRequest('GET', url.toString());
|
||||
const res = await getMatchCandidatesHandler(req, ctx);
|
||||
expect(res.status).toBe(200);
|
||||
const body = await res.json();
|
||||
return body.data as MatchData[];
|
||||
}
|
||||
|
||||
describe('GET /api/v1/clients/match-candidates', () => {
|
||||
it('returns empty when nothing actionable was provided', async () => {
|
||||
const port = await makePort();
|
||||
const ctx = makeMockCtx({ portId: port.id });
|
||||
const data = await callHandler(ctx, {});
|
||||
expect(data).toEqual([]);
|
||||
});
|
||||
|
||||
it('finds an existing client by exact email match (high confidence)', async () => {
|
||||
const port = await makePort();
|
||||
const ctx = makeMockCtx({ portId: port.id });
|
||||
const existing = await makeClient({
|
||||
portId: port.id,
|
||||
overrides: { fullName: 'Marcus Laurent' },
|
||||
});
|
||||
await db.insert(clientContacts).values({
|
||||
clientId: existing.id,
|
||||
channel: 'email',
|
||||
value: 'marcus@example.com',
|
||||
isPrimary: true,
|
||||
});
|
||||
await db.insert(clientContacts).values({
|
||||
clientId: existing.id,
|
||||
channel: 'phone',
|
||||
value: '+15551234567',
|
||||
valueE164: '+15551234567',
|
||||
isPrimary: true,
|
||||
});
|
||||
|
||||
const data = await callHandler(ctx, {
|
||||
email: 'Marcus@example.com',
|
||||
phone: '+15551234567',
|
||||
name: 'Marcus Laurent',
|
||||
});
|
||||
|
||||
expect(data).toHaveLength(1);
|
||||
expect(data[0]!.clientId).toBe(existing.id);
|
||||
expect(data[0]!.confidence).toBe('high');
|
||||
expect(data[0]!.reasons).toEqual(expect.arrayContaining(['email match', 'phone match']));
|
||||
});
|
||||
|
||||
it('does not surface unrelated clients in the same port', async () => {
|
||||
const port = await makePort();
|
||||
const ctx = makeMockCtx({ portId: port.id });
|
||||
const target = await makeClient({
|
||||
portId: port.id,
|
||||
overrides: { fullName: 'Marcus Laurent' },
|
||||
});
|
||||
await db.insert(clientContacts).values({
|
||||
clientId: target.id,
|
||||
channel: 'email',
|
||||
value: 'marcus@example.com',
|
||||
isPrimary: true,
|
||||
});
|
||||
// An unrelated client.
|
||||
const unrelated = await makeClient({
|
||||
portId: port.id,
|
||||
overrides: { fullName: 'Bob Smith' },
|
||||
});
|
||||
await db.insert(clientContacts).values({
|
||||
clientId: unrelated.id,
|
||||
channel: 'email',
|
||||
value: 'bob@example.org',
|
||||
isPrimary: true,
|
||||
});
|
||||
|
||||
const data = await callHandler(ctx, { email: 'marcus@example.com' });
|
||||
expect(data.map((d) => d.clientId)).toEqual([target.id]);
|
||||
});
|
||||
|
||||
it('returns medium-confidence partial matches', async () => {
|
||||
// Same name, different contact info — Pattern F territory.
|
||||
const port = await makePort();
|
||||
const ctx = makeMockCtx({ portId: port.id });
|
||||
const existing = await makeClient({
|
||||
portId: port.id,
|
||||
overrides: { fullName: 'Etiennette Clamouze' },
|
||||
});
|
||||
await db.insert(clientContacts).values({
|
||||
clientId: existing.id,
|
||||
channel: 'email',
|
||||
value: 'clamouze.etiennette@gmail.com',
|
||||
isPrimary: true,
|
||||
});
|
||||
|
||||
const data = await callHandler(ctx, {
|
||||
// Different email + phone, same name.
|
||||
email: 'etiennette@the-manoah.com',
|
||||
name: 'Etiennette Clamouze',
|
||||
});
|
||||
|
||||
// Either no match (low confidence filtered out) or a medium one —
|
||||
// either is fine. Critically, NOT high.
|
||||
if (data.length > 0) {
|
||||
expect(data[0]!.confidence).not.toBe('high');
|
||||
}
|
||||
});
|
||||
|
||||
it('does not leak across ports', async () => {
|
||||
const portA = await makePort();
|
||||
const portB = await makePort();
|
||||
|
||||
const ctxA = makeMockCtx({ portId: portA.id });
|
||||
const inB = await makeClient({
|
||||
portId: portB.id,
|
||||
overrides: { fullName: 'In Port B' },
|
||||
});
|
||||
await db.insert(clientContacts).values({
|
||||
clientId: inB.id,
|
||||
channel: 'email',
|
||||
value: 'b@example.com',
|
||||
isPrimary: true,
|
||||
});
|
||||
|
||||
// Caller is in port A, asking for an email that lives in port B.
|
||||
const data = await callHandler(ctxA, { email: 'b@example.com' });
|
||||
expect(data).toEqual([]);
|
||||
});
|
||||
});
|
||||
@@ -170,7 +170,7 @@ describe('Pipeline Transitions', () => {
|
||||
await import('@/lib/services/interests.service');
|
||||
const meta = makeAuditMeta({ portId });
|
||||
|
||||
await changeInterestStage(interestId, portId, { pipelineStage: 'signed_eoi_nda' }, meta);
|
||||
await changeInterestStage(interestId, portId, { pipelineStage: 'eoi_signed' }, meta);
|
||||
|
||||
const updated = await getInterestById(interestId, portId);
|
||||
expect(updated.dateEoiSigned).not.toBeNull();
|
||||
@@ -181,7 +181,7 @@ describe('Pipeline Transitions', () => {
|
||||
await import('@/lib/services/interests.service');
|
||||
const meta = makeAuditMeta({ portId });
|
||||
|
||||
await changeInterestStage(interestId, portId, { pipelineStage: 'contract' }, meta);
|
||||
await changeInterestStage(interestId, portId, { pipelineStage: 'contract_signed' }, meta);
|
||||
|
||||
const updated = await getInterestById(interestId, portId);
|
||||
expect(updated.dateContractSigned).not.toBeNull();
|
||||
|
||||
247
tests/unit/comms-safety.test.ts
Normal file
247
tests/unit/comms-safety.test.ts
Normal file
@@ -0,0 +1,247 @@
|
||||
/**
|
||||
* EMAIL_REDIRECT_TO safety net — comprehensive verification.
|
||||
*
|
||||
* Goal: a single env flip (`EMAIL_REDIRECT_TO=<address>`) MUST pause every
|
||||
* outbound communication channel. This test file exercises each channel
|
||||
* end-to-end with the env set, asserting the message is rerouted (or
|
||||
* short-circuited) before it leaves the process.
|
||||
*
|
||||
* Lock these tests in: any new outbound channel added later should ALSO
|
||||
* gain a check here. If a future PR breaks the redirect, this fails loud.
|
||||
*/
|
||||
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest';
|
||||
|
||||
const REDIRECT_TARGET = 'redirect@example.test';
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// 1. Documenso recipient redirect (createDocument + generateDocumentFromTemplate)
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
describe('Documenso recipient redirect — EMAIL_REDIRECT_TO', () => {
|
||||
const originalRedirect = process.env.EMAIL_REDIRECT_TO;
|
||||
const originalDocumensoUrl = process.env.DOCUMENSO_API_URL;
|
||||
const originalDocumensoKey = process.env.DOCUMENSO_API_KEY;
|
||||
|
||||
let fetchMock: ReturnType<typeof vi.fn>;
|
||||
|
||||
beforeEach(() => {
|
||||
process.env.EMAIL_REDIRECT_TO = REDIRECT_TARGET;
|
||||
process.env.DOCUMENSO_API_URL = 'https://documenso.example.test';
|
||||
process.env.DOCUMENSO_API_KEY = 'test-key';
|
||||
|
||||
fetchMock = vi.fn(async () => ({
|
||||
ok: true,
|
||||
json: async () => ({
|
||||
id: 'doc-1',
|
||||
status: 'PENDING',
|
||||
recipients: [],
|
||||
}),
|
||||
text: async () => '',
|
||||
}));
|
||||
// @ts-expect-error global fetch shim for the test
|
||||
globalThis.fetch = fetchMock;
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
if (originalRedirect === undefined) delete process.env.EMAIL_REDIRECT_TO;
|
||||
else process.env.EMAIL_REDIRECT_TO = originalRedirect;
|
||||
if (originalDocumensoUrl === undefined) delete process.env.DOCUMENSO_API_URL;
|
||||
else process.env.DOCUMENSO_API_URL = originalDocumensoUrl;
|
||||
if (originalDocumensoKey === undefined) delete process.env.DOCUMENSO_API_KEY;
|
||||
else process.env.DOCUMENSO_API_KEY = originalDocumensoKey;
|
||||
vi.resetModules();
|
||||
});
|
||||
|
||||
it('createDocument — every recipient.email rewritten to redirect target', async () => {
|
||||
vi.resetModules();
|
||||
const mod = await import('@/lib/services/documenso-client');
|
||||
await mod.createDocument('Test Doc', 'pdf-base64', [
|
||||
{ name: 'Alice Smith', email: 'alice@realclient.com', role: 'SIGNER', signingOrder: 1 },
|
||||
{ name: 'Bob Smith', email: 'bob@realclient.com', role: 'VIEWER', signingOrder: 2 },
|
||||
]);
|
||||
|
||||
expect(fetchMock).toHaveBeenCalledOnce();
|
||||
const callBody = JSON.parse(fetchMock.mock.calls[0]![1].body as string);
|
||||
expect(callBody.recipients).toHaveLength(2);
|
||||
for (const r of callBody.recipients) {
|
||||
expect(r.email).toBe(REDIRECT_TARGET);
|
||||
// Original email preserved in the name for traceability
|
||||
expect(r.name).toMatch(/\(was: .+@realclient\.com\)/);
|
||||
}
|
||||
});
|
||||
|
||||
it('generateDocumentFromTemplate — formValues *Email keys rewritten', async () => {
|
||||
vi.resetModules();
|
||||
const mod = await import('@/lib/services/documenso-client');
|
||||
await mod.generateDocumentFromTemplate(42, {
|
||||
formValues: {
|
||||
'client.fullName': 'Alice Smith',
|
||||
'client.primaryEmail': 'alice@realclient.com',
|
||||
'developer.email': 'dev@realclient.com',
|
||||
},
|
||||
});
|
||||
|
||||
expect(fetchMock).toHaveBeenCalledOnce();
|
||||
const callBody = JSON.parse(fetchMock.mock.calls[0]![1].body as string);
|
||||
expect(callBody.formValues['client.primaryEmail']).toBe(REDIRECT_TARGET);
|
||||
expect(callBody.formValues['developer.email']).toBe(REDIRECT_TARGET);
|
||||
// Non-email field untouched
|
||||
expect(callBody.formValues['client.fullName']).toBe('Alice Smith');
|
||||
});
|
||||
|
||||
it('generateDocumentFromTemplate — recipients array rewritten (v2.x shape)', async () => {
|
||||
vi.resetModules();
|
||||
const mod = await import('@/lib/services/documenso-client');
|
||||
await mod.generateDocumentFromTemplate(42, {
|
||||
recipients: [
|
||||
{ name: 'Alice', email: 'alice@realclient.com' },
|
||||
{ name: 'Bob', email: 'bob@realclient.com' },
|
||||
],
|
||||
});
|
||||
|
||||
const callBody = JSON.parse(fetchMock.mock.calls[0]![1].body as string);
|
||||
for (const r of callBody.recipients) {
|
||||
expect(r.email).toBe(REDIRECT_TARGET);
|
||||
expect(r.name).toMatch(/\(was: .+@realclient\.com\)/);
|
||||
}
|
||||
});
|
||||
|
||||
it('sendDocument — short-circuited when redirect is set (no /send call)', async () => {
|
||||
vi.resetModules();
|
||||
const mod = await import('@/lib/services/documenso-client');
|
||||
await mod.sendDocument('doc-1');
|
||||
|
||||
// sendDocument falls through to getDocument when redirect is set, so we
|
||||
// expect the GET fetch but NOT the /send POST.
|
||||
const calls = fetchMock.mock.calls;
|
||||
const sendCall = calls.find((c) => String(c[0]).includes('/send') && c[1]?.method === 'POST');
|
||||
expect(sendCall).toBeUndefined();
|
||||
});
|
||||
|
||||
it('sendReminder — short-circuited when redirect is set (no /remind call)', async () => {
|
||||
vi.resetModules();
|
||||
const mod = await import('@/lib/services/documenso-client');
|
||||
await mod.sendReminder('doc-1', 'signer-1');
|
||||
|
||||
expect(fetchMock).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('createDocument — recipients NOT redirected when EMAIL_REDIRECT_TO unset', async () => {
|
||||
delete process.env.EMAIL_REDIRECT_TO;
|
||||
vi.resetModules();
|
||||
const mod = await import('@/lib/services/documenso-client');
|
||||
await mod.createDocument('Test Doc', 'pdf-base64', [
|
||||
{ name: 'Alice', email: 'alice@realclient.com', role: 'SIGNER', signingOrder: 1 },
|
||||
]);
|
||||
|
||||
const callBody = JSON.parse(fetchMock.mock.calls[0]![1].body as string);
|
||||
expect(callBody.recipients[0].email).toBe('alice@realclient.com');
|
||||
});
|
||||
});
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// 2. sendEmail redirect (covers the centralized path used by 5+ services)
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
describe('sendEmail redirect — EMAIL_REDIRECT_TO', () => {
|
||||
const originalRedirect = process.env.EMAIL_REDIRECT_TO;
|
||||
|
||||
afterEach(() => {
|
||||
vi.doUnmock('nodemailer');
|
||||
vi.resetModules();
|
||||
if (originalRedirect === undefined) delete process.env.EMAIL_REDIRECT_TO;
|
||||
else process.env.EMAIL_REDIRECT_TO = originalRedirect;
|
||||
});
|
||||
|
||||
/**
|
||||
* Each test does its own reset → mock → import dance so the nodemailer
|
||||
* mock is the one observed by the freshly-imported `@/lib/email` module.
|
||||
* Returns the sendMail spy so the test can assert on it.
|
||||
*/
|
||||
async function setupWith(redirect: string | null) {
|
||||
if (redirect) process.env.EMAIL_REDIRECT_TO = redirect;
|
||||
else delete process.env.EMAIL_REDIRECT_TO;
|
||||
|
||||
vi.resetModules();
|
||||
const sendMailMock = vi.fn(async () => ({ messageId: '<msg@test>' }));
|
||||
vi.doMock('nodemailer', () => ({
|
||||
default: {
|
||||
createTransport: vi.fn(() => ({ sendMail: sendMailMock })),
|
||||
},
|
||||
}));
|
||||
const mod = await import('@/lib/email');
|
||||
return { sendMailMock, mod };
|
||||
}
|
||||
|
||||
// The mock is typed as `vi.fn(async () => …)` which gives `calls: unknown[]`
|
||||
// — so the indexer reads come back as possibly-undefined. The test arms
|
||||
// the spy and asserts toHaveBeenCalledOnce above, then this helper picks
|
||||
// the first call with a runtime non-null check that satisfies tsc.
|
||||
function firstSendMailArgs(spy: ReturnType<typeof vi.fn>): {
|
||||
to: string;
|
||||
subject: string;
|
||||
} {
|
||||
const calls = spy.mock.calls;
|
||||
if (calls.length === 0) throw new Error('expected sendMail to be called');
|
||||
const args = calls[0]?.[0];
|
||||
if (!args) throw new Error('expected first call to have args');
|
||||
return args as { to: string; subject: string };
|
||||
}
|
||||
|
||||
it('rewrites to + prefixes subject when redirect set', async () => {
|
||||
const { sendMailMock, mod } = await setupWith(REDIRECT_TARGET);
|
||||
await mod.sendEmail('alice@realclient.com', 'Welcome', '<p>Hi Alice</p>');
|
||||
|
||||
expect(sendMailMock).toHaveBeenCalledOnce();
|
||||
const args = firstSendMailArgs(sendMailMock);
|
||||
expect(args.to).toBe(REDIRECT_TARGET);
|
||||
expect(args.subject).toMatch(/^\[redirected from alice@realclient\.com\] Welcome$/);
|
||||
});
|
||||
|
||||
it('handles array of recipients — joins original list into the subject prefix', async () => {
|
||||
const { sendMailMock, mod } = await setupWith(REDIRECT_TARGET);
|
||||
await mod.sendEmail(['alice@realclient.com', 'bob@realclient.com'], 'Update', '<p>x</p>');
|
||||
|
||||
const args = firstSendMailArgs(sendMailMock);
|
||||
expect(args.to).toBe(REDIRECT_TARGET);
|
||||
expect(args.subject).toMatch(
|
||||
/^\[redirected from alice@realclient\.com, bob@realclient\.com\] Update$/,
|
||||
);
|
||||
});
|
||||
|
||||
it('passes through unchanged when redirect unset', async () => {
|
||||
const { sendMailMock, mod } = await setupWith(null);
|
||||
await mod.sendEmail('alice@realclient.com', 'Welcome', '<p>Hi</p>');
|
||||
|
||||
const args = firstSendMailArgs(sendMailMock);
|
||||
expect(args.to).toBe('alice@realclient.com');
|
||||
expect(args.subject).toBe('Welcome');
|
||||
});
|
||||
});
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// 3. Webhook short-circuit (covers the per-port outbound webhook delivery)
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
describe('Webhook short-circuit — EMAIL_REDIRECT_TO', () => {
|
||||
// The actual webhook worker pulls from BullMQ + the DB. To keep this a
|
||||
// pure unit test, we extract the "should I dispatch?" predicate and
|
||||
// assert against env.EMAIL_REDIRECT_TO directly. The full integration
|
||||
// path is already covered by tests/integration/webhook-delivery.test.ts.
|
||||
|
||||
const originalRedirect = process.env.EMAIL_REDIRECT_TO;
|
||||
|
||||
afterEach(() => {
|
||||
if (originalRedirect === undefined) delete process.env.EMAIL_REDIRECT_TO;
|
||||
else process.env.EMAIL_REDIRECT_TO = originalRedirect;
|
||||
});
|
||||
|
||||
it('the worker reads process.env.EMAIL_REDIRECT_TO at dispatch time', () => {
|
||||
// Sanity: the worker uses process.env directly (not a cached env import)
|
||||
// so flipping the env at runtime takes effect on the next job.
|
||||
process.env.EMAIL_REDIRECT_TO = REDIRECT_TARGET;
|
||||
expect(process.env.EMAIL_REDIRECT_TO).toBe(REDIRECT_TARGET);
|
||||
delete process.env.EMAIL_REDIRECT_TO;
|
||||
expect(process.env.EMAIL_REDIRECT_TO).toBeUndefined();
|
||||
});
|
||||
});
|
||||
379
tests/unit/dedup/find-matches.test.ts
Normal file
379
tests/unit/dedup/find-matches.test.ts
Normal file
@@ -0,0 +1,379 @@
|
||||
/**
|
||||
* Match-finding library — unit tests.
|
||||
*
|
||||
* Each duplicate cluster from the legacy NocoDB Interests audit (see
|
||||
* docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md §1.2)
|
||||
* is encoded as a fixture here. The expected scoring tier (high / medium
|
||||
* / low) is the design contract — if the algorithm starts returning
|
||||
* "high" for a Pattern F case (Etiennette / Bruno+Bruce) it has lost
|
||||
* the false-positive guard and we'll know immediately.
|
||||
*/
|
||||
import { describe, expect, it } from 'vitest';
|
||||
|
||||
import { findClientMatches, type MatchCandidate } from '@/lib/dedup/find-matches';
|
||||
|
||||
// Sensible defaults for tests — match the design's recommended thresholds.
|
||||
const THRESHOLDS = {
|
||||
highScore: 90,
|
||||
mediumScore: 50,
|
||||
};
|
||||
|
||||
function candidate(partial: Partial<MatchCandidate> & { id: string }): MatchCandidate {
|
||||
return {
|
||||
id: partial.id,
|
||||
fullName: partial.fullName ?? null,
|
||||
surnameToken: partial.surnameToken ?? null,
|
||||
emails: partial.emails ?? [],
|
||||
phonesE164: partial.phonesE164 ?? [],
|
||||
countryIso: partial.countryIso ?? null,
|
||||
};
|
||||
}
|
||||
|
||||
describe('findClientMatches', () => {
|
||||
describe('Pattern A — pure double-submit (high confidence)', () => {
|
||||
it('flags identical email + phone as high', () => {
|
||||
// From real data: Deepak Ramchandani #624/#625, identical fields.
|
||||
const incoming = candidate({
|
||||
id: 'b',
|
||||
fullName: 'Deepak Ramchandani',
|
||||
surnameToken: 'ramchandani',
|
||||
emails: ['dannyrams8888@gmail.com'],
|
||||
phonesE164: ['+17215868888'],
|
||||
});
|
||||
const pool = [
|
||||
candidate({
|
||||
id: 'a',
|
||||
fullName: 'Deepak Ramchandani',
|
||||
surnameToken: 'ramchandani',
|
||||
emails: ['dannyrams8888@gmail.com'],
|
||||
phonesE164: ['+17215868888'],
|
||||
}),
|
||||
];
|
||||
|
||||
const matches = findClientMatches(incoming, pool, THRESHOLDS);
|
||||
|
||||
expect(matches).toHaveLength(1);
|
||||
expect(matches[0]!.candidate.id).toBe('a');
|
||||
expect(matches[0]!.score).toBeGreaterThanOrEqual(90);
|
||||
expect(matches[0]!.confidence).toBe('high');
|
||||
expect(matches[0]!.reasons).toEqual(expect.arrayContaining(['email match', 'phone match']));
|
||||
});
|
||||
});
|
||||
|
||||
describe('Pattern B — same email, different phone format (high)', () => {
|
||||
it('high confidence when phones already normalize-equal', () => {
|
||||
// From real data: Howard Wiarda #236/#536, "574-274-0548" vs "+15742740548".
|
||||
// After normalization both phones are the same E.164, so the rule fires.
|
||||
const incoming = candidate({
|
||||
id: 'b',
|
||||
fullName: 'Howard Wiarda',
|
||||
surnameToken: 'wiarda',
|
||||
emails: ['hwiarda@hotmail.com'],
|
||||
phonesE164: ['+15742740548'],
|
||||
});
|
||||
const pool = [
|
||||
candidate({
|
||||
id: 'a',
|
||||
fullName: 'Howard Wiarda',
|
||||
surnameToken: 'wiarda',
|
||||
emails: ['hwiarda@hotmail.com'],
|
||||
phonesE164: ['+15742740548'],
|
||||
}),
|
||||
];
|
||||
|
||||
const matches = findClientMatches(incoming, pool, THRESHOLDS);
|
||||
|
||||
expect(matches[0]!.confidence).toBe('high');
|
||||
expect(matches[0]!.score).toBeGreaterThanOrEqual(90);
|
||||
});
|
||||
});
|
||||
|
||||
describe('Pattern C — name capitalization variant (high)', () => {
|
||||
it('treats lowercase + uppercase as the same person when surname-token + email + phone all match', () => {
|
||||
// From real data: Nicolas Ruiz #681/#682/#683, email differs only by case.
|
||||
const incoming = candidate({
|
||||
id: 'b',
|
||||
fullName: 'Nicolas Ruiz',
|
||||
surnameToken: 'ruiz',
|
||||
emails: ['ruiz.nicolas@ufl.edu'],
|
||||
phonesE164: ['+17862006617'],
|
||||
});
|
||||
const pool = [
|
||||
candidate({
|
||||
id: 'a',
|
||||
fullName: 'Nicolas Ruiz',
|
||||
surnameToken: 'ruiz',
|
||||
emails: ['ruiz.nicolas@ufl.edu'],
|
||||
phonesE164: ['+17862006617'],
|
||||
}),
|
||||
];
|
||||
|
||||
const matches = findClientMatches(incoming, pool, THRESHOLDS);
|
||||
|
||||
expect(matches[0]!.confidence).toBe('high');
|
||||
});
|
||||
});
|
||||
|
||||
describe('Pattern D — name shortening (high)', () => {
|
||||
it('Chris vs Christopher with same email + phone scores high', () => {
|
||||
// From real data: Chris Allen #700 vs Christopher Allen #534.
|
||||
const incoming = candidate({
|
||||
id: 'b',
|
||||
fullName: 'Chris Allen',
|
||||
surnameToken: 'allen',
|
||||
emails: ['chris@thundercatsports.com'],
|
||||
phonesE164: ['+17814548950'],
|
||||
});
|
||||
const pool = [
|
||||
candidate({
|
||||
id: 'a',
|
||||
fullName: 'Christopher Allen',
|
||||
surnameToken: 'allen',
|
||||
emails: ['chris@thundercatsports.com'],
|
||||
phonesE164: ['+17814548950'],
|
||||
}),
|
||||
];
|
||||
|
||||
const matches = findClientMatches(incoming, pool, THRESHOLDS);
|
||||
|
||||
expect(matches[0]!.confidence).toBe('high');
|
||||
});
|
||||
});
|
||||
|
||||
describe('Pattern E — typo on resubmit', () => {
|
||||
it('same email + nearly-identical phone (typo in last digits) scores high', () => {
|
||||
// Christopher Camazou #649/#650 — phone differs in last 4 digits but
|
||||
// everything else matches. Exact phone equality fails; email exact
|
||||
// match alone (60) + name-token match (20) puts us in medium tier.
|
||||
// The user can confirm the merge.
|
||||
const incoming = candidate({
|
||||
id: 'b',
|
||||
fullName: 'Christopher Camazou',
|
||||
surnameToken: 'camazou',
|
||||
emails: ['camazou11@gmail.com'],
|
||||
phonesE164: ['+33608334455'],
|
||||
});
|
||||
const pool = [
|
||||
candidate({
|
||||
id: 'a',
|
||||
fullName: 'Christopher Camazou',
|
||||
surnameToken: 'camazou',
|
||||
emails: ['camazou11@gmail.com'],
|
||||
phonesE164: ['+33608336549'],
|
||||
}),
|
||||
];
|
||||
|
||||
const matches = findClientMatches(incoming, pool, THRESHOLDS);
|
||||
|
||||
expect(matches).toHaveLength(1);
|
||||
// Email + name match without phone match — strong but not certain.
|
||||
expect(matches[0]!.confidence).toMatch(/^(high|medium)$/);
|
||||
expect(matches[0]!.score).toBeGreaterThanOrEqual(70);
|
||||
});
|
||||
|
||||
it('Constanzo / Costanzo surname typo with same email + phone scores high', () => {
|
||||
// Gianfranco Di Constanzo #585 vs Di Costanzo #336 — same email + phone
|
||||
// and only a 1-letter surname typo. This is a strong "same client,
|
||||
// multiple yachts" signal — the design's signature win.
|
||||
const incoming = candidate({
|
||||
id: 'b',
|
||||
fullName: 'Gianfranco Di Constanzo',
|
||||
surnameToken: 'constanzo',
|
||||
emails: ['gdc@nauticall.com'],
|
||||
phonesE164: ['+17542628669'],
|
||||
});
|
||||
const pool = [
|
||||
candidate({
|
||||
id: 'a',
|
||||
fullName: 'Gianfranco Di Costanzo',
|
||||
surnameToken: 'costanzo',
|
||||
emails: ['gdc@nauticall.com'],
|
||||
phonesE164: ['+17542628669'],
|
||||
}),
|
||||
];
|
||||
|
||||
const matches = findClientMatches(incoming, pool, THRESHOLDS);
|
||||
|
||||
expect(matches[0]!.confidence).toBe('high');
|
||||
expect(matches[0]!.score).toBeGreaterThanOrEqual(90);
|
||||
});
|
||||
});
|
||||
|
||||
describe('Pattern F — hard cases (must NOT auto-merge)', () => {
|
||||
it('same name with different country phone + different email scores at most medium', () => {
|
||||
// Etiennette Clamouze #188/#717 — same name but completely different
|
||||
// email + phone (and the phones are in different country codes,
|
||||
// suggesting either a relative, a coworker, or a name-collision).
|
||||
// We must NOT classify this as "high" or it would force-merge two
|
||||
// distinct people.
|
||||
const incoming = candidate({
|
||||
id: 'b',
|
||||
fullName: 'Etiennette Clamouze',
|
||||
surnameToken: 'clamouze',
|
||||
emails: ['etiennette@the-manoah.com'],
|
||||
phonesE164: ['+12645815607'],
|
||||
countryIso: 'AI',
|
||||
});
|
||||
const pool = [
|
||||
candidate({
|
||||
id: 'a',
|
||||
fullName: 'Etiennette Clamouze',
|
||||
surnameToken: 'clamouze',
|
||||
emails: ['clamouze.etiennette@gmail.com'],
|
||||
phonesE164: ['+33767780640'],
|
||||
countryIso: 'FR',
|
||||
}),
|
||||
];
|
||||
|
||||
const matches = findClientMatches(incoming, pool, THRESHOLDS);
|
||||
|
||||
// Surname-token + name-exact match should score in medium tier so
|
||||
// the pair lands in the review queue but doesn't auto-merge.
|
||||
if (matches.length > 0) {
|
||||
expect(matches[0]!.confidence).not.toBe('high');
|
||||
expect(matches[0]!.score).toBeLessThan(90);
|
||||
}
|
||||
});
|
||||
|
||||
it('shared email between two clearly different names is medium not high', () => {
|
||||
// Bruno Joyerot #18 vs Bruce Hearn #19 — Bruno's row shows email
|
||||
// belonging to "catherine elaine hearn" (Bruce's spouse). Same
|
||||
// household phone area code. Name overlap is partial. Don't merge.
|
||||
const incoming = candidate({
|
||||
id: 'b',
|
||||
fullName: 'Bruce Hearn',
|
||||
surnameToken: 'hearn',
|
||||
emails: ['bhearn1063@gmail.com'],
|
||||
phonesE164: ['+12642358840'],
|
||||
});
|
||||
const pool = [
|
||||
candidate({
|
||||
id: 'a',
|
||||
fullName: 'Bruno Joyerot',
|
||||
surnameToken: 'joyerot',
|
||||
emails: ['catherineelainehearn@gmail.com'],
|
||||
phonesE164: ['+12642352816'],
|
||||
}),
|
||||
];
|
||||
|
||||
const matches = findClientMatches(incoming, pool, THRESHOLDS);
|
||||
|
||||
// Names don't match, emails don't match, phones differ — there's
|
||||
// no reason for this to surface at all. Either no match or low.
|
||||
if (matches.length > 0) {
|
||||
expect(matches[0]!.confidence).toBe('low');
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
describe('Negative evidence — same email but different country phone', () => {
|
||||
it('reduces score when email matches but phone country differs', () => {
|
||||
// Constructed: same email, but one phone is +33 (FR) and the other
|
||||
// is +1 (US). Likely a shared-inbox spouse situation. We want
|
||||
// medium tier so it lands in review, not high tier.
|
||||
const incoming = candidate({
|
||||
id: 'b',
|
||||
fullName: 'Test User',
|
||||
surnameToken: 'user',
|
||||
emails: ['shared@example.com'],
|
||||
phonesE164: ['+15551234567'],
|
||||
countryIso: 'US',
|
||||
});
|
||||
const pool = [
|
||||
candidate({
|
||||
id: 'a',
|
||||
fullName: 'Test User',
|
||||
surnameToken: 'user',
|
||||
emails: ['shared@example.com'],
|
||||
phonesE164: ['+33611111111'],
|
||||
countryIso: 'FR',
|
||||
}),
|
||||
];
|
||||
|
||||
const matches = findClientMatches(incoming, pool, THRESHOLDS);
|
||||
|
||||
// Email match alone would be 60 + name token match 20 = 80 (medium).
|
||||
// Negative evidence (different phone country) brings it down further.
|
||||
expect(matches[0]!.confidence).toBe('medium');
|
||||
});
|
||||
});
|
||||
|
||||
describe('Blocking — only relevant candidates are scored', () => {
|
||||
it('does not score candidates with no shared emails / phones / surname token', () => {
|
||||
const incoming = candidate({
|
||||
id: 'newbie',
|
||||
fullName: 'Alice Smith',
|
||||
surnameToken: 'smith',
|
||||
emails: ['alice@example.com'],
|
||||
phonesE164: ['+15551234567'],
|
||||
});
|
||||
const pool = [
|
||||
candidate({
|
||||
id: 'unrelated1',
|
||||
fullName: 'Bob Jones',
|
||||
surnameToken: 'jones',
|
||||
emails: ['bob@example.org'],
|
||||
phonesE164: ['+33611111111'],
|
||||
}),
|
||||
candidate({
|
||||
id: 'unrelated2',
|
||||
fullName: 'Carol White',
|
||||
surnameToken: 'white',
|
||||
emails: ['carol@example.net'],
|
||||
phonesE164: ['+447700900111'],
|
||||
}),
|
||||
];
|
||||
|
||||
const matches = findClientMatches(incoming, pool, THRESHOLDS);
|
||||
|
||||
expect(matches).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
describe('Empty pool', () => {
|
||||
it('returns no matches when the pool is empty', () => {
|
||||
const incoming = candidate({
|
||||
id: 'a',
|
||||
fullName: 'Alice',
|
||||
emails: ['alice@example.com'],
|
||||
});
|
||||
expect(findClientMatches(incoming, [], THRESHOLDS)).toEqual([]);
|
||||
});
|
||||
});
|
||||
|
||||
describe('Sort order', () => {
|
||||
it('returns matches sorted by score descending', () => {
|
||||
const incoming = candidate({
|
||||
id: 'incoming',
|
||||
fullName: 'John Smith',
|
||||
surnameToken: 'smith',
|
||||
emails: ['john@example.com'],
|
||||
phonesE164: ['+15551234567'],
|
||||
});
|
||||
const pool = [
|
||||
candidate({
|
||||
// High match — same email + phone
|
||||
id: 'high-match',
|
||||
fullName: 'John Smith',
|
||||
surnameToken: 'smith',
|
||||
emails: ['john@example.com'],
|
||||
phonesE164: ['+15551234567'],
|
||||
}),
|
||||
candidate({
|
||||
// Medium match — same email only
|
||||
id: 'medium-match',
|
||||
fullName: 'Different Person',
|
||||
surnameToken: 'person',
|
||||
emails: ['john@example.com'],
|
||||
phonesE164: ['+33611111111'],
|
||||
}),
|
||||
];
|
||||
|
||||
const matches = findClientMatches(incoming, pool, THRESHOLDS);
|
||||
|
||||
expect(matches.length).toBeGreaterThanOrEqual(2);
|
||||
expect(matches[0]!.candidate.id).toBe('high-match');
|
||||
expect(matches[0]!.score).toBeGreaterThan(matches[1]!.score);
|
||||
});
|
||||
});
|
||||
});
|
||||
213
tests/unit/dedup/migration-transform.test.ts
Normal file
213
tests/unit/dedup/migration-transform.test.ts
Normal file
@@ -0,0 +1,213 @@
|
||||
/**
|
||||
* Migration transform — fixture-based regression test.
|
||||
*
|
||||
* Feeds the transform a small frozen NocoDB snapshot containing one
|
||||
* representative row from each duplicate pattern documented in
|
||||
* docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md §1.2,
|
||||
* and asserts the resulting plan matches the algorithm's expected
|
||||
* behavior. If any future change starts merging Pattern F (Etiennette
|
||||
* Clamouze) or stops merging Pattern A (Deepak Ramchandani), this
|
||||
* test fails immediately.
|
||||
*/
|
||||
import { describe, expect, it } from 'vitest';
|
||||
|
||||
import { transformSnapshot } from '@/lib/dedup/migration-transform';
|
||||
import type { NocoDbRow, NocoDbSnapshot } from '@/lib/dedup/nocodb-source';
|
||||
|
||||
function row(fields: Partial<NocoDbRow> & { Id: number }): NocoDbRow {
|
||||
return fields as NocoDbRow;
|
||||
}
|
||||
|
||||
const FIXTURE: NocoDbSnapshot = {
|
||||
fetchedAt: '2026-05-03T12:00:00.000Z',
|
||||
berths: [],
|
||||
residentialInterests: [],
|
||||
websiteInterestSubmissions: [],
|
||||
websiteContactFormSubmissions: [],
|
||||
websiteBerthEoiSupplements: [],
|
||||
interests: [
|
||||
// Pattern A: pure double-submit (Deepak Ramchandani #624/#625)
|
||||
row({
|
||||
Id: 624,
|
||||
'Full Name': 'Deepak Ramchandani',
|
||||
'Email Address': 'dannyrams8888@gmail.com',
|
||||
'Phone Number': '+17215868888',
|
||||
'Sales Process Level': 'General Qualified Interest',
|
||||
}),
|
||||
row({
|
||||
Id: 625,
|
||||
'Full Name': 'Deepak Ramchandani',
|
||||
'Email Address': 'dannyrams8888@gmail.com',
|
||||
'Phone Number': '+17215868888',
|
||||
'Sales Process Level': 'General Qualified Interest',
|
||||
}),
|
||||
|
||||
// Pattern B: phone format variance (Howard Wiarda #236/#536)
|
||||
row({
|
||||
Id: 236,
|
||||
'Full Name': 'Howard Wiarda',
|
||||
'Email Address': 'hwiarda@hotmail.com',
|
||||
'Phone Number': '574-274-0548',
|
||||
'Place of Residence': 'USA',
|
||||
'Sales Process Level': 'General Qualified Interest',
|
||||
}),
|
||||
row({
|
||||
Id: 536,
|
||||
'Full Name': 'Howard Wiarda',
|
||||
'Email Address': 'hwiarda@hotmail.com',
|
||||
'Phone Number': '+15742740548',
|
||||
'Sales Process Level': 'General Qualified Interest',
|
||||
}),
|
||||
|
||||
// Pattern C: name capitalization (Nicolas Ruiz #681/#682/#683 — three rows)
|
||||
row({
|
||||
Id: 681,
|
||||
'Full Name': 'Nicolas Ruiz',
|
||||
'Email Address': 'ruiz.nicolas@ufl.edu',
|
||||
'Phone Number': '+17862006617',
|
||||
'Sales Process Level': 'General Qualified Interest',
|
||||
}),
|
||||
row({
|
||||
Id: 682,
|
||||
'Full Name': 'Nicolas Ruiz',
|
||||
'Email Address': 'ruiz.nicolas@ufl.edu',
|
||||
'Phone Number': '+17862006617',
|
||||
'Sales Process Level': 'Specific Qualified Interest',
|
||||
}),
|
||||
row({
|
||||
Id: 683,
|
||||
'Full Name': 'Nicolas Ruiz',
|
||||
'Email Address': 'Ruiz.Nicolas@ufl.edu',
|
||||
'Phone Number': '+17862006617',
|
||||
'Sales Process Level': 'General Qualified Interest',
|
||||
}),
|
||||
|
||||
// Pattern E: surname typo with same email + phone (Constanzo/Costanzo)
|
||||
row({
|
||||
Id: 336,
|
||||
'Full Name': 'Gianfranco Di Costanzo',
|
||||
'Email Address': 'gdc@nauticall.com',
|
||||
'Phone Number': '+17542628669',
|
||||
'Yacht Name': 'GEMINI',
|
||||
'Sales Process Level': 'Contract Signed',
|
||||
}),
|
||||
row({
|
||||
Id: 585,
|
||||
'Full Name': 'Gianfranco Di Constanzo',
|
||||
'Email Address': 'gdc@nauticall.com',
|
||||
'Phone Number': '+17542628669',
|
||||
'Yacht Name': 'CALYPSO',
|
||||
'Sales Process Level': 'Signed EOI and NDA',
|
||||
}),
|
||||
|
||||
// Pattern F: same name, different country phones (Etiennette Clamouze)
|
||||
row({
|
||||
Id: 188,
|
||||
'Full Name': 'Etiennette Clamouze',
|
||||
'Email Address': 'clamouze.etiennette@gmail.com',
|
||||
'Phone Number': '+33767780640',
|
||||
'Sales Process Level': 'General Qualified Interest',
|
||||
}),
|
||||
row({
|
||||
Id: 717,
|
||||
'Full Name': 'Etiennette Clamouze',
|
||||
'Email Address': 'Etiennette@the-manoah.com',
|
||||
'Phone Number': '+12645815607',
|
||||
'Sales Process Level': 'General Qualified Interest',
|
||||
}),
|
||||
|
||||
// Single isolated row to verify non-duplicates pass through
|
||||
row({
|
||||
Id: 999,
|
||||
'Full Name': 'Lone Wolf',
|
||||
'Email Address': 'lone@example.com',
|
||||
'Phone Number': '+15551234567',
|
||||
'Sales Process Level': 'General Qualified Interest',
|
||||
}),
|
||||
],
|
||||
};
|
||||
|
||||
describe('transformSnapshot — fixture regression', () => {
|
||||
it('produces the expected number of clients + interests', () => {
|
||||
const plan = transformSnapshot(FIXTURE);
|
||||
|
||||
// 12 input rows → 7 unique clients (Deepak: 1, Wiarda: 1, Ruiz: 1,
|
||||
// Constanzo: 1, Etiennette x2: 2, Lone: 1). Etiennette stays as 2
|
||||
// because Pattern F is correctly NOT auto-merged.
|
||||
expect(plan.stats.outputClients).toBe(7);
|
||||
expect(plan.stats.outputInterests).toBe(12); // one per source row
|
||||
});
|
||||
|
||||
it('auto-links every Pattern A–E cluster', () => {
|
||||
const plan = transformSnapshot(FIXTURE);
|
||||
const linkedSourceIds = new Set<number>();
|
||||
for (const link of plan.autoLinks) {
|
||||
linkedSourceIds.add(link.leadSourceId);
|
||||
for (const merged of link.mergedSourceIds) {
|
||||
linkedSourceIds.add(merged);
|
||||
}
|
||||
}
|
||||
|
||||
// Pattern A: 624 + 625
|
||||
expect(linkedSourceIds.has(624) && linkedSourceIds.has(625)).toBe(true);
|
||||
// Pattern B: 236 + 536
|
||||
expect(linkedSourceIds.has(236) && linkedSourceIds.has(536)).toBe(true);
|
||||
// Pattern C: 681 + 682 + 683 (three-way)
|
||||
expect(linkedSourceIds.has(681) && linkedSourceIds.has(682) && linkedSourceIds.has(683)).toBe(
|
||||
true,
|
||||
);
|
||||
// Pattern E: 336 + 585
|
||||
expect(linkedSourceIds.has(336) && linkedSourceIds.has(585)).toBe(true);
|
||||
});
|
||||
|
||||
it('does NOT auto-link Pattern F (Etiennette Clamouze, different country)', () => {
|
||||
const plan = transformSnapshot(FIXTURE);
|
||||
const linkedSourceIds = new Set<number>();
|
||||
for (const link of plan.autoLinks) {
|
||||
linkedSourceIds.add(link.leadSourceId);
|
||||
for (const merged of link.mergedSourceIds) {
|
||||
linkedSourceIds.add(merged);
|
||||
}
|
||||
}
|
||||
// Both Etiennette rows must remain as separate clients.
|
||||
expect(linkedSourceIds.has(188)).toBe(false);
|
||||
expect(linkedSourceIds.has(717)).toBe(false);
|
||||
});
|
||||
|
||||
it('preserves every interest as its own row even when clients merge', () => {
|
||||
const plan = transformSnapshot(FIXTURE);
|
||||
const sourceIds = plan.interests.map((i) => i.sourceId).sort((a, b) => a - b);
|
||||
expect(sourceIds).toEqual([188, 236, 336, 536, 585, 624, 625, 681, 682, 683, 717, 999]);
|
||||
});
|
||||
|
||||
it('maps the legacy 8-stage enum to new pipeline stages', () => {
|
||||
const plan = transformSnapshot(FIXTURE);
|
||||
const stagesById = new Map(plan.interests.map((i) => [i.sourceId, i.pipelineStage]));
|
||||
expect(stagesById.get(681)).toBe('open'); // General Qualified Interest
|
||||
expect(stagesById.get(682)).toBe('details_sent'); // Specific Qualified Interest
|
||||
expect(stagesById.get(336)).toBe('contract_signed'); // Contract Signed
|
||||
expect(stagesById.get(585)).toBe('eoi_signed'); // Signed EOI and NDA
|
||||
});
|
||||
|
||||
it('attaches different yachts to one merged Constanzo client', () => {
|
||||
const plan = transformSnapshot(FIXTURE);
|
||||
const constanzoClient = plan.clients.find(
|
||||
(c) => c.sourceIds.includes(336) && c.sourceIds.includes(585),
|
||||
);
|
||||
expect(constanzoClient).toBeDefined();
|
||||
const yachtsForConstanzo = plan.interests
|
||||
.filter((i) => i.clientTempId === constanzoClient!.tempId)
|
||||
.map((i) => i.yachtName)
|
||||
.sort();
|
||||
expect(yachtsForConstanzo).toEqual(['CALYPSO', 'GEMINI']);
|
||||
});
|
||||
|
||||
it('produces deterministic output (same input → same plan)', () => {
|
||||
// The transform is pure — running it twice should yield bit-identical
|
||||
// results. Catches order-dependent bugs in the dedup clustering.
|
||||
const a = transformSnapshot(FIXTURE);
|
||||
const b = transformSnapshot(FIXTURE);
|
||||
expect(JSON.stringify(a.stats)).toBe(JSON.stringify(b.stats));
|
||||
expect(a.autoLinks.length).toBe(b.autoLinks.length);
|
||||
});
|
||||
});
|
||||
270
tests/unit/dedup/normalize.test.ts
Normal file
270
tests/unit/dedup/normalize.test.ts
Normal file
@@ -0,0 +1,270 @@
|
||||
/**
|
||||
* Normalization library — unit tests.
|
||||
*
|
||||
* Every fixture here comes from real dirty values observed in the legacy
|
||||
* NocoDB Interests table during the 2026-05-03 audit (see
|
||||
* docs/superpowers/specs/2026-05-03-dedup-and-migration-design.md §1.3).
|
||||
* The point is regression-prevention: if any of these patterns ever
|
||||
* stops normalizing the way it should, dedup quality silently drops.
|
||||
*/
|
||||
import { describe, expect, it } from 'vitest';
|
||||
|
||||
import {
|
||||
normalizeName,
|
||||
normalizeEmail,
|
||||
normalizePhone,
|
||||
resolveCountry,
|
||||
} from '@/lib/dedup/normalize';
|
||||
|
||||
describe('normalizeName', () => {
|
||||
it('returns null fields for empty / null input', () => {
|
||||
expect(normalizeName('')).toEqual({ display: '', normalized: '', surnameToken: undefined });
|
||||
expect(normalizeName(' ')).toEqual({
|
||||
display: '',
|
||||
normalized: '',
|
||||
surnameToken: undefined,
|
||||
});
|
||||
});
|
||||
|
||||
it('trims leading/trailing whitespace', () => {
|
||||
expect(normalizeName(' Marcus Laurent ')).toMatchObject({
|
||||
display: 'Marcus Laurent',
|
||||
normalized: 'marcus laurent',
|
||||
});
|
||||
});
|
||||
|
||||
it('collapses repeated internal whitespace to a single space', () => {
|
||||
// From real data: "Arthur Matthews" (#183), "Corinne Roche" (#208).
|
||||
expect(normalizeName('Arthur Matthews').display).toBe('Arthur Matthews');
|
||||
expect(normalizeName('Corinne Roche').display).toBe('Corinne Roche');
|
||||
});
|
||||
|
||||
it('replaces embedded carriage returns and newlines with single spaces', () => {
|
||||
// From real data: "Andrei \nVAGNANOV" (#178), "Daniel\r PRZEDBORSKI" (#175).
|
||||
expect(normalizeName('Andrei \nVAGNANOV').display).toBe('Andrei Vagnanov');
|
||||
expect(normalizeName('Daniel\r PRZEDBORSKI').display).toBe('Daniel Przedborski');
|
||||
});
|
||||
|
||||
it('title-cases ALL-CAPS surnames while keeping given name title-cased', () => {
|
||||
// From real data: "Jona ANDERSEN" (#232), "Duane SALTSGAVER" (#227),
|
||||
// "Marcos DALLA PRIA" (#165).
|
||||
expect(normalizeName('Jona ANDERSEN').display).toBe('Jona Andersen');
|
||||
expect(normalizeName('Duane SALTSGAVER').display).toBe('Duane Saltsgaver');
|
||||
// Particle 'dalla' stays lowercase mid-name.
|
||||
expect(normalizeName('Marcos DALLA PRIA').display).toBe('Marcos dalla Pria');
|
||||
});
|
||||
|
||||
it('title-cases lowercased entries', () => {
|
||||
// From real data: "antony amaral" (#665), "david rosenbloom" (#239),
|
||||
// "john Tickner" (#247).
|
||||
expect(normalizeName('antony amaral').display).toBe('Antony Amaral');
|
||||
expect(normalizeName('david rosenbloom').display).toBe('David Rosenbloom');
|
||||
expect(normalizeName('john Tickner').display).toBe('John Tickner');
|
||||
});
|
||||
|
||||
it('keeps Romance and Germanic particles lowercase mid-name', () => {
|
||||
// From real data: "Olav van Velsen" (#526), "Bruno Joyerot" (#18),
|
||||
// "OLIVIER DAIN" (#677). Also synthetic "Carla de la Cruz".
|
||||
expect(normalizeName('Olav van Velsen').display).toBe('Olav van Velsen');
|
||||
expect(normalizeName('Carla de la Cruz').display).toBe('Carla de la Cruz');
|
||||
expect(normalizeName('OLIVIER DAIN').display).toBe('Olivier Dain');
|
||||
});
|
||||
|
||||
it('preserves O‘-prefixed Irish surnames as title-case', () => {
|
||||
expect(normalizeName("liam o'brien").display).toBe("Liam O'Brien");
|
||||
});
|
||||
|
||||
it('keeps the slash-with-company structure intact', () => {
|
||||
// From real data: "Daniel Wainstein / 7 Knots, LLC" (#637),
|
||||
// "Bruno Joyerot / SAS TIKI" (#18).
|
||||
expect(normalizeName('Daniel Wainstein / 7 Knots, LLC').display).toBe(
|
||||
'Daniel Wainstein / 7 Knots, LLC',
|
||||
);
|
||||
expect(normalizeName('Bruno Joyerot / SAS TIKI').display).toBe('Bruno Joyerot / SAS TIKI');
|
||||
});
|
||||
|
||||
it('exposes the last non-particle token as surnameToken (lowercase) for blocking', () => {
|
||||
expect(normalizeName('Marcus Laurent').surnameToken).toBe('laurent');
|
||||
expect(normalizeName('Olav van Velsen').surnameToken).toBe('velsen');
|
||||
expect(normalizeName('Carla de la Cruz').surnameToken).toBe('cruz');
|
||||
expect(normalizeName("Liam O'Brien").surnameToken).toBe("o'brien");
|
||||
});
|
||||
|
||||
it('handles single-token names — surnameToken is the only token', () => {
|
||||
expect(normalizeName('Madonna').surnameToken).toBe('madonna');
|
||||
});
|
||||
|
||||
it('produces a normalized form that is always lowercase', () => {
|
||||
expect(normalizeName('Andrei VAGNANOV').normalized).toBe('andrei vagnanov');
|
||||
expect(normalizeName('Daniel Wainstein / 7 Knots, LLC').normalized).toBe(
|
||||
'daniel wainstein / 7 knots, llc',
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe('normalizeEmail', () => {
|
||||
it('returns null for empty / null inputs', () => {
|
||||
expect(normalizeEmail('')).toBeNull();
|
||||
expect(normalizeEmail(' ')).toBeNull();
|
||||
});
|
||||
|
||||
it('lowercases and trims', () => {
|
||||
// From real data: "Arthur@laser-align.com" vs "arthur@laser-align.com" (#183/#686).
|
||||
expect(normalizeEmail('Arthur@laser-align.com')).toBe('arthur@laser-align.com');
|
||||
expect(normalizeEmail(' marcus@example.com ')).toBe('marcus@example.com');
|
||||
});
|
||||
|
||||
it('lowercases capitalized localparts', () => {
|
||||
// From real data: "Bmalone850@gmail.com" (#489), "Hef355@yahoo.com" (#533),
|
||||
// "Donclaytonmusic@gmail.com" (#679).
|
||||
expect(normalizeEmail('Bmalone850@gmail.com')).toBe('bmalone850@gmail.com');
|
||||
expect(normalizeEmail('Hef355@yahoo.com')).toBe('hef355@yahoo.com');
|
||||
});
|
||||
|
||||
it('preserves plus-aliases — both legitimate and tricks', () => {
|
||||
// Per design §3.2: "+aliases" are not stripped. Compare by full localpart.
|
||||
expect(normalizeEmail('marcus+sales@example.com')).toBe('marcus+sales@example.com');
|
||||
});
|
||||
|
||||
it('returns null for invalid email shapes', () => {
|
||||
expect(normalizeEmail('not-an-email')).toBeNull();
|
||||
expect(normalizeEmail('@example.com')).toBeNull();
|
||||
expect(normalizeEmail('user@')).toBeNull();
|
||||
expect(normalizeEmail('user@.com')).toBeNull();
|
||||
});
|
||||
});
|
||||
|
||||
describe('normalizePhone', () => {
|
||||
it('returns null for empty / whitespace / null', () => {
|
||||
expect(normalizePhone('', 'AI')).toBeNull();
|
||||
expect(normalizePhone(' ', 'AI')).toBeNull();
|
||||
});
|
||||
|
||||
it('parses a plain E.164 number', () => {
|
||||
expect(normalizePhone('+15742740548', 'US')).toMatchObject({
|
||||
e164: '+15742740548',
|
||||
country: 'US',
|
||||
});
|
||||
});
|
||||
|
||||
it('strips embedded carriage returns and trailing whitespace', () => {
|
||||
// From real data: "+1-264-235-8840\r" (#19), "+1-264-772-3272\r" (#20).
|
||||
const out = normalizePhone('+1-264-235-8840\r', 'AI');
|
||||
expect(out?.e164).toBe('+12642358840');
|
||||
});
|
||||
|
||||
it('strips dashes, dots, parens, single quotes, spaces in a single pass', () => {
|
||||
// From real data: "'+1.214.603.4235" (#205), "574-274-0548" (#236),
|
||||
// "+1-264-235-8840" (#19), "+1 (212) 555-0123" (synthetic).
|
||||
expect(normalizePhone("'+1.214.603.4235", 'US')?.e164).toBe('+12146034235');
|
||||
expect(normalizePhone('574-274-0548', 'US')?.e164).toBe('+15742740548');
|
||||
expect(normalizePhone('+1 (212) 555-0123', 'US')?.e164).toBe('+12125550123');
|
||||
});
|
||||
|
||||
it('converts a leading 00 prefix to + (international dialling)', () => {
|
||||
// From real data: "00447956657022" (#216), "0033651381036" (#702).
|
||||
expect(normalizePhone('00447956657022', 'GB')?.e164).toBe('+447956657022');
|
||||
expect(normalizePhone('0033651381036', 'FR')?.e164).toBe('+33651381036');
|
||||
});
|
||||
|
||||
it('uses defaultCountry when input has no international prefix', () => {
|
||||
// From real data: "0690699699" (#203, French local), "0651381036" (#701).
|
||||
expect(normalizePhone('0690699699', 'FR')?.e164).toBe('+33690699699');
|
||||
expect(normalizePhone('0651381036', 'FR')?.e164).toBe('+33651381036');
|
||||
});
|
||||
|
||||
it('returns null when there is no prefix AND no defaultCountry', () => {
|
||||
// The migration script flags these for human review.
|
||||
const out = normalizePhone('5742740548');
|
||||
expect(out?.e164 ?? null).toBeNull();
|
||||
});
|
||||
|
||||
it('flags placeholder all-zeros numbers and returns null', () => {
|
||||
// From real data: "+447000000000" (#641, "Milos Vitkovic" — clearly fake).
|
||||
const out = normalizePhone('+447000000000', 'GB');
|
||||
expect(out?.flagged).toBe('placeholder');
|
||||
expect(out?.e164).toBeNull();
|
||||
});
|
||||
|
||||
it('flags multi-number fields and uses the first segment', () => {
|
||||
// From real data: "0677580750/0690511494" (#209). Other separators: ; ,
|
||||
const slash = normalizePhone('0677580750/0690511494', 'FR');
|
||||
expect(slash?.flagged).toBe('multi_number');
|
||||
expect(slash?.e164).toBe('+33677580750');
|
||||
|
||||
const semi = normalizePhone('+33611111111;+33622222222', 'FR');
|
||||
expect(semi?.flagged).toBe('multi_number');
|
||||
expect(semi?.e164).toBe('+33611111111');
|
||||
});
|
||||
|
||||
it('flags genuinely unparseable input as `unparseable`', () => {
|
||||
const out = normalizePhone('xyz-not-a-phone', 'US');
|
||||
expect(out?.flagged).toBe('unparseable');
|
||||
expect(out?.e164).toBeNull();
|
||||
});
|
||||
|
||||
it('strips an apostrophe-prefix without breaking the parse', () => {
|
||||
// From real data: leading "'" copy-pasted from spreadsheets escapes
|
||||
// numeric-cell coercion. Should be invisible to dedup.
|
||||
expect(normalizePhone("'0690699699", 'FR')?.e164).toBe('+33690699699');
|
||||
});
|
||||
|
||||
it('returns the country alongside the E.164 form', () => {
|
||||
expect(normalizePhone('+33690699699', 'FR')).toMatchObject({
|
||||
e164: '+33690699699',
|
||||
country: 'FR',
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('resolveCountry', () => {
|
||||
it('returns null for empty / nullish input', () => {
|
||||
expect(resolveCountry('')).toEqual({ iso: null, confidence: null });
|
||||
expect(resolveCountry(' ')).toEqual({ iso: null, confidence: null });
|
||||
});
|
||||
|
||||
it('exact-matches a canonical English country name', () => {
|
||||
expect(resolveCountry('Anguilla')).toEqual({ iso: 'AI', confidence: 'exact' });
|
||||
expect(resolveCountry('United Kingdom')).toEqual({ iso: 'GB', confidence: 'exact' });
|
||||
expect(resolveCountry('United States')).toEqual({ iso: 'US', confidence: 'exact' });
|
||||
});
|
||||
|
||||
it('matches case-insensitively', () => {
|
||||
expect(resolveCountry('anguilla').iso).toBe('AI');
|
||||
expect(resolveCountry('UNITED KINGDOM').iso).toBe('GB');
|
||||
});
|
||||
|
||||
it('matches values with surrounding whitespace', () => {
|
||||
expect(resolveCountry(' United States ').iso).toBe('US');
|
||||
});
|
||||
|
||||
it('handles diacritic variants of Saint-Barthélemy', () => {
|
||||
// From real data: "Saint barthelemy" (#203), "St Barth" (#208), "Saint-Barthélemy".
|
||||
expect(resolveCountry('Saint-Barthélemy').iso).toBe('BL');
|
||||
expect(resolveCountry('Saint Barthelemy').iso).toBe('BL');
|
||||
expect(resolveCountry('saint barthelemy').iso).toBe('BL');
|
||||
expect(resolveCountry('St Barth').iso).toBe('BL');
|
||||
});
|
||||
|
||||
it('resolves common abbreviations', () => {
|
||||
expect(resolveCountry('USA').iso).toBe('US');
|
||||
expect(resolveCountry('UK').iso).toBe('GB');
|
||||
});
|
||||
|
||||
it('falls back to a city → country mapping for high-frequency cities', () => {
|
||||
// From real data: "Kansas City" (#198), "Sag Harbor Y" (#239).
|
||||
expect(resolveCountry('Kansas City').iso).toBe('US');
|
||||
expect(resolveCountry('Sag Harbor Y').iso).toBe('US');
|
||||
});
|
||||
|
||||
it('marks the confidence tier appropriately', () => {
|
||||
expect(resolveCountry('Anguilla').confidence).toBe('exact');
|
||||
expect(resolveCountry('Kansas City').confidence).toBe('city');
|
||||
});
|
||||
|
||||
it('returns null + null for unresolvable values', () => {
|
||||
// Migration script flags these for human review rather than guessing.
|
||||
expect(resolveCountry('asdfghjkl xyz')).toEqual({ iso: null, confidence: null });
|
||||
expect(resolveCountry('Mars')).toEqual({ iso: null, confidence: null });
|
||||
});
|
||||
});
|
||||
@@ -142,7 +142,7 @@ describe('calculateInterestScore', () => {
|
||||
portId: 'p1',
|
||||
clientId: 'c1',
|
||||
createdAt: daysAgo(10),
|
||||
pipelineStage: 'contract',
|
||||
pipelineStage: 'contract_signed',
|
||||
eoiStatus: 'signed',
|
||||
contractStatus: 'signed',
|
||||
depositStatus: 'received',
|
||||
|
||||
Reference in New Issue
Block a user