78 Commits

Author SHA1 Message Date
Matt Ciaccio
ba89b61b3f fix(security): port-scope clientId/berthId/yachtId on interests + clientRelationships
All checks were successful
Build & Push Docker Images / lint (pull_request) Successful in 1m17s
Build & Push Docker Images / build-and-push (pull_request) Has been skipped
Pass-6 findings — both MEDIUM cross-tenant FK injection.

- interests.service: createInterest/updateInterest/linkBerth accepted
  clientId/berthId/yachtId from the request body without verifying the
  referenced row belongs to the caller's port. getInterestById joins
  clients/berths/yachtTags on these FKs without a port filter, so a
  port-A caller could splice a foreign-port id and surface that
  tenant's clientName, mooringNumber, or yacht ownership on read.
  New assertInterestFksInPort helper guards all three surfaces.

- clients.service.createRelationship: accepted clientBId from the
  body without a port check; the relationship list endpoint joins
  clients without filtering by port, so the foreign client's name
  + email would render in the relationships tab. Now verifies
  clientBId belongs to portId and rejects self-relationships.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 04:14:09 +02:00
Matt Ciaccio
4eea19a85b sec: lock down 5 cross-tenant FK gaps from fifth-pass review
1. HIGH — reminders.create/updateReminder accepted clientId/interestId/
   berthId from the body and persisted them with no port check; getReminder
   then hydrated the row via Drizzle relations (no port filter on the
   join), so a port-A user with reminders:create could exfiltrate any
   port-B client/interest/berth row by guessing its UUID. New
   assertReminderFksInPort gates create + update.

2. HIGH — listRecommendations(interestId, _portId) discarded portId
   entirely; the route GET /api/v1/interests/[id]/recommendations
   forwarded the URL id straight through. A port-A user with
   interests:view could read any other tenant's recommended berths
   (mooring numbers, dimensions, status). Service now verifies the
   interest belongs to portId and joins berths filtered by port.

3. HIGH — Berth waiting list. The PATCH route did not pre-check that
   the berth belonged to ctx.portId — a port-A user with
   manage_waiting_list could reorder a port-B berth's queue. Separately,
   updateWaitingList accepted arbitrary entries[].clientId and inserted
   them without verifying tenancy, polluting the table with foreign-port
   FKs. Both gaps closed.

4. MEDIUM — setEntityTags (clients/companies/yachts/interests/berths)
   accepted any tagId and inserted into the join table. The tags table
   is per-port but the join only carries a single-column FK. The
   downstream getById join `tags ON join.tag_id = tags.id` has no port
   filter, so a foreign tag's name + color render in the requesting port.
   Helper now batch-validates tagIds belong to portId before insert.

5. MEDIUM — /api/v1/custom-fields/[entityId] PUT had no withPermission
   gate (any role, including viewer, could write) and didn't validate
   that the URL entityId pointed at a port-scoped entity of the field
   definition's entityType. Route now uses
   withPermission('clients','view'/'edit',…); service validates the
   entityId per resolved entityType (client/interest/berth/yacht/company)
   against portId.

Test mocks updated to cover the new entity-port-scope check.
818 vitest tests pass.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 03:28:31 +02:00
Matt Ciaccio
47a1a51832 sec: webhook SSRF guard, IMAP-sync owner check, watcher port membership
Three findings from a fourth-pass review:

1. MEDIUM — webhook URL SSRF. The validator only enforced HTTPS+URL
   parse; it accepted private/loopback/link-local/.internal hosts. The
   delivery worker fetched arbitrary URLs and persisted up to 1KB of
   response body into webhook_deliveries.response_body, which is then
   surfaced via the deliveries listing endpoint — a port admin could
   register a webhook to an internal HTTPS endpoint, hit the test
   endpoint to force immediate dispatch, and read the response back.
   Validator now rejects RFC-1918/loopback/link-local/CGNAT/ULA IPs
   (v4 + v6) and .internal/.local/.localhost/.lan/.intranet/.corp
   suffixes; the worker re-resolves the hostname at dispatch time and
   blocks before fetch (DNS rebinding defense). 21-case unit test
   covers the matrix.

2. MEDIUM — POST /api/v1/email/accounts/[id]/sync had no owner check.
   Any user with email:view could enqueue an inbox-sync job for any
   accountId, which the worker would honour using the foreign user's
   decrypted IMAP credentials and advance the account's lastSyncAt
   (data-loss risk on the legitimate owner's next sync). Route now
   asserts account.userId === ctx.userId before enqueueing, matching
   the toggle/disconnect endpoints.

3. MEDIUM — addDocumentWatcher (and the wizard / upload watcher
   inserts) didn't validate the watcher's userId belonged to the
   document's port. notifyDocumentEvent then emitted a real-time
   socket toast + email containing the document title to the foreign
   user. New assertWatchersInPort helper verifies each candidate has
   a userPortRoles row for the port (super-admin bypass).

818 vitest tests pass.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 03:15:39 +02:00
Matt Ciaccio
9a5479c2c7 sec: lock down socket.io room subscription + crm-invite cross-tenant ops
1. HIGH — Socket.IO accepted client-supplied `auth.portId` in the
   handshake without verifying the user actually held a role in that
   port, then unconditionally joined the socket to `port:${portId}`.
   The `join:entity` handler also skipped authorization. This let any
   authenticated CRM user receive realtime events from any other
   tenant: invoice numbers + totals + client names, document signer
   emails, registration events with full client name + berth, file
   uploads, etc. Auth middleware now resolves the user's
   userPortRoles (or isSuperAdmin) before honouring portId, and
   join:entity verifies the entity's port matches a port the user
   has access to. Pre-existing pre-branch issue but fixed here given
   the explicit "all data is extremely sensitive" directive.

2. MEDIUM — listCrmInvites issued a global SELECT with no port
   scope. The crm_user_invites table has no portId column (invites
   mint global better-auth users, then port roles are assigned
   later). The previous gating on per-port admin.manage_users let
   any director enumerate every other tenant's pending invitee
   emails + isSuperAdmin flags — a phishing target list and a
   super-admin onboarding timing oracle. Restrict GET (list),
   DELETE (revoke), and POST resend to ctx.isSuperAdmin.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 03:00:55 +02:00
Matt Ciaccio
e06fb9545b sec: lock down 5 cross-tenant IDORs uncovered in second-pass review
1. HIGH — /api/v1/admin/ports/[id] PATCH+GET let any port-admin
   (manage_settings) mutate any other tenant's port row by passing the
   foreign id in the path. Now non-super-admins must target their own
   ctx.portId; listPorts and createPort are super-admin only.

2. HIGH — Invoice create/update accepted arbitrary expenseIds and
   linked them into invoice_expenses with no port check; the GET
   response then re-emitted those foreign expense rows via the
   linkedExpenses join. assertExpensesInPort now validates each id
   belongs to the caller's portId before insert; getInvoiceById's
   join filters by expenses.portId as defense-in-depth.

3. HIGH — Document creation paths (createDocument, createFromWizard,
   createFromUpload) persisted user-supplied clientId/interestId/
   companyId/yachtId/reservationId without verifying those FKs were
   in-port. sendForSigning then loaded the foreign client/interest by
   id alone and pushed their PII into the Documenso payload. New
   assertSubjectFksInPort helper rejects out-of-port FKs at create
   time; sendForSigning's interest+client lookups now also filter by
   portId.

4. MEDIUM — calculateInterestScore read its redis cache before
   verifying portId, and the cache key was interestId-only — a
   foreign-port caller could observe a cached score breakdown.
   Cache key now includes portId, and the port-scope DB lookup runs
   before any cache.get.

5. MEDIUM — AI email-draft job results were retrievable by anyone who
   could guess the BullMQ jobId (default sequential integers). Job
   ids are now random UUIDs, requestEmailDraft validates interestId/
   clientId belong to ctx.portId before enqueueing, the worker's
   client lookup is port-scoped, and getEmailDraftResult requires
   the caller to match the original requester's userId+portId before
   returning the drafted subject/body.

The interest-scoring unit test that asserted "DB is bypassed on cache
hit" is updated to reflect the new (security-correct) ordering.
Two new regression test files cover the email-draft binding (5 tests).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 02:48:43 +02:00
Matt Ciaccio
4c5334d471 sec: gate super-admin invite minting, OCR settings, and alert mutations
Three findings from the branch security review:

1. HIGH — Privilege escalation via super-admin invite. POST
   /api/v1/admin/invitations was gated only by manage_users (held by the
   port-scoped director role). The body schema accepted isSuperAdmin
   from the request, createCrmInvite persisted it verbatim, and
   consumeCrmInvite copied it into userProfiles.isSuperAdmin — granting
   the new account cross-tenant access. Now the route rejects
   isSuperAdmin=true unless ctx.isSuperAdmin, and createCrmInvite
   requires invitedBy.isSuperAdmin as defense-in-depth.

2. HIGH — Receipt-image exfiltration via OCR settings. The route
   /api/v1/admin/ocr-settings (and the sibling /test) were wrapped only
   in withAuth — any port role including viewer could PUT a swapped
   provider apiKey + flip aiEnabled, redirecting every subsequent
   receipt scan to attacker infrastructure. Both are now wrapped in
   withPermission('admin','manage_settings',…) matching the sibling
   admin routes (ai-budget, settings).

3. MEDIUM — Cross-tenant alert IDOR. dismissAlert / acknowledgeAlert
   issued UPDATE … WHERE id=? with no portId predicate. Any
   authenticated user with a foreign alert UUID could mutate it. Both
   service functions now require portId and add it to the WHERE; the
   route handlers pass ctx.portId.

The dev-trigger-crm-invite script passes a synthetic super-admin caller
identity since it runs out-of-band.

The two public-form tests randomize their IP prefix per run so a fresh
test process doesn't collide with leftover redis sliding-window entries
from a prior run (publicForm limiter pexpires after 1h).

Two new regression test files cover the fixes (6 tests).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 02:27:01 +02:00
Matt Ciaccio
61e40b5e76 chore(ops): split /api/health (liveness) from /api/ready (readiness)
Previously /api/health did deep dependency probes (postgres + redis +
minio) and 503'd on any failure. That's readiness behavior, not
liveness — a transient Redis/MinIO blip would tell the orchestrator to
restart the pod when it should only be dropped from the load balancer.

Make /api/health a thin liveness check (returns 200 unconditionally if
the process is responding) and move the deep checks to a new
/api/ready endpoint with the canonical Kubernetes-style 200/503
contract. Docker-compose healthchecks keep pointing at /api/health,
which is now more conservative (no false-positive container restarts).

Documenso/SMTP are intentionally not probed in /api/ready: each tenant
configures its own credentials and a tenant misconfiguration shouldn't
deadline the entire shared CRM.

Also tighten the gdpr-bundle-builder casts: replace the scattered
`as unknown as Record<string, unknown>` double-casts with a small
`toJsonRow<T>()` helper that does the widen narrow→wide in one place
with one cast hop instead of two.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 02:03:10 +02:00
Matt Ciaccio
7f9d90ad05 fix(gdpr): cap export-bundle size at 50MB before upload
Article-15 bundles are JSON+HTML only (no receipts/contracts), so even
heavy clients land at <1 MB. Anything larger almost certainly indicates
an unbounded relation we forgot to cap. Fail the worker job before
uploading rather than push a runaway blob to MinIO + email the client a
download link of mystery size.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 02:00:16 +02:00
Matt Ciaccio
5d29bfc153 refactor(services): centralize AuditMeta + transactional setEntityTags helper
The same `interface AuditMeta { userId; portId; ipAddress; userAgent }`
was duplicated in 26 service files. Move the canonical definition into
`@/lib/audit` next to the related types and update every service to
import it. `ServiceAuditMeta` (the alias used in invoices.ts and
expenses.ts) collapses into the same name.

Tag CRUD across clients/companies/yachts/interests/berths followed an
identical wipe-then-rewrite recipe with two latent issues: the delete
and insert weren't wrapped in a transaction (a partial failure left
the entity with zero tags) and the audit-log payload shape diverged
(`newValue: { tagIds }` for clients/yachts/companies but
`metadata: { type: 'tags_updated', tagIds }` for interests/berths).

Extract `setEntityTags` in `entity-tags.helper.ts` that performs the
delete+insert inside a single transaction, normalizes the audit payload
to `newValue: { tagIds }`, and dispatches the per-entity socket event
through a switch so `ServerToClientEvents` typing stays intact.

The five `setXTags(...)` service functions now do parent-row tenant
verification and delegate the join-table work + side effects.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 01:58:42 +02:00
Matt Ciaccio
43f68ca093 chore(hardening): maintenance jobs, defense-in-depth, redis-backed public rate limit
- maintenance worker now expires GDPR export bundles (db row + MinIO object)
  on the gdpr_exports.expires_at boundary, plus 90-day retention sweep on
  ai_usage_ledger; both jobs scheduled daily.
- portId scoping added to listClientRelationships and listClientExports
  (defense-in-depth — parent-resource gates already prevent cross-tenant
  reads, but service layer should enforce on its own).
- SELECT FOR UPDATE on parent client/company row inside add/update address
  transactions to serialize concurrent isPrimary toggles.
- public /interests + /residential-inquiries endpoints swap their
  in-memory ipHits maps for the redis sliding-window limiter via the
  new rateLimiters.publicForm config (5/hr/IP), so the cap survives
  restarts and is shared across worker processes.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 01:52:41 +02:00
Matt Ciaccio
d9557edfc5 docs(spec): GWS inbox-triage exploratory design (not approved for build)
Surveys what it actually takes to ship the AI inbox-triage feature
gated on Google Workspace integration. Walks through three deployment
models with their real costs:

- Model A (Marketplace OAuth app): 4-6 months calendar, $15k-$75k for
  the required CASA security assessment, recurring re-verification
- Model B (per-customer Internal OAuth app): ~5 weeks engineering, $0
  Google-side, scoped to one workspace per customer
- Model C (forward-to-CRM mailbox): ~1 week, receive-only, no reply
  drafts possible

Recommends Model B for the current customer profile, with B → A
promotion only if 3+ customers ask unprompted.

Documents what's already scaffolded (email_accounts/threads/messages
tables, syncInbox stub, BullMQ email queue, ai_usage_ledger, per-port
aiEnabled flag, withRateLimit('ai')) vs what's new (OAuth flow, Pub/
Sub push receiver, gws_user_tokens + email_triage tables, /inbox UI).

End-to-end flow, schema additions, AI cost interaction with the
Phase 3b token budgets, 5-phase build plan (G1-G5), and 5 open
decisions to resolve before scheduling the build. Explicitly out of
scope: M365, sentiment analysis, smart-drafts, cross-staff triage
queue.

No code changes — this is a design doc to drive a go/no-go decision.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 01:18:15 +02:00
Matt Ciaccio
6eb0d3dc92 docs(ops): backup/restore + email deliverability runbooks
Two new runbooks under docs/runbooks/ plus the automation scripts the
backup runbook references. Both are written so an operator who has only
the off-site backup credentials and the runbook can recover the system
unaided.

Backup/restore (Phase 4a):
- docs/runbooks/backup-and-restore.md — covers what gets backed up
  (Postgres / MinIO / .env+ENCRYPTION_KEY), schedule (hourly DB +
  hourly MinIO mirror, 7-day hourly + 30-day daily retention),
  cold-restore procedure with row-count verification, weekly drill
- scripts/backup/pg-backup.sh — pg_dump → gzip → optional GPG → mc
  upload, fails loud
- scripts/backup/minio-mirror.sh — incremental mc mirror, no --remove
  flag so accidental deletes on the live bucket can't cascade
- scripts/backup/restore.sh — interactive prod restore + --drill mode
  that runs against a sandbox DB and diffs row counts

Email deliverability (Phase 4b):
- docs/runbooks/email-deliverability.md — what the CRM sends, DNS
  records (SPF/DKIM/DMARC/MX), per-port override implications,
  diagnosis flow ("didn't arrive" → 4-step checklist starting with
  EMAIL_REDIRECT_TO), provider migration plan, realapi suite as the
  end-to-end probe

Tests still 778/778 vitest, tsc/lint clean — these phases are docs +
shell scripts, no code changes.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 20:10:30 +02:00
Matt Ciaccio
a3305a94f3 feat(gdpr): staff-triggered client-data export bundle (Article 15)
Adds a full GDPR Article 15 (right of access) workflow. Staff trigger
an export from the client detail; a BullMQ worker assembles every row
keyed to that client (profile, contacts, addresses, notes, tags,
yachts, company memberships, interests, reservations, invoices,
documents, last 500 audit events) into JSON + a self-contained HTML
report, ZIPs them, uploads to MinIO, and optionally emails the client
a 7-day signed download link.

- New table gdpr_exports tracks lifecycle (pending → building → ready
  → sent / failed) with a 30-day cleanup target
- Bundle builder (gdpr-bundle-builder.ts) — pure read-side, tenant-
  scoped, with HTML escaping to block injection from rogue field values
- Worker hook in export queue dispatches on job name 'gdpr-export'
- New audit actions: 'request_gdpr_export', 'send_gdpr_export'
- API: POST/GET /api/v1/clients/:id/gdpr-export (admin-gated, exports
  rate-limit, Article-15 audit on POST); GET /:exportId returns a
  fresh signed URL
- UI: <GdprExportButton> dialog on client detail header — admin-only,
  shows recent exports, supports email-to-client + override recipient,
  polls every 5s while open
- Validation: refuses email-to-client when no primary email + no
  override (rather than silently dropping the send)

Tests: 778/778 vitest (was 771) — +7 covering builder happy path,
HTML escaping, tenant isolation, empty client, request-flow validation,
and audit / queue interaction.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 20:06:31 +02:00
Matt Ciaccio
9dfa04094b feat(rate-limit): per-user limiters for OCR, AI, and exports
Adds three named rate limiters to the existing Redis sliding-window
catalog and a withRateLimit wrapper that composes inside withAuth.
Wires the OCR limiter into the receipt-scan endpoint so a runaway
client can't burn through the AI budget in a tight loop.

- ocr: 10/min/user
- ai: 60/min/user (reserved for future server-side AI surfaces)
- exports: 30/hour/user (reserved for GDPR bundle, PDF, CSV exports)

429 responses include X-RateLimit-* headers and a Retry-After hint.

Tests: 771/771 vitest (was 766) — +5 rate-limit tests covering catalog
shape, sliding window, cross-prefix isolation, cross-user isolation,
and resetAt timestamp.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 19:56:01 +02:00
Matt Ciaccio
e7d23b254c feat(ai): per-port token budgets + usage ledger for AI features
Adds a token-denominated guardrail in front of every server-side AI call
so a misconfigured port can't run up an unbounded bill. Soft caps surface
a banner; hard caps refuse new requests until the period rolls over.
Usage flows into a feature-typed ledger so future AI surfaces (summary,
embeddings, reply-draft) can drop in without schema changes.

- New table ai_usage_ledger (port, user, feature, provider, model,
  input/output/total tokens, request id) with two indexes for rollup
- New service ai-budget.service.ts: getAiBudget/setAiBudget,
  checkBudget (pre-flight gate), recordAiUsage, currentPeriodTokens,
  periodBreakdown — all token-based, period boundaries in UTC
- runOcr now returns provider usage so the route can record the actual
  spend instead of estimating
- Scan-receipt route gates on checkBudget before invoking AI; returns
  source: manual / reason: budget-exceeded when blocked, surfaces
  softCapWarning on the success path
- Admin UI: new AiBudgetCard on the OCR settings page — shows current
  spend, per-feature breakdown, soft/hard cap inputs, period selector
- Permission: admin.manage_settings on both routes

Tests: 766/766 vitest (was 756) — +10 budget tests covering enforce/
disabled/cap-exceed/estimate-exceed/soft-warn/period boundaries/
cross-port isolation/silent ledger failure.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 19:53:09 +02:00
Matt Ciaccio
2cf1bd9754 feat(ocr): Tesseract.js as default scanner, AI as opt-in per port
The mobile receipt scanner now runs Tesseract.js in-browser by default —
on-device, free, and image bytes never leave the device. AI providers
(OpenAI / Claude) become a per-port opt-in for higher accuracy on
hard-to-read receipts.

- Lazy-load Tesseract WASM in src/lib/ocr/tesseract-client.ts (5 MB
  bundle dynamic-imports on first scan, not in main chunk)
- Heuristic parser src/lib/ocr/parse-receipt-text.ts extracts vendor,
  date, amount, currency, and line items from raw OCR text
- New port-scoped aiEnabled flag on OcrConfig (defaults false). Resolved
  flag never inherits from the global row — each port admin opts in
  independently
- Scan endpoint short-circuits to manual-mode when aiEnabled=false so
  the AI provider is never invoked unless the admin has flipped the
  switch
- Scan UI runs Tesseract first, then asks the server whether AI is
  enabled — uses the AI result only when its confidence beats Tesseract;
  network failures degrade gracefully to the local parse
- Admin OCR-settings form gains the per-port aiEnabled checkbox

Tests: 756/756 vitest (was 747) — +7 parser unit tests, +2 aiEnabled
config tests.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 19:46:29 +02:00
Matt Ciaccio
46937bbcb9 feat(addresses): full CRUD UI for client + company multi-address
Client and company detail pages each gain an Addresses tab with click-to-edit
fields wired to the existing CountryCombobox/SubdivisionCombobox primitives.
Adds a primary toggle that demotes the previous primary inside one transaction
so the partial unique index never trips.

- New service helpers: list/add/update/remove ClientAddress + CompanyAddress
- New routes: /api/v1/clients/[id]/addresses[/addressId], same under companies/
- New shared component: <AddressesEditor> reused by both detail surfaces
- Integration tests cover happy path, primary demotion, and tenant scoping

Tests: 747/747 vitest (was 741, +6 address tests).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 19:38:43 +02:00
Matt Ciaccio
27cdbcc695 chore(i18n): drop legacy free-text country/nationality columns
Test-data only — no production migration needed (per earlier decision).
Schema is now ISO-only; readers convert ISO codes to localized names where
human-readable output is required (EOI documents, invoices, portal).

Migration 0016 drops:
  - clients.nationality
  - companies.incorporation_country
  - client_addresses.{state_province, country}
  - company_addresses.{state_province, country}

Code paths that previously read free-text values now read the ISO column
and pass through `getCountryName()` / `getSubdivisionName()` for rendering.
Document templates ({{client.nationality}}), portal client view, EOI/
reservation-agreement contexts, and invoice billing addresses all updated.

Public yacht-interest endpoint (/api/public/interests) drops the legacy
fields from its insert path and writes ISO codes only. The Zod validators
no longer accept the legacy fields — older website builds posting raw
'incorporationCountry' / 'country' / 'stateProvince' will get 400s.
Server-side phone normalization is unchanged.

Seed data updated to use ISO codes (GB/FR/ES/GR/SE/IT/GH/MC/PA), spread
across continents to keep test fixtures realistic.

Test assertions updated to match the new render shape (e.g.
'United States' not 'US', 'California' not 'CA').

Vitest: 741 -> 741 (unchanged count; assertions updated, no new tests).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 19:00:57 +02:00
Matt Ciaccio
31fa3d08ec chore(cleanup): Phase 1 — gap closure across audit, alerts, soft-delete, perms
Multi-area cleanup pass closing partial-implementation gaps surfaced by the
post-i18n audit. No behavior changes for happy-path users; closes real
correctness/security holes.

PR1a Public yacht-interest endpoint i18n. /api/public/interests now accepts
     phoneE164/phoneCountry, nationalityIso, address.{countryIso, subdivisionIso},
     and company.{incorporationCountryIso, incorporationSubdivisionIso}.
     Server-side parsePhone() fallback for legacy raw phone strings.

PR1b Alert rule registry trim. Two rule slots ('document.expiring_soon',
     'audit.suspicious_login') were registered but evaluators returned [].
     Both required schema/instrumentation that hadn't landed. Removed from
     the registry; comments record the dependencies needed to revive them.
     Effective rule count: 8 active.

PR1c vi.mock hoist + flake fix. Hoisted vi.mock calls to top-level in 5
     integration test files; webhook-delivery uses vi.hoisted for the
     queue-add ref. Vitest no longer warns about non-top-level mocks.
     Deflaked the 'short value' assertion in security-encryption.test.ts
     by switching plaintext from 'ab' to 'XY' (non-hex chars). 5/5 runs green.

PR1d Soft-delete reference audit. listClientOptions and listYachtsForOwner
     now filter by isNull(archivedAt). Berths use status (no archivedAt).

PR1e Permission-matrix audit script + report. scripts/audit-permissions.ts
     walks every src/app/api/v1/**/route.ts and reports handlers without a
     withPermission() wrapper. Initial run found 33 violations.
     - Allow-listed 17 with explicit reasons (self-data, admin, alerts,
       search, currency, ai, custom-fields — some marked TODO).
     - Wrapped 7 routes with concrete permissions: clients/options
       (clients:view), berths/options (berths:view), dashboard/*
       (reports:view_dashboard), analytics (reports:view_analytics).
     Audit report at docs/runbooks/permission-audit.md. Script exits
     non-zero on any unallow-listed violation so it can become a CI gate.

Vitest: 741 -> 741 (no new tests; existing suite covers the changes).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 18:48:22 +02:00
Matt Ciaccio
16d98d630e feat(i18n): country/phone/timezone/subdivision primitives + form wiring
Cross-cutting i18n polish for forms across the marina + residential + company
domains. Introduces a single source of truth for country/phone/timezone/
subdivision data and replaces every nationality-as-free-text and timezone-
as-string Input with a dedicated combobox.

PR1  Countries — ALL_COUNTRY_CODES (~250 ISO-3166-1 alpha-2), Intl.DisplayNames
     for localized labels, detectDefaultCountry() with navigator-region
     fallback to US, CountryCombobox with regional-indicator flag glyphs +
     compact mode for inline use.
PR2  Phone — libphonenumber-js wrapper (parsePhone / formatAsYouType /
     callingCodeFor), PhoneInput with flag dropdown + national-format
     AsYouType + paste-detect that flips the country dropdown for pasted
     international strings.
PR3  Timezones — country->IANA map (250 entries, multi-zone for AU/BR/CA/CD/
     ID/KZ/MN/MX/RU/US), formatTimezoneLabel ("Europe/London (UTC+1)"),
     TimezoneCombobox with Suggested/All grouping driven by countryHint.
PR4  Subdivisions — wraps the iso-3166-2 npm package (~5000 ISO 3166-2
     codes for every country), per-country cache, SubdivisionCombobox with
     "Pick a country first" / "No regions available" empty states.
PR5  Schema deltas (migration 0015) — clients.nationality_iso, clientContacts
     {value_e164, value_country}, clientAddresses {country_iso, subdivision_iso},
     residentialClients {phone_e164, phone_country, nationality_iso, timezone,
     place_of_residence_country_iso, subdivision_iso}, companies {incorporation_
     country_iso, incorporation_subdivision_iso}, companyAddresses {country_iso,
     subdivision_iso}. Plus shared zod validators (validators/i18n.ts) used
     by every entity validator + route handler.
PR6  ClientForm + ClientDetail — CountryCombobox replaces nationality Input,
     TimezoneCombobox replaces timezone Input (driven by nationalityIso hint),
     PhoneInput conditionally rendered for phone/whatsapp contacts. Inline
     editors (InlineCountryField / InlineTimezoneField / InlinePhoneField)
     for the detail-page overview rows + ContactsEditor.
PR7  Residential client form + detail — phone -> PhoneInput, nationality/
     timezone/place-of-residence-country/subdivision rows in both create
     sheet and inline-editable detail view. Subdivision wipes when country
     flips since codes are country-scoped.
PR8  Company form + detail — incorporation country -> CountryCombobox,
     incorporation region -> SubdivisionCombobox in both modes.
PR9  Public inquiry endpoint — accepts pre-normalized phoneE164/phoneCountry
     and i18n fields from newer website builds, server-side parsePhone()
     fallback for legacy raw-international submissions. Old Nuxt builds
     keep working unchanged.

Tests: 4 unit suites for the primitives (25 tests), 1 integration spec for
the public phone-normalization path (3 tests), 1 smoke spec asserting the
combobox triggers render in all three create sheets.

Test totals: vitest 713 -> 741 (+28).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 18:13:08 +02:00
Matt Ciaccio
f52d21df83 feat(phase-b): ship analytics dashboard, alerts, scanner PWA, dedup, audit view
Phase B (Insights & Alerts) PR4-11 in one drop. Builds on the schema +
service skeletons committed in PRs 1-3.

PR4  Analytics dashboard — 4 chart types (funnel/timeline/breakdown/source),
     date-range picker (today/7d/30d/90d), CSV+PNG export per card.
PR5  Alert rail UI + /alerts page — topbar bell w/ live count, dashboard
     right-rail, three-tab page (active/dismissed/resolved), socket-driven
     invalidation. Bell lazy-loads list on popover open to keep cold pages
     fast in non-dashboard routes.
PR6  EOI queue tab on documents hub — filters to in-flight EOIs, count
     surfaces in tab label.
PR7  Interests-by-berth tab on berth detail — replaces the stub.
PR8  Expense duplicate detection — BullMQ job runs scan on create, yellow
     banner on detail w/ Merge / Not-a-duplicate, transactional merge
     consolidates receipts and archives the source.
PR9  Receipt scanner PWA + multi-provider AI — port-scoped /scan route in
     its own (scanner) group with no dashboard chrome, dynamic per-port
     manifest, OpenAI + Claude provider abstraction, admin OCR settings
     page (port-level + super-admin global default w/ opt-in fallback),
     test-connection endpoint, manual-entry fallback when no key is
     configured. Verify form always shown before save — no ghost rows.
PR10 Audit log read view — swap to tsvector full-text search on the
     existing GIN index, cursor pagination, filters for entity/action/user
     /date range, batched actor-email resolution.
PR11 Real-API tests — opt-in receipt-ocr.spec (admin save+test, optional
     real-receipt parse via REALAPI_RECEIPT_FIXTURE) and alert-engine
     socket-fanout spec gated behind RUN_ALERT_ENGINE_REALAPI. Both skip
     cleanly without their gate envs so CI stays green.

Test totals: vitest 690 -> 713, smoke 130 -> 138, realapi +2 opt-in.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 17:21:55 +02:00
Matt Ciaccio
2fa70f4582 merge: PR3 — analytics snapshot service + refresh job (Phase B)
All checks were successful
Build & Push Docker Images / lint (pull_request) Successful in 1m1s
Build & Push Docker Images / build-and-push (pull_request) Has been skipped
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 14:54:48 +02:00
Matt Ciaccio
01b201e1a2 feat(analytics): real computations + 15-min snapshot refresh job
PR3 of Phase B. Replaces the no-op stubs in analytics.service.ts with
working drizzle queries and adds the recurring BullMQ job that warms
the cache.

Computations:
- computePipelineFunnel: groups interests by pipeline_stage filtered by
  port + range + not archived; emits 8-row stages array with conversion
  pct relative to 'open' as the funnel top.
- computeOccupancyTimeline: per day in range, counts berths covered by
  an active reservation (start_date ≤ day, end_date IS NULL OR ≥ day);
  emits {date, occupied, total, occupancyPct}.
- computeRevenueBreakdown: sums invoices.total grouped by status +
  currency; filters out archived rows.
- computeLeadSourceAttribution: counts interests by source descending;
  null source bucketed as 'unspecified'.

Public API (getPipelineFunnel, getOccupancyTimeline, etc.) reads
analytics_snapshots first; falls back to compute + writeSnapshot. TTL
15 minutes (matches the cron interval).

Cron:
- queue/scheduler.ts registers 'analytics-refresh' on maintenance with
  pattern '*/15 * * * *'.
- queue/workers/maintenance.ts dispatches to refreshSnapshotsForPort
  for every port; per-port try/catch so one bad port doesn't kill the
  sweep.

Tests: tests/integration/analytics-service.test.ts (9 cases). Pipeline
funnel math (incl. zero state), occupancy timeline shape/percentages
with seeded reservations, revenue grouped by status + currency, lead
source attribution incl. null bucketing, cache hit (mutate snapshot
directly → next read returns mutated value), refreshSnapshotsForPort
warms every metric×range combo.

Vitest 690/690 (+9). tsc + lint clean.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 14:54:46 +02:00
Matt Ciaccio
94f049c8b8 merge: PR2 — alert rules engine + cron + socket (Phase B)
All checks were successful
Build & Push Docker Images / lint (pull_request) Successful in 1m3s
Build & Push Docker Images / build-and-push (pull_request) Has been skipped
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 14:50:57 +02:00
Matt Ciaccio
df495133b7 feat(alerts): rule engine, recurring evaluator, socket fanout
PR2 of Phase B. Wires the alert framework end-to-end:

- alert-rules.ts: 10 rule evaluators implemented as pure async fns over
  the existing schema. reservation.no_agreement, interest.stale,
  document.signer_overdue, berth.under_offer_stalled, expense.duplicate,
  expense.unscanned, interest.high_value_silent, eoi.unsigned_long,
  audit.suspicious_login fire against real conditions.
  document.expiring_soon stays inert until the documents schema gets an
  expires_at column. audit.suspicious_login also stays inert until the
  auth layer logs 'login.failed' rows (TODO noted in the rule body).

- alert-engine.ts: runAlertEngine() walks every port × every rule and
  calls reconcileAlertsForPort. Errors per (port, rule) are collected
  in the summary, not thrown — one bad evaluator can't stop the sweep.

- alerts.service.ts: reconcileAlertsForPort now emits 'alert:created'
  socket events on insert and 'alert:resolved' on auto-resolve;
  dismissAlert emits 'alert:dismissed'. All scoped to port:{portId}
  rooms.

- socket/events.ts: adds the three Server→Client alert event types.

- queue/scheduler.ts: registers 'alerts-evaluate' on the maintenance
  queue with cron */5 * * * * (every 5 min, per spec risk register).

- queue/workers/maintenance.ts: dispatches 'alerts-evaluate' to
  runAlertEngine; logs sweep summary.

Tests:
- tests/integration/alerts-engine.test.ts (6 cases): seeds reservation
  → fires, runs twice → no dupe, adds agreement → auto-resolves; seeds
  stale interest → fires; hot lead silent → critical; engine summary
  shape on no-data port. Socket emit module is vi.mocked.

Vitest 681/681 (was 675; +6). tsc clean. Lint clean.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 14:50:55 +02:00
Matt Ciaccio
639025ebf9 merge: PR1 — Phase B schema + service skeletons (Phase B)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 14:43:03 +02:00
Matt Ciaccio
e77d55ac50 feat(insights): Phase B schema + service skeletons
PR1 of Phase B per docs/superpowers/specs/2026-04-28-phase-b-insights-alerts-design.md.
Lays the foundation that PRs 2-10 will fill in with behaviour.

Schema (migration 0014):
- alerts table with rule-engine fields (rule_id, severity, link,
  entity_type/id, fingerprint, fired/dismissed/acknowledged/resolved
  timestamps, jsonb metadata). Partial-unique fingerprint index keeps
  one open row per (port, rule, entity); separate indexes power
  severity-filtered and time-ordered queries.
- analytics_snapshots (port_id, metric_id) -> jsonb cache + computedAt
  for the 15-min recurring refresh.
- expenses: duplicate_of self-FK, dedup_scanned_at, ocr_status/raw/
  confidence; partial index on (port, vendor, amount, date) where
  duplicate_of IS NULL drives the dedup heuristic.
- audit_logs.search_text: GENERATED ALWAYS tsvector over
  action+entity_type+entity_id+user_id, GIN-indexed (drizzle can't
  model GENERATED ALWAYS in TS yet, so the migration appends manual
  ALTER + the GIN index).

Service skeletons in src/lib/services/:
- alerts.service.ts: fingerprintFor, reconcileAlertsForPort (upsert +
  auto-resolve), dismiss, acknowledge, listAlertsForPort.
- alert-rules.ts: RULE_REGISTRY of 10 rule evaluators (currently no-op);
  PR2 fills in the bodies.
- analytics.service.ts: readSnapshot/writeSnapshot with 15-min TTL +
  no-op compute* stubs for the four chart series; PR3 fills behavior.
- expense-dedup.service.ts: scanForDuplicates + markBestDuplicate
  using the partial dedup index. PR8 wires the BullMQ trigger.
- expense-ocr.service.ts: OcrResult/OcrLineItem types + ocrReceipt
  stub. PR9 wires Claude Vision (Haiku 4.5 + ephemeral system-prompt
  cache).
- audit-search.service.ts: tsvector @@ plainto_tsquery + cursor
  pagination on (createdAt, id). PR10 wires the admin UI.

tsc clean, lint clean, vitest 675/675 (one unrelated AES random-output
flake passes solo).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 14:43:01 +02:00
Matt Ciaccio
f1ed2a5f87 docs(spec): Phase B — insights, alerts, and operational awareness
All checks were successful
Build & Push Docker Images / lint (pull_request) Successful in 1m4s
Build & Push Docker Images / build-and-push (pull_request) Has been skipped
Full execution plan for the next phase. Closes the seven priority gaps
the 2026-04-28 Nuxt→Next audit surfaced (analytics, alerts,
interests-by-berth, expense dedup, EOI queue, OCR, audit log read view).

Scope:
- Analytics dashboard with KPI tiles, pipeline funnel, occupancy
  timeline, revenue breakdown, lead-source attribution; cached via
  `analytics_snapshots` recurring job.
- Alert framework: 10-rule v1 catalog, rule engine evaluates on cron,
  fingerprint dedupes, auto-resolves when condition clears, surfaces
  in dashboard right rail + dedicated /alerts page.
- Interests-by-berth tab on berth detail.
- Expense duplicate detection (vendor + amount + date ±3d) with
  merge action.
- OCR for expense receipts via Claude Vision (Haiku 4.5 + ephemeral
  system-prompt cache).
- Audit log admin read view with tsvector search + cursor pagination.
- EOI queue: saved-view tab on the documents hub.

11 PRs, ~10-13 dev days, calendar 2.5-3 weeks. Critical path
graphed. Risk register includes alert false-positive mitigation,
OCR cost ceiling via Haiku + cache, and audit-log scale.

Four open questions for the user in the spec footer.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 14:00:01 +02:00
Matt Ciaccio
4036c16f39 test(infra): vitest globalSetup teardown purges test-port-* leaks
Integration tests use makePort() which writes ports with slug 'test-port-{rand}'
and never cleans up. Result: 17,564 leaked rows in dev that slowed every page
load fetching the port-switcher list (and was contributing to smoke flakes).

Adds tests/global-setup.ts with a teardown() that DELETEs every 'test-port-%'
row plus its dependent rows across 30+ tables in one CTE. Wires it into
vitest.config.ts via globalSetup. Adds closeDb() helper so the teardown can
end the postgres-js pool cleanly (kills the 'Tests closed but Vite server
won't exit' warning).

Also lands docs/superpowers/specs/2026-04-28-country-phone-timezone-design.md
— full-scope agenda for the country dropdown / E.164 phone input /
country-driven timezone autofill work, ~7 dev days across 10 PRs. Per
user request: 'let's do this full-fledged if we're gonna do it'.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 13:28:15 +02:00
Matt Ciaccio
5f9bbb97bd fix(sidebar): replace floating circular collapse button with blended row
User feedback: the circular toggle floating off the sidebar's right
edge looked tacked-on. Replaced with a flush full-width row above the
user footer (right-aligned 'Collapse <' chip when expanded; centered
chevron when collapsed). Same nav-item hover treatment so it merges
visually with the sidebar palette. The <aside> no longer needs to
host an overhanging button.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 13:00:20 +02:00
Matt Ciaccio
4911083d0f fix(visual): KPITile data-testid + restore residential interest casing
Post-PR10c follow-ups discovered during smoke triage:
- KPITile gets data-testid="kpi-tile" so the dashboard smoke spec's
  '[data-testid*="kpi"]' selector matches (test 10-dashboard:27 expected
  >=4 kpi cards; the old Card-based render was matched by the
  '[class*="card"]' branch and didn't need a testid).
- Residential interest detail eyebrow text reverted from "Residential
  Interest" to "Residential interest" (lowercase i). The visual is
  identical because the wrapper has the `uppercase` class; the smoke
  spec at 26-residential:140 looks for the literal lowercase string.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 12:56:32 +02:00
Matt Ciaccio
3a7fef59b0 fix(visual): dark-mode-safe borders + sidebar relative + ring-background
Code-review follow-up to PR10b-e:
- DetailHeaderStrip + KPITile: border-slate-200 → border-border so dark
  mode doesn't paint a bright halo around the gradient strip.
- Topbar avatar: ring-white → ring-background so the 2px ring tracks
  the surface (matches the sidebar footer pattern).
- KpiTileSkeleton stripe: bg-slate-100 → bg-muted for parity with
  shadcn skeleton tokens in dark mode.
- Sidebar <aside>: add `relative` so the absolute-positioned
  collapse-toggle button anchors to the sidebar itself rather than
  the nearest positioned ancestor.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 12:24:14 +02:00
Matt Ciaccio
c081334020 merge: PR10e — visual polish (mobile responsive sweep) (Phase A)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 12:10:22 +02:00
Matt Ciaccio
2d1b50745a style(mobile): responsive tabs + table overflow + hub flex-wrap (Phase A)
Adds <ResponsiveTabs> primitive that swaps the TabsList for a native Select on
phone-sized viewports (<640px). DetailLayout now routes its tab strip through it,
so every tabbed detail page gets the collapse for free. DataTable wraps the
Table in overflow-x-auto so wide column sets scroll horizontally instead of
breaking the layout under 768px. Documents-hub row swaps the fixed
grid-cols-[auto_1fr_auto_auto_auto_auto] for flex-wrap below sm: so signers /
status / dates stack instead of clipping.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 12:10:21 +02:00
Matt Ciaccio
40ae860a88 merge: PR10d — visual polish (sidebar/topbar) (Phase A)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 12:10:12 +02:00
Matt Ciaccio
c7ca7c1f96 style(layout): sidebar stripe + topbar gradient + bell spring + search ring
Sidebar active items: 4px brand left-edge stripe (rounded-r-full) replacing the
border-l-2 + bg shift; section header smaller-caps + brand-200 colour; user-footer
avatar gets shadow-sm + ring-2 ring-white/30.

Topbar '+ New' uses bg-gradient-brand with shadow-sm + scale-1.02 hover. User
avatar trigger gets shadow-sm + ring-2 ring-white. Notification badge gets
gradient-brand fill + ring-2 ring-background + animate-badge-pop spring keyframe
(retriggers on count change via key={unreadCount}). Command search gets shadow-xs
inset + brand focus ring (ring-4 ring-brand/15).

Adds badge-pop keyframes to tailwind config.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 12:10:11 +02:00
Matt Ciaccio
22b019a27e merge: PR10c — visual polish (dashboard) (Phase A)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 12:10:00 +02:00
Matt Ciaccio
a3424b80d5 style(dashboard): KPITile primitive + gradient PageHeader + tile skeletons
Replaces flat Card-based KPI rendering with KPITile (gradient-brand-soft + accent
stripe). Adds polished gradient PageHeader to DashboardShell with eyebrow, KPI
sub-line, description. Tile-shaped skeletons replace the four CardSkeletons during
KPI load.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 12:09:59 +02:00
Matt Ciaccio
5bcdfefde3 merge: PR10b — visual polish (detail pages) (Phase A)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 12:09:54 +02:00
Matt Ciaccio
22f944fde2 style(detail): apply gradient header strip to client/interest/yacht/company/berth/residential/invoice details
Adds shared <DetailHeaderStrip> wrapper (rounded-xl + gradient-brand-soft + shadow-xs)
and applies it to every legacy domain detail header. Residential client/interest and
invoice detail get an inline gradient strip with eyebrow ('Residential Client',
'Residential Interest', 'Invoice'). Residential bodies normalized to lg:grid-cols-[2fr_1fr]
per spec.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 12:09:47 +02:00
Matt Ciaccio
cda44e721b fix(layout): hoist TooltipProvider to wrap full sidebar tree
The collapsed-state user-footer renders a Tooltip that was outside the
TooltipProvider — the provider only wrapped the nav. Once the sidebar
toggled to collapsed, the footer Tooltip threw "Tooltip must be used
within TooltipProvider", surfacing as console errors in exhaustive
click-through tests.

Move TooltipProvider up one level so every Tooltip in the sidebar tree
(nav items + user footer) is covered.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 05:08:01 +02:00
Matt Ciaccio
0406778c44 fix(api): kill currentPortId persist race + dedupe admin/ports stampede
The dashboard and residential interest smoke tests were intermittently
failing with the page rendering empty/skeleton state. Root causes:

1. ui-store persisted currentPortId/Slug, but those are URL-derived state.
   After login lands on /<first-port-by-name>/dashboard, localStorage holds
   that port. Hard-navigating to /port-nimara/... rehydrated the store with
   the stale id, and useQuery fired with the wrong port before
   PortProvider's URL-sync useEffect could correct it. Drop both fields
   from partialize — PortProvider re-derives them from the route every
   navigation.

2. apiFetch's slug-to-port fallback fired N parallel /api/v1/admin/ports
   calls when N components mounted simultaneously with an empty store.
   Dedupe in-flight lookups so a stampede collapses into one round-trip.

Also tightened four flaky smoke tests that depended on a fixed 3s wait or
non-waiting isVisible({timeout}) — replaced with expect(...).toBeVisible
or expect.poll so they handle dev-mode JIT cold-start delays cleanly.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 04:38:57 +02:00
Matt Ciaccio
259cd7b8bb merge: PR11 — realapi spec scaffolds (Phase A) 2026-04-28 02:53:55 +02:00
Matt Ciaccio
e42b8fde84 test(realapi): Phase A integration spec scaffolds
Adds four realapi specs that gate cleanly on env: smtp-system-send
(roundtrip + attachment bytes), email-attachments-roundtrip
(cross-port 403), documenso-cancel (in-flight cancel → status flip),
minio-file-lifecycle (upload/list/download/delete byte-equal).
Specs skip without DOCUMENSO_API_*/SMTP/IMAP/MINIO env so they run as
no-ops when those services aren't reachable.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 02:53:49 +02:00
Matt Ciaccio
f354f4adab merge: PR10 — visual polish (lists) (Phase A) 2026-04-28 02:52:26 +02:00
Matt Ciaccio
38cd36a616 style(lists): apply gradient PageHeader to client/interest/yacht/company/berth lists
Pulls the polished gradient hero strip into the five primary list
surfaces. PR10b-e (detail polish, dashboard/admin polish, email +
notifications polish, mobile responsive sweep) deferred to a follow-up
release per spec risk register since visual baseline regen needs hands-
on iteration.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 02:52:17 +02:00
Matt Ciaccio
77b6ef5026 merge: PR9 — reminder framework polish (Phase A) 2026-04-28 02:50:06 +02:00
Matt Ciaccio
978df1c4d7 feat(reminders): cadence-aware framework with auto/manual modes
isReminderDue now keys off doc.remindersDisabled and the effective
cadence (per-doc override → template default), dropping the implicit
interests.reminderEnabled gate so non-EOI docs auto-remind correctly.
sendReminderIfAllowed gains an options bag — auto:true keeps the 9-16
window + cadence cooldown for the cron, auto:false bypasses both for
manual UI sends. signerId targets a specific pending signer (must be
next in sequential mode). 7 unit tests cover the cadence math.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 02:50:00 +02:00
Matt Ciaccio
df0b408b7a merge: PR8 — email attachments + system/user senderType (Phase A) 2026-04-28 02:48:17 +02:00
Matt Ciaccio
1151768159 feat(email): system/user senderType + attachments
Composer validator now takes senderType (system|user) and an
attachments[] array, and the service dispatches across two paths:
the system path uses lib/email/index.ts with port-config noreply
identity and logs signed_doc_emailed when an attachment matches a
document's signed PDF; the user path stays on the existing personal-
account flow but is gated by the new email.allowPersonalAccountSends
toggle and the attachment fileIds are persisted on email_messages.
sendEmail in lib/email accepts attachments and resolves them from
MinIO with cross-port enforcement.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 02:48:11 +02:00
Matt Ciaccio
9e69c13202 merge: PR7 — reservation detail + agreement (Phase A) 2026-04-28 02:45:11 +02:00
Matt Ciaccio
6212c118e5 feat(reservations): detail page with agreement flow + contract mirror
Adds /berth-reservations/[id] with state-aware agreement card (none /
in-flight / completed) and the Generate-agreement entry point that
opens the wizard prefilled. handleDocumentCompleted now mirrors a
signed reservation_agreement onto berth_reservations.contractFileId
so the portal can resolve contracts without joining through documents.
Reservation merge tokens (startDate/endDate/tenureType/termSummary/
signedDate) added to the catalog.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 02:45:05 +02:00
Matt Ciaccio
6795db9aa8 merge: PR6 — create-document wizard MVP (Phase A) 2026-04-28 02:43:05 +02:00
Matt Ciaccio
d8f0cdd7d2 feat(documents): create-document wizard MVP + service dispatch
Implements createFromWizard and createFromUpload service paths covering
the documenso-template, in-app, and upload pathways. Persists subject
FK, signers, watchers, and the per-document reminder controls
(remindersDisabled / reminderCadenceOverride) introduced in PR1. New
POST /api/v1/documents/wizard route and a functional /documents/new UI
with type/source/template/signers/reminders sections. Drag-handle
reorder, watcher autocomplete picker, and PDF preview defer to the
PR10 polish sweep.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 02:43:00 +02:00
Matt Ciaccio
2dc53842c0 merge: PR5 — document detail page (Phase A) 2026-04-28 02:39:52 +02:00
Matt Ciaccio
aa15807063 feat(documents): detail page with signers, watchers, activity, actions
Replaces the PR4 stub at /documents/[id] with the full Phase A detail
view: gradient header strip, status-aware action bar (Cancel /
Download / Email signatories), per-signer remind + copy-link, watcher
list with remove, and activity timeline. Adds the supporting endpoints
(cancel, compose-completion-email, watchers GET/POST/DELETE) and
listDocumentWatchers / addDocumentWatcher / removeDocumentWatcher
service helpers. The document GET now serves the aggregator shape
when ?detail=true.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 02:39:46 +02:00
Matt Ciaccio
2a3fae4d6a merge: PR4 — documents hub page (Phase A) 2026-04-28 02:35:43 +02:00
Matt Ciaccio
da7262f18f feat(documents): hub page with tabs, filters, and live counts
Replaces /documents with the Phase A hub: tabs (All/Awaiting them/
Awaiting me/Completed/Expired) backed by per-tab counts via a new
hub-counts endpoint, signature-only chip, type filter, expandable
signer rows, and real-time invalidation across the eight document
socket events. listDocuments grew tab/watcher/signatureOnly/sent-window
filters; the legacy file browser moved to /documents/files where the
sidebar already linked.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 02:35:36 +02:00
Matt Ciaccio
398d6322f1 merge: PR3 — visual primitives + tokens (Phase A) 2026-04-28 02:25:14 +02:00
Matt Ciaccio
deafc5ef38 feat(ui): visual polish primitives + token additions (Phase A)
Adds the design tokens the polish PRs (10a-e) will draw from:
shadow-xs/sm/md/lg/glow, radius scale tuned to spec, gradient utilities,
spring/smooth eases, and fast/base/slow durations. Introduces
StatusPill, KPITile, and EmptyState primitives plus a polished
PageHeader variant ('gradient') with optional eyebrow + KPI sub-line —
existing PageHeader callers stay on the plain variant.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 02:25:08 +02:00
Matt Ciaccio
9b87b14c99 merge: PR2 — Documenso v1/v2 abstraction (Phase A) 2026-04-28 02:22:11 +02:00
Matt Ciaccio
da44e8ecbe feat(documenso): version-aware field placement + void abstractions
Adds DOCUMENSO_API_VERSION env (default v1) plus per-port override.
Introduces placeFields, placeDefaultSignatureFields, and voidDocument
that hide v1 (per-field POST, pixel coords) vs v2 (bulk POST, percent +
fieldMeta) differences. cancelDocument now voids in Documenso first and
treats transient void failures as recoverable so the CRM stays the
system of record. 16 unit specs cover dispatch, layout math, idempotent
404, and v1 pixel conversion.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 02:22:04 +02:00
Matt Ciaccio
af2db06244 merge: PR1 — data model + service skeletons (Phase A) 2026-04-28 02:12:14 +02:00
Matt Ciaccio
0eff6050ae feat(documents): Phase A schema + service skeletons
Adds Phase A data model deltas to documents/templates and the new
document_watchers table. Introduces createFromWizard/createFromUpload
stubs, getDocumentDetail aggregator, cancelDocument flow, signed-doc
email composer, reservation agreement context, and notifyDocumentEvent
fan-out. Validator update accepts new template formats with html-only
bodyHtml requirement. EOI cadence backfilled to 1 day to preserve
current effective behaviour.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 02:12:05 +02:00
Matt Ciaccio
d8ac62f6f4 docs(spec): documents hub + reservation agreements + visual polish (Phase A)
Captures the brainstorm output covering:
- Documents hub at /[port]/documents replacing existing list
- Document detail page with vertical signers panel, watchers, timeline
- Generalised create-document wizard (HTML / PDF AcroForm / PDF overlay /
  Documenso-rendered + ad-hoc PDF upload)
- Reservation agreements as a doc type with new CRM-side detail page
- Email composer attachments + System-vs-User From selector (admin-gated)
- Reminder framework polish (per-template cadence, per-doc override, per-doc
  disable, per-signer manual reminders); drops interests.reminderEnabled gating
- Documenso v1.13.1/v2.x version-aware abstraction for field placement + void
- System-wide visual polish (token additions, primitive components, sweep)
- Test plan including click-everything sweep + expanded realapi round-trip
- Build sequence: 11 PRs, ~3.5 weeks critical path

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 01:51:41 +02:00
Matt Ciaccio
dd138547fb test(e2e): fix admin-nav locator + add residential interest API coverage
- 21-role-based-ui: tighten the Settings link locator. The previous
  `getByRole('link', { name: /settings/i }).first().or(getByText(/.../) .first())`
  chain hit a strict-mode violation once the sidebar Admin section became
  default-expanded — both the section header text node and the Settings
  link matched. Match the link directly with exact: true.
- 26-residential: extend smoke with two API-driven specs covering the
  residential interest pipeline — create+list and detail-page render —
  using preferences-string stamp + heading match for assertions.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 00:19:51 +02:00
Matt Ciaccio
1791dd7319 fix(ui): resolve yacht owner names server-side, real user in topbar
All checks were successful
Build & Push Docker Images / lint (pull_request) Successful in 1m1s
Build & Push Docker Images / build-and-push (pull_request) Has been skipped
Yachts list page rendered each row's Current Owner via OwnerLink, which
fired its own /api/v1/clients/{id} or /companies/{id} fetch — N+1 round-
trips per page load (12+ for the harbor-royale fixture). Worse, until
those fetches resolved each cell showed "Client c68da7..." style raw IDs.

Fix: listYachts now resolves the polymorphic currentOwnerName in two
batched in-array queries after the page query (mirrors the listClients
yachtCount/companyCount pattern), and OwnerLink accepts an optional
preloadedName prop that suppresses the per-row fetch when supplied.

Topbar: show real user name + avatar initial from session/profile, and
expand the My-Account dropdown header to include the user's email.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 23:54:04 +02:00
Matt Ciaccio
0ccc66833d fix(ui): admin settings loading-loop, real user name, expanded admin nav
All checks were successful
Build & Push Docker Images / lint (pull_request) Successful in 1m0s
Build & Push Docker Images / build-and-push (pull_request) Has been skipped
SettingsFormCard
- Parent components pass `FIELDS.slice(...)` inline, so the prop reference
  changes on every render. The fetch callback's useCallback re-created
  itself, useEffect re-fired, and loading flicker meant the form never
  rendered. Capture fields in a ref so the callback is stable.

Sidebar
- Show real user name + avatar initial from session/profile, replacing
  the hardcoded "User Name" / "U" placeholder.
- Default the admin-section to expanded so its items are reachable on
  first page load (was collapsed behind a chevron).

Dashboard layout
- Pass {name, email} from the session/profile through to <Sidebar />.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 23:44:04 +02:00
Matt Ciaccio
4877b97f27 feat(admin): per-port email/Documenso/branding/reminder settings + invitations
All checks were successful
Build & Push Docker Images / lint (pull_request) Successful in 1m1s
Build & Push Docker Images / build-and-push (pull_request) Has been skipped
Centralizes everything operators need to configure into the admin panel,
each setting per-port with env fallback.

New admin pages
- /admin              landing page linking to every admin section as a card
- /admin/email        FROM name+address, reply-to, signature/footer HTML,
                      optional SMTP host/port/user/pass override
- /admin/documenso    API URL+key override, EOI Documenso template ID,
                      default EOI pathway (documenso-template vs inapp),
                      "Test connection" button
- /admin/branding     logo URL, primary color, app name, email
                      header/footer HTML
- /admin/reminders    port-level defaults for new interests +
                      port-wide daily-digest delivery window
- /admin/invitations  send / list / resend / revoke CRM invitations

Per-user reminder digest
- /notifications/preferences gains a Reminder digest card:
  immediate / daily / weekly / off, with HH:MM, day-of-week,
  IANA timezone fields. Stored in user_profiles.preferences.reminders.

Plumbing
- port-config.ts typed accessors (getPortEmailConfig, getPortDocumensoConfig,
  getPortBrandingConfig, getPortReminderConfig) — settings → env fallback.
- sendEmail accepts optional portId; resolves From/SMTP from settings
  when supplied.
- documensoFetch + downloadSignedPdf accept optional portId; each public
  function takes it through. checkDocumensoHealth() backs the test button.
- crm-invite.service gains listCrmInvites / revokeCrmInvite / resendCrmInvite
  with audit-log entries (revoke_invite, resend_invite added to AuditAction).
- AdminLandingPage card grid + shared SettingsFormCard component to remove
  per-page form boilerplate.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 23:21:54 +02:00
Matt Ciaccio
f2c57c513e feat(queue): implement form-expiry-check maintenance job
All checks were successful
Build & Push Docker Images / lint (pull_request) Successful in 1m0s
Build & Push Docker Images / build-and-push (pull_request) Has been skipped
Marks pending form_submissions whose expires_at has passed
as 'expired'. Logs the count of rows transitioned each run.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 21:58:14 +02:00
Matt Ciaccio
999622fd08 feat(companies): show member + yacht counts on list page
All checks were successful
Build & Push Docker Images / lint (pull_request) Successful in 59s
Build & Push Docker Images / build-and-push (pull_request) Has been skipped
listCompanies returns memberCount (active companyMemberships)
and yachtCount (yachts where currentOwnerType=company), each
fetched as a parallel grouped count after the main page query.
Two new badge columns in company-columns render them between
the tax-id and status columns.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 21:57:13 +02:00
Matt Ciaccio
e8d61c91c4 feat(platform): residential module + admin UI + reliability fixes
All checks were successful
Build & Push Docker Images / lint (pull_request) Successful in 1m2s
Build & Push Docker Images / build-and-push (pull_request) Has been skipped
Residential platform
- New schema: residentialClients, residentialInterests (separate from
  marina/yacht clients) with migration 0010
- Service layer with CRUD + audit + sockets + per-port portal toggle
- v1 + public API routes (/api/v1/residential/*, /api/public/residential-inquiries)
- List + detail pages with inline editing for clients and interests
- Per-user residentialAccess toggle on userPortRoles (migration 0011)
- Permission keys: residential_clients, residential_interests
- Sidebar nav + role form integration
- Smoke spec covering page loads, UI create flow, public endpoint

Admin & shared UI
- Admin → Forms (form templates CRUD) with validators + service
- Notification preferences page (in-app + email per type)
- Email composition + accounts list + threads view
- Branded auth shell shared across CRM + portal auth surfaces
- Inline editing extended to yacht/company/interest detail pages
- InlineTagEditor + per-entity tags endpoints (yachts, companies)
- Notes service polymorphic across clients/interests/yachts/companies
- Client list columns: yachtCount + companyCount badges
- Reservation file-download via presigned URL (replaces stale <a href>)

Route handler refactor
- Extracted yachts/companies/berths reservation handlers to sibling
  handlers.ts files (Next.js 15 route.ts only allows specific exports)

Reliability fixes
- apiFetch double-stringify bug fixed across 13 components
  (apiFetch already JSON.stringifies its body; passing a stringified
  body produced double-encoded JSON which failed zod validation)
- SocketProvider gated behind useSyncExternalStore-based mount check
  to avoid useSession() SSR crashes under React 19 + Next 15
- apiFetch falls back to URL-pathname → port-id resolution when the
  Zustand store hasn't hydrated yet (fresh contexts, e2e tests)
- CRM invite flow (schema, service, route, email, dev script)
- Dashboard route → [portSlug]/dashboard/page.tsx + redirect
- Document the dev-server restart-after-migration gotcha in CLAUDE.md

Tests
- 5-case residential smoke spec
- Integration test updates for new service signatures

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 21:54:32 +02:00
Matt Ciaccio
fac8021156 docs: reflect testing infra + Documenso/portal auth conventions in CLAUDE.md
All checks were successful
Build & Push Docker Images / lint (pull_request) Successful in 59s
Build & Push Docker Images / build-and-push (pull_request) Has been skipped
- Quick reference: add commands for every Playwright project + dev tsx helpers
- Conventions: document the Documenso webhook auth pattern (X-Documenso-Secret
  plaintext, not HMAC), the v1.13/2.x response shape normalization layer,
  the email template module location + responsive table layout, and the
  PortalAuthShell pattern that unifies the in-app and email branding
- Environment: document EMAIL_REDIRECT_TO and IMAP_* dev/test-only vars
- New Testing section enumerating the five Playwright projects (setup,
  smoke, exhaustive, destructive, realapi, visual) and what each covers

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 15:48:18 +02:00
Matt Ciaccio
ea8181d108 test(visual): regression baselines for stable list/landing pages
All checks were successful
Build & Push Docker Images / lint (pull_request) Successful in 1m7s
Build & Push Docker Images / build-and-push (pull_request) Has been skipped
New `visual` project covers six low-volatility screens — portal login,
dashboard, and the four core lists (clients/yachts/berths/invoices) —
with full-page screenshots that diff to a 2% pixel-ratio tolerance.
Animations and the cursor caret are disabled inline so transient
rendering doesn't trigger flaky diffs.

Detail screens (yacht detail, EOI dialog, invoice form steps) are
intentionally deferred until we have stable per-id fixtures so
snapshots don't drift with seed data.

Regenerate with: pnpm exec playwright test --project=visual --update-snapshots

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 15:42:40 +02:00
Matt Ciaccio
65b241805e test(portal): IMAP full-lifecycle activation E2E + dev probe helper
New realapi spec walks the entire portal activation loop over real
network: invite via the admin endpoint → wait for the activation email
to land in the IMAP mailbox → extract the token from the body link →
activate the portal user via the public API → sign in with the new
password.

The match logic deliberately doesn't filter on the TO header — the
combination of EMAIL_REDIRECT_TO rewriting and +addressing made TO
matching brittle. Instead we discriminate by sender (noreply@…),
subject keyword, and body link pattern, which is unique enough to find
exactly the email this test triggered.

Companion script scripts/dev-imap-probe.ts dumps the most recent ~10
messages with from/to/subject/date — useful for debugging when an IMAP
match goes wrong.

Skips when IMAP_HOST / IMAP_USER / IMAP_PASS are absent so the suite
stays portable.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 15:40:28 +02:00
Matt Ciaccio
4a859245b7 test(documenso): real-API E2E spec + 2.x response normalization
The documenso-template pathway was returning 201 with documensoId=null
because Documenso 2.x renamed `id` → `documentId` and recipient `id` →
`recipientId` in its API responses. Our DocumensoDocument interface
still expected the legacy v1.13 shape, so destructuring silently yielded
undefined and the documents row got NULL'd.

- Add normalizeDocument() in documenso-client that reads either field
  name and surfaces the legacy `id` form downstream consumers expect
- Apply normalization at every callsite that returns DocumensoDocument
  (createDocument, generateDocumentFromTemplate, sendDocument, getDocument)
- New realapi Playwright project (opt-in: --project=realapi) targeting
  tests/e2e/realapi/, with 2-min timeout for real-network calls
- New spec: documenso-real-api.spec.ts seeds client/yacht/berth/interest
  via the v1 API, fires generate-and-sign through the documenso-template
  pathway, asserts the response carries a documensoId, then GETs the
  document directly from Documenso to confirm it exists with PENDING
  status and recipients populated

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 15:25:06 +02:00
Matt Ciaccio
4441f1177f feat(portal): branded auth pages + legacy email styling + dev redirect override
- New PortalAuthShell component: blurred Port Nimara overhead background +
  circular logo + white rounded card, used by /portal/login,
  /portal/activate, /portal/reset-password
- New email/templates/portal-auth.ts: table-based, responsive (max-width
  600px / width 100%), matching the existing legacy inquiry templates;
  replaces the inline templates that lived in portal-auth.service
- EMAIL_REDIRECT_TO env override: when set, sendEmail routes every
  outbound message to that address regardless of recipient and tags the
  subject with "[redirected from <original>]". Dev/test safety net only;
  unset in production
- Portal password minimum length 12 → 9 (service + both API routes +
  client-side form)
- Dev helper script scripts/dev-trigger-portal-invite.ts: seeds a portal
  user against the first port-nimara client and uses EMAIL_REDIRECT_TO
  as the stored email so the tester can sign in with the address that
  received the activation mail

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 15:04:21 +02:00
Matt Ciaccio
c4085265ff fix(documenso): align webhook receiver with Documenso v1.13 + 2.x protocol
Documenso authenticates outbound webhooks via the X-Documenso-Secret
header carrying the plaintext secret (no HMAC). The previous receiver
verified an HMAC against a non-existent x-documenso-signature header
and switched on parsed.type, neither of which Documenso emits — so
every real delivery was being silently rejected.

- Read X-Documenso-Secret, compare timing-safe to env secret
- Switch on parsed.event with uppercase normalization for both v1.13
  (DOCUMENT_SIGNED) and 2.x (lowercase-dotted UI labels) wire formats
- Alias DOCUMENT_RECIPIENT_COMPLETED to DOCUMENT_SIGNED (same
  semantics across versions)
- Handle DOCUMENT_OPENED / DOCUMENT_REJECTED / DOCUMENT_CANCELLED in
  addition to the existing DOCUMENT_SIGNED + DOCUMENT_COMPLETED paths
- Bypass session middleware for /api/webhooks/* (signature is the auth)

Verified end-to-end against signatures.letsbe.solutions: real
DOCUMENT_RECIPIENT_COMPLETED + DOCUMENT_COMPLETED deliveries now pass
secret verification, dispatch correctly, and the handler updates
state (or warns gracefully when the documensoId is unknown).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 13:46:48 +02:00
433 changed files with 119831 additions and 3149 deletions

6
.gitignore vendored
View File

@@ -21,3 +21,9 @@ docker-compose.override.yml
.remember/ .remember/
.DS_Store .DS_Store
eoi/ eoi/
# Brainstorming companion mockup files
.superpowers/
# Ad-hoc screenshots / scratch artifacts at repo root
/*.png

View File

@@ -13,6 +13,19 @@ pnpm db:generate # Generate Drizzle migrations
pnpm db:push # Push schema to DB pnpm db:push # Push schema to DB
pnpm db:studio # Drizzle Studio GUI pnpm db:studio # Drizzle Studio GUI
pnpm db:seed # Seed database (tsx src/lib/db/seed.ts) pnpm db:seed # Seed database (tsx src/lib/db/seed.ts)
# Tests
pnpm exec vitest run # Unit + integration (~3s)
pnpm exec playwright test --project=smoke # Click-through smoke (~10min)
pnpm exec playwright test --project=exhaustive # Full UI exhaustive
pnpm exec playwright test --project=destructive # Archive/delete flows
pnpm exec playwright test --project=realapi # Real Documenso/IMAP (opt-in)
pnpm exec playwright test --project=visual # Pixel-diff baselines
pnpm exec playwright test --project=visual --update-snapshots # Regenerate baselines
# Dev helpers
pnpm tsx scripts/dev-trigger-portal-invite.ts # Send a portal activation email
pnpm tsx scripts/dev-imap-probe.ts # Dump recent IMAP inbox messages
``` ```
## Tech stack ## Tech stack
@@ -75,13 +88,42 @@ src/
- **Polymorphic ownership:** Yachts and invoice billing-entities use `<entity>_type` + `<entity>_id` column pairs (`'client' | 'company'`). Resolve owner identity through `src/lib/services/yachts.service.ts` / `eoi-context.ts` rather than reading the columns ad hoc — those services apply the type discriminator. - **Polymorphic ownership:** Yachts and invoice billing-entities use `<entity>_type` + `<entity>_id` column pairs (`'client' | 'company'`). Resolve owner identity through `src/lib/services/yachts.service.ts` / `eoi-context.ts` rather than reading the columns ad hoc — those services apply the type discriminator.
- **EOI generation:** Two pathways share the same `EoiContext` (`src/lib/services/eoi-context.ts`). Documenso pathway calls the template-generate endpoint via `documenso-payload.ts`; in-app pathway fills the same source PDF (`assets/eoi-template.pdf`) via `src/lib/pdf/fill-eoi-form.ts` (pdf-lib AcroForm). Routed through `generateAndSign(...)` in `src/lib/services/document-templates.ts` with a `pathway` parameter. - **EOI generation:** Two pathways share the same `EoiContext` (`src/lib/services/eoi-context.ts`). Documenso pathway calls the template-generate endpoint via `documenso-payload.ts`; in-app pathway fills the same source PDF (`assets/eoi-template.pdf`) via `src/lib/pdf/fill-eoi-form.ts` (pdf-lib AcroForm). Routed through `generateAndSign(...)` in `src/lib/services/document-templates.ts` with a `pathway` parameter.
- **Merge fields:** Token catalog lives in `src/lib/templates/merge-fields.ts`; the `createTemplateSchema` validator uses `VALID_MERGE_TOKENS` as an allow-list, so unknown tokens are rejected at template creation time. - **Merge fields:** Token catalog lives in `src/lib/templates/merge-fields.ts`; the `createTemplateSchema` validator uses `VALID_MERGE_TOKENS` as an allow-list, so unknown tokens are rejected at template creation time.
- **Documenso webhooks:** Documenso (both v1.13 and 2.x) authenticates outbound webhooks by sending the configured secret in plaintext via the `X-Documenso-Secret` header — there is no HMAC. The receiver at `src/app/api/webhooks/documenso/route.ts` does a timing-safe equality check via `verifyDocumensoSecret`. Event names arrive as the uppercase Prisma enum on the wire (`DOCUMENT_SIGNED`, `DOCUMENT_COMPLETED`, etc.) even though the UI displays them as lowercase-dotted. The route also normalizes lowercase-dotted variants for forward-compat.
- **Documenso API responses:** 2.x renamed `id``documentId` and recipient `id``recipientId`; v1.13 still uses `id`. `src/lib/services/documenso-client.ts` runs every response through `normalizeDocument()` which reads either field name and surfaces the legacy `id` form to downstream consumers.
- **Email templates:** Branded HTML lives in `src/lib/email/templates/`. The portal-auth flow uses `portal-auth.ts` (activation + reset). All templates use the legacy table-based layout with the Port Nimara logo + blurred overhead background, max-width 600px and `width:100%` for responsive shrink. The `<img>` URLs reference `s3.portnimara.com` directly (will move to `/public` later).
- **Portal auth pages:** `/portal/login`, `/portal/activate`, `/portal/reset-password` and the CRM `/login`, `/reset-password`, `/set-password` all wrap their content in `<BrandedAuthShell>` (`src/components/shared/branded-auth-shell.tsx`) which renders the same blurred background + logo + white card the email templates use, so the in-app and email surfaces look unified.
- **Inline editing pattern:** detail pages (clients, yachts, companies, interests, residential clients/interests) use `<InlineEditableField>` (`src/components/shared/inline-editable-field.tsx`) for click-to-edit text/select/textarea fields and `<InlineTagEditor>` (`src/components/shared/inline-tag-editor.tsx`) for tag chips. Each entity exposes a `PUT /api/v1/<entity>/[id]/tags` endpoint backed by a `set<Entity>Tags` service helper that wipes-and-rewrites the join table inside a single transaction. There are no separate "Edit" modal forms on detail pages — the entire overview tab is editable in place.
- **Notes (polymorphic across entity types):** `notes.service.ts` dispatches across `clientNotes`, `interestNotes`, `yachtNotes`, `companyNotes` based on an `entityType` discriminator. `<NotesList entityType="…" />` works for all four. `companyNotes` lacks an `updatedAt` column — the service substitutes `createdAt` so callers get a uniform shape.
- **Route handler exports:** Next.js App Router `route.ts` files only allow specific named exports (`GET|POST|…`). Service-tested handler functions live in sibling `handlers.ts` files (e.g. `src/app/api/v1/yachts/[id]/handlers.ts`) and are imported by the colocated `route.ts` for `withAuth(withPermission(...))` wrapping. Integration tests import from `handlers.ts` directly to bypass auth/permission middleware.
- **Routes:** Multi-tenant via `[portSlug]` dynamic segment. Typed routes enabled. - **Routes:** Multi-tenant via `[portSlug]` dynamic segment. Typed routes enabled.
- **Pre-commit:** Husky + lint-staged runs ESLint fix + Prettier on staged `.ts`/`.tsx` files. The hook also blocks `.env*` files (including `.env.example`) from being committed; pass them via a separate workflow if needed. - **Pre-commit:** Husky + lint-staged runs ESLint fix + Prettier on staged `.ts`/`.tsx` files. The hook also blocks `.env*` files (including `.env.example`) from being committed; pass them via a separate workflow if needed.
## Schema migrations during dev
When you run a `db:push` or apply a migration via `psql` against a running dev server, **restart the dev server afterwards**. Drizzle/postgres.js keeps connection-level prepared statements that can hold stale column lists; a stale pool causes `column X does not exist` errors on pages that touch the migrated table even though the column is present in the DB. Symptom: pages return 500 with `errorMissingColumn`/`42703` after a successful migration. Fix: kill `next dev` and restart it.
## Environment ## Environment
Copy `.env.example` to `.env` for local dev. See `src/lib/env.ts` for the full schema. Set `SKIP_ENV_VALIDATION=1` to bypass validation (used in Docker build). Copy `.env.example` to `.env` for local dev. See `src/lib/env.ts` for the full schema. Set `SKIP_ENV_VALIDATION=1` to bypass validation (used in Docker build).
Optional dev/test-only env vars (not in `.env.example`):
- `EMAIL_REDIRECT_TO=<address>` — when set, every outbound email is rerouted to this address regardless of the requested recipient and the subject is prefixed with `[redirected from <original>]`. Dev safety net so seeded fake-client emails don't escape; **must be unset in production**.
- `IMAP_HOST` / `IMAP_PORT` / `IMAP_USER` / `IMAP_PASS` — read by `tests/e2e/realapi/portal-imap-activation.spec.ts` to fetch the activation email from a real mailbox during the IMAP round-trip test. The spec skips when any are missing.
## Testing
Five Playwright projects, defined in `playwright.config.ts`:
- `setup` — global setup (seeds users, port, berths, system settings).
- `smoke` — fast click-through over every major flow. Run on every change (~10 min, 125 specs).
- `exhaustive` — deeper UI coverage that takes longer.
- `destructive` — archive/delete/cancel paths against throwaway entities.
- `realapi` — opt-in suite that hits real external services (Documenso send-side + IMAP round-trip). Requires `DOCUMENSO_API_*`, `SMTP_*`, `IMAP_*` env. Cloudflared tunnel needs to be running so Documenso can call the local webhook receiver.
- `visual` — pixel-diff baselines for stable list/landing pages. Snapshots committed under `tests/e2e/visual/snapshots.spec.ts-snapshots/`. Regenerate with `--update-snapshots` after intentional UI changes.
Vitest covers unit + integration with mocked external services (`tests/unit/`, `tests/integration/`).
## Docker ## Docker
- `Dockerfile` - Production multi-stage build (deps -> build -> runner) - `Dockerfile` - Production multi-stage build (deps -> build -> runner)

View File

@@ -0,0 +1,199 @@
# Backup and restore runbook
This runbook documents what gets backed up, how often, where it lands, and
the exact commands to restore the system from a cold start. The goal is
that any operator who has the off-site backup credentials can bring the
CRM back up on a clean host without help.
## Scope of a "full backup"
The CRM has three stateful surfaces. All three must be captured for a
restore to be useful.
| Surface | Holds | Risk if missing |
| ------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------- |
| **PostgreSQL** (`port_nimara_crm`) | Every relational record: clients, yachts, companies, interests, reservations, invoices, audit log, GDPR exports, AI usage ledger, Documenso webhook receipts, etc. | Total data loss — site is unrecoverable. |
| **MinIO bucket** (`MINIO_BUCKET`, default `crm-files`) | Receipts, signed contracts, EOI PDFs, GDPR export ZIPs, document attachments. | Files reachable by row references in Postgres become 404s. |
| **`.env` + secrets** | DB password, MinIO keys, Documenso webhook secret, SMTP creds, encryption key (`ENCRYPTION_KEY`). | OCR API keys re-resolve from `system_settings` (encrypted at rest), but **without the original `ENCRYPTION_KEY` they're unreadable**. |
The Redis instance is not backed up. It only holds queue state, rate-limit
counters, and Socket.IO presence — all reconstructable. Stop the workers
during a restore so the queue starts clean.
## Backup schedule
Defaults are tuned for a single-port deployment with O(10k) clients. Bump
on the producing side as scale demands.
| Job | Frequency | Retention | Where |
| ---------------------------------- | -------------------- | ----------------------------- | -------------------------------------------------------------------- |
| `pg_dump` (custom format, gzipped) | Hourly | 7 days hourly + 30 days daily | `${BACKUP_BUCKET}/pg/<host>/<UTC date>/<hour>.dump.gz` |
| MinIO mirror | Hourly (incremental) | 30 days versions | `${BACKUP_BUCKET}/minio/` |
| `.env` snapshot (encrypted) | On change (manual) | Forever | Password manager / secrets vault — **never the same bucket as data** |
The hourly cadence is the right answer for this workload — invoices and
contracts cluster around business hours, and an hour of lost work is the
worst-case data loss window most clients will tolerate. Promote to 15-min
WAL streaming if a customer demands tighter RPO.
## Required environment variables
The scripts below read these. Store them in a CI secret store, not the
host's bash profile.
```
# Source (the running CRM database)
DATABASE_URL=postgresql://crm:<pw>@<host>:<port>/port_nimara_crm
# MinIO (source bucket — the live one)
MINIO_ENDPOINT=minio.letsbe.solutions
MINIO_PORT=443
MINIO_USE_SSL=true
MINIO_ACCESS_KEY=<live key>
MINIO_SECRET_KEY=<live secret>
MINIO_BUCKET=crm-files
# Backup destination (a *separate* MinIO/S3 endpoint or a different bucket
# with no IAM overlap with the live keys)
BACKUP_S3_ENDPOINT=https://s3.eu-west-1.amazonaws.com
BACKUP_S3_REGION=eu-west-1
BACKUP_S3_BUCKET=portnimara-backups-prod
BACKUP_S3_ACCESS_KEY=<dedicated read+write key for this bucket only>
BACKUP_S3_SECRET_KEY=<...>
# Optional: encrypts dumps at rest with a passphrase. Cuts a wider blast
# radius if the backup bucket itself is compromised.
BACKUP_GPG_RECIPIENT=ops@portnimara.com
```
## Provisioning the backup destination
1. Create a dedicated S3-compatible bucket in a **different account** from
the live infra. AWS S3, Backblaze B2, or a separately-credentialed
MinIO instance all work.
2. Apply object-lock or versioning so an attacker who steals the backup
write key still can't permanently delete history.
3. Generate IAM credentials scoped to `s3:PutObject`, `s3:GetObject`,
`s3:ListBucket` on this bucket only. Inject them as
`BACKUP_S3_*` above. Do not reuse the live `MINIO_*` keys.
4. Set a 90-day lifecycle rule that transitions objects older than 30
days to cold storage and deletes them at 90 days. Past 90 days it's
cheaper to restart from a snapshot taken outside the system.
## The scripts
Three scripts in `scripts/backup/`:
- `pg-backup.sh` — runs `pg_dump`, gzips, optionally GPG-encrypts, uploads
- `minio-mirror.sh``mc mirror` of the live bucket → backup bucket
- `restore.sh` — interactive restore (DB + MinIO) given a snapshot path
Make them executable and wire them into cron / GitHub Actions / your
scheduler of choice. Sample crontab on the worker host:
```cron
# Hourly DB dump at minute 7
7 * * * * /opt/pncrm/scripts/backup/pg-backup.sh >> /var/log/pncrm-backup.log 2>&1
# Hourly MinIO mirror at minute 17 (offset so the two don't fight for I/O)
17 * * * * /opt/pncrm/scripts/backup/minio-mirror.sh >> /var/log/pncrm-backup.log 2>&1
# Weekly restore drill (smoke-test to a throwaway DB on Sunday at 03:00)
0 3 * * 0 /opt/pncrm/scripts/backup/restore.sh --drill >> /var/log/pncrm-restore-drill.log 2>&1
```
## Restoring from cold
These steps have been rehearsed against the dev environment; expect them
to take 1530 minutes for a typical port. **The drill (last cron line
above) ensures the runbook stays correct — if the drill fails, the
real restore will too.**
### 0. Stop everything that writes
```bash
docker compose -f docker-compose.prod.yml stop web worker scheduler
# Leave postgres + minio + redis up; we'll point them at restored data.
```
### 1. Restore PostgreSQL
```bash
# Find the dump you want. Prefer the most recent successful hour.
mc ls "$BACKUP_S3_BUCKET/pg/$(hostname)/" | tail
SNAPSHOT="2026-04-28/14.dump.gz"
# Pull it.
mc cp "$BACKUP_S3_BUCKET/pg/$(hostname)/$SNAPSHOT" /tmp/
# Decrypt if BACKUP_GPG_RECIPIENT was set on the producer side.
gpg --decrypt /tmp/14.dump.gz.gpg > /tmp/14.dump.gz
# Drop & recreate the database. The 'restrict' FK from gdpr_exports.requested_by
# to user means we restore in the right order — pg_restore handles this.
psql "$DATABASE_URL" -c 'DROP DATABASE IF EXISTS port_nimara_crm WITH (FORCE);'
psql "$DATABASE_URL" -c 'CREATE DATABASE port_nimara_crm;'
gunzip -c /tmp/14.dump.gz | pg_restore --no-owner --no-privileges \
--dbname "$DATABASE_URL"
```
### 2. Restore MinIO
```bash
# Sync the backup bucket back over the live one. --overwrite handles
# files that were modified between snapshots.
mc mirror --overwrite \
"$BACKUP_S3_BUCKET/minio/" \
"live/$MINIO_BUCKET/"
```
### 3. Restore secrets
The `.env` file is **not** in object storage. Pull it from the password
manager / secrets vault. Verify `ENCRYPTION_KEY` matches the value used
when the database was last running — if it doesn't, rows in
`system_settings` (OCR API keys, etc.) decrypt to garbage and the OCR
"Test connection" button will return an opaque error. There is no
recovery path; the keys must be re-entered through the admin UI.
### 4. Bring services back up
```bash
docker compose -f docker-compose.prod.yml up -d
# Watch the worker logs; expect a flurry of socket reconnections, then quiet.
docker compose -f docker-compose.prod.yml logs -f worker
```
### 5. Verify
Tail through the smoke checklist, in order:
1. **DB up**`psql "$DATABASE_URL" -c 'SELECT count(*) FROM clients;'`
matches the producer-side count from the snapshot's hour.
2. **MinIO up** — open any client with attachments in the CRM, click a
receipt thumbnail; verify the signed URL serves the file.
3. **Documenso webhooks** — re-trigger one in the Documenso admin and
confirm `audit_logs` records the receipt.
4. **Email** — send a portal invite to a real address.
5. **Realtime** — open two browser windows, edit a client in one, watch
the other update via Socket.IO.
6. **AI usage ledger**`SELECT count(*) FROM ai_usage_ledger;`
non-empty if AI was being used. Old rows survive but the budget gates
reset alongside the period boundary at month rollover.
## Drill schedule
The weekly drill (cron line above) runs `restore.sh --drill` against a
throwaway database and a sandbox MinIO bucket. It must produce zero diff
between the restored row counts and the live row counts (modulo the
hour-or-so the drill takes to run).
Failure modes the drill catches before they bite production:
- New tables added without inclusion in `pg_dump`'s `--schema=public` (we
use the default, which captures everything in `public` — but a future
developer adding a `tenant_X` schema will silently lose it).
- MinIO bucket-policy changes that block the backup-side `s3:GetObject`
on certain prefixes.
- GPG passphrase rotation that wasn't propagated to the restore host.
- A `pg_restore` version skew with the producer-side `pg_dump`.

View File

@@ -0,0 +1,186 @@
# Email deliverability runbook
The CRM sends transactional email through three different surfaces. Each
has a different failure mode when it lands in spam. This runbook covers
how to diagnose, fix, and verify each path.
## What email the CRM sends
| Surface | Trigger | Template | Default `from` |
| ----------------------------------------- | -------------------------------------------------------------------------------------- | ----------------------------------------------------------------- | ----------------------------------------------------- |
| Portal activation / password-reset | Admin invites a client to the portal | `src/lib/email/templates/portal-auth.ts` | per-port `email_settings.from_address` or `SMTP_FROM` |
| Inquiry confirmation + sales notification | Public website POSTs to `/api/public/interests` or `/api/public/residential-inquiries` | `inquiry-client-confirmation.ts`, `inquiry-sales-notification.ts` | same |
| GDPR export ready | Staff requests an export with `emailToClient=true` | inline in `gdpr-export.service.ts` | same |
| Documenso reminders | Cadence job fires for an unsigned signer | `documenso/reminders/*` | same |
Documenso _itself_ sends signing requests with its own `from` address —
those don't flow through this codebase. SPF/DKIM for the Documenso
sender is the Documenso operator's problem, not yours.
## DNS records
For every domain that appears in a `from:` header you must publish:
### 1. SPF
A single TXT record at the apex authorizing whichever provider is
sending. Multiple SPF records on the same name **break SPF entirely**
combine into one.
```
v=spf1 include:_spf.google.com include:amazonses.com -all
```
The `-all` (hardfail) is correct for transactional mail. Switch to `~all`
(softfail) only as a temporary diagnostic when migrating providers.
### 2. DKIM
Each provider publishes its own selector. Common shapes:
- Google Workspace: `google._domainkey` → 2048-bit RSA pubkey (rotate every 12 months).
- Amazon SES: `xxxx._domainkey`, `yyyy._domainkey`, `zzzz._domainkey` (three CNAMEs SES gives you).
- Postmark / Resend / Mailgun: one CNAME per selector.
Verify alignment — the `d=` value in the DKIM signature must match the
`From:` domain (relaxed alignment is fine, strict is overkill).
### 3. DMARC
Start at `p=none` while you build deliverability data, then upgrade.
```
_dmarc 14400 IN TXT "v=DMARC1; p=quarantine; rua=mailto:dmarc@portnimara.com; ruf=mailto:dmarc@portnimara.com; fo=1; adkim=r; aspf=r; pct=100"
```
`rua` (aggregate reports) is the diagnostic feed — set it before the
first send so the first weekly report has data.
### 4. MX (only if you also receive)
The CRM's IMAP probe (`scripts/dev-imap-probe.ts`) and the inbound thread
sync rely on a real mailbox. Whoever runs that mailbox publishes the MX
records — typically Google Workspace or a dedicated provider. Don't add
an MX pointing at the CRM host; it doesn't accept SMTP IN.
## Per-port overrides
Each port can override `from_address`, `from_name`, and SMTP creds via
the admin email-settings page. When set, `getPortEmailConfig()` returns
those values and `sendEmail()` uses them in preference to the global
`SMTP_*` env. **The override domain still needs SPF / DKIM / DMARC** on
its own DNS — without them, every send from that port lands in spam.
When a customer reports "our portal invite didn't arrive":
1. Pull the port's email settings from the admin UI. Check `from_address`.
2. Run `dig TXT <from-domain>` and `dig TXT _dmarc.<from-domain>`.
Confirm SPF includes the SMTP provider's domain and DMARC exists.
3. Send a probe through `mail-tester.com`: paste the address into a
test send, click the score breakdown.
4. Score < 8/10 → fix whatever's flagged before doing anything else in
this runbook.
## Diagnosing a "didn't arrive" report
Order matters — go top-down, stop when one of these is the answer.
### Step 1: Was the send attempted?
```bash
# Tail the worker logs for the recipient address.
docker compose logs worker | grep '<recipient>'
```
You'll see one of three patterns:
- **Nothing**: The job didn't run. Check that BullMQ actually queued it.
`redis-cli LLEN bull:email:waiting` — if non-zero, the worker is dead.
`docker compose logs scheduler | tail` to see why.
- **`Email sent`** with a message-id: The provider accepted it. Move to
Step 2.
- **`SendError`**: Provider rejected. The error string says why
(auth, rate limit, blocked recipient).
### Step 2: Is `EMAIL_REDIRECT_TO` set?
In dev/test we set `EMAIL_REDIRECT_TO=ops@portnimara.com` so seeded fake
clients don't get real email. **It must be unset in production.**
```bash
# On the production host:
docker exec pncrm-web printenv EMAIL_REDIRECT_TO
# Should print nothing.
```
If it's set, every email is going to the redirect target with the
original recipient prefixed in the subject — the customer never sees it.
### Step 3: Did it land but get filtered?
Ask the recipient to check:
- Spam / Junk folder
- Gmail "Promotions" tab
- Outlook "Other" folder (vs Focused)
- The Quarantine console if they're on M365 with anti-spam enabled
If found in a spam folder: the email arrived; the recipient's filter
classified it. SPF/DKIM/DMARC alignment is suspect — re-run the
mail-tester probe from above.
### Step 4: Was the recipient on a suppression list?
Some providers (SES, Postmark) maintain a suppression list — once a
domain bounces from an address, future sends are dropped silently.
```bash
# SES example:
aws ses list-suppressed-destinations --region eu-west-1
```
If the recipient is suppressed, remove them and ask them to retry. The
CRM doesn't track suppression locally; that's the provider's job.
## When migrating SMTP providers
1. Add the new provider's DKIM CNAMEs alongside the old ones.
2. Add the new provider's `include:` to the existing SPF record.
3. Wait 48 hours for DNS to propagate and DMARC reports to confirm both
providers align.
4. Switch `SMTP_*` env to the new provider on a single staging host.
5. Send through the staging host for a week. Watch DMARC reports.
6. Cut production over.
7. Wait two weeks before removing the old provider's DNS — undelivered
bounce reports keep arriving for a while.
## Testing a deliverability fix
There's no automated test for "did this email reach the inbox" — that's a
property of the recipient's filter, which we don't control. The closest
proxy is the realapi suite:
```bash
pnpm exec playwright test --project=realapi
```
It runs `tests/e2e/realapi/portal-imap-activation.spec.ts` which sends a
real portal-invite email through SMTP, then polls the configured IMAP
mailbox for the activation link. If it appears within 30 seconds, the
SMTP→DKIM→DMARC chain is alive end-to-end. If the test times out, work
backwards through this runbook.
The realapi suite needs `SMTP_*` and `IMAP_*` env vars — see the
"Optional dev/test-only env vars" block in `CLAUDE.md`.
## Bounce handling
The CRM doesn't currently process bounces. If you start seeing volume:
- Set up the provider's webhook (SES → SNS → Lambda; Postmark → webhook
URL) to POST bounce events to a new `/api/webhooks/email-bounce` route.
- Persist the bounced address into a `email_suppressions` table.
- Have `sendEmail()` consult that table before each send.
That work isn't in scope yet; this runbook just flags it as the next
deliverability gap.

View File

@@ -0,0 +1,56 @@
# Permission Matrix Audit
Scanned 182 route files under `src/app/api/v1/`.
**No violations.** Every internal v1 handler is permission-gated.
**Allow-listed:** 46 handler(s) intentionally skip `withPermission`.
| File | Method | Reason |
| ---------------------------------------------------------------- | ------ | --------------------------------------------------------------------------------- |
| `src/app/api/v1/admin/alerts/run-engine/route.ts` | POST | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/connections/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/errors/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/health/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/ocr-settings/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/ocr-settings/route.ts` | PUT | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/ocr-settings/test/route.ts` | POST | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/queues/[queueName]/[jobId]/retry/route.ts` | POST | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/queues/[queueName]/[jobId]/route.ts` | DELETE | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/queues/[queueName]/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/queues/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/users/options/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/ai/email-draft/[jobId]/route.ts` | GET | TODO: needs ai:\* permission catalog entry. Currently allow-listed. |
| `src/app/api/v1/ai/email-draft/route.ts` | POST | TODO: needs ai:\* permission catalog entry. Currently allow-listed. |
| `src/app/api/v1/ai/interest-score/bulk/route.ts` | GET | TODO: needs ai:\* permission catalog entry. Currently allow-listed. |
| `src/app/api/v1/ai/interest-score/route.ts` | GET | TODO: needs ai:\* permission catalog entry. Currently allow-listed. |
| `src/app/api/v1/alerts/[id]/acknowledge/route.ts` | POST | Alerts are user-scoped; port-filtered via auth context. |
| `src/app/api/v1/alerts/[id]/dismiss/route.ts` | POST | Alerts are user-scoped; port-filtered via auth context. |
| `src/app/api/v1/alerts/count/route.ts` | GET | Alerts are user-scoped; port-filtered via auth context. |
| `src/app/api/v1/alerts/route.ts` | GET | Alerts are user-scoped; port-filtered via auth context. |
| `src/app/api/v1/berth-reservations/[id]/route.ts` | PATCH | TODO: PATCH should map to reservations:edit (not currently in catalog). |
| `src/app/api/v1/currency/convert/route.ts` | POST | Currency reference data; port-scoped, no PII. |
| `src/app/api/v1/currency/rates/refresh/route.ts` | POST | TODO: gate with admin:manage_settings — currently allow-listed. |
| `src/app/api/v1/currency/rates/route.ts` | GET | Currency reference data; port-scoped, no PII. |
| `src/app/api/v1/custom-fields/[entityId]/route.ts` | GET | TODO: needs custom_fields:\* permission. PUT path internally validated. |
| `src/app/api/v1/custom-fields/[entityId]/route.ts` | PUT | TODO: needs custom_fields:\* permission. PUT path internally validated. |
| `src/app/api/v1/expenses/export/parent-company/route.ts` | POST | Internally gated by isSuperAdmin inside the handler. |
| `src/app/api/v1/me/route.ts` | GET | Self-endpoint — auth is sufficient. |
| `src/app/api/v1/me/route.ts` | PATCH | Self-endpoint — auth is sufficient. |
| `src/app/api/v1/notifications/[notificationId]/route.ts` | PATCH | User-scoped notifications — caller is the resource owner. |
| `src/app/api/v1/notifications/preferences/route.ts` | GET | User-scoped notifications — caller is the resource owner. |
| `src/app/api/v1/notifications/preferences/route.ts` | PUT | User-scoped notifications — caller is the resource owner. |
| `src/app/api/v1/notifications/read-all/route.ts` | POST | User-scoped notifications — caller is the resource owner. |
| `src/app/api/v1/notifications/route.ts` | GET | User-scoped notifications — caller is the resource owner. |
| `src/app/api/v1/notifications/unread-count/route.ts` | GET | User-scoped notifications — caller is the resource owner. |
| `src/app/api/v1/saved-views/[id]/route.ts` | PATCH | User-self saved views — caller is the resource owner. |
| `src/app/api/v1/saved-views/[id]/route.ts` | DELETE | User-self saved views — caller is the resource owner. |
| `src/app/api/v1/saved-views/route.ts` | GET | User-self saved views — caller is the resource owner. |
| `src/app/api/v1/saved-views/route.ts` | POST | User-self saved views — caller is the resource owner. |
| `src/app/api/v1/search/recent/route.ts` | GET | Port-scoped search — results filtered by auth context (resources have own perms). |
| `src/app/api/v1/search/route.ts` | GET | Port-scoped search — results filtered by auth context (resources have own perms). |
| `src/app/api/v1/settings/feature-flag/route.ts` | GET | Public read of feature-flag bool — no PII; auth is sufficient. |
| `src/app/api/v1/tags/options/route.ts` | GET | Tags are cross-cutting reference data; port-scoped via auth. |
| `src/app/api/v1/tags/route.ts` | GET | Tags are cross-cutting reference data; port-scoped via auth. |
| `src/app/api/v1/users/me/preferences/route.ts` | GET | User-self preferences — caller is the resource owner. |
| `src/app/api/v1/users/me/preferences/route.ts` | PATCH | User-self preferences — caller is the resource owner. |

View File

@@ -0,0 +1,171 @@
# Country / Phone / Timezone — i18n form polish
**Status:** Agenda — awaiting prioritization (likely Phase B or B.5)
**Date:** 2026-04-28
**Phase:** Cross-cutting; touches every form that captures contact data
## Why
Today every CRM form takes free-text strings for nationality, phone, and timezone. That's fine for a marina with one operator typing it in once, but it leaks operator inconsistencies into reports and breaks any later system that consumes these fields (Documenso prefill, public website inquiry, portal sync, exports). For a multi-port platform that's about to onboard non-Polish-speaking residential clients, the data quality matters.
Three coupled UX upgrades:
1. **Nationality → ISO-3166 country dropdown.** Searchable. Stores ISO alpha-2 code (`'GB'`), displays localized country name.
2. **Phone → country-code dropdown + format-as-you-type.** E.164 storage on the wire, formatted display per country.
3. **Timezone → autofilled from country with override dropdown.** Most countries are single-zone; the few that aren't (US, RU, AU, BR, CA, ID, KZ, MN, MX, CD) get a sub-select. Stores IANA TZ string (`'Europe/Warsaw'`).
## Scope
### In scope
- New shared primitives: `<CountryCombobox>`, `<PhoneInput>`, `<TimezoneCombobox>`
- ISO-3166 country list bundled (no API call); names from `Intl.DisplayNames` with locale fallback to English
- Country → primary IANA timezone map (~250 entries, JSON)
- Phone parsing/validation/formatting via `libphonenumber-js` (server + client)
- Wire into every form that captures contact data:
- `<ClientForm>` (name, nationality, phone)
- `<ResidentialClientDetail>` inline editor (nationality, phone, place_of_residence — country-aware)
- `<CompanyForm>` (incorporation_country)
- `<PortalActivateForm>` (phone)
- public inquiry form (form-template renderer, when phone field present)
- DB migration: store ISO codes (`countries`, `nationality_iso`), E.164 phone (`phone_e164`), IANA timezone (`timezone`)
- Backfill: best-effort parse existing free-text into the new columns; keep originals as `_legacy` for one release cycle
- Display: localized country name in tables/detail pages; phone formatted per country (e.g. `+44 20 7946 0958`); timezone shown as friendly `'London (UTC+1)'` when current
- Tests: unit (parser edge cases), integration (form submit → E.164 storage), smoke (typing + selecting flows)
### Out of scope (deferred)
- Multilingual UI surface (only the country _names_ localize via `Intl.DisplayNames`; rest of the UI stays English for now)
- Subdivision picker (states/provinces) — only top-level country
- Phone number geocoding / carrier lookup
- Address autocomplete (Google Places, etc.)
- Currency localization
- RTL layout
## Library choices
| Concern | Library | Why |
| --------------------------- | -------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Phone input + flag dropdown | `omeralpi/shadcn-phone-input` | Built on shadcn-ui's `Input` primitive (zero styling friction with our component library), wraps `libphonenumber-js`, ships with country dropdown + format-as-you-type. Small bundle. |
| Phone parsing/validation | `libphonenumber-js` | Google's library, ~88 benchmark, used by every popular React phone input. Server-side validation in zod. |
| Country list | Bundled JSON of ISO-3166 alpha-2 codes + 3-letter codes + display names (English baseline) | No need for the heavier `country-state-city` databases — we don't need cities or states yet. |
| Country → timezone | Hand-curated `country-timezones.json` (250 entries, ~10kb) sourced from `country-tz` or moment-timezone's data | Static, no network call. For multi-zone countries, expose a sub-select. |
| Timezone formatting | `Intl.DateTimeFormat` (built-in) | Browser API; renders `'Europe/Warsaw (UTC+1)'`-style labels. |
| Timezone list | `Intl.supportedValuesOf('timeZone')` (built-in, ~600 entries) | Used as the override dropdown when a user wants a non-primary zone. |
Bundle impact: `libphonenumber-js` mobile build is ~80 KB gz; `shadcn-phone-input` is ~5 KB; country/timezone JSONs ~30 KB. All client-side, lazy-loaded on first form render via `next/dynamic`.
## Schema deltas
```sql
-- clients
ALTER TABLE clients ADD COLUMN nationality_iso text; -- 'GB'
ALTER TABLE clients ADD COLUMN timezone text; -- 'Europe/London'
-- existing 'nationality' free-text column stays for a release; new code reads ISO
-- client_contacts (or wherever phone lives)
ALTER TABLE client_contacts ADD COLUMN value_e164 text; -- '+442079460958'
ALTER TABLE client_contacts ADD COLUMN value_country text; -- 'GB' (where the number was parsed against)
-- existing 'value' stays as the human-displayable formatted form
-- residential_clients — same pattern
ALTER TABLE residential_clients ADD COLUMN nationality_iso text;
ALTER TABLE residential_clients ADD COLUMN timezone text;
ALTER TABLE residential_clients ADD COLUMN phone_e164 text;
ALTER TABLE residential_clients ADD COLUMN phone_country text;
-- companies
ALTER TABLE companies ADD COLUMN incorporation_country_iso text;
```
Indexes: `idx_clients_nationality_iso`, `idx_clients_timezone` (cheap; powers analytics filters later).
## Component primitives
```tsx
<CountryCombobox
value={iso} // 'GB' | undefined
onChange={(iso) => }
locale="en" // for name lookup; default to navigator.language
variant="default" | "compact" // compact = icon-only flag, default = name
/>
<PhoneInput
value={e164} // '+442079460958'
onChange={({ e164, country }) => }
defaultCountry={'GB'} // pre-selects the dropdown
required={false}
/>
<TimezoneCombobox
value={iana} // 'Europe/London'
onChange={(iana) => }
countryHint={'GB'} // when set, narrows the dropdown to matching zones first
/>
```
All three are shadcn-styled, keyboard-accessible, support form integration with react-hook-form + zod.
## Validators
```ts
// src/lib/validators/contact.ts
import { isValidPhoneNumber } from 'libphonenumber-js';
export const phoneE164Schema = z
.string()
.refine((v) => isValidPhoneNumber(v), 'Invalid phone number');
export const isoCountrySchema = z
.string()
.length(2)
.toUpperCase()
.refine((c) => ISO_COUNTRIES.has(c), 'Unknown country');
export const ianaTimezoneSchema = z
.string()
.refine((tz) => Intl.supportedValuesOf('timeZone').includes(tz), 'Unknown timezone');
```
## Backfill plan
A migration script (`scripts/backfill-iso-and-e164.ts`) that:
1. For each client/residential_client, attempt `libphonenumber-js` `parsePhoneNumber(rawPhone, { defaultCountry: 'PL' })` → if valid, write `phone_e164` + `phone_country`.
2. For each free-text `nationality`, fuzzy-match against the country name list (exact match first, then Levenshtein ≤2). Write `nationality_iso` if confident.
3. For each timezone, exact-match against IANA list. Otherwise leave null and let user fill it.
4. Log unparseable rows to `backfill-iso-report.csv` for manual review.
Run on staging first; require dry-run flag.
## Build sequence
| # | PR | Effort | Depends on |
| --- | ------------------------------------------------------------ | ------ | ---------- |
| 1 | Country list JSON + ISO sets + `<CountryCombobox>` primitive | 0.5d | — |
| 2 | `libphonenumber-js` integration + `<PhoneInput>` primitive | 1d | — |
| 3 | Country → timezone JSON + `<TimezoneCombobox>` primitive | 0.5d | 1 |
| 4 | Schema deltas + drizzle migrations + zod validators | 0.5d | — |
| 5 | Wire into ClientForm + ClientDetail inline editors | 1d | 1, 2, 3, 4 |
| 6 | Wire into ResidentialClientDetail | 0.5d | 5 |
| 7 | Wire into CompanyForm | 0.5d | 1 |
| 8 | Public inquiry form template renderer support | 0.5d | 2 |
| 9 | Backfill script + dry-run runbook | 1d | 4 |
| 10 | Smoke + integration tests | 1d | 59 |
Total: ~7 dev days. Self-contained; no external dependencies on Phase B (analytics/alerts).
## Risk register
| Risk | Mitigation |
| --------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------- |
| Bundle bloat from libphonenumber data | Use the `mobile` metadata build, lazy-import via `next/dynamic` |
| Existing free-text data is too messy to backfill | Keep the legacy column for one release; expose a "needs review" badge in admin |
| Multi-zone country UX confusion | Sub-select only appears when country is multi-zone; otherwise zone is hidden behind "Override" |
| Public inquiry form breaks if phone is required and user can't find their country | Default to PL, search by country name and dial code |
## Open questions for the user
- Which port's locale should drive the _default_ country in `<PhoneInput>` (Poland for now, or detect from browser)?
- Should existing free-text `nationality` field be removed once backfilled, or kept indefinitely as a fallback?
- Is there an appetite for adding the same treatment to subdivision (state/region/voivodship) selectors, or strictly country-level for now?

View File

@@ -0,0 +1,775 @@
# Documents Hub, Reservation Agreements, and Visual Polish (Phase A)
**Status:** Draft — awaiting final review
**Date:** 2026-04-28
**Phase:** A of D (B = Insights & Alerts; C = Website integration; D = Pre-prod ops)
## Overview
Phase A delivers a unified Documents Hub that tracks every signature-based document (EOI, Reservation Agreements, NDAs, ad-hoc uploads), generalises the existing single-purpose EOI dialog into a multi-format create-document wizard, builds the missing CRM-side reservation detail page with an end-to-end agreement workflow, polishes the reminder framework so non-EOI docs auto-remind correctly, and applies a system-wide visual upgrade to the polished-SaaS aesthetic the project already has tokens for.
The project already ships a usable CRM with auth, multi-tenancy, full client/yacht/company/interest/berth/reservation data model, an EOI dual-path (Documenso template + in-app PDF), socket-driven real-time updates, and 130 smoke specs. What's missing for the next release: a single place to see what documents need signing and chase the people who haven't signed.
## Scope boundaries
### In scope (this spec)
- New `/[port]/documents` hub page replacing the existing list
- New `/[port]/documents/[id]` document detail page
- Generalised create-document wizard supporting four template formats (HTML, PDF AcroForm fillable, PDF overlay-positioned, Documenso-rendered) plus ad-hoc PDF upload
- New `/[port]/berth-reservations/[id]` reservation detail page with agreement-generation flow
- Reservation Agreement as a first-class document type with default template seeded
- Email composer extended with attachments and a System-vs-User From selector (admin-gated)
- Reminder framework: per-template cadence, per-doc override, per-doc disable, per-signer manual reminders
- Documenso version-aware abstraction layer covering field placement and document voiding across v1.13.1 and v2.x
- System-wide visual polish: shadow scale, gradient layer, animation tokens, primitive components (`<StatusPill>`, `<KPITile>`, `<EmptyState>`, polished `<PageHeader>`), applied across all list and detail pages
- Mobile-responsive sweep across every page touched
- Comprehensive test coverage: unit, integration, smoke, exhaustive click-through, real-API round-trips, visual baseline regeneration
### Explicitly out of scope (deferred to later phases)
- Analytics dashboard, alert framework, interests-by-berth view, expense duplicate detection (Phase B)
- Website-side integration: `/api/form/[token]/data` prefill endpoint, `/api/webhook/document-signed` callback receiver, public-endpoint shape compat (Phase C)
- NocoDB to Postgres data migration, email deliverability (DKIM/SPF/DMARC), Sentry error reporting, audit log retention, performance baseline at 5k clients / 50k interests, backup/restore automation, production deploy readiness (Phase D)
- Native in-CRM PDF field-placement editor (deferred until upload-path pain emerges; Phase A v1 ships with auto-placed footer signature fields and a "Customize fields in Documenso" link)
- Word `.docx` template upload (deferred; PDF prioritized because Word adds LibreOffice/CloudConvert toolchain dependency without saving the field-placement step)
- Per-interest "silence all reminders" toggle (was implicit in old `interests.reminderEnabled` gating which this spec drops; can be re-added as a bulk action if anyone misses it)
## Information architecture
### URL surface
```
/[port]/documents hub (replaces existing list)
/[port]/documents/[id] document detail (new)
/[port]/documents/new create-document wizard (new)
/[port]/berth-reservations/[id] reservation detail (new)
/[port]/admin/templates existing; extended for new template formats
/[port]/admin/email existing; one new toggle
```
### Schema deltas
```
documents — additions:
+ reservation_id text null references berth_reservations(id)
+ reminders_disabled boolean default false
+ reminder_cadence_override int null
document_templates — additions:
+ reminder_cadence_days int null (null = no auto-reminders)
+ template_format text default 'html' ('html'|'pdf_form'|'pdf_overlay'|'documenso_render')
+ source_file_id text null references files(id)
+ documenso_template_id text null
+ field_mapping jsonb default '{}' (pdf_form: { acroFieldName: mergeToken })
+ overlay_positions jsonb default '[]' (pdf_overlay: [{token, page, x, y, fontSize}])
document_templates.body_html — relax to nullable (only required when template_format='html')
document_watchers — new table:
document_id text not null references documents(id) on delete cascade
user_id text not null references users(id)
added_by text not null references users(id)
added_at timestamptz default now()
primary key (document_id, user_id)
documents indexes — additions:
+ idx_docs_reservation on (reservation_id)
+ idx_docs_status_port on (port_id, status) — powers tab counts cheaply
document_watchers indexes:
+ idx_doc_watchers_doc on (document_id)
+ idx_doc_watchers_user on (user_id)
documents.documentType enum — already includes 'reservation_agreement'; no migration needed
documents.status enum — already accepts 'expired'; no migration needed
documentSigners.status enum — pending|signed|declined; no migration needed
```
Backfill (one statement, safe to run in same migration):
```sql
UPDATE document_templates SET reminder_cadence_days = 1 WHERE template_type = 'eoi';
```
This preserves the existing 1-day-effective reminder cadence for existing EOI templates. Admins can edit per-template later.
After running migration on a dev/staging server, restart `next dev` to flush postgres.js prepared-statement cache (existing project convention).
### Polymorphic ownership pattern
Documents already use the multi-FK pattern (`interest_id`, `client_id`, `yacht_id`, `company_id` as separate nullable columns). Adding `reservation_id` matches this. No conversion to polymorphic discriminator columns despite yachts and invoices using that pattern; staying consistent with the existing documents shape avoids a destructive migration.
### Service-layer changes
- `documents.service.ts`:
- `createFromWizard(portId, data, meta)` — dispatches across template/upload paths
- `createFromUpload(portId, data, meta)` — new upload-driven path; calls Documenso `createDocument`, stores file in MinIO via `files` service, mirrors to `documents` + `documentSigners`, optionally calls `sendDocument` if `sendImmediately`
- `cancelDocument(documentId, portId, meta)` — user-initiated cancel; calls Documenso void, updates DB status, logs event
- `composeSignedDocEmail(documentId, portId)` — returns prefilled `{ to, cc, subject, body, attachments, defaultSenderType }` for the composer
- `getDocumentDetail(id, portId)` — single-roundtrip aggregator returning doc + signers + events + watchers + linked-entity summary
- `document-templates.ts`:
- `generateAndSign` extended for new `template_format` values
- `fillAcroForm(sourceFile, fieldMapping, mergeContext)` — pdf-lib AcroForm fill
- `drawOverlay(sourceFile, overlayPositions, mergeContext)` — pdf-lib text-draw at positions
- Documenso-render path uses existing `generateDocumentFromTemplate`
- `documenso-client.ts`:
- `placeFields(docId, fields, portId?)` — version-aware bulk field placement
- `placeDefaultSignatureFields(docId, recipientIds, portId?)` — auto-position one SIGNATURE per recipient at footer
- `voidDocument(docId, portId?)` — version-aware doc void/delete
- Coordinate normalization helpers (caller passes percent 0-100; converted to pixels for v1 using cached page dimensions)
- `document-reminders.ts`:
- `sendReminderIfAllowed(documentId, portId, options?)` — extended signature with optional `signerId` and `auto: boolean`
- `processReminderQueue(portId)` — query rewritten around `documents.reminder_cadence_override ?? template.reminder_cadence_days`; drops `interests.reminderEnabled` gating
- `notifications.service.ts`:
- `notifyDocumentEvent(docId, eventType)` — fans out to creator + entity-assignee + watchers; existing socket events keep firing
- New: `reservation-agreement-context.ts`:
- `buildReservationAgreementContext(reservationId, portId)` — joins reservation -> client + yacht + berth -> port; returns context shape for template merge
- `email-compose.service.ts`:
- Validator extended: `{ senderType: 'system'|'user', accountId? (when user), attachments[] }`
- System path: calls `lib/email/index.ts → sendEmail()` with `portId` + attachments; logs `documentEvents` row `signed_doc_emailed`; skips `email_messages`/`email_threads` writes
- User path: existing flow, with attachments resolution from `files` table
- Port-isolation: cross-port `fileId` returns 403
- `lib/email/index.ts`:
- `SendEmailOptions.attachments?: Array<{ fileId, filename? }>` — fetches files from MinIO, passes to nodemailer
## Documents hub page
Replaces existing `/[port]/documents` list.
### Layout
```
[ Header strip: title, KPI sub-line, "+ New document" button ]
[ Tabs: All | Awaiting them (count) | Awaiting me (count) | Completed | Expired ]
[ Search · Type · Status · Sent · Watcher filter chips · saved-view selector · overflow ]
[ Table:
checkbox | Document | Type pill | Subject pill | Status (X/Y signed + dot) | Sent
▾ expand row inline to show signers + watchers strip
]
[ Sticky bulk-action bar appears when ≥1 row checked:
"N selected" | Remind unsigned | Cancel | Export | pagination
]
```
### Tab queries
- All — every document in port
- Awaiting them — `status IN ('sent','partially_signed')` AND has pending signer != current user
- Awaiting me — at least one `documentSigners` row matching `signer_email = current user email` AND `status = 'pending'`
- Completed — `status IN ('completed','signed')`
- Expired — `status = 'expired'` OR (`status IN ('sent','partially_signed')` AND `expires_at < now()`)
Counts run cheap thanks to `idx_docs_status_port`.
### Filters and saved views
- Search: fuzzy match on title, subject name, signer email
- Type: multi-select doc types
- Status: multi-select status enum
- Sent: date-range chips (Today, 7d, 30d, custom)
- Watcher: filter by watching user
- "Signature-based only" chip defaults to ON; toggle off to see non-signed docs (welcome letters etc.) as well, rendered with a "Delivered" pill
- Saved-view integration: filter combos save to existing `saved_views` table
### Row anatomy
- Collapsed: name (links to detail), type pill (colored per type), subject pill (links to entity), status indicator (X/Y signed with progress dot), sent age
- Expanded: per-signer rows with email, status pill, sent timestamp, signed timestamp, `[Remind]` and overflow `[...]` (resend invite, copy signing link, skip — skip is UI-only flag, not implemented in v1)
- Watchers strip at bottom of expansion: chips + `+ Add watcher` autocomplete
- Hover: row gets soft brand-soft gradient bg
### Real-time
Subscribes to existing `documents.service.ts`-emitted socket events: `document:created`, `document:updated`, `document:deleted`, `document:sent`, `document:completed`, `document:expired`, `document:cancelled`, `document:rejected`, `document:signer:signed`, `document:signer:opened`. All already fire today.
### Empty states
- No docs yet: illustration + 1-line explanation + `[+ New document]` CTA
- Filtered empty: "No docs match these filters. Clear filters?"
### Mobile (< 768px)
- Tabs collapse into `<select>`
- Filters collapse behind `[Filters]` button into a sheet
- Rows stack as cards: title + status + age, expand to show signers
- "+ New document" floats as FAB bottom-right
## Document detail page
New `/[port]/documents/[id]` page. No detail page exists today.
### Layout
```
[ Breadcrumb: All documents ]
[ Header strip with gradient: title (editable inline), type pill, status pill, subtitle (subject link, creator, age) ]
[ Action bar — context-aware ]
[ Two-column body:
Left (2fr):
Signers panel (vertical list, replaces existing horizontal SigningProgress)
Linked entity card
Right (1fr):
Watchers panel (chips + add)
Activity timeline (from documentEvents)
Notes (auto-saving editable text)
Preview (PDF; tabbed Original/Signed when completed)
]
```
### Action bar by status
- `draft``[Send for signing]` `[Edit signers]` `[Delete]`
- `sent | partially_signed``[Send reminder to all]` `[Resend invite]` `[Cancel]`
- `completed``[Download signed PDF]` `[Email signed PDF to all signatories]`
- `cancelled | rejected | expired``[Duplicate]`
- Always `[...]` overflow: Duplicate, Move to other entity, View Documenso URL, Audit log
### Signers panel (vertical, replaces horizontal stepper)
Per-row:
- Numbered status circle (pending grey, signed green, declined red)
- Name, email, role
- Sent age, last-reminded age, signed timestamp
- `[Remind]` button — disabled with countdown if cooldown active (24h-or-cadence) for auto mode; bypassed in manual mode
- `[Copy signing link]` — copies `signingUrl` (hosted Documenso); overflow offers "Copy embed link" if `embeddedUrl` present (used by website embed at `/sign/[type]/[token]`)
- `[...]` overflow: Resend invite, View signing history, Replace email (draft only)
- Sequential mode: only current pending signer's `[Remind]` active; others greyed with tooltip
### Send-signed-PDF email flow
Action visible only when `status='completed' AND signedFileId IS NOT NULL`.
Click opens email composer drawer prefilled:
- From: dropdown defaulting to System (port-config noreply identity); Personal accounts available only when port admin enables `email.allowPersonalAccountSends`
- To: union of `documentSigners.signerEmail` for the doc
- Cc: empty; "Cc watchers" toggle adds users from `document_watchers`
- Subject: `"Signed {document type} — {document title}"`
- Body: from `signed_doc_completion` per-port template (new template type; default seeded for new ports)
- Attachments: signed PDF auto-attached from `documents.signedFileId` (chip with filename + size; removable)
Send dispatch:
- System path: `lib/email/index.ts → sendEmail()` with portId + attachments; writes `documentEvents` row; skips email_messages/threads writes (no IMAP sync expected)
- User path: `email-compose.service.ts` existing flow; writes email_messages + thread; subject to `allowPersonalAccountSends` gate (server-side enforces 403 on user senderType when toggle off)
### Backend additions
- `POST /api/v1/documents/[id]/cancel` — calls `cancelDocument` service; service calls Documenso void via new client function
- `POST /api/v1/documents/[id]/remind` — accepts optional `{ signerId }`; passes `auto: false` to service
- `GET /api/v1/documents/[id]/watchers` — list
- `POST /api/v1/documents/[id]/watchers` — add `{ userId }`
- `DELETE /api/v1/documents/[id]/watchers/[userId]` — remove
- `POST /api/v1/documents/[id]/compose-completion-email` — returns prefilled draft
## Create-document wizard
Replaces `<EoiGenerateDialog>`. Single drawer/dialog, three steps.
### Step 1 — Type and source
```
Render: ● Generate the PDF here (using template format below)
○ Use a Documenso-stored template (Documenso renders + signs)
Format (when "Generate the PDF here" selected):
● HTML (write inline)
○ PDF (AcroForm fillable upload)
○ PDF (overlay positioning)
Template: [ pick from port's templates of selected format ]
OR
Upload PDF: [ drop or pick file; preview renders inline ]
Document type: [ auto-derived from template, or picked from DOCUMENT_TYPES enum ]
```
Signing destination is always Documenso. The "Render in CRM" vs "Render in Documenso" axis is about PDF generation only.
### Step 2 — Recipients
```
Attached to: [ Interest #142 — Smith family Change ]
↑ pre-filled if launched from a detail page
Signers: (hidden for documenso-render path; signers embedded in template)
① name email role [✕]
② name email role [✕]
[+ Add signer] (autocomplete from clients/companies/users; or manual entry)
Drag to reorder; signing-order assigned by row position
Signing mode: ● Sequential ○ Parallel
Watchers (optional): [chips] [+ Add watcher] (CRM users)
Reminder cadence:
● Use template default (every 7 days)
○ Override: [_____] days
○ Disable for this document
[ For upload path only ]
☑ Auto-place signature fields at footer (default; refine later in Documenso)
```
### Step 3 — Review and send
```
Title: [ EOI — Smith family ____________ ] (editable; default rendered from merge tokens)
Notes (internal): [_____________]
Preview: [ rendered PDF inline · 4 pages · scrollable ]
Signing-order banner (multi-signer in-app/upload only): "Sequential — Carol must sign before Bob" [Switch to parallel]
[← Back] [Save as draft] [Send →]
```
Save as draft → status='draft'; `[Send for signing]` available later from detail page. Send → calls Documenso, status='sent', socket event fires.
### Documenso version-aware field placement
For upload path, `placeDefaultSignatureFields` auto-positions one SIGNATURE per recipient at last-page footer (staggered to avoid overlap). User can refine in Documenso via "Customize fields in Documenso" link on detail page.
`placeFields` and `placeDefaultSignatureFields` in `documenso-client.ts` hide v1/v2 differences:
- v1: `POST /api/v1/documents/{id}/fields` per field; pixel coordinates; requires page dimension lookup
- v2: `POST /api/v2/envelope/field/create-many` bulk; percentage 0-100 coordinates; rich `fieldMeta`
- Caller passes percentage; abstraction converts for v1 using cached page dimensions
### `createDocumentSchema` extension
```ts
export const createDocumentSchema = z.object({
source: z.enum(['template', 'upload']),
templateId: z.string().uuid().optional(),
uploadedFileId: z.string().uuid().optional(),
documentType: z.enum(DOCUMENT_TYPES),
title: z.string().min(1).max(200),
notes: z.string().optional(),
// Subject (exactly one required)
interestId: z.string().uuid().optional(),
reservationId: z.string().uuid().optional(),
clientId: z.string().uuid().optional(),
companyId: z.string().uuid().optional(),
yachtId: z.string().uuid().optional(),
// Signers (required when render=in-app or source=upload)
signers: z.array(z.object({
signerName: z.string().min(1),
signerEmail: z.string().email(),
signerRole: z.enum(['client', 'sales', 'approver', 'developer', 'other']),
signingOrder: z.number().int().min(1),
})).optional(),
signingMode: z.enum(['sequential', 'parallel']).default('sequential'),
pathway: z.enum(['documenso-template', 'inapp', 'upload']).optional(),
watchers: z.array(z.string().uuid()).optional(),
reminderCadenceOverride: z.number().int().min(1).max(365).nullable().optional(),
remindersDisabled: z.boolean().default(false),
autoPlaceFields: z.boolean().default(true),
sendImmediately: z.boolean().default(true),
}).refine(...one-subject-FK-required...);
```
## Template formats
### Authoring paths
| Format | Authoring | Merge fields | Best for |
| ---------------------------- | ------------------------------------------------------------------------------------------- | --------------------------------------------------- | ------------------------------------------------ |
| HTML (existing) | Inline rich-text editor with merge tokens | Server-side substitution, rendered to PDF via pdfme | Welcome letters, acknowledgments, correspondence |
| PDF (AcroForm fillable) | Admin uploads fillable PDF; UI scans AcroForm field names; admin maps each to a merge token | pdf-lib fills form at gen time | EOI, Reservation Agreement, NDA |
| PDF (overlay positioning) | Admin uploads any PDF; UI specifies merge token positions per page+x+y+fontSize | pdf-lib draws text over PDF at positions | Quick wins where preparing AcroForm is overkill |
| Documenso template reference | Admin enters Documenso template ID + label | None in CRM; Documenso owns it | Documenso-rendered signing flows |
### Generator dispatch
```ts
switch (template.template_format) {
case 'html': generatePdf(template.body_html, mergeContext);
case 'pdf_form': fillAcroForm(template.source_file_id, template.field_mapping, mergeContext);
case 'pdf_overlay': drawOverlay(template.source_file_id, template.overlay_positions, mergeContext);
case 'documenso_render': documenso.generateDocumentFromTemplate(template.documenso_template_id, ...);
}
```
All four formats end at Documenso for signing — only PDF generation location differs. Non-signature templates (welcome letters etc.) skip the upload-to-Documenso step entirely; they render to PDF then get emailed.
### Admin template editor extension
Format picker added to `/admin/templates` editor:
- For PDF (AcroForm): file upload field, then two-column mapping UI (AcroForm field names ↔ merge tokens autocomplete from existing `MERGE_FIELDS` catalog)
- For PDF (overlay): file upload, then per-token form with page/x/y/fontSize inputs (visual placement editor deferred)
- For Documenso template: single text input + Test connection button calling `getDocumensoTemplate`
- For HTML: existing inline editor unchanged
### Word (.docx) deferred
Reasons: LibreOffice headless adds significant install/memory/security surface; CloudConvert adds paid dependency and third-party data exposure; `docxtemplater` merge syntax incompatible with existing `{{token}}` convention; field placement still needs PDF flow afterwards. If marinas push back, the feasible path is `.docx → server-side conversion → PDF → existing AcroForm/overlay flow`. Not worth the engineering until requested.
## Reservation agreements as a doc type
### What differs from EOI's pattern
| Aspect | EOI | Reservation Agreement |
| --------------------- | ----------------------------- | ------------------------------------------------------------------------------------------ |
| Subject FK | `interestId` | `reservationId` |
| Default template | Documenso EOI per port | Documenso reservation_agreement per port (seeded) |
| Default signers | client + sales/approver | client + port admin |
| Trigger | Manual on interest detail | Manual on reservation detail |
| Lifecycle integration | None | Active reservations without an agreement get flagged in dashboard alert |
| Final-PDF storage | `documents.signedFileId` only | `documents.signedFileId` AND mirrored to `berth_reservations.contractFileId` on completion |
### New CRM-side reservation detail page
`/[port]/berth-reservations/[id]` doesn't exist today (only the portal's `/portal/my-reservations`). Phase A builds it.
Layout:
```
[ Header: "Reservation #88 · M/Y Tate" status pill subtitle: berth, client, dates, tenure ]
[ Action bar: Activate | Generate agreement | Cancel | ... ]
[ Two columns:
Left: Reservation details card
Linked interest card
Activity timeline
Right: Agreement card (state-dependent: no agreement / in-flight / completed)
]
```
Agreement card states:
- No agreement yet: warning + `[Generate agreement →]`
- In-flight (sent/partially_signed): "X/Y signed", per-signer status, `[View document →]` `[Send reminder]` `[Cancel]`
- Completed: "Completed YYYY-MM-DD", `[Download signed PDF]` `[Email to all signatories]`, "Signed contract attached to reservation."
Generate-agreement button launches the wizard with prefills:
- `documentType='reservation_agreement'`
- `templateId=<port's default>`
- `reservationId=<current>`
- Default signers from linked client + configurable port-admin user
- Wizard step 1 pre-validated; user lands on step 2
### Backend additions
- Merge field catalog extended in `src/lib/templates/merge-fields.ts`:
- `{{reservation.startDate}}` `{{reservation.endDate}}` `{{reservation.tenureType}}` `{{reservation.termSummary}}` `{{reservation.signedDate}}`
- New service `reservation-agreement-context.ts.buildReservationAgreementContext(reservationId, portId)`
- New seeder for default `reservation_agreement` template on port creation (HTML format; admins can switch to AcroForm/overlay later); template stored at `assets/templates/reservation-agreement-default.html`
- Webhook handler extension: `handleDocumentCompleted` detects `documentType='reservation_agreement'` and sets `berth_reservations.contractFileId = doc.signedFileId` for the linked reservation
- Dashboard alert query: active reservations without a completed agreement (LEFT JOIN against documents filtered on type+status); rows surface as a warning card
### Trade-off
`berth_reservations.contractFileId` becomes a denormalized convenience pointer duplicated with `documents.signedFileId` for the linked reservation. Updating it on completion costs one extra UPDATE. Benefit: anyone querying reservations directly (portal "My Reservations") doesn't need to join through documents to know which file is the contract.
## Reminder framework polish
### Problems with today's logic
1. Eligibility gated by `interests.reminderEnabled` — reservation agreements, NDAs, ad-hoc upload docs (no interest link) never auto-remind
2. Hardcoded 24h cooldown — effective cadence is 1 day; can't slow down for low-urgency docs
3. Always reminds lowest-pending signer — parallel-signing docs can't nudge a specific signer
4. No per-doc disable
### New eligibility logic
```
function isReminderDue(doc, template, lastReminderAt) {
if (!['sent','partially_signed'].includes(doc.status)) return false;
if (doc.documenso_id == null) return false;
if (doc.reminders_disabled) return false;
const effectiveCadence = doc.reminder_cadence_override ?? template.reminder_cadence_days;
if (effectiveCadence === null) return false;
if (lastReminderAt == null) return true;
return (now - lastReminderAt) >= effectiveCadence * 24h;
}
```
`processReminderQueue` query rewritten:
```sql
SELECT d.* FROM documents d
LEFT JOIN document_templates t ON t.id = d.template_id
WHERE d.port_id = $1
AND d.status IN ('sent','partially_signed')
AND d.documenso_id IS NOT NULL
AND d.reminders_disabled = false
AND COALESCE(d.reminder_cadence_override, t.reminder_cadence_days) IS NOT NULL;
```
`interests.reminderEnabled` is dropped from the gating logic but the column stays for now (no migration). Future cleanup PR can drop the column.
### `sendReminderIfAllowed` extended signature
```ts
export async function sendReminderIfAllowed(
documentId: string,
portId: string,
options: {
auto?: boolean; // true = cron; false (default) = manual
signerId?: string; // optional — target a specific pending signer
} = {},
): Promise<{ sent: boolean; reason?: string; signerId?: string }>;
```
Behaviour matrix:
| Mode | 9-16 window | Cadence cooldown | Manual cooldown |
| ----------- | ----------- | ---------------- | ------------------------ |
| auto: true | enforced | enforced | n/a |
| auto: false | bypassed | bypassed | 30s client-side debounce |
Per-signer logic:
- If `signerId` provided in sequential-mode doc, signer must be the lowest-pending signer (otherwise reason='Signer is not next in sequence')
- In parallel-mode doc, any pending signer can be reminded independently
- Returns `{ sent, reason }` so caller can show toast on skip
### Admin and per-doc UI
Admin `/admin/templates` editor:
```
Auto-reminders for this template:
☑ Enabled Cadence: every [_____] days (1-365; default 7)
☐ Disabled (manual reminders only)
```
Doc detail page (Section 3) "Reminders" panel under signers, with edit drawer for per-doc override.
## Visual polish system
### Token additions
```
--radius-sm: 0.375rem (existing)
--radius-md: 0.5rem (NEW — default cards)
--radius-lg: 0.625rem (NEW — sheets, dialogs)
--radius-xl: 0.875rem (NEW — KPI tiles, hero strips)
--shadow-xs: 0 1px 2px 0 rgb(15 23 42 / 0.04)
--shadow-sm: 0 2px 4px -1px rgb(15 23 42 / 0.06)
--shadow-md: 0 4px 12px -2px rgb(15 23 42 / 0.08)
--shadow-lg: 0 12px 32px -8px rgb(15 23 42 / 0.12)
--shadow-glow: 0 0 0 4px rgb(58 123 200 / 0.12)
--gradient-brand: linear-gradient(135deg, #3a7bc8 0%, #2f6ab5 100%)
--gradient-brand-soft: linear-gradient(135deg, #d8e5f4 0%, #ffffff 100%)
--gradient-success: linear-gradient(135deg, #e8f5e9 0%, #ffffff 100%)
--gradient-warning: linear-gradient(135deg, #fef3c7 0%, #ffffff 100%)
--ease-spring: cubic-bezier(0.34, 1.56, 0.64, 1)
--ease-smooth: cubic-bezier(0.4, 0, 0.2, 1)
--duration-fast: 150ms
--duration-base: 200ms
--duration-slow: 300ms
```
All exposed as Tailwind utilities.
### Existing token foundation (already in place; not changing)
- Full HSL shadcn token system (primary, secondary, muted, accent, destructive, border, input, ring, popover, card)
- Brand palette `brand` (50-700, default `#3a7bc8`)
- Navy palette `navy` (50-600, default `#1e2844` for sidebar)
- Maritime accents: `sage`, `mint`, `teal`, `purple` with light/default/dark variants
- Semantic `success` / `warning` with bg+border
- Recharts chart-1 through chart-6 token system
- Dark mode wired
- Sidebar tokens separate from main palette
### New primitive components
- `<StatusPill status="...">` — colored-by-state pill (pending grey, sent brand, partial teal, completed success, expired warning, rejected destructive, cancelled muted-darker, active success, archived muted)
- `<KPITile title value delta sparkline?>` — rounded-xl, shadow-sm, gradient-brand-soft border-top accent stripe; recharts mini sparkline using `--chart-1`
- `<EmptyState icon title body actions>` — large icon in brand-soft circle, title, body, action buttons
- `<PageHeader>` polished — gradient-brand-soft background, eyebrow optional, KPI sub-line, primary action right-aligned
### Component pattern updates
- List rows: hover gradient (subtle brand-soft 4% opacity), shadow-xs lift, animation `transition-all duration-base ease-smooth`; row-update from socket events animates 1s fade-in highlight
- Detail pages: two-column responsive grammar (header strip → 2fr main + 1fr side; cards stack vertical < 768px)
- Sidebar (already dark navy): active item gets 4px brand left-edge stripe instead of bg shift; section headers smaller-caps + brand-200 text
- Topbar: search inset shadow + brand focus ring; "+ New" trigger gets `bg-gradient-brand`; notification bell gets badge spring animation; user avatar gets shadow-sm + 2px white ring
- Forms: focus ring uses `--shadow-glow`; primary submit buttons get `bg-gradient-brand` with hover scale-1.01; inline validation gets destructive-bg pill with caret pointing up
### Loading skeleton system
- List pages: 8 skeleton rows matching column widths with subtle pulse
- Detail pages: header strip skeleton + 2-column section skeletons
- Dashboard: KPI tile skeletons + chart skeletons
- Replaces today's mix of "Loading..." text and spinners
### Mobile responsive (full sweep)
Breakpoints:
- < 640px (phone): single column, sticky bottom action bar, sheet overlays for filters
- 640-1024px (tablet): single column with wider gutters, side column under main
- ≥ 1024px (desktop): full two-column
Per-page rules:
- List tables → card stack < 768px
- Detail page header collapses subtitle to "Show more"
- Tabs collapse to `<select>` < 640px
- Sidebar slides over content < 1024px
- Primary "+ New" actions float as FAB bottom-right < 640px
## Test plan
### Unit (`tests/unit/`)
- `document-reminders-cadence.test.ts``isReminderDue` math; manual-vs-auto window/cooldown bypass
- `documenso-place-fields.test.ts` — v1/v2 dispatch (mocked HTTP); coord normalization; default field staggering for 1/2/3/5 recipients
- `email-attachments-resolver.test.ts` — fileId → MinIO buffer; cross-port 403; 10 MB cap warning
### Integration (`tests/integration/`)
- Extend `document-templates-generate-and-sign.test.ts` — new template formats (`pdf_form`, `pdf_overlay`, `documenso_render`); upload-path test
- New `document-watchers.test.ts` — add/remove endpoints; notification fan-out; port isolation
- New `document-cancel.test.ts` — user-initiated cancel; mocked Documenso void; status + event log; reject 409 if completed
- New `reservation-agreement-contract-mirror.test.ts``handleDocumentCompleted` mirrors `signedFileId` to `berth_reservations.contractFileId` only for `reservation_agreement` type
- New `reminder-cron-cadence.test.ts` — seed varied templates; simulated time advance; assert correct docs reminded
### E2E smoke (`tests/e2e/smoke/`)
- Extend `04-documents.spec.ts` — hub tabs, expand row, per-signer remind with cooldown, type/status filters, saved-view round-trip, bulk-remind with per-row toast reasons
- Extend `05-eoi-generate.spec.ts` — wizard invocation prefills (template, interest); existing flow regression
- New `27-document-create-wizard.spec.ts` — template path full flow; upload path full flow; watcher addition; reminder-override radios produce correct DB state
- New `28-reservation-agreements.spec.ts` — reservation detail → Generate agreement → wizard prefilled → Send → agreement section state transitions; post-completion contract attached + email button visible
- New `29-email-attachments.spec.ts` — system path send (documentEvents row, no email_messages); user path send when toggle on (email_messages with attachment_file_ids); cross-port 403
### E2E exhaustive (`tests/e2e/exhaustive/`) — click-everything sweep
- New `10-documents-hub.spec.ts` — crawl each tab, filter dropdowns, saved-view, expand row, signer-row buttons, bulk-action bar
- New `11-document-detail.spec.ts` — crawl in three states (draft/sent/completed); watcher add/remove; notes auto-save; preview download; "Email signed PDF" launch
- New `12-document-create-wizard.spec.ts` — crawl each wizard step under both template and upload paths; picker dropdowns, signer add/remove, drag-handle, reminder-cadence radios
- New `13-reservation-detail.spec.ts` — crawl in three states (pending no agreement / agreement-in-flight / agreement-completed); Activate/Cancel/Generate buttons; inline notes
- New `14-email-composer.spec.ts` — crawl composer drawer with attachments; From dropdown; attach button; recipient chips
- Extend exhaustive `05-eoi-generate.spec.ts` — parallel-mode + signing-order edge cases (greyed-out reminder buttons; out-of-order remind rejection)
### E2E real-API (`tests/e2e/realapi/`)
Each spec gates on env vars; clean skip if missing.
- Extend `documenso-real-api.spec.ts`:
- Generate from Documenso template (real send) and assert in real Documenso
- Generate from in-app PDF AcroForm fill, upload to real Documenso, assert
- Generate from upload path with auto-placed signature fields, assert fields visible in Documenso
- v1 and v2 explicit version-flag tests (via `DOCUMENSO_API_VERSION`)
- Manually sign in real Documenso (or simulate webhook) and assert local DB updates
- Cancel real in-flight doc, assert local + remote state
- Send reminder via real Documenso, assert HTTP + documentEvents row
- New `smtp-system-send.spec.ts` — system-path send → IMAP fetch → assert subject + attachment; verify port-config from-identity; cleanup via IMAP delete
- New `smtp-user-send.spec.ts` — user-path send (requires connected account, allowPersonalAccountSends=true) → IMAP fetch → email_messages row with attachment_file_ids
- New `minio-file-lifecycle.spec.ts` — upload, list, preview, download (byte-equal), delete; port isolation; mime-type validation
- New `documenso-webhook-ingress.spec.ts` — requires cloudflared tunnel; configure tunnel URL as Documenso webhook target; trigger doc completion; assert webhook fires + handler updates DB; verify timing-safe secret check rejects wrong secret with 401; verify event normalisation (uppercase enum + lowercase-dotted both accepted)
- New `email-attachments-roundtrip.spec.ts` — compose with fileId attachment; SMTP send; IMAP fetch; assert attachment bytes match; reject cross-port fileId with 403 before SMTP touched
### Visual baselines (`tests/e2e/visual/`)
`snapshots.spec.ts-snapshots/` regenerated as polish ships per page; one PR per surface group, baselines reviewed in PR diff. New baselines added: documents hub, doc detail, create-document wizard (each step), reservation detail, email composer with attachments.
### Test data fixtures
`global-setup.ts` extended with:
- Seed default `reservation_agreement` template (HTML format)
- Seed default `signed_doc_completion` template
- Seed one in-flight EOI doc with two pending signers (for hub-tab tests)
- Seed one `berth_reservation` with `status='active'` and no agreement (for lifecycle alert query)
### CI vs local runs
| Project | When |
| ---------------------------------------------- | ----------------------------------------------------------------------------------------------------------- |
| `setup` + `smoke` (~14 min) | Every PR via CI |
| `exhaustive` (with new click-everything specs) | Every PR via CI; ~25 min budget |
| `visual` | Every PR; baselines reviewed in PR diffs |
| `realapi` | Locally before merging touch-points; pre-release; not on CI (avoids burning Documenso quota and SMTP costs) |
## Build sequence
| # | Title | Effort | Depends on |
| ----- | ------------------------------------------------- | ------ | -------------- |
| 1 | Data model + service skeletons | 1d | — |
| 2 | Documenso v1/v2 abstraction layer | 1d | — |
| 3 | Visual primitives + token additions | 1.5d | — |
| 4 | Documents hub page | 2d | 1, 3 |
| 5 | Document detail page | 2d | 1, 3 |
| 6 | Create-document wizard + new template formats | 2.5d | 1, 2, 3 |
| 7 | Reservation detail + agreement flow | 1.5d | 1, 6 |
| 8 | Email composer attachments + From selector | 1d | 1, 3 |
| 9 | Reminder framework polish | 1d | 1 |
| 10a-e | Visual polish sweep (5 PRs across surface groups) | 3-4d | 3 |
| 11 | Real-API integration tests | 1.5d | 2, 4-9 shipped |
### Critical path
```
1 → 2 → 6 → 7 (data model → Documenso → wizard → reservation)
1 → 3 → 4 → 5 → 9 (data model → primitives → hub → detail → reminders)
1 → 8 (composer)
3 → 10a-e (sweep)
all → 11 (realapi)
```
Wall-clock minimum ~9 days; realistic with overhead ~17 days; calendar ~3.5-5 weeks.
### Acceptance gates per PR
- `pnpm tsc --noEmit` and `pnpm lint` clean
- Vitest unit + integration green
- Playwright smoke green for surface touched
- Visual baselines regenerated and reviewed in PR diff
- For PRs touching external integrations (2, 6 upload, 7 contract mirror, 8 SMTP, 11): relevant `realapi` spec verified locally before merge
### Risk register
| Risk | Mitigation |
| ---------------------------------------------------------- | ----------------------------------------------------------------------------------------- |
| Documenso v2 endpoint shape drifts from docs | PR2 validates against real v2 instance during dev; realapi spec re-runs nightly post-ship |
| Visual polish scope creeps | One PR per surface group (10a-e), each independently shippable |
| Cron migration changes effective behaviour | Backfill sets EOI cadence to 1 day matching today's effective; run on staging first |
| Mobile responsive regressions | Visual baselines include phone-viewport snapshots; PR10e is the responsive sweep |
| EOI dialog → wizard migration breaks "Generate EOI" button | Wizard launched with prefills from interest detail; PR6 includes regression spec |
| AcroForm template format confuses non-technical admins | HTML default; inline help; default templates seeded |
| Phase A wall-clock past 5 weeks | Tier-2 sweep items + optional realapi specs deferrable to follow-up release |
## Glossary
- **Documenso** — open-source document signing service, self-hosted instance at `signatures.portnimara.dev`
- **EOI** — Expression of Interest, a pre-reservation signed document
- **Reservation Agreement** — contract signed when a berth reservation is committed
- **Hub** — the new `/[port]/documents` page
- **Watcher** — a CRM user added to a doc to receive notifications on signature events without being a signer themselves
- **Signing order** — sequential index across signers; sequential mode requires lower order to sign first; parallel mode lets all sign concurrently
- **Cadence** — interval in days between auto-reminders to unsigned signers
- **System send / User send** — email dispatch identity: System uses port-config noreply SMTP; User uses connected personal email account (gated by admin toggle)
- **Render location** — where the PDF is generated (CRM-local via HTML/AcroForm/overlay, or in Documenso). Signing is always Documenso; render location is independent.

View File

@@ -0,0 +1,435 @@
# Phase B — Insights, Alerts, and Operational Awareness
**Status:** Draft — awaiting review
**Date:** 2026-04-28
**Phase:** B of D (A = Documents hub + visual polish ✓ shipped; C = Website integration; D = Pre-prod ops)
## Overview
Phase A made the CRM look polished and finished the documents/signing surface. Phase B turns it into a tool that _tells operators what's happening_ — instead of forcing them to navigate every list to find pipeline drift, expiring documents, or stalled reservations. It also closes the seven highest-priority Nuxt→Next gaps the 2026-04-28 audit surfaced (analytics, berth-interests, EOI queue, OCR, alerts, audit log, expense dedup).
The product story changes from "system of record" to "system of attention." Operators land on the dashboard and immediately see what needs them today — not a flat list they have to filter.
## Scope boundaries
### In scope (this spec)
- **Analytics dashboard** — chart-driven KPI page replacing the current 4-tile placeholder; pipeline funnel, occupancy timeline, revenue breakdown, lead-source attribution, with date-range and per-port filters
- **Alert framework** — rule engine that evaluates conditions on a schedule and surfaces actionable cards (alerts) in the dashboard's right rail; dismissible per-user; deep-links into the offending entity
- **Interests-by-berth view** — `/[port]/berths/[id]/interests` panel showing every interest targeting a berth, sortable by stage/score/age
- **Expense duplicate detection** — heuristic match on (vendor + amount + date ± 3 days); surfaces in expense detail with "Merge" action; background scan on new expense
- **EOI queue** — saved-view filter on the existing documents hub for `documentType='eoi' AND status IN ('sent','partially_signed')`, surfaced as a hub tab and a dashboard alert link
- **OCR for expense receipts** — Claude Vision integration on the existing `/expenses/scan` route to extract vendor, amount, date, currency, line items from uploaded receipts; user confirms before save
- **Audit log read view** — admin-gated UI for the existing `audit_logs` table with filters (user, action, entity type, date range, entity id search) and per-port + global (super-admin) scopes
### Explicitly out of scope (deferred to later phases)
- Custom user-defined alert rules (Phase B v1 ships with a fixed catalog of ~10 rules; user-rule creation deferred to Phase D)
- Real-time alert push notifications (only socket-fired updates of the alert list; SMS/email push deferred)
- Alert grouping / digests (each alert is its own card)
- Predictive analytics, ML scoring (separate from existing AI feature flag)
- Cross-port roll-up dashboards for super-admins (per-port only in v1)
- Full audit-log retention / archival policy (Phase D)
- OCR for PDF receipts (only image formats: jpg/png/heic; PDF expense uploads bypass OCR and stay manual until Phase D)
- Excel/CSV import for bulk expense backfill
- Country / phone / timezone work (separate cross-cutting agenda at `2026-04-28-country-phone-timezone-design.md`)
## Information architecture
### URL surface
```
/[port]/dashboard replaces existing; analytics-driven
/[port]/insights deep-link analytics page (charts only, no alerts)
/[port]/alerts full alert list (admin filter, dismissed history)
/[port]/berths/[id]/interests new tab on berth detail
/[port]/expenses/scan extend existing route with Claude Vision OCR
/[port]/admin/audit admin-gated audit log viewer
/[port]/documents extended: 'EOI queue' tab pre-filters to EOI in flight
```
### Schema deltas
```sql
-- alerts: surfaces operational warnings the user should act on
CREATE TABLE alerts (
id text PRIMARY KEY DEFAULT generate_id('alrt'),
port_id text NOT NULL REFERENCES ports(id) ON DELETE CASCADE,
rule_id text NOT NULL, -- 'reservation.no_agreement', 'interest.stale', ...
severity text NOT NULL, -- 'info' | 'warning' | 'critical'
title text NOT NULL,
body text,
link text NOT NULL, -- relative path the card deep-links to
entity_type text, -- optional FK target ('interest', 'reservation', ...)
entity_id text,
fingerprint text NOT NULL, -- hash of (rule_id + entity_type + entity_id) — dedupe
fired_at timestamptz NOT NULL DEFAULT now(),
dismissed_at timestamptz,
dismissed_by text REFERENCES users(id),
acknowledged_at timestamptz, -- "I'm on it" without dismissing
acknowledged_by text REFERENCES users(id),
resolved_at timestamptz, -- auto-set when underlying condition clears
metadata jsonb DEFAULT '{}' -- per-rule extras (e.g. days_stale, amount_at_risk)
);
CREATE UNIQUE INDEX idx_alerts_fingerprint_open ON alerts (port_id, fingerprint) WHERE resolved_at IS NULL;
CREATE INDEX idx_alerts_port_fired ON alerts (port_id, fired_at DESC);
CREATE INDEX idx_alerts_port_severity_open ON alerts (port_id, severity) WHERE resolved_at IS NULL AND dismissed_at IS NULL;
-- expense duplicate detection (column-only, no new table)
ALTER TABLE expenses ADD COLUMN duplicate_of text REFERENCES expenses(id);
ALTER TABLE expenses ADD COLUMN dedup_scanned_at timestamptz;
CREATE INDEX idx_expenses_dedup ON expenses (port_id, vendor_name, amount, expense_date)
WHERE duplicate_of IS NULL;
-- analytics support: materialized refresh tracking (avoids recomputing on every dashboard hit)
CREATE TABLE analytics_snapshots (
port_id text NOT NULL REFERENCES ports(id) ON DELETE CASCADE,
metric_id text NOT NULL, -- 'pipeline_funnel.30d', 'occupancy_timeline.90d', ...
computed_at timestamptz NOT NULL DEFAULT now(),
data jsonb NOT NULL,
PRIMARY KEY (port_id, metric_id)
);
-- audit_logs already exists; add a tsvector column for fast search
ALTER TABLE audit_logs ADD COLUMN search_text tsvector
GENERATED ALWAYS AS (
to_tsvector('simple',
coalesce(action, '') || ' ' ||
coalesce(entity_type, '') || ' ' ||
coalesce(entity_id::text, '') || ' ' ||
coalesce(actor_email, ''))
) STORED;
CREATE INDEX idx_audit_search ON audit_logs USING gin(search_text);
-- ocr extracted fields on receipt files (most fields already on expenses)
ALTER TABLE expenses ADD COLUMN ocr_status text DEFAULT 'pending'; -- 'pending'|'ok'|'failed'|'low_confidence'
ALTER TABLE expenses ADD COLUMN ocr_raw jsonb; -- the model's full response
ALTER TABLE expenses ADD COLUMN ocr_confidence numeric; -- 0..1
```
After running migration on dev/staging, restart `next dev` to flush postgres.js prepared-statement cache (project convention).
### Service-layer changes
**New services:**
- `alerts.service.ts` — CRUD + fanout: `evaluateRules(portId)`, `dismissAlert(id, userId)`, `acknowledgeAlert(id, userId)`, `resolveStaleAlerts(portId)`
- `alert-rules.ts` — fixed catalog of evaluator functions, each takes `(portId, db)` and returns `Array<{ rule_id, severity, fingerprint, ... }>`
- `analytics.service.ts``getPipelineFunnel(portId, range)`, `getOccupancyTimeline(portId, range)`, `getRevenueBreakdown(portId, range)`, `getLeadSourceAttribution(portId, range)`; reads `analytics_snapshots` first, recomputes if stale
- `analytics-snapshot-job.ts` — BullMQ recurring job that recomputes snapshots every 15 min per port
- `expense-dedup.service.ts``scanForDuplicates(expenseId)`, returns candidate matches with confidence; called from BullMQ on `expense:created`
- `expense-ocr.service.ts` — Claude Vision wrapper: takes file URL, returns parsed expense fields; uses prompt caching for the system prompt to keep cost down
- `audit-search.service.ts` — wraps drizzle query with tsvector match + filters
**Extended services:**
- `documents.service.ts` — adds `getEoiQueueRows(portId, opts)` that joins documents + signers + last-reminder for the EOI queue tab
- `expenses.service.ts``createExpense` triggers OCR + dedup BullMQ jobs after row insert
- `notifications.service.ts` — fires `alert:created` and `alert:resolved` socket events
### Alert rule catalog (v1)
| Rule ID | Severity | Trigger | Resolves when | Why it matters |
| ---------------------------- | -------- | -------------------------------------------------------------------------------------------- | -------------------------------------------- | ---------------------------- |
| `reservation.no_agreement` | warning | active reservation > 3d old without a `reservation_agreement` doc in any non-cancelled state | doc reaches `sent` | flagged in Phase A spec |
| `interest.stale` | info | `pipelineStage IN ('details_sent','in_communication','visited')` AND last activity > 14d | activity timestamp updates | dropped leads |
| `document.expiring_soon` | warning | `expires_at` within 7 days, `status IN ('sent','partially_signed')` | doc completed/cancelled or expires_at passes | nudge before contracts lapse |
| `document.signer_overdue` | warning | signer pending > 14d AND last reminder > 7d ago | signer signs/declines | classic chase target |
| `berth.under_offer_stalled` | info | berth `status='under_offer'` > 30d | status changes | reservation never closed |
| `expense.duplicate` | info | `expense.duplicate_of IS NOT NULL` | merged or marked-not-duplicate | bookkeeping cleanup |
| `expense.unscanned` | info | expense with file but `ocr_status='pending'` > 1h | `ocr_status='ok'` | OCR failed silently |
| `interest.high_value_silent` | critical | `leadCategory='hot_lead'` AND last activity > 7d | activity update | revenue at risk |
| `eoi.unsigned_long` | warning | EOI doc `status='sent'` > 21d | doc completed/cancelled | EOI funnel leak |
| `audit.suspicious_login` | critical | >3 failed logins from same IP in 1h | manual dismiss | security awareness |
Rules are pure functions; the engine takes their outputs, upserts on `(port_id, fingerprint)` to avoid spam, and auto-resolves alerts whose rule no longer fires.
## Per-feature design
### Analytics dashboard
Replaces the current 4-tile dashboard. Layout:
```
[ Gradient PageHeader: "Dashboard" · last-updated stamp · Date range picker (Today / 7d / 30d / 90d / custom) ]
[ KPI row (4 KPITiles, sparkline + delta vs prior period):
Total Clients Active Interests Pipeline Value Occupancy Rate
]
[ Pipeline funnel (recharts FunnelChart): | Alert rail (right column):
horizontal bars per stage with conversion % | Critical (red) cards
click bar → filtered interests list | Warning (amber) cards
| Info (blue) cards
| "Show dismissed" toggle
] |
[ Revenue breakdown (recharts BarChart, stacked by source) ] | (continues)
[ Occupancy timeline (recharts AreaChart, daily/weekly) ] |
[ Lead source attribution (recharts PieChart with legend) ]
```
Charts are server-rendered via the recharts already-in-bundle. Data comes from `analytics.service.ts` which reads `analytics_snapshots` (refreshed every 15 min by cron) — first hit warms the cache, subsequent hits are sub-100ms.
Date-range picker re-runs `analytics.service` queries with the selected range; cache key includes the range so 30d and 90d don't fight.
Export: each chart card has a `[...]` overflow menu with "Download as CSV" and "Download as PNG"; uses recharts' `getDataUrl()` for PNG.
### Alert rail
Right column on `/dashboard`, full page at `/alerts`. Each alert is a card:
```
[severity-color stripe-left]
[rule-icon] Title (entity name)
Body — body text describing the condition
Last fired N days ago · entity: link
[Acknowledge] [Dismiss] [Open →]
```
- Acknowledge: marks `acknowledged_at` but stays visible (someone's on it)
- Dismiss: hides from the rail; appears in `/alerts` "Dismissed" tab
- Auto-resolve: when the rule re-evaluates and the condition no longer fires, alert moves to "Resolved" history
Real-time: socket emits `alert:created` / `alert:resolved` from the cron worker; React Query invalidates the alert list.
### Interests-by-berth view
New tab on `/[port]/berths/[id]` called "Interests" — count badge in tab.
```
[ Berth header (existing) ]
[ Tabs: Overview | Reservations | Interests (N) | Notes | Files | Activity ]
[ Interests tab body:
[Filter: All stages | Active only | Lost] [Sort: Newest | Stage progress | Lead score]
Table: client name | stage pill | source | category | last activity | score badge
Click row → interest detail
]
```
Pure read; no mutations. The list filters interests where `interest.berthId = berth.id`. Already exists in DB; just needs the UI tab.
### Expense duplicate detection
When a new expense is created, BullMQ job `expense.dedup` runs:
```ts
async function scanForDuplicates(expenseId: string) {
const e = await db.query.expenses.findFirst({ where: eq(expenses.id, expenseId) });
const candidates = await db.query.expenses.findMany({
where: and(
eq(expenses.portId, e.portId),
eq(expenses.vendorName, e.vendorName),
eq(expenses.amount, e.amount),
between(expenses.expenseDate, addDays(e.expenseDate, -3), addDays(e.expenseDate, 3)),
ne(expenses.id, e.id),
),
});
if (candidates.length > 0) {
await db
.update(expenses)
.set({ duplicate_of: candidates[0].id, dedup_scanned_at: new Date() })
.where(eq(expenses.id, expenseId));
// fires `expense.duplicate` alert via rule engine on next sweep
}
}
```
Detail page: when `duplicate_of` is set, show a yellow banner: "Looks like a duplicate of {linked expense}. [Merge them] [Mark as not duplicate]". Merge: deletes the new expense and merges any line items into the original.
### EOI queue tab
Documents hub gets a new tab between "Awaiting them" and "Awaiting me":
```
Tabs: All | EOI queue (N) | Awaiting them | Awaiting me | Completed | Expired
```
`EOI queue` filters: `documentType='eoi' AND status IN ('sent','partially_signed')`. Same row chrome as the rest of the hub. Bulk-action bar adds an "EOI bulk reminder" preset that respects the rule engine's reminder cooldown.
### OCR for expense receipts
Existing `/expenses/scan` route — extend to call Claude Vision on upload:
```ts
// expense-ocr.service.ts (uses Anthropic SDK; already in deps)
import Anthropic from '@anthropic-ai/sdk';
const client = new Anthropic();
const SYSTEM_PROMPT = `You extract structured expense data from receipts...
Output JSON: { vendor, amount, currency, date (ISO), lineItems: [...], confidence (0-1) }
`; /* cached via ephemeral cache_control for cost savings */
export async function ocrReceipt(fileUrl: string) {
const file = await fetch(fileUrl);
const base64 = Buffer.from(await file.arrayBuffer()).toString('base64');
const message = await client.messages.create({
model: 'claude-haiku-4-5-20251001', // haiku for cost; sonnet if quality needed
max_tokens: 1024,
system: [{ type: 'text', text: SYSTEM_PROMPT, cache_control: { type: 'ephemeral' } }],
messages: [
{
role: 'user',
content: [
{ type: 'image', source: { type: 'base64', media_type: 'image/jpeg', data: base64 } },
{ type: 'text', text: 'Extract expense fields from this receipt.' },
],
},
],
});
return parseAndValidate(message.content[0].text);
}
```
UI: existing scan page now shows a 3-step flow:
1. Upload receipt photo
2. Wait for OCR (spinner; ~3s avg with Haiku)
3. Confirm extracted fields (pre-filled form, user can edit)
4. Save → existing expense create flow
Low-confidence (< 0.6) extractions show a yellow banner "Please verify all fields" and pre-select the file uploader.
### Audit log read view
Admin route `/[port]/admin/audit`:
```
[ PageHeader: "Audit Log" · "Last 30 days · 12,847 events" ]
[ Filter row:
Search [tsvector] Actor [combobox of users] Action [pills] Entity type [select]
Date range [picker] Severity [pills] [Reset]
]
[ Table:
Timestamp | Actor | Action | Entity | Diff button | IP | User-agent
Click row → expand to show before/after JSON diff
]
[ Pagination · Export CSV button (admin-gated) ]
```
Server-side: `audit-search.service.ts` builds a drizzle query with the tsvector match + filters; supports cursor pagination on `(created_at, id)`.
Super-admin sees a port toggle that switches between current port and "All ports" view.
## Test plan
### Unit (`tests/unit/`)
- `alert-rules-evaluators.test.ts` — each rule tested with seeded data; covers fire/no-fire cases and resolution conditions
- `expense-dedup-heuristic.test.ts` — vendor/amount/date matching with edge cases (case-insensitive, ±3d window, currency mismatch ignored)
- `analytics-pipeline-funnel.test.ts` — funnel math against fixture interests
- `analytics-occupancy-timeline.test.ts` — daily aggregation against fixture berth status changes
- `audit-search-filters.test.ts` — tsvector + filter composition
- `ocr-prompt-caching.test.ts` — assert cache_control presence on system prompt; mocked Claude response
### Integration (`tests/integration/`)
- `alerts-engine.test.ts` — full evaluation cycle: seed conditions, run engine, assert correct alerts upserted, run again to assert dedupe via fingerprint, mutate state, assert auto-resolve
- `analytics-snapshot-refresh.test.ts` — recurring job: snapshot row written, served from cache on next read, refreshed on next tick
- `expense-dedup-flow.test.ts` — create A, create matching B, assert B.duplicate_of=A; merge B → A absorbs line items, B archived
- `audit-search-tsvector.test.ts` — seed audit_logs, query by free-text, assert returned ids
- `eoi-queue-listing.test.ts` — extends documents-hub test; assert EOI tab returns correct subset
### E2E smoke (`tests/e2e/smoke/`)
- New `27-analytics-dashboard.spec.ts` — dashboard renders charts; date-range picker re-renders; KPI tiles show non-zero data after seed
- New `28-alerts.spec.ts` — alert appears after seeding stale-interest condition; click-to-deep-link; dismiss persists; resolve hides
- New `29-interests-by-berth.spec.ts` — tab visible on berth detail; lists interests; sort works
- New `30-expense-dedup.spec.ts` — create two matching expenses; banner appears; merge button works
- New `31-ocr-flow.spec.ts` — uploads fixture receipt image; extracted fields pre-filled; user can edit and save
- New `32-audit-log.spec.ts` — admin page loads; search by entity id returns expected row; date filter narrows
- Extend `04-documents.spec.ts` — EOI queue tab presence + count badge
### E2E exhaustive (`tests/e2e/exhaustive/`)
- `15-analytics-dashboard.spec.ts` — crawl every chart's hover tooltips, legend toggles, export menu
- `16-alerts.spec.ts` — crawl alert card actions, severity filters, dismissed history, real-time arrival via socket emit
- `17-audit-log.spec.ts` — crawl filter combos, expand row diffs, super-admin all-ports toggle
### E2E real-API (`tests/e2e/realapi/`)
- New `claude-vision-receipt-ocr.spec.ts` — gates on `ANTHROPIC_API_KEY`; uploads two real fixture receipts (one clean, one blurry); asserts Haiku response shape and confidence score; verifies `cache_control` headers in HTTP trace; cleanup deletes test expense
### Test data fixtures
`global-setup.ts` extends:
- Seed one stale interest in `details_sent` stage with `last_activity_at = now - 20d` (fires `interest.stale`)
- Seed one active reservation without an agreement (fires `reservation.no_agreement`)
- Seed two matching expenses (fires `expense.duplicate`)
- Seed 90 days of pipeline activity for analytics charts
- Add a `tests/e2e/fixtures/receipts/` dir with two .jpg receipts for OCR tests
## Build sequence
| # | Title | Effort | Depends on |
| --- | ------------------------------------------------------------ | ------ | ----------------- |
| 1 | Schema + alert/analytics service skeletons | 1d | — |
| 2 | Alert rules engine + recurring evaluator + socket | 1.5d | 1 |
| 3 | Analytics snapshot job + service layer | 1d | 1 |
| 4 | Analytics dashboard page (KPI tiles + 4 charts + date-range) | 2.5d | 1, 3, A's KPITile |
| 5 | Alert rail UI + `/alerts` page | 1.5d | 2 |
| 6 | EOI queue tab on documents hub | 0.5d | A's hub |
| 7 | Interests-by-berth tab on berth detail | 0.5d | — |
| 8 | Expense duplicate detection (job + UI banner + merge) | 1.5d | 1 |
| 9 | OCR for expense receipts (Claude Vision + 3-step UI) | 1.5d | — |
| 10 | Audit log read view (admin page + filters + tsvector search) | 1.5d | 1 |
| 11 | Real-API integration tests | 1d | 9 |
### Critical path
```
1 → 2 → 5 (data → alert engine → alert UI)
1 → 3 → 4 (data → analytics service → analytics page)
8 → 2 (alert rule) (dedup populates the data the alert reads)
9 (OCR) → 11 (realapi)
```
Wall-clock minimum ~10 days (one engineer, sequential critical path); realistic with overhead ~13 days; calendar 2.53 weeks.
### Acceptance gates per PR
- `pnpm tsc --noEmit` and `pnpm lint` clean
- Vitest unit + integration green (incl. new tests)
- Playwright smoke green for the surface touched
- Visual baselines regenerated and reviewed in PR diff
- For PRs touching external integrations (9 OCR, 11 realapi): relevant `realapi` spec verified locally before merge
### Risk register
| Risk | Mitigation |
| ------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Alert engine false positives spam users | Each rule has a "snooze" window in metadata; rules ship behind a feature flag `alerts.{rule_id}.enabled`; QA seeds production-shape data before flipping flags on |
| Analytics queries slow on large datasets | `analytics_snapshots` materialized cache; cron recomputes off the request path; queries use existing per-port indexes |
| Claude Vision OCR cost spirals | Default to Haiku 4.5 (~10× cheaper than Sonnet); ephemeral system-prompt cache hits ~80%; per-port quota with admin-visible meter |
| OCR low-quality on blurry receipts | Confidence threshold (< 0.6) flips to "verify mode" — user must touch every field before save; failure metric tracked in admin/monitoring |
| Audit log table large (millions of rows) | Already partitioned-friendly via the GIN tsvector index; pagination uses cursor on `(created_at, id)` not OFFSET |
| Alert socket fanout overwhelms client | Throttle the engine cron to once per 5min; client debounces React Query refetches |
| Interest stale rule fires for legitimately paused leads | Add a per-interest `paused_until` field as a follow-up if operators ask; v1 ships without |
## Glossary
- **Alert** — operator-facing actionable card, rule-fired, dismissible
- **Rule** — a pure-function evaluator that takes (port, db) and returns alert candidates
- **Fingerprint** — `hash(rule_id + entity_type + entity_id)` used to dedupe alerts across re-evaluations
- **Snapshot** — cached chart data row in `analytics_snapshots`, refreshed on cron
- **EOI queue** — saved-view filter on the documents hub, not a separate page
- **OCR** — Claude Vision extraction of structured expense fields from receipt images
- **Audit log** — read view of the existing `audit_logs` table; no schema change beyond a tsvector column
## Open questions for the user
- Which port should be the **default landing dashboard** when a super-admin logs in (currently first-port-by-name; analytics page works the same)?
- Should the alert rail be **always visible on all dashboard pages** or only on `/dashboard` (currently spec'd as the latter)?
- Do you want the **Audit log retention policy** (delete > N days old) wired in v1 or deferred to Phase D?
- Should **OCR be opt-in per port** (admin toggle) or always-on with a quota?

View File

@@ -0,0 +1,376 @@
# Google Workspace inbox-triage integration (exploratory)
**Status:** Exploratory — not approved for build
**Date:** 2026-04-29
**Tracks:** AI inbox-triage, Google Workspace email connection
## What this spec is for
The user has flagged inbox-triage as the most valuable AI surface left to
build, but conditioned email integration on it being via Google Workspace
specifically (not generic IMAP), with a per-port toggle so clients who
don't use GWS aren't billed for capability they can't reach.
This document captures what that build actually costs — especially on
the Google side, which is where most teams underestimate the work — so
we can decide whether to commit before writing any code. **Nothing in
this spec is approved for implementation.** The deliverable is a go /
no-go decision and, if go, a scope choice between three deployment
models that cost wildly different amounts of calendar time.
## What inbox-triage actually does for the user
Concretely, on the staff member's desktop:
1. **Linked-inbox panel on the client detail page.** When you open
`/[port]/clients/<id>` you see the last N email threads with that
client, pulled from the staff member's own Gmail. Each thread has
the latest message preview, an "open in Gmail" deep-link, and a
"draft reply" button (Phase 2+).
2. **Inbox triage queue.** A new top-level page `/[port]/inbox` that
lists unread/unanswered threads ranked by AI-assessed importance
(high-value client, contractual urgency, chase-overdue). Each row
has one-click actions: "log this as a note on the client",
"create a follow-up reminder", "draft reply".
3. **Email-driven alerts.** When a high-value client emails and no one
responds within X hours, the existing alerts engine fires a
`inbox.unanswered_high_value` rule (slots into the alert framework
from Phase B without schema change).
4. **Reply drafts (Phase 3).** AI generates a reply draft grounded in
the client's CRM record (open interests, pending reservations,
recent invoices). Staff edit and send through Gmail.
The value is selective: a port with three staff members fielding 50
client emails a day saves maybe an hour a day collectively if the
ranking is right. Below that volume the build doesn't pay back.
## What already exists in the codebase
The CRM is roughly halfway scaffolded for this:
| Surface | Status | Notes |
| ----------------------------------------------- | ----------------------- | ------------------------------------------------------------------------------------------------------------------------ |
| `email_accounts` table | ✅ Exists | Has `provider: 'google' \| 'outlook' \| 'custom'` discriminator and `imap_*` / `smtp_*` cols. Built for IMAP, not OAuth. |
| `email_threads` / `email_messages` tables | ✅ Exists | Already linked to `clientId`. Schema is good as-is for Gmail. |
| `email-threads.service.ts` `syncInbox()` | ⚠ Stub-ish | IMAP-flow only. Won't reach Gmail without OAuth + Gmail API rewrite. |
| `email` BullMQ queue + `inbox-sync` job name | ✅ Exists | Worker dispatches on the job name; new sync impl drops in. |
| `google_calendar_tokens` table | ✅ Exists | OAuth token storage shape we can mirror for Gmail. |
| Per-port email override (port `email_settings`) | ✅ Exists | Used for outbound only today; Gmail integration is per-staff-user, not per-port. |
| `ai_usage_ledger` + per-port `aiEnabled` flag | ✅ Exists (Phase 3a/3b) | Triage AI calls book against the same ledger. |
| `withRateLimit('ai', ...)` wrapper | ✅ Exists (Phase 3c) | Caps triage AI traffic at 60/min/user out of the box. |
Net: schemas are mostly right. The OAuth flow, Gmail API client, push
notification receiver, and triage classifier are the new builds.
## Why Google Workspace specifically
The user's stated constraint: "I don't think we need email integration
unless we connect it to Google Workspace." Reasons that hold up:
- **No password storage.** OAuth tokens are revocable, scoped, and
rotate. IMAP requires app passwords, which Google has been actively
deprecating since 2024 — they'll be gone for the workspace plans
this product targets.
- **Push notifications, not polling.** Gmail's `users.watch` API plus
Google Pub/Sub means we get an HTTP callback within seconds of a new
message landing. IMAP requires polling on a 30-60 second cadence,
which costs more and lags worse.
- **Search and labels.** The Gmail API exposes label management and
full-text search natively; IMAP search is much weaker.
- **Threading.** Gmail's `threadId` is canonical. Reconstructing
threads over IMAP from `In-Reply-To` / `References` headers is
reliable in theory, painful in practice.
Microsoft 365 is the obvious peer integration but is out of scope here.
The Graph API model is similar enough that a future M365 path can reuse
most of the storage shape.
## Three deployment models — pick one before building
This is the most important decision in the spec. Each model has
different OAuth-verification consequences, which dominate everything
else.
### Model A — Marketplace-published OAuth app
A single OAuth client owned by Port Nimara, listed in the Google
Workspace Marketplace, that any GWS customer can install. Each staff
member clicks "Connect Gmail," consents to the scopes, and the CRM
stores their refresh token.
**Google-side work:**
1. Build the OAuth flow in CRM (~1 week).
2. Submit for OAuth verification. Gmail's `gmail.readonly` /
`gmail.modify` scopes are **restricted scopes** — they require:
- Domain-verified production URLs
- A homepage with a privacy policy that explicitly enumerates which
scopes are used and why
- A demo video (literally a screen recording) showing the consent
screen and what happens next
- **A third-party security assessment from a Google-approved
vendor** ($15k$75k, 612 weeks)
- A Cloud Application Security Assessment (CASA) report
3. Marketplace listing review (~2 weeks after CASA passes).
**Calendar time:** 46 months.
**Money:** $15k$75k for the security assessment alone.
**Recurring:** Re-verification every 12 months.
Right answer if Port Nimara wants to be the marina-CRM that ships GWS
out of the box for _any_ customer. Wrong answer if there are <5
customers who'd use it.
### Model B — Per-customer "Internal" OAuth app
Each customer's GWS admin creates an OAuth client _inside their own
workspace_ and gives Port Nimara the client ID + secret. Because the
app is "Internal," Google skips verification entirely — the consent
screen is unverified-but-permitted. Tokens never cross workspace
boundaries.
**Google-side work per customer:**
1. Customer's GWS admin enables the Gmail API in their Cloud project.
2. Creates an OAuth 2.0 client ID with type "Internal" + your CRM's
redirect URI.
3. Hands the client ID + secret to Port Nimara out-of-band.
4. Staff connect their Gmail through that client.
**Calendar time per customer:** ~1 hour of admin work.
**Money:** $0.
**Limit:** Doesn't span across GWS workspaces. A user with two GWS
accounts (e.g. the marina + a personal workspace) can only connect the
one matching the OAuth client.
This is the **clear winner for the current customer base**: small
number of customers, each with their own GWS workspace, and each
buying the integration as part of an onboarding conversation.
### Model C — Forward-to-CRM mailbox
The CRM exposes a per-port email alias (e.g.
`port-nimara-NN@inbox.portnimara.com`). Customers configure a Gmail
filter or mailing rule that BCCs that alias on relevant threads. The
CRM ingests via SMTP and runs the same triage pipeline.
**Google-side work:** None. Customer does it as a Gmail filter.
**Calendar time:** ~1 week of CRM-side build.
**Limit:** Receive-only — no reply drafts, no thread state changes,
no labels. The "draft reply" feature in Phase 3 above is impossible
under this model.
Model C is the right answer if the user wants to ship inbox-triage
_now_ and decide on bidirectional Gmail integration later. The schema
is designed so the model can be upgraded to A or B without data
migration.
### Recommendation
**Build Model B first.** It costs nothing on the Google side, takes
~3 weeks of CRM work, and matches the actual customer profile.
**Promote to Model A only after 3+ paying customers ask for it
unprompted.** Until then, the security-assessment cost can't justify
itself.
Model C as a fallback for customers who refuse to set up an Internal
OAuth app. Build it last, lazily — the schema accommodates it.
## End-to-end flow (Model B)
### 1. Per-port OAuth-app config
New admin page `/[port]/admin/google-workspace`:
- Field: "OAuth client ID" (their internal client ID)
- Field: "OAuth client secret" (encrypted at rest using `ENCRYPTION_KEY`)
- Field: "Authorized redirect URI" (read-only; we display the value
they need to paste into their Google Cloud Console)
- Toggle: "Enable Gmail integration for this port"
Stored in `system_settings` under key `gws.config`, port-scoped.
Resolution mirrors the existing OCR config service.
### 2. Per-staff connect flow
Staff member visits `/[port]/me/integrations`, clicks "Connect Gmail."
```
GET /api/v1/auth/gws/start
→ looks up port's gws.config
→ builds Google authorize URL with port's client_id + state token
→ 302 to Google
[ user consents ]
→ 302 back to /api/v1/auth/gws/callback?code=…&state=…
→ exchanges code for tokens via port's client_secret
→ stores in new `gws_user_tokens` table (encrypted)
→ schedules an `inbox-watch` job
```
### 3. Push notification subscription
After tokens are stored, the worker calls
`gmail.users.watch({ topicName: <Pub/Sub topic>, labelIds: ['INBOX'] })`.
Gmail then posts to a Pub/Sub topic on every inbox change. The CRM
exposes a Pub/Sub push subscription endpoint at
`/api/webhooks/gmail-push` which fetches the changed messages via the
delta `historyId` and writes them into `email_messages`.
Watch subscriptions expire every 7 days. A maintenance job
re-establishes them daily.
### 4. Triage pipeline
For each new inbound message:
1. Match against `clients` and `companies` by `from_address` against
`client_contacts` (email channel). Persist a thread→client link if
found.
2. If port has `aiEnabled` AND `gws.triageEnabled`, queue an `ai`
job that classifies the thread:
- `urgency`: low / medium / high
- `category`: invoice-question / availability / contract / other
- `requires_response`: boolean
3. AI call records into `ai_usage_ledger` with `feature='inbox_triage'`.
The existing per-port budget gates apply automatically.
4. Triage output written to a new `email_triage` table keyed on
`email_messages.id`.
### 5. UI surfaces
- `/[port]/inbox` — sorted by triage rank, port-wide view.
- Linked-inbox panel on `client-tabs.tsx` — adds a new "Email" tab
pulling from `email_threads` filtered to that client.
- Alert rule `inbox.unanswered_high_value` slots into Phase B's
alert engine; no schema change.
## Schema additions
Three new tables, all port-scoped where it matters:
```ts
// Per-staff Gmail tokens. Mirror of google_calendar_tokens.
gws_user_tokens {
id, userId (UNIQUE), portId, emailAddress,
accessTokenEnc, refreshTokenEnc, tokenExpiry,
scope, watchExpiresAt, watchHistoryId,
connectedAt, lastSyncAt, syncEnabled, createdAt, updatedAt
}
// Triage classifications keyed to messages.
email_triage {
messageId (PK, FK email_messages.id ON DELETE CASCADE),
urgency, category, requiresResponse,
modelVersion, tokensUsed, classifiedAt
}
// Pub/Sub idempotency log. Gmail re-delivers; we dedupe.
gws_push_log {
messageId (Pub/Sub message id, PK),
historyId, receivedAt
}
```
Plus extensions to `email_messages`:
- `googleMessageId` (text, indexed) — Gmail's own ID for thread ops.
- `googleThreadId` (text, indexed).
- `gmailLabels` (text[]) — for "is unread" checks without hitting Gmail.
The existing `emailAccounts.provider='google'` column repurposes
unchanged; the IMAP fields go nullable since OAuth-flow accounts won't
populate them.
## AI cost interaction
Triage AI is opt-in **twice**: the port admin must turn on
`aiEnabled` (Phase 3a flag, default off) **and** `gws.triageEnabled`
(this spec, default off). Either toggle off and the inbox sync still
runs but skips classification, so staff can manually scan threads
without burning tokens.
Per-message token cost on a current Haiku-class model is roughly
15002500 tokens including the system prompt. A port doing 200 inbound
emails a day at the upper bound is ~500k tokens/day. The default
hard-cap is 500k/month, so triage will trip it inside a day. Two
mitigations baked in:
- The system prompt is short (<500 tokens) and prompt-cached on the
Anthropic side, so most tokens are output.
- Triage runs only on threads not already classified — re-syncs from
the watch loop don't re-bill.
The admin UI shows triage as its own line in the per-feature breakdown
so customers can see how much their inbox is costing them and tune
caps accordingly.
## Phased build (assuming Model B)
| Phase | Scope | Effort | Ships when |
| ---------------------------- | ------------------------------------------------------------------------------------------------------- | ------ | ----------------------------------------------------------- |
| **G1** Connect | OAuth flow + per-port config + per-user token storage. No sync yet. Staff can connect; nothing happens. | 1 week | Standalone |
| **G2** Read-only sync | Pub/Sub push receiver + delta sync into `email_messages`. Linked-inbox tab on client detail. No AI. | 1 week | After G1 |
| **G3** Triage classification | AI classifier, `email_triage` writes, `/inbox` page sorting. Per-port toggle. | 1 week | After G2; depends on Phase 3b budgets being live (they are) |
| **G4** Reply drafts | Gmail API send + draft creation. "Draft reply" button on the client detail Email tab. | 1 week | After G3 |
| **G5** Alerts | New `inbox.unanswered_high_value` rule. Hooks into Phase B alert engine. | 2 days | After G3 |
Total: ~5 weeks for a single engineer, assuming the user provides one
real GWS workspace to test against during G1.
## Open decisions for the user
These are the questions to resolve before scheduling the build, in
priority order:
1. **Deployment model — A, B, or C?** Default recommendation B.
2. **Single user or domain-wide delegation?** Per-staff connect (one
token per user) is simpler. Domain-wide delegation lets the port
admin connect once on behalf of every staff member but requires
the customer to grant a service account broader access. Default
recommendation: per-staff.
3. **Scope set.** Minimal viable scope is `gmail.readonly`. To send
replies (G4) we need `gmail.send`. To manage labels (e.g. mark
"triaged-by-CRM") we need `gmail.modify`. Each scope expansion
widens the consent screen scariness but doesn't add new
verification steps under Model B.
4. **Pub/Sub topic ownership.** Pub/Sub topics live in _some_ GCP
project. Under Model B the customer's project owns the topic —
they pay for Pub/Sub (cents/month) and grant our service account
subscriber access. Alternative: Port Nimara owns the topic and
the customer's Gmail publishes cross-project (allowed, slightly
more setup). Default: customer-owned topic, fewer moving parts.
5. **Triage model.** Haiku 4.5 is right for cost; Sonnet 4.6 is
right if the ranking quality on Haiku turns out to be poor.
Defer this until G3 has real-world tuning data.
## Things that are NOT in this spec
- **Microsoft 365 / Outlook integration.** Same shape, different API.
Once Model B is proven on GWS, Graph API takes another ~3 weeks.
- **Reply drafts grounded in CRM context.** That's G4 and depends on
the work in this spec, but the prompt engineering for "good replies
citing this client's open interests + reservations + invoices"
deserves its own design pass before building.
- **Cross-staff triage queue (i.e. "show me all unanswered emails
across the team").** That requires either domain-wide delegation
(decision #2 above) or per-staff opt-in to a shared view. Punt
until staff actually ask for it.
- **Sentiment / urgency tone analysis.** Tempting; almost always
wrong; skip in v1.
- **"Smart drafts" using the recipient's past replies as context.**
Every customer asks for this and almost no one uses it once
built. Skip.
## Cost summary at a glance
| Item | Model A | Model B | Model C |
| ------------------------------- | ------------------------------- | -------------------------------------- | ------------------------------------ |
| Build effort | 34 weeks | ~5 weeks (over G1G5) | ~1 week (receive-only) |
| Calendar time to first customer | 46 months | 1 hour of customer admin work | 1 hour of customer Gmail-filter work |
| Up-front cash | $15k$75k (CASA) | $0 | $0 |
| Recurring | Re-verification annually | None | None |
| Best for | 50+ customers, Marketplace play | 110 customers, white-glove onboarding | Customers who refuse OAuth setup |
The recommendation stands: build Model B for G1 + G2 + G3, ship that,
and let real customer demand decide whether G4/G5 and Model A
promotion are worth the calendar time.

160
docs/website-refactor.md Normal file
View File

@@ -0,0 +1,160 @@
# Website → CRM wiring refactor
The `website/` subrepo (Nuxt) currently writes inquiry submissions to NocoDB.
The new CRM exposes its own public ingestion endpoints, so the website needs
to be re-pointed at the CRM and the website's local server-side helpers can
eventually be retired.
This document describes **what needs to change in the website repo**. Nothing
here applies to the CRM repo — that side is already done.
## Endpoints the CRM now exposes
Both are unauthenticated, IP-rate-limited (5/hour), and require an explicit
port id (query param `?portId=…` or header `X-Port-Id`).
| Form intent | New CRM endpoint | Old NocoDB target |
| -------------------- | ---------------------------------------- | ------------------------ |
| Berth interest | `POST /api/public/interests` | `Interests` (NocoDB) |
| Residential interest | `POST /api/public/residential-inquiries` | `Interests (Residences)` |
Notification emails (client confirmation + sales-team alert) are sent by the
CRM itself when these endpoints succeed, so the website's
`sendRegistrationEmails` helper (`server/utils/email.ts`) is no longer
required for these flows.
## Required changes in the website repo
### 1. New env vars
Add to `.env` and the deploy environment:
```
PN_CRM_BASE_URL=https://crm.portnimara.com
PN_CRM_PORT_ID=<uuid of the Port Nimara port row in CRM>
```
`PN_CRM_BASE_URL` defaults to the prod CRM. In dev it can point to the local
tunnel (`shoulder-contain-…trycloudflare.com`) so submissions hit a dev DB.
### 2. Refactor `server/api/register.ts`
Today the file owns both the berth and residence branches and writes to
NocoDB directly. After the refactor, both branches just relay to the CRM:
```ts
const baseUrl = process.env.PN_CRM_BASE_URL;
const portId = process.env.PN_CRM_PORT_ID;
if (category === 'Residences') {
await $fetch(`${baseUrl}/api/public/residential-inquiries?portId=${portId}`, {
method: 'POST',
body: {
firstName: body.first_name,
lastName: body.last_name,
email: body.email,
phone: body.phone,
placeOfResidence: body.address,
preferredContactMethod: body.method_of_contact, // 'email' | 'phone'
notes: body.notes,
// preferences: collect via new optional textarea (see section 4)
},
});
return { success: true };
}
// Berth branch
await $fetch(`${baseUrl}/api/public/interests?portId=${portId}`, {
method: 'POST',
body: {
// map to the CRM's publicInterestSchema (see src/lib/validators/interests.ts)
firstName: body.first_name,
lastName: body.last_name,
email: body.email,
phone: body.phone,
address: body.address,
berthSize: body.berth_size,
berthMinLength: body.berth_min_length,
berthMinWidth: body.berth_min_width,
berthMinDraught: body.berth_min_draught,
yachtName: body.berth_yacht_name,
preferredMethodOfContact: body.method_of_contact,
specificBerthMooring: body.berth, // optional, links interest to a specific berth
},
});
return { success: true };
```
The reCAPTCHA verification stays in the website handler — the CRM trusts the
website to gate its public endpoints.
### 3. Retire dead code
After step 2, the following can be deleted from the website:
- `server/utils/websiteInterests.ts`
- `server/utils/residentialInterests.ts`
- `server/utils/nocodb.ts`
- The NocoDB-specific call sites in `server/utils/email.ts` (the CRM
sends its own confirmation/alert emails)
- NocoDB env vars (`NOCODB_*`)
The Nuxt `/api/berths` route stays as-is — it reads from the
`directus_items.berths` collection for the public site, not the CRM.
### 4. Form additions on `pages/register.vue`
The current residence branch only collects contact info. The CRM accepts an
optional `preferences` field (free-text) and `notes` field. Add a
"Preferences" textarea inside the residences block of
`components/pn/specific/website/register/form.vue`:
```vue
<transition name="fade-down">
<div v-show="interest === 'residences'">
<vee-field
as="textarea"
class="form-input py-3 px-0 md:text-lg border-0 border-t border-davysgrey ..."
placeholder="Tell us what you're looking for (unit type, budget, timeline)"
name="residence_preferences"
:disabled="loading"
/>
</div>
</transition>
```
Append `preferences: body.residence_preferences` in the POST body in
`server/api/register.ts`.
### 5. Stand up a residential-only `residences.vue` form (optional)
Today the residences interest is captured on `register.vue` via a radio. If
the marketing team wants a dedicated CTA on `residences.vue`, add a small
inline form using the same submit handler from step 2. No new endpoint —
this is purely a UX addition.
## Deployment order
1. **CRM first**: deploy this repo, ensure `/api/public/interests` and
`/api/public/residential-inquiries` are reachable from the website host.
2. **Verify in CRM**: configure `Inquiry Contact Email` and (for residential)
`Residential Notification Recipients` per port in
admin → settings.
3. **Smoke test from a dev tunnel** (curl the public endpoints with a JSON
payload). Confirm rows land in `clients`/`residential_clients` and
notification emails are received.
4. **Then deploy website changes** (sections 13 above). The form
submissions immediately start landing in the new CRM.
5. **Cut-over note**: once the website is pointed at the CRM, leave the
NocoDB tables read-only as a historical archive. Don't delete them until
prod data has been imported into the new CRM (see "Prod data import
strategy" task #59 in the task list).
## Open questions
- **Port routing for multi-port deploys**: today the website only knows about
Port Nimara. If/when the website serves multiple ports, the `portId`
resolution needs to happen per-domain or per-route, not a single env var.
- **Brand/email domain**: confirm whether residential confirmations should
send from the same `noreply@letsbe.solutions` address as marina, or a
dedicated residential mailbox. The CRM uses `SMTP_FROM`, which is global.

View File

@@ -52,6 +52,7 @@
"@tanstack/react-query": "^5.62.0", "@tanstack/react-query": "^5.62.0",
"@tanstack/react-query-devtools": "^5.62.0", "@tanstack/react-query-devtools": "^5.62.0",
"@tanstack/react-table": "^8.21.3", "@tanstack/react-table": "^8.21.3",
"archiver": "^7.0.1",
"better-auth": "^1.2.0", "better-auth": "^1.2.0",
"bullmq": "^5.25.0", "bullmq": "^5.25.0",
"class-variance-authority": "^0.7.0", "class-variance-authority": "^0.7.0",
@@ -61,7 +62,9 @@
"drizzle-orm": "^0.38.0", "drizzle-orm": "^0.38.0",
"imapflow": "^1.2.13", "imapflow": "^1.2.13",
"ioredis": "^5.4.0", "ioredis": "^5.4.0",
"iso-3166-2": "^1.0.0",
"jose": "^6.2.1", "jose": "^6.2.1",
"libphonenumber-js": "^1.12.42",
"lucide-react": "^0.460.0", "lucide-react": "^0.460.0",
"mailparser": "^3.9.4", "mailparser": "^3.9.4",
"minio": "^8.0.0", "minio": "^8.0.0",
@@ -83,12 +86,15 @@
"sonner": "^1.7.0", "sonner": "^1.7.0",
"tailwind-merge": "^2.6.0", "tailwind-merge": "^2.6.0",
"tailwindcss-animate": "^1.0.7", "tailwindcss-animate": "^1.0.7",
"tesseract.js": "^7.0.0",
"zod": "^3.24.0", "zod": "^3.24.0",
"zustand": "^5.0.0" "zustand": "^5.0.0"
}, },
"devDependencies": { "devDependencies": {
"@eslint/eslintrc": "^3.3.5", "@eslint/eslintrc": "^3.3.5",
"@playwright/test": "^1.58.2", "@playwright/test": "^1.58.2",
"@types/archiver": "^7.0.0",
"@types/iso-3166-2": "^1.0.4",
"@types/mailparser": "^3.4.6", "@types/mailparser": "^3.4.6",
"@types/node": "^22.0.0", "@types/node": "^22.0.0",
"@types/nodemailer": "^6.4.0", "@types/nodemailer": "^6.4.0",

View File

@@ -51,6 +51,30 @@ export default defineConfig({
viewport: { width: 1440, height: 900 }, viewport: { width: 1440, height: 900 },
}, },
}, },
{
// Real-API tests hit live external services (Documenso, IMAP, etc.).
// Opt-in only: pnpm exec playwright test --project=realapi
name: 'realapi',
testMatch: /realapi\/.*\.spec\.ts/,
dependencies: ['setup'],
timeout: 120_000,
use: {
...devices['Desktop Chrome'],
viewport: { width: 1440, height: 900 },
},
},
{
// Visual regression baselines. Regenerate with --update-snapshots after
// intentional UI changes; otherwise pnpm exec playwright test --project=visual
// diffs against the committed PNGs.
name: 'visual',
testMatch: /visual\/.*\.spec\.ts/,
dependencies: ['setup'],
use: {
...devices['Desktop Chrome'],
viewport: { width: 1440, height: 900 },
},
},
], ],
// Don't start the dev server — we expect it to already be running // Don't start the dev server — we expect it to already be running

622
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,188 @@
/**
* Permission-matrix audit.
*
* Walks every src/app/api/v1/** /route.ts file and reports each exported HTTP
* handler (GET/POST/PUT/PATCH/DELETE) that is *not* wrapped in withPermission().
* Internal v1 routes should be permission-gated; routes that intentionally use
* withAuth() alone (e.g. user-self endpoints) can be allow-listed below.
*
* Run:
* pnpm tsx scripts/audit-permissions.ts
*
* Exit code:
* 0 — every handler is permission-gated or in the allow-list
* 1 — at least one handler is missing both a withPermission wrapper and an
* allow-list entry. CI should fail.
*/
import { readdir, readFile } from 'node:fs/promises';
import { join, relative } from 'node:path';
const ROOT = join(process.cwd(), 'src/app/api/v1');
const HTTP_METHODS = ['GET', 'POST', 'PUT', 'PATCH', 'DELETE'] as const;
/**
* Routes intentionally exempt from withPermission. Each entry should explain
* why — typically because the route operates on the caller's own resources
* (no port-level permission semantics) or is admin-only and gated by
* isSuperAdmin inside the handler.
*/
const ALLOW_LIST: ReadonlyArray<{ pattern: RegExp; reason: string }> = [
// Self / admin / public
{ pattern: /\/me\/route\.ts$/, reason: 'Self-endpoint — auth is sufficient.' },
{ pattern: /\/admin\//, reason: 'Admin-only — gated by isSuperAdmin inside handler.' },
{
pattern: /\/notifications\//,
reason: 'User-scoped notifications — caller is the resource owner.',
},
{ pattern: /\/socket\//, reason: 'Socket auth handshake.' },
{ pattern: /\/health\//, reason: 'Public health check.' },
{ pattern: /\/users\/me\//, reason: 'User-self preferences — caller is the resource owner.' },
{ pattern: /\/saved-views\//, reason: 'User-self saved views — caller is the resource owner.' },
{
pattern: /\/settings\/feature-flag\//,
reason: 'Public read of feature-flag bool — no PII; auth is sufficient.',
},
// Cross-cutting / port-scoped reference data
{ pattern: /\/tags\//, reason: 'Tags are cross-cutting reference data; port-scoped via auth.' },
{
pattern: /\/currency\/(convert|rates)\/route\.ts$/,
reason: 'Currency reference data; port-scoped, no PII.',
},
{
pattern: /\/currency\/rates\/refresh\//,
reason: 'TODO: gate with admin:manage_settings — currently allow-listed.',
},
{
pattern: /\/search\//,
reason: 'Port-scoped search — results filtered by auth context (resources have own perms).',
},
// Alerts surface in topbar/dashboard for every signed-in user; per-port not per-resource.
{ pattern: /\/alerts\//, reason: 'Alerts are user-scoped; port-filtered via auth context.' },
// Internally gated by isSuperAdmin
{
pattern: /\/expenses\/export\/parent-company\//,
reason: 'Internally gated by isSuperAdmin inside the handler.',
},
// Pending dedicated permissions
{
pattern: /\/ai\//,
reason: 'TODO: needs ai:* permission catalog entry. Currently allow-listed.',
},
{
pattern: /\/custom-fields\/\[entityId\]\//,
reason: 'TODO: needs custom_fields:* permission. PUT path internally validated.',
},
{
pattern: /\/berth-reservations\/\[id\]\/route\.ts$/,
reason: 'TODO: PATCH should map to reservations:edit (not currently in catalog).',
},
];
interface Finding {
file: string;
method: string;
reason: 'no-withPermission' | 'no-withAuth' | 'allow-listed';
allowReason?: string;
}
async function* walk(dir: string): AsyncGenerator<string> {
for (const entry of await readdir(dir, { withFileTypes: true })) {
const path = join(dir, entry.name);
if (entry.isDirectory()) yield* walk(path);
else if (entry.isFile() && entry.name === 'route.ts') yield path;
}
}
function isAllowListed(file: string): { allowed: boolean; reason?: string } {
for (const { pattern, reason } of ALLOW_LIST) {
if (pattern.test(file)) return { allowed: true, reason };
}
return { allowed: false };
}
async function auditFile(file: string): Promise<Finding[]> {
const src = await readFile(file, 'utf-8');
const findings: Finding[] = [];
for (const method of HTTP_METHODS) {
// Match: export const GET = withAuth(...
const declRe = new RegExp(`export\\s+const\\s+${method}\\s*=\\s*(.+?);`, 's');
const m = declRe.exec(src);
if (!m) continue;
const block = m[1] ?? '';
const hasAuth = /withAuth\s*\(/.test(block);
const hasPerm = /withPermission\s*\(/.test(block);
const allow = isAllowListed(file);
if (!hasAuth) {
findings.push({ file, method, reason: 'no-withAuth' });
continue;
}
if (!hasPerm) {
if (allow.allowed) {
findings.push({ file, method, reason: 'allow-listed', allowReason: allow.reason });
} else {
findings.push({ file, method, reason: 'no-withPermission' });
}
}
}
return findings;
}
async function main() {
const files: string[] = [];
for await (const f of walk(ROOT)) files.push(f);
files.sort();
const all: Finding[] = [];
for (const f of files) all.push(...(await auditFile(f)));
const violations = all.filter(
(f) => f.reason === 'no-withPermission' || f.reason === 'no-withAuth',
);
const allowListed = all.filter((f) => f.reason === 'allow-listed');
// Markdown report
const lines: string[] = [];
lines.push('# Permission Matrix Audit');
lines.push('');
lines.push(`Scanned ${files.length} route files under \`src/app/api/v1/\`.`);
lines.push('');
if (violations.length === 0) {
lines.push('**No violations.** Every internal v1 handler is permission-gated.');
} else {
lines.push(`**${violations.length} violation(s):**`);
lines.push('');
lines.push('| File | Method | Issue |');
lines.push('| --- | --- | --- |');
for (const v of violations) {
const rel = relative(process.cwd(), v.file);
lines.push(`| \`${rel}\` | ${v.method} | ${v.reason} |`);
}
}
lines.push('');
lines.push(
`**Allow-listed:** ${allowListed.length} handler(s) intentionally skip \`withPermission\`.`,
);
if (allowListed.length > 0) {
lines.push('');
lines.push('| File | Method | Reason |');
lines.push('| --- | --- | --- |');
for (const a of allowListed) {
const rel = relative(process.cwd(), a.file);
lines.push(`| \`${rel}\` | ${a.method} | ${a.allowReason} |`);
}
}
process.stdout.write(lines.join('\n') + '\n');
process.exit(violations.length > 0 ? 1 : 0);
}
main().catch((err) => {
console.error(err);
process.exit(2);
});

View File

@@ -0,0 +1,51 @@
#!/usr/bin/env bash
# Hourly MinIO mirror for Port Nimara CRM.
#
# Mirrors the live `MINIO_BUCKET` to the backup destination. `mc mirror`
# is incremental — only changed objects transfer — so this is cheap.
#
# Versioning on the destination bucket is what protects against object
# deletes / overwrites; we don't try to roll our own.
set -euo pipefail
: "${MINIO_ENDPOINT:?MINIO_ENDPOINT not set}"
: "${MINIO_ACCESS_KEY:?MINIO_ACCESS_KEY not set}"
: "${MINIO_SECRET_KEY:?MINIO_SECRET_KEY not set}"
: "${MINIO_BUCKET:?MINIO_BUCKET not set}"
: "${BACKUP_S3_BUCKET:?BACKUP_S3_BUCKET not set}"
: "${BACKUP_S3_ENDPOINT:?BACKUP_S3_ENDPOINT not set}"
: "${BACKUP_S3_ACCESS_KEY:?BACKUP_S3_ACCESS_KEY not set}"
: "${BACKUP_S3_SECRET_KEY:?BACKUP_S3_SECRET_KEY not set}"
# Default scheme: live MinIO is plain HTTP unless MINIO_USE_SSL=true.
LIVE_URL="${MINIO_ENDPOINT}"
if [[ "${MINIO_USE_SSL:-false}" == "true" ]]; then
LIVE_URL="https://${MINIO_ENDPOINT}:${MINIO_PORT:-443}"
else
LIVE_URL="http://${MINIO_ENDPOINT}:${MINIO_PORT:-9000}"
fi
LIVE_ALIAS="live-$$"
BACKUP_ALIAS="bk-$$"
trap 'mc alias remove "$LIVE_ALIAS" 2>/dev/null || true; mc alias remove "$BACKUP_ALIAS" 2>/dev/null || true' EXIT
mc alias set "$LIVE_ALIAS" "$LIVE_URL" \
"$MINIO_ACCESS_KEY" "$MINIO_SECRET_KEY" --api S3v4 >/dev/null
mc alias set "$BACKUP_ALIAS" "$BACKUP_S3_ENDPOINT" \
"$BACKUP_S3_ACCESS_KEY" "$BACKUP_S3_SECRET_KEY" --api S3v4 >/dev/null
SOURCE="${LIVE_ALIAS}/${MINIO_BUCKET}/"
DEST="${BACKUP_ALIAS}/${BACKUP_S3_BUCKET}/minio/"
echo "[$(date -u +%FT%TZ)] Mirroring $SOURCE$DEST"
# `--remove` would delete objects from the destination that no longer
# exist in source — we DON'T pass it, because that would let an
# accidental delete on the live bucket cascade into permanent loss on
# the backup side. Versioning + lifecycle handle stale-object cleanup.
mc mirror --quiet --overwrite "$SOURCE" "$DEST"
# Print byte / count diff for the operator.
echo "[$(date -u +%FT%TZ)] Done. Destination summary:"
mc du "$DEST"

View File

@@ -0,0 +1,63 @@
#!/usr/bin/env bash
# Hourly PostgreSQL backup for Port Nimara CRM.
#
# Reads DATABASE_URL and BACKUP_S3_* from the environment. Dumps to a
# tmpfile, gzips, optionally GPG-encrypts to BACKUP_GPG_RECIPIENT, and
# uploads to s3://${BACKUP_S3_BUCKET}/pg/<hostname>/<UTC-date>/<hour>.dump.gz[.gpg].
#
# Designed to fail loud: any non-zero exit halts the script and propagates
# to the cron / CI runner so the operator sees the failure.
set -euo pipefail
: "${DATABASE_URL:?DATABASE_URL not set}"
: "${BACKUP_S3_BUCKET:?BACKUP_S3_BUCKET not set}"
: "${BACKUP_S3_ENDPOINT:?BACKUP_S3_ENDPOINT not set}"
: "${BACKUP_S3_ACCESS_KEY:?BACKUP_S3_ACCESS_KEY not set}"
: "${BACKUP_S3_SECRET_KEY:?BACKUP_S3_SECRET_KEY not set}"
HOST="${BACKUP_HOST_OVERRIDE:-$(hostname -s)}"
DATE_UTC="$(date -u +%Y-%m-%d)"
HOUR_UTC="$(date -u +%H)"
WORKDIR="$(mktemp -d)"
trap 'rm -rf "$WORKDIR"' EXIT
DUMP_FILE="$WORKDIR/${HOUR_UTC}.dump"
ARCHIVE_NAME="${HOUR_UTC}.dump.gz"
echo "[$(date -u +%FT%TZ)] Dumping $DATABASE_URL$DUMP_FILE"
pg_dump --format=custom --compress=9 --no-owner --no-privileges \
--file="$DUMP_FILE" "$DATABASE_URL"
# pg_dump's `custom` format is already compressed, but we wrap in gzip so
# the file looks the same regardless of the dump format on disk.
gzip -n "$DUMP_FILE"
GZ_FILE="${DUMP_FILE}.gz"
# Optional GPG layer. Only encrypt if the recipient is configured.
if [[ -n "${BACKUP_GPG_RECIPIENT:-}" ]]; then
echo "[$(date -u +%FT%TZ)] Encrypting for $BACKUP_GPG_RECIPIENT"
gpg --batch --yes --trust-model always \
--recipient "$BACKUP_GPG_RECIPIENT" \
--encrypt --output "${GZ_FILE}.gpg" "$GZ_FILE"
rm "$GZ_FILE"
GZ_FILE="${GZ_FILE}.gpg"
ARCHIVE_NAME="${ARCHIVE_NAME}.gpg"
fi
# Configure mc client for the backup destination.
MC_ALIAS="bk-$$"
mc alias set "$MC_ALIAS" "$BACKUP_S3_ENDPOINT" \
"$BACKUP_S3_ACCESS_KEY" "$BACKUP_S3_SECRET_KEY" \
--api S3v4 >/dev/null
REMOTE_PATH="${MC_ALIAS}/${BACKUP_S3_BUCKET}/pg/${HOST}/${DATE_UTC}/${ARCHIVE_NAME}"
echo "[$(date -u +%FT%TZ)] Uploading → $REMOTE_PATH"
mc cp --quiet "$GZ_FILE" "$REMOTE_PATH"
# Tag with retention metadata so lifecycle rules can decide what to expire.
mc tag set "$REMOTE_PATH" "kind=hourly&host=${HOST}&date=${DATE_UTC}" >/dev/null
mc alias remove "$MC_ALIAS" >/dev/null
echo "[$(date -u +%FT%TZ)] OK ${ARCHIVE_NAME} ($(du -h "$GZ_FILE" | cut -f1))"

121
scripts/backup/restore.sh Normal file
View File

@@ -0,0 +1,121 @@
#!/usr/bin/env bash
# Cold-restore script for Port Nimara CRM.
#
# Two modes:
# --drill Restore to a sandbox DB ($DRILL_DATABASE_URL) + a tagged
# sandbox path on the live MinIO bucket. Used by the weekly
# cron drill so the runbook stays accurate.
# (no --drill) Interactive production restore. Prompts before each
# destructive step; refuses to run if the live DB has
# non-empty tables (caller is expected to drop first).
#
# Common args:
# --snapshot YYYY-MM-DD/HH Specific dump to restore. Defaults to "latest".
set -euo pipefail
DRILL=0
SNAPSHOT="latest"
while [[ $# -gt 0 ]]; do
case "$1" in
--drill) DRILL=1; shift ;;
--snapshot) SNAPSHOT="$2"; shift 2 ;;
*) echo "unknown arg: $1" >&2; exit 2 ;;
esac
done
: "${BACKUP_S3_BUCKET:?BACKUP_S3_BUCKET not set}"
: "${BACKUP_S3_ENDPOINT:?BACKUP_S3_ENDPOINT not set}"
: "${BACKUP_S3_ACCESS_KEY:?BACKUP_S3_ACCESS_KEY not set}"
: "${BACKUP_S3_SECRET_KEY:?BACKUP_S3_SECRET_KEY not set}"
if [[ "$DRILL" -eq 1 ]]; then
: "${DRILL_DATABASE_URL:?DRILL_DATABASE_URL not set}"
TARGET_DB="$DRILL_DATABASE_URL"
echo "[drill] target DB = $TARGET_DB"
else
: "${DATABASE_URL:?DATABASE_URL not set}"
TARGET_DB="$DATABASE_URL"
read -rp "About to overwrite $TARGET_DB. Type 'restore' to continue: " confirm
[[ "$confirm" == "restore" ]] || { echo "aborted"; exit 1; }
fi
HOST="${BACKUP_HOST_OVERRIDE:-$(hostname -s)}"
WORKDIR="$(mktemp -d)"
trap 'rm -rf "$WORKDIR"' EXIT
MC_ALIAS="bk-$$"
mc alias set "$MC_ALIAS" "$BACKUP_S3_ENDPOINT" \
"$BACKUP_S3_ACCESS_KEY" "$BACKUP_S3_SECRET_KEY" --api S3v4 >/dev/null
trap 'rm -rf "$WORKDIR"; mc alias remove "$MC_ALIAS" 2>/dev/null || true' EXIT
# Resolve the snapshot path.
if [[ "$SNAPSHOT" == "latest" ]]; then
REMOTE=$(mc ls --recursive "${MC_ALIAS}/${BACKUP_S3_BUCKET}/pg/${HOST}/" \
| awk '{print $NF}' | sort | tail -1)
if [[ -z "$REMOTE" ]]; then
echo "no snapshots found under ${BACKUP_S3_BUCKET}/pg/${HOST}/" >&2
exit 1
fi
REMOTE="${MC_ALIAS}/${BACKUP_S3_BUCKET}/pg/${HOST}/${REMOTE}"
else
REMOTE="${MC_ALIAS}/${BACKUP_S3_BUCKET}/pg/${HOST}/${SNAPSHOT}.dump.gz"
# If GPG was used, the file lives at .dump.gz.gpg. Try both.
if ! mc stat "$REMOTE" >/dev/null 2>&1; then
REMOTE="${REMOTE}.gpg"
fi
fi
echo "[$(date -u +%FT%TZ)] Pulling $REMOTE"
LOCAL="$WORKDIR/$(basename "$REMOTE")"
mc cp --quiet "$REMOTE" "$LOCAL"
# Decrypt if needed.
if [[ "$LOCAL" == *.gpg ]]; then
echo "[$(date -u +%FT%TZ)] Decrypting"
gpg --batch --yes --decrypt --output "${LOCAL%.gpg}" "$LOCAL"
rm "$LOCAL"
LOCAL="${LOCAL%.gpg}"
fi
# Decompress.
gunzip "$LOCAL"
LOCAL="${LOCAL%.gz}"
echo "[$(date -u +%FT%TZ)] Restoring into $TARGET_DB"
# Drop & recreate to guarantee no half-state from a prior run.
DB_NAME=$(echo "$TARGET_DB" | sed -E 's|.*/([^?]+).*|\1|')
ADMIN_URL=$(echo "$TARGET_DB" | sed -E "s|/${DB_NAME}|/postgres|")
psql "$ADMIN_URL" -v ON_ERROR_STOP=1 <<SQL
SELECT pg_terminate_backend(pid) FROM pg_stat_activity
WHERE datname = '${DB_NAME}' AND pid <> pg_backend_pid();
DROP DATABASE IF EXISTS "${DB_NAME}";
CREATE DATABASE "${DB_NAME}";
SQL
pg_restore --no-owner --no-privileges --dbname "$TARGET_DB" "$LOCAL"
# Drill mode: compare row counts vs the live producer for parity.
if [[ "$DRILL" -eq 1 ]]; then
echo "[$(date -u +%FT%TZ)] Drill row-count diff (live vs restored):"
TABLES=$(psql -At "$TARGET_DB" -c \
"SELECT tablename FROM pg_tables WHERE schemaname='public' ORDER BY tablename;")
diff_count=0
while IFS= read -r tbl; do
[[ -z "$tbl" ]] && continue
live=$(psql -At "${LIVE_DATABASE_URL:-$DATABASE_URL}" -c "SELECT count(*) FROM \"$tbl\";")
restored=$(psql -At "$TARGET_DB" -c "SELECT count(*) FROM \"$tbl\";")
delta=$((live - restored))
if [[ "$delta" -ne 0 ]]; then
echo "$tbl: live=$live restored=$restored delta=$delta"
diff_count=$((diff_count + 1))
fi
done <<< "$TABLES"
if [[ "$diff_count" -eq 0 ]]; then
echo " ✓ row counts match across all tables"
fi
fi
echo "[$(date -u +%FT%TZ)] Restore complete."

View File

@@ -0,0 +1,102 @@
/**
* Dev-only helper: create (or upsert) a CRM better-auth user and mark them
* super_admin. Idempotent — re-running with the same email will reset the
* password.
*
* Run: pnpm tsx scripts/dev-create-crm-user.ts <email> <password> [displayName]
*/
import 'dotenv/config';
import postgres from 'postgres';
import { auth } from '@/lib/auth';
import { db } from '@/lib/db';
import { userProfiles } from '@/lib/db/schema/users';
import { env } from '@/lib/env';
import { eq } from 'drizzle-orm';
async function main() {
const [email, password, displayNameArg] = process.argv.slice(2);
if (!email || !password) {
console.error(
'Usage: pnpm tsx scripts/dev-create-crm-user.ts <email> <password> [displayName]',
);
process.exit(1);
}
const displayName = displayNameArg ?? email.split('@')[0] ?? 'User';
const sql = postgres(env.DATABASE_URL);
try {
// 1. Check if better-auth user already exists.
const existing = await sql<{ id: string }[]>`
SELECT id FROM "user" WHERE email = ${email} LIMIT 1
`;
let userId: string;
if (existing.length > 0) {
const row = existing[0];
if (!row) throw new Error('unreachable');
userId = row.id;
console.log(`User ${email} exists (id=${userId}); resetting password.`);
// Use better-auth's internal context to hash and update the credential.
const ctx = await auth.$context;
const hash = await ctx.password.hash(password);
await sql`
UPDATE account
SET password = ${hash}, updated_at = NOW()
WHERE user_id = ${userId} AND provider_id = 'credential'
`;
} else {
console.log(`Creating better-auth user ${email}`);
const result = await auth.api.signUpEmail({
body: { email, password, name: displayName },
});
userId = result.user.id;
console.log(`Created user_id=${userId}`);
}
// 2. Upsert user_profiles entry as super admin.
const profile = await db
.select()
.from(userProfiles)
.where(eq(userProfiles.userId, userId))
.limit(1);
if (profile.length === 0) {
await db.insert(userProfiles).values({
id: crypto.randomUUID(),
userId,
displayName,
avatarUrl: null,
phone: null,
isSuperAdmin: true,
isActive: true,
lastLoginAt: null,
preferences: {},
});
console.log(`Created super_admin profile for ${userId}`);
} else {
await db
.update(userProfiles)
.set({ displayName, isSuperAdmin: true, isActive: true })
.where(eq(userProfiles.userId, userId));
console.log(`Updated profile for ${userId} (super_admin=true)`);
}
console.log('');
console.log(`✓ Done. Sign in at http://localhost:3000/login with`);
console.log(` email: ${email}`);
console.log(` password: ${password}`);
} finally {
await sql.end();
process.exit(0);
}
}
main().catch((e) => {
console.error(e);
process.exit(1);
});

66
scripts/dev-imap-probe.ts Normal file
View File

@@ -0,0 +1,66 @@
/**
* Dev diagnostic: connect to IMAP and print the most recent ~10 messages,
* showing TO/FROM/subject/date so we can see what the dev mailbox is
* actually receiving.
*
* Run: pnpm tsx scripts/dev-imap-probe.ts
*/
import 'dotenv/config';
import { ImapFlow } from 'imapflow';
import { simpleParser } from 'mailparser';
async function main(): Promise<void> {
const host = process.env.IMAP_HOST!;
const port = Number(process.env.IMAP_PORT ?? 993);
const user = process.env.IMAP_USER!;
const pass = process.env.IMAP_PASS!;
if (!host || !user || !pass) {
throw new Error('IMAP_HOST / IMAP_USER / IMAP_PASS not set');
}
console.log(`Connecting to ${user}@${host}:${port}`);
const client = new ImapFlow({
host,
port,
secure: port === 993,
auth: { user, pass },
logger: false,
});
await client.connect();
console.log('Connected. Inbox status:');
const lock = await client.getMailboxLock('INBOX');
try {
const status = await client.status('INBOX', { messages: true, recent: true });
console.log(' total:', status.messages, '| recent:', status.recent);
// Pull the last 10 by UID
const since = new Date(Date.now() - 30 * 60 * 1000); // last 30 min
const result = await client.search({ since });
const uids = Array.isArray(result) ? result.slice(-10).reverse() : [];
console.log(`Found ${uids.length} messages in last 30min:`);
for (const uid of uids) {
const msg = await client.fetchOne(String(uid), { source: true, envelope: true });
if (!msg || !msg.source) continue;
const parsed = await simpleParser(msg.source);
const tos = (Array.isArray(parsed.to) ? parsed.to : parsed.to ? [parsed.to] : [])
.flatMap((a) => a.value.map((v) => v.address ?? ''))
.join(', ');
console.log(
` uid=${uid} date=${parsed.date?.toISOString()} from=${parsed.from?.text} to=${tos} subject=${parsed.subject}`,
);
}
} finally {
lock.release();
}
await client.logout();
console.log('Done.');
process.exit(0);
}
main().catch((err) => {
console.error('Probe failed:', err);
process.exit(1);
});

25
scripts/dev-list-users.ts Normal file
View File

@@ -0,0 +1,25 @@
import 'dotenv/config';
import postgres from 'postgres';
import { env } from '@/lib/env';
async function main() {
const sql = postgres(env.DATABASE_URL);
const users =
await sql`SELECT id, email, name, email_verified, created_at FROM "user" ORDER BY created_at DESC LIMIT 20`;
console.log('--- user ---');
console.log(JSON.stringify(users, null, 2));
const profiles =
await sql`SELECT user_id, display_name, is_super_admin, is_active FROM user_profiles ORDER BY created_at DESC LIMIT 20`;
console.log('--- user_profiles ---');
console.log(JSON.stringify(profiles, null, 2));
const accounts =
await sql`SELECT user_id, provider_id, account_id FROM account ORDER BY created_at DESC LIMIT 20`;
console.log('--- account ---');
console.log(JSON.stringify(accounts, null, 2));
await sql.end();
}
main().catch((e) => {
console.error(e);
process.exit(1);
});

View File

@@ -0,0 +1,44 @@
/**
* Dev-only helper: issue a CRM admin invite and send the activation email.
* The email gets routed via EMAIL_REDIRECT_TO if that's set, so it always
* lands in the dev inbox.
*
* Run: pnpm tsx scripts/dev-trigger-crm-invite.ts <email> [name] [--super]
*/
import 'dotenv/config';
import { createCrmInvite } from '@/lib/services/crm-invite.service';
async function main() {
const args = process.argv.slice(2);
const email = args[0];
if (!email) {
console.error('Usage: pnpm tsx scripts/dev-trigger-crm-invite.ts <email> [name] [--super]');
process.exit(1);
}
const isSuperAdmin = args.includes('--super');
const name = args.find((a, i) => i > 0 && !a.startsWith('--'));
// Dev script runs out-of-band (no HTTP request, no session). The service's
// super-admin gate requires `invitedBy.isSuperAdmin === true` for super
// invites; the script bypasses that with a synthetic caller identity.
const { inviteId, link } = await createCrmInvite({
email,
name,
isSuperAdmin,
invitedBy: { userId: 'cli-script', isSuperAdmin: true },
});
console.log(`✓ Invite created (id=${inviteId})`);
console.log(` email: ${email}`);
console.log(` super_admin: ${isSuperAdmin}`);
console.log(` activation link: ${link}`);
console.log('');
console.log('Email sent (subject permitting via EMAIL_REDIRECT_TO).');
process.exit(0);
}
main().catch((e) => {
console.error(e);
process.exit(1);
});

View File

@@ -0,0 +1,59 @@
/**
* Dev-only helper: pick an existing client and trigger a portal-invite email.
* The activation email gets routed to EMAIL_REDIRECT_TO (set in .env) regardless
* of the per-portal-user `email` field — so we can use any throwaway address
* here without conflicting with seed data.
*
* Run: pnpm tsx scripts/dev-trigger-portal-invite.ts
*/
import 'dotenv/config';
import { db } from '@/lib/db';
import { clients } from '@/lib/db/schema/clients';
import { portalUsers } from '@/lib/db/schema/portal';
import { createPortalUser } from '@/lib/services/portal-auth.service';
import { env } from '@/lib/env';
import { eq } from 'drizzle-orm';
async function main(): Promise<void> {
if (!env.EMAIL_REDIRECT_TO) {
throw new Error(
'EMAIL_REDIRECT_TO is not set — refusing to send a real activation email to a real client.',
);
}
console.log(`EMAIL_REDIRECT_TO is set: ${env.EMAIL_REDIRECT_TO}`);
const client = await db.query.clients.findFirst({
where: eq(clients.portId, '294c8240-49a7-403e-92e8-fc3a524c00b4'),
});
if (!client) throw new Error('No client found in port-nimara');
// Use the redirect target as the portal user's actual email, so the
// tester can sign in with the same address that received the activation mail.
const portalEmail = env.EMAIL_REDIRECT_TO;
console.log(
`Creating portal user for client ${client.fullName} (${client.id}) with email ${portalEmail}`,
);
// Clear any prior dev-script seed so uniqueness checks don't trip.
await db.delete(portalUsers).where(eq(portalUsers.clientId, client.id));
await db.delete(portalUsers).where(eq(portalUsers.email, portalEmail));
const result = await createPortalUser({
clientId: client.id,
portId: client.portId,
email: portalEmail,
name: client.fullName,
createdBy: 'dev-script',
});
console.log('Portal user created:', result);
console.log(`Activation email enqueued — should arrive at ${portalEmail}.`);
process.exit(0);
}
main().catch((err) => {
console.error('Script failed:', err);
process.exit(1);
});

View File

@@ -8,14 +8,5 @@ export const metadata: Metadata = {
}; };
export default function AuthLayout({ children }: { children: React.ReactNode }) { export default function AuthLayout({ children }: { children: React.ReactNode }) {
return ( return <>{children}</>;
<div
className="min-h-screen flex items-center justify-center wave-watermark"
style={{ backgroundColor: '#1e2844' }}
>
<div className="w-full max-w-md px-4">
{children}
</div>
</div>
);
} }

View File

@@ -10,9 +10,9 @@ import { toast } from 'sonner';
import { authClient } from '@/lib/auth/client'; import { authClient } from '@/lib/auth/client';
import { cn } from '@/lib/utils'; import { cn } from '@/lib/utils';
import { Button } from '@/components/ui/button'; import { Button } from '@/components/ui/button';
import { Card, CardContent, CardHeader } from '@/components/ui/card';
import { Input } from '@/components/ui/input'; import { Input } from '@/components/ui/input';
import { Label } from '@/components/ui/label'; import { Label } from '@/components/ui/label';
import { BrandedAuthShell } from '@/components/shared/branded-auth-shell';
const loginSchema = z.object({ const loginSchema = z.object({
email: z.string().email('Please enter a valid email address'), email: z.string().email('Please enter a valid email address'),
@@ -55,64 +55,53 @@ export default function LoginPage() {
} }
return ( return (
<div <BrandedAuthShell>
className="min-h-screen flex items-center justify-center px-4" <div className="text-center mb-6">
style={{ backgroundColor: '#1e2844' }} <h1 className="text-xl font-semibold text-gray-900">Port Nimara CRM</h1>
> <p className="text-sm text-gray-500 mt-1">Sign in to continue</p>
<Card className="w-full max-w-md"> </div>
<CardHeader className="space-y-1 text-center pb-6">
<h1 className="text-2xl font-bold tracking-tight text-foreground">Port Nimara</h1>
<p className="text-sm text-muted-foreground">Marina CRM</p>
</CardHeader>
<CardContent>
<form onSubmit={handleSubmit(onSubmit)} className="space-y-4" noValidate>
<div className="space-y-2">
<Label htmlFor="email">Email</Label>
<Input
id="email"
type="email"
autoComplete="email"
placeholder="you@example.com"
disabled={isLoading}
className={cn(errors.email && 'border-destructive focus-visible:ring-destructive')}
{...register('email')}
/>
{errors.email && (
<p className="text-sm text-destructive">{errors.email.message}</p>
)}
</div>
<div className="space-y-2"> <form onSubmit={handleSubmit(onSubmit)} className="space-y-4" noValidate>
<div className="flex items-center justify-between"> <div className="space-y-1.5">
<Label htmlFor="password">Password</Label> <Label htmlFor="email">Email</Label>
<Link <Input
href="/reset-password" id="email"
className="text-sm text-muted-foreground hover:text-foreground transition-colors" type="email"
> autoComplete="email"
Forgot password? placeholder="you@example.com"
</Link> disabled={isLoading}
</div> className={cn(errors.email && 'border-destructive focus-visible:ring-destructive')}
<Input {...register('email')}
id="password" />
type="password" {errors.email && <p className="text-sm text-destructive">{errors.email.message}</p>}
autoComplete="current-password" </div>
disabled={isLoading}
className={cn(
errors.password && 'border-destructive focus-visible:ring-destructive',
)}
{...register('password')}
/>
{errors.password && (
<p className="text-sm text-destructive">{errors.password.message}</p>
)}
</div>
<Button type="submit" className="w-full" disabled={isLoading}> <div className="space-y-1.5">
{isLoading ? 'Signing in…' : 'Sign in'} <div className="flex items-center justify-between">
</Button> <Label htmlFor="password">Password</Label>
</form> <Link href="/reset-password" className="text-xs text-[#007bff] hover:underline">
</CardContent> Forgot password?
</Card> </Link>
</div> </div>
<Input
id="password"
type="password"
autoComplete="current-password"
disabled={isLoading}
className={cn(errors.password && 'border-destructive focus-visible:ring-destructive')}
{...register('password')}
/>
{errors.password && <p className="text-sm text-destructive">{errors.password.message}</p>}
</div>
<Button
type="submit"
className="w-full bg-[#007bff] hover:bg-[#0069d9] text-white"
disabled={isLoading}
>
{isLoading ? 'Signing in…' : 'Sign in'}
</Button>
</form>
</BrandedAuthShell>
); );
} }

View File

@@ -7,9 +7,9 @@ import { zodResolver } from '@hookform/resolvers/zod';
import { z } from 'zod'; import { z } from 'zod';
import { toast } from 'sonner'; import { toast } from 'sonner';
import { Button } from '@/components/ui/button'; import { Button } from '@/components/ui/button';
import { Card, CardContent, CardHeader } from '@/components/ui/card';
import { Input } from '@/components/ui/input'; import { Input } from '@/components/ui/input';
import { Label } from '@/components/ui/label'; import { Label } from '@/components/ui/label';
import { BrandedAuthShell } from '@/components/shared/branded-auth-shell';
import { cn } from '@/lib/utils'; import { cn } from '@/lib/utils';
const resetSchema = z.object({ const resetSchema = z.object({
@@ -49,69 +49,55 @@ export default function ResetPasswordPage() {
} }
return ( return (
<div <BrandedAuthShell>
className="min-h-screen flex items-center justify-center px-4" <div className="text-center mb-6">
style={{ backgroundColor: '#1e2844' }} <h1 className="text-xl font-semibold text-gray-900">Reset your password</h1>
> <p className="text-sm text-gray-500 mt-1">We&apos;ll email you a link</p>
<Card className="w-full max-w-md"> </div>
<CardHeader className="space-y-1 text-center pb-6">
<h1 className="text-2xl font-bold tracking-tight text-foreground">Port Nimara</h1>
<p className="text-sm text-muted-foreground">Reset your password</p>
</CardHeader>
<CardContent>
{submitted ? (
<div className="space-y-4 text-center">
<div className="space-y-2">
<p className="font-medium text-foreground">Check your email</p>
<p className="text-sm text-muted-foreground">
If an account exists for that email address, we have sent a password reset link.
Please check your inbox and spam folder.
</p>
</div>
<Link
href="/login"
className="inline-block text-sm text-muted-foreground hover:text-foreground transition-colors"
>
Back to sign in
</Link>
</div>
) : (
<form onSubmit={handleSubmit(onSubmit)} className="space-y-4" noValidate>
<div className="space-y-2">
<Label htmlFor="email">Email</Label>
<Input
id="email"
type="email"
autoComplete="email"
placeholder="you@example.com"
disabled={isLoading}
className={cn(
errors.email && 'border-destructive focus-visible:ring-destructive',
)}
{...register('email')}
/>
{errors.email && (
<p className="text-sm text-destructive">{errors.email.message}</p>
)}
</div>
<Button type="submit" className="w-full" disabled={isLoading}> {submitted ? (
{isLoading ? 'Sending…' : 'Send reset link'} <div className="space-y-4 text-center">
</Button> <p className="font-medium text-gray-900">Check your email</p>
<p className="text-sm text-gray-500">
If an account exists for that email address, we have sent a password reset link. Please
check your inbox and spam folder.
</p>
<Link href="/login" className="inline-block text-sm text-[#007bff] hover:underline">
Back to sign in
</Link>
</div>
) : (
<form onSubmit={handleSubmit(onSubmit)} className="space-y-4" noValidate>
<div className="space-y-1.5">
<Label htmlFor="email">Email</Label>
<Input
id="email"
type="email"
autoComplete="email"
placeholder="you@example.com"
disabled={isLoading}
className={cn(errors.email && 'border-destructive focus-visible:ring-destructive')}
{...register('email')}
/>
{errors.email && <p className="text-sm text-destructive">{errors.email.message}</p>}
</div>
<p className="text-center text-sm text-muted-foreground"> <Button
Remember your password?{' '} type="submit"
<Link className="w-full bg-[#007bff] hover:bg-[#0069d9] text-white"
href="/login" disabled={isLoading}
className="text-foreground underline-offset-4 hover:underline" >
> {isLoading ? 'Sending…' : 'Send reset link'}
Sign in </Button>
</Link>
</p> <p className="text-center text-sm text-gray-500">
</form> Remember your password?{' '}
)} <Link href="/login" className="text-[#007bff] hover:underline">
</CardContent> Sign in
</Card> </Link>
</div> </p>
</form>
)}
</BrandedAuthShell>
); );
} }

View File

@@ -1,27 +1,23 @@
'use client'; 'use client';
import { Suspense, useState } from 'react'; import { Suspense, useState } from 'react';
import Link from 'next/link';
import { useRouter, useSearchParams } from 'next/navigation'; import { useRouter, useSearchParams } from 'next/navigation';
import { useForm } from 'react-hook-form'; import { useForm } from 'react-hook-form';
import { zodResolver } from '@hookform/resolvers/zod'; import { zodResolver } from '@hookform/resolvers/zod';
import { z } from 'zod'; import { z } from 'zod';
import { toast } from 'sonner'; import { toast } from 'sonner';
import { CheckCircle2, Circle } from 'lucide-react';
import { cn } from '@/lib/utils'; import { cn } from '@/lib/utils';
import { Button } from '@/components/ui/button'; import { Button } from '@/components/ui/button';
import { Card, CardContent, CardHeader } from '@/components/ui/card';
import { Input } from '@/components/ui/input'; import { Input } from '@/components/ui/input';
import { Label } from '@/components/ui/label'; import { Label } from '@/components/ui/label';
import { BrandedAuthShell } from '@/components/shared/branded-auth-shell';
const MIN_LENGTH = 9;
const passwordSchema = z const passwordSchema = z
.object({ .object({
password: z password: z.string().min(MIN_LENGTH, `Must be at least ${MIN_LENGTH} characters`),
.string()
.min(12, 'Must be at least 12 characters')
.regex(/[A-Z]/, 'Must contain an uppercase letter')
.regex(/[a-z]/, 'Must contain a lowercase letter')
.regex(/[0-9]/, 'Must contain a number')
.regex(/[^A-Za-z0-9]/, 'Must contain a special character'),
confirmPassword: z.string().min(1, 'Please confirm your password'), confirmPassword: z.string().min(1, 'Please confirm your password'),
}) })
.refine((data) => data.password === data.confirmPassword, { .refine((data) => data.password === data.confirmPassword, {
@@ -31,25 +27,11 @@ const passwordSchema = z
type SetPasswordFormData = z.infer<typeof passwordSchema>; type SetPasswordFormData = z.infer<typeof passwordSchema>;
type Requirement = {
label: string;
test: (value: string) => boolean;
};
const requirements: Requirement[] = [
{ label: 'At least 12 characters', test: (v) => v.length >= 12 },
{ label: 'Uppercase letter', test: (v) => /[A-Z]/.test(v) },
{ label: 'Lowercase letter', test: (v) => /[a-z]/.test(v) },
{ label: 'Number', test: (v) => /[0-9]/.test(v) },
{ label: 'Special character', test: (v) => /[^A-Za-z0-9]/.test(v) },
];
function SetPasswordInner() { function SetPasswordInner() {
const router = useRouter(); const router = useRouter();
const searchParams = useSearchParams(); const searchParams = useSearchParams();
const token = searchParams.get('token'); const token = searchParams.get('token');
const [isLoading, setIsLoading] = useState(false); const [isLoading, setIsLoading] = useState(false);
const [passwordValue, setPasswordValue] = useState('');
const { const {
register, register,
@@ -61,7 +43,7 @@ function SetPasswordInner() {
async function onSubmit(data: SetPasswordFormData) { async function onSubmit(data: SetPasswordFormData) {
if (!token) { if (!token) {
toast.error('Invalid or missing reset token. Please request a new password reset link.'); toast.error('Invalid or missing reset token. Please request a new link.');
return; return;
} }
@@ -75,7 +57,7 @@ function SetPasswordInner() {
if (!response.ok) { if (!response.ok) {
const body = await response.json().catch(() => ({})); const body = await response.json().catch(() => ({}));
toast.error(body.message ?? 'Failed to set password. Please try again.'); toast.error(body.message ?? body.error ?? 'Failed to set password. Please try again.');
return; return;
} }
@@ -88,102 +70,77 @@ function SetPasswordInner() {
} }
} }
if (!token) {
return (
<BrandedAuthShell>
<div className="text-center space-y-3">
<h1 className="text-xl font-semibold text-gray-900">Link is missing or invalid</h1>
<p className="text-sm text-gray-500">
Please use the link from the email we sent you. If the link is broken, ask your
administrator for a new one.
</p>
<Link href="/login" className="inline-block text-sm text-[#007bff] hover:underline">
Back to sign in
</Link>
</div>
</BrandedAuthShell>
);
}
return ( return (
<div <BrandedAuthShell>
className="min-h-screen flex items-center justify-center px-4" <div className="text-center mb-6">
style={{ backgroundColor: '#1e2844' }} <h1 className="text-xl font-semibold text-gray-900">Set your password</h1>
> <p className="text-sm text-gray-500 mt-1">Choose a password for your CRM account</p>
<Card className="w-full max-w-md"> </div>
<CardHeader className="space-y-1 text-center pb-6">
<h1 className="text-2xl font-bold tracking-tight text-foreground">Port Nimara</h1>
<p className="text-sm text-muted-foreground">Set your password</p>
</CardHeader>
<CardContent>
{!token ? (
<p className="text-center text-sm text-destructive">
Invalid or missing token. Please request a new password reset link.
</p>
) : (
<form onSubmit={handleSubmit(onSubmit)} className="space-y-4" noValidate>
<div className="space-y-2">
<Label htmlFor="password">New Password</Label>
<Input
id="password"
type="password"
autoComplete="new-password"
disabled={isLoading}
className={cn(
errors.password && 'border-destructive focus-visible:ring-destructive',
)}
{...register('password', {
onChange: (e) => setPasswordValue(e.target.value),
})}
/>
{errors.password && (
<p className="text-sm text-destructive">{errors.password.message}</p>
)}
<ul className="space-y-1 pt-1"> <form onSubmit={handleSubmit(onSubmit)} className="space-y-4" noValidate>
{requirements.map((req) => { <div className="space-y-1.5">
const met = req.test(passwordValue); <Label htmlFor="password">New password</Label>
return ( <Input
<li id="password"
key={req.label} type="password"
className={cn( autoComplete="new-password"
'flex items-center gap-2 text-xs', disabled={isLoading}
met ? 'text-green-600 dark:text-green-400' : 'text-muted-foreground', className={cn(errors.password && 'border-destructive focus-visible:ring-destructive')}
)} {...register('password')}
> />
{met ? ( <p className="text-xs text-gray-500">At least {MIN_LENGTH} characters.</p>
<CheckCircle2 className="h-3.5 w-3.5 shrink-0" /> {errors.password && <p className="text-sm text-destructive">{errors.password.message}</p>}
) : ( </div>
<Circle className="h-3.5 w-3.5 shrink-0" />
)}
{req.label}
</li>
);
})}
</ul>
</div>
<div className="space-y-2"> <div className="space-y-1.5">
<Label htmlFor="confirmPassword">Confirm Password</Label> <Label htmlFor="confirmPassword">Confirm password</Label>
<Input <Input
id="confirmPassword" id="confirmPassword"
type="password" type="password"
autoComplete="new-password" autoComplete="new-password"
disabled={isLoading} disabled={isLoading}
className={cn( className={cn(
errors.confirmPassword && 'border-destructive focus-visible:ring-destructive', errors.confirmPassword && 'border-destructive focus-visible:ring-destructive',
)} )}
{...register('confirmPassword')} {...register('confirmPassword')}
/> />
{errors.confirmPassword && ( {errors.confirmPassword && (
<p className="text-sm text-destructive">{errors.confirmPassword.message}</p> <p className="text-sm text-destructive">{errors.confirmPassword.message}</p>
)}
</div>
<Button type="submit" className="w-full" disabled={isLoading}>
{isLoading ? 'Setting password…' : 'Set password'}
</Button>
</form>
)} )}
</CardContent> </div>
</Card>
</div> <Button
type="submit"
className="w-full bg-[#007bff] hover:bg-[#0069d9] text-white"
disabled={isLoading}
>
{isLoading ? 'Setting password…' : 'Set password'}
</Button>
</form>
</BrandedAuthShell>
); );
} }
export default function SetPasswordPage() { export default function SetPasswordPage() {
return ( return (
<Suspense <Suspense fallback={<BrandedAuthShell>{null}</BrandedAuthShell>}>
fallback={
<div
className="min-h-screen flex items-center justify-center px-4"
style={{ backgroundColor: '#1e2844' }}
/>
}
>
<SetPasswordInner /> <SetPasswordInner />
</Suspense> </Suspense>
); );

View File

@@ -0,0 +1,69 @@
import {
SettingsFormCard,
type SettingFieldDef,
} from '@/components/admin/shared/settings-form-card';
const FIELDS: SettingFieldDef[] = [
{
key: 'branding_app_name',
label: 'App name',
description: 'Shown in the email subject prefix and the in-app header.',
type: 'string',
placeholder: 'Port Nimara CRM',
defaultValue: '',
},
{
key: 'branding_logo_url',
label: 'Logo URL',
description:
'Public HTTPS URL of the logo used in email headers and the branded auth shell. Recommended size: 240×80 PNG with transparent background.',
type: 'string',
placeholder: 'https://example.com/logo.png',
defaultValue: '',
},
{
key: 'branding_primary_color',
label: 'Primary color',
description: 'Used for buttons and links in transactional email templates.',
type: 'color',
defaultValue: '#1e293b',
},
{
key: 'branding_email_header_html',
label: 'Email header HTML',
description: 'Optional HTML rendered above each email body. Leave blank to use the default.',
type: 'html',
defaultValue: '',
},
{
key: 'branding_email_footer_html',
label: 'Email footer HTML',
description: 'Optional HTML rendered at the very bottom of each email (above the signature).',
type: 'html',
defaultValue: '',
},
];
export default function BrandingSettingsPage() {
return (
<div className="space-y-6">
<div>
<h1 className="text-2xl font-semibold">Branding</h1>
<p className="text-sm text-muted-foreground">
Logo, primary color, app name, and email header/footer HTML used by the branded auth shell
and outgoing email templates.
</p>
</div>
<SettingsFormCard
title="Identity"
description="App name, logo, and primary color."
fields={FIELDS.slice(0, 3)}
/>
<SettingsFormCard
title="Email branding"
description="HTML fragments rendered around every transactional email."
fields={FIELDS.slice(3)}
/>
</div>
);
}

View File

@@ -0,0 +1,73 @@
import {
SettingsFormCard,
type SettingFieldDef,
} from '@/components/admin/shared/settings-form-card';
import { DocumensoTestButton } from '@/components/admin/documenso/documenso-test-button';
const API_FIELDS: SettingFieldDef[] = [
{
key: 'documenso_api_url_override',
label: 'API URL override',
description: 'Optional. Falls back to DOCUMENSO_API_URL env when blank.',
type: 'string',
placeholder: 'https://documenso.example.com',
defaultValue: '',
},
{
key: 'documenso_api_key_override',
label: 'API key override',
description: 'Optional. Falls back to DOCUMENSO_API_KEY env when blank. Stored in plain text.',
type: 'password',
defaultValue: '',
},
];
const EOI_FIELDS: SettingFieldDef[] = [
{
key: 'documenso_eoi_template_id',
label: 'EOI Documenso template ID',
description: 'Numeric template ID used by the Documenso EOI pathway.',
type: 'string',
placeholder: '12345',
defaultValue: '',
},
{
key: 'eoi_default_pathway',
label: 'Default EOI pathway',
description:
'Which pathway is used when an EOI is generated without an explicit choice. Documenso = signed via Documenso, In-app = filled locally with pdf-lib.',
type: 'select',
options: [
{ value: 'documenso-template', label: 'Documenso template' },
{ value: 'inapp', label: 'In-app (pdf-lib)' },
],
defaultValue: 'documenso-template',
},
];
export default function DocumensoSettingsPage() {
return (
<div className="space-y-6">
<div>
<h1 className="text-2xl font-semibold">Documenso & EOI</h1>
<p className="text-sm text-muted-foreground">
API credentials and default EOI generation pathway. Use the test-connection button to
verify a saved configuration before relying on it.
</p>
</div>
<SettingsFormCard
title="Documenso API"
description="Per-port API credentials. Leave blank to use the global env defaults."
fields={API_FIELDS}
extra={<DocumensoTestButton />}
/>
<SettingsFormCard
title="EOI generation"
description="Default pathway and template used when an interest's EOI is generated."
fields={EOI_FIELDS}
/>
</div>
);
}

View File

@@ -0,0 +1,101 @@
import {
SettingsFormCard,
type SettingFieldDef,
} from '@/components/admin/shared/settings-form-card';
const FIELDS: SettingFieldDef[] = [
{
key: 'email_from_name',
label: 'From name',
description: 'Display name shown in the From: header on outgoing email.',
type: 'string',
placeholder: 'Port Nimara',
defaultValue: '',
},
{
key: 'email_from_address',
label: 'From address',
description: 'Sender email address. Falls back to SMTP_FROM env when blank.',
type: 'string',
placeholder: 'noreply@example.com',
defaultValue: '',
},
{
key: 'email_reply_to',
label: 'Reply-to address',
description: 'Optional Reply-To: header for replies (e.g. sales@example.com).',
type: 'string',
placeholder: 'sales@example.com',
defaultValue: '',
},
{
key: 'email_signature_html',
label: 'Default signature (HTML)',
description: 'Appended to the bottom of system-generated emails.',
type: 'html',
placeholder: '<p>—<br>The Port Nimara team</p>',
defaultValue: '',
},
{
key: 'email_footer_html',
label: 'Email footer (HTML)',
description: 'Legal/contact footer rendered at the very bottom of all emails.',
type: 'html',
placeholder: '<p style="font-size:11px;color:#888;">© Port Nimara · ul. ...</p>',
defaultValue: '',
},
{
key: 'smtp_host_override',
label: 'SMTP host override',
description: 'Optional. Falls back to SMTP_HOST env when blank.',
type: 'string',
placeholder: 'mail.example.com',
defaultValue: '',
},
{
key: 'smtp_port_override',
label: 'SMTP port override',
description: 'Optional. Falls back to SMTP_PORT env when blank.',
type: 'number',
placeholder: '587',
defaultValue: null,
},
{
key: 'smtp_user_override',
label: 'SMTP username override',
description: 'Optional. Falls back to SMTP_USER env when blank.',
type: 'string',
defaultValue: '',
},
{
key: 'smtp_pass_override',
label: 'SMTP password override',
description: 'Optional. Stored in plain text — only set when overriding env credentials.',
type: 'password',
defaultValue: '',
},
];
export default function EmailSettingsPage() {
return (
<div className="space-y-6">
<div>
<h1 className="text-2xl font-semibold">Email Settings</h1>
<p className="text-sm text-muted-foreground">
Per-port outgoing email configuration. SMTP credentials and the From address default to
environment variables when these fields are blank.
</p>
</div>
<SettingsFormCard
title="From address & signature"
description="Identity headers and shared HTML used by system-generated emails."
fields={FIELDS.slice(0, 5)}
/>
<SettingsFormCard
title="SMTP transport overrides"
description="Optional per-port SMTP credentials. Leave blank to use the global env defaults."
fields={FIELDS.slice(5)}
/>
</div>
);
}

View File

@@ -1,16 +1,5 @@
import { FormTemplateList } from '@/components/admin/forms/form-template-list';
export default function FormTemplatesPage() { export default function FormTemplatesPage() {
return ( return <FormTemplateList />;
<div className="space-y-6">
<div>
<h1 className="text-2xl font-bold text-foreground">Form Templates</h1>
<p className="text-muted-foreground">Create and manage intake form templates</p>
</div>
<div className="flex flex-col items-center justify-center rounded-lg border border-dashed p-12">
<p className="text-lg font-medium text-muted-foreground">Coming in Layer 3</p>
<p className="text-sm text-muted-foreground">
This feature will be implemented in the next phase.
</p>
</div>
</div>
);
} }

View File

@@ -0,0 +1,16 @@
import { InvitationsManager } from '@/components/admin/invitations/invitations-manager';
export default function InvitationsPage() {
return (
<div className="space-y-6">
<div>
<h1 className="text-2xl font-semibold">Invitations</h1>
<p className="text-sm text-muted-foreground">
Send a single-use invitation to a new CRM user. The recipient sets their own password via
the link in the email.
</p>
</div>
<InvitationsManager />
</div>
);
}

View File

@@ -0,0 +1,5 @@
import { OcrSettingsForm } from '@/components/admin/ocr-settings-form';
export default function OcrSettingsPage() {
return <OcrSettingsForm />;
}

View File

@@ -0,0 +1,202 @@
import Link from 'next/link';
import {
Bell,
Briefcase,
Database,
FileText,
HardDrive,
Key,
LayoutDashboard,
Mail,
Palette,
ScrollText,
Settings,
Shield,
Sliders,
Tag,
Upload,
Users,
Webhook,
} from 'lucide-react';
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from '@/components/ui/card';
interface AdminSection {
href: string;
label: string;
description: string;
icon: typeof Settings;
}
const SECTIONS: AdminSection[] = [
{
href: 'users',
label: 'Users',
description: 'CRM accounts, role assignments, and per-user residential access toggles.',
icon: Users,
},
{
href: 'invitations',
label: 'Invitations',
description: 'Send invitations, track pending invites, and resend or revoke them.',
icon: Mail,
},
{
href: 'roles',
label: 'Roles & Permissions',
description: 'Default permission sets and per-port role overrides.',
icon: Shield,
},
{
href: 'audit',
label: 'Audit Log',
description: 'Searchable log of every authenticated mutation in the system.',
icon: ScrollText,
},
{
href: 'email',
label: 'Email Settings',
description: 'From address, signatures, and per-port SMTP overrides.',
icon: Mail,
},
{
href: 'documenso',
label: 'Documenso & EOI',
description: 'API credentials, EOI template, and default in-app vs Documenso pathway.',
icon: FileText,
},
{
href: 'reminders',
label: 'Reminders',
description: 'Default reminder behaviour and the daily-digest delivery window.',
icon: Bell,
},
{
href: 'branding',
label: 'Branding',
description: 'App name, logo, primary color, and email header/footer HTML.',
icon: Palette,
},
{
href: 'settings',
label: 'System Settings',
description: 'Generic key/value configuration store for advanced flags.',
icon: Settings,
},
{
href: 'webhooks',
label: 'Webhooks',
description: 'Outgoing webhook subscriptions, secrets, and delivery log.',
icon: Webhook,
},
{
href: 'forms',
label: 'Forms',
description: 'Form templates used by client-facing inquiry and intake flows.',
icon: Sliders,
},
{
href: 'templates',
label: 'Document Templates',
description: 'PDF + email templates with merge-field placeholders.',
icon: FileText,
},
{
href: 'tags',
label: 'Tags',
description: 'Color-coded tags applied to clients, yachts, companies, and interests.',
icon: Tag,
},
{
href: 'custom-fields',
label: 'Custom Fields',
description: 'Tenant-defined fields for clients, yachts, and reservations.',
icon: Key,
},
{
href: 'reports',
label: 'Reports',
description: 'Saved analytics views and ad-hoc query results.',
icon: LayoutDashboard,
},
{
href: 'monitoring',
label: 'Queue Monitoring',
description: 'BullMQ queue health, throughput, and retry diagnostics.',
icon: Database,
},
{
href: 'import',
label: 'Bulk Import',
description: 'CSV-driven imports for clients, yachts, and reservations.',
icon: Upload,
},
{
href: 'backup',
label: 'Backup & Restore',
description: 'Database snapshots and on-demand exports.',
icon: HardDrive,
},
{
href: 'ports',
label: 'Ports',
description: 'Manage the marinas/ports this installation serves.',
icon: Briefcase,
},
{
href: 'onboarding',
label: 'Onboarding',
description: 'Initial-setup wizard for fresh ports.',
icon: LayoutDashboard,
},
{
href: 'ocr',
label: 'Receipt OCR',
description: 'Configure the AI provider used by the mobile receipt scanner.',
icon: ScrollText,
},
];
export default async function AdminLandingPage({
params,
}: {
params: Promise<{ portSlug: string }>;
}) {
const { portSlug } = await params;
return (
<div className="space-y-6">
<div>
<h1 className="text-2xl font-semibold">Administration</h1>
<p className="text-sm text-muted-foreground">
Per-port configuration and system administration. Each card below opens a dedicated
settings page.
</p>
</div>
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 gap-4">
{SECTIONS.map((s) => {
const Icon = s.icon;
return (
<Link
key={s.href}
// eslint-disable-next-line @typescript-eslint/no-explicit-any
href={`/${portSlug}/admin/${s.href}` as any}
className="block group"
>
<Card className="h-full transition-colors group-hover:border-primary/50 group-hover:bg-muted/30">
<CardHeader className="flex flex-row items-start gap-3 space-y-0 pb-2">
<Icon className="h-5 w-5 mt-0.5 text-muted-foreground group-hover:text-primary" />
<div className="flex-1">
<CardTitle className="text-base">{s.label}</CardTitle>
</div>
</CardHeader>
<CardContent>
<CardDescription>{s.description}</CardDescription>
</CardContent>
</Card>
</Link>
);
})}
</div>
</div>
);
}

View File

@@ -0,0 +1,78 @@
import {
SettingsFormCard,
type SettingFieldDef,
} from '@/components/admin/shared/settings-form-card';
const DEFAULT_FIELDS: SettingFieldDef[] = [
{
key: 'reminder_default_enabled',
label: 'Enable reminders by default on new interests',
description:
'When on, newly-created interests inherit reminderEnabled=true. Users can still toggle it on a per-interest basis.',
type: 'boolean',
defaultValue: false,
},
{
key: 'reminder_default_days',
label: 'Default inactivity days',
description:
"Default value for an interest's reminderDays field. Reminders fire after this many days of no contact.",
type: 'number',
placeholder: '7',
defaultValue: 7,
},
];
const DIGEST_FIELDS: SettingFieldDef[] = [
{
key: 'reminder_digest_enabled',
label: 'Batch reminders into a daily digest',
description:
'Off (default): reminders fire as soon as the threshold is hit. On: pending reminders are accumulated and delivered once per day at the digest time.',
type: 'boolean',
defaultValue: false,
},
{
key: 'reminder_digest_time',
label: 'Digest delivery time',
description: '24-hour HH:MM in the digest timezone.',
type: 'string',
placeholder: '09:00',
defaultValue: '09:00',
},
{
key: 'reminder_digest_timezone',
label: 'Digest timezone',
description: 'IANA timezone name used to interpret the delivery time (e.g. Europe/Warsaw).',
type: 'string',
placeholder: 'Europe/Warsaw',
defaultValue: 'Europe/Warsaw',
},
];
export default function ReminderSettingsPage() {
return (
<div className="space-y-6">
<div>
<h1 className="text-2xl font-semibold">Reminders</h1>
<p className="text-sm text-muted-foreground">
Default reminder behaviour for new interests and the optional daily-digest delivery
window. Individual users can still configure their own digest preferences in Notifications
Preferences.
</p>
</div>
<SettingsFormCard
title="Defaults for new interests"
description="Applied when an interest is created without an explicit reminder configuration."
fields={DEFAULT_FIELDS}
/>
<SettingsFormCard
title="Daily digest"
description="Optional batching window so reminder notifications go out once per day instead of as they fire."
fields={DIGEST_FIELDS}
/>
</div>
);
}

View File

@@ -0,0 +1,5 @@
import { AlertsPageShell } from '@/components/alerts/alerts-page-shell';
export default function AlertsPage() {
return <AlertsPageShell />;
}

View File

@@ -0,0 +1,10 @@
import { ReservationDetail } from '@/components/reservations/reservation-detail';
interface PageProps {
params: Promise<{ portSlug: string; id: string }>;
}
export default async function ReservationDetailPage({ params }: PageProps) {
const { portSlug, id } = await params;
return <ReservationDetail reservationId={id} portSlug={portSlug} />;
}

View File

@@ -0,0 +1,5 @@
import { DashboardShell } from '@/components/dashboard/dashboard-shell';
export default function DashboardPage() {
return <DashboardShell />;
}

View File

@@ -0,0 +1,10 @@
import { DocumentDetail } from '@/components/documents/document-detail';
interface PageProps {
params: Promise<{ portSlug: string; id: string }>;
}
export default async function DocumentDetailPage({ params }: PageProps) {
const { portSlug, id } = await params;
return <DocumentDetail documentId={id} portSlug={portSlug} />;
}

View File

@@ -0,0 +1,138 @@
'use client';
import { useState } from 'react';
import { Grid, List, Upload } from 'lucide-react';
import { useQueryClient } from '@tanstack/react-query';
import { Button } from '@/components/ui/button';
import { PageHeader } from '@/components/shared/page-header';
import { PermissionGate } from '@/components/shared/permission-gate';
import { FileGrid } from '@/components/files/file-grid';
import { FolderTree } from '@/components/files/folder-tree';
import { FileUploadZone } from '@/components/files/file-upload-zone';
import { FilePreviewDialog } from '@/components/files/file-preview-dialog';
import { usePaginatedQuery } from '@/hooks/use-paginated-query';
import { useRealtimeInvalidation } from '@/hooks/use-realtime-invalidation';
import { useFileBrowserStore } from '@/stores/file-browser-store';
import { apiFetch } from '@/lib/api/client';
import type { FileRow } from '@/components/files/file-grid';
export default function DocumentsPage() {
const queryClient = useQueryClient();
const { viewMode, setViewMode, currentFolder, setCurrentFolder } = useFileBrowserStore();
const [showUpload, setShowUpload] = useState(false);
const [previewFile, setPreviewFile] = useState<FileRow | null>(null);
const [, setRenameFile] = useState<FileRow | null>(null);
const { data, isLoading } = usePaginatedQuery<FileRow & { storagePath: string }>({
queryKey: ['files'],
endpoint: '/api/v1/files',
filterDefinitions: [],
});
useRealtimeInvalidation({
'file:uploaded': [['files']],
'file:updated': [['files']],
'file:deleted': [['files']],
});
const filesInFolder = currentFolder
? data.filter((f) => f.storagePath?.includes(currentFolder))
: data;
const handleDownload = async (file: FileRow) => {
try {
const res = await apiFetch<{ data: { url: string; filename: string } }>(
`/api/v1/files/${file.id}/download`,
);
const a = document.createElement('a');
a.href = res.data.url;
a.download = res.data.filename;
a.click();
} catch {
// silent
}
};
const handleDelete = async (file: FileRow) => {
if (!confirm(`Delete "${file.filename}"? This cannot be undone.`)) return;
try {
await apiFetch(`/api/v1/files/${file.id}`, { method: 'DELETE' });
queryClient.invalidateQueries({ queryKey: ['files'] });
} catch {
// silent
}
};
return (
<div className="flex h-full flex-col gap-4">
<PageHeader
title="Documents"
description="Store and manage port documents and attachments"
actions={
<div className="flex items-center gap-2">
<Button
variant="outline"
size="icon"
onClick={() => setViewMode(viewMode === 'grid' ? 'list' : 'grid')}
>
{viewMode === 'grid' ? <List className="h-4 w-4" /> : <Grid className="h-4 w-4" />}
</Button>
<PermissionGate resource="files" action="upload">
<Button size="sm" onClick={() => setShowUpload((v) => !v)}>
<Upload className="mr-1.5 h-4 w-4" />
Upload
</Button>
</PermissionGate>
</div>
}
/>
{showUpload && (
<PermissionGate resource="files" action="upload">
<FileUploadZone
onUploadComplete={() => {
queryClient.invalidateQueries({ queryKey: ['files'] });
setShowUpload(false);
}}
/>
</PermissionGate>
)}
<div className="flex flex-1 gap-4 overflow-hidden">
{/* Folder tree sidebar */}
<aside className="w-48 shrink-0 overflow-y-auto rounded-lg border bg-card p-2">
<p className="mb-1 px-2 text-xs font-medium text-muted-foreground uppercase tracking-wide">
Folders
</p>
<FolderTree
files={data}
currentFolder={currentFolder}
onFolderSelect={setCurrentFolder}
/>
</aside>
{/* Main content */}
<main className="flex-1 overflow-y-auto rounded-lg border bg-card p-4">
<FileGrid
files={filesInFolder}
onDownload={handleDownload}
onPreview={setPreviewFile}
onRename={setRenameFile}
onDelete={handleDelete}
isLoading={isLoading}
/>
</main>
</div>
<FilePreviewDialog
open={!!previewFile}
onOpenChange={(open) => !open && setPreviewFile(null)}
fileId={previewFile?.id}
fileName={previewFile?.filename}
mimeType={previewFile?.mimeType ?? undefined}
/>
</div>
);
}

View File

@@ -0,0 +1,10 @@
import { CreateDocumentWizard } from '@/components/documents/create-document-wizard';
interface PageProps {
params: Promise<{ portSlug: string }>;
}
export default async function NewDocumentPage({ params }: PageProps) {
const { portSlug } = await params;
return <CreateDocumentWizard portSlug={portSlug} />;
}

View File

@@ -1,142 +1,10 @@
'use client'; import { DocumentsHub } from '@/components/documents/documents-hub';
import { useState } from 'react'; interface PageProps {
import { Grid, List, Upload } from 'lucide-react'; params: Promise<{ portSlug: string }>;
import { useQueryClient } from '@tanstack/react-query'; }
import { Button } from '@/components/ui/button'; export default async function DocumentsPage({ params }: PageProps) {
import { PageHeader } from '@/components/shared/page-header'; const { portSlug } = await params;
import { PermissionGate } from '@/components/shared/permission-gate'; return <DocumentsHub portSlug={portSlug} />;
import { FileGrid } from '@/components/files/file-grid';
import { FolderTree } from '@/components/files/folder-tree';
import { FileUploadZone } from '@/components/files/file-upload-zone';
import { FilePreviewDialog } from '@/components/files/file-preview-dialog';
import { usePaginatedQuery } from '@/hooks/use-paginated-query';
import { useRealtimeInvalidation } from '@/hooks/use-realtime-invalidation';
import { useFileBrowserStore } from '@/stores/file-browser-store';
import { apiFetch } from '@/lib/api/client';
import type { FileRow } from '@/components/files/file-grid';
export default function DocumentsPage() {
const queryClient = useQueryClient();
const { viewMode, setViewMode, currentFolder, setCurrentFolder } = useFileBrowserStore();
const [showUpload, setShowUpload] = useState(false);
const [previewFile, setPreviewFile] = useState<FileRow | null>(null);
const [, setRenameFile] = useState<FileRow | null>(null);
const { data, isLoading } = usePaginatedQuery<FileRow & { storagePath: string }>({
queryKey: ['files'],
endpoint: '/api/v1/files',
filterDefinitions: [],
});
useRealtimeInvalidation({
'file:uploaded': [['files']],
'file:updated': [['files']],
'file:deleted': [['files']],
});
const filesInFolder = currentFolder
? data.filter((f) => f.storagePath?.includes(currentFolder))
: data;
const handleDownload = async (file: FileRow) => {
try {
const res = await apiFetch<{ data: { url: string; filename: string } }>(
`/api/v1/files/${file.id}/download`,
);
const a = document.createElement('a');
a.href = res.data.url;
a.download = res.data.filename;
a.click();
} catch {
// silent
}
};
const handleDelete = async (file: FileRow) => {
if (!confirm(`Delete "${file.filename}"? This cannot be undone.`)) return;
try {
await apiFetch(`/api/v1/files/${file.id}`, { method: 'DELETE' });
queryClient.invalidateQueries({ queryKey: ['files'] });
} catch {
// silent
}
};
return (
<div className="flex h-full flex-col gap-4">
<PageHeader
title="Documents"
description="Store and manage port documents and attachments"
actions={
<div className="flex items-center gap-2">
<Button
variant="outline"
size="icon"
onClick={() => setViewMode(viewMode === 'grid' ? 'list' : 'grid')}
>
{viewMode === 'grid' ? (
<List className="h-4 w-4" />
) : (
<Grid className="h-4 w-4" />
)}
</Button>
<PermissionGate resource="files" action="upload">
<Button size="sm" onClick={() => setShowUpload((v) => !v)}>
<Upload className="mr-1.5 h-4 w-4" />
Upload
</Button>
</PermissionGate>
</div>
}
/>
{showUpload && (
<PermissionGate resource="files" action="upload">
<FileUploadZone
onUploadComplete={() => {
queryClient.invalidateQueries({ queryKey: ['files'] });
setShowUpload(false);
}}
/>
</PermissionGate>
)}
<div className="flex flex-1 gap-4 overflow-hidden">
{/* Folder tree sidebar */}
<aside className="w-48 shrink-0 overflow-y-auto rounded-lg border bg-card p-2">
<p className="mb-1 px-2 text-xs font-medium text-muted-foreground uppercase tracking-wide">
Folders
</p>
<FolderTree
files={data}
currentFolder={currentFolder}
onFolderSelect={setCurrentFolder}
/>
</aside>
{/* Main content */}
<main className="flex-1 overflow-y-auto rounded-lg border bg-card p-4">
<FileGrid
files={filesInFolder}
onDownload={handleDownload}
onPreview={setPreviewFile}
onRename={setRenameFile}
onDelete={handleDelete}
isLoading={isLoading}
/>
</main>
</div>
<FilePreviewDialog
open={!!previewFile}
onOpenChange={(open) => !open && setPreviewFile(null)}
fileId={previewFile?.id}
fileName={previewFile?.filename}
mimeType={previewFile?.mimeType ?? undefined}
/>
</div>
);
} }

View File

@@ -1,16 +1,47 @@
'use client';
import { useState } from 'react';
import { Send } from 'lucide-react';
import { Button } from '@/components/ui/button';
import { Tabs, TabsList, TabsTrigger, TabsContent } from '@/components/ui/tabs';
import { EmailAccountsList } from '@/components/email/email-accounts-list';
import { EmailThreadsList } from '@/components/email/email-threads-list';
import { ComposeDialog } from '@/components/email/compose-dialog';
export default function EmailPage() { export default function EmailPage() {
const [tab, setTab] = useState('threads');
const [composeOpen, setComposeOpen] = useState(false);
return ( return (
<div className="space-y-6"> <div className="space-y-6">
<div> <div className="flex items-start justify-between gap-4">
<h1 className="text-2xl font-bold text-foreground">Email</h1> <div>
<p className="text-muted-foreground">Send and manage client communications</p> <h1 className="text-2xl font-bold text-foreground">Email</h1>
</div> <p className="text-muted-foreground">Send and manage client communications</p>
<div className="flex flex-col items-center justify-center rounded-lg border border-dashed p-12"> </div>
<p className="text-lg font-medium text-muted-foreground">Coming in Layer 3</p> <Button onClick={() => setComposeOpen(true)}>
<p className="text-sm text-muted-foreground"> <Send className="h-4 w-4 mr-1.5" />
This feature will be implemented in the next phase. Compose
</p> </Button>
</div> </div>
<Tabs value={tab} onValueChange={setTab}>
<TabsList>
<TabsTrigger value="threads">Inbox</TabsTrigger>
<TabsTrigger value="accounts">Accounts</TabsTrigger>
</TabsList>
<TabsContent value="threads" className="pt-4">
<EmailThreadsList />
</TabsContent>
<TabsContent value="accounts" className="pt-4">
<EmailAccountsList />
</TabsContent>
</Tabs>
<ComposeDialog open={composeOpen} onOpenChange={setComposeOpen} />
</div> </div>
); );
} }

View File

@@ -0,0 +1,17 @@
import { NotificationPreferencesForm } from '@/components/notifications/notification-preferences-form';
import { ReminderDigestForm } from '@/components/notifications/reminder-digest-form';
export default function NotificationPreferencesPage() {
return (
<div className="max-w-2xl mx-auto py-6 space-y-6">
<div>
<h1 className="text-2xl font-bold">Notification Preferences</h1>
<p className="text-sm text-muted-foreground">
Choose which notifications you receive and how.
</p>
</div>
<NotificationPreferencesForm />
<ReminderDigestForm />
</div>
);
}

View File

@@ -1,5 +1,7 @@
import { DashboardShell } from '@/components/dashboard/dashboard-shell'; import { redirect } from 'next/navigation';
export default function DashboardPage() { export default async function PortIndexPage({ params }: { params: Promise<{ portSlug: string }> }) {
return <DashboardShell />; const { portSlug } = await params;
// eslint-disable-next-line @typescript-eslint/no-explicit-any
redirect(`/${portSlug}/dashboard` as any);
} }

View File

@@ -0,0 +1,10 @@
import { ResidentialClientDetail } from '@/components/residential/residential-client-detail';
interface Props {
params: Promise<{ id: string }>;
}
export default async function ResidentialClientDetailPage({ params }: Props) {
const { id } = await params;
return <ResidentialClientDetail clientId={id} />;
}

View File

@@ -0,0 +1,5 @@
import { ResidentialClientsList } from '@/components/residential/residential-clients-list';
export default function ResidentialClientsPage() {
return <ResidentialClientsList />;
}

View File

@@ -0,0 +1,10 @@
import { ResidentialInterestDetail } from '@/components/residential/residential-interest-detail';
interface Props {
params: Promise<{ id: string }>;
}
export default async function ResidentialInterestDetailPage({ params }: Props) {
const { id } = await params;
return <ResidentialInterestDetail interestId={id} />;
}

View File

@@ -0,0 +1,5 @@
import { ResidentialInterestsList } from '@/components/residential/residential-interests-list';
export default function ResidentialInterestsPage() {
return <ResidentialInterestsList />;
}

View File

@@ -4,7 +4,8 @@ import { eq } from 'drizzle-orm';
import { auth } from '@/lib/auth'; import { auth } from '@/lib/auth';
import { db } from '@/lib/db'; import { db } from '@/lib/db';
import { userPortRoles } from '@/lib/db/schema/users'; import { ports as portsTable } from '@/lib/db/schema/ports';
import { userPortRoles, userProfiles } from '@/lib/db/schema/users';
import { QueryProvider } from '@/providers/query-provider'; import { QueryProvider } from '@/providers/query-provider';
import { SocketProvider } from '@/providers/socket-provider'; import { SocketProvider } from '@/providers/socket-provider';
import { PortProvider } from '@/providers/port-provider'; import { PortProvider } from '@/providers/port-provider';
@@ -16,26 +17,44 @@ export default async function DashboardLayout({ children }: { children: React.Re
const session = await auth.api.getSession({ headers: await headers() }); const session = await auth.api.getSession({ headers: await headers() });
if (!session?.user) redirect('/login'); if (!session?.user) redirect('/login');
// Load user's port assignments for PortProvider // Super admins have implicit access to every port; everyone else only sees
// ports they have an explicit user_port_roles row for.
const profile = await db.query.userProfiles.findFirst({
where: eq(userProfiles.userId, session.user.id),
});
const portRoles = await db.query.userPortRoles.findMany({ const portRoles = await db.query.userPortRoles.findMany({
where: eq(userPortRoles.userId, session.user.id), where: eq(userPortRoles.userId, session.user.id),
with: { port: true, role: true }, with: { port: true, role: true },
}); });
const ports = portRoles.map((pr) => pr.port); const ports = profile?.isSuperAdmin
? await db.query.ports.findMany({ orderBy: portsTable.name })
: portRoles.map((pr) => pr.port);
return ( return (
<QueryProvider> <QueryProvider>
<PortProvider ports={ports} defaultPortId={portRoles[0]?.port.id ?? null}> <PortProvider ports={ports} defaultPortId={ports[0]?.id ?? null}>
<PermissionsProvider> <PermissionsProvider>
<SocketProvider> <SocketProvider>
<div className="flex h-screen overflow-hidden bg-background"> <div className="flex h-screen overflow-hidden bg-background">
<Sidebar portRoles={portRoles} /> <Sidebar
portRoles={portRoles}
isSuperAdmin={profile?.isSuperAdmin ?? false}
user={{
name: profile?.displayName ?? session.user.name ?? session.user.email,
email: session.user.email,
}}
/>
<div className="flex-1 flex flex-col overflow-hidden min-w-0"> <div className="flex-1 flex flex-col overflow-hidden min-w-0">
<Topbar ports={ports} /> <Topbar
<main className="flex-1 overflow-y-auto bg-background p-6"> ports={ports}
{children} user={{
</main> name: profile?.displayName ?? session.user.name ?? session.user.email,
email: session.user.email,
}}
/>
<main className="flex-1 overflow-y-auto bg-background p-6">{children}</main>
</div> </div>
</div> </div>
</SocketProvider> </SocketProvider>

View File

@@ -2,11 +2,12 @@
import Link from 'next/link'; import Link from 'next/link';
import { useState } from 'react'; import { useState } from 'react';
import { Loader2, Mail } from 'lucide-react'; import { CheckCircle2, Loader2 } from 'lucide-react';
import { Button } from '@/components/ui/button'; import { Button } from '@/components/ui/button';
import { Input } from '@/components/ui/input'; import { Input } from '@/components/ui/input';
import { Label } from '@/components/ui/label'; import { Label } from '@/components/ui/label';
import { BrandedAuthShell } from '@/components/shared/branded-auth-shell';
export default function PortalForgotPasswordPage() { export default function PortalForgotPasswordPage() {
const [email, setEmail] = useState(''); const [email, setEmail] = useState('');
@@ -31,77 +32,74 @@ export default function PortalForgotPasswordPage() {
if (submitted) { if (submitted) {
return ( return (
<div className="min-h-screen flex items-center justify-center bg-gray-50 px-4"> <BrandedAuthShell>
<div className="w-full max-w-md text-center"> <div className="text-center">
<div className="inline-flex items-center justify-center w-14 h-14 rounded-full bg-green-50 mb-4"> <div className="inline-flex items-center justify-center w-14 h-14 rounded-full bg-green-50 mb-4">
<Mail className="h-7 w-7 text-green-600" /> <CheckCircle2 className="h-7 w-7 text-green-600" />
</div> </div>
<h1 className="text-xl font-semibold text-gray-900 mb-2">Check your email</h1> <h1 className="text-xl font-semibold text-gray-900 mb-2">Check your email</h1>
<p className="text-gray-500 text-sm leading-relaxed"> <p className="text-sm text-gray-500 leading-relaxed">
If <strong>{email}</strong> matches a portal account, we&apos;ve sent a reset link. The If <strong>{email}</strong> matches a portal account, we&apos;ve sent a reset link. The
link expires in 30 minutes. link expires in 30 minutes.
</p> </p>
<Link <Link
href="/portal/login" href="/portal/login"
className="mt-6 inline-block text-sm text-[#1e2844] hover:underline" className="mt-6 inline-block text-sm text-[#007bff] hover:underline"
> >
Back to sign in Back to sign in
</Link> </Link>
</div> </div>
</div> </BrandedAuthShell>
); );
} }
return ( return (
<div className="min-h-screen flex items-center justify-center bg-gray-50 px-4"> <BrandedAuthShell>
<div className="w-full max-w-sm"> <div className="text-center mb-6">
<div className="bg-white rounded-lg border p-8 shadow-sm"> <h1 className="text-xl font-semibold text-gray-900">Reset your password</h1>
<div className="text-center mb-6"> <p className="text-sm text-gray-500 mt-1">
<h1 className="text-xl font-semibold text-gray-900">Reset your password</h1> Enter your email and we&apos;ll send you a reset link.
<p className="text-sm text-gray-500 mt-1"> </p>
Enter your email and we&apos;ll send you a reset link.
</p>
</div>
<form onSubmit={handleSubmit} className="space-y-4">
<div className="space-y-1.5">
<Label htmlFor="email">Email address</Label>
<Input
id="email"
type="email"
placeholder="you@example.com"
value={email}
onChange={(e) => setEmail(e.target.value)}
required
autoFocus
disabled={loading}
/>
</div>
<Button
type="submit"
className="w-full bg-[#1e2844] hover:bg-[#1e2844]/90 text-white"
disabled={loading || !email}
>
{loading ? (
<>
<Loader2 className="h-4 w-4 mr-2 animate-spin" />
Sending
</>
) : (
'Send reset link'
)}
</Button>
</form>
<Link
href="/portal/login"
className="block mt-4 text-center text-xs text-gray-500 hover:underline"
>
Back to sign in
</Link>
</div>
</div> </div>
</div>
<form onSubmit={handleSubmit} className="space-y-4">
<div className="space-y-1.5">
<Label htmlFor="email">Email address</Label>
<Input
id="email"
type="email"
placeholder="you@example.com"
value={email}
onChange={(e) => setEmail(e.target.value)}
required
autoFocus
autoComplete="email"
disabled={loading}
/>
</div>
<Button
type="submit"
className="w-full bg-[#007bff] hover:bg-[#0069d9] text-white"
disabled={loading || !email}
>
{loading ? (
<>
<Loader2 className="h-4 w-4 mr-2 animate-spin" />
Sending
</>
) : (
'Send reset link'
)}
</Button>
<p className="text-center text-sm text-gray-500">
Remember your password?{' '}
<Link href="/portal/login" className="text-[#007bff] hover:underline">
Sign in
</Link>
</p>
</form>
</BrandedAuthShell>
); );
} }

View File

@@ -8,6 +8,7 @@ import { Loader2 } from 'lucide-react';
import { Button } from '@/components/ui/button'; import { Button } from '@/components/ui/button';
import { Input } from '@/components/ui/input'; import { Input } from '@/components/ui/input';
import { Label } from '@/components/ui/label'; import { Label } from '@/components/ui/label';
import { BrandedAuthShell } from '@/components/shared/branded-auth-shell';
export default function PortalLoginPage() { export default function PortalLoginPage() {
const router = useRouter(); const router = useRouter();
@@ -48,74 +49,67 @@ export default function PortalLoginPage() {
} }
return ( return (
<div className="min-h-screen flex items-center justify-center bg-gray-50 px-4"> <BrandedAuthShell>
<div className="w-full max-w-sm"> <div className="text-center mb-6">
<div className="bg-white rounded-lg border p-8 shadow-sm"> <h1 className="text-xl font-semibold text-gray-900">Client Portal</h1>
<div className="text-center mb-6"> <p className="text-sm text-gray-500 mt-1">Sign in to your account</p>
<h1 className="text-xl font-semibold text-gray-900">Client Portal</h1> </div>
<p className="text-sm text-gray-500 mt-1">Sign in to your account</p>
</div>
<form onSubmit={handleSubmit} className="space-y-4"> <form onSubmit={handleSubmit} className="space-y-4">
<div className="space-y-1.5"> <div className="space-y-1.5">
<Label htmlFor="email">Email address</Label> <Label htmlFor="email">Email address</Label>
<Input <Input
id="email" id="email"
type="email" type="email"
placeholder="you@example.com" placeholder="you@example.com"
value={email} value={email}
onChange={(e) => setEmail(e.target.value)} onChange={(e) => setEmail(e.target.value)}
required required
autoFocus autoFocus
autoComplete="email" autoComplete="email"
disabled={loading} disabled={loading}
/> />
</div>
<div className="space-y-1.5">
<div className="flex items-center justify-between">
<Label htmlFor="password">Password</Label>
<Link
href="/portal/forgot-password"
className="text-xs text-[#1e2844] hover:underline"
>
Forgot password?
</Link>
</div>
<Input
id="password"
type="password"
value={password}
onChange={(e) => setPassword(e.target.value)}
required
autoComplete="current-password"
disabled={loading}
/>
</div>
{error && <p className="text-sm text-red-600">{error}</p>}
<Button
type="submit"
className="w-full bg-[#1e2844] hover:bg-[#1e2844]/90 text-white"
disabled={loading || !email || !password}
>
{loading ? (
<>
<Loader2 className="h-4 w-4 mr-2 animate-spin" />
Signing in
</>
) : (
'Sign in'
)}
</Button>
</form>
</div> </div>
<p className="text-center text-xs text-gray-400 mt-4"> <div className="space-y-1.5">
This portal is for existing clients only. <div className="flex items-center justify-between">
</p> <Label htmlFor="password">Password</Label>
</div> <Link href="/portal/forgot-password" className="text-xs text-[#007bff] hover:underline">
</div> Forgot password?
</Link>
</div>
<Input
id="password"
type="password"
value={password}
onChange={(e) => setPassword(e.target.value)}
required
autoComplete="current-password"
disabled={loading}
/>
</div>
{error && <p className="text-sm text-red-600">{error}</p>}
<Button
type="submit"
className="w-full bg-[#007bff] hover:bg-[#0069d9] text-white"
disabled={loading || !email || !password}
>
{loading ? (
<>
<Loader2 className="h-4 w-4 mr-2 animate-spin" />
Signing in
</>
) : (
'Sign in'
)}
</Button>
</form>
<p className="text-center text-xs text-gray-400 mt-6">
This portal is for existing clients only.
</p>
</BrandedAuthShell>
); );
} }

View File

@@ -0,0 +1,50 @@
import { redirect } from 'next/navigation';
import { headers } from 'next/headers';
import { auth } from '@/lib/auth';
import { db } from '@/lib/db';
import { ports as portsTable } from '@/lib/db/schema/ports';
import { QueryProvider } from '@/providers/query-provider';
import { PortProvider } from '@/providers/port-provider';
import { eq } from 'drizzle-orm';
/**
* Minimal layout for the mobile receipt-scanner PWA. No sidebar, no
* topbar — the scanner is its own contained surface. Adds the PWA
* manifest link + theme color so iOS/Android pick up "Add to Home
* Screen". Auth check matches the dashboard layout so unauthorized
* users still bounce to /login.
*/
export default async function ScannerLayout({
children,
params,
}: {
children: React.ReactNode;
params: Promise<{ portSlug: string }>;
}) {
const session = await auth.api.getSession({ headers: await headers() });
if (!session?.user) redirect('/login');
const { portSlug } = await params;
const port = await db.query.ports.findFirst({
where: eq(portsTable.slug, portSlug),
});
if (!port) redirect('/login');
return (
<QueryProvider>
<PortProvider ports={port ? [port] : []} defaultPortId={port?.id ?? null}>
<head>
<link rel="manifest" href={`/${portSlug}/scan/manifest.webmanifest`} />
<meta name="theme-color" content="#3a7bc8" />
<meta name="mobile-web-app-capable" content="yes" />
<meta name="apple-mobile-web-app-capable" content="yes" />
<meta name="apple-mobile-web-app-status-bar-style" content="default" />
<meta name="apple-mobile-web-app-title" content="PN Scanner" />
<meta name="viewport" content="width=device-width, initial-scale=1, viewport-fit=cover" />
</head>
<div className="min-h-[100dvh] bg-background">{children}</div>
</PortProvider>
</QueryProvider>
);
}

View File

@@ -0,0 +1,45 @@
import { NextResponse } from 'next/server';
import { eq } from 'drizzle-orm';
import { db } from '@/lib/db';
import { ports } from '@/lib/db/schema/ports';
/**
* Per-port PWA manifest. Scoped to `/<portSlug>/scan` so the install
* only covers the scanner page, not the rest of the CRM. Each port
* gets its own homescreen icon labeled with its name.
*/
export async function GET(_req: Request, { params }: { params: Promise<{ portSlug: string }> }) {
const { portSlug } = await params;
const port = await db.query.ports.findFirst({ where: eq(ports.slug, portSlug) });
const portName = port?.name ?? 'Port Nimara';
const manifest = {
name: `${portName} — Scanner`,
short_name: 'Scanner',
description: `Capture and submit expense receipts for ${portName}.`,
start_url: `/${portSlug}/scan`,
scope: `/${portSlug}/scan`,
display: 'standalone',
orientation: 'portrait',
background_color: '#ffffff',
theme_color: '#3a7bc8',
icons: [
{ src: '/icon-192.png', sizes: '192x192', type: 'image/png' },
{ src: '/icon-512.png', sizes: '512x512', type: 'image/png' },
{
src: '/icon-512-maskable.png',
sizes: '512x512',
type: 'image/png',
purpose: 'maskable',
},
],
};
return NextResponse.json(manifest, {
headers: {
'Content-Type': 'application/manifest+json',
'Cache-Control': 'public, max-age=300, must-revalidate',
},
});
}

View File

@@ -0,0 +1,11 @@
import type { Metadata } from 'next';
import { ScanShell } from '@/components/scan/scan-shell';
export const metadata: Metadata = {
title: 'Scan receipt — Port Nimara',
};
export default function ScanPage() {
return <ScanShell />;
}

View File

@@ -0,0 +1,37 @@
import { NextRequest, NextResponse } from 'next/server';
import { z } from 'zod';
import { errorResponse } from '@/lib/errors';
import { consumeCrmInvite } from '@/lib/services/crm-invite.service';
const bodySchema = z.object({
token: z.string().min(1),
password: z.string().min(9),
});
export async function POST(req: NextRequest): Promise<NextResponse> {
let body: unknown;
try {
body = await req.json();
} catch {
return NextResponse.json({ message: 'Invalid request body' }, { status: 400 });
}
const parsed = bodySchema.safeParse(body);
if (!parsed.success) {
return NextResponse.json(
{ message: parsed.error.errors[0]?.message ?? 'Invalid input' },
{ status: 400 },
);
}
try {
const result = await consumeCrmInvite({
token: parsed.data.token,
password: parsed.data.password,
});
return NextResponse.json({ success: true, email: result.email });
} catch (err) {
return errorResponse(err);
}
}

View File

@@ -1,68 +1,15 @@
import { NextResponse } from 'next/server'; import { NextResponse } from 'next/server';
import { db } from '@/lib/db';
import { redis } from '@/lib/redis';
import { minioClient } from '@/lib/minio';
import { env } from '@/lib/env';
import { sql } from 'drizzle-orm';
type CheckStatus = 'ok' | 'error'; /**
* Liveness probe — confirms the Next.js process is responding.
interface HealthChecks { *
postgres: CheckStatus; * Returns 200 unconditionally; if the process is wedged or has crashed
redis: CheckStatus; * the request never lands here at all. Do NOT include database/Redis/MinIO
minio: CheckStatus; * checks in this endpoint — a transient downstream blip should drop the
} * pod from the load balancer (readiness), not restart the pod (liveness).
*
interface HealthResponse { * For deep dependency checks, hit `/api/ready` instead.
status: 'healthy' | 'degraded'; */
checks: HealthChecks; export async function GET() {
timestamp: string; return NextResponse.json({ status: 'ok', timestamp: new Date().toISOString() });
}
export async function GET(): Promise<NextResponse<HealthResponse>> {
const checks: HealthChecks = {
postgres: 'error',
redis: 'error',
minio: 'error',
};
await Promise.allSettled([
db
.execute(sql`SELECT 1`)
.then(() => {
checks.postgres = 'ok';
})
.catch(() => {
checks.postgres = 'error';
}),
redis
.ping()
.then(() => {
checks.redis = 'ok';
})
.catch(() => {
checks.redis = 'error';
}),
minioClient
.bucketExists(env.MINIO_BUCKET)
.then(() => {
checks.minio = 'ok';
})
.catch(() => {
checks.minio = 'error';
}),
]);
const allHealthy = Object.values(checks).every((s) => s === 'ok');
const status: HealthResponse['status'] = allHealthy ? 'healthy' : 'degraded';
const body: HealthResponse = {
status,
checks,
timestamp: new Date().toISOString(),
};
return NextResponse.json(body, { status: allHealthy ? 200 : 503 });
} }

View File

@@ -6,7 +6,7 @@ import { activateAccount } from '@/lib/services/portal-auth.service';
const bodySchema = z.object({ const bodySchema = z.object({
token: z.string().min(1), token: z.string().min(1),
password: z.string().min(12), password: z.string().min(9),
}); });
export async function POST(req: NextRequest): Promise<NextResponse> { export async function POST(req: NextRequest): Promise<NextResponse> {

View File

@@ -6,7 +6,7 @@ import { resetPassword } from '@/lib/services/portal-auth.service';
const bodySchema = z.object({ const bodySchema = z.object({
token: z.string().min(1), token: z.string().min(1),
password: z.string().min(12), password: z.string().min(9),
}); });
export async function POST(req: NextRequest): Promise<NextResponse> { export async function POST(req: NextRequest): Promise<NextResponse> {

View File

@@ -12,31 +12,23 @@ import { yachts, yachtOwnershipHistory } from '@/lib/db/schema/yachts';
import { companies, companyMemberships } from '@/lib/db/schema/companies'; import { companies, companyMemberships } from '@/lib/db/schema/companies';
import { createAuditLog } from '@/lib/audit'; import { createAuditLog } from '@/lib/audit';
import { errorResponse, RateLimitError } from '@/lib/errors'; import { errorResponse, RateLimitError } from '@/lib/errors';
import { checkRateLimit, rateLimiters } from '@/lib/rate-limit';
import { publicInterestSchema } from '@/lib/validators/interests'; import { publicInterestSchema } from '@/lib/validators/interests';
import { sendInquiryNotifications } from '@/lib/services/inquiry-notifications.service'; import { sendInquiryNotifications } from '@/lib/services/inquiry-notifications.service';
import { parsePhone } from '@/lib/i18n/phone';
import type { CountryCode } from '@/lib/i18n/countries';
// ─── Simple in-memory rate limiter ─────────────────────────────────────────── /**
// Max 5 requests per hour per IP * Throws RateLimitError if the IP has exceeded the public-form quota.
* Backed by the Redis sliding-window limiter so the cap survives restarts
const ipHits = new Map<string, { count: number; resetAt: number }>(); * and is shared across worker processes.
const WINDOW_MS = 60 * 60 * 1000; // 1 hour */
const MAX_HITS = 5; async function gateRateLimit(ip: string): Promise<void> {
const result = await checkRateLimit(ip, rateLimiters.publicForm);
function checkRateLimit(ip: string): void { if (!result.allowed) {
const now = Date.now(); const retryAfter = Math.max(1, Math.ceil((result.resetAt - Date.now()) / 1000));
const entry = ipHits.get(ip);
if (!entry || now > entry.resetAt) {
ipHits.set(ip, { count: 1, resetAt: now + WINDOW_MS });
return;
}
if (entry.count >= MAX_HITS) {
const retryAfter = Math.ceil((entry.resetAt - now) / 1000);
throw new RateLimitError(retryAfter); throw new RateLimitError(retryAfter);
} }
entry.count += 1;
} }
type PublicInterestData = z.infer<typeof publicInterestSchema>; type PublicInterestData = z.infer<typeof publicInterestSchema>;
@@ -50,7 +42,7 @@ type Tx = typeof db;
export async function POST(req: NextRequest) { export async function POST(req: NextRequest) {
try { try {
const ip = req.headers.get('x-forwarded-for')?.split(',')[0]?.trim() ?? 'unknown'; const ip = req.headers.get('x-forwarded-for')?.split(',')[0]?.trim() ?? 'unknown';
checkRateLimit(ip); await gateRateLimit(ip);
const body = await req.json(); const body = await req.json();
const data = publicInterestSchema.parse(body); const data = publicInterestSchema.parse(body);
@@ -61,6 +53,16 @@ export async function POST(req: NextRequest) {
return NextResponse.json({ error: 'Port context required' }, { status: 400 }); return NextResponse.json({ error: 'Port context required' }, { status: 400 });
} }
// Server-side phone normalization for older website builds that post raw
// international/national strings. Newer builds may pre-fill phoneE164/Country.
let phoneE164 = data.phoneE164 ?? null;
let phoneCountry: CountryCode | null = (data.phoneCountry as CountryCode | null) ?? null;
if (!phoneE164) {
const parsed = parsePhone(data.phone, phoneCountry ?? undefined);
phoneE164 = parsed.e164;
phoneCountry = parsed.country ?? phoneCountry;
}
const fullName = const fullName =
data.firstName && data.lastName data.firstName && data.lastName
? `${data.firstName} ${data.lastName}` ? `${data.firstName} ${data.lastName}`
@@ -96,17 +98,21 @@ export async function POST(req: NextRequest) {
}); });
if (existingClient && existingClient.portId === portId) { if (existingClient && existingClient.portId === portId) {
clientId = existingClient.id; clientId = existingClient.id;
const updates: Partial<typeof clients.$inferInsert> = {};
if (data.preferredContactMethod) { if (data.preferredContactMethod) {
await tx updates.preferredContactMethod = data.preferredContactMethod;
.update(clients) }
.set({ preferredContactMethod: data.preferredContactMethod }) if (data.nationalityIso && !existingClient.nationalityIso) {
.where(eq(clients.id, clientId)); updates.nationalityIso = data.nationalityIso;
}
if (Object.keys(updates).length > 0) {
await tx.update(clients).set(updates).where(eq(clients.id, clientId));
} }
} else { } else {
clientId = await createClientInTx(tx, portId, fullName, data); clientId = await createClientInTx(tx, portId, fullName, data, phoneE164, phoneCountry);
} }
} else { } else {
clientId = await createClientInTx(tx, portId, fullName, data); clientId = await createClientInTx(tx, portId, fullName, data, phoneE164, phoneCountry);
} }
// 2. Optional: upsert company + add membership // 2. Optional: upsert company + add membership
@@ -128,7 +134,8 @@ export async function POST(req: NextRequest) {
name: data.company.name, name: data.company.name,
legalName: data.company.legalName ?? null, legalName: data.company.legalName ?? null,
taxId: data.company.taxId ?? null, taxId: data.company.taxId ?? null,
incorporationCountry: data.company.incorporationCountry ?? null, incorporationCountryIso: data.company.incorporationCountryIso ?? null,
incorporationSubdivisionIso: data.company.incorporationSubdivisionIso ?? null,
status: 'active', status: 'active',
}) })
.returning(); .returning();
@@ -198,9 +205,9 @@ export async function POST(req: NextRequest) {
label: 'Primary', label: 'Primary',
streetAddress: data.address.street ?? null, streetAddress: data.address.street ?? null,
city: data.address.city ?? null, city: data.address.city ?? null,
stateProvince: data.address.stateProvince ?? null, subdivisionIso: data.address.subdivisionIso ?? null,
postalCode: data.address.postalCode ?? null, postalCode: data.address.postalCode ?? null,
country: data.address.country ?? null, countryIso: data.address.countryIso ?? null,
isPrimary: true, isPrimary: true,
}); });
} }
@@ -279,7 +286,9 @@ async function createClientInTx(
tx: Tx, tx: Tx,
portId: string, portId: string,
fullName: string, fullName: string,
data: Pick<PublicInterestData, 'email' | 'phone' | 'preferredContactMethod'>, data: Pick<PublicInterestData, 'email' | 'phone' | 'preferredContactMethod' | 'nationalityIso'>,
phoneE164: string | null,
phoneCountry: CountryCode | null,
): Promise<string> { ): Promise<string> {
const [newClient] = await tx const [newClient] = await tx
.insert(clients) .insert(clients)
@@ -287,6 +296,7 @@ async function createClientInTx(
portId, portId,
fullName, fullName,
preferredContactMethod: data.preferredContactMethod, preferredContactMethod: data.preferredContactMethod,
nationalityIso: data.nationalityIso ?? null,
source: 'website', source: 'website',
}) })
.returning(); .returning();
@@ -303,6 +313,8 @@ async function createClientInTx(
clientId, clientId,
channel: 'phone', channel: 'phone',
value: data.phone, value: data.phone,
valueE164: phoneE164,
valueCountry: phoneCountry,
isPrimary: false, isPrimary: false,
}); });

View File

@@ -0,0 +1,189 @@
import { NextRequest, NextResponse } from 'next/server';
import { and, eq } from 'drizzle-orm';
import { db } from '@/lib/db';
import { withTransaction } from '@/lib/db/utils';
import { ports } from '@/lib/db/schema/ports';
import { residentialClients, residentialInterests } from '@/lib/db/schema/residential';
import { systemSettings } from '@/lib/db/schema/system';
import { sendEmail } from '@/lib/email';
import {
residentialClientConfirmation,
residentialSalesAlert,
} from '@/lib/email/templates/residential-inquiry';
import { env } from '@/lib/env';
import { errorResponse, RateLimitError, ValidationError } from '@/lib/errors';
import { logger } from '@/lib/logger';
import { checkRateLimit, rateLimiters } from '@/lib/rate-limit';
import { publicResidentialInquirySchema } from '@/lib/validators/residential';
import { emitToRoom } from '@/lib/socket/server';
import { parsePhone } from '@/lib/i18n/phone';
import type { CountryCode } from '@/lib/i18n/countries';
/**
* Throws RateLimitError if the IP has exceeded the public-form quota.
* Backed by the Redis sliding-window limiter so the cap survives restarts
* and is shared across worker processes.
*/
async function gateRateLimit(ip: string): Promise<void> {
const result = await checkRateLimit(ip, rateLimiters.publicForm);
if (!result.allowed) {
const retryAfter = Math.max(1, Math.ceil((result.resetAt - Date.now()) / 1000));
throw new RateLimitError(retryAfter);
}
}
/**
* POST /api/public/residential-inquiries — unauthenticated entry point for
* the public website's residential interest form. Creates a
* `residential_clients` row and an opening `residential_interests` row in a
* single transaction.
*
* Required: `portId` query param or `X-Port-Id` header.
*/
export async function POST(req: NextRequest) {
try {
const ip = req.headers.get('x-forwarded-for')?.split(',')[0]?.trim() ?? 'unknown';
await gateRateLimit(ip);
const body = await req.json();
const data = publicResidentialInquirySchema.parse(body);
const portId = req.nextUrl.searchParams.get('portId') ?? req.headers.get('X-Port-Id');
if (!portId) {
throw new ValidationError('portId is required');
}
const port = await db.query.ports.findFirst({ where: eq(ports.id, portId) });
if (!port) {
throw new ValidationError('Unknown port');
}
// If the website didn't pre-normalize, parse server-side. International
// strings parse without a hint; national-format submissions need a country.
let phoneE164 = data.phoneE164 ?? null;
let phoneCountry: CountryCode | null = (data.phoneCountry as CountryCode | null) ?? null;
if (!phoneE164) {
const parsed = parsePhone(data.phone, phoneCountry ?? undefined);
phoneE164 = parsed.e164;
phoneCountry = parsed.country ?? phoneCountry;
}
const result = await withTransaction(async (tx) => {
const [client] = await tx
.insert(residentialClients)
.values({
portId,
fullName: `${data.firstName.trim()} ${data.lastName.trim()}`.trim(),
email: data.email,
phone: data.phone,
phoneE164,
phoneCountry,
nationalityIso: data.nationalityIso ?? null,
timezone: data.timezone ?? null,
placeOfResidence: data.placeOfResidence,
placeOfResidenceCountryIso: data.placeOfResidenceCountryIso ?? null,
subdivisionIso: data.subdivisionIso ?? null,
preferredContactMethod: data.preferredContactMethod,
source: 'website',
status: 'prospect',
notes: data.notes,
})
.returning();
if (!client) throw new Error('Failed to create residential client');
const [interest] = await tx
.insert(residentialInterests)
.values({
portId,
residentialClientId: client.id,
pipelineStage: 'new',
source: 'website',
notes: data.notes,
preferences: data.preferences,
})
.returning();
if (!interest) throw new Error('Failed to create residential interest');
return { clientId: client.id, interestId: interest.id };
});
emitToRoom(`port:${portId}`, 'residential_client:created', { id: result.clientId });
emitToRoom(`port:${portId}`, 'residential_interest:created', { id: result.interestId });
// Send notification emails (non-blocking — failures shouldn't 500 the
// public form).
void sendResidentialNotifications({
portId,
data,
crmDeepLink: `${env.APP_URL}/${port.slug}/residential/clients/${result.clientId}`,
}).catch((err) => logger.error({ err }, 'Failed to send residential inquiry notifications'));
return NextResponse.json({ success: true, ...result }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
}
async function sendResidentialNotifications(args: {
portId: string;
data: {
firstName: string;
lastName: string;
email: string;
phone: string;
placeOfResidence?: string;
preferredContactMethod?: 'email' | 'phone';
notes?: string;
preferences?: string;
};
crmDeepLink: string;
}): Promise<void> {
const { portId, data, crmDeepLink } = args;
// Client confirmation
const confirmation = residentialClientConfirmation({
firstName: data.firstName,
contactEmail: 'sales@portnimara.com',
});
await sendEmail(data.email, confirmation.subject, confirmation.html);
// Sales-team alert — pull recipients from system_settings if configured;
// fall back to the inquiry_contact_email if available.
const recipientsRow = await db.query.systemSettings.findFirst({
where: and(
eq(systemSettings.key, 'residential_notification_recipients'),
eq(systemSettings.portId, portId),
),
});
const fallbackRow = await db.query.systemSettings.findFirst({
where: and(eq(systemSettings.key, 'inquiry_contact_email'), eq(systemSettings.portId, portId)),
});
const configured = Array.isArray(recipientsRow?.value) ? (recipientsRow!.value as string[]) : [];
const fallback =
typeof fallbackRow?.value === 'string' && fallbackRow.value.length > 0
? [fallbackRow.value]
: [];
const recipients = configured.length > 0 ? configured : fallback;
if (recipients.length === 0) {
logger.warn(
{ portId },
'No residential_notification_recipients or inquiry_contact_email configured; skipping sales alert',
);
return;
}
const alert = residentialSalesAlert({
fullName: `${data.firstName} ${data.lastName}`.trim(),
email: data.email,
phone: data.phone,
placeOfResidence: data.placeOfResidence,
preferredContactMethod: data.preferredContactMethod,
notes: data.notes,
preferences: data.preferences,
crmDeepLink,
});
await sendEmail(recipients, alert.subject, alert.html);
}

View File

@@ -0,0 +1,82 @@
import { NextResponse } from 'next/server';
import { sql } from 'drizzle-orm';
import { db } from '@/lib/db';
import { redis } from '@/lib/redis';
import { minioClient } from '@/lib/minio';
import { env } from '@/lib/env';
type CheckStatus = 'ok' | 'error';
interface ReadyChecks {
postgres: CheckStatus;
redis: CheckStatus;
minio: CheckStatus;
}
interface ReadyResponse {
status: 'ready' | 'degraded';
checks: ReadyChecks;
timestamp: string;
}
/**
* Readiness probe — verifies that every backing service this process
* needs to serve traffic is reachable. A 503 should drop the pod from the
* load balancer until the next probe succeeds; it should not trigger a
* pod restart (that's what `/api/health` is for).
*
* Checks:
* - postgres: `SELECT 1` against the primary
* - redis: `PING`
* - minio: `bucketExists(<configured-bucket>)`
*
* Documenso + SMTP are intentionally not probed here: they're optional
* integrations, and each tenant configures its own credentials. A
* tenant-misconfigured Documenso instance shouldn't deadline the entire
* shared CRM.
*/
export async function GET(): Promise<NextResponse<ReadyResponse>> {
const checks: ReadyChecks = {
postgres: 'error',
redis: 'error',
minio: 'error',
};
await Promise.allSettled([
db
.execute(sql`SELECT 1`)
.then(() => {
checks.postgres = 'ok';
})
.catch(() => {
checks.postgres = 'error';
}),
redis
.ping()
.then(() => {
checks.redis = 'ok';
})
.catch(() => {
checks.redis = 'error';
}),
minioClient
.bucketExists(env.MINIO_BUCKET)
.then(() => {
checks.minio = 'ok';
})
.catch(() => {
checks.minio = 'error';
}),
]);
const allReady = Object.values(checks).every((s) => s === 'ok');
const status: ReadyResponse['status'] = allReady ? 'ready' : 'degraded';
return NextResponse.json(
{ status, checks, timestamp: new Date().toISOString() },
{ status: allReady ? 200 : 503 },
);
}

View File

@@ -0,0 +1,46 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import {
getAiBudget,
setAiBudget,
currentPeriodTokens,
periodBreakdown,
} from '@/lib/services/ai-budget.service';
const saveSchema = z.object({
enabled: z.boolean().optional(),
softCapTokens: z.number().int().nonnegative().max(100_000_000).optional(),
hardCapTokens: z.number().int().nonnegative().max(100_000_000).optional(),
period: z.enum(['day', 'week', 'month']).optional(),
});
export const GET = withAuth(
withPermission('admin', 'manage_settings', async (req, ctx) => {
try {
const [budget, used, breakdown] = await Promise.all([
getAiBudget(ctx.portId),
currentPeriodTokens(ctx.portId),
periodBreakdown(ctx.portId),
]);
return NextResponse.json({ data: { budget, used, breakdown } });
} catch (error) {
return errorResponse(error);
}
}),
);
export const PUT = withAuth(
withPermission('admin', 'manage_settings', async (req, ctx) => {
try {
const body = await parseBody(req, saveSchema);
const next = await setAiBudget(ctx.portId, body, ctx.userId);
return NextResponse.json({ data: next });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,25 @@
import { NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { runAlertEngineForPorts } from '@/lib/services/alert-engine';
/**
* Admin trigger for an immediate alert engine sweep over the caller's port.
* Useful for manual ops ("re-evaluate now after I fixed a rule") and
* exercised by the realapi socket fanout test.
*
* Requires super_admin or per-port admin permissions; the engine itself
* is idempotent — duplicate runs only re-evaluate, never duplicate rows.
*/
export const POST = withAuth(async (_req, ctx) => {
try {
if (!ctx.isSuperAdmin) {
return NextResponse.json({ error: 'Super admin only' }, { status: 403 });
}
const summary = await runAlertEngineForPorts([ctx.portId]);
return NextResponse.json({ data: summary });
} catch (error) {
return errorResponse(error);
}
});

View File

@@ -1,29 +1,76 @@
import { NextResponse } from 'next/server'; import { NextResponse } from 'next/server';
import { z } from 'zod'; import { z } from 'zod';
import { inArray } from 'drizzle-orm';
import { withAuth, withPermission } from '@/lib/api/helpers'; import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseQuery } from '@/lib/api/route-helpers'; import { parseQuery } from '@/lib/api/route-helpers';
import { listAuditLogs } from '@/lib/services/audit.service'; import { searchAuditLogs } from '@/lib/services/audit-search.service';
import { db } from '@/lib/db';
import { user } from '@/lib/db/schema/users';
import { errorResponse } from '@/lib/errors'; import { errorResponse } from '@/lib/errors';
const auditQuerySchema = z.object({ const auditQuerySchema = z.object({
page: z.coerce.number().int().min(1).default(1), limit: z.coerce.number().int().min(1).max(200).default(50),
limit: z.coerce.number().int().min(1).max(100).default(50),
entityType: z.string().optional(), entityType: z.string().optional(),
action: z.string().optional(), action: z.string().optional(),
userId: z.string().optional(), userId: z.string().optional(),
entityId: z.string().optional(), entityId: z.string().optional(),
dateFrom: z.string().optional(), dateFrom: z.string().optional(),
dateTo: z.string().optional(), dateTo: z.string().optional(),
/** Free-text query against the tsvector `search_text` column. */
search: z.string().optional(), search: z.string().optional(),
/** Cursor pair from the previous page's response. */
cursorAt: z.string().optional(),
cursorId: z.string().optional(),
}); });
export const GET = withAuth( export const GET = withAuth(
withPermission('admin', 'view_audit_log', async (req, ctx) => { withPermission('admin', 'view_audit_log', async (req, ctx) => {
try { try {
const query = parseQuery(req, auditQuerySchema); const query = parseQuery(req, auditQuerySchema);
const result = await listAuditLogs(ctx.portId, query); const cursor =
return NextResponse.json(result); query.cursorAt && query.cursorId
? { createdAt: new Date(query.cursorAt), id: query.cursorId }
: undefined;
const { rows, nextCursor } = await searchAuditLogs({
portId: ctx.portId,
q: query.search,
userId: query.userId,
action: query.action,
entityType: query.entityType,
entityId: query.entityId,
from: query.dateFrom ? new Date(query.dateFrom) : undefined,
to: query.dateTo ? new Date(query.dateTo) : undefined,
cursor,
limit: query.limit,
});
// Resolve actor emails in one batched query so the table can show
// who did what without N+1 round trips.
const userIds = Array.from(
new Set(rows.map((r) => r.userId).filter((id): id is string => Boolean(id))),
);
const userRows = userIds.length
? await db
.select({ id: user.id, email: user.email, name: user.name })
.from(user)
.where(inArray(user.id, userIds))
: [];
const userMap = new Map(userRows.map((u) => [u.id, u]));
const data = rows.map((r) => ({
...r,
actor: r.userId ? (userMap.get(r.userId) ?? null) : null,
}));
return NextResponse.json({
data,
pagination: {
nextCursor: nextCursor
? { createdAt: nextCursor.createdAt.toISOString(), id: nextCursor.id }
: null,
},
});
} catch (error) { } catch (error) {
return errorResponse(error); return errorResponse(error);
} }

View File

@@ -0,0 +1,20 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { checkDocumensoHealth } from '@/lib/services/documenso-client';
/**
* Admin probe — calls Documenso /api/v1/health using the port's effective
* config. Used by the "Test connection" button on /admin/documenso.
*/
export const POST = withAuth(
withPermission('admin', 'manage_settings', async (_req, ctx) => {
try {
const result = await checkDocumensoHealth(ctx.portId);
return NextResponse.json({ data: result });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,58 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse, NotFoundError } from '@/lib/errors';
import {
deleteFormTemplate,
getFormTemplateById,
updateFormTemplate,
} from '@/lib/services/form-templates.service';
import { updateFormTemplateSchema } from '@/lib/validators/form-templates';
export const GET = withAuth(
withPermission('admin', 'manage_forms', async (_req, ctx, params) => {
try {
if (!params.id) throw new NotFoundError('Form template');
const tpl = await getFormTemplateById(params.id, ctx.portId);
return NextResponse.json({ data: tpl });
} catch (error) {
return errorResponse(error);
}
}),
);
export const PATCH = withAuth(
withPermission('admin', 'manage_forms', async (req, ctx, params) => {
try {
if (!params.id) throw new NotFoundError('Form template');
const body = await parseBody(req, updateFormTemplateSchema);
const tpl = await updateFormTemplate(params.id, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: tpl });
} catch (error) {
return errorResponse(error);
}
}),
);
export const DELETE = withAuth(
withPermission('admin', 'manage_forms', async (_req, ctx, params) => {
try {
if (!params.id) throw new NotFoundError('Form template');
await deleteFormTemplate(params.id, ctx.portId, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return new NextResponse(null, { status: 204 });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,35 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { createFormTemplate, listFormTemplates } from '@/lib/services/form-templates.service';
import { createFormTemplateSchema } from '@/lib/validators/form-templates';
export const GET = withAuth(
withPermission('admin', 'manage_forms', async (_req, ctx) => {
try {
const data = await listFormTemplates(ctx.portId);
return NextResponse.json({ data });
} catch (error) {
return errorResponse(error);
}
}),
);
export const POST = withAuth(
withPermission('admin', 'manage_forms', async (req, ctx) => {
try {
const body = await parseBody(req, createFormTemplateSchema);
const tpl = await createFormTemplate(ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: tpl }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,28 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { errorResponse, ForbiddenError } from '@/lib/errors';
import { resendCrmInvite } from '@/lib/services/crm-invite.service';
// Resend mints a fresh token + new email on a global invite row;
// restrict to super-admins to match revoke/list and avoid cross-tenant
// re-issuance of foreign-port invitations.
export const POST = withAuth(
withPermission('admin', 'manage_users', async (_req, ctx, params) => {
try {
if (!ctx.isSuperAdmin) {
throw new ForbiddenError('Resending CRM invites requires super-admin');
}
const id = params.id ?? '';
const result = await resendCrmInvite(id, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: result });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,28 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { errorResponse, ForbiddenError } from '@/lib/errors';
import { revokeCrmInvite } from '@/lib/services/crm-invite.service';
// Invites are a global resource (no portId column). Revoking a foreign
// tenant's pending invite by id would be cross-tenant tampering;
// restrict to super-admins to match the listing endpoint.
export const DELETE = withAuth(
withPermission('admin', 'manage_users', async (_req, ctx, params) => {
try {
if (!ctx.isSuperAdmin) {
throw new ForbiddenError('Revoking CRM invites requires super-admin');
}
const id = params.id ?? '';
await revokeCrmInvite(id, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ success: true });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,51 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse, ForbiddenError } from '@/lib/errors';
import { createCrmInvite, listCrmInvites } from '@/lib/services/crm-invite.service';
export const GET = withAuth(
withPermission('admin', 'manage_users', async (_req, ctx) => {
try {
// crm_user_invites is a global table (no per-port column) — invites
// mint better-auth users that may later be assigned roles in any
// port. Listing it cross-tenant would let a port-A director
// enumerate pending invitee emails, names, and isSuperAdmin flags
// for every other tenant. Restrict the listing to super-admins.
if (!ctx.isSuperAdmin) {
throw new ForbiddenError('Listing CRM invites requires super-admin');
}
const data = await listCrmInvites();
return NextResponse.json({ data });
} catch (error) {
return errorResponse(error);
}
}),
);
const createInviteSchema = z.object({
email: z.string().email(),
name: z.string().min(1).max(200).optional(),
isSuperAdmin: z.boolean().optional().default(false),
});
export const POST = withAuth(
withPermission('admin', 'manage_users', async (req, ctx) => {
try {
const body = await parseBody(req, createInviteSchema);
// Only existing super-admins can mint super-admin invitations. The
// manage_users permission is granted to port-scoped director roles,
// which must not be able to elevate themselves cross-tenant by
// inviting a fresh super_admin.
if (body.isSuperAdmin && !ctx.isSuperAdmin) {
throw new ForbiddenError('Only super admins can mint super-admin invitations');
}
const result = await createCrmInvite({ ...body, invitedBy: ctx });
return NextResponse.json({ data: result }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,72 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { getPublicOcrConfig, saveOcrConfig, OCR_MODELS } from '@/lib/services/ocr-config.service';
const saveSchema = z.object({
/** When 'global', requires super_admin and stores at port_id=null. */
scope: z.enum(['port', 'global']),
provider: z.enum(['openai', 'claude']),
model: z.string().min(1),
apiKey: z.string().optional(),
clearApiKey: z.boolean().optional(),
useGlobal: z.boolean().optional(),
aiEnabled: z.boolean().optional(),
});
// Only role tiers that hold `admin.manage_settings` (director / super_admin)
// may read or write the OCR config: the apiKey is stored encrypted but is
// passed straight into the receipt-scan handler, so a swapped key would
// exfiltrate every subsequent receipt image to whatever endpoint that key
// authenticates with.
export const GET = withAuth(
withPermission('admin', 'manage_settings', async (req, ctx) => {
try {
const url = new URL(req.url);
const scope = url.searchParams.get('scope') ?? 'port';
if (scope === 'global' && !ctx.isSuperAdmin) {
return NextResponse.json({ error: 'Super admin only' }, { status: 403 });
}
const config = await getPublicOcrConfig(scope === 'global' ? null : ctx.portId);
return NextResponse.json({ data: config, models: OCR_MODELS });
} catch (error) {
return errorResponse(error);
}
}),
);
export const PUT = withAuth(
withPermission('admin', 'manage_settings', async (req, ctx) => {
try {
const body = await parseBody(req, saveSchema);
if (body.scope === 'global' && !ctx.isSuperAdmin) {
return NextResponse.json({ error: 'Super admin only' }, { status: 403 });
}
const validModels = OCR_MODELS[body.provider];
if (!validModels.includes(body.model)) {
return NextResponse.json(
{ error: `Invalid model for provider ${body.provider}` },
{ status: 400 },
);
}
await saveOcrConfig(
body.scope === 'global' ? null : ctx.portId,
{
provider: body.provider,
model: body.model,
apiKey: body.apiKey,
clearApiKey: body.clearApiKey,
useGlobal: body.useGlobal,
aiEnabled: body.aiEnabled,
},
ctx.userId,
);
return NextResponse.json({ ok: true });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,31 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { OCR_MODELS } from '@/lib/services/ocr-config.service';
import { testProvider } from '@/lib/services/ocr-providers';
const schema = z.object({
provider: z.enum(['openai', 'claude']),
model: z.string().min(1),
apiKey: z.string().min(1),
});
// `manage_settings`-gated for parity with the parent OCR settings route —
// triggers outbound AI provider auth requests using a caller-supplied key.
export const POST = withAuth(
withPermission('admin', 'manage_settings', async (req) => {
try {
const body = await parseBody(req, schema);
if (!OCR_MODELS[body.provider].includes(body.model)) {
return NextResponse.json({ error: 'Invalid model' }, { status: 400 });
}
const result = await testProvider(body.provider, body.apiKey, body.model);
return NextResponse.json(result);
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -4,11 +4,25 @@ import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers'; import { parseBody } from '@/lib/api/route-helpers';
import { getPort, updatePort } from '@/lib/services/ports.service'; import { getPort, updatePort } from '@/lib/services/ports.service';
import { updatePortSchema } from '@/lib/validators/ports'; import { updatePortSchema } from '@/lib/validators/ports';
import { errorResponse } from '@/lib/errors'; import { errorResponse, ForbiddenError } from '@/lib/errors';
/**
* Non-super-admin callers (e.g. port directors holding admin.manage_settings)
* may only read/mutate THEIR OWN port row. The path id is therefore
* compared against ctx.portId and a foreign target is rejected before the
* service is touched. Super-admins retain unrestricted access.
*/
function assertPortInScope(targetPortId: string, ctx: { portId: string; isSuperAdmin: boolean }) {
if (ctx.isSuperAdmin) return;
if (targetPortId !== ctx.portId) {
throw new ForbiddenError('Cross-tenant port access denied');
}
}
export const GET = withAuth( export const GET = withAuth(
withPermission('admin', 'manage_settings', async (_req, _ctx, params) => { withPermission('admin', 'manage_settings', async (_req, ctx, params) => {
try { try {
assertPortInScope(params.id!, ctx);
const data = await getPort(params.id!); const data = await getPort(params.id!);
return NextResponse.json({ data }); return NextResponse.json({ data });
} catch (error) { } catch (error) {
@@ -20,6 +34,7 @@ export const GET = withAuth(
export const PATCH = withAuth( export const PATCH = withAuth(
withPermission('admin', 'manage_settings', async (req, ctx, params) => { withPermission('admin', 'manage_settings', async (req, ctx, params) => {
try { try {
assertPortInScope(params.id!, ctx);
const body = await parseBody(req, updatePortSchema); const body = await parseBody(req, updatePortSchema);
const data = await updatePort(params.id!, body, { const data = await updatePort(params.id!, body, {
userId: ctx.userId, userId: ctx.userId,

View File

@@ -4,11 +4,18 @@ import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers'; import { parseBody } from '@/lib/api/route-helpers';
import { listPorts, createPort } from '@/lib/services/ports.service'; import { listPorts, createPort } from '@/lib/services/ports.service';
import { createPortSchema } from '@/lib/validators/ports'; import { createPortSchema } from '@/lib/validators/ports';
import { errorResponse } from '@/lib/errors'; import { errorResponse, ForbiddenError } from '@/lib/errors';
// Listing every tenant and creating new tenants are super-admin operations:
// a port director must not be able to enumerate other ports (target
// discovery for cross-tenant attacks) or spin up new tenants whose admin
// they implicitly become.
export const GET = withAuth( export const GET = withAuth(
withPermission('admin', 'manage_settings', async () => { withPermission('admin', 'manage_settings', async (_req, ctx) => {
try { try {
if (!ctx.isSuperAdmin) {
throw new ForbiddenError('Listing all ports requires super-admin');
}
const data = await listPorts(); const data = await listPorts();
return NextResponse.json({ data }); return NextResponse.json({ data });
} catch (error) { } catch (error) {
@@ -20,6 +27,9 @@ export const GET = withAuth(
export const POST = withAuth( export const POST = withAuth(
withPermission('admin', 'manage_settings', async (req, ctx) => { withPermission('admin', 'manage_settings', async (req, ctx) => {
try { try {
if (!ctx.isSuperAdmin) {
throw new ForbiddenError('Creating ports requires super-admin');
}
const body = await parseBody(req, createPortSchema); const body = await parseBody(req, createPortSchema);
const data = await createPort(body, { const data = await createPort(body, {
userId: ctx.userId, userId: ctx.userId,

View File

@@ -4,14 +4,17 @@ import { withAuth } from '@/lib/api/helpers';
import { getEmailDraftResult } from '@/lib/services/email-draft.service'; import { getEmailDraftResult } from '@/lib/services/email-draft.service';
import { errorResponse } from '@/lib/errors'; import { errorResponse } from '@/lib/errors';
export const GET = withAuth(async (_req, _ctx, params) => { export const GET = withAuth(async (_req, ctx, params) => {
try { try {
const { jobId } = params; const { jobId } = params;
if (!jobId) { if (!jobId) {
return NextResponse.json({ error: 'jobId is required' }, { status: 400 }); return NextResponse.json({ error: 'jobId is required' }, { status: 400 });
} }
const result = await getEmailDraftResult(jobId); const result = await getEmailDraftResult(jobId, {
userId: ctx.userId,
portId: ctx.portId,
});
if (result === null) { if (result === null) {
return NextResponse.json({ status: 'processing' }); return NextResponse.json({ status: 'processing' });

View File

@@ -0,0 +1,11 @@
import { NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { acknowledgeAlert } from '@/lib/services/alerts.service';
export const POST = withAuth(async (_req, ctx, params) => {
const id = params.id;
if (!id) return NextResponse.json({ error: 'Missing id' }, { status: 400 });
await acknowledgeAlert(id, ctx.portId, ctx.userId);
return NextResponse.json({ ok: true });
});

View File

@@ -0,0 +1,11 @@
import { NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { dismissAlert } from '@/lib/services/alerts.service';
export const POST = withAuth(async (_req, ctx, params) => {
const id = params.id;
if (!id) return NextResponse.json({ error: 'Missing id' }, { status: 400 });
await dismissAlert(id, ctx.portId, ctx.userId);
return NextResponse.json({ ok: true });
});

View File

@@ -0,0 +1,24 @@
import { NextResponse } from 'next/server';
import { and, eq, isNull, sql } from 'drizzle-orm';
import { withAuth } from '@/lib/api/helpers';
import { db } from '@/lib/db';
import { alerts } from '@/lib/db/schema/insights';
export const GET = withAuth(async (_req, ctx) => {
const rows = await db
.select({ severity: alerts.severity, count: sql<number>`count(*)::int` })
.from(alerts)
.where(
and(eq(alerts.portId, ctx.portId), isNull(alerts.resolvedAt), isNull(alerts.dismissedAt)),
)
.groupBy(alerts.severity);
const bySeverity = { info: 0, warning: 0, critical: 0 } as Record<string, number>;
let total = 0;
for (const r of rows) {
bySeverity[r.severity] = r.count;
total += r.count;
}
return NextResponse.json({ total, bySeverity });
});

View File

@@ -0,0 +1,26 @@
import { NextRequest, NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { listAlertsForPort } from '@/lib/services/alerts.service';
type AlertStatus = 'open' | 'dismissed' | 'resolved';
export const GET = withAuth(async (req: NextRequest, ctx) => {
const url = new URL(req.url);
const status = (url.searchParams.get('status') ?? 'open') as AlertStatus;
const rows = await listAlertsForPort(ctx.portId, {
includeDismissed: status !== 'open',
includeResolved: status !== 'open',
});
// Filter to the requested status bucket so callers don't see overlap.
const filtered = rows.filter((a) => {
if (status === 'open') return !a.dismissedAt && !a.resolvedAt;
if (status === 'dismissed') return Boolean(a.dismissedAt) && !a.resolvedAt;
if (status === 'resolved') return Boolean(a.resolvedAt);
return true;
});
return NextResponse.json({ data: filtered });
});

View File

@@ -0,0 +1,37 @@
import { NextRequest, NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import {
ALL_RANGES,
getLeadSourceAttribution,
getOccupancyTimeline,
getPipelineFunnel,
getRevenueBreakdown,
type DateRange,
type MetricBase,
} from '@/lib/services/analytics.service';
const METRICS: Record<MetricBase, (portId: string, range: DateRange) => Promise<unknown>> = {
pipeline_funnel: getPipelineFunnel,
occupancy_timeline: getOccupancyTimeline,
revenue_breakdown: getRevenueBreakdown,
lead_source_attribution: getLeadSourceAttribution,
};
export const GET = withAuth(
withPermission('reports', 'view_analytics', async (req: NextRequest, ctx) => {
const url = new URL(req.url);
const metric = url.searchParams.get('metric') as MetricBase | null;
const range = (url.searchParams.get('range') ?? '30d') as DateRange;
if (!metric || !(metric in METRICS)) {
return NextResponse.json({ error: 'Invalid or missing metric' }, { status: 400 });
}
if (!ALL_RANGES.includes(range)) {
return NextResponse.json({ error: 'Invalid range' }, { status: 400 });
}
const data = await METRICS[metric](ctx.portId, range);
return NextResponse.json({ metric, range, data });
}),
);

View File

@@ -0,0 +1,65 @@
import { and, eq } from 'drizzle-orm';
import { NextResponse } from 'next/server';
import { type RouteHandler } from '@/lib/api/helpers';
import { parseBody, parseQuery } from '@/lib/api/route-helpers';
import { db } from '@/lib/db';
import { berths } from '@/lib/db/schema/berths';
import { NotFoundError, errorResponse } from '@/lib/errors';
import { createPending, listReservations } from '@/lib/services/berth-reservations.service';
import { createPendingSchema, listReservationsSchema } from '@/lib/validators/reservations';
// URL berthId is authoritative; make body berthId optional (ignored anyway).
const createPendingBodySchema = createPendingSchema
.omit({ berthId: true })
.extend({ berthId: createPendingSchema.shape.berthId.optional() });
async function assertBerthInPort(berthId: string, portId: string): Promise<void> {
const berth = await db.query.berths.findFirst({
where: and(eq(berths.id, berthId), eq(berths.portId, portId)),
});
if (!berth) throw new NotFoundError('Berth');
}
export const listHandler: RouteHandler = async (req, ctx, params) => {
try {
await assertBerthInPort(params.id!, ctx.portId);
const query = parseQuery(req, listReservationsSchema);
const result = await listReservations(ctx.portId, { ...query, berthId: params.id! });
const { page, limit } = query;
const totalPages = Math.ceil(result.total / limit);
return NextResponse.json({
data: result.data,
pagination: {
page,
pageSize: limit,
total: result.total,
totalPages,
hasNextPage: page < totalPages,
hasPreviousPage: page > 1,
},
});
} catch (error) {
return errorResponse(error);
}
};
export const createHandler: RouteHandler = async (req, ctx, params) => {
try {
await assertBerthInPort(params.id!, ctx.portId);
const body = await parseBody(req, createPendingBodySchema);
const reservation = await createPending(
ctx.portId,
{ ...body, berthId: params.id! },
{
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
},
);
return NextResponse.json({ data: reservation }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
};

View File

@@ -1,72 +1,6 @@
import { and, eq } from 'drizzle-orm'; import { withAuth, withPermission } from '@/lib/api/helpers';
import { NextResponse } from 'next/server';
import { withAuth, withPermission, type RouteHandler } from '@/lib/api/helpers'; import { listHandler, createHandler } from './handlers';
import { parseBody, parseQuery } from '@/lib/api/route-helpers';
import { db } from '@/lib/db';
import { berths } from '@/lib/db/schema/berths';
import { NotFoundError, errorResponse } from '@/lib/errors';
import { createPending, listReservations } from '@/lib/services/berth-reservations.service';
import { createPendingSchema, listReservationsSchema } from '@/lib/validators/reservations';
// URL berthId is authoritative; make body berthId optional (ignored anyway).
const createPendingBodySchema = createPendingSchema
.omit({ berthId: true })
.extend({ berthId: createPendingSchema.shape.berthId.optional() });
async function assertBerthInPort(berthId: string, portId: string): Promise<void> {
const berth = await db.query.berths.findFirst({
where: and(eq(berths.id, berthId), eq(berths.portId, portId)),
});
if (!berth) throw new NotFoundError('Berth');
}
export const listHandler: RouteHandler = async (req, ctx, params) => {
try {
await assertBerthInPort(params.id!, ctx.portId);
const query = parseQuery(req, listReservationsSchema);
// URL berthId is authoritative; override any client-supplied value.
const result = await listReservations(ctx.portId, { ...query, berthId: params.id! });
const { page, limit } = query;
const totalPages = Math.ceil(result.total / limit);
return NextResponse.json({
data: result.data,
pagination: {
page,
pageSize: limit,
total: result.total,
totalPages,
hasNextPage: page < totalPages,
hasPreviousPage: page > 1,
},
});
} catch (error) {
return errorResponse(error);
}
};
export const createHandler: RouteHandler = async (req, ctx, params) => {
try {
await assertBerthInPort(params.id!, ctx.portId);
const body = await parseBody(req, createPendingBodySchema);
// URL berthId is authoritative; any body-supplied berthId is ignored.
const reservation = await createPending(
ctx.portId,
{ ...body, berthId: params.id! },
{
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
},
);
return NextResponse.json({ data: reservation }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
};
export const GET = withAuth(withPermission('reservations', 'view', listHandler)); export const GET = withAuth(withPermission('reservations', 'view', listHandler));
export const POST = withAuth(withPermission('reservations', 'create', createHandler)); export const POST = withAuth(withPermission('reservations', 'create', createHandler));

View File

@@ -8,7 +8,7 @@ import { reorderWaitingListSchema } from '@/lib/validators/interests';
import { getWaitingList, updateWaitingList } from '@/lib/services/berths.service'; import { getWaitingList, updateWaitingList } from '@/lib/services/berths.service';
import { errorResponse, NotFoundError } from '@/lib/errors'; import { errorResponse, NotFoundError } from '@/lib/errors';
import { db } from '@/lib/db'; import { db } from '@/lib/db';
import { berthWaitingList } from '@/lib/db/schema/berths'; import { berths, berthWaitingList } from '@/lib/db/schema/berths';
// GET /api/v1/berths/[id]/waiting-list // GET /api/v1/berths/[id]/waiting-list
export const GET = withAuth( export const GET = withAuth(
@@ -47,11 +47,17 @@ export const PATCH = withAuth(
const body = await parseBody(req, reorderWaitingListSchema); const body = await parseBody(req, reorderWaitingListSchema);
const berthId = params.id!; const berthId = params.id!;
// Tenant scope: refuse to reorder a foreign-port berth's waiting
// list. The route's URL id and the entry id are otherwise enough
// for any user with manage_waiting_list to mutate any tenant's
// queue ordering.
const berthRow = await db.query.berths.findFirst({
where: and(eq(berths.id, berthId), eq(berths.portId, ctx.portId)),
});
if (!berthRow) throw new NotFoundError('Berth');
const entry = await db.query.berthWaitingList.findFirst({ const entry = await db.query.berthWaitingList.findFirst({
where: and( where: and(eq(berthWaitingList.id, body.entryId), eq(berthWaitingList.berthId, berthId)),
eq(berthWaitingList.id, body.entryId),
eq(berthWaitingList.berthId, berthId),
),
}); });
if (!entry) throw new NotFoundError('Waiting list entry'); if (!entry) throw new NotFoundError('Waiting list entry');

View File

@@ -1,15 +1,17 @@
import { NextResponse } from 'next/server'; import { NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers'; import { withAuth, withPermission } from '@/lib/api/helpers';
import { getBerthOptions } from '@/lib/services/berths.service'; import { getBerthOptions } from '@/lib/services/berths.service';
import { errorResponse } from '@/lib/errors'; import { errorResponse } from '@/lib/errors';
// GET /api/v1/berths/options — lightweight list for selects/comboboxes // GET /api/v1/berths/options — lightweight list for selects/comboboxes
export const GET = withAuth(async (req, ctx) => { export const GET = withAuth(
try { withPermission('berths', 'view', async (req, ctx) => {
const options = await getBerthOptions(ctx.portId); try {
return NextResponse.json({ data: options }); const options = await getBerthOptions(ctx.portId);
} catch (error) { return NextResponse.json({ data: options });
return errorResponse(error); } catch (error) {
} return errorResponse(error);
}); }
}),
);

View File

@@ -0,0 +1,51 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { updateClientAddress, removeClientAddress } from '@/lib/services/clients.service';
import { optionalCountryIsoSchema, optionalSubdivisionIsoSchema } from '@/lib/validators/i18n';
const updateAddressSchema = z.object({
label: z.string().min(1).max(80).optional(),
streetAddress: z.string().max(500).optional().nullable(),
city: z.string().max(120).optional().nullable(),
subdivisionIso: optionalSubdivisionIsoSchema.optional(),
postalCode: z.string().max(40).optional().nullable(),
countryIso: optionalCountryIsoSchema.optional(),
isPrimary: z.boolean().optional(),
});
export const PATCH = withAuth(
withPermission('clients', 'edit', async (req, ctx, params) => {
try {
const body = await parseBody(req, updateAddressSchema);
const row = await updateClientAddress(params.addressId!, params.id!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: row });
} catch (error) {
return errorResponse(error);
}
}),
);
export const DELETE = withAuth(
withPermission('clients', 'edit', async (req, ctx, params) => {
try {
await removeClientAddress(params.addressId!, params.id!, ctx.portId, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return new NextResponse(null, { status: 204 });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,46 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { listClientAddresses, addClientAddress } from '@/lib/services/clients.service';
import { optionalCountryIsoSchema, optionalSubdivisionIsoSchema } from '@/lib/validators/i18n';
const addAddressSchema = z.object({
label: z.string().min(1).max(80).optional(),
streetAddress: z.string().max(500).optional().nullable(),
city: z.string().max(120).optional().nullable(),
subdivisionIso: optionalSubdivisionIsoSchema.optional(),
postalCode: z.string().max(40).optional().nullable(),
countryIso: optionalCountryIsoSchema.optional(),
isPrimary: z.boolean().optional(),
});
export const GET = withAuth(
withPermission('clients', 'view', async (req, ctx, params) => {
try {
const rows = await listClientAddresses(params.id!, ctx.portId);
return NextResponse.json({ data: rows });
} catch (error) {
return errorResponse(error);
}
}),
);
export const POST = withAuth(
withPermission('clients', 'edit', async (req, ctx, params) => {
try {
const body = await parseBody(req, addAddressSchema);
const row = await addClientAddress(params.id!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: row }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -5,10 +5,13 @@ import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers'; import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors'; import { errorResponse } from '@/lib/errors';
import { updateContact, removeContact } from '@/lib/services/clients.service'; import { updateContact, removeContact } from '@/lib/services/clients.service';
import { optionalCountryIsoSchema, optionalPhoneE164Schema } from '@/lib/validators/i18n';
const updateContactSchema = z.object({ const updateContactSchema = z.object({
channel: z.enum(['email', 'phone', 'whatsapp', 'other']).optional(), channel: z.enum(['email', 'phone', 'whatsapp', 'other']).optional(),
value: z.string().min(1).optional(), value: z.string().min(1).optional(),
valueE164: optionalPhoneE164Schema.optional(),
valueCountry: optionalCountryIsoSchema.optional(),
label: z.string().optional(), label: z.string().optional(),
isPrimary: z.boolean().optional(), isPrimary: z.boolean().optional(),
notes: z.string().optional(), notes: z.string().optional(),
@@ -18,18 +21,12 @@ export const PATCH = withAuth(
withPermission('clients', 'edit', async (req, ctx, params) => { withPermission('clients', 'edit', async (req, ctx, params) => {
try { try {
const body = await parseBody(req, updateContactSchema); const body = await parseBody(req, updateContactSchema);
const contact = await updateContact( const contact = await updateContact(params.contactId!, params.id!, ctx.portId, body, {
params.contactId!, userId: ctx.userId,
params.id!, portId: ctx.portId,
ctx.portId, ipAddress: ctx.ipAddress,
body, userAgent: ctx.userAgent,
{ });
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
},
);
return NextResponse.json({ data: contact }); return NextResponse.json({ data: contact });
} catch (error) { } catch (error) {
return errorResponse(error); return errorResponse(error);

View File

@@ -5,10 +5,13 @@ import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers'; import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors'; import { errorResponse } from '@/lib/errors';
import { listContacts, addContact } from '@/lib/services/clients.service'; import { listContacts, addContact } from '@/lib/services/clients.service';
import { optionalCountryIsoSchema, optionalPhoneE164Schema } from '@/lib/validators/i18n';
const addContactSchema = z.object({ const addContactSchema = z.object({
channel: z.enum(['email', 'phone', 'whatsapp', 'other']), channel: z.enum(['email', 'phone', 'whatsapp', 'other']),
value: z.string().min(1), value: z.string().min(1),
valueE164: optionalPhoneE164Schema.optional(),
valueCountry: optionalCountryIsoSchema.optional(),
label: z.string().optional(), label: z.string().optional(),
isPrimary: z.boolean().optional().default(false), isPrimary: z.boolean().optional().default(false),
notes: z.string().optional(), notes: z.string().optional(),

View File

@@ -0,0 +1,24 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission, withRateLimit } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { getExportDownloadUrl } from '@/lib/services/gdpr-export.service';
/**
* Returns a fresh signed URL for an existing GDPR export. Staff use this
* from the admin UI; the email path embeds its own signed URL.
*/
export const GET = withAuth(
withPermission(
'admin',
'manage_settings',
withRateLimit('exports', async (req, ctx, params) => {
try {
const url = await getExportDownloadUrl(params.exportId!, ctx.portId);
return NextResponse.json({ data: { url } });
} catch (error) {
return errorResponse(error);
}
}),
),
);

View File

@@ -0,0 +1,49 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission, withRateLimit } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { requestGdprExport, listClientExports } from '@/lib/services/gdpr-export.service';
const requestSchema = z.object({
/** When true, the bundle is emailed to the client once it finishes building. */
emailToClient: z.boolean().optional().default(false),
/** Optional override recipient (e.g. legal counsel). Skips the primary-email lookup. */
emailOverride: z.string().email().optional().nullable(),
});
export const GET = withAuth(
withPermission('clients', 'view', async (req, ctx, params) => {
try {
const rows = await listClientExports(params.id!, ctx.portId);
return NextResponse.json({ data: rows });
} catch (error) {
return errorResponse(error);
}
}),
);
export const POST = withAuth(
withPermission(
'admin',
'manage_settings',
withRateLimit('exports', async (req, ctx, params) => {
try {
const body = await parseBody(req, requestSchema);
const result = await requestGdprExport({
clientId: params.id!,
portId: ctx.portId,
requestedBy: ctx.userId,
emailToClient: body.emailToClient,
emailOverride: body.emailOverride ?? null,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: result.export }, { status: 202 });
} catch (error) {
return errorResponse(error);
}
}),
),
);

View File

@@ -1,15 +1,17 @@
import { NextResponse } from 'next/server'; import { NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers'; import { withAuth, withPermission } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors'; import { errorResponse } from '@/lib/errors';
import { listClientOptions } from '@/lib/services/clients.service'; import { listClientOptions } from '@/lib/services/clients.service';
export const GET = withAuth(async (req, ctx) => { export const GET = withAuth(
try { withPermission('clients', 'view', async (req, ctx) => {
const search = req.nextUrl.searchParams.get('search') ?? undefined; try {
const data = await listClientOptions(ctx.portId, search); const search = req.nextUrl.searchParams.get('search') ?? undefined;
return NextResponse.json({ data }); const data = await listClientOptions(ctx.portId, search);
} catch (error) { return NextResponse.json({ data });
return errorResponse(error); } catch (error) {
} return errorResponse(error);
}); }
}),
);

View File

@@ -0,0 +1,51 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { updateCompanyAddress, removeCompanyAddress } from '@/lib/services/companies.service';
import { optionalCountryIsoSchema, optionalSubdivisionIsoSchema } from '@/lib/validators/i18n';
const updateAddressSchema = z.object({
label: z.string().min(1).max(80).optional(),
streetAddress: z.string().max(500).optional().nullable(),
city: z.string().max(120).optional().nullable(),
subdivisionIso: optionalSubdivisionIsoSchema.optional(),
postalCode: z.string().max(40).optional().nullable(),
countryIso: optionalCountryIsoSchema.optional(),
isPrimary: z.boolean().optional(),
});
export const PATCH = withAuth(
withPermission('companies', 'edit', async (req, ctx, params) => {
try {
const body = await parseBody(req, updateAddressSchema);
const row = await updateCompanyAddress(params.addressId!, params.id!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: row });
} catch (error) {
return errorResponse(error);
}
}),
);
export const DELETE = withAuth(
withPermission('companies', 'edit', async (req, ctx, params) => {
try {
await removeCompanyAddress(params.addressId!, params.id!, ctx.portId, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return new NextResponse(null, { status: 204 });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,46 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { listCompanyAddresses, addCompanyAddress } from '@/lib/services/companies.service';
import { optionalCountryIsoSchema, optionalSubdivisionIsoSchema } from '@/lib/validators/i18n';
const addAddressSchema = z.object({
label: z.string().min(1).max(80).optional(),
streetAddress: z.string().max(500).optional().nullable(),
city: z.string().max(120).optional().nullable(),
subdivisionIso: optionalSubdivisionIsoSchema.optional(),
postalCode: z.string().max(40).optional().nullable(),
countryIso: optionalCountryIsoSchema.optional(),
isPrimary: z.boolean().optional(),
});
export const GET = withAuth(
withPermission('companies', 'view', async (req, ctx, params) => {
try {
const rows = await listCompanyAddresses(params.id!, ctx.portId);
return NextResponse.json({ data: rows });
} catch (error) {
return errorResponse(error);
}
}),
);
export const POST = withAuth(
withPermission('companies', 'edit', async (req, ctx, params) => {
try {
const body = await parseBody(req, addAddressSchema);
const row = await addCompanyAddress(params.id!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: row }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,45 @@
import { NextResponse } from 'next/server';
import { type RouteHandler } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { getCompanyById, updateCompany, archiveCompany } from '@/lib/services/companies.service';
import { updateCompanySchema } from '@/lib/validators/companies';
export const getHandler: RouteHandler = async (req, ctx, params) => {
try {
const company = await getCompanyById(params.id!, ctx.portId);
return NextResponse.json({ data: company });
} catch (error) {
return errorResponse(error);
}
};
export const patchHandler: RouteHandler = async (req, ctx, params) => {
try {
const body = await parseBody(req, updateCompanySchema);
const updated = await updateCompany(params.id!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: updated });
} catch (error) {
return errorResponse(error);
}
};
export const deleteHandler: RouteHandler = async (req, ctx, params) => {
try {
await archiveCompany(params.id!, ctx.portId, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return new NextResponse(null, { status: 204 });
} catch (error) {
return errorResponse(error);
}
};

View File

@@ -0,0 +1,63 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { createAuditLog } from '@/lib/audit';
import { errorResponse, NotFoundError } from '@/lib/errors';
import { updateNoteSchema } from '@/lib/validators/notes';
import * as notesService from '@/lib/services/notes.service';
export const PATCH = withAuth(
withPermission('companies', 'edit', async (req, ctx, params) => {
try {
const companyId = params.id;
const noteId = params.noteId;
if (!companyId) throw new NotFoundError('Company');
if (!noteId) throw new NotFoundError('Note');
const body = await parseBody(req, updateNoteSchema);
const note = await notesService.update(ctx.portId, 'companies', companyId, noteId, body);
void createAuditLog({
userId: ctx.userId,
portId: ctx.portId,
action: 'update',
entityType: 'company_note',
entityId: noteId,
metadata: { companyId },
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: note });
} catch (error) {
return errorResponse(error);
}
}),
);
export const DELETE = withAuth(
withPermission('companies', 'edit', async (_req, ctx, params) => {
try {
const companyId = params.id;
const noteId = params.noteId;
if (!companyId) throw new NotFoundError('Company');
if (!noteId) throw new NotFoundError('Note');
await notesService.deleteNote(ctx.portId, 'companies', companyId, noteId);
void createAuditLog({
userId: ctx.userId,
portId: ctx.portId,
action: 'delete',
entityType: 'company_note',
entityId: noteId,
metadata: { companyId },
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return new NextResponse(null, { status: 204 });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,47 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { createAuditLog } from '@/lib/audit';
import { errorResponse, NotFoundError } from '@/lib/errors';
import { createNoteSchema } from '@/lib/validators/notes';
import * as notesService from '@/lib/services/notes.service';
export const GET = withAuth(
withPermission('companies', 'view', async (_req, ctx, params) => {
try {
const companyId = params.id;
if (!companyId) throw new NotFoundError('Company');
const notes = await notesService.listForEntity(ctx.portId, 'companies', companyId);
return NextResponse.json({ data: notes });
} catch (error) {
return errorResponse(error);
}
}),
);
export const POST = withAuth(
withPermission('companies', 'edit', async (req, ctx, params) => {
try {
const companyId = params.id;
if (!companyId) throw new NotFoundError('Company');
const body = await parseBody(req, createNoteSchema);
const note = await notesService.create(ctx.portId, 'companies', companyId, ctx.userId, body);
void createAuditLog({
userId: ctx.userId,
portId: ctx.portId,
action: 'create',
entityType: 'company_note',
entityId: note.id,
metadata: { companyId },
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: note }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -1,48 +1,6 @@
import { NextResponse } from 'next/server'; import { withAuth, withPermission } from '@/lib/api/helpers';
import { withAuth, withPermission, type RouteHandler } from '@/lib/api/helpers'; import { getHandler, patchHandler, deleteHandler } from './handlers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { getCompanyById, updateCompany, archiveCompany } from '@/lib/services/companies.service';
import { updateCompanySchema } from '@/lib/validators/companies';
export const getHandler: RouteHandler = async (req, ctx, params) => {
try {
const company = await getCompanyById(params.id!, ctx.portId);
return NextResponse.json({ data: company });
} catch (error) {
return errorResponse(error);
}
};
export const patchHandler: RouteHandler = async (req, ctx, params) => {
try {
const body = await parseBody(req, updateCompanySchema);
const updated = await updateCompany(params.id!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: updated });
} catch (error) {
return errorResponse(error);
}
};
export const deleteHandler: RouteHandler = async (req, ctx, params) => {
try {
await archiveCompany(params.id!, ctx.portId, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return new NextResponse(null, { status: 204 });
} catch (error) {
return errorResponse(error);
}
};
export const GET = withAuth(withPermission('companies', 'view', getHandler)); export const GET = withAuth(withPermission('companies', 'view', getHandler));
export const PATCH = withAuth(withPermission('companies', 'edit', patchHandler)); export const PATCH = withAuth(withPermission('companies', 'edit', patchHandler));

View File

@@ -0,0 +1,28 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { setCompanyTags } from '@/lib/services/companies.service';
const setTagsSchema = z.object({
tagIds: z.array(z.string()),
});
export const PUT = withAuth(
withPermission('companies', 'edit', async (req, ctx, params) => {
try {
const { tagIds } = await parseBody(req, setTagsSchema);
await setCompanyTags(params.id!, ctx.portId, tagIds, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ success: true });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,44 @@
import { NextResponse } from 'next/server';
import { type RouteHandler } from '@/lib/api/helpers';
import { parseQuery, parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { listCompanies, createCompany } from '@/lib/services/companies.service';
import { listCompaniesSchema, createCompanySchema } from '@/lib/validators/companies';
export const listHandler: RouteHandler = async (req, ctx) => {
try {
const query = parseQuery(req, listCompaniesSchema);
const result = await listCompanies(ctx.portId, query);
const { page, limit } = query;
const totalPages = Math.ceil(result.total / limit);
return NextResponse.json({
data: result.data,
pagination: {
page,
pageSize: limit,
total: result.total,
totalPages,
hasNextPage: page < totalPages,
hasPreviousPage: page > 1,
},
});
} catch (error) {
return errorResponse(error);
}
};
export const createHandler: RouteHandler = async (req, ctx) => {
try {
const body = await parseBody(req, createCompanySchema);
const company = await createCompany(ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: company }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
};

View File

@@ -1,47 +1,6 @@
import { NextResponse } from 'next/server'; import { withAuth, withPermission } from '@/lib/api/helpers';
import { withAuth, withPermission, type RouteHandler } from '@/lib/api/helpers'; import { listHandler, createHandler } from './handlers';
import { parseQuery, parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { listCompanies, createCompany } from '@/lib/services/companies.service';
import { listCompaniesSchema, createCompanySchema } from '@/lib/validators/companies';
export const listHandler: RouteHandler = async (req, ctx) => {
try {
const query = parseQuery(req, listCompaniesSchema);
const result = await listCompanies(ctx.portId, query);
const { page, limit } = query;
const totalPages = Math.ceil(result.total / limit);
return NextResponse.json({
data: result.data,
pagination: {
page,
pageSize: limit,
total: result.total,
totalPages,
hasNextPage: page < totalPages,
hasPreviousPage: page > 1,
},
});
} catch (error) {
return errorResponse(error);
}
};
export const createHandler: RouteHandler = async (req, ctx) => {
try {
const body = await parseBody(req, createCompanySchema);
const company = await createCompany(ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: company }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
};
export const GET = withAuth(withPermission('companies', 'view', listHandler)); export const GET = withAuth(withPermission('companies', 'view', listHandler));
export const POST = withAuth(withPermission('companies', 'create', createHandler)); export const POST = withAuth(withPermission('companies', 'create', createHandler));

Some files were not shown because too many files have changed in this diff Show More