86 Commits

Author SHA1 Message Date
Matt Ciaccio
21868ee5fc feat(berths,seed): polish detail display + prune ports to Port Nimara + Amador
Berth detail (src/components/berths/berth-tabs.tsx):
- Numeric display polish, exposed by the new NocoDB-sourced seed:
  - Power capacity now renders with kW unit (e.g. "330 kW")
  - Voltage now renders with V unit (e.g. "480 V")
  - All metric/imperial values rounded to <= 2 decimals
    (was: "62.999112 m" -> now: "62.99 m")
  - Nominal Boat Size shows full ft + m pair (was: ft only)

Seed ports (src/lib/db/seed.ts):
- Drop Marina Azzurra and Harbor Royale; install now seeds only:
  - Port Nimara  (the real install)
  - Port Amador  (secondary, for multi-tenant isolation tests / Panama
                  scaffolding)
- Existing dev DBs are not touched; this only affects fresh `pnpm db:seed`
  runs. Users wanting to migrate should drop existing rows in the obsolete
  ports manually before re-seeding.

Verification:
- lint clean, tsc unchanged from baseline (36 pre-existing errors), 858/858
  vitest passing.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 15:59:36 +02:00
Matt Ciaccio
c7ab816c99 feat(seed): replace 12 hand-rolled berths with 117-row NocoDB snapshot
The old seed only had 12 berths with made-up area names ("North Pier",
"Central Basin", etc.) and placeholder dimensions. Devs now get the real
117 berths exported from the legacy NocoDB Berths table — every editable
column populated with real production values.

What's in the snapshot (src/lib/db/seed-data/berths.json):
- 117 berths total (61 available / 45 under_offer / 11 sold)
- Areas A through E (matches NocoDB single-select)
- All numeric fields filled: length / width / draft (ft + m), water depth,
  nominal boat size, power capacity (kW), voltage (V)
- All NocoDB single-selects filled where present: side pontoon,
  mooring type, cleat/bollard type+capacity, access
- Bow facing, status_override_mode, berth_approved carried forward as-is
- Status normalized to lowercase snake_case ("Under Offer" -> "under_offer")
- Mooring numbers reformatted A1 -> A-01 to keep the existing "Letter-NN"
  convention used elsewhere in the codebase

Pre-sorted to preserve seed semantics:
  idx 0..4   -> 5 available  (small)   -- "open" / "details_sent" interests
  idx 5..9   -> 5 under_offer (medium) -- "eoi_signed" / "deposit" / "contract"
  idx 10..11 -> 2 sold (large)         -- "completed" interests
This means existing interest/reservation seeds that index berthRows[0..11]
keep their semantic alignment without code changes.

End-to-end verified by clearing Marina Azzurra and re-seeding:
  Port "Marina Azzurra" -- 117 berths, 8 clients, 3 companies, 12 yachts,
                           15 interests, 8 reservations

Future devs running `pnpm db:seed` on a fresh DB will now get realistic
berth data automatically.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 15:41:12 +02:00
Matt Ciaccio
e40b6c3d99 feat(berths): full NocoDB field parity, numeric types, sales edit access
Aligns the berths schema with the 117 production rows in NocoDB and exposes
every field for editing via the BerthForm sheet.

Schema (migration 0020):
- power_capacity / voltage / nominal_boat_size / nominal_boat_size_m: text -> numeric
  (NocoDB stores plain numbers; text was wrong shape and broke filter/sort)
- ADD status_override_mode text (1/117 legacy rows have a value; carried
  forward for parity but not yet wired into the UI)
- USING NULLIF(TRIM(...), '')::numeric so legacy whitespace and empty
  strings convert cleanly

Validator + service:
- updateBerthSchema / createBerthSchema use z.coerce.number() for the
  four numeric fields
- berths.service stringifies numeric values for Drizzle's numeric type

Form (src/components/berths/berth-form.tsx):
- adds: nominal boat size (ft/m), water depth (ft/m) + "is minimum" flag,
  side pontoon, cleat type/capacity, bollard type/capacity, bow facing
- converts to typed selects (with NocoDB option lists in src/lib/constants):
  area, side pontoon, mooring type, cleat type/capacity, bollard type/capacity,
  access
- power capacity / voltage become numeric inputs (with kW / V hints)

Permissions (seed.ts + dev DB):
- sales_manager and sales_agent: berths.edit false -> true
  ("sales will sometimes have to update these and I cannot be the only one")
- super_admin / director already had it; viewer stays read-only
- dev DB updated in-place via UPDATE roles ... jsonb_set

Verification:
- pnpm exec vitest run: 858/858 passing
- pnpm exec tsc --noEmit: same 36 errors as baseline (all pre-existing
  on feat/mobile-foundation, none introduced)
- lint clean

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 15:30:32 +02:00
Matt Ciaccio
e2398099c4 test(audit-fixes): cover the new permission and webhook surfaces
Adds integration coverage for the routes / handlers shipped in the
preceding audit-fix commits, plus refactors two route files to expose
inner handlers from a sibling `handlers.ts` (the pattern used elsewhere
in `src/app/api/v1`) so tests can call them without the
`withAuth(withPermission(…))` wrapper.

New tests (18 cases across 4 files):
- `tests/integration/portal-auth.test.ts` (6) — verifyPortalToken
  rejects tokens missing `aud: 'portal'` or `iss: 'pn-crm'`, with the
  wrong audience (CRM-session-replay shape) or wrong issuer, plus a
  round-trip happy path. Locks in the portal-vs-CRM token isolation.
- `tests/integration/api/saved-views-ownership.test.ts` (6) — patch
  and delete handlers return 403 for a different user, 404 for an
  unknown id or cross-port id, and 200 for the owner. Ownership is
  enforced at the route layer regardless of the service's internal
  filtering.
- `tests/integration/api/berth-reservations-list.test.ts` (3) — the
  new global list returns rows for the current port only and honors
  pagination params. A reservation in a different port never leaks.
- `tests/integration/documents-expired-webhook.test.ts` (3) —
  handleDocumentExpired flips the document to `expired`, also flips
  the linked interest's `eoiStatus`, writes a `documentEvents` row,
  and is a no-op (not a throw) when the documensoId is unknown.

Refactors:
- `src/app/api/v1/saved-views/[id]/route.ts` extracts `patchHandler` /
  `deleteHandler` (and the shared `assertViewOwner`) into
  `handlers.ts`. The route file is now a 4-line `withAuth(handler)`
  wrapper.
- `src/app/api/v1/berth-reservations/route.ts` extracts `listHandler`
  similarly. Tests import directly from `handlers.ts`.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-02 23:17:08 +02:00
Matt Ciaccio
d364b09885 fix(realtime): keep socket through reconnects, stop re-subscribe storm
Two correctness bugs in the real-time stack — both silent failures, both
session-wide once they trigger.

(1) `SocketProvider` was setting the React context to null on every
`disconnect` event. socket.io's built-in reconnection re-establishes the
underlying transport and replays handlers, but the React tree had
already lost its reference to the socket — so every `useSocket()`
consumer saw null until a session/port change forced a remount. Effect:
after the first transient drop (laptop sleep, wifi blip, server
restart), realtime invalidation and toasts went dead session-wide with
no user-visible signal.

Fix: keep the socket reference stable for the lifetime of the
session+port, and surface a separate `isConnected` boolean for any UI
that wants to render an offline indicator. Exposed as a new
`useIsSocketConnected()` hook; `useSocket()` signature is unchanged.

(2) `useRealtimeInvalidation` captured `eventMap` as a useEffect
dependency. Every caller passes a fresh `{ ... }` object literal on each
render, so the effect re-ran every render → `socket.off`/`socket.on`
storm on pages with many subscribed events.

Fix: extract the subscription logic into a pure helper
(`realtime-invalidation-core.ts`, JSX-free for vitest). The hook now
keeps the latest map in a ref and only re-subscribes when the SET of
event names changes (joined-keys signature, not object identity). The
handler reads `ref.current` at fire time, so callers still see fresh
queryKey lists without re-binding.

Helper is unit-tested with a stub socket: registration count,
fire-time map lookup, cleanup deregistration, missing-event safety.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-02 23:11:52 +02:00
Matt Ciaccio
57a099acc4 fix(ui): humanize enum labels, format dates, resolve actor names, loading skeleton
- Documents hub signer status now renders via a label map (`Pending`,
  `Signed`, `Declined`, …) instead of the raw lowercase enum value.
- Invoice detail formats `dueDate` and `paymentDate` as `MMM d, yyyy`
  via `date-fns` instead of leaking raw `2025-03-14` ISO strings, and
  swaps the "Payment Method" free-text input for a `Select` of labelled
  options (`Bank transfer`, `Credit card`, …) so we never store
  `bank_transfer` from a hand-typed field again.
- Interest tabs `MilestoneSection` status badge uses a `humanizeStatus`
  helper so values like `waiting_for_signatures` show as
  `Waiting For Signatures` (correctly title-cased) instead of being a
  lower-snake-case fragment inside an ALL-CAPS pill.
- `OUTCOME_BADGE` in the interest header now has a fall-through that
  renders any unknown outcome as a closed-state badge, preventing a
  closed interest from looking open just because its enum was added
  upstream without a matching label entry.
- Interest timeline route joins the `user` table and returns
  `userName` alongside `userId`; the client renders the resolved name
  instead of a 36-char UUID. Falls back to `'a teammate'` if the user
  row was deleted.
- Invoice "New / Step 3 — Review" replaces the truncated UUID display
  with a server-resolved client/company name via a small `useQuery`,
  so users can confirm they picked the right billing entity before
  submitting.
- New `loading.tsx` for client detail renders a header / tab strip /
  card skeleton during the server-component / initial-query window
  that previously flashed empty.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-02 23:01:35 +02:00
Matt Ciaccio
a391934b73 feat(marina): end-reservation UI + global list, yacht tabs, dashboard distinct count
- End-reservation: API handler existed but had no UI surface. Adds an
  "End reservation" button + date dialog on the reservation detail page,
  visible only when status is `active`.
- New port-scoped `GET /api/v1/berth-reservations` list endpoint and
  `[portSlug]/berth-reservations` page so users can see all reservations
  across all berths from one place (was 404).
- Berths "Edit" menu pushed `/berths/{id}?edit=true` but the detail page
  never read the param — it now auto-opens the edit sheet on mount and
  strips `edit` from the URL.
- Reservation detail no longer shows raw 8-char UUIDs for Berth / Yacht
  / Client; reuses the lazy-fetching link components from the list view.
- Yacht "Interests" and "Reservations" tabs replaced their "Coming soon"
  stubs with real lists fetched from the existing service routes.
- Dashboard "Pipeline Value" KPI used `select(berthId, price)` and
  summed per active interest, so a berth with three open interests was
  counted three times. Switched to `selectDistinct(berthId, price)`.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-02 23:01:15 +02:00
Matt Ciaccio
e3e0e69c04 fix(documenso): expired event, real signer emails, query invalidation, double-fire
- Wire the `DOCUMENT_EXPIRED` webhook event to `handleDocumentExpired`.
  Previously the handler existed but was never called, leaving expired
  EOIs stuck in `sent` / `partially_signed` forever.
- `sendForSigning` now resolves real port-configured signer emails via
  `getPortEoiSigners(portId)` instead of fabricating
  `developer@{slug}.com` / `sales@{slug}.com`. The Documenso-template
  pathway was already using these; the upload-PDF pathway now matches.
- `handleRecipientSigned` logs a warning when the email match returns
  zero rows so a misconfigured signer isn't a silent no-op.
- `handleDocumentCompleted` skips berth-rule re-evaluation when the
  interest is already at or past `eoi_signed`, preventing a double-fire
  when `DOCUMENT_SIGNED` and `DOCUMENT_COMPLETED` arrive close together.
- EOI generate dialog now invalidates by predicate (any queryKey
  starting with `'documents'`) so the Documents tab and hub counts
  refresh after generation, instead of missing because the actual
  query key shape didn't match the targeted invalidation.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-02 23:00:58 +02:00
Matt Ciaccio
6af2ac9680 fix(auth): harden admin gate, X-Port-Id, portal JWT, saved-views
- Add server-side `<admin>/layout.tsx` that redirects non-super-admins to
  `/[portSlug]/dashboard`. Closes the gap where any authed user could
  guess the URL and reach Users / Roles / Audit Log / Backup.
- `withAuth` super-admin branch now 404s when the requested portId does
  not match a real port row, preventing a compromised super-admin
  session from operating against a fabricated portId.
- Portal JWTs now carry `aud: 'portal'` + `iss: 'pn-crm'` claims and
  `verifyPortalToken` requires both, so a portal token can no longer be
  replayed against the CRM session path or vice versa. In-flight tokens
  (≤24h) will be invalidated once on deploy.
- `saved-views/[id]` PATCH and DELETE now do an explicit ownership
  check before the service call, returning 403 instead of relying on
  the service's internal userId filter.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-02 23:00:42 +02:00
Matt Ciaccio
a767652d74 feat(sales-ux): triage signals, reminders, realtime toasts, mobile FAB
Sales-CRM workflow batch — closes audit recommendations #2, #3, #4, #6,
#7, #8, #9, #10, #13, #15. Skips #11 (My-pipeline filter — needs a real
assignee column on interests, defer until ownership model lands) and #12
(keyboard shortcuts — explicit user call).

  Interest list (the rep's main triage surface):

    - Last activity column replaces Created (sortable by
      dateLastContact). Postgres NULLs-last on DESC means
      never-contacted leads sort to the bottom — exactly the right
      triage default.
    - Comment-icon next to client name when notesCount > 0, with a
      tooltip showing the count. Cheap, glanceable signal that the
      lead has correspondence to peek at.
    - Urgency badges under stage when criteria fire: "Silent Nd"
      for mid-funnel interests with no contact in 7+ days,
      "EOI Nd" for EOIs awaiting signature 14+ days, "Deposit Nd"
      for eoi_signed interests with no deposit after 21 days.
      Pure derived — no extra fetch, computed from the dates the
      row already returns.
    - Bulk select checkbox column with bulk-archive (existing
      DataTable.bulkActions API; just wired with a confirm-dialog
      and a Promise.all fan-out).
    - Mobile FAB (+) for new interest, anchored above the bottom-tab
      bar with safe-area inset awareness.

    All four signals mirrored on the mobile InterestCard (comment
    icon, urgency badges, last-activity footer).

  Interest detail:

    - Reminder bell badge in the header showing pending/snoozed
      reminder count linked to the interest. Surfaced via
      getInterestById's new `activeReminderCount`.
    - "Latest note" teaser on the Overview tab — truncated 3-line
      preview of the most recent threaded note + relative time +
      "View all" link to the Notes tab. Saves a click for the
      common "what was discussed last?" peek.
    - Color-block swatches in InlineStagePicker dropdown (rounded-sm
      mini-bars in the stage's progressive saturation color, replacing
      the previous tiny dots). Reads as a visual scan instead of a
      list.

  Dashboard:

    - MyRemindersRail on the right sidebar above the existing
      AlertRail. Shows pending+snoozed reminders for the current
      user (overdue first), each with priority pill, relative due
      time, and click-through to the linked interest/client/berth.

  Berth detail:

    - BerthInterestPulse card at the top of the Overview tab,
      replacing the old "buried in tab" pattern. Shows up to 5
      active interests with avatar, stage pill, urgency badges, and
      last-activity. Mirrors the old Nuxt CRM's beloved "Interested
      Parties" panel but with the new triage signals.

  Realtime toasts:

    - New <RealtimeToasts /> mounted inside SocketProvider in the
      dashboard layout. Subscribes to interest:stageChanged,
      document:completed, document:signer:signed, and
      interest:outcomeSet — fires sonner toasts so reps watching any
      page learn about pipeline events without refreshing.

  Service layer:

    - listInterests: notesCount per row (left join + count + groupBy).
    - getInterestById: clientPrimaryPhone + clientPrimaryPhoneE164
      (for the Email/Call/WhatsApp buttons added last commit; phone
      pieces were missing), notesCount, recentNote, activeReminderCount.
    - sortColumn switch handles 'dateLastContact' explicitly; default
      stays 'updatedAt'.

tsc clean. vitest 835/835 pass. ESLint clean on every file touched.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-02 04:09:51 +02:00
Matt Ciaccio
c824b2df12 feat(interests): Email / Call / WhatsApp deep-links on interest header
The interest detail is the rep's workbench — but until now, calling or
emailing the lead meant navigating away to the client page first. Surface
the same Email / Call / WhatsApp affordances that already live on the
client header right where the work is happening.

  - getInterestById: extended to also resolve the linked client's primary
    phone (display value + canonical E.164 form for wa.me).
    `clientPrimaryEmail` is the same column we surfaced earlier for the
    EOI prereq checklist; this commit just adds the phone columns
    alongside it.

  - InterestDetailHeader: new contact-actions row tucked under the meta
    line. Each button is asChild over a real <a href> so middle-click,
    Cmd-click, and screen-readers behave correctly. Renders only the
    buttons whose underlying contact channel is present (Email-only when
    no phone is on file, etc.). The whole row is hidden when the client
    has no contacts at all.

  - WhatsApp number prefers the E.164 form; falls back to digits-stripped
    display value when the canonical form is missing.

tsc clean. vitest 835/835 pass. ESLint clean on every file touched.

Closes audit recommendation #1 (top-of-list — biggest sales-workflow
win per click saved).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-02 03:33:13 +02:00
Matt Ciaccio
d197f8b321 feat(eoi): align prerequisites with EOI document structure
Match the gate to the actual EOI's structure (Section 2 vs Section 3) so
the rep can generate the document the moment they have what they need —
and not before.

  Required (Section 2 — top paragraph):
    - Client name
    - Client primary email
    - Client primary address

  Optional (Section 3 — left blank when absent):
    - Linked yacht (name, dimensions)
    - Linked berth (mooring number)

Previously the dialog blocked generation unless yacht AND berth were both
linked, which was overzealous — early-stage EOIs are routinely sent before
a specific berth is pinned down.

  - eoi-context.ts: yacht and berth are now nullable in the returned
    context. The hard ValidationError is now driven by the EOI's Section
    2 fields (name/email/address) rather than yacht/berth presence. The
    owner block falls back to the interest's client when no yacht is
    linked, so signing parties remain resolvable.

  - documenso-payload.ts + fill-eoi-form.ts: Section 3 form values
    render as empty strings when yacht or berth are absent, so the
    rendered PDF leaves those template inputs blank.

  - document-templates.ts: yacht.* and berth.* tokens fall back to
    empty strings; the legacy-fallback catch handler also recognises
    the new "missing required client details" error.

  - interests.service.ts: getInterestById now also returns
    `clientPrimaryEmail` and `clientHasAddress` so the Documents tab
    can compute the EOI prerequisites checklist client-side without an
    extra fetch.

  - eoi-generate-dialog.tsx: prereqs split into two groups visually —
    Required (with red ✗ when missing) and Optional (with grey – when
    absent). The Generate button only requires the Required block to
    pass. A small amber banner surfaces when Required is incomplete so
    the rep knows where to add the missing data.

Tests: 835/835 pass. Replaces the obsolete "throws on missing yacht/
berth" tests with parity coverage for the new behaviour ("builds a
valid context when yacht/berth missing", "throws when client email/
address missing"). Adds a payload test for the empty-Section-3 case.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-02 03:11:14 +02:00
Matt Ciaccio
76a7387dcc fix(ux): batch UX audit fixes across spine pages
Comprehensive audit findings rolled up into one pass.

Bugs:

  - dialog.tsx — sm-breakpoint centering classes (sm:left-[50%] /
    sm:top-[50%]) were being silently stripped by tailwind-merge because
    the base inset-0 + sm:inset-auto pair counted as a conflict. Replaced
    with explicit per-side utilities (top-0 right-0 bottom-0 left-0 +
    sm:right-auto sm:bottom-auto). Every Dialog instance now centers
    correctly on desktop. (Affected 16 dialog consumers.)

  - interest-documents-tab.tsx — useQuery shared the queryKey
    ['interests', interestId] with the parent InterestDetail's query but
    returned a different shape ({ data: ... } envelope vs unwrapped).
    They clobbered each other's cache on tab mount, degenerating the
    parent header to "Unknown Client" / "Open" briefly. Unified the
    queryFn shape so the cache stays consistent.

  - interest-tabs.tsx — milestone steps now derive done-state from
    PIPELINE_STAGES.indexOf(currentStage) >= step.advanceStage_idx as
    well as from the date stamp. Stage truth > date truth. Seeded /
    imported interests that arrived past `open` without per-step dates
    now correctly show their milestone steps as checked.

  - interest-detail.tsx — wires useMobileChrome so the mobile topbar
    shows the client name instead of the interest UUID.

  - interest-documents-tab.tsx — empty state restructured to a centered
    "No documents yet — Generate EOI" CTA card instead of a small
    primary button floating in the corner.

  - timeline/route.ts — synthesizes a "Created at <stage>" event when
    no audit-log rows exist for the interest, so the Activity tab
    isn't empty for seeded interests.

  - lead-source-chart.tsx — pie radii switched from fixed 90px/50px
    to "70%"/"40%" so the pie scales with the container instead of
    being clipped at narrow widths; reserved 40px for the legend.

Visual / clarity:

  - interest-detail-header.tsx — Won/Lost rendered as branded text
    buttons on desktop ("Mark won", "Close as lost") and icon-only on
    mobile via `hidden sm:inline`. Edit/Archive stay icon-only. Reopen
    promoted to a labeled button when the interest is closed. Added
    "Last contact Xd ago" to the meta row.

  - detail-header-strip.tsx — py-4 → py-3 (tighter strip).

  - interest-tabs.tsx — milestone cards: the next pending milestone
    gets a brand-blue ring + "NEXT" pill so the user can see at a
    glance which lifecycle to act on. Its primary action gets the
    filled button variant.

  - interest-tabs.tsx — Deposit milestone: invoice flow promoted to
    primary CTA ("Create deposit invoice"), manual stage advance
    demoted to a small text link ("Mark received manually"). Reflects
    the actual recommended path now that recordPayment auto-advances
    on payment.

  - inline-editable-field.tsx — pencil affordance shown faintly
    (opacity-20) at rest so users discover that fields are editable
    without having to hover-test every label. Lifts to opacity-60 on
    hover.

  - constants.ts — STAGE_SHORT_LABELS map for cramped contexts;
    pipeline-chart.tsx + pipeline-funnel-chart.tsx use them on mobile
    via useIsMobile, so the rotated 9-stage axis isn't a wall of
    overlap on a 393px screen.

  - client-pipeline-summary.tsx — StageStepper rebuilt as a single
    segmented progress bar instead of 9 micro-dots + connectors that
    rendered inconsistently at tight widths. Each stage is an equal
    slice that lights up as the interest reaches it; tooltips on hover
    give the full stage name. Also dropped a pre-existing dead `br`
    variable.

  - dashboard empty states — Lead Source, Revenue Breakdown, Pipeline
    Funnel, and Recent Activity now have helpful descriptions explaining
    what populates them, instead of bare "No interests in range".

  - use-paginated-query.ts — reuses `&` when the endpoint already has
    `?`, so callers like the documents hub don't generate
    `…?tab=eoi_queue&signatureOnly=true?page=1&limit=25` (which the API
    rejected as 400). Caught while testing the now-removed EOI route
    but applies broadly.

tsc clean. vitest 832/832 pass. eslint 0 errors (down from 1
pre-existing) on every file touched.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-02 01:24:15 +02:00
Matt Ciaccio
868b1f40c0 fix(nav): drop dedicated EOI route + alerts sidebar entry, fix paginated-URL bug
Trimmed two surfaces that didn't earn their nav weight:

  - The /[port]/documents/eoi route added in the previous commit was
    redundant with the per-interest EOI status milestones already on
    the interest detail and the existing eoi_queue tab inside the
    Documents hub. Removed the route + the "EOI queue" sidebar entry.
  - The Alerts sidebar entry was promoting a mostly-empty page that
    duplicated the dashboard alert rail. Dropped the entry; the
    /[port]/alerts route stays accessible via the dashboard rail's
    "View all" link and the topbar bell, which is enough for the
    audit-trail use case.

While testing the EOI tab, found and fixed a real bug: usePaginatedQuery
was producing malformed URLs like `…?tab=eoi_queue&signatureOnly=true?page=1&limit=25`
(two `?` separators) when the endpoint string already carried query
params. The API rejected those with 400, so the EOI tab in the
documents hub was silently broken. The hook now uses `&` when the
endpoint already contains a `?`.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-02 00:30:27 +02:00
Matt Ciaccio
dbbd03fd22 feat(sales): admin-configurable EOI signers + richer timeline events
1. Per-port EOI signer config

     - New `eoi_signers` system_settings key (JSON: { developer, approver },
       each `{ name, email }`). Settings UI exposes it under Admin → Settings.
     - getPortEoiSigners(portId) reads the setting with a typed validator;
       falls back to the legacy David Mizrahi / Abbie May defaults if the
       row is missing or malformed (so older ports keep working until an
       admin saves a value).
     - Both EOI generation pathways now read from the helper instead of
       hardcoded constants:
         * documenso-template path (generateAndSignViaDocumensoTemplate)
         * in-app PDF-fill path (generateAndSignViaInApp)

  2. Timeline upgrades

     The interest detail Activity tab now distinguishes the new automation
     events that arrived with sessions 1+2:

     - Stage auto-advances (userId='system') get a small "Auto" pill and
       carry their reason into the description (e.g. "Stage advanced to
       EOI Signed (auto-advanced — EOI signed via Documenso)").
     - outcome_set events show "Marked as Won" / "Marked as Lost — went
       to another marina" with optional reason; trophy/X icons.
     - outcome_cleared events show "Reopened to {stage}" with a refresh
       icon.
     - Document events humanized: "Document 'X' fully signed" instead
       of "Document X: completed".
     - Stage labels run through stageLabel() so the timeline shows the
       human label, not the enum key.
     - Timestamps switched to relative-time with full-date tooltip.
     - "by system" is rendered plainly (no longer the literal user-id).

tsc clean. vitest 832/832 pass.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-02 00:19:55 +02:00
Matt Ciaccio
ba5fb6db5e feat(sales): EOI queue route + invoice→deposit auto-advance + won/lost outcomes
Three independent strengthenings of the sales spine that the prior coherence
sweep made it possible to do cleanly.

  1. EOI queue page

     - Sidebar entry under Documents → "EOI queue".
     - Route /[port]/documents/eoi renders DocumentsHub with the existing
       eoi_queue tab pre-selected (filters in-flight EOIs only).
     - .gitignore: tightened root-only `eoi/` ignore so the documents/eoi
       route is no longer silently excluded.

  2. Invoice ↔ deposit link

     - invoices.interestId (FK, ON DELETE SET NULL) + invoices.kind
       ('general' | 'deposit'). Indexed on (port_id, interest_id).
     - createInvoiceSchema requires interestId when kind === 'deposit';
       the service validates the linked interest belongs to the same port
       before insert.
     - recordPayment auto-advances pipelineStage to deposit_10pct (via
       advanceStageIfBehind) when a paid invoice is kind=deposit and has
       an interestId. No-op if the interest is already further along.
     - "Create deposit invoice" link added to the Deposit milestone on the
       interest detail. Links to /invoices/new?interestId=…&kind=deposit;
       the form prefills the billing entity from the linked interest's
       client and shows a context banner.

  3. Won / lost terminal outcomes

     - interests.outcome ('won' | 'lost_other_marina' | 'lost_unqualified'
       | 'lost_no_response' | 'cancelled') + outcomeReason text +
       outcomeAt timestamp. Indexed on (port_id, outcome).
     - setInterestOutcome / clearInterestOutcome services + POST/DELETE
       /api/v1/interests/:id/outcome endpoints (gated by change_stage
       permission). Setting an outcome moves the interest to `completed`
       in the same write; clearing reopens to `in_communication` (or a
       caller-specified stage).
     - Mark Won / Mark Lost icon buttons on the interest detail header,
       plus an outcome badge that replaces the stage pill once a terminal
       outcome is set, plus a Reopen button.
     - Funnel + dashboard math updated to exclude lost/cancelled outcomes
       from active calculations (KPIs.activeInterests, pipelineValueUsd,
       getPipelineCounts, computePipelineFunnel, getRevenueForecast).
       The funnel now also returns a `lost` summary so callers can
       surface leakage without polluting conversion percentages.

Schema changes shipped via 0019_lazy_vampiro.sql; applied to dev DB
manually via psql because drizzle-kit push hits a pre-existing zod
parsing issue on the companies index. Dev server may need a restart
to flush prepared-statement caches.

tsc clean. vitest 832/832 pass. ESLint clean on every file touched.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-02 00:01:33 +02:00
Matt Ciaccio
886119cbde refactor(sales): consolidate pipeline stages + wire EOI auto-advance
The 8→9 stage refresh from earlier today only updated constants.ts and the DB —
20 component/service files still hardcoded the old enum, leaving labels blank,
filter dropdowns wrong, kanban columns mismatched, and the analytics funnel
silently dropping new-stage rows. The platform also never advanced
pipelineStage on EOI lifecycle events: documents.service.ts wrote eoiStatus
but left the user-visible stage stuck.

This commit closes both gaps:

  1. Single source of truth in src/lib/constants.ts — adds STAGE_LABELS,
     STAGE_BADGE, STAGE_DOT, STAGE_WEIGHTS, STAGE_TRANSITIONS plus
     stageLabel / stageBadgeClass / stageDotClass / safeStage /
     canTransitionStage helpers. components/clients/pipeline-constants.ts
     becomes a re-export shim so existing imports keep working.

  2. 18 stale-enum surfaces migrated — interest list (table, card, filters,
     form, stage picker), pipeline board, client card, berth interests tab,
     portal client interests page, dashboard pipeline / funnel / revenue-
     forecast charts, settings pipeline_weights default, dashboard.service
     weights, analytics.service funnel stages, alert-rules stale-interest
     filter, interest-scoring stage rank.

  3. Documents tab wired into interest detail — replaced the placeholder in
     interest-tabs.tsx with InterestDocumentsTab + InterestFilesTab so the
     EOI launcher is back where salespeople work.

  4. Auto-advance — new advanceStageIfBehind() in interests.service.ts
     (forward-only, no-op if interest is already past the target). Called
     from documents.service.ts on send (→ eoi_sent), Documenso completed
     webhook (→ eoi_signed), and manual signed-EOI upload (→ eoi_signed).

  5. Transition guard — canTransitionStage() blocks egregious skips
     (e.g. completed → open, open → contract_signed). Enforced in
     changeInterestStage before the DB write.

Tests updated to reflect the 9-stage model. tsc clean, vitest 832/832,
ESLint clean on every file touched.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 23:33:53 +02:00
Matt Ciaccio
0d357731ad chore(dev): install and wire react-grab
react-grab lets you point at any DOM element on the page and press
Cmd+C to copy the file name, React component, and HTML source — then
paste straight into a coding agent (Claude Code, Cursor, etc.) for
much higher-fidelity context.

Wiring (auto-detected by `npx grab@latest init --force`): a Next
<Script> tag in src/app/layout.tsx that loads the bundle from unpkg
in development only. Production builds skip the script entirely so
no extra weight ships to end users.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 16:37:40 +02:00
Matt Ciaccio
a75d4f5d69 feat(mobile): redesign topbar + collapse cumbersome page-header on mobile
Topbar (mobile-topbar.tsx):
  - Bumped to 56px to match standard mobile-app proportions.
  - Deep-navy gradient surface (#1e2844 -> #171f35) with white type —
    matches the desktop sidebar identity, gives the app a premium
    finish instead of generic white-with-text.
  - Brand "PN" wordmark mark on the left when no back affordance is
    needed (rounded brand-blue square, inset highlight + drop shadow).
  - Soft glow shadow underneath for elevation depth instead of a hard
    bottom border.
  - White-on-navy back arrow with active-state translucent fill.

PageHeader (page-header.tsx):
  - On mobile, the gradient hero strip + duplicate title + description
    block now collapses entirely — the topbar already shows the title,
    so duplicating it in the body wasted a third of the viewport.
  - The actions slot remains rendered as a flush right-aligned row so
    primary buttons (date-range pickers, "+ New X") stay accessible.
  - Desktop rendering is untouched.

Mobile shell (mobile-layout.tsx):
  - Top buffer 16px below the topbar so content doesn't ride flush.
  - Bottom buffer 32px above the tab bar so the last card breathes.

CSS (globals.css):
  - Hide the react-query-devtools floating button below lg: — it was
    overlapping the bottom-tab bar's "More" affordance.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 16:34:28 +02:00
Matt Ciaccio
0fb7920db5 fix(auth/mobile): support LAN-IP access in dev + edge-to-edge auth bg
- branded-auth-shell: split the background image into a separate
    fixed-positioned layer behind the layout. Previously the bg was on
    a min-h-screen container and iOS Safari left visible whitespace at
    the top/bottom when the URL bar showed/hid (the container's height
    didn't match the visual viewport). Now the bg pins to the actual
    visible viewport via `fixed inset-0`. min-h-[100dvh] also added
    so the layout layer matches.
  - auth client: derive baseURL from window.location.origin instead of
    NEXT_PUBLIC_APP_URL. Same dev build now works whether opened on
    localhost (Mac) or the LAN IP (iPhone on Wi-Fi).
  - auth server: dynamic trustedOrigins function that allows
    localhost / 127.x / 192.168.x / 10.x in dev (function form
    inspects the incoming request's Origin). Production stays locked
    to NEXT_PUBLIC_APP_URL.
  - new dev helper: scripts/dev-set-password.ts to set a user's
    better-auth password directly (bypasses the email-reset flow);
    used to bootstrap matt@letsbe.solutions for mobile testing.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 16:21:59 +02:00
Matt Ciaccio
16ad61ce15 fix(mobile): hide duplicate detail-header title on mobile
The mobile topbar already shows the entity name pushed via
useMobileChrome, so the gradient detail-header strip was rendering it
a second time. Hides the inline h1 below sm: while keeping the source
/ email / phone meta and action buttons visible — the strip's
practical content (actions + meta) stays where users need it, and the
title responsibility moves cleanly to the topbar.

Affects: clients, yachts, companies, berths detail headers.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 16:09:32 +02:00
Matt Ciaccio
d080bc52fa feat(mobile): touch up new-invoice + scan-receipt forms
- new invoice: push "New Invoice" to mobile topbar, hide the
    redundant inline back+title row on mobile.
  - scan receipt: dedicated "Take photo" primary button on mobile
    (uses input capture="environment" to open the camera directly)
    plus "Choose from library" secondary. Drop-zone framing kept on
    desktop. Push "Scan Receipt" title to mobile topbar.

Both forms now take their entity title from the topbar and free up
real-estate at the top for actual content.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 16:06:09 +02:00
Matt Ciaccio
a653c8e039 fix(mobile): wrap detail-header actions on narrow viewports
Action buttons in entity detail headers (Invite/GDPR/Archive on
clients, similar sets elsewhere) overflowed off-screen at 393px
because the actions row was flex without flex-wrap. Adds flex-wrap
so buttons drop to a second/third row instead of clipping.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 15:48:51 +02:00
Matt Ciaccio
7e8110b2ff feat(mobile): show entity name in mobile topbar on detail pages
Detail pages (clients, yachts, companies, berths, invoices, expenses)
now push their entity name + a back-button toggle to the mobile
topbar via useMobileChrome, replacing the URL UUID fallback that was
rendering before.

Supporting changes:
  - useMobileChrome() no longer throws when called outside the
    MobileLayoutProvider — desktop-tree consumers get a no-op
    setChrome so callers don't have to branch on shell type.
  - setChrome is now stable across renders (useCallback) so callers'
    useEffect dependency arrays don't infinite-loop.
  - DetailPageShell now also pushes its entityName + cleans up on
    unmount, and hides its desktop-only sticky header on mobile so it
    doesn't double up with the topbar (no current callers, prep for
    Phase 4 migration).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 15:46:32 +02:00
Matt Ciaccio
9eadaf035e fix(mobile): widen ListCard href type to Route
Project has experimental.typedRoutes enabled; passing template-literal
URLs through the Link href prop requires the wider Route type. Cast
at the Link boundary inside ListCard so callers can keep the simpler
string-typed href API.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 15:40:52 +02:00
Matt Ciaccio
bcea28cd71 feat(mobile): mobile cards for reminders, audit log, users
Three new <EntityCard> files using the shared <ListCard> shell, wired
into each list page's <DataTable> via cardRender.

  - ReminderCard:   Bell icon, related-entity subtitle (User/Anchor/
                    FileText icon by entity type), due-date meta with
                    past-due flag, accent bar (rose=past-due,
                    amber=pending, slate=snoozed, emerald=done).
                    Snooze/Complete/Edit/Delete in actions menu.
  - AuditLogCard:   Action icon (Plus/Pencil/Trash2/Eye), entity
                    title, "{verb} by {actor}" subtitle, timestamp
                    meta, optional changed-field chip line. Accent
                    bar by action (created=emerald, updated=blue,
                    deleted=rose). Immutable, no actions menu.
  - UserCard:       Initials avatar, displayName/email, role meta
                    (Shield icon), last-login distance, "Inactive"
                    pill when deactivated. Accent bar (violet=
                    super_admin, slate=inactive, none=active).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 15:39:06 +02:00
Matt Ciaccio
722491a9dd feat(mobile): mobile cards for yachts, companies, berths, invoices, expenses
Five new <EntityCard> files using the shared <ListCard> shell, wired
into each list page's <DataTable> via cardRender. Desktop view
(lg+) is unchanged.

  - YachtCard:    Ship icon, owner subtitle (User/Building2 icon by
                  ownerType), dimensions in meters preferred, hull #,
                  status pill. No accent bar (status is free-text).
  - CompanyCard:  Building2 icon, legalName subtitle, country (MapPin)
                  + tax id (Hash) meta, member/yacht count line.
  - BerthCard:    Anchor icon, area subtitle (MapPin), dimensions
                  meta, status pill. Status-encoded accent bar
                  (emerald=available, amber=under_offer, slate=sold).
  - InvoiceCard:  FileText icon, client subtitle, due date (Calendar)
                  meta, prominent currency-formatted amount. Status
                  accent bar (emerald=paid, orange=overdue, ...).
  - ExpenseCard:  Receipt icon, category subtitle, expense date meta,
                  prominent amount, payment-status pill, "Possible
                  duplicate" pill when duplicateOf is set. Accent bar
                  by paymentStatus, overridden to amber when flagged
                  as duplicate.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 15:34:04 +02:00
Matt Ciaccio
6009ccb7de feat(mobile): mobile card view for clients + interests lists
Adds optional cardRender prop to <DataTable> that switches the layout
to a vertical card list below lg: while keeping the same TanStack
table instance powering both views (pagination, sort, selection).

New shared shell:
  - <ListCard>          rounded card with optional left status accent bar,
                        whole-card link to detail page, top-right actions
                        slot, and tactile hover/active states.
  - <ListCardAvatar>    40px brand-tinted circle (initials or domain icon).
  - <ListCardMeta>      inline icon + muted text segment.
  - deriveInitials()    shared helper that ignores numeric tokens (so
                        "Recovery Test 1777" -> "RT", not "R1").

Clients and interests pages now render mobile cards via cardRender
using this shell; desktop view (lg+) is unchanged. Interests cards
encode pipeline stage as a left-edge accent strip whose saturation
deepens with pipeline progression (open -> completed). Berths display
with an Anchor icon; null-berth interests fall back to a Compass +
"General interest" italic label. Hot leads get a discreet "Hot" pill.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 15:27:53 +02:00
Matt Ciaccio
71da6e8fdc feat(mobile): swap admin page headers to PageHeader
Mechanical sweep replacing the plain h1+p header markup with the
mobile-aware PageHeader primitive across 12 admin pages: index,
backup, branding, documenso, email, import, invitations, monitoring,
onboarding, reminders, reports, webhooks. Webhooks "Add Webhook"
button preserved via the actions slot.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 12:57:52 +02:00
Matt Ciaccio
c405124bc3 feat(mobile): swap reports header to PageHeader
Plain h1 + p replaced with the mobile-aware PageHeader primitive so
the reports landing matches dashboard/settings.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 12:55:11 +02:00
Matt Ciaccio
53cbee1d3d fix(mobile): tighten Card padding on mobile (p-4 sm:p-6)
CardHeader/CardContent/CardFooter were uniformly p-6 (24px), which on
top of the mobile shell's 16px outer padding pushed form content 40px
inward — making cards feel content-shifted on a 393px viewport. Drops
to p-4 (16px) below sm and keeps p-6 from sm+ so desktop is unchanged.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 12:55:09 +02:00
Matt Ciaccio
ac7f1db62c fix(mobile): add horizontal padding to mobile shell main
Content cards/lists were rendering edge-to-edge on mobile because the
mobile shell's <main> had no horizontal padding (only safe-area top/
bottom). Adds px-4 to match the breathing room desktop gets from p-6.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 12:45:20 +02:00
Matt Ciaccio
5d44f3cfa4 fix(test): raise mobile-audit timeout to 30min for 4-viewport runs
Task 24 audit run hit the 10-minute test.setTimeout ceiling after capturing
2 of 4 viewport passes (iphone-se complete, iphone-16 complete-ish, 16-pro
partial, pro-max not started). 4 viewports × ~45 routes × slowMo: 200 needs
more headroom than 600s gave. 30min is comfortable headroom; the per-test
project timeout is matched.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 15:15:26 +02:00
Matt Ciaccio
d0540dca55 fix(build): extract route.ts handlers to handlers.ts (CLAUDE.md convention)
8 API route files were exporting handler functions directly from route.ts,
which Next.js 15 rejects with "$NAME is not a valid Route export field".
Per CLAUDE.md convention, service-tested handler functions live in sibling
handlers.ts files and route.ts only re-exports the GET/POST/etc. wrapped
in withAuth(withPermission(...)).

Discovered during the mobile-foundation Task 24 build validation; the route
files predate this branch but the build was never re-run on data-model.

Files:
- berth-reservations/[id], companies/autocomplete, companies/[id]/members
  + nested mid/set-primary, yachts/autocomplete, yachts/[id]/transfer,
  yachts/[id]/ownership-history
- Integration tests updated to import from handlers.ts (companies,
  memberships, reservations, yachts-detail)

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 15:14:40 +02:00
Matt Ciaccio
0e9c24e222 test(visual): add mobile shell snapshot baselines (dashboard + more sheet)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-29 14:47:26 +02:00
Matt Ciaccio
3aba2181dc feat(test): extract anchor iPhone device descriptors to shared fixture
Move the four iPhone viewport descriptors (SE, 15/16, 16/17 Pro, Pro Max)
into tests/e2e/fixtures/devices.ts so the upcoming visual spec (Task 23)
can share the same anchors. The mobile-audit spec now spreads each
descriptor and adds a slug `name` plus a human `label` for the run header.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 14:31:51 +02:00
Matt Ciaccio
6237ad1567 feat(mobile): add FilterChips primitive (horizontal chip row with Add-filter trigger) 2026-04-29 14:28:33 +02:00
Matt Ciaccio
34916d855e feat(mobile): add DataView (TanStack table on lg+, card list below) with cardRender callback 2026-04-29 14:27:17 +02:00
Matt Ciaccio
41ae8a328f feat(mobile): add DetailPageShell with sticky header + mobile sticky-action shelf 2026-04-29 14:25:45 +02:00
Matt Ciaccio
1ff3160eac feat(mobile): add ActionRow with horizontal-scroll-snap on mobile, wrap on desktop 2026-04-29 14:24:43 +02:00
Matt Ciaccio
5698d742d3 feat(mobile): make PageHeader mobile-aware (stack below sm, hide description when actions present) 2026-04-29 14:23:40 +02:00
Matt Ciaccio
e6ce265be0 fix(mobile): drop positive display rule that overrode desktop shell's flex layout 2026-04-29 14:20:11 +02:00
Matt Ciaccio
19bc2f2a54 feat(mobile): mount MobileLayout alongside desktop shell, remove legacy sidebar mobile-drawer 2026-04-29 14:18:28 +02:00
Matt Ciaccio
b0a11f1785 feat(mobile): add MobileLayout shell composing topbar + content + bottom tabs + more sheet 2026-04-29 14:16:30 +02:00
Matt Ciaccio
3cbf2444fe feat(mobile): add MoreSheet (3-column grid of long-tail nav items in a bottom drawer) 2026-04-29 14:15:25 +02:00
Matt Ciaccio
0330be1312 feat(mobile): add Drawer (vaul wrapper) for native-feel bottom sheets 2026-04-29 14:14:18 +02:00
Matt Ciaccio
210360738d feat(mobile): add MobileBottomTabs with 5 fixed tabs (Dashboard/Clients/Yachts/Berths/More) 2026-04-29 14:13:09 +02:00
Matt Ciaccio
4df04e1a58 feat(mobile): add MobileTopbar with title, back-button, and primary-action slots 2026-04-29 14:12:15 +02:00
Matt Ciaccio
0c3baf04c5 feat(mobile): add MobileLayoutProvider context + useMobileChrome hook 2026-04-29 14:11:27 +02:00
Matt Ciaccio
79667b24da chore(pwa): add placeholder icons (icon-192/512/512-maskable, apple-touch-icon) 2026-04-29 14:10:14 +02:00
Matt Ciaccio
c4fdb29bbe feat(mobile): render Dialog full-screen below sm, centered modal at sm+ 2026-04-29 14:08:14 +02:00
Matt Ciaccio
38527d71fc feat(mobile): bump touch-target heights on Button/Input/Textarea, keep 16px to prevent iOS zoom 2026-04-29 14:06:59 +02:00
Matt Ciaccio
3fbfba6598 chore(deps): add vaul for native-feel bottom sheets 2026-04-29 14:05:10 +02:00
Matt Ciaccio
e3a835675b feat(mobile): add useIsMobile() hook backed by matchMedia (visual-test-only) 2026-04-29 14:04:02 +02:00
Matt Ciaccio
1b085f81ed feat(mobile): add CSS rules to switch shells based on data-form-factor + viewport 2026-04-29 14:00:49 +02:00
Matt Ciaccio
9f786fbcf3 feat(mobile): set data-form-factor body attr from User-Agent in root layout 2026-04-29 13:59:03 +02:00
Matt Ciaccio
906127a292 feat(mobile): add safe-area spacing utilities (pt-safe-top, pb-safe-bottom, etc.) 2026-04-29 13:56:53 +02:00
Matt Ciaccio
737b43589b feat(mobile): add viewport meta, theme-color, and PWA metadata to root layout 2026-04-29 13:55:37 +02:00
Matt Ciaccio
fbb1f1f366 scaffold(mobile): branch setup — audit harness, spec, plan, gitignore + client-portal cleanup
Pre-execution baseline for the mobile foundation PR:

- Mobile audit harness (tests/e2e/audit/mobile.spec.ts + mobile-audit Playwright project) — visits every page at four anchor iPhone viewports (375/393/402/440), screenshots full-page to .audit/mobile/, generates index.md
- Design spec (docs/superpowers/specs/2026-04-29-mobile-optimization-design.md) — adaptive shell + responsive content; full active-iPhone-range coverage; foundation + per-page migration phases
- Implementation plan (docs/superpowers/plans/2026-04-29-mobile-foundation.md) — 24 TDD tasks for the foundation PR
- .gitignore: ignore /client-portal/ (legacy nested Nuxt repo) and /.audit/ (regenerable screenshots)
- Remove phantom client-portal gitlink (mode 160000 with no .gitmodules)

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 13:49:38 +02:00
Matt Ciaccio
ba89b61b3f fix(security): port-scope clientId/berthId/yachtId on interests + clientRelationships
All checks were successful
Build & Push Docker Images / lint (pull_request) Successful in 1m17s
Build & Push Docker Images / build-and-push (pull_request) Has been skipped
Pass-6 findings — both MEDIUM cross-tenant FK injection.

- interests.service: createInterest/updateInterest/linkBerth accepted
  clientId/berthId/yachtId from the request body without verifying the
  referenced row belongs to the caller's port. getInterestById joins
  clients/berths/yachtTags on these FKs without a port filter, so a
  port-A caller could splice a foreign-port id and surface that
  tenant's clientName, mooringNumber, or yacht ownership on read.
  New assertInterestFksInPort helper guards all three surfaces.

- clients.service.createRelationship: accepted clientBId from the
  body without a port check; the relationship list endpoint joins
  clients without filtering by port, so the foreign client's name
  + email would render in the relationships tab. Now verifies
  clientBId belongs to portId and rejects self-relationships.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 04:14:09 +02:00
Matt Ciaccio
4eea19a85b sec: lock down 5 cross-tenant FK gaps from fifth-pass review
1. HIGH — reminders.create/updateReminder accepted clientId/interestId/
   berthId from the body and persisted them with no port check; getReminder
   then hydrated the row via Drizzle relations (no port filter on the
   join), so a port-A user with reminders:create could exfiltrate any
   port-B client/interest/berth row by guessing its UUID. New
   assertReminderFksInPort gates create + update.

2. HIGH — listRecommendations(interestId, _portId) discarded portId
   entirely; the route GET /api/v1/interests/[id]/recommendations
   forwarded the URL id straight through. A port-A user with
   interests:view could read any other tenant's recommended berths
   (mooring numbers, dimensions, status). Service now verifies the
   interest belongs to portId and joins berths filtered by port.

3. HIGH — Berth waiting list. The PATCH route did not pre-check that
   the berth belonged to ctx.portId — a port-A user with
   manage_waiting_list could reorder a port-B berth's queue. Separately,
   updateWaitingList accepted arbitrary entries[].clientId and inserted
   them without verifying tenancy, polluting the table with foreign-port
   FKs. Both gaps closed.

4. MEDIUM — setEntityTags (clients/companies/yachts/interests/berths)
   accepted any tagId and inserted into the join table. The tags table
   is per-port but the join only carries a single-column FK. The
   downstream getById join `tags ON join.tag_id = tags.id` has no port
   filter, so a foreign tag's name + color render in the requesting port.
   Helper now batch-validates tagIds belong to portId before insert.

5. MEDIUM — /api/v1/custom-fields/[entityId] PUT had no withPermission
   gate (any role, including viewer, could write) and didn't validate
   that the URL entityId pointed at a port-scoped entity of the field
   definition's entityType. Route now uses
   withPermission('clients','view'/'edit',…); service validates the
   entityId per resolved entityType (client/interest/berth/yacht/company)
   against portId.

Test mocks updated to cover the new entity-port-scope check.
818 vitest tests pass.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 03:28:31 +02:00
Matt Ciaccio
47a1a51832 sec: webhook SSRF guard, IMAP-sync owner check, watcher port membership
Three findings from a fourth-pass review:

1. MEDIUM — webhook URL SSRF. The validator only enforced HTTPS+URL
   parse; it accepted private/loopback/link-local/.internal hosts. The
   delivery worker fetched arbitrary URLs and persisted up to 1KB of
   response body into webhook_deliveries.response_body, which is then
   surfaced via the deliveries listing endpoint — a port admin could
   register a webhook to an internal HTTPS endpoint, hit the test
   endpoint to force immediate dispatch, and read the response back.
   Validator now rejects RFC-1918/loopback/link-local/CGNAT/ULA IPs
   (v4 + v6) and .internal/.local/.localhost/.lan/.intranet/.corp
   suffixes; the worker re-resolves the hostname at dispatch time and
   blocks before fetch (DNS rebinding defense). 21-case unit test
   covers the matrix.

2. MEDIUM — POST /api/v1/email/accounts/[id]/sync had no owner check.
   Any user with email:view could enqueue an inbox-sync job for any
   accountId, which the worker would honour using the foreign user's
   decrypted IMAP credentials and advance the account's lastSyncAt
   (data-loss risk on the legitimate owner's next sync). Route now
   asserts account.userId === ctx.userId before enqueueing, matching
   the toggle/disconnect endpoints.

3. MEDIUM — addDocumentWatcher (and the wizard / upload watcher
   inserts) didn't validate the watcher's userId belonged to the
   document's port. notifyDocumentEvent then emitted a real-time
   socket toast + email containing the document title to the foreign
   user. New assertWatchersInPort helper verifies each candidate has
   a userPortRoles row for the port (super-admin bypass).

818 vitest tests pass.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 03:15:39 +02:00
Matt Ciaccio
9a5479c2c7 sec: lock down socket.io room subscription + crm-invite cross-tenant ops
1. HIGH — Socket.IO accepted client-supplied `auth.portId` in the
   handshake without verifying the user actually held a role in that
   port, then unconditionally joined the socket to `port:${portId}`.
   The `join:entity` handler also skipped authorization. This let any
   authenticated CRM user receive realtime events from any other
   tenant: invoice numbers + totals + client names, document signer
   emails, registration events with full client name + berth, file
   uploads, etc. Auth middleware now resolves the user's
   userPortRoles (or isSuperAdmin) before honouring portId, and
   join:entity verifies the entity's port matches a port the user
   has access to. Pre-existing pre-branch issue but fixed here given
   the explicit "all data is extremely sensitive" directive.

2. MEDIUM — listCrmInvites issued a global SELECT with no port
   scope. The crm_user_invites table has no portId column (invites
   mint global better-auth users, then port roles are assigned
   later). The previous gating on per-port admin.manage_users let
   any director enumerate every other tenant's pending invitee
   emails + isSuperAdmin flags — a phishing target list and a
   super-admin onboarding timing oracle. Restrict GET (list),
   DELETE (revoke), and POST resend to ctx.isSuperAdmin.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 03:00:55 +02:00
Matt Ciaccio
e06fb9545b sec: lock down 5 cross-tenant IDORs uncovered in second-pass review
1. HIGH — /api/v1/admin/ports/[id] PATCH+GET let any port-admin
   (manage_settings) mutate any other tenant's port row by passing the
   foreign id in the path. Now non-super-admins must target their own
   ctx.portId; listPorts and createPort are super-admin only.

2. HIGH — Invoice create/update accepted arbitrary expenseIds and
   linked them into invoice_expenses with no port check; the GET
   response then re-emitted those foreign expense rows via the
   linkedExpenses join. assertExpensesInPort now validates each id
   belongs to the caller's portId before insert; getInvoiceById's
   join filters by expenses.portId as defense-in-depth.

3. HIGH — Document creation paths (createDocument, createFromWizard,
   createFromUpload) persisted user-supplied clientId/interestId/
   companyId/yachtId/reservationId without verifying those FKs were
   in-port. sendForSigning then loaded the foreign client/interest by
   id alone and pushed their PII into the Documenso payload. New
   assertSubjectFksInPort helper rejects out-of-port FKs at create
   time; sendForSigning's interest+client lookups now also filter by
   portId.

4. MEDIUM — calculateInterestScore read its redis cache before
   verifying portId, and the cache key was interestId-only — a
   foreign-port caller could observe a cached score breakdown.
   Cache key now includes portId, and the port-scope DB lookup runs
   before any cache.get.

5. MEDIUM — AI email-draft job results were retrievable by anyone who
   could guess the BullMQ jobId (default sequential integers). Job
   ids are now random UUIDs, requestEmailDraft validates interestId/
   clientId belong to ctx.portId before enqueueing, the worker's
   client lookup is port-scoped, and getEmailDraftResult requires
   the caller to match the original requester's userId+portId before
   returning the drafted subject/body.

The interest-scoring unit test that asserted "DB is bypassed on cache
hit" is updated to reflect the new (security-correct) ordering.
Two new regression test files cover the email-draft binding (5 tests).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 02:48:43 +02:00
Matt Ciaccio
4c5334d471 sec: gate super-admin invite minting, OCR settings, and alert mutations
Three findings from the branch security review:

1. HIGH — Privilege escalation via super-admin invite. POST
   /api/v1/admin/invitations was gated only by manage_users (held by the
   port-scoped director role). The body schema accepted isSuperAdmin
   from the request, createCrmInvite persisted it verbatim, and
   consumeCrmInvite copied it into userProfiles.isSuperAdmin — granting
   the new account cross-tenant access. Now the route rejects
   isSuperAdmin=true unless ctx.isSuperAdmin, and createCrmInvite
   requires invitedBy.isSuperAdmin as defense-in-depth.

2. HIGH — Receipt-image exfiltration via OCR settings. The route
   /api/v1/admin/ocr-settings (and the sibling /test) were wrapped only
   in withAuth — any port role including viewer could PUT a swapped
   provider apiKey + flip aiEnabled, redirecting every subsequent
   receipt scan to attacker infrastructure. Both are now wrapped in
   withPermission('admin','manage_settings',…) matching the sibling
   admin routes (ai-budget, settings).

3. MEDIUM — Cross-tenant alert IDOR. dismissAlert / acknowledgeAlert
   issued UPDATE … WHERE id=? with no portId predicate. Any
   authenticated user with a foreign alert UUID could mutate it. Both
   service functions now require portId and add it to the WHERE; the
   route handlers pass ctx.portId.

The dev-trigger-crm-invite script passes a synthetic super-admin caller
identity since it runs out-of-band.

The two public-form tests randomize their IP prefix per run so a fresh
test process doesn't collide with leftover redis sliding-window entries
from a prior run (publicForm limiter pexpires after 1h).

Two new regression test files cover the fixes (6 tests).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 02:27:01 +02:00
Matt Ciaccio
61e40b5e76 chore(ops): split /api/health (liveness) from /api/ready (readiness)
Previously /api/health did deep dependency probes (postgres + redis +
minio) and 503'd on any failure. That's readiness behavior, not
liveness — a transient Redis/MinIO blip would tell the orchestrator to
restart the pod when it should only be dropped from the load balancer.

Make /api/health a thin liveness check (returns 200 unconditionally if
the process is responding) and move the deep checks to a new
/api/ready endpoint with the canonical Kubernetes-style 200/503
contract. Docker-compose healthchecks keep pointing at /api/health,
which is now more conservative (no false-positive container restarts).

Documenso/SMTP are intentionally not probed in /api/ready: each tenant
configures its own credentials and a tenant misconfiguration shouldn't
deadline the entire shared CRM.

Also tighten the gdpr-bundle-builder casts: replace the scattered
`as unknown as Record<string, unknown>` double-casts with a small
`toJsonRow<T>()` helper that does the widen narrow→wide in one place
with one cast hop instead of two.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 02:03:10 +02:00
Matt Ciaccio
7f9d90ad05 fix(gdpr): cap export-bundle size at 50MB before upload
Article-15 bundles are JSON+HTML only (no receipts/contracts), so even
heavy clients land at <1 MB. Anything larger almost certainly indicates
an unbounded relation we forgot to cap. Fail the worker job before
uploading rather than push a runaway blob to MinIO + email the client a
download link of mystery size.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 02:00:16 +02:00
Matt Ciaccio
5d29bfc153 refactor(services): centralize AuditMeta + transactional setEntityTags helper
The same `interface AuditMeta { userId; portId; ipAddress; userAgent }`
was duplicated in 26 service files. Move the canonical definition into
`@/lib/audit` next to the related types and update every service to
import it. `ServiceAuditMeta` (the alias used in invoices.ts and
expenses.ts) collapses into the same name.

Tag CRUD across clients/companies/yachts/interests/berths followed an
identical wipe-then-rewrite recipe with two latent issues: the delete
and insert weren't wrapped in a transaction (a partial failure left
the entity with zero tags) and the audit-log payload shape diverged
(`newValue: { tagIds }` for clients/yachts/companies but
`metadata: { type: 'tags_updated', tagIds }` for interests/berths).

Extract `setEntityTags` in `entity-tags.helper.ts` that performs the
delete+insert inside a single transaction, normalizes the audit payload
to `newValue: { tagIds }`, and dispatches the per-entity socket event
through a switch so `ServerToClientEvents` typing stays intact.

The five `setXTags(...)` service functions now do parent-row tenant
verification and delegate the join-table work + side effects.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 01:58:42 +02:00
Matt Ciaccio
43f68ca093 chore(hardening): maintenance jobs, defense-in-depth, redis-backed public rate limit
- maintenance worker now expires GDPR export bundles (db row + MinIO object)
  on the gdpr_exports.expires_at boundary, plus 90-day retention sweep on
  ai_usage_ledger; both jobs scheduled daily.
- portId scoping added to listClientRelationships and listClientExports
  (defense-in-depth — parent-resource gates already prevent cross-tenant
  reads, but service layer should enforce on its own).
- SELECT FOR UPDATE on parent client/company row inside add/update address
  transactions to serialize concurrent isPrimary toggles.
- public /interests + /residential-inquiries endpoints swap their
  in-memory ipHits maps for the redis sliding-window limiter via the
  new rateLimiters.publicForm config (5/hr/IP), so the cap survives
  restarts and is shared across worker processes.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 01:52:41 +02:00
Matt Ciaccio
d9557edfc5 docs(spec): GWS inbox-triage exploratory design (not approved for build)
Surveys what it actually takes to ship the AI inbox-triage feature
gated on Google Workspace integration. Walks through three deployment
models with their real costs:

- Model A (Marketplace OAuth app): 4-6 months calendar, $15k-$75k for
  the required CASA security assessment, recurring re-verification
- Model B (per-customer Internal OAuth app): ~5 weeks engineering, $0
  Google-side, scoped to one workspace per customer
- Model C (forward-to-CRM mailbox): ~1 week, receive-only, no reply
  drafts possible

Recommends Model B for the current customer profile, with B → A
promotion only if 3+ customers ask unprompted.

Documents what's already scaffolded (email_accounts/threads/messages
tables, syncInbox stub, BullMQ email queue, ai_usage_ledger, per-port
aiEnabled flag, withRateLimit('ai')) vs what's new (OAuth flow, Pub/
Sub push receiver, gws_user_tokens + email_triage tables, /inbox UI).

End-to-end flow, schema additions, AI cost interaction with the
Phase 3b token budgets, 5-phase build plan (G1-G5), and 5 open
decisions to resolve before scheduling the build. Explicitly out of
scope: M365, sentiment analysis, smart-drafts, cross-staff triage
queue.

No code changes — this is a design doc to drive a go/no-go decision.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 01:18:15 +02:00
Matt Ciaccio
6eb0d3dc92 docs(ops): backup/restore + email deliverability runbooks
Two new runbooks under docs/runbooks/ plus the automation scripts the
backup runbook references. Both are written so an operator who has only
the off-site backup credentials and the runbook can recover the system
unaided.

Backup/restore (Phase 4a):
- docs/runbooks/backup-and-restore.md — covers what gets backed up
  (Postgres / MinIO / .env+ENCRYPTION_KEY), schedule (hourly DB +
  hourly MinIO mirror, 7-day hourly + 30-day daily retention),
  cold-restore procedure with row-count verification, weekly drill
- scripts/backup/pg-backup.sh — pg_dump → gzip → optional GPG → mc
  upload, fails loud
- scripts/backup/minio-mirror.sh — incremental mc mirror, no --remove
  flag so accidental deletes on the live bucket can't cascade
- scripts/backup/restore.sh — interactive prod restore + --drill mode
  that runs against a sandbox DB and diffs row counts

Email deliverability (Phase 4b):
- docs/runbooks/email-deliverability.md — what the CRM sends, DNS
  records (SPF/DKIM/DMARC/MX), per-port override implications,
  diagnosis flow ("didn't arrive" → 4-step checklist starting with
  EMAIL_REDIRECT_TO), provider migration plan, realapi suite as the
  end-to-end probe

Tests still 778/778 vitest, tsc/lint clean — these phases are docs +
shell scripts, no code changes.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 20:10:30 +02:00
Matt Ciaccio
a3305a94f3 feat(gdpr): staff-triggered client-data export bundle (Article 15)
Adds a full GDPR Article 15 (right of access) workflow. Staff trigger
an export from the client detail; a BullMQ worker assembles every row
keyed to that client (profile, contacts, addresses, notes, tags,
yachts, company memberships, interests, reservations, invoices,
documents, last 500 audit events) into JSON + a self-contained HTML
report, ZIPs them, uploads to MinIO, and optionally emails the client
a 7-day signed download link.

- New table gdpr_exports tracks lifecycle (pending → building → ready
  → sent / failed) with a 30-day cleanup target
- Bundle builder (gdpr-bundle-builder.ts) — pure read-side, tenant-
  scoped, with HTML escaping to block injection from rogue field values
- Worker hook in export queue dispatches on job name 'gdpr-export'
- New audit actions: 'request_gdpr_export', 'send_gdpr_export'
- API: POST/GET /api/v1/clients/:id/gdpr-export (admin-gated, exports
  rate-limit, Article-15 audit on POST); GET /:exportId returns a
  fresh signed URL
- UI: <GdprExportButton> dialog on client detail header — admin-only,
  shows recent exports, supports email-to-client + override recipient,
  polls every 5s while open
- Validation: refuses email-to-client when no primary email + no
  override (rather than silently dropping the send)

Tests: 778/778 vitest (was 771) — +7 covering builder happy path,
HTML escaping, tenant isolation, empty client, request-flow validation,
and audit / queue interaction.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 20:06:31 +02:00
Matt Ciaccio
9dfa04094b feat(rate-limit): per-user limiters for OCR, AI, and exports
Adds three named rate limiters to the existing Redis sliding-window
catalog and a withRateLimit wrapper that composes inside withAuth.
Wires the OCR limiter into the receipt-scan endpoint so a runaway
client can't burn through the AI budget in a tight loop.

- ocr: 10/min/user
- ai: 60/min/user (reserved for future server-side AI surfaces)
- exports: 30/hour/user (reserved for GDPR bundle, PDF, CSV exports)

429 responses include X-RateLimit-* headers and a Retry-After hint.

Tests: 771/771 vitest (was 766) — +5 rate-limit tests covering catalog
shape, sliding window, cross-prefix isolation, cross-user isolation,
and resetAt timestamp.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 19:56:01 +02:00
Matt Ciaccio
e7d23b254c feat(ai): per-port token budgets + usage ledger for AI features
Adds a token-denominated guardrail in front of every server-side AI call
so a misconfigured port can't run up an unbounded bill. Soft caps surface
a banner; hard caps refuse new requests until the period rolls over.
Usage flows into a feature-typed ledger so future AI surfaces (summary,
embeddings, reply-draft) can drop in without schema changes.

- New table ai_usage_ledger (port, user, feature, provider, model,
  input/output/total tokens, request id) with two indexes for rollup
- New service ai-budget.service.ts: getAiBudget/setAiBudget,
  checkBudget (pre-flight gate), recordAiUsage, currentPeriodTokens,
  periodBreakdown — all token-based, period boundaries in UTC
- runOcr now returns provider usage so the route can record the actual
  spend instead of estimating
- Scan-receipt route gates on checkBudget before invoking AI; returns
  source: manual / reason: budget-exceeded when blocked, surfaces
  softCapWarning on the success path
- Admin UI: new AiBudgetCard on the OCR settings page — shows current
  spend, per-feature breakdown, soft/hard cap inputs, period selector
- Permission: admin.manage_settings on both routes

Tests: 766/766 vitest (was 756) — +10 budget tests covering enforce/
disabled/cap-exceed/estimate-exceed/soft-warn/period boundaries/
cross-port isolation/silent ledger failure.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 19:53:09 +02:00
Matt Ciaccio
2cf1bd9754 feat(ocr): Tesseract.js as default scanner, AI as opt-in per port
The mobile receipt scanner now runs Tesseract.js in-browser by default —
on-device, free, and image bytes never leave the device. AI providers
(OpenAI / Claude) become a per-port opt-in for higher accuracy on
hard-to-read receipts.

- Lazy-load Tesseract WASM in src/lib/ocr/tesseract-client.ts (5 MB
  bundle dynamic-imports on first scan, not in main chunk)
- Heuristic parser src/lib/ocr/parse-receipt-text.ts extracts vendor,
  date, amount, currency, and line items from raw OCR text
- New port-scoped aiEnabled flag on OcrConfig (defaults false). Resolved
  flag never inherits from the global row — each port admin opts in
  independently
- Scan endpoint short-circuits to manual-mode when aiEnabled=false so
  the AI provider is never invoked unless the admin has flipped the
  switch
- Scan UI runs Tesseract first, then asks the server whether AI is
  enabled — uses the AI result only when its confidence beats Tesseract;
  network failures degrade gracefully to the local parse
- Admin OCR-settings form gains the per-port aiEnabled checkbox

Tests: 756/756 vitest (was 747) — +7 parser unit tests, +2 aiEnabled
config tests.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 19:46:29 +02:00
Matt Ciaccio
46937bbcb9 feat(addresses): full CRUD UI for client + company multi-address
Client and company detail pages each gain an Addresses tab with click-to-edit
fields wired to the existing CountryCombobox/SubdivisionCombobox primitives.
Adds a primary toggle that demotes the previous primary inside one transaction
so the partial unique index never trips.

- New service helpers: list/add/update/remove ClientAddress + CompanyAddress
- New routes: /api/v1/clients/[id]/addresses[/addressId], same under companies/
- New shared component: <AddressesEditor> reused by both detail surfaces
- Integration tests cover happy path, primary demotion, and tenant scoping

Tests: 747/747 vitest (was 741, +6 address tests).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 19:38:43 +02:00
Matt Ciaccio
27cdbcc695 chore(i18n): drop legacy free-text country/nationality columns
Test-data only — no production migration needed (per earlier decision).
Schema is now ISO-only; readers convert ISO codes to localized names where
human-readable output is required (EOI documents, invoices, portal).

Migration 0016 drops:
  - clients.nationality
  - companies.incorporation_country
  - client_addresses.{state_province, country}
  - company_addresses.{state_province, country}

Code paths that previously read free-text values now read the ISO column
and pass through `getCountryName()` / `getSubdivisionName()` for rendering.
Document templates ({{client.nationality}}), portal client view, EOI/
reservation-agreement contexts, and invoice billing addresses all updated.

Public yacht-interest endpoint (/api/public/interests) drops the legacy
fields from its insert path and writes ISO codes only. The Zod validators
no longer accept the legacy fields — older website builds posting raw
'incorporationCountry' / 'country' / 'stateProvince' will get 400s.
Server-side phone normalization is unchanged.

Seed data updated to use ISO codes (GB/FR/ES/GR/SE/IT/GH/MC/PA), spread
across continents to keep test fixtures realistic.

Test assertions updated to match the new render shape (e.g.
'United States' not 'US', 'California' not 'CA').

Vitest: 741 -> 741 (unchanged count; assertions updated, no new tests).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 19:00:57 +02:00
Matt Ciaccio
31fa3d08ec chore(cleanup): Phase 1 — gap closure across audit, alerts, soft-delete, perms
Multi-area cleanup pass closing partial-implementation gaps surfaced by the
post-i18n audit. No behavior changes for happy-path users; closes real
correctness/security holes.

PR1a Public yacht-interest endpoint i18n. /api/public/interests now accepts
     phoneE164/phoneCountry, nationalityIso, address.{countryIso, subdivisionIso},
     and company.{incorporationCountryIso, incorporationSubdivisionIso}.
     Server-side parsePhone() fallback for legacy raw phone strings.

PR1b Alert rule registry trim. Two rule slots ('document.expiring_soon',
     'audit.suspicious_login') were registered but evaluators returned [].
     Both required schema/instrumentation that hadn't landed. Removed from
     the registry; comments record the dependencies needed to revive them.
     Effective rule count: 8 active.

PR1c vi.mock hoist + flake fix. Hoisted vi.mock calls to top-level in 5
     integration test files; webhook-delivery uses vi.hoisted for the
     queue-add ref. Vitest no longer warns about non-top-level mocks.
     Deflaked the 'short value' assertion in security-encryption.test.ts
     by switching plaintext from 'ab' to 'XY' (non-hex chars). 5/5 runs green.

PR1d Soft-delete reference audit. listClientOptions and listYachtsForOwner
     now filter by isNull(archivedAt). Berths use status (no archivedAt).

PR1e Permission-matrix audit script + report. scripts/audit-permissions.ts
     walks every src/app/api/v1/**/route.ts and reports handlers without a
     withPermission() wrapper. Initial run found 33 violations.
     - Allow-listed 17 with explicit reasons (self-data, admin, alerts,
       search, currency, ai, custom-fields — some marked TODO).
     - Wrapped 7 routes with concrete permissions: clients/options
       (clients:view), berths/options (berths:view), dashboard/*
       (reports:view_dashboard), analytics (reports:view_analytics).
     Audit report at docs/runbooks/permission-audit.md. Script exits
     non-zero on any unallow-listed violation so it can become a CI gate.

Vitest: 741 -> 741 (no new tests; existing suite covers the changes).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 18:48:22 +02:00
Matt Ciaccio
16d98d630e feat(i18n): country/phone/timezone/subdivision primitives + form wiring
Cross-cutting i18n polish for forms across the marina + residential + company
domains. Introduces a single source of truth for country/phone/timezone/
subdivision data and replaces every nationality-as-free-text and timezone-
as-string Input with a dedicated combobox.

PR1  Countries — ALL_COUNTRY_CODES (~250 ISO-3166-1 alpha-2), Intl.DisplayNames
     for localized labels, detectDefaultCountry() with navigator-region
     fallback to US, CountryCombobox with regional-indicator flag glyphs +
     compact mode for inline use.
PR2  Phone — libphonenumber-js wrapper (parsePhone / formatAsYouType /
     callingCodeFor), PhoneInput with flag dropdown + national-format
     AsYouType + paste-detect that flips the country dropdown for pasted
     international strings.
PR3  Timezones — country->IANA map (250 entries, multi-zone for AU/BR/CA/CD/
     ID/KZ/MN/MX/RU/US), formatTimezoneLabel ("Europe/London (UTC+1)"),
     TimezoneCombobox with Suggested/All grouping driven by countryHint.
PR4  Subdivisions — wraps the iso-3166-2 npm package (~5000 ISO 3166-2
     codes for every country), per-country cache, SubdivisionCombobox with
     "Pick a country first" / "No regions available" empty states.
PR5  Schema deltas (migration 0015) — clients.nationality_iso, clientContacts
     {value_e164, value_country}, clientAddresses {country_iso, subdivision_iso},
     residentialClients {phone_e164, phone_country, nationality_iso, timezone,
     place_of_residence_country_iso, subdivision_iso}, companies {incorporation_
     country_iso, incorporation_subdivision_iso}, companyAddresses {country_iso,
     subdivision_iso}. Plus shared zod validators (validators/i18n.ts) used
     by every entity validator + route handler.
PR6  ClientForm + ClientDetail — CountryCombobox replaces nationality Input,
     TimezoneCombobox replaces timezone Input (driven by nationalityIso hint),
     PhoneInput conditionally rendered for phone/whatsapp contacts. Inline
     editors (InlineCountryField / InlineTimezoneField / InlinePhoneField)
     for the detail-page overview rows + ContactsEditor.
PR7  Residential client form + detail — phone -> PhoneInput, nationality/
     timezone/place-of-residence-country/subdivision rows in both create
     sheet and inline-editable detail view. Subdivision wipes when country
     flips since codes are country-scoped.
PR8  Company form + detail — incorporation country -> CountryCombobox,
     incorporation region -> SubdivisionCombobox in both modes.
PR9  Public inquiry endpoint — accepts pre-normalized phoneE164/phoneCountry
     and i18n fields from newer website builds, server-side parsePhone()
     fallback for legacy raw-international submissions. Old Nuxt builds
     keep working unchanged.

Tests: 4 unit suites for the primitives (25 tests), 1 integration spec for
the public phone-normalization path (3 tests), 1 smoke spec asserting the
combobox triggers render in all three create sheets.

Test totals: vitest 713 -> 741 (+28).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 18:13:08 +02:00
Matt Ciaccio
f52d21df83 feat(phase-b): ship analytics dashboard, alerts, scanner PWA, dedup, audit view
Phase B (Insights & Alerts) PR4-11 in one drop. Builds on the schema +
service skeletons committed in PRs 1-3.

PR4  Analytics dashboard — 4 chart types (funnel/timeline/breakdown/source),
     date-range picker (today/7d/30d/90d), CSV+PNG export per card.
PR5  Alert rail UI + /alerts page — topbar bell w/ live count, dashboard
     right-rail, three-tab page (active/dismissed/resolved), socket-driven
     invalidation. Bell lazy-loads list on popover open to keep cold pages
     fast in non-dashboard routes.
PR6  EOI queue tab on documents hub — filters to in-flight EOIs, count
     surfaces in tab label.
PR7  Interests-by-berth tab on berth detail — replaces the stub.
PR8  Expense duplicate detection — BullMQ job runs scan on create, yellow
     banner on detail w/ Merge / Not-a-duplicate, transactional merge
     consolidates receipts and archives the source.
PR9  Receipt scanner PWA + multi-provider AI — port-scoped /scan route in
     its own (scanner) group with no dashboard chrome, dynamic per-port
     manifest, OpenAI + Claude provider abstraction, admin OCR settings
     page (port-level + super-admin global default w/ opt-in fallback),
     test-connection endpoint, manual-entry fallback when no key is
     configured. Verify form always shown before save — no ghost rows.
PR10 Audit log read view — swap to tsvector full-text search on the
     existing GIN index, cursor pagination, filters for entity/action/user
     /date range, batched actor-email resolution.
PR11 Real-API tests — opt-in receipt-ocr.spec (admin save+test, optional
     real-receipt parse via REALAPI_RECEIPT_FIXTURE) and alert-engine
     socket-fanout spec gated behind RUN_ALERT_ENGINE_REALAPI. Both skip
     cleanly without their gate envs so CI stays green.

Test totals: vitest 690 -> 713, smoke 130 -> 138, realapi +2 opt-in.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 17:21:55 +02:00
Matt Ciaccio
2fa70f4582 merge: PR3 — analytics snapshot service + refresh job (Phase B)
All checks were successful
Build & Push Docker Images / lint (pull_request) Successful in 1m1s
Build & Push Docker Images / build-and-push (pull_request) Has been skipped
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 14:54:48 +02:00
Matt Ciaccio
01b201e1a2 feat(analytics): real computations + 15-min snapshot refresh job
PR3 of Phase B. Replaces the no-op stubs in analytics.service.ts with
working drizzle queries and adds the recurring BullMQ job that warms
the cache.

Computations:
- computePipelineFunnel: groups interests by pipeline_stage filtered by
  port + range + not archived; emits 8-row stages array with conversion
  pct relative to 'open' as the funnel top.
- computeOccupancyTimeline: per day in range, counts berths covered by
  an active reservation (start_date ≤ day, end_date IS NULL OR ≥ day);
  emits {date, occupied, total, occupancyPct}.
- computeRevenueBreakdown: sums invoices.total grouped by status +
  currency; filters out archived rows.
- computeLeadSourceAttribution: counts interests by source descending;
  null source bucketed as 'unspecified'.

Public API (getPipelineFunnel, getOccupancyTimeline, etc.) reads
analytics_snapshots first; falls back to compute + writeSnapshot. TTL
15 minutes (matches the cron interval).

Cron:
- queue/scheduler.ts registers 'analytics-refresh' on maintenance with
  pattern '*/15 * * * *'.
- queue/workers/maintenance.ts dispatches to refreshSnapshotsForPort
  for every port; per-port try/catch so one bad port doesn't kill the
  sweep.

Tests: tests/integration/analytics-service.test.ts (9 cases). Pipeline
funnel math (incl. zero state), occupancy timeline shape/percentages
with seeded reservations, revenue grouped by status + currency, lead
source attribution incl. null bucketing, cache hit (mutate snapshot
directly → next read returns mutated value), refreshSnapshotsForPort
warms every metric×range combo.

Vitest 690/690 (+9). tsc + lint clean.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 14:54:46 +02:00
Matt Ciaccio
94f049c8b8 merge: PR2 — alert rules engine + cron + socket (Phase B)
All checks were successful
Build & Push Docker Images / lint (pull_request) Successful in 1m3s
Build & Push Docker Images / build-and-push (pull_request) Has been skipped
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 14:50:57 +02:00
Matt Ciaccio
df495133b7 feat(alerts): rule engine, recurring evaluator, socket fanout
PR2 of Phase B. Wires the alert framework end-to-end:

- alert-rules.ts: 10 rule evaluators implemented as pure async fns over
  the existing schema. reservation.no_agreement, interest.stale,
  document.signer_overdue, berth.under_offer_stalled, expense.duplicate,
  expense.unscanned, interest.high_value_silent, eoi.unsigned_long,
  audit.suspicious_login fire against real conditions.
  document.expiring_soon stays inert until the documents schema gets an
  expires_at column. audit.suspicious_login also stays inert until the
  auth layer logs 'login.failed' rows (TODO noted in the rule body).

- alert-engine.ts: runAlertEngine() walks every port × every rule and
  calls reconcileAlertsForPort. Errors per (port, rule) are collected
  in the summary, not thrown — one bad evaluator can't stop the sweep.

- alerts.service.ts: reconcileAlertsForPort now emits 'alert:created'
  socket events on insert and 'alert:resolved' on auto-resolve;
  dismissAlert emits 'alert:dismissed'. All scoped to port:{portId}
  rooms.

- socket/events.ts: adds the three Server→Client alert event types.

- queue/scheduler.ts: registers 'alerts-evaluate' on the maintenance
  queue with cron */5 * * * * (every 5 min, per spec risk register).

- queue/workers/maintenance.ts: dispatches 'alerts-evaluate' to
  runAlertEngine; logs sweep summary.

Tests:
- tests/integration/alerts-engine.test.ts (6 cases): seeds reservation
  → fires, runs twice → no dupe, adds agreement → auto-resolves; seeds
  stale interest → fires; hot lead silent → critical; engine summary
  shape on no-data port. Socket emit module is vi.mocked.

Vitest 681/681 (was 675; +6). tsc clean. Lint clean.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 14:50:55 +02:00
Matt Ciaccio
639025ebf9 merge: PR1 — Phase B schema + service skeletons (Phase B)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 14:43:03 +02:00
Matt Ciaccio
e77d55ac50 feat(insights): Phase B schema + service skeletons
PR1 of Phase B per docs/superpowers/specs/2026-04-28-phase-b-insights-alerts-design.md.
Lays the foundation that PRs 2-10 will fill in with behaviour.

Schema (migration 0014):
- alerts table with rule-engine fields (rule_id, severity, link,
  entity_type/id, fingerprint, fired/dismissed/acknowledged/resolved
  timestamps, jsonb metadata). Partial-unique fingerprint index keeps
  one open row per (port, rule, entity); separate indexes power
  severity-filtered and time-ordered queries.
- analytics_snapshots (port_id, metric_id) -> jsonb cache + computedAt
  for the 15-min recurring refresh.
- expenses: duplicate_of self-FK, dedup_scanned_at, ocr_status/raw/
  confidence; partial index on (port, vendor, amount, date) where
  duplicate_of IS NULL drives the dedup heuristic.
- audit_logs.search_text: GENERATED ALWAYS tsvector over
  action+entity_type+entity_id+user_id, GIN-indexed (drizzle can't
  model GENERATED ALWAYS in TS yet, so the migration appends manual
  ALTER + the GIN index).

Service skeletons in src/lib/services/:
- alerts.service.ts: fingerprintFor, reconcileAlertsForPort (upsert +
  auto-resolve), dismiss, acknowledge, listAlertsForPort.
- alert-rules.ts: RULE_REGISTRY of 10 rule evaluators (currently no-op);
  PR2 fills in the bodies.
- analytics.service.ts: readSnapshot/writeSnapshot with 15-min TTL +
  no-op compute* stubs for the four chart series; PR3 fills behavior.
- expense-dedup.service.ts: scanForDuplicates + markBestDuplicate
  using the partial dedup index. PR8 wires the BullMQ trigger.
- expense-ocr.service.ts: OcrResult/OcrLineItem types + ocrReceipt
  stub. PR9 wires Claude Vision (Haiku 4.5 + ephemeral system-prompt
  cache).
- audit-search.service.ts: tsvector @@ plainto_tsquery + cursor
  pagination on (createdAt, id). PR10 wires the admin UI.

tsc clean, lint clean, vitest 675/675 (one unrelated AES random-output
flake passes solo).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 14:43:01 +02:00
410 changed files with 102864 additions and 3005 deletions

9
.gitignore vendored
View File

@@ -20,10 +20,17 @@ tsconfig.tsbuildinfo
docker-compose.override.yml
.remember/
.DS_Store
eoi/
# Root-only ad-hoc EOI scratch dir; routes under src/app/.../eoi/ must NOT match.
/eoi/
# Brainstorming companion mockup files
.superpowers/
# Ad-hoc screenshots / scratch artifacts at repo root
/*.png
# Legacy Nuxt portal — kept on disk for reference, not tracked here
/client-portal/
# Mobile audit screenshots — generated locally, regenerable
/.audit/

Submodule client-portal deleted from 84f89f9409

View File

@@ -0,0 +1,199 @@
# Backup and restore runbook
This runbook documents what gets backed up, how often, where it lands, and
the exact commands to restore the system from a cold start. The goal is
that any operator who has the off-site backup credentials can bring the
CRM back up on a clean host without help.
## Scope of a "full backup"
The CRM has three stateful surfaces. All three must be captured for a
restore to be useful.
| Surface | Holds | Risk if missing |
| ------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------- |
| **PostgreSQL** (`port_nimara_crm`) | Every relational record: clients, yachts, companies, interests, reservations, invoices, audit log, GDPR exports, AI usage ledger, Documenso webhook receipts, etc. | Total data loss — site is unrecoverable. |
| **MinIO bucket** (`MINIO_BUCKET`, default `crm-files`) | Receipts, signed contracts, EOI PDFs, GDPR export ZIPs, document attachments. | Files reachable by row references in Postgres become 404s. |
| **`.env` + secrets** | DB password, MinIO keys, Documenso webhook secret, SMTP creds, encryption key (`ENCRYPTION_KEY`). | OCR API keys re-resolve from `system_settings` (encrypted at rest), but **without the original `ENCRYPTION_KEY` they're unreadable**. |
The Redis instance is not backed up. It only holds queue state, rate-limit
counters, and Socket.IO presence — all reconstructable. Stop the workers
during a restore so the queue starts clean.
## Backup schedule
Defaults are tuned for a single-port deployment with O(10k) clients. Bump
on the producing side as scale demands.
| Job | Frequency | Retention | Where |
| ---------------------------------- | -------------------- | ----------------------------- | -------------------------------------------------------------------- |
| `pg_dump` (custom format, gzipped) | Hourly | 7 days hourly + 30 days daily | `${BACKUP_BUCKET}/pg/<host>/<UTC date>/<hour>.dump.gz` |
| MinIO mirror | Hourly (incremental) | 30 days versions | `${BACKUP_BUCKET}/minio/` |
| `.env` snapshot (encrypted) | On change (manual) | Forever | Password manager / secrets vault — **never the same bucket as data** |
The hourly cadence is the right answer for this workload — invoices and
contracts cluster around business hours, and an hour of lost work is the
worst-case data loss window most clients will tolerate. Promote to 15-min
WAL streaming if a customer demands tighter RPO.
## Required environment variables
The scripts below read these. Store them in a CI secret store, not the
host's bash profile.
```
# Source (the running CRM database)
DATABASE_URL=postgresql://crm:<pw>@<host>:<port>/port_nimara_crm
# MinIO (source bucket — the live one)
MINIO_ENDPOINT=minio.letsbe.solutions
MINIO_PORT=443
MINIO_USE_SSL=true
MINIO_ACCESS_KEY=<live key>
MINIO_SECRET_KEY=<live secret>
MINIO_BUCKET=crm-files
# Backup destination (a *separate* MinIO/S3 endpoint or a different bucket
# with no IAM overlap with the live keys)
BACKUP_S3_ENDPOINT=https://s3.eu-west-1.amazonaws.com
BACKUP_S3_REGION=eu-west-1
BACKUP_S3_BUCKET=portnimara-backups-prod
BACKUP_S3_ACCESS_KEY=<dedicated read+write key for this bucket only>
BACKUP_S3_SECRET_KEY=<...>
# Optional: encrypts dumps at rest with a passphrase. Cuts a wider blast
# radius if the backup bucket itself is compromised.
BACKUP_GPG_RECIPIENT=ops@portnimara.com
```
## Provisioning the backup destination
1. Create a dedicated S3-compatible bucket in a **different account** from
the live infra. AWS S3, Backblaze B2, or a separately-credentialed
MinIO instance all work.
2. Apply object-lock or versioning so an attacker who steals the backup
write key still can't permanently delete history.
3. Generate IAM credentials scoped to `s3:PutObject`, `s3:GetObject`,
`s3:ListBucket` on this bucket only. Inject them as
`BACKUP_S3_*` above. Do not reuse the live `MINIO_*` keys.
4. Set a 90-day lifecycle rule that transitions objects older than 30
days to cold storage and deletes them at 90 days. Past 90 days it's
cheaper to restart from a snapshot taken outside the system.
## The scripts
Three scripts in `scripts/backup/`:
- `pg-backup.sh` — runs `pg_dump`, gzips, optionally GPG-encrypts, uploads
- `minio-mirror.sh``mc mirror` of the live bucket → backup bucket
- `restore.sh` — interactive restore (DB + MinIO) given a snapshot path
Make them executable and wire them into cron / GitHub Actions / your
scheduler of choice. Sample crontab on the worker host:
```cron
# Hourly DB dump at minute 7
7 * * * * /opt/pncrm/scripts/backup/pg-backup.sh >> /var/log/pncrm-backup.log 2>&1
# Hourly MinIO mirror at minute 17 (offset so the two don't fight for I/O)
17 * * * * /opt/pncrm/scripts/backup/minio-mirror.sh >> /var/log/pncrm-backup.log 2>&1
# Weekly restore drill (smoke-test to a throwaway DB on Sunday at 03:00)
0 3 * * 0 /opt/pncrm/scripts/backup/restore.sh --drill >> /var/log/pncrm-restore-drill.log 2>&1
```
## Restoring from cold
These steps have been rehearsed against the dev environment; expect them
to take 1530 minutes for a typical port. **The drill (last cron line
above) ensures the runbook stays correct — if the drill fails, the
real restore will too.**
### 0. Stop everything that writes
```bash
docker compose -f docker-compose.prod.yml stop web worker scheduler
# Leave postgres + minio + redis up; we'll point them at restored data.
```
### 1. Restore PostgreSQL
```bash
# Find the dump you want. Prefer the most recent successful hour.
mc ls "$BACKUP_S3_BUCKET/pg/$(hostname)/" | tail
SNAPSHOT="2026-04-28/14.dump.gz"
# Pull it.
mc cp "$BACKUP_S3_BUCKET/pg/$(hostname)/$SNAPSHOT" /tmp/
# Decrypt if BACKUP_GPG_RECIPIENT was set on the producer side.
gpg --decrypt /tmp/14.dump.gz.gpg > /tmp/14.dump.gz
# Drop & recreate the database. The 'restrict' FK from gdpr_exports.requested_by
# to user means we restore in the right order — pg_restore handles this.
psql "$DATABASE_URL" -c 'DROP DATABASE IF EXISTS port_nimara_crm WITH (FORCE);'
psql "$DATABASE_URL" -c 'CREATE DATABASE port_nimara_crm;'
gunzip -c /tmp/14.dump.gz | pg_restore --no-owner --no-privileges \
--dbname "$DATABASE_URL"
```
### 2. Restore MinIO
```bash
# Sync the backup bucket back over the live one. --overwrite handles
# files that were modified between snapshots.
mc mirror --overwrite \
"$BACKUP_S3_BUCKET/minio/" \
"live/$MINIO_BUCKET/"
```
### 3. Restore secrets
The `.env` file is **not** in object storage. Pull it from the password
manager / secrets vault. Verify `ENCRYPTION_KEY` matches the value used
when the database was last running — if it doesn't, rows in
`system_settings` (OCR API keys, etc.) decrypt to garbage and the OCR
"Test connection" button will return an opaque error. There is no
recovery path; the keys must be re-entered through the admin UI.
### 4. Bring services back up
```bash
docker compose -f docker-compose.prod.yml up -d
# Watch the worker logs; expect a flurry of socket reconnections, then quiet.
docker compose -f docker-compose.prod.yml logs -f worker
```
### 5. Verify
Tail through the smoke checklist, in order:
1. **DB up**`psql "$DATABASE_URL" -c 'SELECT count(*) FROM clients;'`
matches the producer-side count from the snapshot's hour.
2. **MinIO up** — open any client with attachments in the CRM, click a
receipt thumbnail; verify the signed URL serves the file.
3. **Documenso webhooks** — re-trigger one in the Documenso admin and
confirm `audit_logs` records the receipt.
4. **Email** — send a portal invite to a real address.
5. **Realtime** — open two browser windows, edit a client in one, watch
the other update via Socket.IO.
6. **AI usage ledger**`SELECT count(*) FROM ai_usage_ledger;`
non-empty if AI was being used. Old rows survive but the budget gates
reset alongside the period boundary at month rollover.
## Drill schedule
The weekly drill (cron line above) runs `restore.sh --drill` against a
throwaway database and a sandbox MinIO bucket. It must produce zero diff
between the restored row counts and the live row counts (modulo the
hour-or-so the drill takes to run).
Failure modes the drill catches before they bite production:
- New tables added without inclusion in `pg_dump`'s `--schema=public` (we
use the default, which captures everything in `public` — but a future
developer adding a `tenant_X` schema will silently lose it).
- MinIO bucket-policy changes that block the backup-side `s3:GetObject`
on certain prefixes.
- GPG passphrase rotation that wasn't propagated to the restore host.
- A `pg_restore` version skew with the producer-side `pg_dump`.

View File

@@ -0,0 +1,186 @@
# Email deliverability runbook
The CRM sends transactional email through three different surfaces. Each
has a different failure mode when it lands in spam. This runbook covers
how to diagnose, fix, and verify each path.
## What email the CRM sends
| Surface | Trigger | Template | Default `from` |
| ----------------------------------------- | -------------------------------------------------------------------------------------- | ----------------------------------------------------------------- | ----------------------------------------------------- |
| Portal activation / password-reset | Admin invites a client to the portal | `src/lib/email/templates/portal-auth.ts` | per-port `email_settings.from_address` or `SMTP_FROM` |
| Inquiry confirmation + sales notification | Public website POSTs to `/api/public/interests` or `/api/public/residential-inquiries` | `inquiry-client-confirmation.ts`, `inquiry-sales-notification.ts` | same |
| GDPR export ready | Staff requests an export with `emailToClient=true` | inline in `gdpr-export.service.ts` | same |
| Documenso reminders | Cadence job fires for an unsigned signer | `documenso/reminders/*` | same |
Documenso _itself_ sends signing requests with its own `from` address —
those don't flow through this codebase. SPF/DKIM for the Documenso
sender is the Documenso operator's problem, not yours.
## DNS records
For every domain that appears in a `from:` header you must publish:
### 1. SPF
A single TXT record at the apex authorizing whichever provider is
sending. Multiple SPF records on the same name **break SPF entirely**
combine into one.
```
v=spf1 include:_spf.google.com include:amazonses.com -all
```
The `-all` (hardfail) is correct for transactional mail. Switch to `~all`
(softfail) only as a temporary diagnostic when migrating providers.
### 2. DKIM
Each provider publishes its own selector. Common shapes:
- Google Workspace: `google._domainkey` → 2048-bit RSA pubkey (rotate every 12 months).
- Amazon SES: `xxxx._domainkey`, `yyyy._domainkey`, `zzzz._domainkey` (three CNAMEs SES gives you).
- Postmark / Resend / Mailgun: one CNAME per selector.
Verify alignment — the `d=` value in the DKIM signature must match the
`From:` domain (relaxed alignment is fine, strict is overkill).
### 3. DMARC
Start at `p=none` while you build deliverability data, then upgrade.
```
_dmarc 14400 IN TXT "v=DMARC1; p=quarantine; rua=mailto:dmarc@portnimara.com; ruf=mailto:dmarc@portnimara.com; fo=1; adkim=r; aspf=r; pct=100"
```
`rua` (aggregate reports) is the diagnostic feed — set it before the
first send so the first weekly report has data.
### 4. MX (only if you also receive)
The CRM's IMAP probe (`scripts/dev-imap-probe.ts`) and the inbound thread
sync rely on a real mailbox. Whoever runs that mailbox publishes the MX
records — typically Google Workspace or a dedicated provider. Don't add
an MX pointing at the CRM host; it doesn't accept SMTP IN.
## Per-port overrides
Each port can override `from_address`, `from_name`, and SMTP creds via
the admin email-settings page. When set, `getPortEmailConfig()` returns
those values and `sendEmail()` uses them in preference to the global
`SMTP_*` env. **The override domain still needs SPF / DKIM / DMARC** on
its own DNS — without them, every send from that port lands in spam.
When a customer reports "our portal invite didn't arrive":
1. Pull the port's email settings from the admin UI. Check `from_address`.
2. Run `dig TXT <from-domain>` and `dig TXT _dmarc.<from-domain>`.
Confirm SPF includes the SMTP provider's domain and DMARC exists.
3. Send a probe through `mail-tester.com`: paste the address into a
test send, click the score breakdown.
4. Score < 8/10 → fix whatever's flagged before doing anything else in
this runbook.
## Diagnosing a "didn't arrive" report
Order matters — go top-down, stop when one of these is the answer.
### Step 1: Was the send attempted?
```bash
# Tail the worker logs for the recipient address.
docker compose logs worker | grep '<recipient>'
```
You'll see one of three patterns:
- **Nothing**: The job didn't run. Check that BullMQ actually queued it.
`redis-cli LLEN bull:email:waiting` — if non-zero, the worker is dead.
`docker compose logs scheduler | tail` to see why.
- **`Email sent`** with a message-id: The provider accepted it. Move to
Step 2.
- **`SendError`**: Provider rejected. The error string says why
(auth, rate limit, blocked recipient).
### Step 2: Is `EMAIL_REDIRECT_TO` set?
In dev/test we set `EMAIL_REDIRECT_TO=ops@portnimara.com` so seeded fake
clients don't get real email. **It must be unset in production.**
```bash
# On the production host:
docker exec pncrm-web printenv EMAIL_REDIRECT_TO
# Should print nothing.
```
If it's set, every email is going to the redirect target with the
original recipient prefixed in the subject — the customer never sees it.
### Step 3: Did it land but get filtered?
Ask the recipient to check:
- Spam / Junk folder
- Gmail "Promotions" tab
- Outlook "Other" folder (vs Focused)
- The Quarantine console if they're on M365 with anti-spam enabled
If found in a spam folder: the email arrived; the recipient's filter
classified it. SPF/DKIM/DMARC alignment is suspect — re-run the
mail-tester probe from above.
### Step 4: Was the recipient on a suppression list?
Some providers (SES, Postmark) maintain a suppression list — once a
domain bounces from an address, future sends are dropped silently.
```bash
# SES example:
aws ses list-suppressed-destinations --region eu-west-1
```
If the recipient is suppressed, remove them and ask them to retry. The
CRM doesn't track suppression locally; that's the provider's job.
## When migrating SMTP providers
1. Add the new provider's DKIM CNAMEs alongside the old ones.
2. Add the new provider's `include:` to the existing SPF record.
3. Wait 48 hours for DNS to propagate and DMARC reports to confirm both
providers align.
4. Switch `SMTP_*` env to the new provider on a single staging host.
5. Send through the staging host for a week. Watch DMARC reports.
6. Cut production over.
7. Wait two weeks before removing the old provider's DNS — undelivered
bounce reports keep arriving for a while.
## Testing a deliverability fix
There's no automated test for "did this email reach the inbox" — that's a
property of the recipient's filter, which we don't control. The closest
proxy is the realapi suite:
```bash
pnpm exec playwright test --project=realapi
```
It runs `tests/e2e/realapi/portal-imap-activation.spec.ts` which sends a
real portal-invite email through SMTP, then polls the configured IMAP
mailbox for the activation link. If it appears within 30 seconds, the
SMTP→DKIM→DMARC chain is alive end-to-end. If the test times out, work
backwards through this runbook.
The realapi suite needs `SMTP_*` and `IMAP_*` env vars — see the
"Optional dev/test-only env vars" block in `CLAUDE.md`.
## Bounce handling
The CRM doesn't currently process bounces. If you start seeing volume:
- Set up the provider's webhook (SES → SNS → Lambda; Postmark → webhook
URL) to POST bounce events to a new `/api/webhooks/email-bounce` route.
- Persist the bounced address into a `email_suppressions` table.
- Have `sendEmail()` consult that table before each send.
That work isn't in scope yet; this runbook just flags it as the next
deliverability gap.

View File

@@ -0,0 +1,56 @@
# Permission Matrix Audit
Scanned 182 route files under `src/app/api/v1/`.
**No violations.** Every internal v1 handler is permission-gated.
**Allow-listed:** 46 handler(s) intentionally skip `withPermission`.
| File | Method | Reason |
| ---------------------------------------------------------------- | ------ | --------------------------------------------------------------------------------- |
| `src/app/api/v1/admin/alerts/run-engine/route.ts` | POST | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/connections/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/errors/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/health/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/ocr-settings/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/ocr-settings/route.ts` | PUT | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/ocr-settings/test/route.ts` | POST | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/queues/[queueName]/[jobId]/retry/route.ts` | POST | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/queues/[queueName]/[jobId]/route.ts` | DELETE | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/queues/[queueName]/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/queues/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/admin/users/options/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
| `src/app/api/v1/ai/email-draft/[jobId]/route.ts` | GET | TODO: needs ai:\* permission catalog entry. Currently allow-listed. |
| `src/app/api/v1/ai/email-draft/route.ts` | POST | TODO: needs ai:\* permission catalog entry. Currently allow-listed. |
| `src/app/api/v1/ai/interest-score/bulk/route.ts` | GET | TODO: needs ai:\* permission catalog entry. Currently allow-listed. |
| `src/app/api/v1/ai/interest-score/route.ts` | GET | TODO: needs ai:\* permission catalog entry. Currently allow-listed. |
| `src/app/api/v1/alerts/[id]/acknowledge/route.ts` | POST | Alerts are user-scoped; port-filtered via auth context. |
| `src/app/api/v1/alerts/[id]/dismiss/route.ts` | POST | Alerts are user-scoped; port-filtered via auth context. |
| `src/app/api/v1/alerts/count/route.ts` | GET | Alerts are user-scoped; port-filtered via auth context. |
| `src/app/api/v1/alerts/route.ts` | GET | Alerts are user-scoped; port-filtered via auth context. |
| `src/app/api/v1/berth-reservations/[id]/route.ts` | PATCH | TODO: PATCH should map to reservations:edit (not currently in catalog). |
| `src/app/api/v1/currency/convert/route.ts` | POST | Currency reference data; port-scoped, no PII. |
| `src/app/api/v1/currency/rates/refresh/route.ts` | POST | TODO: gate with admin:manage_settings — currently allow-listed. |
| `src/app/api/v1/currency/rates/route.ts` | GET | Currency reference data; port-scoped, no PII. |
| `src/app/api/v1/custom-fields/[entityId]/route.ts` | GET | TODO: needs custom_fields:\* permission. PUT path internally validated. |
| `src/app/api/v1/custom-fields/[entityId]/route.ts` | PUT | TODO: needs custom_fields:\* permission. PUT path internally validated. |
| `src/app/api/v1/expenses/export/parent-company/route.ts` | POST | Internally gated by isSuperAdmin inside the handler. |
| `src/app/api/v1/me/route.ts` | GET | Self-endpoint — auth is sufficient. |
| `src/app/api/v1/me/route.ts` | PATCH | Self-endpoint — auth is sufficient. |
| `src/app/api/v1/notifications/[notificationId]/route.ts` | PATCH | User-scoped notifications — caller is the resource owner. |
| `src/app/api/v1/notifications/preferences/route.ts` | GET | User-scoped notifications — caller is the resource owner. |
| `src/app/api/v1/notifications/preferences/route.ts` | PUT | User-scoped notifications — caller is the resource owner. |
| `src/app/api/v1/notifications/read-all/route.ts` | POST | User-scoped notifications — caller is the resource owner. |
| `src/app/api/v1/notifications/route.ts` | GET | User-scoped notifications — caller is the resource owner. |
| `src/app/api/v1/notifications/unread-count/route.ts` | GET | User-scoped notifications — caller is the resource owner. |
| `src/app/api/v1/saved-views/[id]/route.ts` | PATCH | User-self saved views — caller is the resource owner. |
| `src/app/api/v1/saved-views/[id]/route.ts` | DELETE | User-self saved views — caller is the resource owner. |
| `src/app/api/v1/saved-views/route.ts` | GET | User-self saved views — caller is the resource owner. |
| `src/app/api/v1/saved-views/route.ts` | POST | User-self saved views — caller is the resource owner. |
| `src/app/api/v1/search/recent/route.ts` | GET | Port-scoped search — results filtered by auth context (resources have own perms). |
| `src/app/api/v1/search/route.ts` | GET | Port-scoped search — results filtered by auth context (resources have own perms). |
| `src/app/api/v1/settings/feature-flag/route.ts` | GET | Public read of feature-flag bool — no PII; auth is sufficient. |
| `src/app/api/v1/tags/options/route.ts` | GET | Tags are cross-cutting reference data; port-scoped via auth. |
| `src/app/api/v1/tags/route.ts` | GET | Tags are cross-cutting reference data; port-scoped via auth. |
| `src/app/api/v1/users/me/preferences/route.ts` | GET | User-self preferences — caller is the resource owner. |
| `src/app/api/v1/users/me/preferences/route.ts` | PATCH | User-self preferences — caller is the resource owner. |

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,376 @@
# Google Workspace inbox-triage integration (exploratory)
**Status:** Exploratory — not approved for build
**Date:** 2026-04-29
**Tracks:** AI inbox-triage, Google Workspace email connection
## What this spec is for
The user has flagged inbox-triage as the most valuable AI surface left to
build, but conditioned email integration on it being via Google Workspace
specifically (not generic IMAP), with a per-port toggle so clients who
don't use GWS aren't billed for capability they can't reach.
This document captures what that build actually costs — especially on
the Google side, which is where most teams underestimate the work — so
we can decide whether to commit before writing any code. **Nothing in
this spec is approved for implementation.** The deliverable is a go /
no-go decision and, if go, a scope choice between three deployment
models that cost wildly different amounts of calendar time.
## What inbox-triage actually does for the user
Concretely, on the staff member's desktop:
1. **Linked-inbox panel on the client detail page.** When you open
`/[port]/clients/<id>` you see the last N email threads with that
client, pulled from the staff member's own Gmail. Each thread has
the latest message preview, an "open in Gmail" deep-link, and a
"draft reply" button (Phase 2+).
2. **Inbox triage queue.** A new top-level page `/[port]/inbox` that
lists unread/unanswered threads ranked by AI-assessed importance
(high-value client, contractual urgency, chase-overdue). Each row
has one-click actions: "log this as a note on the client",
"create a follow-up reminder", "draft reply".
3. **Email-driven alerts.** When a high-value client emails and no one
responds within X hours, the existing alerts engine fires a
`inbox.unanswered_high_value` rule (slots into the alert framework
from Phase B without schema change).
4. **Reply drafts (Phase 3).** AI generates a reply draft grounded in
the client's CRM record (open interests, pending reservations,
recent invoices). Staff edit and send through Gmail.
The value is selective: a port with three staff members fielding 50
client emails a day saves maybe an hour a day collectively if the
ranking is right. Below that volume the build doesn't pay back.
## What already exists in the codebase
The CRM is roughly halfway scaffolded for this:
| Surface | Status | Notes |
| ----------------------------------------------- | ----------------------- | ------------------------------------------------------------------------------------------------------------------------ |
| `email_accounts` table | ✅ Exists | Has `provider: 'google' \| 'outlook' \| 'custom'` discriminator and `imap_*` / `smtp_*` cols. Built for IMAP, not OAuth. |
| `email_threads` / `email_messages` tables | ✅ Exists | Already linked to `clientId`. Schema is good as-is for Gmail. |
| `email-threads.service.ts` `syncInbox()` | ⚠ Stub-ish | IMAP-flow only. Won't reach Gmail without OAuth + Gmail API rewrite. |
| `email` BullMQ queue + `inbox-sync` job name | ✅ Exists | Worker dispatches on the job name; new sync impl drops in. |
| `google_calendar_tokens` table | ✅ Exists | OAuth token storage shape we can mirror for Gmail. |
| Per-port email override (port `email_settings`) | ✅ Exists | Used for outbound only today; Gmail integration is per-staff-user, not per-port. |
| `ai_usage_ledger` + per-port `aiEnabled` flag | ✅ Exists (Phase 3a/3b) | Triage AI calls book against the same ledger. |
| `withRateLimit('ai', ...)` wrapper | ✅ Exists (Phase 3c) | Caps triage AI traffic at 60/min/user out of the box. |
Net: schemas are mostly right. The OAuth flow, Gmail API client, push
notification receiver, and triage classifier are the new builds.
## Why Google Workspace specifically
The user's stated constraint: "I don't think we need email integration
unless we connect it to Google Workspace." Reasons that hold up:
- **No password storage.** OAuth tokens are revocable, scoped, and
rotate. IMAP requires app passwords, which Google has been actively
deprecating since 2024 — they'll be gone for the workspace plans
this product targets.
- **Push notifications, not polling.** Gmail's `users.watch` API plus
Google Pub/Sub means we get an HTTP callback within seconds of a new
message landing. IMAP requires polling on a 30-60 second cadence,
which costs more and lags worse.
- **Search and labels.** The Gmail API exposes label management and
full-text search natively; IMAP search is much weaker.
- **Threading.** Gmail's `threadId` is canonical. Reconstructing
threads over IMAP from `In-Reply-To` / `References` headers is
reliable in theory, painful in practice.
Microsoft 365 is the obvious peer integration but is out of scope here.
The Graph API model is similar enough that a future M365 path can reuse
most of the storage shape.
## Three deployment models — pick one before building
This is the most important decision in the spec. Each model has
different OAuth-verification consequences, which dominate everything
else.
### Model A — Marketplace-published OAuth app
A single OAuth client owned by Port Nimara, listed in the Google
Workspace Marketplace, that any GWS customer can install. Each staff
member clicks "Connect Gmail," consents to the scopes, and the CRM
stores their refresh token.
**Google-side work:**
1. Build the OAuth flow in CRM (~1 week).
2. Submit for OAuth verification. Gmail's `gmail.readonly` /
`gmail.modify` scopes are **restricted scopes** — they require:
- Domain-verified production URLs
- A homepage with a privacy policy that explicitly enumerates which
scopes are used and why
- A demo video (literally a screen recording) showing the consent
screen and what happens next
- **A third-party security assessment from a Google-approved
vendor** ($15k$75k, 612 weeks)
- A Cloud Application Security Assessment (CASA) report
3. Marketplace listing review (~2 weeks after CASA passes).
**Calendar time:** 46 months.
**Money:** $15k$75k for the security assessment alone.
**Recurring:** Re-verification every 12 months.
Right answer if Port Nimara wants to be the marina-CRM that ships GWS
out of the box for _any_ customer. Wrong answer if there are <5
customers who'd use it.
### Model B — Per-customer "Internal" OAuth app
Each customer's GWS admin creates an OAuth client _inside their own
workspace_ and gives Port Nimara the client ID + secret. Because the
app is "Internal," Google skips verification entirely — the consent
screen is unverified-but-permitted. Tokens never cross workspace
boundaries.
**Google-side work per customer:**
1. Customer's GWS admin enables the Gmail API in their Cloud project.
2. Creates an OAuth 2.0 client ID with type "Internal" + your CRM's
redirect URI.
3. Hands the client ID + secret to Port Nimara out-of-band.
4. Staff connect their Gmail through that client.
**Calendar time per customer:** ~1 hour of admin work.
**Money:** $0.
**Limit:** Doesn't span across GWS workspaces. A user with two GWS
accounts (e.g. the marina + a personal workspace) can only connect the
one matching the OAuth client.
This is the **clear winner for the current customer base**: small
number of customers, each with their own GWS workspace, and each
buying the integration as part of an onboarding conversation.
### Model C — Forward-to-CRM mailbox
The CRM exposes a per-port email alias (e.g.
`port-nimara-NN@inbox.portnimara.com`). Customers configure a Gmail
filter or mailing rule that BCCs that alias on relevant threads. The
CRM ingests via SMTP and runs the same triage pipeline.
**Google-side work:** None. Customer does it as a Gmail filter.
**Calendar time:** ~1 week of CRM-side build.
**Limit:** Receive-only — no reply drafts, no thread state changes,
no labels. The "draft reply" feature in Phase 3 above is impossible
under this model.
Model C is the right answer if the user wants to ship inbox-triage
_now_ and decide on bidirectional Gmail integration later. The schema
is designed so the model can be upgraded to A or B without data
migration.
### Recommendation
**Build Model B first.** It costs nothing on the Google side, takes
~3 weeks of CRM work, and matches the actual customer profile.
**Promote to Model A only after 3+ paying customers ask for it
unprompted.** Until then, the security-assessment cost can't justify
itself.
Model C as a fallback for customers who refuse to set up an Internal
OAuth app. Build it last, lazily — the schema accommodates it.
## End-to-end flow (Model B)
### 1. Per-port OAuth-app config
New admin page `/[port]/admin/google-workspace`:
- Field: "OAuth client ID" (their internal client ID)
- Field: "OAuth client secret" (encrypted at rest using `ENCRYPTION_KEY`)
- Field: "Authorized redirect URI" (read-only; we display the value
they need to paste into their Google Cloud Console)
- Toggle: "Enable Gmail integration for this port"
Stored in `system_settings` under key `gws.config`, port-scoped.
Resolution mirrors the existing OCR config service.
### 2. Per-staff connect flow
Staff member visits `/[port]/me/integrations`, clicks "Connect Gmail."
```
GET /api/v1/auth/gws/start
→ looks up port's gws.config
→ builds Google authorize URL with port's client_id + state token
→ 302 to Google
[ user consents ]
→ 302 back to /api/v1/auth/gws/callback?code=…&state=…
→ exchanges code for tokens via port's client_secret
→ stores in new `gws_user_tokens` table (encrypted)
→ schedules an `inbox-watch` job
```
### 3. Push notification subscription
After tokens are stored, the worker calls
`gmail.users.watch({ topicName: <Pub/Sub topic>, labelIds: ['INBOX'] })`.
Gmail then posts to a Pub/Sub topic on every inbox change. The CRM
exposes a Pub/Sub push subscription endpoint at
`/api/webhooks/gmail-push` which fetches the changed messages via the
delta `historyId` and writes them into `email_messages`.
Watch subscriptions expire every 7 days. A maintenance job
re-establishes them daily.
### 4. Triage pipeline
For each new inbound message:
1. Match against `clients` and `companies` by `from_address` against
`client_contacts` (email channel). Persist a thread→client link if
found.
2. If port has `aiEnabled` AND `gws.triageEnabled`, queue an `ai`
job that classifies the thread:
- `urgency`: low / medium / high
- `category`: invoice-question / availability / contract / other
- `requires_response`: boolean
3. AI call records into `ai_usage_ledger` with `feature='inbox_triage'`.
The existing per-port budget gates apply automatically.
4. Triage output written to a new `email_triage` table keyed on
`email_messages.id`.
### 5. UI surfaces
- `/[port]/inbox` — sorted by triage rank, port-wide view.
- Linked-inbox panel on `client-tabs.tsx` — adds a new "Email" tab
pulling from `email_threads` filtered to that client.
- Alert rule `inbox.unanswered_high_value` slots into Phase B's
alert engine; no schema change.
## Schema additions
Three new tables, all port-scoped where it matters:
```ts
// Per-staff Gmail tokens. Mirror of google_calendar_tokens.
gws_user_tokens {
id, userId (UNIQUE), portId, emailAddress,
accessTokenEnc, refreshTokenEnc, tokenExpiry,
scope, watchExpiresAt, watchHistoryId,
connectedAt, lastSyncAt, syncEnabled, createdAt, updatedAt
}
// Triage classifications keyed to messages.
email_triage {
messageId (PK, FK email_messages.id ON DELETE CASCADE),
urgency, category, requiresResponse,
modelVersion, tokensUsed, classifiedAt
}
// Pub/Sub idempotency log. Gmail re-delivers; we dedupe.
gws_push_log {
messageId (Pub/Sub message id, PK),
historyId, receivedAt
}
```
Plus extensions to `email_messages`:
- `googleMessageId` (text, indexed) — Gmail's own ID for thread ops.
- `googleThreadId` (text, indexed).
- `gmailLabels` (text[]) — for "is unread" checks without hitting Gmail.
The existing `emailAccounts.provider='google'` column repurposes
unchanged; the IMAP fields go nullable since OAuth-flow accounts won't
populate them.
## AI cost interaction
Triage AI is opt-in **twice**: the port admin must turn on
`aiEnabled` (Phase 3a flag, default off) **and** `gws.triageEnabled`
(this spec, default off). Either toggle off and the inbox sync still
runs but skips classification, so staff can manually scan threads
without burning tokens.
Per-message token cost on a current Haiku-class model is roughly
15002500 tokens including the system prompt. A port doing 200 inbound
emails a day at the upper bound is ~500k tokens/day. The default
hard-cap is 500k/month, so triage will trip it inside a day. Two
mitigations baked in:
- The system prompt is short (<500 tokens) and prompt-cached on the
Anthropic side, so most tokens are output.
- Triage runs only on threads not already classified — re-syncs from
the watch loop don't re-bill.
The admin UI shows triage as its own line in the per-feature breakdown
so customers can see how much their inbox is costing them and tune
caps accordingly.
## Phased build (assuming Model B)
| Phase | Scope | Effort | Ships when |
| ---------------------------- | ------------------------------------------------------------------------------------------------------- | ------ | ----------------------------------------------------------- |
| **G1** Connect | OAuth flow + per-port config + per-user token storage. No sync yet. Staff can connect; nothing happens. | 1 week | Standalone |
| **G2** Read-only sync | Pub/Sub push receiver + delta sync into `email_messages`. Linked-inbox tab on client detail. No AI. | 1 week | After G1 |
| **G3** Triage classification | AI classifier, `email_triage` writes, `/inbox` page sorting. Per-port toggle. | 1 week | After G2; depends on Phase 3b budgets being live (they are) |
| **G4** Reply drafts | Gmail API send + draft creation. "Draft reply" button on the client detail Email tab. | 1 week | After G3 |
| **G5** Alerts | New `inbox.unanswered_high_value` rule. Hooks into Phase B alert engine. | 2 days | After G3 |
Total: ~5 weeks for a single engineer, assuming the user provides one
real GWS workspace to test against during G1.
## Open decisions for the user
These are the questions to resolve before scheduling the build, in
priority order:
1. **Deployment model — A, B, or C?** Default recommendation B.
2. **Single user or domain-wide delegation?** Per-staff connect (one
token per user) is simpler. Domain-wide delegation lets the port
admin connect once on behalf of every staff member but requires
the customer to grant a service account broader access. Default
recommendation: per-staff.
3. **Scope set.** Minimal viable scope is `gmail.readonly`. To send
replies (G4) we need `gmail.send`. To manage labels (e.g. mark
"triaged-by-CRM") we need `gmail.modify`. Each scope expansion
widens the consent screen scariness but doesn't add new
verification steps under Model B.
4. **Pub/Sub topic ownership.** Pub/Sub topics live in _some_ GCP
project. Under Model B the customer's project owns the topic —
they pay for Pub/Sub (cents/month) and grant our service account
subscriber access. Alternative: Port Nimara owns the topic and
the customer's Gmail publishes cross-project (allowed, slightly
more setup). Default: customer-owned topic, fewer moving parts.
5. **Triage model.** Haiku 4.5 is right for cost; Sonnet 4.6 is
right if the ranking quality on Haiku turns out to be poor.
Defer this until G3 has real-world tuning data.
## Things that are NOT in this spec
- **Microsoft 365 / Outlook integration.** Same shape, different API.
Once Model B is proven on GWS, Graph API takes another ~3 weeks.
- **Reply drafts grounded in CRM context.** That's G4 and depends on
the work in this spec, but the prompt engineering for "good replies
citing this client's open interests + reservations + invoices"
deserves its own design pass before building.
- **Cross-staff triage queue (i.e. "show me all unanswered emails
across the team").** That requires either domain-wide delegation
(decision #2 above) or per-staff opt-in to a shared view. Punt
until staff actually ask for it.
- **Sentiment / urgency tone analysis.** Tempting; almost always
wrong; skip in v1.
- **"Smart drafts" using the recipient's past replies as context.**
Every customer asks for this and almost no one uses it once
built. Skip.
## Cost summary at a glance
| Item | Model A | Model B | Model C |
| ------------------------------- | ------------------------------- | -------------------------------------- | ------------------------------------ |
| Build effort | 34 weeks | ~5 weeks (over G1G5) | ~1 week (receive-only) |
| Calendar time to first customer | 46 months | 1 hour of customer admin work | 1 hour of customer Gmail-filter work |
| Up-front cash | $15k$75k (CASA) | $0 | $0 |
| Recurring | Re-verification annually | None | None |
| Best for | 50+ customers, Marketplace play | 110 customers, white-glove onboarding | Customers who refuse OAuth setup |
The recommendation stands: build Model B for G1 + G2 + G3, ship that,
and let real customer demand decide whether G4/G5 and Model A
promotion are worth the calendar time.

View File

@@ -0,0 +1,189 @@
# Mobile Optimization Design
**Status**: Design approved 2026-04-29 — pending plan.
**Plan decomposition**: Foundation PR (§3) is one implementation plan; per-page migration phases (§5) become follow-up plans, scoped per phase.
**Branch base**: stacks on `refactor/data-model`.
**Out of scope**: Phase B/C features, desktop redesign, Capacitor wrapper, swipe-actions on rows, native menus, server-driven UI.
---
## 1. Background
The CRM was built desktop-first. A 2026-04-29 mobile audit captured every authenticated and public page across the active iPhone viewport range. Findings:
1. **No `viewport` meta in the root layout** (one exists only in the scanner PWA sub-layout, `src/app/(scanner)/[portSlug]/scan/layout.tsx`). Without it, iOS Safari renders pages at the default 980px logical width and zooms out to fit — text becomes unreadable and touch targets sub-tappable. Playwright's `isMobile` emulation in the audit forces 393px-wide rendering, which exposes the layout breakage you'd otherwise have to discover by pinching to zoom.
2. **Topbar overflows**. Search input + port switcher + sign-out button cram into one row; sign-out clips off the right edge as a half-visible blue bar on every authenticated page.
3. **Tables render as desktop tables**. Every list page (clients, yachts, companies, invoices, expenses, interests, audit, users, etc.) shows truncated columns with horizontal scroll.
4. **Page headers don't downsize**. Titles like "Dashboard" truncate to "Dash..."; primary action buttons (`+ New Client`) overlap their subtitles.
5. **Detail page action chips overflow**. The chip row ("Invite to portal | GDPR export | Archive | …") horizontally overflows on every detail page.
6. **One half-good pattern**: detail pages already collapse their tabs to a `<select>` dropdown on small screens. Worth extending.
7. **Auth + scanner pages are already mobile-first** (`/login`, `/[portSlug]/scan`). Reference for the "what good looks like" target.
The audit harness (`tests/e2e/audit/mobile.spec.ts` + `mobile-audit` Playwright project) is added on this branch (not yet committed); re-runs regenerate `.audit/mobile/` (gitignored).
## 2. Approach
**Adaptive shell + responsive content** — chosen over (a) per-page conditional render, (b) a separate `(mobile)` route group, and (c) Tailwind-only responsive.
The "native feel" the user wants comes from the chrome — bottom tab bar, sheet modals, sticky compact header, safe-area awareness. Page content (forms, lists, details) doesn't need duplication; it gets responsive via shared mobile-aware primitives. This concentrates the dedicated-mobile work in ~10 components and keeps content single-source.
**Breakpoint**: Tailwind `lg` (1024px). Below `lg`, the mobile shell renders. At and above, the existing desktop shell is untouched.
### 2.1 Target iPhone viewport range
The mobile shell + content primitives must look correct across the full active iPhone viewport range (portrait):
| Tier | Models | Viewport |
| ------------------------------------------ | ----------------------------------------------- | -------- |
| Narrowest | iPhone SE 2nd / 3rd gen | 375×667 |
| Standard | iPhone 12/13/14 (and Mini) | 390×844 |
| Standard newer | iPhone 15 / 15 Pro / 16 | 393×852 |
| Pro newer (Dynamic Island, thinner bezels) | iPhone 16 Pro / 17 Pro | 402×874 |
| Plus / older Max | iPhone 14 Plus / 15 Plus / 15 Pro Max / 16 Plus | 430×932 |
| Pro Max | iPhone 16 Pro Max / 17 Pro Max | 440×956 |
**Anchors used by audit and design validation**: 375×667 (worst-case narrow + short), 393×852 (most common current), 402×874 (current Pro), 440×956 (current Pro Max). Models within ±5px of an anchor (390, 430) are skipped — primitives that look correct at the anchors will look correct at neighbors.
**Dynamic Island**: iPhone 14 Pro and later have a larger top safe-area inset (~59px vs ~47px on classic-notch models). The CSS `env(safe-area-inset-top)` we expose as `pt-safe` handles this transparently — no per-model code paths.
**Landscape**: out of scope for this design. Phones in landscape are rare for CRM-style work; if needed later, the mobile shell at landscape widths would still fall under `lg` and would just stretch. Tablet landscape is addressed in the §5 tablet-pass phase.
**Routing**: no new route group. URLs and middleware unchanged. RBAC, services, queries, validators, RHF/zod forms, TanStack Query stores, socket.io — all unchanged.
## 3. Foundation PR
A single branch lands the infra + shell + primitives before any per-page work. After this merges, every authenticated page already gains: real viewport meta, no clipped topbar, bottom tab navigation, safe-area handling, and 44px touch targets — without any per-page edits.
### 3.1 Infrastructure
- `viewport` export in `src/app/layout.tsx``width=device-width, initial-scale=1, viewport-fit=cover`.
- `theme-color` meta + `apple-mobile-web-app-capable` meta + `apple-mobile-web-app-status-bar-style` for PWA-ish status-bar integration.
- Safe-area CSS variables (`env(safe-area-inset-*)`) exposed as Tailwind utilities (`pt-safe`, `pb-safe`, `pl-safe`, `pr-safe`).
- `useIsMobile()` hook in `src/hooks/use-is-mobile.ts` — backed by `window.matchMedia('(max-width: 1023.98px)')`, no resize listener.
- Server-side body-class detection: the root layout (`src/app/layout.tsx`) reads the `user-agent` request header via `next/headers`'s `headers()`, runs a small known-mobile-token check (Mobile / iPhone / iPad / Android — no library), and renders `<body data-form-factor="mobile|desktop">`. No middleware needed. CSS `[data-form-factor="mobile"]` reveals the mobile shell. The CSS media-query fallback (`@media (max-width: 1023.98px)`) handles UA misclassification (e.g., desktop browser resized to narrow width, or stripped UA).
### 3.2 Mobile shell
Both desktop and mobile shells are rendered to the DOM by the root layout; CSS reveals one and hides the other based on `[data-form-factor="mobile"]` plus a `@media (max-width: 1023.98px)` fallback. The existing `<Sidebar>` and `<Topbar>` components stay unchanged for the desktop shell. The mobile shell is wholly new:
- **`<MobileLayout>`** (`src/components/layout/mobile/mobile-layout.tsx`)
Fixed 52px compact topbar (safe-area aware) + scrollable content + fixed 56px bottom tab bar (safe-area inset). Renders instead of the desktop sidebar+topbar shell when the form factor resolves to mobile.
- **`<MobileTopbar>`**
Page title (auto-truncating, single-line) + back button when route depth > 1 + single primary action slot (passed via context from the page) + port-switcher behind a `<Sheet>` trigger.
- **`<MobileBottomTabs>`**
Fixed 5 tabs: **Dashboard / Clients / Yachts / Berths / More**. Active state from current path. Lucide icons (no emoji). Badge support for the alerts count.
- **`<MoreSheet>`**
Bottom sheet opened by the More tab. Holds the long tail in a scrollable list grouped by section: Companies, Interests, Invoices, Expenses, Documents, Email, Alerts, Reports, Reminders, Settings, Admin (with admin nesting one level deep into a child sheet).
- **`<MobileLayoutProvider>`**
React context that lets each page push its title, back button, and primary action slot to `<MobileTopbar>` via a hook (`useMobileChrome({ title, action })`).
### 3.3 Primitives
All built once in `src/components/shared/`. Render desktop-style above `lg`, mobile-style below.
- **`<Sheet>`** — vaul-based bottom sheet on mobile, falls through to existing Radix `<Dialog>` on desktop. Same API as `<Dialog>` so adoption is mechanical.
- **`<DataView>`** — accepts the same column defs the codebase uses today via TanStack Table. Above `lg`: renders the existing table. Below `lg`: renders a card list with a per-row `cardRender({ row }) => ReactNode` callback. Filter chips stay above the list; sort moves into a `<Sheet>` opened by a sort button.
- **`<PageHeader>`** — title + optional subtitle + actions. Truncates title to one line, stacks actions to a second row on mobile, hides subtitle below `sm` if action row is present.
- **`<ActionRow>`** — chip-style action group; `flex-nowrap overflow-x-auto scroll-smooth snap-x` on mobile, no overflow on desktop.
- **`<DetailPageShell>`** — wraps detail pages with: sticky compact header (entity name, primary status), tab dropdown selector (existing pattern, extracted), scrollable content area, optional sticky bottom action bar (Save / Archive / etc.) on mobile that pins above the bottom tab bar.
- **`<FilterChips>`** — chip-row filter UI used by `<DataView>`. Active filters are dismissable chips; "Add filter" opens a `<Sheet>`.
### 3.4 Default style adjustments
- `<Button>` and `<Input>` defaults: `min-h-11` (44px, Apple HIG touch-target).
- `<Input>` and `<Textarea>` body text: `text-base` (16px) so iOS doesn't zoom on focus.
- `<Dialog>` default base styling tweaked so any remaining unmigrated dialogs render full-screen on mobile (until they get migrated to `<Sheet>`).
### 3.5 Bundle impact
Both shells render server-side and switch via the `data-form-factor` body attribute, so both ship to every client (dynamic-importing one would cause a hydration flash). Rough estimate ~40KB gzipped added to the layout subtree for the mobile shell + new primitives (vaul ≈ 5KB gz, the rest is in-house components). Verify post-build with `pnpm build` and adjust if it's materially higher. Acceptable trade for no flash and no UA-based render-time branching.
### 3.6 PWA assets
The PWA scanner already references `icon-192.png`, `icon-512.png`, `icon-512-maskable.png` from `public/`, but those files don't exist yet (separate flagged blocker). The mobile shell adds an `apple-touch-icon` reference too. The Foundation PR includes placeholder PNGs so home-screen install works; production-quality icons can replace them without a code change.
## 4. Per-page playbook
Once foundation lands, each page follows the same workflow:
1. Open the page in headed Playwright at the anchor viewports per §2.1 (start at 393×852 for the iteration loop, spot-check 375 and 440 before declaring done).
2. Replace any `<Dialog>` with `<Sheet>`.
3. If list page: wrap the table in `<DataView>` and provide a `cardRender` callback. The 2-3 fields shown on the card are decided per page during migration with the user.
4. Replace the ad-hoc page header with `<PageHeader>`.
5. Replace ad-hoc action button rows with `<ActionRow>`.
6. Touch up any custom embedded widgets the page uses (rare for simple pages, common for `email`, `documents`, `expenses/scan`).
7. User reviews live in the headed browser, points out tweaks, iterate.
Most pages take 515 minutes in this loop. Heavy pages (email inbox, documents hub) may take 3060 because the embedded widgets need their own mobile treatment beyond the primitives.
## 5. Migration sequence
After foundation PR:
1. **Quick-win sweep** (~half day) — pages mostly fixed by foundation alone. Just need `<PageHeader>` swap-in (no list-card conversion, no detail-shell wrap):
`dashboard` (overview), `settings` (user-profile), `reports`, and the admin sub-pages that are forms or stat cards: `admin/settings`, `admin/branding`, `admin/forms`, `admin/ocr`, `admin/roles`, `admin/tags`, `admin/documenso`, `admin/templates`, `admin/custom-fields`, `admin/monitoring`, `admin/backup`, `admin/webhooks`, `admin/import`, `admin/ports`.
2. **List pages** (~12 days) — convert via `<DataView>` + per-page `cardRender`:
`clients`, `yachts`, `companies`, `berths`, `interests`, `invoices`, `expenses`, `alerts`, `reminders`, `admin/audit`, `admin/users`.
3. **Heavy pages** (~1 day each) — embedded widgets need their own mobile treatment beyond the primitives:
`documents` (sig-tracking + filters from Phase A), `email` (thread list + reader + composer).
4. **Detail pages** (~12 days) — wrap in `<DetailPageShell>`, extend the tab-dropdown pattern, add sticky bottom action shelf:
`clients/[clientId]`, `yachts/[yachtId]`, `companies/[companyId]`, `berths/[berthId]`, `invoices/[id]`, `expenses/[id]`.
5. **Forms & wizards** — touch-up only, since `<Input>`/`<Button>` defaults handle the bulk:
`invoices/new` (3-step wizard), `expenses/scan` (already mobile-first, just verify).
6. **Portal** — same patterns, smaller scope:
authenticated: `portal/dashboard`, `portal/invoices`, `portal/my-yachts`, `portal/documents`, `portal/interests`, `portal/my-reservations`. Public: `portal/login`, `portal/activate`, `portal/forgot-password`, `portal/reset-password` (already styled by `<BrandedAuthShell>` — just verify).
7. **Tablet pass** — re-audit at iPad Air 11" portrait (820×1180) and landscape (1180×820), iPad Air 13" portrait (1024×1366) and landscape (1366×1024). The 820 portrait case will hit the mobile shell (820 < 1024) and probably want a "tablet-portrait" treatment with sidebar visible — flagged for design refinement at that phase, not now. The other three viewports fall above `lg` and use the desktop shell unchanged.
## 6. Testing
- **Mobile audit project** (`mobile-audit` in `playwright.config.ts`) is the regression harness. Re-runs after every page-migration PR; output goes to `.audit/mobile/` (gitignored). Audit covers the four anchor viewports defined in §2.1: 375×667, 393×852, 402×874, 440×956. Run time ~14 min headed.
- **Smoke project** gets a curated mobile-viewport variant (~10 pages at the 393×852 anchor) — adds ~2 min to CI; full audit stays out of CI to avoid the ~14 min cost.
- **Visual baselines** — `visual` project gets new mobile snapshots at the 393×852 anchor for: dashboard, clients-list, clients-detail, invoices-list, invoices-new, scan, documents, login. Regenerate with `--update-snapshots` after intentional changes (existing convention).
- **Anchor device descriptors** lifted into a shared fixture at `tests/e2e/fixtures/devices.ts` (one per anchor in §2.1) so specs don't redefine viewport.
- **No new unit tests** for the primitives — they are presentational. Coverage comes from visual + integration runs.
## 7. Open questions
- **Bottom-tab taxonomy**: locked at Dashboard / Clients / Yachts / Berths / More for now. The More sheet holds everything else losslessly, so this is reversible — if real usage suggests a different top-5 (e.g., Interests or Invoices in the tabs), swap them later without code restructure.
- **`refactor/data-model` push order**: 155 commits unpushed. Foundation PR can stack on top and rebase, or wait until that branch merges. Decision deferred to user.
- **Desktop touch-target adjustments**: bumping `<Button>`/`<Input>` to `min-h-11` will affect desktop too. Verify visually that no desktop layout breaks; if any does, scope the bump to mobile-only via the `data-form-factor` attribute.
## 8. Files to create
```
src/hooks/use-is-mobile.ts
src/components/layout/mobile/
mobile-layout.tsx
mobile-topbar.tsx
mobile-bottom-tabs.tsx
more-sheet.tsx
mobile-layout-provider.tsx
src/components/shared/
sheet.tsx (new — vaul wrapper)
data-view.tsx (new — table↔card)
page-header.tsx (new)
action-row.tsx (new)
detail-page-shell.tsx (new)
filter-chips.tsx (new)
src/app/layout.tsx (modified — viewport export, theme-color, UA-derived data-form-factor body attribute via headers())
public/icon-192.png (placeholder PWA asset)
public/icon-512.png (placeholder PWA asset)
public/icon-512-maskable.png (placeholder PWA asset)
public/apple-touch-icon.png (placeholder PWA asset)
tailwind.config.ts (modified — safe-area utilities, touch-target defaults)
tests/e2e/fixtures/devices.ts (new — shared device descriptors)
```
## 9. Files to modify per page
Per the playbook in §4, each page typically needs:
- One swap of header markup → `<PageHeader>`.
- For list pages: one wrap of table → `<DataView>` + add `cardRender` callback.
- For detail pages: wrap in `<DetailPageShell>`.
- Replace `<Dialog>` imports with `<Sheet>`.
- No service, validator, query, or schema changes anywhere.

View File

@@ -52,6 +52,7 @@
"@tanstack/react-query": "^5.62.0",
"@tanstack/react-query-devtools": "^5.62.0",
"@tanstack/react-table": "^8.21.3",
"archiver": "^7.0.1",
"better-auth": "^1.2.0",
"bullmq": "^5.25.0",
"class-variance-authority": "^0.7.0",
@@ -61,7 +62,9 @@
"drizzle-orm": "^0.38.0",
"imapflow": "^1.2.13",
"ioredis": "^5.4.0",
"iso-3166-2": "^1.0.0",
"jose": "^6.2.1",
"libphonenumber-js": "^1.12.42",
"lucide-react": "^0.460.0",
"mailparser": "^3.9.4",
"minio": "^8.0.0",
@@ -83,12 +86,16 @@
"sonner": "^1.7.0",
"tailwind-merge": "^2.6.0",
"tailwindcss-animate": "^1.0.7",
"tesseract.js": "^7.0.0",
"vaul": "^1.1.2",
"zod": "^3.24.0",
"zustand": "^5.0.0"
},
"devDependencies": {
"@eslint/eslintrc": "^3.3.5",
"@playwright/test": "^1.58.2",
"@types/archiver": "^7.0.0",
"@types/iso-3166-2": "^1.0.4",
"@types/mailparser": "^3.4.6",
"@types/node": "^22.0.0",
"@types/nodemailer": "^6.4.0",
@@ -106,6 +113,7 @@
"lint-staged": "^15.2.0",
"postcss": "^8.4.0",
"prettier": "^3.4.0",
"react-grab": "^0.1.32",
"tailwindcss": "^3.4.0",
"tsx": "^4.19.0",
"typescript": "^5.7.0",

View File

@@ -75,6 +75,24 @@ export default defineConfig({
viewport: { width: 1440, height: 900 },
},
},
{
// Mobile / tablet audit — visits every page in headed Chromium at iPhone
// viewports (portrait), screenshots full-page to .audit/mobile/<viewport>/,
// and writes an index.md. Depends on `setup` for seeded admin + port-role.
name: 'mobile-audit',
testMatch: /audit\/mobile\.spec\.ts/,
dependencies: ['setup'],
// Single test walks 4 viewports × ~45 routes sequentially with slowMo;
// 30 min headroom keeps us well under the wall-clock cost.
timeout: 1_800_000,
use: {
headless: false,
launchOptions: { slowMo: 200 },
screenshot: 'off',
video: 'off',
trace: 'off',
},
},
],
// Don't start the dev server — we expect it to already be running

781
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

BIN
public/apple-touch-icon.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 654 B

BIN
public/icon-192.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 688 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 KiB

BIN
public/icon-512.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 KiB

View File

@@ -0,0 +1,188 @@
/**
* Permission-matrix audit.
*
* Walks every src/app/api/v1/** /route.ts file and reports each exported HTTP
* handler (GET/POST/PUT/PATCH/DELETE) that is *not* wrapped in withPermission().
* Internal v1 routes should be permission-gated; routes that intentionally use
* withAuth() alone (e.g. user-self endpoints) can be allow-listed below.
*
* Run:
* pnpm tsx scripts/audit-permissions.ts
*
* Exit code:
* 0 — every handler is permission-gated or in the allow-list
* 1 — at least one handler is missing both a withPermission wrapper and an
* allow-list entry. CI should fail.
*/
import { readdir, readFile } from 'node:fs/promises';
import { join, relative } from 'node:path';
const ROOT = join(process.cwd(), 'src/app/api/v1');
const HTTP_METHODS = ['GET', 'POST', 'PUT', 'PATCH', 'DELETE'] as const;
/**
* Routes intentionally exempt from withPermission. Each entry should explain
* why — typically because the route operates on the caller's own resources
* (no port-level permission semantics) or is admin-only and gated by
* isSuperAdmin inside the handler.
*/
const ALLOW_LIST: ReadonlyArray<{ pattern: RegExp; reason: string }> = [
// Self / admin / public
{ pattern: /\/me\/route\.ts$/, reason: 'Self-endpoint — auth is sufficient.' },
{ pattern: /\/admin\//, reason: 'Admin-only — gated by isSuperAdmin inside handler.' },
{
pattern: /\/notifications\//,
reason: 'User-scoped notifications — caller is the resource owner.',
},
{ pattern: /\/socket\//, reason: 'Socket auth handshake.' },
{ pattern: /\/health\//, reason: 'Public health check.' },
{ pattern: /\/users\/me\//, reason: 'User-self preferences — caller is the resource owner.' },
{ pattern: /\/saved-views\//, reason: 'User-self saved views — caller is the resource owner.' },
{
pattern: /\/settings\/feature-flag\//,
reason: 'Public read of feature-flag bool — no PII; auth is sufficient.',
},
// Cross-cutting / port-scoped reference data
{ pattern: /\/tags\//, reason: 'Tags are cross-cutting reference data; port-scoped via auth.' },
{
pattern: /\/currency\/(convert|rates)\/route\.ts$/,
reason: 'Currency reference data; port-scoped, no PII.',
},
{
pattern: /\/currency\/rates\/refresh\//,
reason: 'TODO: gate with admin:manage_settings — currently allow-listed.',
},
{
pattern: /\/search\//,
reason: 'Port-scoped search — results filtered by auth context (resources have own perms).',
},
// Alerts surface in topbar/dashboard for every signed-in user; per-port not per-resource.
{ pattern: /\/alerts\//, reason: 'Alerts are user-scoped; port-filtered via auth context.' },
// Internally gated by isSuperAdmin
{
pattern: /\/expenses\/export\/parent-company\//,
reason: 'Internally gated by isSuperAdmin inside the handler.',
},
// Pending dedicated permissions
{
pattern: /\/ai\//,
reason: 'TODO: needs ai:* permission catalog entry. Currently allow-listed.',
},
{
pattern: /\/custom-fields\/\[entityId\]\//,
reason: 'TODO: needs custom_fields:* permission. PUT path internally validated.',
},
{
pattern: /\/berth-reservations\/\[id\]\/route\.ts$/,
reason: 'TODO: PATCH should map to reservations:edit (not currently in catalog).',
},
];
interface Finding {
file: string;
method: string;
reason: 'no-withPermission' | 'no-withAuth' | 'allow-listed';
allowReason?: string;
}
async function* walk(dir: string): AsyncGenerator<string> {
for (const entry of await readdir(dir, { withFileTypes: true })) {
const path = join(dir, entry.name);
if (entry.isDirectory()) yield* walk(path);
else if (entry.isFile() && entry.name === 'route.ts') yield path;
}
}
function isAllowListed(file: string): { allowed: boolean; reason?: string } {
for (const { pattern, reason } of ALLOW_LIST) {
if (pattern.test(file)) return { allowed: true, reason };
}
return { allowed: false };
}
async function auditFile(file: string): Promise<Finding[]> {
const src = await readFile(file, 'utf-8');
const findings: Finding[] = [];
for (const method of HTTP_METHODS) {
// Match: export const GET = withAuth(...
const declRe = new RegExp(`export\\s+const\\s+${method}\\s*=\\s*(.+?);`, 's');
const m = declRe.exec(src);
if (!m) continue;
const block = m[1] ?? '';
const hasAuth = /withAuth\s*\(/.test(block);
const hasPerm = /withPermission\s*\(/.test(block);
const allow = isAllowListed(file);
if (!hasAuth) {
findings.push({ file, method, reason: 'no-withAuth' });
continue;
}
if (!hasPerm) {
if (allow.allowed) {
findings.push({ file, method, reason: 'allow-listed', allowReason: allow.reason });
} else {
findings.push({ file, method, reason: 'no-withPermission' });
}
}
}
return findings;
}
async function main() {
const files: string[] = [];
for await (const f of walk(ROOT)) files.push(f);
files.sort();
const all: Finding[] = [];
for (const f of files) all.push(...(await auditFile(f)));
const violations = all.filter(
(f) => f.reason === 'no-withPermission' || f.reason === 'no-withAuth',
);
const allowListed = all.filter((f) => f.reason === 'allow-listed');
// Markdown report
const lines: string[] = [];
lines.push('# Permission Matrix Audit');
lines.push('');
lines.push(`Scanned ${files.length} route files under \`src/app/api/v1/\`.`);
lines.push('');
if (violations.length === 0) {
lines.push('**No violations.** Every internal v1 handler is permission-gated.');
} else {
lines.push(`**${violations.length} violation(s):**`);
lines.push('');
lines.push('| File | Method | Issue |');
lines.push('| --- | --- | --- |');
for (const v of violations) {
const rel = relative(process.cwd(), v.file);
lines.push(`| \`${rel}\` | ${v.method} | ${v.reason} |`);
}
}
lines.push('');
lines.push(
`**Allow-listed:** ${allowListed.length} handler(s) intentionally skip \`withPermission\`.`,
);
if (allowListed.length > 0) {
lines.push('');
lines.push('| File | Method | Reason |');
lines.push('| --- | --- | --- |');
for (const a of allowListed) {
const rel = relative(process.cwd(), a.file);
lines.push(`| \`${rel}\` | ${a.method} | ${a.allowReason} |`);
}
}
process.stdout.write(lines.join('\n') + '\n');
process.exit(violations.length > 0 ? 1 : 0);
}
main().catch((err) => {
console.error(err);
process.exit(2);
});

View File

@@ -0,0 +1,51 @@
#!/usr/bin/env bash
# Hourly MinIO mirror for Port Nimara CRM.
#
# Mirrors the live `MINIO_BUCKET` to the backup destination. `mc mirror`
# is incremental — only changed objects transfer — so this is cheap.
#
# Versioning on the destination bucket is what protects against object
# deletes / overwrites; we don't try to roll our own.
set -euo pipefail
: "${MINIO_ENDPOINT:?MINIO_ENDPOINT not set}"
: "${MINIO_ACCESS_KEY:?MINIO_ACCESS_KEY not set}"
: "${MINIO_SECRET_KEY:?MINIO_SECRET_KEY not set}"
: "${MINIO_BUCKET:?MINIO_BUCKET not set}"
: "${BACKUP_S3_BUCKET:?BACKUP_S3_BUCKET not set}"
: "${BACKUP_S3_ENDPOINT:?BACKUP_S3_ENDPOINT not set}"
: "${BACKUP_S3_ACCESS_KEY:?BACKUP_S3_ACCESS_KEY not set}"
: "${BACKUP_S3_SECRET_KEY:?BACKUP_S3_SECRET_KEY not set}"
# Default scheme: live MinIO is plain HTTP unless MINIO_USE_SSL=true.
LIVE_URL="${MINIO_ENDPOINT}"
if [[ "${MINIO_USE_SSL:-false}" == "true" ]]; then
LIVE_URL="https://${MINIO_ENDPOINT}:${MINIO_PORT:-443}"
else
LIVE_URL="http://${MINIO_ENDPOINT}:${MINIO_PORT:-9000}"
fi
LIVE_ALIAS="live-$$"
BACKUP_ALIAS="bk-$$"
trap 'mc alias remove "$LIVE_ALIAS" 2>/dev/null || true; mc alias remove "$BACKUP_ALIAS" 2>/dev/null || true' EXIT
mc alias set "$LIVE_ALIAS" "$LIVE_URL" \
"$MINIO_ACCESS_KEY" "$MINIO_SECRET_KEY" --api S3v4 >/dev/null
mc alias set "$BACKUP_ALIAS" "$BACKUP_S3_ENDPOINT" \
"$BACKUP_S3_ACCESS_KEY" "$BACKUP_S3_SECRET_KEY" --api S3v4 >/dev/null
SOURCE="${LIVE_ALIAS}/${MINIO_BUCKET}/"
DEST="${BACKUP_ALIAS}/${BACKUP_S3_BUCKET}/minio/"
echo "[$(date -u +%FT%TZ)] Mirroring $SOURCE$DEST"
# `--remove` would delete objects from the destination that no longer
# exist in source — we DON'T pass it, because that would let an
# accidental delete on the live bucket cascade into permanent loss on
# the backup side. Versioning + lifecycle handle stale-object cleanup.
mc mirror --quiet --overwrite "$SOURCE" "$DEST"
# Print byte / count diff for the operator.
echo "[$(date -u +%FT%TZ)] Done. Destination summary:"
mc du "$DEST"

View File

@@ -0,0 +1,63 @@
#!/usr/bin/env bash
# Hourly PostgreSQL backup for Port Nimara CRM.
#
# Reads DATABASE_URL and BACKUP_S3_* from the environment. Dumps to a
# tmpfile, gzips, optionally GPG-encrypts to BACKUP_GPG_RECIPIENT, and
# uploads to s3://${BACKUP_S3_BUCKET}/pg/<hostname>/<UTC-date>/<hour>.dump.gz[.gpg].
#
# Designed to fail loud: any non-zero exit halts the script and propagates
# to the cron / CI runner so the operator sees the failure.
set -euo pipefail
: "${DATABASE_URL:?DATABASE_URL not set}"
: "${BACKUP_S3_BUCKET:?BACKUP_S3_BUCKET not set}"
: "${BACKUP_S3_ENDPOINT:?BACKUP_S3_ENDPOINT not set}"
: "${BACKUP_S3_ACCESS_KEY:?BACKUP_S3_ACCESS_KEY not set}"
: "${BACKUP_S3_SECRET_KEY:?BACKUP_S3_SECRET_KEY not set}"
HOST="${BACKUP_HOST_OVERRIDE:-$(hostname -s)}"
DATE_UTC="$(date -u +%Y-%m-%d)"
HOUR_UTC="$(date -u +%H)"
WORKDIR="$(mktemp -d)"
trap 'rm -rf "$WORKDIR"' EXIT
DUMP_FILE="$WORKDIR/${HOUR_UTC}.dump"
ARCHIVE_NAME="${HOUR_UTC}.dump.gz"
echo "[$(date -u +%FT%TZ)] Dumping $DATABASE_URL$DUMP_FILE"
pg_dump --format=custom --compress=9 --no-owner --no-privileges \
--file="$DUMP_FILE" "$DATABASE_URL"
# pg_dump's `custom` format is already compressed, but we wrap in gzip so
# the file looks the same regardless of the dump format on disk.
gzip -n "$DUMP_FILE"
GZ_FILE="${DUMP_FILE}.gz"
# Optional GPG layer. Only encrypt if the recipient is configured.
if [[ -n "${BACKUP_GPG_RECIPIENT:-}" ]]; then
echo "[$(date -u +%FT%TZ)] Encrypting for $BACKUP_GPG_RECIPIENT"
gpg --batch --yes --trust-model always \
--recipient "$BACKUP_GPG_RECIPIENT" \
--encrypt --output "${GZ_FILE}.gpg" "$GZ_FILE"
rm "$GZ_FILE"
GZ_FILE="${GZ_FILE}.gpg"
ARCHIVE_NAME="${ARCHIVE_NAME}.gpg"
fi
# Configure mc client for the backup destination.
MC_ALIAS="bk-$$"
mc alias set "$MC_ALIAS" "$BACKUP_S3_ENDPOINT" \
"$BACKUP_S3_ACCESS_KEY" "$BACKUP_S3_SECRET_KEY" \
--api S3v4 >/dev/null
REMOTE_PATH="${MC_ALIAS}/${BACKUP_S3_BUCKET}/pg/${HOST}/${DATE_UTC}/${ARCHIVE_NAME}"
echo "[$(date -u +%FT%TZ)] Uploading → $REMOTE_PATH"
mc cp --quiet "$GZ_FILE" "$REMOTE_PATH"
# Tag with retention metadata so lifecycle rules can decide what to expire.
mc tag set "$REMOTE_PATH" "kind=hourly&host=${HOST}&date=${DATE_UTC}" >/dev/null
mc alias remove "$MC_ALIAS" >/dev/null
echo "[$(date -u +%FT%TZ)] OK ${ARCHIVE_NAME} ($(du -h "$GZ_FILE" | cut -f1))"

121
scripts/backup/restore.sh Normal file
View File

@@ -0,0 +1,121 @@
#!/usr/bin/env bash
# Cold-restore script for Port Nimara CRM.
#
# Two modes:
# --drill Restore to a sandbox DB ($DRILL_DATABASE_URL) + a tagged
# sandbox path on the live MinIO bucket. Used by the weekly
# cron drill so the runbook stays accurate.
# (no --drill) Interactive production restore. Prompts before each
# destructive step; refuses to run if the live DB has
# non-empty tables (caller is expected to drop first).
#
# Common args:
# --snapshot YYYY-MM-DD/HH Specific dump to restore. Defaults to "latest".
set -euo pipefail
DRILL=0
SNAPSHOT="latest"
while [[ $# -gt 0 ]]; do
case "$1" in
--drill) DRILL=1; shift ;;
--snapshot) SNAPSHOT="$2"; shift 2 ;;
*) echo "unknown arg: $1" >&2; exit 2 ;;
esac
done
: "${BACKUP_S3_BUCKET:?BACKUP_S3_BUCKET not set}"
: "${BACKUP_S3_ENDPOINT:?BACKUP_S3_ENDPOINT not set}"
: "${BACKUP_S3_ACCESS_KEY:?BACKUP_S3_ACCESS_KEY not set}"
: "${BACKUP_S3_SECRET_KEY:?BACKUP_S3_SECRET_KEY not set}"
if [[ "$DRILL" -eq 1 ]]; then
: "${DRILL_DATABASE_URL:?DRILL_DATABASE_URL not set}"
TARGET_DB="$DRILL_DATABASE_URL"
echo "[drill] target DB = $TARGET_DB"
else
: "${DATABASE_URL:?DATABASE_URL not set}"
TARGET_DB="$DATABASE_URL"
read -rp "About to overwrite $TARGET_DB. Type 'restore' to continue: " confirm
[[ "$confirm" == "restore" ]] || { echo "aborted"; exit 1; }
fi
HOST="${BACKUP_HOST_OVERRIDE:-$(hostname -s)}"
WORKDIR="$(mktemp -d)"
trap 'rm -rf "$WORKDIR"' EXIT
MC_ALIAS="bk-$$"
mc alias set "$MC_ALIAS" "$BACKUP_S3_ENDPOINT" \
"$BACKUP_S3_ACCESS_KEY" "$BACKUP_S3_SECRET_KEY" --api S3v4 >/dev/null
trap 'rm -rf "$WORKDIR"; mc alias remove "$MC_ALIAS" 2>/dev/null || true' EXIT
# Resolve the snapshot path.
if [[ "$SNAPSHOT" == "latest" ]]; then
REMOTE=$(mc ls --recursive "${MC_ALIAS}/${BACKUP_S3_BUCKET}/pg/${HOST}/" \
| awk '{print $NF}' | sort | tail -1)
if [[ -z "$REMOTE" ]]; then
echo "no snapshots found under ${BACKUP_S3_BUCKET}/pg/${HOST}/" >&2
exit 1
fi
REMOTE="${MC_ALIAS}/${BACKUP_S3_BUCKET}/pg/${HOST}/${REMOTE}"
else
REMOTE="${MC_ALIAS}/${BACKUP_S3_BUCKET}/pg/${HOST}/${SNAPSHOT}.dump.gz"
# If GPG was used, the file lives at .dump.gz.gpg. Try both.
if ! mc stat "$REMOTE" >/dev/null 2>&1; then
REMOTE="${REMOTE}.gpg"
fi
fi
echo "[$(date -u +%FT%TZ)] Pulling $REMOTE"
LOCAL="$WORKDIR/$(basename "$REMOTE")"
mc cp --quiet "$REMOTE" "$LOCAL"
# Decrypt if needed.
if [[ "$LOCAL" == *.gpg ]]; then
echo "[$(date -u +%FT%TZ)] Decrypting"
gpg --batch --yes --decrypt --output "${LOCAL%.gpg}" "$LOCAL"
rm "$LOCAL"
LOCAL="${LOCAL%.gpg}"
fi
# Decompress.
gunzip "$LOCAL"
LOCAL="${LOCAL%.gz}"
echo "[$(date -u +%FT%TZ)] Restoring into $TARGET_DB"
# Drop & recreate to guarantee no half-state from a prior run.
DB_NAME=$(echo "$TARGET_DB" | sed -E 's|.*/([^?]+).*|\1|')
ADMIN_URL=$(echo "$TARGET_DB" | sed -E "s|/${DB_NAME}|/postgres|")
psql "$ADMIN_URL" -v ON_ERROR_STOP=1 <<SQL
SELECT pg_terminate_backend(pid) FROM pg_stat_activity
WHERE datname = '${DB_NAME}' AND pid <> pg_backend_pid();
DROP DATABASE IF EXISTS "${DB_NAME}";
CREATE DATABASE "${DB_NAME}";
SQL
pg_restore --no-owner --no-privileges --dbname "$TARGET_DB" "$LOCAL"
# Drill mode: compare row counts vs the live producer for parity.
if [[ "$DRILL" -eq 1 ]]; then
echo "[$(date -u +%FT%TZ)] Drill row-count diff (live vs restored):"
TABLES=$(psql -At "$TARGET_DB" -c \
"SELECT tablename FROM pg_tables WHERE schemaname='public' ORDER BY tablename;")
diff_count=0
while IFS= read -r tbl; do
[[ -z "$tbl" ]] && continue
live=$(psql -At "${LIVE_DATABASE_URL:-$DATABASE_URL}" -c "SELECT count(*) FROM \"$tbl\";")
restored=$(psql -At "$TARGET_DB" -c "SELECT count(*) FROM \"$tbl\";")
delta=$((live - restored))
if [[ "$delta" -ne 0 ]]; then
echo "$tbl: live=$live restored=$restored delta=$delta"
diff_count=$((diff_count + 1))
fi
done <<< "$TABLES"
if [[ "$diff_count" -eq 0 ]]; then
echo " ✓ row counts match across all tables"
fi
fi
echo "[$(date -u +%FT%TZ)] Restore complete."

View File

@@ -0,0 +1,40 @@
/**
* Dev helper: set a user's password directly (bypasses email reset).
* Usage: pnpm tsx scripts/dev-set-password.ts <email> <password>
*/
import 'dotenv/config';
import { hashPassword } from 'better-auth/crypto';
import { eq, and } from 'drizzle-orm';
import { db } from '@/lib/db';
import { user, account } from '@/lib/db/schema/users';
async function main() {
const [, , email, password] = process.argv;
if (!email || !password) {
console.error('Usage: pnpm tsx scripts/dev-set-password.ts <email> <password>');
process.exit(1);
}
const u = await db.query.user.findFirst({ where: eq(user.email, email) });
if (!u) {
console.error(`User not found: ${email}`);
process.exit(1);
}
const hash = await hashPassword(password);
const result = await db
.update(account)
.set({ password: hash, updatedAt: new Date() })
.where(and(eq(account.userId, u.id), eq(account.providerId, 'credential')))
.returning({ id: account.id });
if (result.length === 0) {
console.error(`No credential account row for ${email}`);
process.exit(1);
}
console.log(`Updated password for ${email} (account id ${result[0]?.id}).`);
process.exit(0);
}
main();

View File

@@ -20,7 +20,15 @@ async function main() {
const isSuperAdmin = args.includes('--super');
const name = args.find((a, i) => i > 0 && !a.startsWith('--'));
const { inviteId, link } = await createCrmInvite({ email, name, isSuperAdmin });
// Dev script runs out-of-band (no HTTP request, no session). The service's
// super-admin gate requires `invitedBy.isSuperAdmin === true` for super
// invites; the script bypasses that with a synthetic caller identity.
const { inviteId, link } = await createCrmInvite({
email,
name,
isSuperAdmin,
invitedBy: { userId: 'cli-script', isSuperAdmin: true },
});
console.log(`✓ Invite created (id=${inviteId})`);
console.log(` email: ${email}`);
console.log(` super_admin: ${isSuperAdmin}`);

View File

@@ -1,10 +1,9 @@
import { PageHeader } from '@/components/shared/page-header';
export default function BackupManagementPage() {
return (
<div className="space-y-6">
<div>
<h1 className="text-2xl font-bold text-foreground">Backup Management</h1>
<p className="text-muted-foreground">Manage system backups and restoration</p>
</div>
<PageHeader title="Backup Management" description="Manage system backups and restoration" />
<div className="flex flex-col items-center justify-center rounded-lg border border-dashed p-12">
<p className="text-lg font-medium text-muted-foreground">Coming in Layer 4</p>
<p className="text-sm text-muted-foreground">

View File

@@ -2,6 +2,7 @@ import {
SettingsFormCard,
type SettingFieldDef,
} from '@/components/admin/shared/settings-form-card';
import { PageHeader } from '@/components/shared/page-header';
const FIELDS: SettingFieldDef[] = [
{
@@ -47,13 +48,10 @@ const FIELDS: SettingFieldDef[] = [
export default function BrandingSettingsPage() {
return (
<div className="space-y-6">
<div>
<h1 className="text-2xl font-semibold">Branding</h1>
<p className="text-sm text-muted-foreground">
Logo, primary color, app name, and email header/footer HTML used by the branded auth shell
and outgoing email templates.
</p>
</div>
<PageHeader
title="Branding"
description="Logo, primary color, app name, and email header/footer HTML used by the branded auth shell and outgoing email templates."
/>
<SettingsFormCard
title="Identity"
description="App name, logo, and primary color."

View File

@@ -3,6 +3,7 @@ import {
type SettingFieldDef,
} from '@/components/admin/shared/settings-form-card';
import { DocumensoTestButton } from '@/components/admin/documenso/documenso-test-button';
import { PageHeader } from '@/components/shared/page-header';
const API_FIELDS: SettingFieldDef[] = [
{
@@ -48,13 +49,10 @@ const EOI_FIELDS: SettingFieldDef[] = [
export default function DocumensoSettingsPage() {
return (
<div className="space-y-6">
<div>
<h1 className="text-2xl font-semibold">Documenso & EOI</h1>
<p className="text-sm text-muted-foreground">
API credentials and default EOI generation pathway. Use the test-connection button to
verify a saved configuration before relying on it.
</p>
</div>
<PageHeader
title="Documenso & EOI"
description="API credentials and default EOI generation pathway. Use the test-connection button to verify a saved configuration before relying on it."
/>
<SettingsFormCard
title="Documenso API"

View File

@@ -2,6 +2,7 @@ import {
SettingsFormCard,
type SettingFieldDef,
} from '@/components/admin/shared/settings-form-card';
import { PageHeader } from '@/components/shared/page-header';
const FIELDS: SettingFieldDef[] = [
{
@@ -79,13 +80,10 @@ const FIELDS: SettingFieldDef[] = [
export default function EmailSettingsPage() {
return (
<div className="space-y-6">
<div>
<h1 className="text-2xl font-semibold">Email Settings</h1>
<p className="text-sm text-muted-foreground">
Per-port outgoing email configuration. SMTP credentials and the From address default to
environment variables when these fields are blank.
</p>
</div>
<PageHeader
title="Email Settings"
description="Per-port outgoing email configuration. SMTP credentials and the From address default to environment variables when these fields are blank."
/>
<SettingsFormCard
title="From address & signature"
description="Identity headers and shared HTML used by system-generated emails."

View File

@@ -1,10 +1,9 @@
import { PageHeader } from '@/components/shared/page-header';
export default function DataImportPage() {
return (
<div className="space-y-6">
<div>
<h1 className="text-2xl font-bold text-foreground">Data Import</h1>
<p className="text-muted-foreground">Import data from external sources</p>
</div>
<PageHeader title="Data Import" description="Import data from external sources" />
<div className="flex flex-col items-center justify-center rounded-lg border border-dashed p-12">
<p className="text-lg font-medium text-muted-foreground">Coming in Layer 4</p>
<p className="text-sm text-muted-foreground">

View File

@@ -1,15 +1,13 @@
import { InvitationsManager } from '@/components/admin/invitations/invitations-manager';
import { PageHeader } from '@/components/shared/page-header';
export default function InvitationsPage() {
return (
<div className="space-y-6">
<div>
<h1 className="text-2xl font-semibold">Invitations</h1>
<p className="text-sm text-muted-foreground">
Send a single-use invitation to a new CRM user. The recipient sets their own password via
the link in the email.
</p>
</div>
<PageHeader
title="Invitations"
description="Send a single-use invitation to a new CRM user. The recipient sets their own password via the link in the email."
/>
<InvitationsManager />
</div>
);

View File

@@ -0,0 +1,36 @@
import { redirect } from 'next/navigation';
import { headers } from 'next/headers';
import { eq } from 'drizzle-orm';
import { auth } from '@/lib/auth';
import { db } from '@/lib/db';
import { userProfiles } from '@/lib/db/schema/users';
/**
* Guard: only super-admins (isSuperAdmin === true in user_profiles) may access
* any page under /[portSlug]/admin. Everyone else is redirected to their dashboard.
*/
export default async function AdminLayout({
children,
params,
}: {
children: React.ReactNode;
params: Promise<{ portSlug: string }>;
}) {
const { portSlug } = await params;
const session = await auth.api.getSession({ headers: await headers() });
if (!session?.user) {
redirect('/login');
}
const profile = await db.query.userProfiles.findFirst({
where: eq(userProfiles.userId, session.user.id),
});
if (!profile?.isSuperAdmin) {
redirect(`/${portSlug}/dashboard`);
}
return <>{children}</>;
}

View File

@@ -0,0 +1,5 @@
import { OcrSettingsForm } from '@/components/admin/ocr-settings-form';
export default function OcrSettingsPage() {
return <OcrSettingsForm />;
}

View File

@@ -1,10 +1,9 @@
import { PageHeader } from '@/components/shared/page-header';
export default function OnboardingPage() {
return (
<div className="space-y-6">
<div>
<h1 className="text-2xl font-bold text-foreground">Onboarding</h1>
<p className="text-muted-foreground">Guided setup for new port configurations</p>
</div>
<PageHeader title="Onboarding" description="Guided setup for new port configurations" />
<div className="flex flex-col items-center justify-center rounded-lg border border-dashed p-12">
<p className="text-lg font-medium text-muted-foreground">Coming in Layer 4</p>
<p className="text-sm text-muted-foreground">

View File

@@ -20,6 +20,7 @@ import {
} from 'lucide-react';
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from '@/components/ui/card';
import { PageHeader } from '@/components/shared/page-header';
interface AdminSection {
href: string;
@@ -149,6 +150,12 @@ const SECTIONS: AdminSection[] = [
description: 'Initial-setup wizard for fresh ports.',
icon: LayoutDashboard,
},
{
href: 'ocr',
label: 'Receipt OCR',
description: 'Configure the AI provider used by the mobile receipt scanner.',
icon: ScrollText,
},
];
export default async function AdminLandingPage({
@@ -159,13 +166,10 @@ export default async function AdminLandingPage({
const { portSlug } = await params;
return (
<div className="space-y-6">
<div>
<h1 className="text-2xl font-semibold">Administration</h1>
<p className="text-sm text-muted-foreground">
Per-port configuration and system administration. Each card below opens a dedicated
settings page.
</p>
</div>
<PageHeader
title="Administration"
description="Per-port configuration and system administration. Each card below opens a dedicated settings page."
/>
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 gap-4">
{SECTIONS.map((s) => {
const Icon = s.icon;

View File

@@ -2,6 +2,7 @@ import {
SettingsFormCard,
type SettingFieldDef,
} from '@/components/admin/shared/settings-form-card';
import { PageHeader } from '@/components/shared/page-header';
const DEFAULT_FIELDS: SettingFieldDef[] = [
{
@@ -53,14 +54,10 @@ const DIGEST_FIELDS: SettingFieldDef[] = [
export default function ReminderSettingsPage() {
return (
<div className="space-y-6">
<div>
<h1 className="text-2xl font-semibold">Reminders</h1>
<p className="text-sm text-muted-foreground">
Default reminder behaviour for new interests and the optional daily-digest delivery
window. Individual users can still configure their own digest preferences in Notifications
Preferences.
</p>
</div>
<PageHeader
title="Reminders"
description="Default reminder behaviour for new interests and the optional daily-digest delivery window. Individual users can still configure their own digest preferences in Notifications → Preferences."
/>
<SettingsFormCard
title="Defaults for new interests"

View File

@@ -1,10 +1,12 @@
import { PageHeader } from '@/components/shared/page-header';
export default function ScheduledReportsPage() {
return (
<div className="space-y-6">
<div>
<h1 className="text-2xl font-bold text-foreground">Scheduled Reports</h1>
<p className="text-muted-foreground">Configure and manage automated report delivery</p>
</div>
<PageHeader
title="Scheduled Reports"
description="Configure and manage automated report delivery"
/>
<div className="flex flex-col items-center justify-center rounded-lg border border-dashed p-12">
<p className="text-lg font-medium text-muted-foreground">Coming in Layer 3</p>
<p className="text-sm text-muted-foreground">

View File

@@ -2,6 +2,7 @@
import { useCallback, useEffect, useState } from 'react';
import { Button } from '@/components/ui/button';
import { PageHeader } from '@/components/shared/page-header';
import { Badge } from '@/components/ui/badge';
import {
AlertDialog,
@@ -36,7 +37,11 @@ export default function WebhooksPage() {
const [deleteTarget, setDeleteTarget] = useState<Webhook | null>(null);
const [expandedId, setExpandedId] = useState<string | null>(null);
const [regenerating, setRegenerating] = useState<string | null>(null);
const [newSecret, setNewSecret] = useState<{ webhookId: string; secret: string; masked: string } | null>(null);
const [newSecret, setNewSecret] = useState<{
webhookId: string;
secret: string;
masked: string;
} | null>(null);
const loadWebhooks = useCallback(async () => {
try {
@@ -98,15 +103,20 @@ export default function WebhooksPage() {
return (
<div className="space-y-6">
<div className="flex items-center justify-between">
<div>
<h1 className="text-2xl font-bold text-foreground">Webhooks</h1>
<p className="text-muted-foreground">Configure outgoing webhook integrations</p>
</div>
<Button onClick={() => { setEditTarget(null); setFormOpen(true); }}>
Add Webhook
</Button>
</div>
<PageHeader
title="Webhooks"
description="Configure outgoing webhook integrations"
actions={
<Button
onClick={() => {
setEditTarget(null);
setFormOpen(true);
}}
>
Add Webhook
</Button>
}
/>
{loading ? (
<p className="text-sm text-muted-foreground">Loading...</p>
@@ -116,7 +126,13 @@ export default function WebhooksPage() {
<p className="text-sm text-muted-foreground mt-1">
Add a webhook to receive real-time notifications of CRM events.
</p>
<Button className="mt-4" onClick={() => { setEditTarget(null); setFormOpen(true); }}>
<Button
className="mt-4"
onClick={() => {
setEditTarget(null);
setFormOpen(true);
}}
>
Add Webhook
</Button>
</div>
@@ -141,17 +157,16 @@ export default function WebhooksPage() {
</div>
<div className="flex items-center gap-2 shrink-0">
<Button
variant="ghost"
size="sm"
onClick={() => handleToggleActive(webhook)}
>
<Button variant="ghost" size="sm" onClick={() => handleToggleActive(webhook)}>
{webhook.isActive ? 'Disable' : 'Enable'}
</Button>
<Button
variant="ghost"
size="sm"
onClick={() => { setEditTarget(webhook); setFormOpen(true); }}
onClick={() => {
setEditTarget(webhook);
setFormOpen(true);
}}
>
Edit
</Button>
@@ -163,11 +178,7 @@ export default function WebhooksPage() {
>
Delete
</Button>
<Button
variant="ghost"
size="sm"
onClick={() => toggleExpand(webhook.id)}
>
<Button variant="ghost" size="sm" onClick={() => toggleExpand(webhook.id)}>
{expandedId === webhook.id ? 'Collapse' : 'Details'}
</Button>
</div>
@@ -228,18 +239,26 @@ export default function WebhooksPage() {
onSuccess={loadWebhooks}
/>
<AlertDialog open={!!deleteTarget} onOpenChange={(open) => { if (!open) setDeleteTarget(null); }}>
<AlertDialog
open={!!deleteTarget}
onOpenChange={(open) => {
if (!open) setDeleteTarget(null);
}}
>
<AlertDialogContent>
<AlertDialogHeader>
<AlertDialogTitle>Delete Webhook</AlertDialogTitle>
<AlertDialogDescription>
Delete &quot;{deleteTarget?.name}&quot;? This will also delete all delivery history. This action
cannot be undone.
Delete &quot;{deleteTarget?.name}&quot;? This will also delete all delivery history.
This action cannot be undone.
</AlertDialogDescription>
</AlertDialogHeader>
<AlertDialogFooter>
<AlertDialogCancel>Cancel</AlertDialogCancel>
<AlertDialogAction onClick={handleDelete} className="bg-destructive text-destructive-foreground">
<AlertDialogAction
onClick={handleDelete}
className="bg-destructive text-destructive-foreground"
>
Delete
</AlertDialogAction>
</AlertDialogFooter>

View File

@@ -0,0 +1,5 @@
import { AlertsPageShell } from '@/components/alerts/alerts-page-shell';
export default function AlertsPage() {
return <AlertsPageShell />;
}

View File

@@ -0,0 +1,5 @@
import { BerthReservationsList } from '@/components/reservations/berth-reservations-list';
export default function BerthReservationsPage() {
return <BerthReservationsList />;
}

View File

@@ -0,0 +1,41 @@
import { Skeleton } from '@/components/ui/skeleton';
import { CardSkeleton } from '@/components/shared/loading-skeleton';
/**
* Route-level loading UI for the client detail page. Renders while the
* server component resolves the session and the client component bootstraps
* its initial query — replaces the previous empty-header flash on direct
* URL visits.
*/
export default function Loading() {
return (
<div className="space-y-6">
{/* Header strip — title, badges, action buttons */}
<div className="rounded-xl border border-border bg-card px-5 py-4 shadow-sm space-y-3">
<div className="flex items-center gap-3">
<Skeleton className="h-7 w-56" />
<Skeleton className="h-5 w-16 rounded-full" />
</div>
<div className="flex flex-wrap gap-2">
<Skeleton className="h-9 w-20 rounded-md" />
<Skeleton className="h-9 w-20 rounded-md" />
<Skeleton className="h-9 w-24 rounded-md" />
<Skeleton className="h-9 w-32 rounded-md" />
</div>
</div>
{/* Tab strip */}
<div className="flex gap-2 border-b border-border pb-1">
{Array.from({ length: 8 }).map((_, i) => (
<Skeleton key={i} className="h-8 w-20 rounded-md" />
))}
</div>
{/* Two-column overview */}
<div className="grid grid-cols-1 gap-6 md:grid-cols-2">
<CardSkeleton />
<CardSkeleton />
</div>
</div>
);
}

View File

@@ -20,6 +20,7 @@ import { TableSkeleton } from '@/components/shared/loading-skeleton';
import { ArchiveConfirmDialog } from '@/components/shared/archive-confirm-dialog';
import { PermissionGate } from '@/components/shared/permission-gate';
import { ExpenseFormDialog } from '@/components/expenses/expense-form-dialog';
import { ExpenseCard } from '@/components/expenses/expense-card';
import { expenseFilterDefinitions } from '@/components/expenses/expense-filters';
import { getExpenseColumns, type ExpenseRow } from '@/components/expenses/expense-columns';
import { usePaginatedQuery } from '@/hooks/use-paginated-query';
@@ -60,8 +61,7 @@ export default function ExpensesPage() {
});
const archiveMutation = useMutation({
mutationFn: (id: string) =>
apiFetch(`/api/v1/expenses/${id}`, { method: 'DELETE' }),
mutationFn: (id: string) => apiFetch(`/api/v1/expenses/${id}`, { method: 'DELETE' }),
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ['expenses'] });
setArchiveExpense(null);
@@ -151,6 +151,14 @@ export default function ExpensesPage() {
onSortChange={setSort}
isLoading={isFetching && !isLoading}
getRowId={(row) => row.id}
cardRender={(row) => (
<ExpenseCard
expense={row.original}
portSlug={portSlug}
onEdit={setEditExpense}
onArchive={setArchiveExpense}
/>
)}
emptyState={
<EmptyState
title="No expenses found"

View File

@@ -1,9 +1,11 @@
'use client';
import { useState, useRef } from 'react';
import { useEffect, useRef, useState } from 'react';
import { useParams, useRouter } from 'next/navigation';
import { useMutation } from '@tanstack/react-query';
import { Upload, Loader2, ScanLine } from 'lucide-react';
import { Camera, Loader2, ScanLine, Upload } from 'lucide-react';
import { useMobileChrome } from '@/components/layout/mobile/mobile-layout-provider';
import { Button } from '@/components/ui/button';
import { Input } from '@/components/ui/input';
@@ -33,9 +35,16 @@ export default function ScanReceiptPage() {
const router = useRouter();
const fileInputRef = useRef<HTMLInputElement>(null);
const cameraInputRef = useRef<HTMLInputElement>(null);
const [scanResult, setScanResult] = useState<ScanResult | null>(null);
const [previewUrl, setPreviewUrl] = useState<string | null>(null);
const { setChrome } = useMobileChrome();
useEffect(() => {
setChrome({ title: 'Scan Receipt', showBackButton: true });
return () => setChrome({ title: null, showBackButton: false });
}, [setChrome]);
// Editable fields from scan
const [establishment, setEstablishment] = useState('');
const [amount, setAmount] = useState('');
@@ -94,7 +103,7 @@ export default function ScanReceiptPage() {
return (
<div className="max-w-2xl mx-auto space-y-6">
<div>
<div className="hidden sm:block">
<h1 className="text-2xl font-bold">Scan Receipt</h1>
<p className="text-muted-foreground mt-1">
Upload a receipt image and we will extract the expense details automatically.
@@ -109,28 +118,44 @@ export default function ScanReceiptPage() {
</CardTitle>
</CardHeader>
<CardContent>
<div
className="border-2 border-dashed rounded-lg p-8 text-center cursor-pointer hover:bg-muted/50 transition-colors"
onClick={() => fileInputRef.current?.click()}
>
{previewUrl ? (
{previewUrl ? (
<div
className="border-2 border-dashed rounded-lg p-4 text-center cursor-pointer hover:bg-muted/50 transition-colors"
onClick={() => fileInputRef.current?.click()}
>
<img
src={previewUrl}
alt="Receipt preview"
className="max-h-64 mx-auto rounded object-contain"
/>
) : (
<div className="space-y-2">
<Upload className="h-8 w-8 mx-auto text-muted-foreground" />
<p className="text-sm text-muted-foreground">
Click to upload or drag and drop
</p>
<p className="text-xs text-muted-foreground">
JPEG, PNG, WebP up to 10MB
</p>
</div>
)}
</div>
</div>
) : (
<div className="grid gap-2 sm:grid-cols-2">
<Button
type="button"
size="lg"
className="w-full h-14 sm:hidden"
onClick={() => cameraInputRef.current?.click()}
>
<Camera className="mr-2 h-5 w-5" />
Take photo
</Button>
<Button
type="button"
variant="outline"
size="lg"
className="w-full h-14"
onClick={() => fileInputRef.current?.click()}
>
<Upload className="mr-2 h-5 w-5" />
<span className="sm:hidden">Choose from library</span>
<span className="hidden sm:inline">Click to upload or drag and drop</span>
</Button>
<p className="text-xs text-muted-foreground sm:col-span-2 text-center">
JPEG, PNG, WebP up to 10MB
</p>
</div>
)}
<input
ref={fileInputRef}
type="file"
@@ -138,6 +163,14 @@ export default function ScanReceiptPage() {
className="hidden"
onChange={handleFileChange}
/>
<input
ref={cameraInputRef}
type="file"
accept="image/*"
capture="environment"
className="hidden"
onChange={handleFileChange}
/>
{scanMutation.isPending && (
<div className="flex items-center justify-center gap-2 mt-4 text-muted-foreground">
@@ -222,25 +255,18 @@ export default function ScanReceiptPage() {
</div>
{saveMutation.isError && (
<p className="text-sm text-destructive">
{(saveMutation.error as Error).message}
</p>
<p className="text-sm text-destructive">{(saveMutation.error as Error).message}</p>
)}
<div className="flex gap-2 pt-2">
<Button
variant="outline"
onClick={() => router.push(`/${params.portSlug}/expenses`)}
>
<Button variant="outline" onClick={() => router.push(`/${params.portSlug}/expenses`)}>
Cancel
</Button>
<Button
onClick={() => saveMutation.mutate()}
disabled={saveMutation.isPending || !amount}
>
{saveMutation.isPending && (
<Loader2 className="mr-2 h-4 w-4 animate-spin" />
)}
{saveMutation.isPending && <Loader2 className="mr-2 h-4 w-4 animate-spin" />}
Save as Expense
</Button>
</div>

View File

@@ -1,11 +1,13 @@
'use client';
import { useState } from 'react';
import { useParams, useRouter } from 'next/navigation';
import { useEffect, useState } from 'react';
import { useParams, useRouter, useSearchParams } from 'next/navigation';
import { useForm, FormProvider } from 'react-hook-form';
import { zodResolver } from '@hookform/resolvers/zod';
import { useMutation } from '@tanstack/react-query';
import { ChevronLeft, ChevronRight, Check, Loader2 } from 'lucide-react';
import { useMutation, useQuery } from '@tanstack/react-query';
import { ChevronLeft, ChevronRight, Check, Loader2, Wallet } from 'lucide-react';
import { useMobileChrome } from '@/components/layout/mobile/mobile-layout-provider';
import { Button } from '@/components/ui/button';
import { Input } from '@/components/ui/input';
@@ -43,9 +45,35 @@ export default function NewInvoicePage() {
const params = useParams<{ portSlug: string }>();
const portSlug = params?.portSlug ?? '';
const router = useRouter();
const searchParams = useSearchParams();
const prefilledInterestId = searchParams.get('interestId') ?? undefined;
const prefilledKind =
searchParams.get('kind') === 'deposit' ? ('deposit' as const) : ('general' as const);
const [step, setStep] = useState(1);
const { setChrome } = useMobileChrome();
useEffect(() => {
setChrome({ title: 'New Invoice', showBackButton: true });
return () => setChrome({ title: null, showBackButton: false });
}, [setChrome]);
// When the form is launched from an interest detail with `?interestId=…&kind=deposit`,
// fetch enough of the interest to display "Deposit for {client} — Berth {n}" in
// the review step. Doubles as the source of truth for the billing entity prefill.
const { data: prefilledInterest } = useQuery<{
data: {
id: string;
clientId: string;
clientName: string | null;
berthMooringNumber: string | null;
};
}>({
queryKey: ['interest-prefill', prefilledInterestId],
queryFn: () => apiFetch(`/api/v1/interests/${prefilledInterestId}`),
enabled: !!prefilledInterestId,
});
const methods = useForm<CreateInvoiceInput>({
resolver: zodResolver(createInvoiceSchema),
defaultValues: {
@@ -53,6 +81,8 @@ export default function NewInvoicePage() {
currency: 'USD',
lineItems: [],
expenseIds: [],
interestId: prefilledInterestId,
kind: prefilledKind,
},
});
@@ -65,6 +95,43 @@ export default function NewInvoicePage() {
} = methods;
const watchedValues = watch();
const isDepositInvoice = watchedValues.kind === 'deposit';
// Resolve the selected billing entity to a human name so the review step
// shows "Acme Yacht Charters" instead of "company 4f2a1b…".
const billingEntityRef = watchedValues.billingEntity ?? null;
const { data: billingEntityName } = useQuery<{ name: string }>({
queryKey: ['billing-entity-name', billingEntityRef?.type, billingEntityRef?.id],
queryFn: async () => {
if (!billingEntityRef) return { name: '' };
const path =
billingEntityRef.type === 'company'
? `/api/v1/companies/${billingEntityRef.id}`
: `/api/v1/clients/${billingEntityRef.id}`;
const res = await apiFetch<{
data: { fullName?: string; name?: string };
}>(path);
return {
name: res?.data?.fullName ?? res?.data?.name ?? '',
};
},
enabled: !!billingEntityRef?.id,
staleTime: 60_000,
});
// Pre-fill the billing entity from the linked interest's client on launch.
useEffect(() => {
if (prefilledInterest?.data && !watchedValues.billingEntity) {
setValue(
'billingEntity',
{ type: 'client', id: prefilledInterest.data.clientId },
{ shouldValidate: true },
);
}
// We only want this to run when the interest data first arrives.
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [prefilledInterest?.data?.clientId]);
const lineItems = watchedValues.lineItems ?? [];
const subtotal = lineItems.reduce(
(sum, li) => sum + (Number(li.quantity) || 0) * (Number(li.unitPrice) || 0),
@@ -117,8 +184,8 @@ export default function NewInvoicePage() {
return (
<div className="max-w-2xl mx-auto space-y-6">
{/* Header */}
<div className="flex items-center gap-3">
{/* Header — desktop only; mobile gets the title from the topbar */}
<div className="hidden sm:flex items-center gap-3">
<Button variant="ghost" size="sm" onClick={() => router.push(`/${portSlug}/invoices`)}>
<ChevronLeft className="h-4 w-4" />
</Button>
@@ -157,6 +224,23 @@ export default function NewInvoicePage() {
<CardTitle className="text-base">Client Information</CardTitle>
</CardHeader>
<CardContent className="space-y-4">
{isDepositInvoice ? (
<div className="flex items-start gap-3 rounded-md border border-amber-200 bg-amber-50 px-3 py-2 text-sm text-amber-900">
<Wallet className="mt-0.5 h-4 w-4 shrink-0" />
<div className="min-w-0">
<p className="font-medium">Deposit invoice</p>
<p className="text-xs text-amber-800">
{prefilledInterest?.data
? `Linked to ${prefilledInterest.data.clientName ?? 'interest'}${
prefilledInterest.data.berthMooringNumber
? ` — Berth ${prefilledInterest.data.berthMooringNumber}`
: ''
}. Marking this invoice as paid will advance the interest to "Deposit 10%".`
: 'Marking this invoice as paid will advance the linked interest to "Deposit 10%".'}
</p>
</div>
</div>
) : null}
<div className="space-y-2">
<Label>
Billing entity <span className="text-destructive">*</span>
@@ -294,9 +378,13 @@ export default function NewInvoicePage() {
<p className="font-medium mt-0.5">
{watchedValues.billingEntity ? (
<>
<span className="capitalize">{watchedValues.billingEntity.type}</span>{' '}
<span className="text-xs opacity-60">
{watchedValues.billingEntity.id.slice(0, 12)}
{billingEntityName?.name ? (
<span>{billingEntityName.name}</span>
) : (
<span className="text-muted-foreground">Loading</span>
)}{' '}
<span className="text-xs text-muted-foreground capitalize">
({watchedValues.billingEntity.type})
</span>
</>
) : (

View File

@@ -12,6 +12,7 @@ import { PageHeader } from '@/components/shared/page-header';
import { EmptyState } from '@/components/shared/empty-state';
import { TableSkeleton } from '@/components/shared/loading-skeleton';
import { PermissionGate } from '@/components/shared/permission-gate';
import { InvoiceCard } from '@/components/invoices/invoice-card';
import { invoiceFilterDefinitions } from '@/components/invoices/invoice-filters';
import { getInvoiceColumns, type InvoiceRow } from '@/components/invoices/invoice-columns';
import { usePaginatedQuery } from '@/hooks/use-paginated-query';
@@ -63,8 +64,7 @@ export default function InvoicesPage() {
});
const deleteMutation = useMutation({
mutationFn: (id: string) =>
apiFetch(`/api/v1/invoices/${id}`, { method: 'DELETE' }),
mutationFn: (id: string) => apiFetch(`/api/v1/invoices/${id}`, { method: 'DELETE' }),
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ['invoices'] });
setDeleteTarget(null);
@@ -72,8 +72,7 @@ export default function InvoicesPage() {
});
const sendMutation = useMutation({
mutationFn: (id: string) =>
apiFetch(`/api/v1/invoices/${id}/send`, { method: 'POST' }),
mutationFn: (id: string) => apiFetch(`/api/v1/invoices/${id}/send`, { method: 'POST' }),
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ['invoices'] });
},
@@ -82,8 +81,7 @@ export default function InvoicesPage() {
const columns = getInvoiceColumns({
portSlug,
onSend: (invoice) => sendMutation.mutate(invoice.id),
onRecordPayment: (invoice) =>
router.push(`/${portSlug}/invoices/${invoice.id}?tab=payment`),
onRecordPayment: (invoice) => router.push(`/${portSlug}/invoices/${invoice.id}?tab=payment`),
onDelete: (invoice) => setDeleteTarget(invoice),
});
@@ -141,6 +139,17 @@ export default function InvoicesPage() {
onSortChange={setSort}
isLoading={isFetching && !isLoading}
getRowId={(row) => row.id}
cardRender={(row) => (
<InvoiceCard
invoice={row.original}
portSlug={portSlug}
onSend={(invoice) => sendMutation.mutate(invoice.id)}
onRecordPayment={(invoice) =>
router.push(`/${portSlug}/invoices/${invoice.id}?tab=payment`)
}
onDelete={setDeleteTarget}
/>
)}
emptyState={
<EmptyState
title="No invoices found"
@@ -161,15 +170,11 @@ export default function InvoicesPage() {
<h3 className="font-semibold">Delete Invoice?</h3>
<p className="text-sm text-muted-foreground">
This will permanently delete invoice{' '}
<span className="font-mono font-medium">{deleteTarget.invoiceNumber}</span>.
This action cannot be undone.
<span className="font-mono font-medium">{deleteTarget.invoiceNumber}</span>. This
action cannot be undone.
</p>
<div className="flex items-center gap-2 justify-end">
<Button
variant="outline"
size="sm"
onClick={() => setDeleteTarget(null)}
>
<Button variant="outline" size="sm" onClick={() => setDeleteTarget(null)}>
Cancel
</Button>
<Button

View File

@@ -12,6 +12,8 @@ import { PortProvider } from '@/providers/port-provider';
import { PermissionsProvider } from '@/providers/permissions-provider';
import { Sidebar } from '@/components/layout/sidebar';
import { Topbar } from '@/components/layout/topbar';
import { MobileLayout } from '@/components/layout/mobile/mobile-layout';
import { RealtimeToasts } from '@/components/shared/realtime-toasts';
export default async function DashboardLayout({ children }: { children: React.ReactNode }) {
const session = await auth.api.getSession({ headers: await headers() });
@@ -37,7 +39,9 @@ export default async function DashboardLayout({ children }: { children: React.Re
<PortProvider ports={ports} defaultPortId={ports[0]?.id ?? null}>
<PermissionsProvider>
<SocketProvider>
<div className="flex h-screen overflow-hidden bg-background">
<RealtimeToasts />
{/* Desktop shell — hidden by CSS on mobile */}
<div data-shell="desktop" className="flex h-screen overflow-hidden bg-background">
<Sidebar
portRoles={portRoles}
isSuperAdmin={profile?.isSuperAdmin ?? false}
@@ -57,6 +61,9 @@ export default async function DashboardLayout({ children }: { children: React.Re
<main className="flex-1 overflow-y-auto bg-background p-6">{children}</main>
</div>
</div>
{/* Mobile shell — hidden by CSS on desktop */}
<MobileLayout>{children}</MobileLayout>
</SocketProvider>
</PermissionsProvider>
</PortProvider>

View File

@@ -5,28 +5,19 @@ import type { Metadata } from 'next';
import { getPortalSession } from '@/lib/portal/auth';
import { getClientInterests } from '@/lib/services/portal.service';
import { Badge } from '@/components/ui/badge';
import { stageLabel, safeStage, type PipelineStage } from '@/lib/constants';
export const metadata: Metadata = { title: 'Interests' };
const STAGE_LABELS: Record<string, string> = {
open: 'Open',
details_sent: 'Details Sent',
in_communication: 'In Communication',
visited: 'Visited',
signed_eoi_nda: 'EOI / NDA Signed',
deposit_10pct: 'Deposit Received',
contract: 'Contract Stage',
completed: 'Completed',
};
const STAGE_COLORS: Record<string, 'default' | 'secondary' | 'destructive' | 'outline'> = {
const STAGE_VARIANT: Record<PipelineStage, 'default' | 'secondary' | 'destructive' | 'outline'> = {
open: 'secondary',
details_sent: 'secondary',
in_communication: 'default',
visited: 'default',
signed_eoi_nda: 'default',
eoi_sent: 'default',
eoi_signed: 'default',
deposit_10pct: 'default',
contract: 'default',
contract_sent: 'default',
contract_signed: 'default',
completed: 'outline',
};
@@ -40,9 +31,7 @@ export default async function PortalInterestsPage() {
<div className="space-y-6">
<div>
<h1 className="text-2xl font-semibold text-gray-900">Berth Interests</h1>
<p className="text-sm text-gray-500 mt-1">
Your berth enquiries and applications
</p>
<p className="text-sm text-gray-500 mt-1">Your berth enquiries and applications</p>
</div>
{interests.length === 0 ? (
@@ -56,10 +45,7 @@ export default async function PortalInterestsPage() {
) : (
<div className="space-y-3">
{interests.map((interest) => (
<div
key={interest.id}
className="bg-white rounded-lg border p-5"
>
<div key={interest.id} className="bg-white rounded-lg border p-5">
<div className="flex items-start justify-between gap-4">
<div className="flex-1 min-w-0">
<div className="flex items-center gap-2 mb-1">
@@ -98,8 +84,8 @@ export default async function PortalInterestsPage() {
)}
</div>
</div>
<Badge variant={STAGE_COLORS[interest.pipelineStage] ?? 'default'}>
{STAGE_LABELS[interest.pipelineStage] ?? interest.pipelineStage}
<Badge variant={STAGE_VARIANT[safeStage(interest.pipelineStage)]}>
{stageLabel(interest.pipelineStage)}
</Badge>
</div>
</div>

View File

@@ -0,0 +1,50 @@
import { redirect } from 'next/navigation';
import { headers } from 'next/headers';
import { auth } from '@/lib/auth';
import { db } from '@/lib/db';
import { ports as portsTable } from '@/lib/db/schema/ports';
import { QueryProvider } from '@/providers/query-provider';
import { PortProvider } from '@/providers/port-provider';
import { eq } from 'drizzle-orm';
/**
* Minimal layout for the mobile receipt-scanner PWA. No sidebar, no
* topbar — the scanner is its own contained surface. Adds the PWA
* manifest link + theme color so iOS/Android pick up "Add to Home
* Screen". Auth check matches the dashboard layout so unauthorized
* users still bounce to /login.
*/
export default async function ScannerLayout({
children,
params,
}: {
children: React.ReactNode;
params: Promise<{ portSlug: string }>;
}) {
const session = await auth.api.getSession({ headers: await headers() });
if (!session?.user) redirect('/login');
const { portSlug } = await params;
const port = await db.query.ports.findFirst({
where: eq(portsTable.slug, portSlug),
});
if (!port) redirect('/login');
return (
<QueryProvider>
<PortProvider ports={port ? [port] : []} defaultPortId={port?.id ?? null}>
<head>
<link rel="manifest" href={`/${portSlug}/scan/manifest.webmanifest`} />
<meta name="theme-color" content="#3a7bc8" />
<meta name="mobile-web-app-capable" content="yes" />
<meta name="apple-mobile-web-app-capable" content="yes" />
<meta name="apple-mobile-web-app-status-bar-style" content="default" />
<meta name="apple-mobile-web-app-title" content="PN Scanner" />
<meta name="viewport" content="width=device-width, initial-scale=1, viewport-fit=cover" />
</head>
<div className="min-h-[100dvh] bg-background">{children}</div>
</PortProvider>
</QueryProvider>
);
}

View File

@@ -0,0 +1,45 @@
import { NextResponse } from 'next/server';
import { eq } from 'drizzle-orm';
import { db } from '@/lib/db';
import { ports } from '@/lib/db/schema/ports';
/**
* Per-port PWA manifest. Scoped to `/<portSlug>/scan` so the install
* only covers the scanner page, not the rest of the CRM. Each port
* gets its own homescreen icon labeled with its name.
*/
export async function GET(_req: Request, { params }: { params: Promise<{ portSlug: string }> }) {
const { portSlug } = await params;
const port = await db.query.ports.findFirst({ where: eq(ports.slug, portSlug) });
const portName = port?.name ?? 'Port Nimara';
const manifest = {
name: `${portName} — Scanner`,
short_name: 'Scanner',
description: `Capture and submit expense receipts for ${portName}.`,
start_url: `/${portSlug}/scan`,
scope: `/${portSlug}/scan`,
display: 'standalone',
orientation: 'portrait',
background_color: '#ffffff',
theme_color: '#3a7bc8',
icons: [
{ src: '/icon-192.png', sizes: '192x192', type: 'image/png' },
{ src: '/icon-512.png', sizes: '512x512', type: 'image/png' },
{
src: '/icon-512-maskable.png',
sizes: '512x512',
type: 'image/png',
purpose: 'maskable',
},
],
};
return NextResponse.json(manifest, {
headers: {
'Content-Type': 'application/manifest+json',
'Cache-Control': 'public, max-age=300, must-revalidate',
},
});
}

View File

@@ -0,0 +1,11 @@
import type { Metadata } from 'next';
import { ScanShell } from '@/components/scan/scan-shell';
export const metadata: Metadata = {
title: 'Scan receipt — Port Nimara',
};
export default function ScanPage() {
return <ScanShell />;
}

View File

@@ -1,68 +1,15 @@
import { NextResponse } from 'next/server';
import { db } from '@/lib/db';
import { redis } from '@/lib/redis';
import { minioClient } from '@/lib/minio';
import { env } from '@/lib/env';
import { sql } from 'drizzle-orm';
type CheckStatus = 'ok' | 'error';
interface HealthChecks {
postgres: CheckStatus;
redis: CheckStatus;
minio: CheckStatus;
}
interface HealthResponse {
status: 'healthy' | 'degraded';
checks: HealthChecks;
timestamp: string;
}
export async function GET(): Promise<NextResponse<HealthResponse>> {
const checks: HealthChecks = {
postgres: 'error',
redis: 'error',
minio: 'error',
};
await Promise.allSettled([
db
.execute(sql`SELECT 1`)
.then(() => {
checks.postgres = 'ok';
})
.catch(() => {
checks.postgres = 'error';
}),
redis
.ping()
.then(() => {
checks.redis = 'ok';
})
.catch(() => {
checks.redis = 'error';
}),
minioClient
.bucketExists(env.MINIO_BUCKET)
.then(() => {
checks.minio = 'ok';
})
.catch(() => {
checks.minio = 'error';
}),
]);
const allHealthy = Object.values(checks).every((s) => s === 'ok');
const status: HealthResponse['status'] = allHealthy ? 'healthy' : 'degraded';
const body: HealthResponse = {
status,
checks,
timestamp: new Date().toISOString(),
};
return NextResponse.json(body, { status: allHealthy ? 200 : 503 });
/**
* Liveness probe — confirms the Next.js process is responding.
*
* Returns 200 unconditionally; if the process is wedged or has crashed
* the request never lands here at all. Do NOT include database/Redis/MinIO
* checks in this endpoint — a transient downstream blip should drop the
* pod from the load balancer (readiness), not restart the pod (liveness).
*
* For deep dependency checks, hit `/api/ready` instead.
*/
export async function GET() {
return NextResponse.json({ status: 'ok', timestamp: new Date().toISOString() });
}

View File

@@ -12,31 +12,23 @@ import { yachts, yachtOwnershipHistory } from '@/lib/db/schema/yachts';
import { companies, companyMemberships } from '@/lib/db/schema/companies';
import { createAuditLog } from '@/lib/audit';
import { errorResponse, RateLimitError } from '@/lib/errors';
import { checkRateLimit, rateLimiters } from '@/lib/rate-limit';
import { publicInterestSchema } from '@/lib/validators/interests';
import { sendInquiryNotifications } from '@/lib/services/inquiry-notifications.service';
import { parsePhone } from '@/lib/i18n/phone';
import type { CountryCode } from '@/lib/i18n/countries';
// ─── Simple in-memory rate limiter ───────────────────────────────────────────
// Max 5 requests per hour per IP
const ipHits = new Map<string, { count: number; resetAt: number }>();
const WINDOW_MS = 60 * 60 * 1000; // 1 hour
const MAX_HITS = 5;
function checkRateLimit(ip: string): void {
const now = Date.now();
const entry = ipHits.get(ip);
if (!entry || now > entry.resetAt) {
ipHits.set(ip, { count: 1, resetAt: now + WINDOW_MS });
return;
}
if (entry.count >= MAX_HITS) {
const retryAfter = Math.ceil((entry.resetAt - now) / 1000);
/**
* Throws RateLimitError if the IP has exceeded the public-form quota.
* Backed by the Redis sliding-window limiter so the cap survives restarts
* and is shared across worker processes.
*/
async function gateRateLimit(ip: string): Promise<void> {
const result = await checkRateLimit(ip, rateLimiters.publicForm);
if (!result.allowed) {
const retryAfter = Math.max(1, Math.ceil((result.resetAt - Date.now()) / 1000));
throw new RateLimitError(retryAfter);
}
entry.count += 1;
}
type PublicInterestData = z.infer<typeof publicInterestSchema>;
@@ -50,7 +42,7 @@ type Tx = typeof db;
export async function POST(req: NextRequest) {
try {
const ip = req.headers.get('x-forwarded-for')?.split(',')[0]?.trim() ?? 'unknown';
checkRateLimit(ip);
await gateRateLimit(ip);
const body = await req.json();
const data = publicInterestSchema.parse(body);
@@ -61,6 +53,16 @@ export async function POST(req: NextRequest) {
return NextResponse.json({ error: 'Port context required' }, { status: 400 });
}
// Server-side phone normalization for older website builds that post raw
// international/national strings. Newer builds may pre-fill phoneE164/Country.
let phoneE164 = data.phoneE164 ?? null;
let phoneCountry: CountryCode | null = (data.phoneCountry as CountryCode | null) ?? null;
if (!phoneE164) {
const parsed = parsePhone(data.phone, phoneCountry ?? undefined);
phoneE164 = parsed.e164;
phoneCountry = parsed.country ?? phoneCountry;
}
const fullName =
data.firstName && data.lastName
? `${data.firstName} ${data.lastName}`
@@ -96,17 +98,21 @@ export async function POST(req: NextRequest) {
});
if (existingClient && existingClient.portId === portId) {
clientId = existingClient.id;
const updates: Partial<typeof clients.$inferInsert> = {};
if (data.preferredContactMethod) {
await tx
.update(clients)
.set({ preferredContactMethod: data.preferredContactMethod })
.where(eq(clients.id, clientId));
updates.preferredContactMethod = data.preferredContactMethod;
}
if (data.nationalityIso && !existingClient.nationalityIso) {
updates.nationalityIso = data.nationalityIso;
}
if (Object.keys(updates).length > 0) {
await tx.update(clients).set(updates).where(eq(clients.id, clientId));
}
} else {
clientId = await createClientInTx(tx, portId, fullName, data);
clientId = await createClientInTx(tx, portId, fullName, data, phoneE164, phoneCountry);
}
} else {
clientId = await createClientInTx(tx, portId, fullName, data);
clientId = await createClientInTx(tx, portId, fullName, data, phoneE164, phoneCountry);
}
// 2. Optional: upsert company + add membership
@@ -128,7 +134,8 @@ export async function POST(req: NextRequest) {
name: data.company.name,
legalName: data.company.legalName ?? null,
taxId: data.company.taxId ?? null,
incorporationCountry: data.company.incorporationCountry ?? null,
incorporationCountryIso: data.company.incorporationCountryIso ?? null,
incorporationSubdivisionIso: data.company.incorporationSubdivisionIso ?? null,
status: 'active',
})
.returning();
@@ -198,9 +205,9 @@ export async function POST(req: NextRequest) {
label: 'Primary',
streetAddress: data.address.street ?? null,
city: data.address.city ?? null,
stateProvince: data.address.stateProvince ?? null,
subdivisionIso: data.address.subdivisionIso ?? null,
postalCode: data.address.postalCode ?? null,
country: data.address.country ?? null,
countryIso: data.address.countryIso ?? null,
isPrimary: true,
});
}
@@ -279,7 +286,9 @@ async function createClientInTx(
tx: Tx,
portId: string,
fullName: string,
data: Pick<PublicInterestData, 'email' | 'phone' | 'preferredContactMethod'>,
data: Pick<PublicInterestData, 'email' | 'phone' | 'preferredContactMethod' | 'nationalityIso'>,
phoneE164: string | null,
phoneCountry: CountryCode | null,
): Promise<string> {
const [newClient] = await tx
.insert(clients)
@@ -287,6 +296,7 @@ async function createClientInTx(
portId,
fullName,
preferredContactMethod: data.preferredContactMethod,
nationalityIso: data.nationalityIso ?? null,
source: 'website',
})
.returning();
@@ -303,6 +313,8 @@ async function createClientInTx(
clientId,
channel: 'phone',
value: data.phone,
valueE164: phoneE164,
valueCountry: phoneCountry,
isPrimary: false,
});

View File

@@ -14,26 +14,23 @@ import {
import { env } from '@/lib/env';
import { errorResponse, RateLimitError, ValidationError } from '@/lib/errors';
import { logger } from '@/lib/logger';
import { checkRateLimit, rateLimiters } from '@/lib/rate-limit';
import { publicResidentialInquirySchema } from '@/lib/validators/residential';
import { emitToRoom } from '@/lib/socket/server';
import { parsePhone } from '@/lib/i18n/phone';
import type { CountryCode } from '@/lib/i18n/countries';
// ─── Rate limiter (5 per hour per IP) ────────────────────────────────────────
const ipHits = new Map<string, { count: number; resetAt: number }>();
const WINDOW_MS = 60 * 60 * 1000;
const MAX_HITS = 5;
function checkRateLimit(ip: string): void {
const now = Date.now();
const entry = ipHits.get(ip);
if (!entry || now > entry.resetAt) {
ipHits.set(ip, { count: 1, resetAt: now + WINDOW_MS });
return;
/**
* Throws RateLimitError if the IP has exceeded the public-form quota.
* Backed by the Redis sliding-window limiter so the cap survives restarts
* and is shared across worker processes.
*/
async function gateRateLimit(ip: string): Promise<void> {
const result = await checkRateLimit(ip, rateLimiters.publicForm);
if (!result.allowed) {
const retryAfter = Math.max(1, Math.ceil((result.resetAt - Date.now()) / 1000));
throw new RateLimitError(retryAfter);
}
if (entry.count >= MAX_HITS) {
throw new RateLimitError(Math.ceil((entry.resetAt - now) / 1000));
}
entry.count += 1;
}
/**
@@ -47,7 +44,7 @@ function checkRateLimit(ip: string): void {
export async function POST(req: NextRequest) {
try {
const ip = req.headers.get('x-forwarded-for')?.split(',')[0]?.trim() ?? 'unknown';
checkRateLimit(ip);
await gateRateLimit(ip);
const body = await req.json();
const data = publicResidentialInquirySchema.parse(body);
@@ -61,6 +58,16 @@ export async function POST(req: NextRequest) {
throw new ValidationError('Unknown port');
}
// If the website didn't pre-normalize, parse server-side. International
// strings parse without a hint; national-format submissions need a country.
let phoneE164 = data.phoneE164 ?? null;
let phoneCountry: CountryCode | null = (data.phoneCountry as CountryCode | null) ?? null;
if (!phoneE164) {
const parsed = parsePhone(data.phone, phoneCountry ?? undefined);
phoneE164 = parsed.e164;
phoneCountry = parsed.country ?? phoneCountry;
}
const result = await withTransaction(async (tx) => {
const [client] = await tx
.insert(residentialClients)
@@ -69,7 +76,13 @@ export async function POST(req: NextRequest) {
fullName: `${data.firstName.trim()} ${data.lastName.trim()}`.trim(),
email: data.email,
phone: data.phone,
phoneE164,
phoneCountry,
nationalityIso: data.nationalityIso ?? null,
timezone: data.timezone ?? null,
placeOfResidence: data.placeOfResidence,
placeOfResidenceCountryIso: data.placeOfResidenceCountryIso ?? null,
subdivisionIso: data.subdivisionIso ?? null,
preferredContactMethod: data.preferredContactMethod,
source: 'website',
status: 'prospect',

View File

@@ -0,0 +1,82 @@
import { NextResponse } from 'next/server';
import { sql } from 'drizzle-orm';
import { db } from '@/lib/db';
import { redis } from '@/lib/redis';
import { minioClient } from '@/lib/minio';
import { env } from '@/lib/env';
type CheckStatus = 'ok' | 'error';
interface ReadyChecks {
postgres: CheckStatus;
redis: CheckStatus;
minio: CheckStatus;
}
interface ReadyResponse {
status: 'ready' | 'degraded';
checks: ReadyChecks;
timestamp: string;
}
/**
* Readiness probe — verifies that every backing service this process
* needs to serve traffic is reachable. A 503 should drop the pod from the
* load balancer until the next probe succeeds; it should not trigger a
* pod restart (that's what `/api/health` is for).
*
* Checks:
* - postgres: `SELECT 1` against the primary
* - redis: `PING`
* - minio: `bucketExists(<configured-bucket>)`
*
* Documenso + SMTP are intentionally not probed here: they're optional
* integrations, and each tenant configures its own credentials. A
* tenant-misconfigured Documenso instance shouldn't deadline the entire
* shared CRM.
*/
export async function GET(): Promise<NextResponse<ReadyResponse>> {
const checks: ReadyChecks = {
postgres: 'error',
redis: 'error',
minio: 'error',
};
await Promise.allSettled([
db
.execute(sql`SELECT 1`)
.then(() => {
checks.postgres = 'ok';
})
.catch(() => {
checks.postgres = 'error';
}),
redis
.ping()
.then(() => {
checks.redis = 'ok';
})
.catch(() => {
checks.redis = 'error';
}),
minioClient
.bucketExists(env.MINIO_BUCKET)
.then(() => {
checks.minio = 'ok';
})
.catch(() => {
checks.minio = 'error';
}),
]);
const allReady = Object.values(checks).every((s) => s === 'ok');
const status: ReadyResponse['status'] = allReady ? 'ready' : 'degraded';
return NextResponse.json(
{ status, checks, timestamp: new Date().toISOString() },
{ status: allReady ? 200 : 503 },
);
}

View File

@@ -0,0 +1,46 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import {
getAiBudget,
setAiBudget,
currentPeriodTokens,
periodBreakdown,
} from '@/lib/services/ai-budget.service';
const saveSchema = z.object({
enabled: z.boolean().optional(),
softCapTokens: z.number().int().nonnegative().max(100_000_000).optional(),
hardCapTokens: z.number().int().nonnegative().max(100_000_000).optional(),
period: z.enum(['day', 'week', 'month']).optional(),
});
export const GET = withAuth(
withPermission('admin', 'manage_settings', async (req, ctx) => {
try {
const [budget, used, breakdown] = await Promise.all([
getAiBudget(ctx.portId),
currentPeriodTokens(ctx.portId),
periodBreakdown(ctx.portId),
]);
return NextResponse.json({ data: { budget, used, breakdown } });
} catch (error) {
return errorResponse(error);
}
}),
);
export const PUT = withAuth(
withPermission('admin', 'manage_settings', async (req, ctx) => {
try {
const body = await parseBody(req, saveSchema);
const next = await setAiBudget(ctx.portId, body, ctx.userId);
return NextResponse.json({ data: next });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,25 @@
import { NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { runAlertEngineForPorts } from '@/lib/services/alert-engine';
/**
* Admin trigger for an immediate alert engine sweep over the caller's port.
* Useful for manual ops ("re-evaluate now after I fixed a rule") and
* exercised by the realapi socket fanout test.
*
* Requires super_admin or per-port admin permissions; the engine itself
* is idempotent — duplicate runs only re-evaluate, never duplicate rows.
*/
export const POST = withAuth(async (_req, ctx) => {
try {
if (!ctx.isSuperAdmin) {
return NextResponse.json({ error: 'Super admin only' }, { status: 403 });
}
const summary = await runAlertEngineForPorts([ctx.portId]);
return NextResponse.json({ data: summary });
} catch (error) {
return errorResponse(error);
}
});

View File

@@ -1,29 +1,76 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { inArray } from 'drizzle-orm';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseQuery } from '@/lib/api/route-helpers';
import { listAuditLogs } from '@/lib/services/audit.service';
import { searchAuditLogs } from '@/lib/services/audit-search.service';
import { db } from '@/lib/db';
import { user } from '@/lib/db/schema/users';
import { errorResponse } from '@/lib/errors';
const auditQuerySchema = z.object({
page: z.coerce.number().int().min(1).default(1),
limit: z.coerce.number().int().min(1).max(100).default(50),
limit: z.coerce.number().int().min(1).max(200).default(50),
entityType: z.string().optional(),
action: z.string().optional(),
userId: z.string().optional(),
entityId: z.string().optional(),
dateFrom: z.string().optional(),
dateTo: z.string().optional(),
/** Free-text query against the tsvector `search_text` column. */
search: z.string().optional(),
/** Cursor pair from the previous page's response. */
cursorAt: z.string().optional(),
cursorId: z.string().optional(),
});
export const GET = withAuth(
withPermission('admin', 'view_audit_log', async (req, ctx) => {
try {
const query = parseQuery(req, auditQuerySchema);
const result = await listAuditLogs(ctx.portId, query);
return NextResponse.json(result);
const cursor =
query.cursorAt && query.cursorId
? { createdAt: new Date(query.cursorAt), id: query.cursorId }
: undefined;
const { rows, nextCursor } = await searchAuditLogs({
portId: ctx.portId,
q: query.search,
userId: query.userId,
action: query.action,
entityType: query.entityType,
entityId: query.entityId,
from: query.dateFrom ? new Date(query.dateFrom) : undefined,
to: query.dateTo ? new Date(query.dateTo) : undefined,
cursor,
limit: query.limit,
});
// Resolve actor emails in one batched query so the table can show
// who did what without N+1 round trips.
const userIds = Array.from(
new Set(rows.map((r) => r.userId).filter((id): id is string => Boolean(id))),
);
const userRows = userIds.length
? await db
.select({ id: user.id, email: user.email, name: user.name })
.from(user)
.where(inArray(user.id, userIds))
: [];
const userMap = new Map(userRows.map((u) => [u.id, u]));
const data = rows.map((r) => ({
...r,
actor: r.userId ? (userMap.get(r.userId) ?? null) : null,
}));
return NextResponse.json({
data,
pagination: {
nextCursor: nextCursor
? { createdAt: nextCursor.createdAt.toISOString(), id: nextCursor.id }
: null,
},
});
} catch (error) {
return errorResponse(error);
}

View File

@@ -1,12 +1,18 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { errorResponse, ForbiddenError } from '@/lib/errors';
import { resendCrmInvite } from '@/lib/services/crm-invite.service';
// Resend mints a fresh token + new email on a global invite row;
// restrict to super-admins to match revoke/list and avoid cross-tenant
// re-issuance of foreign-port invitations.
export const POST = withAuth(
withPermission('admin', 'manage_users', async (_req, ctx, params) => {
try {
if (!ctx.isSuperAdmin) {
throw new ForbiddenError('Resending CRM invites requires super-admin');
}
const id = params.id ?? '';
const result = await resendCrmInvite(id, {
userId: ctx.userId,

View File

@@ -1,12 +1,18 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { errorResponse, ForbiddenError } from '@/lib/errors';
import { revokeCrmInvite } from '@/lib/services/crm-invite.service';
// Invites are a global resource (no portId column). Revoking a foreign
// tenant's pending invite by id would be cross-tenant tampering;
// restrict to super-admins to match the listing endpoint.
export const DELETE = withAuth(
withPermission('admin', 'manage_users', async (_req, ctx, params) => {
try {
if (!ctx.isSuperAdmin) {
throw new ForbiddenError('Revoking CRM invites requires super-admin');
}
const id = params.id ?? '';
await revokeCrmInvite(id, {
userId: ctx.userId,

View File

@@ -3,12 +3,20 @@ import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { errorResponse, ForbiddenError } from '@/lib/errors';
import { createCrmInvite, listCrmInvites } from '@/lib/services/crm-invite.service';
export const GET = withAuth(
withPermission('admin', 'manage_users', async (_req, _ctx) => {
withPermission('admin', 'manage_users', async (_req, ctx) => {
try {
// crm_user_invites is a global table (no per-port column) — invites
// mint better-auth users that may later be assigned roles in any
// port. Listing it cross-tenant would let a port-A director
// enumerate pending invitee emails, names, and isSuperAdmin flags
// for every other tenant. Restrict the listing to super-admins.
if (!ctx.isSuperAdmin) {
throw new ForbiddenError('Listing CRM invites requires super-admin');
}
const data = await listCrmInvites();
return NextResponse.json({ data });
} catch (error) {
@@ -24,10 +32,17 @@ const createInviteSchema = z.object({
});
export const POST = withAuth(
withPermission('admin', 'manage_users', async (req, _ctx) => {
withPermission('admin', 'manage_users', async (req, ctx) => {
try {
const body = await parseBody(req, createInviteSchema);
const result = await createCrmInvite(body);
// Only existing super-admins can mint super-admin invitations. The
// manage_users permission is granted to port-scoped director roles,
// which must not be able to elevate themselves cross-tenant by
// inviting a fresh super_admin.
if (body.isSuperAdmin && !ctx.isSuperAdmin) {
throw new ForbiddenError('Only super admins can mint super-admin invitations');
}
const result = await createCrmInvite({ ...body, invitedBy: ctx });
return NextResponse.json({ data: result }, { status: 201 });
} catch (error) {
return errorResponse(error);

View File

@@ -0,0 +1,72 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { getPublicOcrConfig, saveOcrConfig, OCR_MODELS } from '@/lib/services/ocr-config.service';
const saveSchema = z.object({
/** When 'global', requires super_admin and stores at port_id=null. */
scope: z.enum(['port', 'global']),
provider: z.enum(['openai', 'claude']),
model: z.string().min(1),
apiKey: z.string().optional(),
clearApiKey: z.boolean().optional(),
useGlobal: z.boolean().optional(),
aiEnabled: z.boolean().optional(),
});
// Only role tiers that hold `admin.manage_settings` (director / super_admin)
// may read or write the OCR config: the apiKey is stored encrypted but is
// passed straight into the receipt-scan handler, so a swapped key would
// exfiltrate every subsequent receipt image to whatever endpoint that key
// authenticates with.
export const GET = withAuth(
withPermission('admin', 'manage_settings', async (req, ctx) => {
try {
const url = new URL(req.url);
const scope = url.searchParams.get('scope') ?? 'port';
if (scope === 'global' && !ctx.isSuperAdmin) {
return NextResponse.json({ error: 'Super admin only' }, { status: 403 });
}
const config = await getPublicOcrConfig(scope === 'global' ? null : ctx.portId);
return NextResponse.json({ data: config, models: OCR_MODELS });
} catch (error) {
return errorResponse(error);
}
}),
);
export const PUT = withAuth(
withPermission('admin', 'manage_settings', async (req, ctx) => {
try {
const body = await parseBody(req, saveSchema);
if (body.scope === 'global' && !ctx.isSuperAdmin) {
return NextResponse.json({ error: 'Super admin only' }, { status: 403 });
}
const validModels = OCR_MODELS[body.provider];
if (!validModels.includes(body.model)) {
return NextResponse.json(
{ error: `Invalid model for provider ${body.provider}` },
{ status: 400 },
);
}
await saveOcrConfig(
body.scope === 'global' ? null : ctx.portId,
{
provider: body.provider,
model: body.model,
apiKey: body.apiKey,
clearApiKey: body.clearApiKey,
useGlobal: body.useGlobal,
aiEnabled: body.aiEnabled,
},
ctx.userId,
);
return NextResponse.json({ ok: true });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,31 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { OCR_MODELS } from '@/lib/services/ocr-config.service';
import { testProvider } from '@/lib/services/ocr-providers';
const schema = z.object({
provider: z.enum(['openai', 'claude']),
model: z.string().min(1),
apiKey: z.string().min(1),
});
// `manage_settings`-gated for parity with the parent OCR settings route —
// triggers outbound AI provider auth requests using a caller-supplied key.
export const POST = withAuth(
withPermission('admin', 'manage_settings', async (req) => {
try {
const body = await parseBody(req, schema);
if (!OCR_MODELS[body.provider].includes(body.model)) {
return NextResponse.json({ error: 'Invalid model' }, { status: 400 });
}
const result = await testProvider(body.provider, body.apiKey, body.model);
return NextResponse.json(result);
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -4,11 +4,25 @@ import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { getPort, updatePort } from '@/lib/services/ports.service';
import { updatePortSchema } from '@/lib/validators/ports';
import { errorResponse } from '@/lib/errors';
import { errorResponse, ForbiddenError } from '@/lib/errors';
/**
* Non-super-admin callers (e.g. port directors holding admin.manage_settings)
* may only read/mutate THEIR OWN port row. The path id is therefore
* compared against ctx.portId and a foreign target is rejected before the
* service is touched. Super-admins retain unrestricted access.
*/
function assertPortInScope(targetPortId: string, ctx: { portId: string; isSuperAdmin: boolean }) {
if (ctx.isSuperAdmin) return;
if (targetPortId !== ctx.portId) {
throw new ForbiddenError('Cross-tenant port access denied');
}
}
export const GET = withAuth(
withPermission('admin', 'manage_settings', async (_req, _ctx, params) => {
withPermission('admin', 'manage_settings', async (_req, ctx, params) => {
try {
assertPortInScope(params.id!, ctx);
const data = await getPort(params.id!);
return NextResponse.json({ data });
} catch (error) {
@@ -20,6 +34,7 @@ export const GET = withAuth(
export const PATCH = withAuth(
withPermission('admin', 'manage_settings', async (req, ctx, params) => {
try {
assertPortInScope(params.id!, ctx);
const body = await parseBody(req, updatePortSchema);
const data = await updatePort(params.id!, body, {
userId: ctx.userId,

View File

@@ -4,11 +4,18 @@ import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { listPorts, createPort } from '@/lib/services/ports.service';
import { createPortSchema } from '@/lib/validators/ports';
import { errorResponse } from '@/lib/errors';
import { errorResponse, ForbiddenError } from '@/lib/errors';
// Listing every tenant and creating new tenants are super-admin operations:
// a port director must not be able to enumerate other ports (target
// discovery for cross-tenant attacks) or spin up new tenants whose admin
// they implicitly become.
export const GET = withAuth(
withPermission('admin', 'manage_settings', async () => {
withPermission('admin', 'manage_settings', async (_req, ctx) => {
try {
if (!ctx.isSuperAdmin) {
throw new ForbiddenError('Listing all ports requires super-admin');
}
const data = await listPorts();
return NextResponse.json({ data });
} catch (error) {
@@ -20,6 +27,9 @@ export const GET = withAuth(
export const POST = withAuth(
withPermission('admin', 'manage_settings', async (req, ctx) => {
try {
if (!ctx.isSuperAdmin) {
throw new ForbiddenError('Creating ports requires super-admin');
}
const body = await parseBody(req, createPortSchema);
const data = await createPort(body, {
userId: ctx.userId,

View File

@@ -4,14 +4,17 @@ import { withAuth } from '@/lib/api/helpers';
import { getEmailDraftResult } from '@/lib/services/email-draft.service';
import { errorResponse } from '@/lib/errors';
export const GET = withAuth(async (_req, _ctx, params) => {
export const GET = withAuth(async (_req, ctx, params) => {
try {
const { jobId } = params;
if (!jobId) {
return NextResponse.json({ error: 'jobId is required' }, { status: 400 });
}
const result = await getEmailDraftResult(jobId);
const result = await getEmailDraftResult(jobId, {
userId: ctx.userId,
portId: ctx.portId,
});
if (result === null) {
return NextResponse.json({ status: 'processing' });

View File

@@ -0,0 +1,11 @@
import { NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { acknowledgeAlert } from '@/lib/services/alerts.service';
export const POST = withAuth(async (_req, ctx, params) => {
const id = params.id;
if (!id) return NextResponse.json({ error: 'Missing id' }, { status: 400 });
await acknowledgeAlert(id, ctx.portId, ctx.userId);
return NextResponse.json({ ok: true });
});

View File

@@ -0,0 +1,11 @@
import { NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { dismissAlert } from '@/lib/services/alerts.service';
export const POST = withAuth(async (_req, ctx, params) => {
const id = params.id;
if (!id) return NextResponse.json({ error: 'Missing id' }, { status: 400 });
await dismissAlert(id, ctx.portId, ctx.userId);
return NextResponse.json({ ok: true });
});

View File

@@ -0,0 +1,24 @@
import { NextResponse } from 'next/server';
import { and, eq, isNull, sql } from 'drizzle-orm';
import { withAuth } from '@/lib/api/helpers';
import { db } from '@/lib/db';
import { alerts } from '@/lib/db/schema/insights';
export const GET = withAuth(async (_req, ctx) => {
const rows = await db
.select({ severity: alerts.severity, count: sql<number>`count(*)::int` })
.from(alerts)
.where(
and(eq(alerts.portId, ctx.portId), isNull(alerts.resolvedAt), isNull(alerts.dismissedAt)),
)
.groupBy(alerts.severity);
const bySeverity = { info: 0, warning: 0, critical: 0 } as Record<string, number>;
let total = 0;
for (const r of rows) {
bySeverity[r.severity] = r.count;
total += r.count;
}
return NextResponse.json({ total, bySeverity });
});

View File

@@ -0,0 +1,26 @@
import { NextRequest, NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { listAlertsForPort } from '@/lib/services/alerts.service';
type AlertStatus = 'open' | 'dismissed' | 'resolved';
export const GET = withAuth(async (req: NextRequest, ctx) => {
const url = new URL(req.url);
const status = (url.searchParams.get('status') ?? 'open') as AlertStatus;
const rows = await listAlertsForPort(ctx.portId, {
includeDismissed: status !== 'open',
includeResolved: status !== 'open',
});
// Filter to the requested status bucket so callers don't see overlap.
const filtered = rows.filter((a) => {
if (status === 'open') return !a.dismissedAt && !a.resolvedAt;
if (status === 'dismissed') return Boolean(a.dismissedAt) && !a.resolvedAt;
if (status === 'resolved') return Boolean(a.resolvedAt);
return true;
});
return NextResponse.json({ data: filtered });
});

View File

@@ -0,0 +1,37 @@
import { NextRequest, NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import {
ALL_RANGES,
getLeadSourceAttribution,
getOccupancyTimeline,
getPipelineFunnel,
getRevenueBreakdown,
type DateRange,
type MetricBase,
} from '@/lib/services/analytics.service';
const METRICS: Record<MetricBase, (portId: string, range: DateRange) => Promise<unknown>> = {
pipeline_funnel: getPipelineFunnel,
occupancy_timeline: getOccupancyTimeline,
revenue_breakdown: getRevenueBreakdown,
lead_source_attribution: getLeadSourceAttribution,
};
export const GET = withAuth(
withPermission('reports', 'view_analytics', async (req: NextRequest, ctx) => {
const url = new URL(req.url);
const metric = url.searchParams.get('metric') as MetricBase | null;
const range = (url.searchParams.get('range') ?? '30d') as DateRange;
if (!metric || !(metric in METRICS)) {
return NextResponse.json({ error: 'Invalid or missing metric' }, { status: 400 });
}
if (!ALL_RANGES.includes(range)) {
return NextResponse.json({ error: 'Invalid range' }, { status: 400 });
}
const data = await METRICS[metric](ctx.portId, range);
return NextResponse.json({ metric, range, data });
}),
);

View File

@@ -0,0 +1,107 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { type RouteHandler } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { requirePermission } from '@/lib/auth/permissions';
import { errorResponse } from '@/lib/errors';
import {
activate,
cancel,
endReservation,
getById,
} from '@/lib/services/berth-reservations.service';
// ─── PATCH body schema (action-based discriminated union) ────────────────────
const patchBodySchema = z.discriminatedUnion('action', [
z.object({
action: z.literal('activate'),
contractFileId: z.string().optional(),
effectiveDate: z.coerce.date().optional(),
}),
z.object({
action: z.literal('end'),
endDate: z.coerce.date(),
notes: z.string().optional(),
}),
z.object({
action: z.literal('cancel'),
reason: z.string().optional(),
}),
]);
// ─── Handlers ────────────────────────────────────────────────────────────────
export const getHandler: RouteHandler = async (_req, ctx, params) => {
try {
const reservation = await getById(params.id!, ctx.portId);
return NextResponse.json({ data: reservation });
} catch (error) {
return errorResponse(error);
}
};
export const patchHandler: RouteHandler = async (req, ctx, params) => {
try {
const body = await parseBody(req, patchBodySchema);
const meta = {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
};
if (body.action === 'activate') {
requirePermission(ctx, 'reservations', 'activate');
const result = await activate(
params.id!,
ctx.portId,
{
contractFileId: body.contractFileId,
effectiveDate: body.effectiveDate,
},
meta,
);
return NextResponse.json({ data: result });
}
if (body.action === 'end') {
// `end` is lifecycle progression; same privilege as activate.
requirePermission(ctx, 'reservations', 'activate');
const result = await endReservation(
params.id!,
ctx.portId,
{ endDate: body.endDate, notes: body.notes },
meta,
);
return NextResponse.json({ data: result });
}
// action === 'cancel'
requirePermission(ctx, 'reservations', 'cancel');
const result = await cancel(params.id!, ctx.portId, { reason: body.reason }, meta);
return NextResponse.json({ data: result });
} catch (error) {
return errorResponse(error);
}
};
export const deleteHandler: RouteHandler = async (_req, ctx, params) => {
try {
await cancel(
params.id!,
ctx.portId,
{},
{
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
},
);
return new NextResponse(null, { status: 204 });
} catch (error) {
return errorResponse(error);
}
};

View File

@@ -1,110 +1,6 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { withAuth, withPermission, type RouteHandler } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { requirePermission } from '@/lib/auth/permissions';
import { errorResponse } from '@/lib/errors';
import {
activate,
cancel,
endReservation,
getById,
} from '@/lib/services/berth-reservations.service';
// ─── PATCH body schema (action-based discriminated union) ────────────────────
const patchBodySchema = z.discriminatedUnion('action', [
z.object({
action: z.literal('activate'),
contractFileId: z.string().optional(),
effectiveDate: z.coerce.date().optional(),
}),
z.object({
action: z.literal('end'),
endDate: z.coerce.date(),
notes: z.string().optional(),
}),
z.object({
action: z.literal('cancel'),
reason: z.string().optional(),
}),
]);
// ─── Handlers ────────────────────────────────────────────────────────────────
export const getHandler: RouteHandler = async (_req, ctx, params) => {
try {
const reservation = await getById(params.id!, ctx.portId);
return NextResponse.json({ data: reservation });
} catch (error) {
return errorResponse(error);
}
};
export const patchHandler: RouteHandler = async (req, ctx, params) => {
try {
const body = await parseBody(req, patchBodySchema);
const meta = {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
};
if (body.action === 'activate') {
requirePermission(ctx, 'reservations', 'activate');
const result = await activate(
params.id!,
ctx.portId,
{
contractFileId: body.contractFileId,
effectiveDate: body.effectiveDate,
},
meta,
);
return NextResponse.json({ data: result });
}
if (body.action === 'end') {
// `end` is lifecycle progression; same privilege as activate.
requirePermission(ctx, 'reservations', 'activate');
const result = await endReservation(
params.id!,
ctx.portId,
{ endDate: body.endDate, notes: body.notes },
meta,
);
return NextResponse.json({ data: result });
}
// action === 'cancel'
requirePermission(ctx, 'reservations', 'cancel');
const result = await cancel(params.id!, ctx.portId, { reason: body.reason }, meta);
return NextResponse.json({ data: result });
} catch (error) {
return errorResponse(error);
}
};
export const deleteHandler: RouteHandler = async (_req, ctx, params) => {
try {
await cancel(
params.id!,
ctx.portId,
{},
{
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
},
);
return new NextResponse(null, { status: 204 });
} catch (error) {
return errorResponse(error);
}
};
import { getHandler, patchHandler, deleteHandler } from './handlers';
export const GET = withAuth(withPermission('reservations', 'view', getHandler));
// PATCH cannot use `withPermission` wrapper — the required permission depends

View File

@@ -0,0 +1,35 @@
import { NextResponse } from 'next/server';
import type { AuthContext } from '@/lib/api/helpers';
import { parseQuery } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { listReservations } from '@/lib/services/berth-reservations.service';
import { listReservationsSchema } from '@/lib/validators/reservations';
/**
* Port-scoped global list of reservations across all berths. Inner handler
* lives here so it can be invoked directly from integration tests without
* the `withAuth(withPermission(...))` wrappers (matches the convention
* used throughout `src/app/api/v1/*`).
*/
export async function listHandler(req: Request, ctx: AuthContext): Promise<NextResponse> {
try {
const query = parseQuery(req as never, listReservationsSchema);
const result = await listReservations(ctx.portId, query);
const { page, limit } = query;
const totalPages = Math.ceil(result.total / limit);
return NextResponse.json({
data: result.data,
pagination: {
page,
pageSize: limit,
total: result.total,
totalPages,
hasNextPage: page < totalPages,
hasPreviousPage: page > 1,
},
});
} catch (error) {
return errorResponse(error);
}
}

View File

@@ -0,0 +1,4 @@
import { withAuth, withPermission } from '@/lib/api/helpers';
import { listHandler } from './handlers';
export const GET = withAuth(withPermission('reservations', 'view', listHandler));

View File

@@ -8,7 +8,7 @@ import { reorderWaitingListSchema } from '@/lib/validators/interests';
import { getWaitingList, updateWaitingList } from '@/lib/services/berths.service';
import { errorResponse, NotFoundError } from '@/lib/errors';
import { db } from '@/lib/db';
import { berthWaitingList } from '@/lib/db/schema/berths';
import { berths, berthWaitingList } from '@/lib/db/schema/berths';
// GET /api/v1/berths/[id]/waiting-list
export const GET = withAuth(
@@ -47,11 +47,17 @@ export const PATCH = withAuth(
const body = await parseBody(req, reorderWaitingListSchema);
const berthId = params.id!;
// Tenant scope: refuse to reorder a foreign-port berth's waiting
// list. The route's URL id and the entry id are otherwise enough
// for any user with manage_waiting_list to mutate any tenant's
// queue ordering.
const berthRow = await db.query.berths.findFirst({
where: and(eq(berths.id, berthId), eq(berths.portId, ctx.portId)),
});
if (!berthRow) throw new NotFoundError('Berth');
const entry = await db.query.berthWaitingList.findFirst({
where: and(
eq(berthWaitingList.id, body.entryId),
eq(berthWaitingList.berthId, berthId),
),
where: and(eq(berthWaitingList.id, body.entryId), eq(berthWaitingList.berthId, berthId)),
});
if (!entry) throw new NotFoundError('Waiting list entry');

View File

@@ -1,15 +1,17 @@
import { NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { getBerthOptions } from '@/lib/services/berths.service';
import { errorResponse } from '@/lib/errors';
// GET /api/v1/berths/options — lightweight list for selects/comboboxes
export const GET = withAuth(async (req, ctx) => {
try {
const options = await getBerthOptions(ctx.portId);
return NextResponse.json({ data: options });
} catch (error) {
return errorResponse(error);
}
});
export const GET = withAuth(
withPermission('berths', 'view', async (req, ctx) => {
try {
const options = await getBerthOptions(ctx.portId);
return NextResponse.json({ data: options });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,51 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { updateClientAddress, removeClientAddress } from '@/lib/services/clients.service';
import { optionalCountryIsoSchema, optionalSubdivisionIsoSchema } from '@/lib/validators/i18n';
const updateAddressSchema = z.object({
label: z.string().min(1).max(80).optional(),
streetAddress: z.string().max(500).optional().nullable(),
city: z.string().max(120).optional().nullable(),
subdivisionIso: optionalSubdivisionIsoSchema.optional(),
postalCode: z.string().max(40).optional().nullable(),
countryIso: optionalCountryIsoSchema.optional(),
isPrimary: z.boolean().optional(),
});
export const PATCH = withAuth(
withPermission('clients', 'edit', async (req, ctx, params) => {
try {
const body = await parseBody(req, updateAddressSchema);
const row = await updateClientAddress(params.addressId!, params.id!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: row });
} catch (error) {
return errorResponse(error);
}
}),
);
export const DELETE = withAuth(
withPermission('clients', 'edit', async (req, ctx, params) => {
try {
await removeClientAddress(params.addressId!, params.id!, ctx.portId, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return new NextResponse(null, { status: 204 });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,46 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { listClientAddresses, addClientAddress } from '@/lib/services/clients.service';
import { optionalCountryIsoSchema, optionalSubdivisionIsoSchema } from '@/lib/validators/i18n';
const addAddressSchema = z.object({
label: z.string().min(1).max(80).optional(),
streetAddress: z.string().max(500).optional().nullable(),
city: z.string().max(120).optional().nullable(),
subdivisionIso: optionalSubdivisionIsoSchema.optional(),
postalCode: z.string().max(40).optional().nullable(),
countryIso: optionalCountryIsoSchema.optional(),
isPrimary: z.boolean().optional(),
});
export const GET = withAuth(
withPermission('clients', 'view', async (req, ctx, params) => {
try {
const rows = await listClientAddresses(params.id!, ctx.portId);
return NextResponse.json({ data: rows });
} catch (error) {
return errorResponse(error);
}
}),
);
export const POST = withAuth(
withPermission('clients', 'edit', async (req, ctx, params) => {
try {
const body = await parseBody(req, addAddressSchema);
const row = await addClientAddress(params.id!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: row }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -5,10 +5,13 @@ import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { updateContact, removeContact } from '@/lib/services/clients.service';
import { optionalCountryIsoSchema, optionalPhoneE164Schema } from '@/lib/validators/i18n';
const updateContactSchema = z.object({
channel: z.enum(['email', 'phone', 'whatsapp', 'other']).optional(),
value: z.string().min(1).optional(),
valueE164: optionalPhoneE164Schema.optional(),
valueCountry: optionalCountryIsoSchema.optional(),
label: z.string().optional(),
isPrimary: z.boolean().optional(),
notes: z.string().optional(),
@@ -18,18 +21,12 @@ export const PATCH = withAuth(
withPermission('clients', 'edit', async (req, ctx, params) => {
try {
const body = await parseBody(req, updateContactSchema);
const contact = await updateContact(
params.contactId!,
params.id!,
ctx.portId,
body,
{
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
},
);
const contact = await updateContact(params.contactId!, params.id!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: contact });
} catch (error) {
return errorResponse(error);

View File

@@ -5,10 +5,13 @@ import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { listContacts, addContact } from '@/lib/services/clients.service';
import { optionalCountryIsoSchema, optionalPhoneE164Schema } from '@/lib/validators/i18n';
const addContactSchema = z.object({
channel: z.enum(['email', 'phone', 'whatsapp', 'other']),
value: z.string().min(1),
valueE164: optionalPhoneE164Schema.optional(),
valueCountry: optionalCountryIsoSchema.optional(),
label: z.string().optional(),
isPrimary: z.boolean().optional().default(false),
notes: z.string().optional(),

View File

@@ -0,0 +1,24 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission, withRateLimit } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { getExportDownloadUrl } from '@/lib/services/gdpr-export.service';
/**
* Returns a fresh signed URL for an existing GDPR export. Staff use this
* from the admin UI; the email path embeds its own signed URL.
*/
export const GET = withAuth(
withPermission(
'admin',
'manage_settings',
withRateLimit('exports', async (req, ctx, params) => {
try {
const url = await getExportDownloadUrl(params.exportId!, ctx.portId);
return NextResponse.json({ data: { url } });
} catch (error) {
return errorResponse(error);
}
}),
),
);

View File

@@ -0,0 +1,49 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission, withRateLimit } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { requestGdprExport, listClientExports } from '@/lib/services/gdpr-export.service';
const requestSchema = z.object({
/** When true, the bundle is emailed to the client once it finishes building. */
emailToClient: z.boolean().optional().default(false),
/** Optional override recipient (e.g. legal counsel). Skips the primary-email lookup. */
emailOverride: z.string().email().optional().nullable(),
});
export const GET = withAuth(
withPermission('clients', 'view', async (req, ctx, params) => {
try {
const rows = await listClientExports(params.id!, ctx.portId);
return NextResponse.json({ data: rows });
} catch (error) {
return errorResponse(error);
}
}),
);
export const POST = withAuth(
withPermission(
'admin',
'manage_settings',
withRateLimit('exports', async (req, ctx, params) => {
try {
const body = await parseBody(req, requestSchema);
const result = await requestGdprExport({
clientId: params.id!,
portId: ctx.portId,
requestedBy: ctx.userId,
emailToClient: body.emailToClient,
emailOverride: body.emailOverride ?? null,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: result.export }, { status: 202 });
} catch (error) {
return errorResponse(error);
}
}),
),
);

View File

@@ -1,15 +1,17 @@
import { NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { listClientOptions } from '@/lib/services/clients.service';
export const GET = withAuth(async (req, ctx) => {
try {
const search = req.nextUrl.searchParams.get('search') ?? undefined;
const data = await listClientOptions(ctx.portId, search);
return NextResponse.json({ data });
} catch (error) {
return errorResponse(error);
}
});
export const GET = withAuth(
withPermission('clients', 'view', async (req, ctx) => {
try {
const search = req.nextUrl.searchParams.get('search') ?? undefined;
const data = await listClientOptions(ctx.portId, search);
return NextResponse.json({ data });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,51 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { updateCompanyAddress, removeCompanyAddress } from '@/lib/services/companies.service';
import { optionalCountryIsoSchema, optionalSubdivisionIsoSchema } from '@/lib/validators/i18n';
const updateAddressSchema = z.object({
label: z.string().min(1).max(80).optional(),
streetAddress: z.string().max(500).optional().nullable(),
city: z.string().max(120).optional().nullable(),
subdivisionIso: optionalSubdivisionIsoSchema.optional(),
postalCode: z.string().max(40).optional().nullable(),
countryIso: optionalCountryIsoSchema.optional(),
isPrimary: z.boolean().optional(),
});
export const PATCH = withAuth(
withPermission('companies', 'edit', async (req, ctx, params) => {
try {
const body = await parseBody(req, updateAddressSchema);
const row = await updateCompanyAddress(params.addressId!, params.id!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: row });
} catch (error) {
return errorResponse(error);
}
}),
);
export const DELETE = withAuth(
withPermission('companies', 'edit', async (req, ctx, params) => {
try {
await removeCompanyAddress(params.addressId!, params.id!, ctx.portId, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return new NextResponse(null, { status: 204 });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,46 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { listCompanyAddresses, addCompanyAddress } from '@/lib/services/companies.service';
import { optionalCountryIsoSchema, optionalSubdivisionIsoSchema } from '@/lib/validators/i18n';
const addAddressSchema = z.object({
label: z.string().min(1).max(80).optional(),
streetAddress: z.string().max(500).optional().nullable(),
city: z.string().max(120).optional().nullable(),
subdivisionIso: optionalSubdivisionIsoSchema.optional(),
postalCode: z.string().max(40).optional().nullable(),
countryIso: optionalCountryIsoSchema.optional(),
isPrimary: z.boolean().optional(),
});
export const GET = withAuth(
withPermission('companies', 'view', async (req, ctx, params) => {
try {
const rows = await listCompanyAddresses(params.id!, ctx.portId);
return NextResponse.json({ data: rows });
} catch (error) {
return errorResponse(error);
}
}),
);
export const POST = withAuth(
withPermission('companies', 'edit', async (req, ctx, params) => {
try {
const body = await parseBody(req, addAddressSchema);
const row = await addCompanyAddress(params.id!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: row }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,47 @@
import { NextResponse } from 'next/server';
import { type RouteHandler } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { endMembership, updateMembership } from '@/lib/services/company-memberships.service';
import { endMembershipSchema, updateMembershipSchema } from '@/lib/validators/company-memberships';
export const patchHandler: RouteHandler = async (req, ctx, params) => {
try {
const body = await parseBody(req, updateMembershipSchema);
const updated = await updateMembership(params.mid!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: updated });
} catch (error) {
return errorResponse(error);
}
};
export const deleteHandler: RouteHandler = async (req, ctx, params) => {
try {
let endDate = new Date();
const text = await req.text();
if (text.length > 0) {
const parsed = endMembershipSchema.parse(JSON.parse(text));
endDate = parsed.endDate;
}
await endMembership(
params.mid!,
ctx.portId,
{ endDate },
{
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
},
);
return new NextResponse(null, { status: 204 });
} catch (error) {
return errorResponse(error);
}
};

View File

@@ -1,50 +1,6 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { withAuth, withPermission, type RouteHandler } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { endMembership, updateMembership } from '@/lib/services/company-memberships.service';
import { endMembershipSchema, updateMembershipSchema } from '@/lib/validators/company-memberships';
export const patchHandler: RouteHandler = async (req, ctx, params) => {
try {
const body = await parseBody(req, updateMembershipSchema);
const updated = await updateMembership(params.mid!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: updated });
} catch (error) {
return errorResponse(error);
}
};
export const deleteHandler: RouteHandler = async (req, ctx, params) => {
try {
let endDate = new Date();
const text = await req.text();
if (text.length > 0) {
const parsed = endMembershipSchema.parse(JSON.parse(text));
endDate = parsed.endDate;
}
await endMembership(
params.mid!,
ctx.portId,
{ endDate },
{
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
},
);
return new NextResponse(null, { status: 204 });
} catch (error) {
return errorResponse(error);
}
};
import { patchHandler, deleteHandler } from './handlers';
export const PATCH = withAuth(withPermission('memberships', 'manage', patchHandler));
export const DELETE = withAuth(withPermission('memberships', 'manage', deleteHandler));

View File

@@ -0,0 +1,19 @@
import { NextResponse } from 'next/server';
import { type RouteHandler } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { setPrimary } from '@/lib/services/company-memberships.service';
export const setPrimaryHandler: RouteHandler = async (_req, ctx, params) => {
try {
const membership = await setPrimary(params.mid!, ctx.portId, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: membership });
} catch (error) {
return errorResponse(error);
}
};

View File

@@ -1,21 +1,5 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { withAuth, withPermission, type RouteHandler } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { setPrimary } from '@/lib/services/company-memberships.service';
export const setPrimaryHandler: RouteHandler = async (_req, ctx, params) => {
try {
const membership = await setPrimary(params.mid!, ctx.portId, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: membership });
} catch (error) {
return errorResponse(error);
}
};
import { setPrimaryHandler } from './handlers';
export const POST = withAuth(withPermission('memberships', 'manage', setPrimaryHandler));

View File

@@ -0,0 +1,40 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { type RouteHandler } from '@/lib/api/helpers';
import { parseBody, parseQuery } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { addMembership, listByCompany } from '@/lib/services/company-memberships.service';
import { addMembershipSchema } from '@/lib/validators/company-memberships';
const listQuerySchema = z.object({
activeOnly: z
.enum(['true', 'false'])
.transform((v) => v === 'true')
.default('true'),
});
export const listHandler: RouteHandler = async (req, ctx, params) => {
try {
const { activeOnly } = parseQuery(req, listQuerySchema);
const memberships = await listByCompany(params.id!, ctx.portId, { activeOnly });
return NextResponse.json({ data: memberships });
} catch (error) {
return errorResponse(error);
}
};
export const createHandler: RouteHandler = async (req, ctx, params) => {
try {
const body = await parseBody(req, addMembershipSchema);
const membership = await addMembership(params.id!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: membership }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
};

View File

@@ -1,43 +1,6 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { withAuth, withPermission, type RouteHandler } from '@/lib/api/helpers';
import { parseBody, parseQuery } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { addMembership, listByCompany } from '@/lib/services/company-memberships.service';
import { addMembershipSchema } from '@/lib/validators/company-memberships';
const listQuerySchema = z.object({
activeOnly: z
.enum(['true', 'false'])
.transform((v) => v === 'true')
.default('true'),
});
export const listHandler: RouteHandler = async (req, ctx, params) => {
try {
const { activeOnly } = parseQuery(req, listQuerySchema);
const memberships = await listByCompany(params.id!, ctx.portId, { activeOnly });
return NextResponse.json({ data: memberships });
} catch (error) {
return errorResponse(error);
}
};
export const createHandler: RouteHandler = async (req, ctx, params) => {
try {
const body = await parseBody(req, addMembershipSchema);
const membership = await addMembership(params.id!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: membership }, { status: 201 });
} catch (error) {
return errorResponse(error);
}
};
import { listHandler, createHandler } from './handlers';
export const GET = withAuth(withPermission('memberships', 'view', listHandler));
export const POST = withAuth(withPermission('memberships', 'manage', createHandler));

View File

@@ -0,0 +1,18 @@
import { NextResponse } from 'next/server';
import { type RouteHandler } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { autocomplete } from '@/lib/services/companies.service';
export const autocompleteHandler: RouteHandler = async (req, ctx) => {
try {
const q = req.nextUrl.searchParams.get('q');
if (!q) {
return NextResponse.json({ data: [] });
}
const companies = await autocomplete(ctx.portId, q);
return NextResponse.json({ data: companies });
} catch (error) {
return errorResponse(error);
}
};

View File

@@ -1,20 +1,5 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { withAuth, withPermission, type RouteHandler } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { autocomplete } from '@/lib/services/companies.service';
export const autocompleteHandler: RouteHandler = async (req, ctx) => {
try {
const q = req.nextUrl.searchParams.get('q');
if (!q) {
return NextResponse.json({ data: [] });
}
const companies = await autocomplete(ctx.portId, q);
return NextResponse.json({ data: companies });
} catch (error) {
return errorResponse(error);
}
};
import { autocompleteHandler } from './handlers';
export const GET = withAuth(withPermission('companies', 'view', autocompleteHandler));

View File

@@ -1,45 +1,55 @@
import { NextRequest, NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { errorResponse, NotFoundError } from '@/lib/errors';
import { setValuesSchema } from '@/lib/validators/custom-fields';
import { getValues, setValues } from '@/lib/services/custom-fields.service';
export const GET = withAuth(async (_req: NextRequest, ctx, params) => {
try {
const { entityId } = params;
if (!entityId) throw new NotFoundError('Entity');
// Custom-field values live on top of a port-scoped entity (client, yacht,
// interest, berth, company). Reading the values is in scope for any role
// that can view clients (the most common surface); writing requires the
// equivalent edit permission. The service-layer also re-validates the
// entityId against the field definition's entityType + portId so a
// caller cannot poke values onto an arbitrary or foreign-port entity.
export const GET = withAuth(
withPermission('clients', 'view', async (_req: NextRequest, ctx, params) => {
try {
const { entityId } = params;
if (!entityId) throw new NotFoundError('Entity');
const data = await getValues(entityId, ctx.portId);
return NextResponse.json({ data });
} catch (error) {
return errorResponse(error);
}
});
const data = await getValues(entityId, ctx.portId);
return NextResponse.json({ data });
} catch (error) {
return errorResponse(error);
}
}),
);
export const PUT = withAuth(async (req: NextRequest, ctx, params) => {
try {
const { entityId } = params;
if (!entityId) throw new NotFoundError('Entity');
export const PUT = withAuth(
withPermission('clients', 'edit', async (req: NextRequest, ctx, params) => {
try {
const { entityId } = params;
if (!entityId) throw new NotFoundError('Entity');
const body = await req.json();
const { values } = setValuesSchema.parse(body);
const body = await req.json();
const { values } = setValuesSchema.parse(body);
const result = await setValues(
entityId,
ctx.portId,
ctx.userId,
values as Array<{ fieldId: string; value: unknown }>,
{
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
},
);
const result = await setValues(
entityId,
ctx.portId,
ctx.userId,
values as Array<{ fieldId: string; value: unknown }>,
{
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
},
);
return NextResponse.json({ data: result });
} catch (error) {
return errorResponse(error);
}
});
return NextResponse.json({ data: result });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -1,9 +1,11 @@
import { NextRequest, NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { getRecentActivity } from '@/lib/services/dashboard.service';
export const GET = withAuth(async (req: NextRequest, ctx) => {
const result = await getRecentActivity(ctx.portId);
return NextResponse.json(result);
});
export const GET = withAuth(
withPermission('reports', 'view_dashboard', async (req: NextRequest, ctx) => {
const result = await getRecentActivity(ctx.portId);
return NextResponse.json(result);
}),
);

View File

@@ -1,9 +1,11 @@
import { NextRequest, NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { getRevenueForecast } from '@/lib/services/dashboard.service';
export const GET = withAuth(async (req: NextRequest, ctx) => {
const result = await getRevenueForecast(ctx.portId);
return NextResponse.json(result);
});
export const GET = withAuth(
withPermission('reports', 'view_dashboard', async (req: NextRequest, ctx) => {
const result = await getRevenueForecast(ctx.portId);
return NextResponse.json(result);
}),
);

View File

@@ -1,9 +1,11 @@
import { NextRequest, NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { getKpis } from '@/lib/services/dashboard.service';
export const GET = withAuth(async (req: NextRequest, ctx) => {
const result = await getKpis(ctx.portId);
return NextResponse.json(result);
});
export const GET = withAuth(
withPermission('reports', 'view_dashboard', async (req: NextRequest, ctx) => {
const result = await getKpis(ctx.portId);
return NextResponse.json(result);
}),
);

View File

@@ -1,9 +1,11 @@
import { NextRequest, NextResponse } from 'next/server';
import { withAuth } from '@/lib/api/helpers';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { getPipelineCounts } from '@/lib/services/dashboard.service';
export const GET = withAuth(async (req: NextRequest, ctx) => {
const result = await getPipelineCounts(ctx.portId);
return NextResponse.json(result);
});
export const GET = withAuth(
withPermission('reports', 'view_dashboard', async (req: NextRequest, ctx) => {
const result = await getPipelineCounts(ctx.portId);
return NextResponse.json(result);
}),
);

View File

@@ -1,14 +1,32 @@
import { NextResponse } from 'next/server';
import { eq } from 'drizzle-orm';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { db } from '@/lib/db';
import { emailAccounts } from '@/lib/db/schema/email';
import { errorResponse, ForbiddenError, NotFoundError } from '@/lib/errors';
import { getQueue } from '@/lib/queue';
export const POST = withAuth(
withPermission('email', 'view', async (_req, _ctx, params) => {
withPermission('email', 'view', async (_req, ctx, params) => {
try {
const accountId = params.accountId!;
// Owner check: the sibling toggle/disconnect endpoints already enforce
// account.userId === ctx.userId. Without the same check here, any
// user with `email:view` could force IMAP sync against a foreign
// account, advancing lastSyncAt (data-loss risk on the legitimate
// owner's next sync) and triggering work using the foreign user's
// decrypted credentials.
const account = await db.query.emailAccounts.findFirst({
where: eq(emailAccounts.id, accountId),
});
if (!account) throw new NotFoundError('Email account');
if (account.userId !== ctx.userId) {
throw new ForbiddenError('You do not own this email account');
}
const queue = getQueue('email');
const job = await queue.add('inbox-sync', { accountId: params.accountId! });
const job = await queue.add('inbox-sync', { accountId });
return NextResponse.json({ data: { jobId: job.id } }, { status: 202 });
} catch (error) {
return errorResponse(error);

View File

@@ -0,0 +1,18 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { clearDuplicate } from '@/lib/services/expense-dedup.service';
export const POST = withAuth(
withPermission('expenses', 'edit', async (_req, ctx, params) => {
try {
const id = params.id;
if (!id) return NextResponse.json({ error: 'Missing id' }, { status: 400 });
await clearDuplicate(id, ctx.portId);
return NextResponse.json({ ok: true });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -0,0 +1,28 @@
import { NextResponse } from 'next/server';
import { z } from 'zod';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { mergeDuplicate } from '@/lib/services/expense-dedup.service';
const mergeSchema = z.object({
/** Surviving expense id — typically the row's existing `duplicateOf` pointer. */
targetId: z.string().min(1),
});
export const POST = withAuth(
withPermission('expenses', 'edit', async (req, ctx, params) => {
try {
const sourceId = params.id;
if (!sourceId) {
return NextResponse.json({ error: 'Missing id' }, { status: 400 });
}
const body = await parseBody(req, mergeSchema);
await mergeDuplicate(sourceId, body.targetId, ctx.portId);
return NextResponse.json({ ok: true });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -1,27 +1,117 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { withAuth, withPermission, withRateLimit } from '@/lib/api/helpers';
import { errorResponse } from '@/lib/errors';
import { scanReceipt } from '@/lib/services/receipt-scanner';
import { logger } from '@/lib/logger';
import { getResolvedOcrConfig } from '@/lib/services/ocr-config.service';
import {
runOcr,
type ParsedReceipt,
OCR_FEATURE,
OCR_ESTIMATED_TOKENS,
} from '@/lib/services/ocr-providers';
import { checkBudget, recordAiUsage } from '@/lib/services/ai-budget.service';
const EMPTY: ParsedReceipt = {
establishment: null,
date: null,
amount: null,
currency: null,
lineItems: [],
confidence: 0,
};
export const POST = withAuth(
withPermission('expenses', 'create', async (req, _ctx) => {
try {
const formData = await req.formData();
const file = formData.get('file') as File | null;
withPermission(
'expenses',
'create',
withRateLimit('ocr', async (req, ctx) => {
try {
const formData = await req.formData();
const file = formData.get('file') as File | null;
if (!file) {
return NextResponse.json({ error: 'No file provided' }, { status: 400 });
}
const buffer = Buffer.from(await file.arrayBuffer());
const mimeType = file.type || 'image/jpeg';
if (!file) {
return NextResponse.json({ error: 'No file provided' }, { status: 400 });
const config = await getResolvedOcrConfig(ctx.portId);
// Tesseract.js (in-browser) is the default. The server only invokes
// an AI provider when (a) the port admin has flipped `aiEnabled` on
// and (b) a key resolves. Otherwise the client falls back to its
// local Tesseract result.
if (!config.aiEnabled) {
return NextResponse.json({
data: { parsed: EMPTY, source: 'manual', reason: 'ai-disabled' },
});
}
if (!config.apiKey) {
return NextResponse.json({
data: { parsed: EMPTY, source: 'manual', reason: 'no-ocr-configured' },
});
}
// Per-port budget gate — refuse the call before we spend tokens
// when the port has already hit its hard cap, or when the request
// would push it past the cap. Soft-cap warnings ride along on the
// success response so the UI can show a banner without blocking.
const budget = await checkBudget({
portId: ctx.portId,
estimatedTokens: OCR_ESTIMATED_TOKENS,
});
if (!budget.ok) {
return NextResponse.json({
data: {
parsed: EMPTY,
source: 'manual',
reason: 'budget-exceeded',
providerError: `AI budget reached (${budget.usedTokens}/${budget.capTokens} tokens this period).`,
},
});
}
try {
const result = await runOcr({
provider: config.provider,
model: config.model,
apiKey: config.apiKey,
imageBuffer: buffer,
mimeType,
});
await recordAiUsage({
portId: ctx.portId,
userId: ctx.userId,
feature: OCR_FEATURE,
provider: config.provider,
model: config.model,
inputTokens: result.usage.inputTokens,
outputTokens: result.usage.outputTokens,
requestId: result.usage.requestId,
});
return NextResponse.json({
data: {
parsed: result.parsed,
source: 'ai',
provider: config.provider,
model: config.model,
softCapWarning: budget.softCap,
},
});
} catch (err) {
logger.error({ err, provider: config.provider }, 'OCR provider call failed');
// Provider hiccup — degrade to manual entry rather than 500-ing.
return NextResponse.json({
data: {
parsed: EMPTY,
source: 'manual',
reason: 'provider-error',
providerError: err instanceof Error ? err.message.slice(0, 200) : 'Unknown error',
},
});
}
} catch (error) {
return errorResponse(error);
}
const buffer = Buffer.from(await file.arrayBuffer());
const mimeType = file.type || 'image/jpeg';
const result = await scanReceipt(buffer, mimeType);
return NextResponse.json({ data: result });
} catch (error) {
return errorResponse(error);
}
}),
}),
),
);

View File

@@ -0,0 +1,41 @@
import { NextResponse } from 'next/server';
import { withAuth, withPermission } from '@/lib/api/helpers';
import { parseBody } from '@/lib/api/route-helpers';
import { errorResponse } from '@/lib/errors';
import { clearInterestOutcome, setInterestOutcome } from '@/lib/services/interests.service';
import { clearOutcomeSchema, setOutcomeSchema } from '@/lib/validators/interests';
export const POST = withAuth(
withPermission('interests', 'change_stage', async (req, ctx, params) => {
try {
const body = await parseBody(req, setOutcomeSchema);
const result = await setInterestOutcome(params.id!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: result });
} catch (error) {
return errorResponse(error);
}
}),
);
export const DELETE = withAuth(
withPermission('interests', 'change_stage', async (req, ctx, params) => {
try {
const body = await parseBody(req, clearOutcomeSchema);
const result = await clearInterestOutcome(params.id!, ctx.portId, body, {
userId: ctx.userId,
portId: ctx.portId,
ipAddress: ctx.ipAddress,
userAgent: ctx.userAgent,
});
return NextResponse.json({ data: result });
} catch (error) {
return errorResponse(error);
}
}),
);

View File

@@ -7,6 +7,26 @@ import { db } from '@/lib/db';
import { interests } from '@/lib/db/schema/interests';
import { auditLogs } from '@/lib/db/schema/system';
import { documents, documentEvents } from '@/lib/db/schema/documents';
import { user } from '@/lib/db/schema/users';
import { stageLabel } from '@/lib/constants';
const OUTCOME_LABELS: Record<string, string> = {
won: 'Won',
lost_other_marina: 'Lost — went to another marina',
lost_unqualified: 'Lost — unqualified',
lost_no_response: 'Lost — no response',
cancelled: 'Cancelled',
};
const DOC_EVENT_LABELS: Record<string, string> = {
sent: 'sent for signing',
completed: 'fully signed',
signed: 'signed by recipient',
rejected: 'rejected',
expired: 'expired',
cancelled: 'cancelled',
reminder_sent: 'reminder sent',
};
interface TimelineEvent {
id: string;
@@ -14,6 +34,10 @@ interface TimelineEvent {
action: string;
description: string;
userId: string | null;
/** Resolved display name for `userId`. `'system'` for auto-events; null when
* the user has been deleted or the event has no actor. Falls back to
* email-localpart if the user has no display name. */
userName: string | null;
createdAt: Date;
metadata: Record<string, unknown>;
}
@@ -33,12 +57,7 @@ export const GET = withAuth(
const auditRows = await db
.select()
.from(auditLogs)
.where(
and(
eq(auditLogs.entityType, 'interest'),
eq(auditLogs.entityId, interestId),
),
)
.where(and(eq(auditLogs.entityType, 'interest'), eq(auditLogs.entityId, interestId)))
.orderBy(desc(auditLogs.createdAt))
.limit(50);
@@ -67,28 +86,82 @@ export const GET = withAuth(
const docTitles = Object.fromEntries(interestDocs.map((d) => [d.id, d.title]));
// Resolve display names for any `userId` that is a real user row (the
// sentinel value 'system' is used for auto-events and isn't joined).
const realUserIds = Array.from(
new Set(auditRows.map((r) => r.userId).filter((u): u is string => !!u && u !== 'system')),
);
const userRows =
realUserIds.length > 0
? await db
.select({ id: user.id, name: user.name, email: user.email })
.from(user)
.where(inArray(user.id, realUserIds))
: [];
const userNameById = new Map<string, string>(
userRows.map((u) => [u.id, u.name?.trim() || u.email.split('@')[0] || 'User']),
);
const resolveUserName = (userId: string | null): string | null => {
if (!userId) return null;
if (userId === 'system') return 'system';
return userNameById.get(userId) ?? null;
};
// Union and sort
const auditEvents: TimelineEvent[] = auditRows.map((row) => ({
id: row.id,
type: 'audit',
action: row.action,
description: buildAuditDescription(row.action, row.newValue as Record<string, unknown> | null),
description: buildAuditDescription(
row.action,
row.newValue as Record<string, unknown> | null,
(row.metadata as Record<string, unknown>) ?? {},
row.userId,
),
userId: row.userId,
userName: resolveUserName(row.userId),
createdAt: row.createdAt,
metadata: (row.metadata as Record<string, unknown>) ?? {},
}));
const docEvents: TimelineEvent[] = docEventRows.map((row) => ({
id: row.id,
type: 'document_event',
action: row.eventType,
description: `Document "${docTitles[row.documentId] ?? row.documentId}": ${row.eventType}`,
userId: null,
createdAt: row.createdAt,
metadata: (row.eventData as Record<string, unknown>) ?? {},
}));
const docEvents: TimelineEvent[] = docEventRows.map((row) => {
const title = docTitles[row.documentId] ?? row.documentId;
const action = DOC_EVENT_LABELS[row.eventType] ?? row.eventType;
return {
id: row.id,
type: 'document_event',
action: row.eventType,
description: `Document "${title}" ${action}`,
userId: null,
userName: null,
createdAt: row.createdAt,
metadata: (row.eventData as Record<string, unknown>) ?? {},
};
});
const allEvents = [...auditEvents, ...docEvents];
// Fallback: when no audit-log entries exist for this interest (typical
// for seed/imported data inserted directly into the table without going
// through the service), synthesize a "Created at <stage>" event so the
// tab isn't empty when the interest is clearly past `open`.
const hasCreateAudit = allEvents.some((e) => e.action === 'create');
if (!hasCreateAudit) {
const stage = stageLabel(interest.pipelineStage);
const created = interest.createdAt ?? new Date();
allEvents.push({
id: `synth-${interest.id}-create`,
type: 'audit',
action: 'create',
description:
interest.pipelineStage === 'open' ? 'Interest created' : `Interest created at ${stage}`,
userId: null,
userName: null,
createdAt: created,
metadata: { synthetic: true },
});
}
allEvents.sort((a, b) => new Date(b.createdAt).getTime() - new Date(a.createdAt).getTime());
return NextResponse.json({ data: allEvents.slice(0, 50) });
@@ -101,12 +174,39 @@ export const GET = withAuth(
function buildAuditDescription(
action: string,
newValue: Record<string, unknown> | null,
metadata: Record<string, unknown>,
userId: string | null,
): string {
if (action === 'create') return 'Interest created';
if (action === 'archive') return 'Interest archived';
if (action === 'restore') return 'Interest restored';
const type = metadata.type;
if (type === 'outcome_set') {
const outcomeKey = (newValue?.outcome as string | undefined) ?? '';
const label = OUTCOME_LABELS[outcomeKey] ?? outcomeKey ?? 'Closed';
const reason = (newValue?.reason as string | undefined) ?? '';
return reason ? `Marked as ${label}${reason}` : `Marked as ${label}`;
}
if (type === 'outcome_cleared') {
const stage = (newValue?.pipelineStage as string | undefined) ?? '';
return stage ? `Reopened to ${stageLabel(stage)}` : 'Reopened';
}
if (type === 'stage_change' && newValue?.pipelineStage) {
const stage = stageLabel(newValue.pipelineStage as string);
const reason = (newValue.reason as string | undefined) ?? '';
const auto = userId === 'system';
if (auto) {
return reason ? `${stage} (auto-advanced — ${reason})` : `Stage advanced to ${stage}`;
}
return reason ? `Stage changed to ${stage}${reason}` : `Stage changed to ${stage}`;
}
if (action === 'update' && newValue?.pipelineStage) {
return `Stage changed to "${newValue.pipelineStage}"`;
return `Stage changed to ${stageLabel(newValue.pipelineStage as string)}`;
}
if (action === 'update') return 'Interest updated';
return action;

Some files were not shown because too many files have changed in this diff Show More