Compare commits
84 Commits
f1ed2a5f87
...
feat/mobil
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
cad55e3565 | ||
|
|
e2398099c4 | ||
|
|
d364b09885 | ||
|
|
57a099acc4 | ||
|
|
a391934b73 | ||
|
|
e3e0e69c04 | ||
|
|
6af2ac9680 | ||
|
|
a767652d74 | ||
|
|
c824b2df12 | ||
|
|
d197f8b321 | ||
|
|
76a7387dcc | ||
|
|
868b1f40c0 | ||
|
|
dbbd03fd22 | ||
|
|
ba5fb6db5e | ||
|
|
886119cbde | ||
|
|
0d357731ad | ||
|
|
a75d4f5d69 | ||
|
|
0fb7920db5 | ||
|
|
16ad61ce15 | ||
|
|
d080bc52fa | ||
|
|
a653c8e039 | ||
|
|
7e8110b2ff | ||
|
|
9eadaf035e | ||
|
|
bcea28cd71 | ||
|
|
722491a9dd | ||
|
|
6009ccb7de | ||
|
|
71da6e8fdc | ||
|
|
c405124bc3 | ||
|
|
53cbee1d3d | ||
|
|
ac7f1db62c | ||
|
|
5d44f3cfa4 | ||
|
|
d0540dca55 | ||
|
|
0e9c24e222 | ||
|
|
3aba2181dc | ||
|
|
6237ad1567 | ||
|
|
34916d855e | ||
|
|
41ae8a328f | ||
|
|
1ff3160eac | ||
|
|
5698d742d3 | ||
|
|
e6ce265be0 | ||
|
|
19bc2f2a54 | ||
|
|
b0a11f1785 | ||
|
|
3cbf2444fe | ||
|
|
0330be1312 | ||
|
|
210360738d | ||
|
|
4df04e1a58 | ||
|
|
0c3baf04c5 | ||
|
|
79667b24da | ||
|
|
c4fdb29bbe | ||
|
|
38527d71fc | ||
|
|
3fbfba6598 | ||
|
|
e3a835675b | ||
|
|
1b085f81ed | ||
|
|
9f786fbcf3 | ||
|
|
906127a292 | ||
|
|
737b43589b | ||
|
|
fbb1f1f366 | ||
|
|
ba89b61b3f | ||
|
|
4eea19a85b | ||
|
|
47a1a51832 | ||
|
|
9a5479c2c7 | ||
|
|
e06fb9545b | ||
|
|
4c5334d471 | ||
|
|
61e40b5e76 | ||
|
|
7f9d90ad05 | ||
|
|
5d29bfc153 | ||
|
|
43f68ca093 | ||
|
|
d9557edfc5 | ||
|
|
6eb0d3dc92 | ||
|
|
a3305a94f3 | ||
|
|
9dfa04094b | ||
|
|
e7d23b254c | ||
|
|
2cf1bd9754 | ||
|
|
46937bbcb9 | ||
|
|
27cdbcc695 | ||
|
|
31fa3d08ec | ||
|
|
16d98d630e | ||
|
|
f52d21df83 | ||
|
|
2fa70f4582 | ||
|
|
01b201e1a2 | ||
|
|
94f049c8b8 | ||
|
|
df495133b7 | ||
|
|
639025ebf9 | ||
|
|
e77d55ac50 |
9
.gitignore
vendored
9
.gitignore
vendored
@@ -20,10 +20,17 @@ tsconfig.tsbuildinfo
|
||||
docker-compose.override.yml
|
||||
.remember/
|
||||
.DS_Store
|
||||
eoi/
|
||||
# Root-only ad-hoc EOI scratch dir; routes under src/app/.../eoi/ must NOT match.
|
||||
/eoi/
|
||||
|
||||
# Brainstorming companion mockup files
|
||||
.superpowers/
|
||||
|
||||
# Ad-hoc screenshots / scratch artifacts at repo root
|
||||
/*.png
|
||||
|
||||
# Legacy Nuxt portal — kept on disk for reference, not tracked here
|
||||
/client-portal/
|
||||
|
||||
# Mobile audit screenshots — generated locally, regenerable
|
||||
/.audit/
|
||||
|
||||
Submodule client-portal deleted from 84f89f9409
199
docs/runbooks/backup-and-restore.md
Normal file
199
docs/runbooks/backup-and-restore.md
Normal file
@@ -0,0 +1,199 @@
|
||||
# Backup and restore runbook
|
||||
|
||||
This runbook documents what gets backed up, how often, where it lands, and
|
||||
the exact commands to restore the system from a cold start. The goal is
|
||||
that any operator who has the off-site backup credentials can bring the
|
||||
CRM back up on a clean host without help.
|
||||
|
||||
## Scope of a "full backup"
|
||||
|
||||
The CRM has three stateful surfaces. All three must be captured for a
|
||||
restore to be useful.
|
||||
|
||||
| Surface | Holds | Risk if missing |
|
||||
| ------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| **PostgreSQL** (`port_nimara_crm`) | Every relational record: clients, yachts, companies, interests, reservations, invoices, audit log, GDPR exports, AI usage ledger, Documenso webhook receipts, etc. | Total data loss — site is unrecoverable. |
|
||||
| **MinIO bucket** (`MINIO_BUCKET`, default `crm-files`) | Receipts, signed contracts, EOI PDFs, GDPR export ZIPs, document attachments. | Files reachable by row references in Postgres become 404s. |
|
||||
| **`.env` + secrets** | DB password, MinIO keys, Documenso webhook secret, SMTP creds, encryption key (`ENCRYPTION_KEY`). | OCR API keys re-resolve from `system_settings` (encrypted at rest), but **without the original `ENCRYPTION_KEY` they're unreadable**. |
|
||||
|
||||
The Redis instance is not backed up. It only holds queue state, rate-limit
|
||||
counters, and Socket.IO presence — all reconstructable. Stop the workers
|
||||
during a restore so the queue starts clean.
|
||||
|
||||
## Backup schedule
|
||||
|
||||
Defaults are tuned for a single-port deployment with O(10k) clients. Bump
|
||||
on the producing side as scale demands.
|
||||
|
||||
| Job | Frequency | Retention | Where |
|
||||
| ---------------------------------- | -------------------- | ----------------------------- | -------------------------------------------------------------------- |
|
||||
| `pg_dump` (custom format, gzipped) | Hourly | 7 days hourly + 30 days daily | `${BACKUP_BUCKET}/pg/<host>/<UTC date>/<hour>.dump.gz` |
|
||||
| MinIO mirror | Hourly (incremental) | 30 days versions | `${BACKUP_BUCKET}/minio/` |
|
||||
| `.env` snapshot (encrypted) | On change (manual) | Forever | Password manager / secrets vault — **never the same bucket as data** |
|
||||
|
||||
The hourly cadence is the right answer for this workload — invoices and
|
||||
contracts cluster around business hours, and an hour of lost work is the
|
||||
worst-case data loss window most clients will tolerate. Promote to 15-min
|
||||
WAL streaming if a customer demands tighter RPO.
|
||||
|
||||
## Required environment variables
|
||||
|
||||
The scripts below read these. Store them in a CI secret store, not the
|
||||
host's bash profile.
|
||||
|
||||
```
|
||||
# Source (the running CRM database)
|
||||
DATABASE_URL=postgresql://crm:<pw>@<host>:<port>/port_nimara_crm
|
||||
|
||||
# MinIO (source bucket — the live one)
|
||||
MINIO_ENDPOINT=minio.letsbe.solutions
|
||||
MINIO_PORT=443
|
||||
MINIO_USE_SSL=true
|
||||
MINIO_ACCESS_KEY=<live key>
|
||||
MINIO_SECRET_KEY=<live secret>
|
||||
MINIO_BUCKET=crm-files
|
||||
|
||||
# Backup destination (a *separate* MinIO/S3 endpoint or a different bucket
|
||||
# with no IAM overlap with the live keys)
|
||||
BACKUP_S3_ENDPOINT=https://s3.eu-west-1.amazonaws.com
|
||||
BACKUP_S3_REGION=eu-west-1
|
||||
BACKUP_S3_BUCKET=portnimara-backups-prod
|
||||
BACKUP_S3_ACCESS_KEY=<dedicated read+write key for this bucket only>
|
||||
BACKUP_S3_SECRET_KEY=<...>
|
||||
|
||||
# Optional: encrypts dumps at rest with a passphrase. Cuts a wider blast
|
||||
# radius if the backup bucket itself is compromised.
|
||||
BACKUP_GPG_RECIPIENT=ops@portnimara.com
|
||||
```
|
||||
|
||||
## Provisioning the backup destination
|
||||
|
||||
1. Create a dedicated S3-compatible bucket in a **different account** from
|
||||
the live infra. AWS S3, Backblaze B2, or a separately-credentialed
|
||||
MinIO instance all work.
|
||||
2. Apply object-lock or versioning so an attacker who steals the backup
|
||||
write key still can't permanently delete history.
|
||||
3. Generate IAM credentials scoped to `s3:PutObject`, `s3:GetObject`,
|
||||
`s3:ListBucket` on this bucket only. Inject them as
|
||||
`BACKUP_S3_*` above. Do not reuse the live `MINIO_*` keys.
|
||||
4. Set a 90-day lifecycle rule that transitions objects older than 30
|
||||
days to cold storage and deletes them at 90 days. Past 90 days it's
|
||||
cheaper to restart from a snapshot taken outside the system.
|
||||
|
||||
## The scripts
|
||||
|
||||
Three scripts in `scripts/backup/`:
|
||||
|
||||
- `pg-backup.sh` — runs `pg_dump`, gzips, optionally GPG-encrypts, uploads
|
||||
- `minio-mirror.sh` — `mc mirror` of the live bucket → backup bucket
|
||||
- `restore.sh` — interactive restore (DB + MinIO) given a snapshot path
|
||||
|
||||
Make them executable and wire them into cron / GitHub Actions / your
|
||||
scheduler of choice. Sample crontab on the worker host:
|
||||
|
||||
```cron
|
||||
# Hourly DB dump at minute 7
|
||||
7 * * * * /opt/pncrm/scripts/backup/pg-backup.sh >> /var/log/pncrm-backup.log 2>&1
|
||||
|
||||
# Hourly MinIO mirror at minute 17 (offset so the two don't fight for I/O)
|
||||
17 * * * * /opt/pncrm/scripts/backup/minio-mirror.sh >> /var/log/pncrm-backup.log 2>&1
|
||||
|
||||
# Weekly restore drill (smoke-test to a throwaway DB on Sunday at 03:00)
|
||||
0 3 * * 0 /opt/pncrm/scripts/backup/restore.sh --drill >> /var/log/pncrm-restore-drill.log 2>&1
|
||||
```
|
||||
|
||||
## Restoring from cold
|
||||
|
||||
These steps have been rehearsed against the dev environment; expect them
|
||||
to take 15–30 minutes for a typical port. **The drill (last cron line
|
||||
above) ensures the runbook stays correct — if the drill fails, the
|
||||
real restore will too.**
|
||||
|
||||
### 0. Stop everything that writes
|
||||
|
||||
```bash
|
||||
docker compose -f docker-compose.prod.yml stop web worker scheduler
|
||||
# Leave postgres + minio + redis up; we'll point them at restored data.
|
||||
```
|
||||
|
||||
### 1. Restore PostgreSQL
|
||||
|
||||
```bash
|
||||
# Find the dump you want. Prefer the most recent successful hour.
|
||||
mc ls "$BACKUP_S3_BUCKET/pg/$(hostname)/" | tail
|
||||
SNAPSHOT="2026-04-28/14.dump.gz"
|
||||
|
||||
# Pull it.
|
||||
mc cp "$BACKUP_S3_BUCKET/pg/$(hostname)/$SNAPSHOT" /tmp/
|
||||
|
||||
# Decrypt if BACKUP_GPG_RECIPIENT was set on the producer side.
|
||||
gpg --decrypt /tmp/14.dump.gz.gpg > /tmp/14.dump.gz
|
||||
|
||||
# Drop & recreate the database. The 'restrict' FK from gdpr_exports.requested_by
|
||||
# to user means we restore in the right order — pg_restore handles this.
|
||||
psql "$DATABASE_URL" -c 'DROP DATABASE IF EXISTS port_nimara_crm WITH (FORCE);'
|
||||
psql "$DATABASE_URL" -c 'CREATE DATABASE port_nimara_crm;'
|
||||
gunzip -c /tmp/14.dump.gz | pg_restore --no-owner --no-privileges \
|
||||
--dbname "$DATABASE_URL"
|
||||
```
|
||||
|
||||
### 2. Restore MinIO
|
||||
|
||||
```bash
|
||||
# Sync the backup bucket back over the live one. --overwrite handles
|
||||
# files that were modified between snapshots.
|
||||
mc mirror --overwrite \
|
||||
"$BACKUP_S3_BUCKET/minio/" \
|
||||
"live/$MINIO_BUCKET/"
|
||||
```
|
||||
|
||||
### 3. Restore secrets
|
||||
|
||||
The `.env` file is **not** in object storage. Pull it from the password
|
||||
manager / secrets vault. Verify `ENCRYPTION_KEY` matches the value used
|
||||
when the database was last running — if it doesn't, rows in
|
||||
`system_settings` (OCR API keys, etc.) decrypt to garbage and the OCR
|
||||
"Test connection" button will return an opaque error. There is no
|
||||
recovery path; the keys must be re-entered through the admin UI.
|
||||
|
||||
### 4. Bring services back up
|
||||
|
||||
```bash
|
||||
docker compose -f docker-compose.prod.yml up -d
|
||||
# Watch the worker logs; expect a flurry of socket reconnections, then quiet.
|
||||
docker compose -f docker-compose.prod.yml logs -f worker
|
||||
```
|
||||
|
||||
### 5. Verify
|
||||
|
||||
Tail through the smoke checklist, in order:
|
||||
|
||||
1. **DB up** — `psql "$DATABASE_URL" -c 'SELECT count(*) FROM clients;'`
|
||||
matches the producer-side count from the snapshot's hour.
|
||||
2. **MinIO up** — open any client with attachments in the CRM, click a
|
||||
receipt thumbnail; verify the signed URL serves the file.
|
||||
3. **Documenso webhooks** — re-trigger one in the Documenso admin and
|
||||
confirm `audit_logs` records the receipt.
|
||||
4. **Email** — send a portal invite to a real address.
|
||||
5. **Realtime** — open two browser windows, edit a client in one, watch
|
||||
the other update via Socket.IO.
|
||||
6. **AI usage ledger** — `SELECT count(*) FROM ai_usage_ledger;`
|
||||
non-empty if AI was being used. Old rows survive but the budget gates
|
||||
reset alongside the period boundary at month rollover.
|
||||
|
||||
## Drill schedule
|
||||
|
||||
The weekly drill (cron line above) runs `restore.sh --drill` against a
|
||||
throwaway database and a sandbox MinIO bucket. It must produce zero diff
|
||||
between the restored row counts and the live row counts (modulo the
|
||||
hour-or-so the drill takes to run).
|
||||
|
||||
Failure modes the drill catches before they bite production:
|
||||
|
||||
- New tables added without inclusion in `pg_dump`'s `--schema=public` (we
|
||||
use the default, which captures everything in `public` — but a future
|
||||
developer adding a `tenant_X` schema will silently lose it).
|
||||
- MinIO bucket-policy changes that block the backup-side `s3:GetObject`
|
||||
on certain prefixes.
|
||||
- GPG passphrase rotation that wasn't propagated to the restore host.
|
||||
- A `pg_restore` version skew with the producer-side `pg_dump`.
|
||||
186
docs/runbooks/email-deliverability.md
Normal file
186
docs/runbooks/email-deliverability.md
Normal file
@@ -0,0 +1,186 @@
|
||||
# Email deliverability runbook
|
||||
|
||||
The CRM sends transactional email through three different surfaces. Each
|
||||
has a different failure mode when it lands in spam. This runbook covers
|
||||
how to diagnose, fix, and verify each path.
|
||||
|
||||
## What email the CRM sends
|
||||
|
||||
| Surface | Trigger | Template | Default `from` |
|
||||
| ----------------------------------------- | -------------------------------------------------------------------------------------- | ----------------------------------------------------------------- | ----------------------------------------------------- |
|
||||
| Portal activation / password-reset | Admin invites a client to the portal | `src/lib/email/templates/portal-auth.ts` | per-port `email_settings.from_address` or `SMTP_FROM` |
|
||||
| Inquiry confirmation + sales notification | Public website POSTs to `/api/public/interests` or `/api/public/residential-inquiries` | `inquiry-client-confirmation.ts`, `inquiry-sales-notification.ts` | same |
|
||||
| GDPR export ready | Staff requests an export with `emailToClient=true` | inline in `gdpr-export.service.ts` | same |
|
||||
| Documenso reminders | Cadence job fires for an unsigned signer | `documenso/reminders/*` | same |
|
||||
|
||||
Documenso _itself_ sends signing requests with its own `from` address —
|
||||
those don't flow through this codebase. SPF/DKIM for the Documenso
|
||||
sender is the Documenso operator's problem, not yours.
|
||||
|
||||
## DNS records
|
||||
|
||||
For every domain that appears in a `from:` header you must publish:
|
||||
|
||||
### 1. SPF
|
||||
|
||||
A single TXT record at the apex authorizing whichever provider is
|
||||
sending. Multiple SPF records on the same name **break SPF entirely** —
|
||||
combine into one.
|
||||
|
||||
```
|
||||
v=spf1 include:_spf.google.com include:amazonses.com -all
|
||||
```
|
||||
|
||||
The `-all` (hardfail) is correct for transactional mail. Switch to `~all`
|
||||
(softfail) only as a temporary diagnostic when migrating providers.
|
||||
|
||||
### 2. DKIM
|
||||
|
||||
Each provider publishes its own selector. Common shapes:
|
||||
|
||||
- Google Workspace: `google._domainkey` → 2048-bit RSA pubkey (rotate every 12 months).
|
||||
- Amazon SES: `xxxx._domainkey`, `yyyy._domainkey`, `zzzz._domainkey` (three CNAMEs SES gives you).
|
||||
- Postmark / Resend / Mailgun: one CNAME per selector.
|
||||
|
||||
Verify alignment — the `d=` value in the DKIM signature must match the
|
||||
`From:` domain (relaxed alignment is fine, strict is overkill).
|
||||
|
||||
### 3. DMARC
|
||||
|
||||
Start at `p=none` while you build deliverability data, then upgrade.
|
||||
|
||||
```
|
||||
_dmarc 14400 IN TXT "v=DMARC1; p=quarantine; rua=mailto:dmarc@portnimara.com; ruf=mailto:dmarc@portnimara.com; fo=1; adkim=r; aspf=r; pct=100"
|
||||
```
|
||||
|
||||
`rua` (aggregate reports) is the diagnostic feed — set it before the
|
||||
first send so the first weekly report has data.
|
||||
|
||||
### 4. MX (only if you also receive)
|
||||
|
||||
The CRM's IMAP probe (`scripts/dev-imap-probe.ts`) and the inbound thread
|
||||
sync rely on a real mailbox. Whoever runs that mailbox publishes the MX
|
||||
records — typically Google Workspace or a dedicated provider. Don't add
|
||||
an MX pointing at the CRM host; it doesn't accept SMTP IN.
|
||||
|
||||
## Per-port overrides
|
||||
|
||||
Each port can override `from_address`, `from_name`, and SMTP creds via
|
||||
the admin email-settings page. When set, `getPortEmailConfig()` returns
|
||||
those values and `sendEmail()` uses them in preference to the global
|
||||
`SMTP_*` env. **The override domain still needs SPF / DKIM / DMARC** on
|
||||
its own DNS — without them, every send from that port lands in spam.
|
||||
|
||||
When a customer reports "our portal invite didn't arrive":
|
||||
|
||||
1. Pull the port's email settings from the admin UI. Check `from_address`.
|
||||
2. Run `dig TXT <from-domain>` and `dig TXT _dmarc.<from-domain>`.
|
||||
Confirm SPF includes the SMTP provider's domain and DMARC exists.
|
||||
3. Send a probe through `mail-tester.com`: paste the address into a
|
||||
test send, click the score breakdown.
|
||||
4. Score < 8/10 → fix whatever's flagged before doing anything else in
|
||||
this runbook.
|
||||
|
||||
## Diagnosing a "didn't arrive" report
|
||||
|
||||
Order matters — go top-down, stop when one of these is the answer.
|
||||
|
||||
### Step 1: Was the send attempted?
|
||||
|
||||
```bash
|
||||
# Tail the worker logs for the recipient address.
|
||||
docker compose logs worker | grep '<recipient>'
|
||||
```
|
||||
|
||||
You'll see one of three patterns:
|
||||
|
||||
- **Nothing**: The job didn't run. Check that BullMQ actually queued it.
|
||||
`redis-cli LLEN bull:email:waiting` — if non-zero, the worker is dead.
|
||||
`docker compose logs scheduler | tail` to see why.
|
||||
- **`Email sent`** with a message-id: The provider accepted it. Move to
|
||||
Step 2.
|
||||
- **`SendError`**: Provider rejected. The error string says why
|
||||
(auth, rate limit, blocked recipient).
|
||||
|
||||
### Step 2: Is `EMAIL_REDIRECT_TO` set?
|
||||
|
||||
In dev/test we set `EMAIL_REDIRECT_TO=ops@portnimara.com` so seeded fake
|
||||
clients don't get real email. **It must be unset in production.**
|
||||
|
||||
```bash
|
||||
# On the production host:
|
||||
docker exec pncrm-web printenv EMAIL_REDIRECT_TO
|
||||
# Should print nothing.
|
||||
```
|
||||
|
||||
If it's set, every email is going to the redirect target with the
|
||||
original recipient prefixed in the subject — the customer never sees it.
|
||||
|
||||
### Step 3: Did it land but get filtered?
|
||||
|
||||
Ask the recipient to check:
|
||||
|
||||
- Spam / Junk folder
|
||||
- Gmail "Promotions" tab
|
||||
- Outlook "Other" folder (vs Focused)
|
||||
- The Quarantine console if they're on M365 with anti-spam enabled
|
||||
|
||||
If found in a spam folder: the email arrived; the recipient's filter
|
||||
classified it. SPF/DKIM/DMARC alignment is suspect — re-run the
|
||||
mail-tester probe from above.
|
||||
|
||||
### Step 4: Was the recipient on a suppression list?
|
||||
|
||||
Some providers (SES, Postmark) maintain a suppression list — once a
|
||||
domain bounces from an address, future sends are dropped silently.
|
||||
|
||||
```bash
|
||||
# SES example:
|
||||
aws ses list-suppressed-destinations --region eu-west-1
|
||||
```
|
||||
|
||||
If the recipient is suppressed, remove them and ask them to retry. The
|
||||
CRM doesn't track suppression locally; that's the provider's job.
|
||||
|
||||
## When migrating SMTP providers
|
||||
|
||||
1. Add the new provider's DKIM CNAMEs alongside the old ones.
|
||||
2. Add the new provider's `include:` to the existing SPF record.
|
||||
3. Wait 48 hours for DNS to propagate and DMARC reports to confirm both
|
||||
providers align.
|
||||
4. Switch `SMTP_*` env to the new provider on a single staging host.
|
||||
5. Send through the staging host for a week. Watch DMARC reports.
|
||||
6. Cut production over.
|
||||
7. Wait two weeks before removing the old provider's DNS — undelivered
|
||||
bounce reports keep arriving for a while.
|
||||
|
||||
## Testing a deliverability fix
|
||||
|
||||
There's no automated test for "did this email reach the inbox" — that's a
|
||||
property of the recipient's filter, which we don't control. The closest
|
||||
proxy is the realapi suite:
|
||||
|
||||
```bash
|
||||
pnpm exec playwright test --project=realapi
|
||||
```
|
||||
|
||||
It runs `tests/e2e/realapi/portal-imap-activation.spec.ts` which sends a
|
||||
real portal-invite email through SMTP, then polls the configured IMAP
|
||||
mailbox for the activation link. If it appears within 30 seconds, the
|
||||
SMTP→DKIM→DMARC chain is alive end-to-end. If the test times out, work
|
||||
backwards through this runbook.
|
||||
|
||||
The realapi suite needs `SMTP_*` and `IMAP_*` env vars — see the
|
||||
"Optional dev/test-only env vars" block in `CLAUDE.md`.
|
||||
|
||||
## Bounce handling
|
||||
|
||||
The CRM doesn't currently process bounces. If you start seeing volume:
|
||||
|
||||
- Set up the provider's webhook (SES → SNS → Lambda; Postmark → webhook
|
||||
URL) to POST bounce events to a new `/api/webhooks/email-bounce` route.
|
||||
- Persist the bounced address into a `email_suppressions` table.
|
||||
- Have `sendEmail()` consult that table before each send.
|
||||
|
||||
That work isn't in scope yet; this runbook just flags it as the next
|
||||
deliverability gap.
|
||||
56
docs/runbooks/permission-audit.md
Normal file
56
docs/runbooks/permission-audit.md
Normal file
@@ -0,0 +1,56 @@
|
||||
# Permission Matrix Audit
|
||||
|
||||
Scanned 182 route files under `src/app/api/v1/`.
|
||||
|
||||
**No violations.** Every internal v1 handler is permission-gated.
|
||||
|
||||
**Allow-listed:** 46 handler(s) intentionally skip `withPermission`.
|
||||
|
||||
| File | Method | Reason |
|
||||
| ---------------------------------------------------------------- | ------ | --------------------------------------------------------------------------------- |
|
||||
| `src/app/api/v1/admin/alerts/run-engine/route.ts` | POST | Admin-only — gated by isSuperAdmin inside handler. |
|
||||
| `src/app/api/v1/admin/connections/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
|
||||
| `src/app/api/v1/admin/errors/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
|
||||
| `src/app/api/v1/admin/health/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
|
||||
| `src/app/api/v1/admin/ocr-settings/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
|
||||
| `src/app/api/v1/admin/ocr-settings/route.ts` | PUT | Admin-only — gated by isSuperAdmin inside handler. |
|
||||
| `src/app/api/v1/admin/ocr-settings/test/route.ts` | POST | Admin-only — gated by isSuperAdmin inside handler. |
|
||||
| `src/app/api/v1/admin/queues/[queueName]/[jobId]/retry/route.ts` | POST | Admin-only — gated by isSuperAdmin inside handler. |
|
||||
| `src/app/api/v1/admin/queues/[queueName]/[jobId]/route.ts` | DELETE | Admin-only — gated by isSuperAdmin inside handler. |
|
||||
| `src/app/api/v1/admin/queues/[queueName]/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
|
||||
| `src/app/api/v1/admin/queues/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
|
||||
| `src/app/api/v1/admin/users/options/route.ts` | GET | Admin-only — gated by isSuperAdmin inside handler. |
|
||||
| `src/app/api/v1/ai/email-draft/[jobId]/route.ts` | GET | TODO: needs ai:\* permission catalog entry. Currently allow-listed. |
|
||||
| `src/app/api/v1/ai/email-draft/route.ts` | POST | TODO: needs ai:\* permission catalog entry. Currently allow-listed. |
|
||||
| `src/app/api/v1/ai/interest-score/bulk/route.ts` | GET | TODO: needs ai:\* permission catalog entry. Currently allow-listed. |
|
||||
| `src/app/api/v1/ai/interest-score/route.ts` | GET | TODO: needs ai:\* permission catalog entry. Currently allow-listed. |
|
||||
| `src/app/api/v1/alerts/[id]/acknowledge/route.ts` | POST | Alerts are user-scoped; port-filtered via auth context. |
|
||||
| `src/app/api/v1/alerts/[id]/dismiss/route.ts` | POST | Alerts are user-scoped; port-filtered via auth context. |
|
||||
| `src/app/api/v1/alerts/count/route.ts` | GET | Alerts are user-scoped; port-filtered via auth context. |
|
||||
| `src/app/api/v1/alerts/route.ts` | GET | Alerts are user-scoped; port-filtered via auth context. |
|
||||
| `src/app/api/v1/berth-reservations/[id]/route.ts` | PATCH | TODO: PATCH should map to reservations:edit (not currently in catalog). |
|
||||
| `src/app/api/v1/currency/convert/route.ts` | POST | Currency reference data; port-scoped, no PII. |
|
||||
| `src/app/api/v1/currency/rates/refresh/route.ts` | POST | TODO: gate with admin:manage_settings — currently allow-listed. |
|
||||
| `src/app/api/v1/currency/rates/route.ts` | GET | Currency reference data; port-scoped, no PII. |
|
||||
| `src/app/api/v1/custom-fields/[entityId]/route.ts` | GET | TODO: needs custom_fields:\* permission. PUT path internally validated. |
|
||||
| `src/app/api/v1/custom-fields/[entityId]/route.ts` | PUT | TODO: needs custom_fields:\* permission. PUT path internally validated. |
|
||||
| `src/app/api/v1/expenses/export/parent-company/route.ts` | POST | Internally gated by isSuperAdmin inside the handler. |
|
||||
| `src/app/api/v1/me/route.ts` | GET | Self-endpoint — auth is sufficient. |
|
||||
| `src/app/api/v1/me/route.ts` | PATCH | Self-endpoint — auth is sufficient. |
|
||||
| `src/app/api/v1/notifications/[notificationId]/route.ts` | PATCH | User-scoped notifications — caller is the resource owner. |
|
||||
| `src/app/api/v1/notifications/preferences/route.ts` | GET | User-scoped notifications — caller is the resource owner. |
|
||||
| `src/app/api/v1/notifications/preferences/route.ts` | PUT | User-scoped notifications — caller is the resource owner. |
|
||||
| `src/app/api/v1/notifications/read-all/route.ts` | POST | User-scoped notifications — caller is the resource owner. |
|
||||
| `src/app/api/v1/notifications/route.ts` | GET | User-scoped notifications — caller is the resource owner. |
|
||||
| `src/app/api/v1/notifications/unread-count/route.ts` | GET | User-scoped notifications — caller is the resource owner. |
|
||||
| `src/app/api/v1/saved-views/[id]/route.ts` | PATCH | User-self saved views — caller is the resource owner. |
|
||||
| `src/app/api/v1/saved-views/[id]/route.ts` | DELETE | User-self saved views — caller is the resource owner. |
|
||||
| `src/app/api/v1/saved-views/route.ts` | GET | User-self saved views — caller is the resource owner. |
|
||||
| `src/app/api/v1/saved-views/route.ts` | POST | User-self saved views — caller is the resource owner. |
|
||||
| `src/app/api/v1/search/recent/route.ts` | GET | Port-scoped search — results filtered by auth context (resources have own perms). |
|
||||
| `src/app/api/v1/search/route.ts` | GET | Port-scoped search — results filtered by auth context (resources have own perms). |
|
||||
| `src/app/api/v1/settings/feature-flag/route.ts` | GET | Public read of feature-flag bool — no PII; auth is sufficient. |
|
||||
| `src/app/api/v1/tags/options/route.ts` | GET | Tags are cross-cutting reference data; port-scoped via auth. |
|
||||
| `src/app/api/v1/tags/route.ts` | GET | Tags are cross-cutting reference data; port-scoped via auth. |
|
||||
| `src/app/api/v1/users/me/preferences/route.ts` | GET | User-self preferences — caller is the resource owner. |
|
||||
| `src/app/api/v1/users/me/preferences/route.ts` | PATCH | User-self preferences — caller is the resource owner. |
|
||||
1918
docs/superpowers/plans/2026-04-29-mobile-foundation.md
Normal file
1918
docs/superpowers/plans/2026-04-29-mobile-foundation.md
Normal file
File diff suppressed because it is too large
Load Diff
376
docs/superpowers/specs/2026-04-29-gws-inbox-triage-design.md
Normal file
376
docs/superpowers/specs/2026-04-29-gws-inbox-triage-design.md
Normal file
@@ -0,0 +1,376 @@
|
||||
# Google Workspace inbox-triage integration (exploratory)
|
||||
|
||||
**Status:** Exploratory — not approved for build
|
||||
**Date:** 2026-04-29
|
||||
**Tracks:** AI inbox-triage, Google Workspace email connection
|
||||
|
||||
## What this spec is for
|
||||
|
||||
The user has flagged inbox-triage as the most valuable AI surface left to
|
||||
build, but conditioned email integration on it being via Google Workspace
|
||||
specifically (not generic IMAP), with a per-port toggle so clients who
|
||||
don't use GWS aren't billed for capability they can't reach.
|
||||
|
||||
This document captures what that build actually costs — especially on
|
||||
the Google side, which is where most teams underestimate the work — so
|
||||
we can decide whether to commit before writing any code. **Nothing in
|
||||
this spec is approved for implementation.** The deliverable is a go /
|
||||
no-go decision and, if go, a scope choice between three deployment
|
||||
models that cost wildly different amounts of calendar time.
|
||||
|
||||
## What inbox-triage actually does for the user
|
||||
|
||||
Concretely, on the staff member's desktop:
|
||||
|
||||
1. **Linked-inbox panel on the client detail page.** When you open
|
||||
`/[port]/clients/<id>` you see the last N email threads with that
|
||||
client, pulled from the staff member's own Gmail. Each thread has
|
||||
the latest message preview, an "open in Gmail" deep-link, and a
|
||||
"draft reply" button (Phase 2+).
|
||||
2. **Inbox triage queue.** A new top-level page `/[port]/inbox` that
|
||||
lists unread/unanswered threads ranked by AI-assessed importance
|
||||
(high-value client, contractual urgency, chase-overdue). Each row
|
||||
has one-click actions: "log this as a note on the client",
|
||||
"create a follow-up reminder", "draft reply".
|
||||
3. **Email-driven alerts.** When a high-value client emails and no one
|
||||
responds within X hours, the existing alerts engine fires a
|
||||
`inbox.unanswered_high_value` rule (slots into the alert framework
|
||||
from Phase B without schema change).
|
||||
4. **Reply drafts (Phase 3).** AI generates a reply draft grounded in
|
||||
the client's CRM record (open interests, pending reservations,
|
||||
recent invoices). Staff edit and send through Gmail.
|
||||
|
||||
The value is selective: a port with three staff members fielding 50
|
||||
client emails a day saves maybe an hour a day collectively if the
|
||||
ranking is right. Below that volume the build doesn't pay back.
|
||||
|
||||
## What already exists in the codebase
|
||||
|
||||
The CRM is roughly halfway scaffolded for this:
|
||||
|
||||
| Surface | Status | Notes |
|
||||
| ----------------------------------------------- | ----------------------- | ------------------------------------------------------------------------------------------------------------------------ |
|
||||
| `email_accounts` table | ✅ Exists | Has `provider: 'google' \| 'outlook' \| 'custom'` discriminator and `imap_*` / `smtp_*` cols. Built for IMAP, not OAuth. |
|
||||
| `email_threads` / `email_messages` tables | ✅ Exists | Already linked to `clientId`. Schema is good as-is for Gmail. |
|
||||
| `email-threads.service.ts` `syncInbox()` | ⚠ Stub-ish | IMAP-flow only. Won't reach Gmail without OAuth + Gmail API rewrite. |
|
||||
| `email` BullMQ queue + `inbox-sync` job name | ✅ Exists | Worker dispatches on the job name; new sync impl drops in. |
|
||||
| `google_calendar_tokens` table | ✅ Exists | OAuth token storage shape we can mirror for Gmail. |
|
||||
| Per-port email override (port `email_settings`) | ✅ Exists | Used for outbound only today; Gmail integration is per-staff-user, not per-port. |
|
||||
| `ai_usage_ledger` + per-port `aiEnabled` flag | ✅ Exists (Phase 3a/3b) | Triage AI calls book against the same ledger. |
|
||||
| `withRateLimit('ai', ...)` wrapper | ✅ Exists (Phase 3c) | Caps triage AI traffic at 60/min/user out of the box. |
|
||||
|
||||
Net: schemas are mostly right. The OAuth flow, Gmail API client, push
|
||||
notification receiver, and triage classifier are the new builds.
|
||||
|
||||
## Why Google Workspace specifically
|
||||
|
||||
The user's stated constraint: "I don't think we need email integration
|
||||
unless we connect it to Google Workspace." Reasons that hold up:
|
||||
|
||||
- **No password storage.** OAuth tokens are revocable, scoped, and
|
||||
rotate. IMAP requires app passwords, which Google has been actively
|
||||
deprecating since 2024 — they'll be gone for the workspace plans
|
||||
this product targets.
|
||||
- **Push notifications, not polling.** Gmail's `users.watch` API plus
|
||||
Google Pub/Sub means we get an HTTP callback within seconds of a new
|
||||
message landing. IMAP requires polling on a 30-60 second cadence,
|
||||
which costs more and lags worse.
|
||||
- **Search and labels.** The Gmail API exposes label management and
|
||||
full-text search natively; IMAP search is much weaker.
|
||||
- **Threading.** Gmail's `threadId` is canonical. Reconstructing
|
||||
threads over IMAP from `In-Reply-To` / `References` headers is
|
||||
reliable in theory, painful in practice.
|
||||
|
||||
Microsoft 365 is the obvious peer integration but is out of scope here.
|
||||
The Graph API model is similar enough that a future M365 path can reuse
|
||||
most of the storage shape.
|
||||
|
||||
## Three deployment models — pick one before building
|
||||
|
||||
This is the most important decision in the spec. Each model has
|
||||
different OAuth-verification consequences, which dominate everything
|
||||
else.
|
||||
|
||||
### Model A — Marketplace-published OAuth app
|
||||
|
||||
A single OAuth client owned by Port Nimara, listed in the Google
|
||||
Workspace Marketplace, that any GWS customer can install. Each staff
|
||||
member clicks "Connect Gmail," consents to the scopes, and the CRM
|
||||
stores their refresh token.
|
||||
|
||||
**Google-side work:**
|
||||
|
||||
1. Build the OAuth flow in CRM (~1 week).
|
||||
2. Submit for OAuth verification. Gmail's `gmail.readonly` /
|
||||
`gmail.modify` scopes are **restricted scopes** — they require:
|
||||
- Domain-verified production URLs
|
||||
- A homepage with a privacy policy that explicitly enumerates which
|
||||
scopes are used and why
|
||||
- A demo video (literally a screen recording) showing the consent
|
||||
screen and what happens next
|
||||
- **A third-party security assessment from a Google-approved
|
||||
vendor** ($15k–$75k, 6–12 weeks)
|
||||
- A Cloud Application Security Assessment (CASA) report
|
||||
3. Marketplace listing review (~2 weeks after CASA passes).
|
||||
|
||||
**Calendar time:** 4–6 months.
|
||||
**Money:** $15k–$75k for the security assessment alone.
|
||||
**Recurring:** Re-verification every 12 months.
|
||||
|
||||
Right answer if Port Nimara wants to be the marina-CRM that ships GWS
|
||||
out of the box for _any_ customer. Wrong answer if there are <5
|
||||
customers who'd use it.
|
||||
|
||||
### Model B — Per-customer "Internal" OAuth app
|
||||
|
||||
Each customer's GWS admin creates an OAuth client _inside their own
|
||||
workspace_ and gives Port Nimara the client ID + secret. Because the
|
||||
app is "Internal," Google skips verification entirely — the consent
|
||||
screen is unverified-but-permitted. Tokens never cross workspace
|
||||
boundaries.
|
||||
|
||||
**Google-side work per customer:**
|
||||
|
||||
1. Customer's GWS admin enables the Gmail API in their Cloud project.
|
||||
2. Creates an OAuth 2.0 client ID with type "Internal" + your CRM's
|
||||
redirect URI.
|
||||
3. Hands the client ID + secret to Port Nimara out-of-band.
|
||||
4. Staff connect their Gmail through that client.
|
||||
|
||||
**Calendar time per customer:** ~1 hour of admin work.
|
||||
**Money:** $0.
|
||||
**Limit:** Doesn't span across GWS workspaces. A user with two GWS
|
||||
accounts (e.g. the marina + a personal workspace) can only connect the
|
||||
one matching the OAuth client.
|
||||
|
||||
This is the **clear winner for the current customer base**: small
|
||||
number of customers, each with their own GWS workspace, and each
|
||||
buying the integration as part of an onboarding conversation.
|
||||
|
||||
### Model C — Forward-to-CRM mailbox
|
||||
|
||||
The CRM exposes a per-port email alias (e.g.
|
||||
`port-nimara-NN@inbox.portnimara.com`). Customers configure a Gmail
|
||||
filter or mailing rule that BCCs that alias on relevant threads. The
|
||||
CRM ingests via SMTP and runs the same triage pipeline.
|
||||
|
||||
**Google-side work:** None. Customer does it as a Gmail filter.
|
||||
**Calendar time:** ~1 week of CRM-side build.
|
||||
**Limit:** Receive-only — no reply drafts, no thread state changes,
|
||||
no labels. The "draft reply" feature in Phase 3 above is impossible
|
||||
under this model.
|
||||
|
||||
Model C is the right answer if the user wants to ship inbox-triage
|
||||
_now_ and decide on bidirectional Gmail integration later. The schema
|
||||
is designed so the model can be upgraded to A or B without data
|
||||
migration.
|
||||
|
||||
### Recommendation
|
||||
|
||||
**Build Model B first.** It costs nothing on the Google side, takes
|
||||
~3 weeks of CRM work, and matches the actual customer profile.
|
||||
**Promote to Model A only after 3+ paying customers ask for it
|
||||
unprompted.** Until then, the security-assessment cost can't justify
|
||||
itself.
|
||||
|
||||
Model C as a fallback for customers who refuse to set up an Internal
|
||||
OAuth app. Build it last, lazily — the schema accommodates it.
|
||||
|
||||
## End-to-end flow (Model B)
|
||||
|
||||
### 1. Per-port OAuth-app config
|
||||
|
||||
New admin page `/[port]/admin/google-workspace`:
|
||||
|
||||
- Field: "OAuth client ID" (their internal client ID)
|
||||
- Field: "OAuth client secret" (encrypted at rest using `ENCRYPTION_KEY`)
|
||||
- Field: "Authorized redirect URI" (read-only; we display the value
|
||||
they need to paste into their Google Cloud Console)
|
||||
- Toggle: "Enable Gmail integration for this port"
|
||||
|
||||
Stored in `system_settings` under key `gws.config`, port-scoped.
|
||||
Resolution mirrors the existing OCR config service.
|
||||
|
||||
### 2. Per-staff connect flow
|
||||
|
||||
Staff member visits `/[port]/me/integrations`, clicks "Connect Gmail."
|
||||
|
||||
```
|
||||
GET /api/v1/auth/gws/start
|
||||
→ looks up port's gws.config
|
||||
→ builds Google authorize URL with port's client_id + state token
|
||||
→ 302 to Google
|
||||
[ user consents ]
|
||||
→ 302 back to /api/v1/auth/gws/callback?code=…&state=…
|
||||
→ exchanges code for tokens via port's client_secret
|
||||
→ stores in new `gws_user_tokens` table (encrypted)
|
||||
→ schedules an `inbox-watch` job
|
||||
```
|
||||
|
||||
### 3. Push notification subscription
|
||||
|
||||
After tokens are stored, the worker calls
|
||||
`gmail.users.watch({ topicName: <Pub/Sub topic>, labelIds: ['INBOX'] })`.
|
||||
Gmail then posts to a Pub/Sub topic on every inbox change. The CRM
|
||||
exposes a Pub/Sub push subscription endpoint at
|
||||
`/api/webhooks/gmail-push` which fetches the changed messages via the
|
||||
delta `historyId` and writes them into `email_messages`.
|
||||
|
||||
Watch subscriptions expire every 7 days. A maintenance job
|
||||
re-establishes them daily.
|
||||
|
||||
### 4. Triage pipeline
|
||||
|
||||
For each new inbound message:
|
||||
|
||||
1. Match against `clients` and `companies` by `from_address` against
|
||||
`client_contacts` (email channel). Persist a thread→client link if
|
||||
found.
|
||||
2. If port has `aiEnabled` AND `gws.triageEnabled`, queue an `ai`
|
||||
job that classifies the thread:
|
||||
- `urgency`: low / medium / high
|
||||
- `category`: invoice-question / availability / contract / other
|
||||
- `requires_response`: boolean
|
||||
3. AI call records into `ai_usage_ledger` with `feature='inbox_triage'`.
|
||||
The existing per-port budget gates apply automatically.
|
||||
4. Triage output written to a new `email_triage` table keyed on
|
||||
`email_messages.id`.
|
||||
|
||||
### 5. UI surfaces
|
||||
|
||||
- `/[port]/inbox` — sorted by triage rank, port-wide view.
|
||||
- Linked-inbox panel on `client-tabs.tsx` — adds a new "Email" tab
|
||||
pulling from `email_threads` filtered to that client.
|
||||
- Alert rule `inbox.unanswered_high_value` slots into Phase B's
|
||||
alert engine; no schema change.
|
||||
|
||||
## Schema additions
|
||||
|
||||
Three new tables, all port-scoped where it matters:
|
||||
|
||||
```ts
|
||||
// Per-staff Gmail tokens. Mirror of google_calendar_tokens.
|
||||
gws_user_tokens {
|
||||
id, userId (UNIQUE), portId, emailAddress,
|
||||
accessTokenEnc, refreshTokenEnc, tokenExpiry,
|
||||
scope, watchExpiresAt, watchHistoryId,
|
||||
connectedAt, lastSyncAt, syncEnabled, createdAt, updatedAt
|
||||
}
|
||||
|
||||
// Triage classifications keyed to messages.
|
||||
email_triage {
|
||||
messageId (PK, FK → email_messages.id ON DELETE CASCADE),
|
||||
urgency, category, requiresResponse,
|
||||
modelVersion, tokensUsed, classifiedAt
|
||||
}
|
||||
|
||||
// Pub/Sub idempotency log. Gmail re-delivers; we dedupe.
|
||||
gws_push_log {
|
||||
messageId (Pub/Sub message id, PK),
|
||||
historyId, receivedAt
|
||||
}
|
||||
```
|
||||
|
||||
Plus extensions to `email_messages`:
|
||||
|
||||
- `googleMessageId` (text, indexed) — Gmail's own ID for thread ops.
|
||||
- `googleThreadId` (text, indexed).
|
||||
- `gmailLabels` (text[]) — for "is unread" checks without hitting Gmail.
|
||||
|
||||
The existing `emailAccounts.provider='google'` column repurposes
|
||||
unchanged; the IMAP fields go nullable since OAuth-flow accounts won't
|
||||
populate them.
|
||||
|
||||
## AI cost interaction
|
||||
|
||||
Triage AI is opt-in **twice**: the port admin must turn on
|
||||
`aiEnabled` (Phase 3a flag, default off) **and** `gws.triageEnabled`
|
||||
(this spec, default off). Either toggle off and the inbox sync still
|
||||
runs but skips classification, so staff can manually scan threads
|
||||
without burning tokens.
|
||||
|
||||
Per-message token cost on a current Haiku-class model is roughly
|
||||
1500–2500 tokens including the system prompt. A port doing 200 inbound
|
||||
emails a day at the upper bound is ~500k tokens/day. The default
|
||||
hard-cap is 500k/month, so triage will trip it inside a day. Two
|
||||
mitigations baked in:
|
||||
|
||||
- The system prompt is short (<500 tokens) and prompt-cached on the
|
||||
Anthropic side, so most tokens are output.
|
||||
- Triage runs only on threads not already classified — re-syncs from
|
||||
the watch loop don't re-bill.
|
||||
|
||||
The admin UI shows triage as its own line in the per-feature breakdown
|
||||
so customers can see how much their inbox is costing them and tune
|
||||
caps accordingly.
|
||||
|
||||
## Phased build (assuming Model B)
|
||||
|
||||
| Phase | Scope | Effort | Ships when |
|
||||
| ---------------------------- | ------------------------------------------------------------------------------------------------------- | ------ | ----------------------------------------------------------- |
|
||||
| **G1** Connect | OAuth flow + per-port config + per-user token storage. No sync yet. Staff can connect; nothing happens. | 1 week | Standalone |
|
||||
| **G2** Read-only sync | Pub/Sub push receiver + delta sync into `email_messages`. Linked-inbox tab on client detail. No AI. | 1 week | After G1 |
|
||||
| **G3** Triage classification | AI classifier, `email_triage` writes, `/inbox` page sorting. Per-port toggle. | 1 week | After G2; depends on Phase 3b budgets being live (they are) |
|
||||
| **G4** Reply drafts | Gmail API send + draft creation. "Draft reply" button on the client detail Email tab. | 1 week | After G3 |
|
||||
| **G5** Alerts | New `inbox.unanswered_high_value` rule. Hooks into Phase B alert engine. | 2 days | After G3 |
|
||||
|
||||
Total: ~5 weeks for a single engineer, assuming the user provides one
|
||||
real GWS workspace to test against during G1.
|
||||
|
||||
## Open decisions for the user
|
||||
|
||||
These are the questions to resolve before scheduling the build, in
|
||||
priority order:
|
||||
|
||||
1. **Deployment model — A, B, or C?** Default recommendation B.
|
||||
2. **Single user or domain-wide delegation?** Per-staff connect (one
|
||||
token per user) is simpler. Domain-wide delegation lets the port
|
||||
admin connect once on behalf of every staff member but requires
|
||||
the customer to grant a service account broader access. Default
|
||||
recommendation: per-staff.
|
||||
3. **Scope set.** Minimal viable scope is `gmail.readonly`. To send
|
||||
replies (G4) we need `gmail.send`. To manage labels (e.g. mark
|
||||
"triaged-by-CRM") we need `gmail.modify`. Each scope expansion
|
||||
widens the consent screen scariness but doesn't add new
|
||||
verification steps under Model B.
|
||||
4. **Pub/Sub topic ownership.** Pub/Sub topics live in _some_ GCP
|
||||
project. Under Model B the customer's project owns the topic —
|
||||
they pay for Pub/Sub (cents/month) and grant our service account
|
||||
subscriber access. Alternative: Port Nimara owns the topic and
|
||||
the customer's Gmail publishes cross-project (allowed, slightly
|
||||
more setup). Default: customer-owned topic, fewer moving parts.
|
||||
5. **Triage model.** Haiku 4.5 is right for cost; Sonnet 4.6 is
|
||||
right if the ranking quality on Haiku turns out to be poor.
|
||||
Defer this until G3 has real-world tuning data.
|
||||
|
||||
## Things that are NOT in this spec
|
||||
|
||||
- **Microsoft 365 / Outlook integration.** Same shape, different API.
|
||||
Once Model B is proven on GWS, Graph API takes another ~3 weeks.
|
||||
- **Reply drafts grounded in CRM context.** That's G4 and depends on
|
||||
the work in this spec, but the prompt engineering for "good replies
|
||||
citing this client's open interests + reservations + invoices"
|
||||
deserves its own design pass before building.
|
||||
- **Cross-staff triage queue (i.e. "show me all unanswered emails
|
||||
across the team").** That requires either domain-wide delegation
|
||||
(decision #2 above) or per-staff opt-in to a shared view. Punt
|
||||
until staff actually ask for it.
|
||||
- **Sentiment / urgency tone analysis.** Tempting; almost always
|
||||
wrong; skip in v1.
|
||||
- **"Smart drafts" using the recipient's past replies as context.**
|
||||
Every customer asks for this and almost no one uses it once
|
||||
built. Skip.
|
||||
|
||||
## Cost summary at a glance
|
||||
|
||||
| Item | Model A | Model B | Model C |
|
||||
| ------------------------------- | ------------------------------- | -------------------------------------- | ------------------------------------ |
|
||||
| Build effort | 3–4 weeks | ~5 weeks (over G1–G5) | ~1 week (receive-only) |
|
||||
| Calendar time to first customer | 4–6 months | 1 hour of customer admin work | 1 hour of customer Gmail-filter work |
|
||||
| Up-front cash | $15k–$75k (CASA) | $0 | $0 |
|
||||
| Recurring | Re-verification annually | None | None |
|
||||
| Best for | 50+ customers, Marketplace play | 1–10 customers, white-glove onboarding | Customers who refuse OAuth setup |
|
||||
|
||||
The recommendation stands: build Model B for G1 + G2 + G3, ship that,
|
||||
and let real customer demand decide whether G4/G5 and Model A
|
||||
promotion are worth the calendar time.
|
||||
189
docs/superpowers/specs/2026-04-29-mobile-optimization-design.md
Normal file
189
docs/superpowers/specs/2026-04-29-mobile-optimization-design.md
Normal file
@@ -0,0 +1,189 @@
|
||||
# Mobile Optimization Design
|
||||
|
||||
**Status**: Design approved 2026-04-29 — pending plan.
|
||||
**Plan decomposition**: Foundation PR (§3) is one implementation plan; per-page migration phases (§5) become follow-up plans, scoped per phase.
|
||||
**Branch base**: stacks on `refactor/data-model`.
|
||||
**Out of scope**: Phase B/C features, desktop redesign, Capacitor wrapper, swipe-actions on rows, native menus, server-driven UI.
|
||||
|
||||
---
|
||||
|
||||
## 1. Background
|
||||
|
||||
The CRM was built desktop-first. A 2026-04-29 mobile audit captured every authenticated and public page across the active iPhone viewport range. Findings:
|
||||
|
||||
1. **No `viewport` meta in the root layout** (one exists only in the scanner PWA sub-layout, `src/app/(scanner)/[portSlug]/scan/layout.tsx`). Without it, iOS Safari renders pages at the default 980px logical width and zooms out to fit — text becomes unreadable and touch targets sub-tappable. Playwright's `isMobile` emulation in the audit forces 393px-wide rendering, which exposes the layout breakage you'd otherwise have to discover by pinching to zoom.
|
||||
2. **Topbar overflows**. Search input + port switcher + sign-out button cram into one row; sign-out clips off the right edge as a half-visible blue bar on every authenticated page.
|
||||
3. **Tables render as desktop tables**. Every list page (clients, yachts, companies, invoices, expenses, interests, audit, users, etc.) shows truncated columns with horizontal scroll.
|
||||
4. **Page headers don't downsize**. Titles like "Dashboard" truncate to "Dash..."; primary action buttons (`+ New Client`) overlap their subtitles.
|
||||
5. **Detail page action chips overflow**. The chip row ("Invite to portal | GDPR export | Archive | …") horizontally overflows on every detail page.
|
||||
6. **One half-good pattern**: detail pages already collapse their tabs to a `<select>` dropdown on small screens. Worth extending.
|
||||
7. **Auth + scanner pages are already mobile-first** (`/login`, `/[portSlug]/scan`). Reference for the "what good looks like" target.
|
||||
|
||||
The audit harness (`tests/e2e/audit/mobile.spec.ts` + `mobile-audit` Playwright project) is added on this branch (not yet committed); re-runs regenerate `.audit/mobile/` (gitignored).
|
||||
|
||||
## 2. Approach
|
||||
|
||||
**Adaptive shell + responsive content** — chosen over (a) per-page conditional render, (b) a separate `(mobile)` route group, and (c) Tailwind-only responsive.
|
||||
|
||||
The "native feel" the user wants comes from the chrome — bottom tab bar, sheet modals, sticky compact header, safe-area awareness. Page content (forms, lists, details) doesn't need duplication; it gets responsive via shared mobile-aware primitives. This concentrates the dedicated-mobile work in ~10 components and keeps content single-source.
|
||||
|
||||
**Breakpoint**: Tailwind `lg` (1024px). Below `lg`, the mobile shell renders. At and above, the existing desktop shell is untouched.
|
||||
|
||||
### 2.1 Target iPhone viewport range
|
||||
|
||||
The mobile shell + content primitives must look correct across the full active iPhone viewport range (portrait):
|
||||
|
||||
| Tier | Models | Viewport |
|
||||
| ------------------------------------------ | ----------------------------------------------- | -------- |
|
||||
| Narrowest | iPhone SE 2nd / 3rd gen | 375×667 |
|
||||
| Standard | iPhone 12/13/14 (and Mini) | 390×844 |
|
||||
| Standard newer | iPhone 15 / 15 Pro / 16 | 393×852 |
|
||||
| Pro newer (Dynamic Island, thinner bezels) | iPhone 16 Pro / 17 Pro | 402×874 |
|
||||
| Plus / older Max | iPhone 14 Plus / 15 Plus / 15 Pro Max / 16 Plus | 430×932 |
|
||||
| Pro Max | iPhone 16 Pro Max / 17 Pro Max | 440×956 |
|
||||
|
||||
**Anchors used by audit and design validation**: 375×667 (worst-case narrow + short), 393×852 (most common current), 402×874 (current Pro), 440×956 (current Pro Max). Models within ±5px of an anchor (390, 430) are skipped — primitives that look correct at the anchors will look correct at neighbors.
|
||||
|
||||
**Dynamic Island**: iPhone 14 Pro and later have a larger top safe-area inset (~59px vs ~47px on classic-notch models). The CSS `env(safe-area-inset-top)` we expose as `pt-safe` handles this transparently — no per-model code paths.
|
||||
|
||||
**Landscape**: out of scope for this design. Phones in landscape are rare for CRM-style work; if needed later, the mobile shell at landscape widths would still fall under `lg` and would just stretch. Tablet landscape is addressed in the §5 tablet-pass phase.
|
||||
|
||||
**Routing**: no new route group. URLs and middleware unchanged. RBAC, services, queries, validators, RHF/zod forms, TanStack Query stores, socket.io — all unchanged.
|
||||
|
||||
## 3. Foundation PR
|
||||
|
||||
A single branch lands the infra + shell + primitives before any per-page work. After this merges, every authenticated page already gains: real viewport meta, no clipped topbar, bottom tab navigation, safe-area handling, and 44px touch targets — without any per-page edits.
|
||||
|
||||
### 3.1 Infrastructure
|
||||
|
||||
- `viewport` export in `src/app/layout.tsx` — `width=device-width, initial-scale=1, viewport-fit=cover`.
|
||||
- `theme-color` meta + `apple-mobile-web-app-capable` meta + `apple-mobile-web-app-status-bar-style` for PWA-ish status-bar integration.
|
||||
- Safe-area CSS variables (`env(safe-area-inset-*)`) exposed as Tailwind utilities (`pt-safe`, `pb-safe`, `pl-safe`, `pr-safe`).
|
||||
- `useIsMobile()` hook in `src/hooks/use-is-mobile.ts` — backed by `window.matchMedia('(max-width: 1023.98px)')`, no resize listener.
|
||||
- Server-side body-class detection: the root layout (`src/app/layout.tsx`) reads the `user-agent` request header via `next/headers`'s `headers()`, runs a small known-mobile-token check (Mobile / iPhone / iPad / Android — no library), and renders `<body data-form-factor="mobile|desktop">`. No middleware needed. CSS `[data-form-factor="mobile"]` reveals the mobile shell. The CSS media-query fallback (`@media (max-width: 1023.98px)`) handles UA misclassification (e.g., desktop browser resized to narrow width, or stripped UA).
|
||||
|
||||
### 3.2 Mobile shell
|
||||
|
||||
Both desktop and mobile shells are rendered to the DOM by the root layout; CSS reveals one and hides the other based on `[data-form-factor="mobile"]` plus a `@media (max-width: 1023.98px)` fallback. The existing `<Sidebar>` and `<Topbar>` components stay unchanged for the desktop shell. The mobile shell is wholly new:
|
||||
|
||||
- **`<MobileLayout>`** (`src/components/layout/mobile/mobile-layout.tsx`)
|
||||
Fixed 52px compact topbar (safe-area aware) + scrollable content + fixed 56px bottom tab bar (safe-area inset). Renders instead of the desktop sidebar+topbar shell when the form factor resolves to mobile.
|
||||
|
||||
- **`<MobileTopbar>`**
|
||||
Page title (auto-truncating, single-line) + back button when route depth > 1 + single primary action slot (passed via context from the page) + port-switcher behind a `<Sheet>` trigger.
|
||||
|
||||
- **`<MobileBottomTabs>`**
|
||||
Fixed 5 tabs: **Dashboard / Clients / Yachts / Berths / More**. Active state from current path. Lucide icons (no emoji). Badge support for the alerts count.
|
||||
|
||||
- **`<MoreSheet>`**
|
||||
Bottom sheet opened by the More tab. Holds the long tail in a scrollable list grouped by section: Companies, Interests, Invoices, Expenses, Documents, Email, Alerts, Reports, Reminders, Settings, Admin (with admin nesting one level deep into a child sheet).
|
||||
|
||||
- **`<MobileLayoutProvider>`**
|
||||
React context that lets each page push its title, back button, and primary action slot to `<MobileTopbar>` via a hook (`useMobileChrome({ title, action })`).
|
||||
|
||||
### 3.3 Primitives
|
||||
|
||||
All built once in `src/components/shared/`. Render desktop-style above `lg`, mobile-style below.
|
||||
|
||||
- **`<Sheet>`** — vaul-based bottom sheet on mobile, falls through to existing Radix `<Dialog>` on desktop. Same API as `<Dialog>` so adoption is mechanical.
|
||||
- **`<DataView>`** — accepts the same column defs the codebase uses today via TanStack Table. Above `lg`: renders the existing table. Below `lg`: renders a card list with a per-row `cardRender({ row }) => ReactNode` callback. Filter chips stay above the list; sort moves into a `<Sheet>` opened by a sort button.
|
||||
- **`<PageHeader>`** — title + optional subtitle + actions. Truncates title to one line, stacks actions to a second row on mobile, hides subtitle below `sm` if action row is present.
|
||||
- **`<ActionRow>`** — chip-style action group; `flex-nowrap overflow-x-auto scroll-smooth snap-x` on mobile, no overflow on desktop.
|
||||
- **`<DetailPageShell>`** — wraps detail pages with: sticky compact header (entity name, primary status), tab dropdown selector (existing pattern, extracted), scrollable content area, optional sticky bottom action bar (Save / Archive / etc.) on mobile that pins above the bottom tab bar.
|
||||
- **`<FilterChips>`** — chip-row filter UI used by `<DataView>`. Active filters are dismissable chips; "Add filter" opens a `<Sheet>`.
|
||||
|
||||
### 3.4 Default style adjustments
|
||||
|
||||
- `<Button>` and `<Input>` defaults: `min-h-11` (44px, Apple HIG touch-target).
|
||||
- `<Input>` and `<Textarea>` body text: `text-base` (16px) so iOS doesn't zoom on focus.
|
||||
- `<Dialog>` default base styling tweaked so any remaining unmigrated dialogs render full-screen on mobile (until they get migrated to `<Sheet>`).
|
||||
|
||||
### 3.5 Bundle impact
|
||||
|
||||
Both shells render server-side and switch via the `data-form-factor` body attribute, so both ship to every client (dynamic-importing one would cause a hydration flash). Rough estimate ~40KB gzipped added to the layout subtree for the mobile shell + new primitives (vaul ≈ 5KB gz, the rest is in-house components). Verify post-build with `pnpm build` and adjust if it's materially higher. Acceptable trade for no flash and no UA-based render-time branching.
|
||||
|
||||
### 3.6 PWA assets
|
||||
|
||||
The PWA scanner already references `icon-192.png`, `icon-512.png`, `icon-512-maskable.png` from `public/`, but those files don't exist yet (separate flagged blocker). The mobile shell adds an `apple-touch-icon` reference too. The Foundation PR includes placeholder PNGs so home-screen install works; production-quality icons can replace them without a code change.
|
||||
|
||||
## 4. Per-page playbook
|
||||
|
||||
Once foundation lands, each page follows the same workflow:
|
||||
|
||||
1. Open the page in headed Playwright at the anchor viewports per §2.1 (start at 393×852 for the iteration loop, spot-check 375 and 440 before declaring done).
|
||||
2. Replace any `<Dialog>` with `<Sheet>`.
|
||||
3. If list page: wrap the table in `<DataView>` and provide a `cardRender` callback. The 2-3 fields shown on the card are decided per page during migration with the user.
|
||||
4. Replace the ad-hoc page header with `<PageHeader>`.
|
||||
5. Replace ad-hoc action button rows with `<ActionRow>`.
|
||||
6. Touch up any custom embedded widgets the page uses (rare for simple pages, common for `email`, `documents`, `expenses/scan`).
|
||||
7. User reviews live in the headed browser, points out tweaks, iterate.
|
||||
|
||||
Most pages take 5–15 minutes in this loop. Heavy pages (email inbox, documents hub) may take 30–60 because the embedded widgets need their own mobile treatment beyond the primitives.
|
||||
|
||||
## 5. Migration sequence
|
||||
|
||||
After foundation PR:
|
||||
|
||||
1. **Quick-win sweep** (~half day) — pages mostly fixed by foundation alone. Just need `<PageHeader>` swap-in (no list-card conversion, no detail-shell wrap):
|
||||
`dashboard` (overview), `settings` (user-profile), `reports`, and the admin sub-pages that are forms or stat cards: `admin/settings`, `admin/branding`, `admin/forms`, `admin/ocr`, `admin/roles`, `admin/tags`, `admin/documenso`, `admin/templates`, `admin/custom-fields`, `admin/monitoring`, `admin/backup`, `admin/webhooks`, `admin/import`, `admin/ports`.
|
||||
2. **List pages** (~1–2 days) — convert via `<DataView>` + per-page `cardRender`:
|
||||
`clients`, `yachts`, `companies`, `berths`, `interests`, `invoices`, `expenses`, `alerts`, `reminders`, `admin/audit`, `admin/users`.
|
||||
3. **Heavy pages** (~1 day each) — embedded widgets need their own mobile treatment beyond the primitives:
|
||||
`documents` (sig-tracking + filters from Phase A), `email` (thread list + reader + composer).
|
||||
4. **Detail pages** (~1–2 days) — wrap in `<DetailPageShell>`, extend the tab-dropdown pattern, add sticky bottom action shelf:
|
||||
`clients/[clientId]`, `yachts/[yachtId]`, `companies/[companyId]`, `berths/[berthId]`, `invoices/[id]`, `expenses/[id]`.
|
||||
5. **Forms & wizards** — touch-up only, since `<Input>`/`<Button>` defaults handle the bulk:
|
||||
`invoices/new` (3-step wizard), `expenses/scan` (already mobile-first, just verify).
|
||||
6. **Portal** — same patterns, smaller scope:
|
||||
authenticated: `portal/dashboard`, `portal/invoices`, `portal/my-yachts`, `portal/documents`, `portal/interests`, `portal/my-reservations`. Public: `portal/login`, `portal/activate`, `portal/forgot-password`, `portal/reset-password` (already styled by `<BrandedAuthShell>` — just verify).
|
||||
7. **Tablet pass** — re-audit at iPad Air 11" portrait (820×1180) and landscape (1180×820), iPad Air 13" portrait (1024×1366) and landscape (1366×1024). The 820 portrait case will hit the mobile shell (820 < 1024) and probably want a "tablet-portrait" treatment with sidebar visible — flagged for design refinement at that phase, not now. The other three viewports fall above `lg` and use the desktop shell unchanged.
|
||||
|
||||
## 6. Testing
|
||||
|
||||
- **Mobile audit project** (`mobile-audit` in `playwright.config.ts`) is the regression harness. Re-runs after every page-migration PR; output goes to `.audit/mobile/` (gitignored). Audit covers the four anchor viewports defined in §2.1: 375×667, 393×852, 402×874, 440×956. Run time ~14 min headed.
|
||||
- **Smoke project** gets a curated mobile-viewport variant (~10 pages at the 393×852 anchor) — adds ~2 min to CI; full audit stays out of CI to avoid the ~14 min cost.
|
||||
- **Visual baselines** — `visual` project gets new mobile snapshots at the 393×852 anchor for: dashboard, clients-list, clients-detail, invoices-list, invoices-new, scan, documents, login. Regenerate with `--update-snapshots` after intentional changes (existing convention).
|
||||
- **Anchor device descriptors** lifted into a shared fixture at `tests/e2e/fixtures/devices.ts` (one per anchor in §2.1) so specs don't redefine viewport.
|
||||
- **No new unit tests** for the primitives — they are presentational. Coverage comes from visual + integration runs.
|
||||
|
||||
## 7. Open questions
|
||||
|
||||
- **Bottom-tab taxonomy**: locked at Dashboard / Clients / Yachts / Berths / More for now. The More sheet holds everything else losslessly, so this is reversible — if real usage suggests a different top-5 (e.g., Interests or Invoices in the tabs), swap them later without code restructure.
|
||||
- **`refactor/data-model` push order**: 155 commits unpushed. Foundation PR can stack on top and rebase, or wait until that branch merges. Decision deferred to user.
|
||||
- **Desktop touch-target adjustments**: bumping `<Button>`/`<Input>` to `min-h-11` will affect desktop too. Verify visually that no desktop layout breaks; if any does, scope the bump to mobile-only via the `data-form-factor` attribute.
|
||||
|
||||
## 8. Files to create
|
||||
|
||||
```
|
||||
src/hooks/use-is-mobile.ts
|
||||
src/components/layout/mobile/
|
||||
mobile-layout.tsx
|
||||
mobile-topbar.tsx
|
||||
mobile-bottom-tabs.tsx
|
||||
more-sheet.tsx
|
||||
mobile-layout-provider.tsx
|
||||
src/components/shared/
|
||||
sheet.tsx (new — vaul wrapper)
|
||||
data-view.tsx (new — table↔card)
|
||||
page-header.tsx (new)
|
||||
action-row.tsx (new)
|
||||
detail-page-shell.tsx (new)
|
||||
filter-chips.tsx (new)
|
||||
src/app/layout.tsx (modified — viewport export, theme-color, UA-derived data-form-factor body attribute via headers())
|
||||
public/icon-192.png (placeholder PWA asset)
|
||||
public/icon-512.png (placeholder PWA asset)
|
||||
public/icon-512-maskable.png (placeholder PWA asset)
|
||||
public/apple-touch-icon.png (placeholder PWA asset)
|
||||
tailwind.config.ts (modified — safe-area utilities, touch-target defaults)
|
||||
tests/e2e/fixtures/devices.ts (new — shared device descriptors)
|
||||
```
|
||||
|
||||
## 9. Files to modify per page
|
||||
|
||||
Per the playbook in §4, each page typically needs:
|
||||
|
||||
- One swap of header markup → `<PageHeader>`.
|
||||
- For list pages: one wrap of table → `<DataView>` + add `cardRender` callback.
|
||||
- For detail pages: wrap in `<DetailPageShell>`.
|
||||
- Replace `<Dialog>` imports with `<Sheet>`.
|
||||
- No service, validator, query, or schema changes anywhere.
|
||||
@@ -52,6 +52,7 @@
|
||||
"@tanstack/react-query": "^5.62.0",
|
||||
"@tanstack/react-query-devtools": "^5.62.0",
|
||||
"@tanstack/react-table": "^8.21.3",
|
||||
"archiver": "^7.0.1",
|
||||
"better-auth": "^1.2.0",
|
||||
"bullmq": "^5.25.0",
|
||||
"class-variance-authority": "^0.7.0",
|
||||
@@ -61,7 +62,9 @@
|
||||
"drizzle-orm": "^0.38.0",
|
||||
"imapflow": "^1.2.13",
|
||||
"ioredis": "^5.4.0",
|
||||
"iso-3166-2": "^1.0.0",
|
||||
"jose": "^6.2.1",
|
||||
"libphonenumber-js": "^1.12.42",
|
||||
"lucide-react": "^0.460.0",
|
||||
"mailparser": "^3.9.4",
|
||||
"minio": "^8.0.0",
|
||||
@@ -83,12 +86,16 @@
|
||||
"sonner": "^1.7.0",
|
||||
"tailwind-merge": "^2.6.0",
|
||||
"tailwindcss-animate": "^1.0.7",
|
||||
"tesseract.js": "^7.0.0",
|
||||
"vaul": "^1.1.2",
|
||||
"zod": "^3.24.0",
|
||||
"zustand": "^5.0.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@eslint/eslintrc": "^3.3.5",
|
||||
"@playwright/test": "^1.58.2",
|
||||
"@types/archiver": "^7.0.0",
|
||||
"@types/iso-3166-2": "^1.0.4",
|
||||
"@types/mailparser": "^3.4.6",
|
||||
"@types/node": "^22.0.0",
|
||||
"@types/nodemailer": "^6.4.0",
|
||||
@@ -106,6 +113,7 @@
|
||||
"lint-staged": "^15.2.0",
|
||||
"postcss": "^8.4.0",
|
||||
"prettier": "^3.4.0",
|
||||
"react-grab": "^0.1.32",
|
||||
"tailwindcss": "^3.4.0",
|
||||
"tsx": "^4.19.0",
|
||||
"typescript": "^5.7.0",
|
||||
|
||||
@@ -75,6 +75,24 @@ export default defineConfig({
|
||||
viewport: { width: 1440, height: 900 },
|
||||
},
|
||||
},
|
||||
{
|
||||
// Mobile / tablet audit — visits every page in headed Chromium at iPhone
|
||||
// viewports (portrait), screenshots full-page to .audit/mobile/<viewport>/,
|
||||
// and writes an index.md. Depends on `setup` for seeded admin + port-role.
|
||||
name: 'mobile-audit',
|
||||
testMatch: /audit\/mobile\.spec\.ts/,
|
||||
dependencies: ['setup'],
|
||||
// Single test walks 4 viewports × ~45 routes sequentially with slowMo;
|
||||
// 30 min headroom keeps us well under the wall-clock cost.
|
||||
timeout: 1_800_000,
|
||||
use: {
|
||||
headless: false,
|
||||
launchOptions: { slowMo: 200 },
|
||||
screenshot: 'off',
|
||||
video: 'off',
|
||||
trace: 'off',
|
||||
},
|
||||
},
|
||||
],
|
||||
|
||||
// Don't start the dev server — we expect it to already be running
|
||||
|
||||
781
pnpm-lock.yaml
generated
781
pnpm-lock.yaml
generated
File diff suppressed because it is too large
Load Diff
BIN
public/apple-touch-icon.png
Normal file
BIN
public/apple-touch-icon.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 654 B |
BIN
public/icon-192.png
Normal file
BIN
public/icon-192.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 688 B |
BIN
public/icon-512-maskable.png
Normal file
BIN
public/icon-512-maskable.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 2.4 KiB |
BIN
public/icon-512.png
Normal file
BIN
public/icon-512.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 2.4 KiB |
188
scripts/audit-permissions.ts
Normal file
188
scripts/audit-permissions.ts
Normal file
@@ -0,0 +1,188 @@
|
||||
/**
|
||||
* Permission-matrix audit.
|
||||
*
|
||||
* Walks every src/app/api/v1/** /route.ts file and reports each exported HTTP
|
||||
* handler (GET/POST/PUT/PATCH/DELETE) that is *not* wrapped in withPermission().
|
||||
* Internal v1 routes should be permission-gated; routes that intentionally use
|
||||
* withAuth() alone (e.g. user-self endpoints) can be allow-listed below.
|
||||
*
|
||||
* Run:
|
||||
* pnpm tsx scripts/audit-permissions.ts
|
||||
*
|
||||
* Exit code:
|
||||
* 0 — every handler is permission-gated or in the allow-list
|
||||
* 1 — at least one handler is missing both a withPermission wrapper and an
|
||||
* allow-list entry. CI should fail.
|
||||
*/
|
||||
|
||||
import { readdir, readFile } from 'node:fs/promises';
|
||||
import { join, relative } from 'node:path';
|
||||
|
||||
const ROOT = join(process.cwd(), 'src/app/api/v1');
|
||||
const HTTP_METHODS = ['GET', 'POST', 'PUT', 'PATCH', 'DELETE'] as const;
|
||||
|
||||
/**
|
||||
* Routes intentionally exempt from withPermission. Each entry should explain
|
||||
* why — typically because the route operates on the caller's own resources
|
||||
* (no port-level permission semantics) or is admin-only and gated by
|
||||
* isSuperAdmin inside the handler.
|
||||
*/
|
||||
const ALLOW_LIST: ReadonlyArray<{ pattern: RegExp; reason: string }> = [
|
||||
// Self / admin / public
|
||||
{ pattern: /\/me\/route\.ts$/, reason: 'Self-endpoint — auth is sufficient.' },
|
||||
{ pattern: /\/admin\//, reason: 'Admin-only — gated by isSuperAdmin inside handler.' },
|
||||
{
|
||||
pattern: /\/notifications\//,
|
||||
reason: 'User-scoped notifications — caller is the resource owner.',
|
||||
},
|
||||
{ pattern: /\/socket\//, reason: 'Socket auth handshake.' },
|
||||
{ pattern: /\/health\//, reason: 'Public health check.' },
|
||||
{ pattern: /\/users\/me\//, reason: 'User-self preferences — caller is the resource owner.' },
|
||||
{ pattern: /\/saved-views\//, reason: 'User-self saved views — caller is the resource owner.' },
|
||||
{
|
||||
pattern: /\/settings\/feature-flag\//,
|
||||
reason: 'Public read of feature-flag bool — no PII; auth is sufficient.',
|
||||
},
|
||||
// Cross-cutting / port-scoped reference data
|
||||
{ pattern: /\/tags\//, reason: 'Tags are cross-cutting reference data; port-scoped via auth.' },
|
||||
{
|
||||
pattern: /\/currency\/(convert|rates)\/route\.ts$/,
|
||||
reason: 'Currency reference data; port-scoped, no PII.',
|
||||
},
|
||||
{
|
||||
pattern: /\/currency\/rates\/refresh\//,
|
||||
reason: 'TODO: gate with admin:manage_settings — currently allow-listed.',
|
||||
},
|
||||
{
|
||||
pattern: /\/search\//,
|
||||
reason: 'Port-scoped search — results filtered by auth context (resources have own perms).',
|
||||
},
|
||||
// Alerts surface in topbar/dashboard for every signed-in user; per-port not per-resource.
|
||||
{ pattern: /\/alerts\//, reason: 'Alerts are user-scoped; port-filtered via auth context.' },
|
||||
// Internally gated by isSuperAdmin
|
||||
{
|
||||
pattern: /\/expenses\/export\/parent-company\//,
|
||||
reason: 'Internally gated by isSuperAdmin inside the handler.',
|
||||
},
|
||||
// Pending dedicated permissions
|
||||
{
|
||||
pattern: /\/ai\//,
|
||||
reason: 'TODO: needs ai:* permission catalog entry. Currently allow-listed.',
|
||||
},
|
||||
{
|
||||
pattern: /\/custom-fields\/\[entityId\]\//,
|
||||
reason: 'TODO: needs custom_fields:* permission. PUT path internally validated.',
|
||||
},
|
||||
{
|
||||
pattern: /\/berth-reservations\/\[id\]\/route\.ts$/,
|
||||
reason: 'TODO: PATCH should map to reservations:edit (not currently in catalog).',
|
||||
},
|
||||
];
|
||||
|
||||
interface Finding {
|
||||
file: string;
|
||||
method: string;
|
||||
reason: 'no-withPermission' | 'no-withAuth' | 'allow-listed';
|
||||
allowReason?: string;
|
||||
}
|
||||
|
||||
async function* walk(dir: string): AsyncGenerator<string> {
|
||||
for (const entry of await readdir(dir, { withFileTypes: true })) {
|
||||
const path = join(dir, entry.name);
|
||||
if (entry.isDirectory()) yield* walk(path);
|
||||
else if (entry.isFile() && entry.name === 'route.ts') yield path;
|
||||
}
|
||||
}
|
||||
|
||||
function isAllowListed(file: string): { allowed: boolean; reason?: string } {
|
||||
for (const { pattern, reason } of ALLOW_LIST) {
|
||||
if (pattern.test(file)) return { allowed: true, reason };
|
||||
}
|
||||
return { allowed: false };
|
||||
}
|
||||
|
||||
async function auditFile(file: string): Promise<Finding[]> {
|
||||
const src = await readFile(file, 'utf-8');
|
||||
const findings: Finding[] = [];
|
||||
|
||||
for (const method of HTTP_METHODS) {
|
||||
// Match: export const GET = withAuth(...
|
||||
const declRe = new RegExp(`export\\s+const\\s+${method}\\s*=\\s*(.+?);`, 's');
|
||||
const m = declRe.exec(src);
|
||||
if (!m) continue;
|
||||
const block = m[1] ?? '';
|
||||
|
||||
const hasAuth = /withAuth\s*\(/.test(block);
|
||||
const hasPerm = /withPermission\s*\(/.test(block);
|
||||
const allow = isAllowListed(file);
|
||||
|
||||
if (!hasAuth) {
|
||||
findings.push({ file, method, reason: 'no-withAuth' });
|
||||
continue;
|
||||
}
|
||||
if (!hasPerm) {
|
||||
if (allow.allowed) {
|
||||
findings.push({ file, method, reason: 'allow-listed', allowReason: allow.reason });
|
||||
} else {
|
||||
findings.push({ file, method, reason: 'no-withPermission' });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return findings;
|
||||
}
|
||||
|
||||
async function main() {
|
||||
const files: string[] = [];
|
||||
for await (const f of walk(ROOT)) files.push(f);
|
||||
files.sort();
|
||||
|
||||
const all: Finding[] = [];
|
||||
for (const f of files) all.push(...(await auditFile(f)));
|
||||
|
||||
const violations = all.filter(
|
||||
(f) => f.reason === 'no-withPermission' || f.reason === 'no-withAuth',
|
||||
);
|
||||
const allowListed = all.filter((f) => f.reason === 'allow-listed');
|
||||
|
||||
// Markdown report
|
||||
const lines: string[] = [];
|
||||
lines.push('# Permission Matrix Audit');
|
||||
lines.push('');
|
||||
lines.push(`Scanned ${files.length} route files under \`src/app/api/v1/\`.`);
|
||||
lines.push('');
|
||||
|
||||
if (violations.length === 0) {
|
||||
lines.push('**No violations.** Every internal v1 handler is permission-gated.');
|
||||
} else {
|
||||
lines.push(`**${violations.length} violation(s):**`);
|
||||
lines.push('');
|
||||
lines.push('| File | Method | Issue |');
|
||||
lines.push('| --- | --- | --- |');
|
||||
for (const v of violations) {
|
||||
const rel = relative(process.cwd(), v.file);
|
||||
lines.push(`| \`${rel}\` | ${v.method} | ${v.reason} |`);
|
||||
}
|
||||
}
|
||||
lines.push('');
|
||||
lines.push(
|
||||
`**Allow-listed:** ${allowListed.length} handler(s) intentionally skip \`withPermission\`.`,
|
||||
);
|
||||
if (allowListed.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('| File | Method | Reason |');
|
||||
lines.push('| --- | --- | --- |');
|
||||
for (const a of allowListed) {
|
||||
const rel = relative(process.cwd(), a.file);
|
||||
lines.push(`| \`${rel}\` | ${a.method} | ${a.allowReason} |`);
|
||||
}
|
||||
}
|
||||
|
||||
process.stdout.write(lines.join('\n') + '\n');
|
||||
process.exit(violations.length > 0 ? 1 : 0);
|
||||
}
|
||||
|
||||
main().catch((err) => {
|
||||
console.error(err);
|
||||
process.exit(2);
|
||||
});
|
||||
51
scripts/backup/minio-mirror.sh
Normal file
51
scripts/backup/minio-mirror.sh
Normal file
@@ -0,0 +1,51 @@
|
||||
#!/usr/bin/env bash
|
||||
# Hourly MinIO mirror for Port Nimara CRM.
|
||||
#
|
||||
# Mirrors the live `MINIO_BUCKET` to the backup destination. `mc mirror`
|
||||
# is incremental — only changed objects transfer — so this is cheap.
|
||||
#
|
||||
# Versioning on the destination bucket is what protects against object
|
||||
# deletes / overwrites; we don't try to roll our own.
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
: "${MINIO_ENDPOINT:?MINIO_ENDPOINT not set}"
|
||||
: "${MINIO_ACCESS_KEY:?MINIO_ACCESS_KEY not set}"
|
||||
: "${MINIO_SECRET_KEY:?MINIO_SECRET_KEY not set}"
|
||||
: "${MINIO_BUCKET:?MINIO_BUCKET not set}"
|
||||
: "${BACKUP_S3_BUCKET:?BACKUP_S3_BUCKET not set}"
|
||||
: "${BACKUP_S3_ENDPOINT:?BACKUP_S3_ENDPOINT not set}"
|
||||
: "${BACKUP_S3_ACCESS_KEY:?BACKUP_S3_ACCESS_KEY not set}"
|
||||
: "${BACKUP_S3_SECRET_KEY:?BACKUP_S3_SECRET_KEY not set}"
|
||||
|
||||
# Default scheme: live MinIO is plain HTTP unless MINIO_USE_SSL=true.
|
||||
LIVE_URL="${MINIO_ENDPOINT}"
|
||||
if [[ "${MINIO_USE_SSL:-false}" == "true" ]]; then
|
||||
LIVE_URL="https://${MINIO_ENDPOINT}:${MINIO_PORT:-443}"
|
||||
else
|
||||
LIVE_URL="http://${MINIO_ENDPOINT}:${MINIO_PORT:-9000}"
|
||||
fi
|
||||
|
||||
LIVE_ALIAS="live-$$"
|
||||
BACKUP_ALIAS="bk-$$"
|
||||
trap 'mc alias remove "$LIVE_ALIAS" 2>/dev/null || true; mc alias remove "$BACKUP_ALIAS" 2>/dev/null || true' EXIT
|
||||
|
||||
mc alias set "$LIVE_ALIAS" "$LIVE_URL" \
|
||||
"$MINIO_ACCESS_KEY" "$MINIO_SECRET_KEY" --api S3v4 >/dev/null
|
||||
mc alias set "$BACKUP_ALIAS" "$BACKUP_S3_ENDPOINT" \
|
||||
"$BACKUP_S3_ACCESS_KEY" "$BACKUP_S3_SECRET_KEY" --api S3v4 >/dev/null
|
||||
|
||||
SOURCE="${LIVE_ALIAS}/${MINIO_BUCKET}/"
|
||||
DEST="${BACKUP_ALIAS}/${BACKUP_S3_BUCKET}/minio/"
|
||||
|
||||
echo "[$(date -u +%FT%TZ)] Mirroring $SOURCE → $DEST"
|
||||
|
||||
# `--remove` would delete objects from the destination that no longer
|
||||
# exist in source — we DON'T pass it, because that would let an
|
||||
# accidental delete on the live bucket cascade into permanent loss on
|
||||
# the backup side. Versioning + lifecycle handle stale-object cleanup.
|
||||
mc mirror --quiet --overwrite "$SOURCE" "$DEST"
|
||||
|
||||
# Print byte / count diff for the operator.
|
||||
echo "[$(date -u +%FT%TZ)] Done. Destination summary:"
|
||||
mc du "$DEST"
|
||||
63
scripts/backup/pg-backup.sh
Normal file
63
scripts/backup/pg-backup.sh
Normal file
@@ -0,0 +1,63 @@
|
||||
#!/usr/bin/env bash
|
||||
# Hourly PostgreSQL backup for Port Nimara CRM.
|
||||
#
|
||||
# Reads DATABASE_URL and BACKUP_S3_* from the environment. Dumps to a
|
||||
# tmpfile, gzips, optionally GPG-encrypts to BACKUP_GPG_RECIPIENT, and
|
||||
# uploads to s3://${BACKUP_S3_BUCKET}/pg/<hostname>/<UTC-date>/<hour>.dump.gz[.gpg].
|
||||
#
|
||||
# Designed to fail loud: any non-zero exit halts the script and propagates
|
||||
# to the cron / CI runner so the operator sees the failure.
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
: "${DATABASE_URL:?DATABASE_URL not set}"
|
||||
: "${BACKUP_S3_BUCKET:?BACKUP_S3_BUCKET not set}"
|
||||
: "${BACKUP_S3_ENDPOINT:?BACKUP_S3_ENDPOINT not set}"
|
||||
: "${BACKUP_S3_ACCESS_KEY:?BACKUP_S3_ACCESS_KEY not set}"
|
||||
: "${BACKUP_S3_SECRET_KEY:?BACKUP_S3_SECRET_KEY not set}"
|
||||
|
||||
HOST="${BACKUP_HOST_OVERRIDE:-$(hostname -s)}"
|
||||
DATE_UTC="$(date -u +%Y-%m-%d)"
|
||||
HOUR_UTC="$(date -u +%H)"
|
||||
WORKDIR="$(mktemp -d)"
|
||||
trap 'rm -rf "$WORKDIR"' EXIT
|
||||
|
||||
DUMP_FILE="$WORKDIR/${HOUR_UTC}.dump"
|
||||
ARCHIVE_NAME="${HOUR_UTC}.dump.gz"
|
||||
|
||||
echo "[$(date -u +%FT%TZ)] Dumping $DATABASE_URL → $DUMP_FILE"
|
||||
pg_dump --format=custom --compress=9 --no-owner --no-privileges \
|
||||
--file="$DUMP_FILE" "$DATABASE_URL"
|
||||
|
||||
# pg_dump's `custom` format is already compressed, but we wrap in gzip so
|
||||
# the file looks the same regardless of the dump format on disk.
|
||||
gzip -n "$DUMP_FILE"
|
||||
GZ_FILE="${DUMP_FILE}.gz"
|
||||
|
||||
# Optional GPG layer. Only encrypt if the recipient is configured.
|
||||
if [[ -n "${BACKUP_GPG_RECIPIENT:-}" ]]; then
|
||||
echo "[$(date -u +%FT%TZ)] Encrypting for $BACKUP_GPG_RECIPIENT"
|
||||
gpg --batch --yes --trust-model always \
|
||||
--recipient "$BACKUP_GPG_RECIPIENT" \
|
||||
--encrypt --output "${GZ_FILE}.gpg" "$GZ_FILE"
|
||||
rm "$GZ_FILE"
|
||||
GZ_FILE="${GZ_FILE}.gpg"
|
||||
ARCHIVE_NAME="${ARCHIVE_NAME}.gpg"
|
||||
fi
|
||||
|
||||
# Configure mc client for the backup destination.
|
||||
MC_ALIAS="bk-$$"
|
||||
mc alias set "$MC_ALIAS" "$BACKUP_S3_ENDPOINT" \
|
||||
"$BACKUP_S3_ACCESS_KEY" "$BACKUP_S3_SECRET_KEY" \
|
||||
--api S3v4 >/dev/null
|
||||
|
||||
REMOTE_PATH="${MC_ALIAS}/${BACKUP_S3_BUCKET}/pg/${HOST}/${DATE_UTC}/${ARCHIVE_NAME}"
|
||||
echo "[$(date -u +%FT%TZ)] Uploading → $REMOTE_PATH"
|
||||
mc cp --quiet "$GZ_FILE" "$REMOTE_PATH"
|
||||
|
||||
# Tag with retention metadata so lifecycle rules can decide what to expire.
|
||||
mc tag set "$REMOTE_PATH" "kind=hourly&host=${HOST}&date=${DATE_UTC}" >/dev/null
|
||||
|
||||
mc alias remove "$MC_ALIAS" >/dev/null
|
||||
|
||||
echo "[$(date -u +%FT%TZ)] OK ${ARCHIVE_NAME} ($(du -h "$GZ_FILE" | cut -f1))"
|
||||
121
scripts/backup/restore.sh
Normal file
121
scripts/backup/restore.sh
Normal file
@@ -0,0 +1,121 @@
|
||||
#!/usr/bin/env bash
|
||||
# Cold-restore script for Port Nimara CRM.
|
||||
#
|
||||
# Two modes:
|
||||
# --drill Restore to a sandbox DB ($DRILL_DATABASE_URL) + a tagged
|
||||
# sandbox path on the live MinIO bucket. Used by the weekly
|
||||
# cron drill so the runbook stays accurate.
|
||||
# (no --drill) Interactive production restore. Prompts before each
|
||||
# destructive step; refuses to run if the live DB has
|
||||
# non-empty tables (caller is expected to drop first).
|
||||
#
|
||||
# Common args:
|
||||
# --snapshot YYYY-MM-DD/HH Specific dump to restore. Defaults to "latest".
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
DRILL=0
|
||||
SNAPSHOT="latest"
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case "$1" in
|
||||
--drill) DRILL=1; shift ;;
|
||||
--snapshot) SNAPSHOT="$2"; shift 2 ;;
|
||||
*) echo "unknown arg: $1" >&2; exit 2 ;;
|
||||
esac
|
||||
done
|
||||
|
||||
: "${BACKUP_S3_BUCKET:?BACKUP_S3_BUCKET not set}"
|
||||
: "${BACKUP_S3_ENDPOINT:?BACKUP_S3_ENDPOINT not set}"
|
||||
: "${BACKUP_S3_ACCESS_KEY:?BACKUP_S3_ACCESS_KEY not set}"
|
||||
: "${BACKUP_S3_SECRET_KEY:?BACKUP_S3_SECRET_KEY not set}"
|
||||
|
||||
if [[ "$DRILL" -eq 1 ]]; then
|
||||
: "${DRILL_DATABASE_URL:?DRILL_DATABASE_URL not set}"
|
||||
TARGET_DB="$DRILL_DATABASE_URL"
|
||||
echo "[drill] target DB = $TARGET_DB"
|
||||
else
|
||||
: "${DATABASE_URL:?DATABASE_URL not set}"
|
||||
TARGET_DB="$DATABASE_URL"
|
||||
read -rp "About to overwrite $TARGET_DB. Type 'restore' to continue: " confirm
|
||||
[[ "$confirm" == "restore" ]] || { echo "aborted"; exit 1; }
|
||||
fi
|
||||
|
||||
HOST="${BACKUP_HOST_OVERRIDE:-$(hostname -s)}"
|
||||
WORKDIR="$(mktemp -d)"
|
||||
trap 'rm -rf "$WORKDIR"' EXIT
|
||||
|
||||
MC_ALIAS="bk-$$"
|
||||
mc alias set "$MC_ALIAS" "$BACKUP_S3_ENDPOINT" \
|
||||
"$BACKUP_S3_ACCESS_KEY" "$BACKUP_S3_SECRET_KEY" --api S3v4 >/dev/null
|
||||
trap 'rm -rf "$WORKDIR"; mc alias remove "$MC_ALIAS" 2>/dev/null || true' EXIT
|
||||
|
||||
# Resolve the snapshot path.
|
||||
if [[ "$SNAPSHOT" == "latest" ]]; then
|
||||
REMOTE=$(mc ls --recursive "${MC_ALIAS}/${BACKUP_S3_BUCKET}/pg/${HOST}/" \
|
||||
| awk '{print $NF}' | sort | tail -1)
|
||||
if [[ -z "$REMOTE" ]]; then
|
||||
echo "no snapshots found under ${BACKUP_S3_BUCKET}/pg/${HOST}/" >&2
|
||||
exit 1
|
||||
fi
|
||||
REMOTE="${MC_ALIAS}/${BACKUP_S3_BUCKET}/pg/${HOST}/${REMOTE}"
|
||||
else
|
||||
REMOTE="${MC_ALIAS}/${BACKUP_S3_BUCKET}/pg/${HOST}/${SNAPSHOT}.dump.gz"
|
||||
# If GPG was used, the file lives at .dump.gz.gpg. Try both.
|
||||
if ! mc stat "$REMOTE" >/dev/null 2>&1; then
|
||||
REMOTE="${REMOTE}.gpg"
|
||||
fi
|
||||
fi
|
||||
|
||||
echo "[$(date -u +%FT%TZ)] Pulling $REMOTE"
|
||||
LOCAL="$WORKDIR/$(basename "$REMOTE")"
|
||||
mc cp --quiet "$REMOTE" "$LOCAL"
|
||||
|
||||
# Decrypt if needed.
|
||||
if [[ "$LOCAL" == *.gpg ]]; then
|
||||
echo "[$(date -u +%FT%TZ)] Decrypting"
|
||||
gpg --batch --yes --decrypt --output "${LOCAL%.gpg}" "$LOCAL"
|
||||
rm "$LOCAL"
|
||||
LOCAL="${LOCAL%.gpg}"
|
||||
fi
|
||||
|
||||
# Decompress.
|
||||
gunzip "$LOCAL"
|
||||
LOCAL="${LOCAL%.gz}"
|
||||
|
||||
echo "[$(date -u +%FT%TZ)] Restoring into $TARGET_DB"
|
||||
|
||||
# Drop & recreate to guarantee no half-state from a prior run.
|
||||
DB_NAME=$(echo "$TARGET_DB" | sed -E 's|.*/([^?]+).*|\1|')
|
||||
ADMIN_URL=$(echo "$TARGET_DB" | sed -E "s|/${DB_NAME}|/postgres|")
|
||||
|
||||
psql "$ADMIN_URL" -v ON_ERROR_STOP=1 <<SQL
|
||||
SELECT pg_terminate_backend(pid) FROM pg_stat_activity
|
||||
WHERE datname = '${DB_NAME}' AND pid <> pg_backend_pid();
|
||||
DROP DATABASE IF EXISTS "${DB_NAME}";
|
||||
CREATE DATABASE "${DB_NAME}";
|
||||
SQL
|
||||
|
||||
pg_restore --no-owner --no-privileges --dbname "$TARGET_DB" "$LOCAL"
|
||||
|
||||
# Drill mode: compare row counts vs the live producer for parity.
|
||||
if [[ "$DRILL" -eq 1 ]]; then
|
||||
echo "[$(date -u +%FT%TZ)] Drill row-count diff (live vs restored):"
|
||||
TABLES=$(psql -At "$TARGET_DB" -c \
|
||||
"SELECT tablename FROM pg_tables WHERE schemaname='public' ORDER BY tablename;")
|
||||
diff_count=0
|
||||
while IFS= read -r tbl; do
|
||||
[[ -z "$tbl" ]] && continue
|
||||
live=$(psql -At "${LIVE_DATABASE_URL:-$DATABASE_URL}" -c "SELECT count(*) FROM \"$tbl\";")
|
||||
restored=$(psql -At "$TARGET_DB" -c "SELECT count(*) FROM \"$tbl\";")
|
||||
delta=$((live - restored))
|
||||
if [[ "$delta" -ne 0 ]]; then
|
||||
echo " ⚠ $tbl: live=$live restored=$restored delta=$delta"
|
||||
diff_count=$((diff_count + 1))
|
||||
fi
|
||||
done <<< "$TABLES"
|
||||
if [[ "$diff_count" -eq 0 ]]; then
|
||||
echo " ✓ row counts match across all tables"
|
||||
fi
|
||||
fi
|
||||
|
||||
echo "[$(date -u +%FT%TZ)] Restore complete."
|
||||
40
scripts/dev-set-password.ts
Normal file
40
scripts/dev-set-password.ts
Normal file
@@ -0,0 +1,40 @@
|
||||
/**
|
||||
* Dev helper: set a user's password directly (bypasses email reset).
|
||||
* Usage: pnpm tsx scripts/dev-set-password.ts <email> <password>
|
||||
*/
|
||||
import 'dotenv/config';
|
||||
import { hashPassword } from 'better-auth/crypto';
|
||||
import { eq, and } from 'drizzle-orm';
|
||||
import { db } from '@/lib/db';
|
||||
import { user, account } from '@/lib/db/schema/users';
|
||||
|
||||
async function main() {
|
||||
const [, , email, password] = process.argv;
|
||||
if (!email || !password) {
|
||||
console.error('Usage: pnpm tsx scripts/dev-set-password.ts <email> <password>');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const u = await db.query.user.findFirst({ where: eq(user.email, email) });
|
||||
if (!u) {
|
||||
console.error(`User not found: ${email}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const hash = await hashPassword(password);
|
||||
const result = await db
|
||||
.update(account)
|
||||
.set({ password: hash, updatedAt: new Date() })
|
||||
.where(and(eq(account.userId, u.id), eq(account.providerId, 'credential')))
|
||||
.returning({ id: account.id });
|
||||
|
||||
if (result.length === 0) {
|
||||
console.error(`No credential account row for ${email}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log(`Updated password for ${email} (account id ${result[0]?.id}).`);
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
main();
|
||||
@@ -20,7 +20,15 @@ async function main() {
|
||||
const isSuperAdmin = args.includes('--super');
|
||||
const name = args.find((a, i) => i > 0 && !a.startsWith('--'));
|
||||
|
||||
const { inviteId, link } = await createCrmInvite({ email, name, isSuperAdmin });
|
||||
// Dev script runs out-of-band (no HTTP request, no session). The service's
|
||||
// super-admin gate requires `invitedBy.isSuperAdmin === true` for super
|
||||
// invites; the script bypasses that with a synthetic caller identity.
|
||||
const { inviteId, link } = await createCrmInvite({
|
||||
email,
|
||||
name,
|
||||
isSuperAdmin,
|
||||
invitedBy: { userId: 'cli-script', isSuperAdmin: true },
|
||||
});
|
||||
console.log(`✓ Invite created (id=${inviteId})`);
|
||||
console.log(` email: ${email}`);
|
||||
console.log(` super_admin: ${isSuperAdmin}`);
|
||||
|
||||
@@ -1,10 +1,9 @@
|
||||
import { PageHeader } from '@/components/shared/page-header';
|
||||
|
||||
export default function BackupManagementPage() {
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div>
|
||||
<h1 className="text-2xl font-bold text-foreground">Backup Management</h1>
|
||||
<p className="text-muted-foreground">Manage system backups and restoration</p>
|
||||
</div>
|
||||
<PageHeader title="Backup Management" description="Manage system backups and restoration" />
|
||||
<div className="flex flex-col items-center justify-center rounded-lg border border-dashed p-12">
|
||||
<p className="text-lg font-medium text-muted-foreground">Coming in Layer 4</p>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
|
||||
@@ -2,6 +2,7 @@ import {
|
||||
SettingsFormCard,
|
||||
type SettingFieldDef,
|
||||
} from '@/components/admin/shared/settings-form-card';
|
||||
import { PageHeader } from '@/components/shared/page-header';
|
||||
|
||||
const FIELDS: SettingFieldDef[] = [
|
||||
{
|
||||
@@ -47,13 +48,10 @@ const FIELDS: SettingFieldDef[] = [
|
||||
export default function BrandingSettingsPage() {
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div>
|
||||
<h1 className="text-2xl font-semibold">Branding</h1>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
Logo, primary color, app name, and email header/footer HTML used by the branded auth shell
|
||||
and outgoing email templates.
|
||||
</p>
|
||||
</div>
|
||||
<PageHeader
|
||||
title="Branding"
|
||||
description="Logo, primary color, app name, and email header/footer HTML used by the branded auth shell and outgoing email templates."
|
||||
/>
|
||||
<SettingsFormCard
|
||||
title="Identity"
|
||||
description="App name, logo, and primary color."
|
||||
|
||||
@@ -3,6 +3,7 @@ import {
|
||||
type SettingFieldDef,
|
||||
} from '@/components/admin/shared/settings-form-card';
|
||||
import { DocumensoTestButton } from '@/components/admin/documenso/documenso-test-button';
|
||||
import { PageHeader } from '@/components/shared/page-header';
|
||||
|
||||
const API_FIELDS: SettingFieldDef[] = [
|
||||
{
|
||||
@@ -48,13 +49,10 @@ const EOI_FIELDS: SettingFieldDef[] = [
|
||||
export default function DocumensoSettingsPage() {
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div>
|
||||
<h1 className="text-2xl font-semibold">Documenso & EOI</h1>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
API credentials and default EOI generation pathway. Use the test-connection button to
|
||||
verify a saved configuration before relying on it.
|
||||
</p>
|
||||
</div>
|
||||
<PageHeader
|
||||
title="Documenso & EOI"
|
||||
description="API credentials and default EOI generation pathway. Use the test-connection button to verify a saved configuration before relying on it."
|
||||
/>
|
||||
|
||||
<SettingsFormCard
|
||||
title="Documenso API"
|
||||
|
||||
@@ -2,6 +2,7 @@ import {
|
||||
SettingsFormCard,
|
||||
type SettingFieldDef,
|
||||
} from '@/components/admin/shared/settings-form-card';
|
||||
import { PageHeader } from '@/components/shared/page-header';
|
||||
|
||||
const FIELDS: SettingFieldDef[] = [
|
||||
{
|
||||
@@ -79,13 +80,10 @@ const FIELDS: SettingFieldDef[] = [
|
||||
export default function EmailSettingsPage() {
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div>
|
||||
<h1 className="text-2xl font-semibold">Email Settings</h1>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
Per-port outgoing email configuration. SMTP credentials and the From address default to
|
||||
environment variables when these fields are blank.
|
||||
</p>
|
||||
</div>
|
||||
<PageHeader
|
||||
title="Email Settings"
|
||||
description="Per-port outgoing email configuration. SMTP credentials and the From address default to environment variables when these fields are blank."
|
||||
/>
|
||||
<SettingsFormCard
|
||||
title="From address & signature"
|
||||
description="Identity headers and shared HTML used by system-generated emails."
|
||||
|
||||
@@ -1,10 +1,9 @@
|
||||
import { PageHeader } from '@/components/shared/page-header';
|
||||
|
||||
export default function DataImportPage() {
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div>
|
||||
<h1 className="text-2xl font-bold text-foreground">Data Import</h1>
|
||||
<p className="text-muted-foreground">Import data from external sources</p>
|
||||
</div>
|
||||
<PageHeader title="Data Import" description="Import data from external sources" />
|
||||
<div className="flex flex-col items-center justify-center rounded-lg border border-dashed p-12">
|
||||
<p className="text-lg font-medium text-muted-foreground">Coming in Layer 4</p>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
|
||||
@@ -1,15 +1,13 @@
|
||||
import { InvitationsManager } from '@/components/admin/invitations/invitations-manager';
|
||||
import { PageHeader } from '@/components/shared/page-header';
|
||||
|
||||
export default function InvitationsPage() {
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div>
|
||||
<h1 className="text-2xl font-semibold">Invitations</h1>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
Send a single-use invitation to a new CRM user. The recipient sets their own password via
|
||||
the link in the email.
|
||||
</p>
|
||||
</div>
|
||||
<PageHeader
|
||||
title="Invitations"
|
||||
description="Send a single-use invitation to a new CRM user. The recipient sets their own password via the link in the email."
|
||||
/>
|
||||
<InvitationsManager />
|
||||
</div>
|
||||
);
|
||||
|
||||
36
src/app/(dashboard)/[portSlug]/admin/layout.tsx
Normal file
36
src/app/(dashboard)/[portSlug]/admin/layout.tsx
Normal file
@@ -0,0 +1,36 @@
|
||||
import { redirect } from 'next/navigation';
|
||||
import { headers } from 'next/headers';
|
||||
import { eq } from 'drizzle-orm';
|
||||
|
||||
import { auth } from '@/lib/auth';
|
||||
import { db } from '@/lib/db';
|
||||
import { userProfiles } from '@/lib/db/schema/users';
|
||||
|
||||
/**
|
||||
* Guard: only super-admins (isSuperAdmin === true in user_profiles) may access
|
||||
* any page under /[portSlug]/admin. Everyone else is redirected to their dashboard.
|
||||
*/
|
||||
export default async function AdminLayout({
|
||||
children,
|
||||
params,
|
||||
}: {
|
||||
children: React.ReactNode;
|
||||
params: Promise<{ portSlug: string }>;
|
||||
}) {
|
||||
const { portSlug } = await params;
|
||||
const session = await auth.api.getSession({ headers: await headers() });
|
||||
|
||||
if (!session?.user) {
|
||||
redirect('/login');
|
||||
}
|
||||
|
||||
const profile = await db.query.userProfiles.findFirst({
|
||||
where: eq(userProfiles.userId, session.user.id),
|
||||
});
|
||||
|
||||
if (!profile?.isSuperAdmin) {
|
||||
redirect(`/${portSlug}/dashboard`);
|
||||
}
|
||||
|
||||
return <>{children}</>;
|
||||
}
|
||||
5
src/app/(dashboard)/[portSlug]/admin/ocr/page.tsx
Normal file
5
src/app/(dashboard)/[portSlug]/admin/ocr/page.tsx
Normal file
@@ -0,0 +1,5 @@
|
||||
import { OcrSettingsForm } from '@/components/admin/ocr-settings-form';
|
||||
|
||||
export default function OcrSettingsPage() {
|
||||
return <OcrSettingsForm />;
|
||||
}
|
||||
@@ -1,10 +1,9 @@
|
||||
import { PageHeader } from '@/components/shared/page-header';
|
||||
|
||||
export default function OnboardingPage() {
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div>
|
||||
<h1 className="text-2xl font-bold text-foreground">Onboarding</h1>
|
||||
<p className="text-muted-foreground">Guided setup for new port configurations</p>
|
||||
</div>
|
||||
<PageHeader title="Onboarding" description="Guided setup for new port configurations" />
|
||||
<div className="flex flex-col items-center justify-center rounded-lg border border-dashed p-12">
|
||||
<p className="text-lg font-medium text-muted-foreground">Coming in Layer 4</p>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
|
||||
@@ -20,6 +20,7 @@ import {
|
||||
} from 'lucide-react';
|
||||
|
||||
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from '@/components/ui/card';
|
||||
import { PageHeader } from '@/components/shared/page-header';
|
||||
|
||||
interface AdminSection {
|
||||
href: string;
|
||||
@@ -149,6 +150,12 @@ const SECTIONS: AdminSection[] = [
|
||||
description: 'Initial-setup wizard for fresh ports.',
|
||||
icon: LayoutDashboard,
|
||||
},
|
||||
{
|
||||
href: 'ocr',
|
||||
label: 'Receipt OCR',
|
||||
description: 'Configure the AI provider used by the mobile receipt scanner.',
|
||||
icon: ScrollText,
|
||||
},
|
||||
];
|
||||
|
||||
export default async function AdminLandingPage({
|
||||
@@ -159,13 +166,10 @@ export default async function AdminLandingPage({
|
||||
const { portSlug } = await params;
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div>
|
||||
<h1 className="text-2xl font-semibold">Administration</h1>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
Per-port configuration and system administration. Each card below opens a dedicated
|
||||
settings page.
|
||||
</p>
|
||||
</div>
|
||||
<PageHeader
|
||||
title="Administration"
|
||||
description="Per-port configuration and system administration. Each card below opens a dedicated settings page."
|
||||
/>
|
||||
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 gap-4">
|
||||
{SECTIONS.map((s) => {
|
||||
const Icon = s.icon;
|
||||
|
||||
@@ -2,6 +2,7 @@ import {
|
||||
SettingsFormCard,
|
||||
type SettingFieldDef,
|
||||
} from '@/components/admin/shared/settings-form-card';
|
||||
import { PageHeader } from '@/components/shared/page-header';
|
||||
|
||||
const DEFAULT_FIELDS: SettingFieldDef[] = [
|
||||
{
|
||||
@@ -53,14 +54,10 @@ const DIGEST_FIELDS: SettingFieldDef[] = [
|
||||
export default function ReminderSettingsPage() {
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div>
|
||||
<h1 className="text-2xl font-semibold">Reminders</h1>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
Default reminder behaviour for new interests and the optional daily-digest delivery
|
||||
window. Individual users can still configure their own digest preferences in Notifications
|
||||
→ Preferences.
|
||||
</p>
|
||||
</div>
|
||||
<PageHeader
|
||||
title="Reminders"
|
||||
description="Default reminder behaviour for new interests and the optional daily-digest delivery window. Individual users can still configure their own digest preferences in Notifications → Preferences."
|
||||
/>
|
||||
|
||||
<SettingsFormCard
|
||||
title="Defaults for new interests"
|
||||
|
||||
@@ -1,10 +1,12 @@
|
||||
import { PageHeader } from '@/components/shared/page-header';
|
||||
|
||||
export default function ScheduledReportsPage() {
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div>
|
||||
<h1 className="text-2xl font-bold text-foreground">Scheduled Reports</h1>
|
||||
<p className="text-muted-foreground">Configure and manage automated report delivery</p>
|
||||
</div>
|
||||
<PageHeader
|
||||
title="Scheduled Reports"
|
||||
description="Configure and manage automated report delivery"
|
||||
/>
|
||||
<div className="flex flex-col items-center justify-center rounded-lg border border-dashed p-12">
|
||||
<p className="text-lg font-medium text-muted-foreground">Coming in Layer 3</p>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
|
||||
import { useCallback, useEffect, useState } from 'react';
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { PageHeader } from '@/components/shared/page-header';
|
||||
import { Badge } from '@/components/ui/badge';
|
||||
import {
|
||||
AlertDialog,
|
||||
@@ -36,7 +37,11 @@ export default function WebhooksPage() {
|
||||
const [deleteTarget, setDeleteTarget] = useState<Webhook | null>(null);
|
||||
const [expandedId, setExpandedId] = useState<string | null>(null);
|
||||
const [regenerating, setRegenerating] = useState<string | null>(null);
|
||||
const [newSecret, setNewSecret] = useState<{ webhookId: string; secret: string; masked: string } | null>(null);
|
||||
const [newSecret, setNewSecret] = useState<{
|
||||
webhookId: string;
|
||||
secret: string;
|
||||
masked: string;
|
||||
} | null>(null);
|
||||
|
||||
const loadWebhooks = useCallback(async () => {
|
||||
try {
|
||||
@@ -98,15 +103,20 @@ export default function WebhooksPage() {
|
||||
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div className="flex items-center justify-between">
|
||||
<div>
|
||||
<h1 className="text-2xl font-bold text-foreground">Webhooks</h1>
|
||||
<p className="text-muted-foreground">Configure outgoing webhook integrations</p>
|
||||
</div>
|
||||
<Button onClick={() => { setEditTarget(null); setFormOpen(true); }}>
|
||||
Add Webhook
|
||||
</Button>
|
||||
</div>
|
||||
<PageHeader
|
||||
title="Webhooks"
|
||||
description="Configure outgoing webhook integrations"
|
||||
actions={
|
||||
<Button
|
||||
onClick={() => {
|
||||
setEditTarget(null);
|
||||
setFormOpen(true);
|
||||
}}
|
||||
>
|
||||
Add Webhook
|
||||
</Button>
|
||||
}
|
||||
/>
|
||||
|
||||
{loading ? (
|
||||
<p className="text-sm text-muted-foreground">Loading...</p>
|
||||
@@ -116,7 +126,13 @@ export default function WebhooksPage() {
|
||||
<p className="text-sm text-muted-foreground mt-1">
|
||||
Add a webhook to receive real-time notifications of CRM events.
|
||||
</p>
|
||||
<Button className="mt-4" onClick={() => { setEditTarget(null); setFormOpen(true); }}>
|
||||
<Button
|
||||
className="mt-4"
|
||||
onClick={() => {
|
||||
setEditTarget(null);
|
||||
setFormOpen(true);
|
||||
}}
|
||||
>
|
||||
Add Webhook
|
||||
</Button>
|
||||
</div>
|
||||
@@ -141,17 +157,16 @@ export default function WebhooksPage() {
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-2 shrink-0">
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
onClick={() => handleToggleActive(webhook)}
|
||||
>
|
||||
<Button variant="ghost" size="sm" onClick={() => handleToggleActive(webhook)}>
|
||||
{webhook.isActive ? 'Disable' : 'Enable'}
|
||||
</Button>
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
onClick={() => { setEditTarget(webhook); setFormOpen(true); }}
|
||||
onClick={() => {
|
||||
setEditTarget(webhook);
|
||||
setFormOpen(true);
|
||||
}}
|
||||
>
|
||||
Edit
|
||||
</Button>
|
||||
@@ -163,11 +178,7 @@ export default function WebhooksPage() {
|
||||
>
|
||||
Delete
|
||||
</Button>
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
onClick={() => toggleExpand(webhook.id)}
|
||||
>
|
||||
<Button variant="ghost" size="sm" onClick={() => toggleExpand(webhook.id)}>
|
||||
{expandedId === webhook.id ? 'Collapse' : 'Details'}
|
||||
</Button>
|
||||
</div>
|
||||
@@ -228,18 +239,26 @@ export default function WebhooksPage() {
|
||||
onSuccess={loadWebhooks}
|
||||
/>
|
||||
|
||||
<AlertDialog open={!!deleteTarget} onOpenChange={(open) => { if (!open) setDeleteTarget(null); }}>
|
||||
<AlertDialog
|
||||
open={!!deleteTarget}
|
||||
onOpenChange={(open) => {
|
||||
if (!open) setDeleteTarget(null);
|
||||
}}
|
||||
>
|
||||
<AlertDialogContent>
|
||||
<AlertDialogHeader>
|
||||
<AlertDialogTitle>Delete Webhook</AlertDialogTitle>
|
||||
<AlertDialogDescription>
|
||||
Delete "{deleteTarget?.name}"? This will also delete all delivery history. This action
|
||||
cannot be undone.
|
||||
Delete "{deleteTarget?.name}"? This will also delete all delivery history.
|
||||
This action cannot be undone.
|
||||
</AlertDialogDescription>
|
||||
</AlertDialogHeader>
|
||||
<AlertDialogFooter>
|
||||
<AlertDialogCancel>Cancel</AlertDialogCancel>
|
||||
<AlertDialogAction onClick={handleDelete} className="bg-destructive text-destructive-foreground">
|
||||
<AlertDialogAction
|
||||
onClick={handleDelete}
|
||||
className="bg-destructive text-destructive-foreground"
|
||||
>
|
||||
Delete
|
||||
</AlertDialogAction>
|
||||
</AlertDialogFooter>
|
||||
|
||||
5
src/app/(dashboard)/[portSlug]/alerts/page.tsx
Normal file
5
src/app/(dashboard)/[portSlug]/alerts/page.tsx
Normal file
@@ -0,0 +1,5 @@
|
||||
import { AlertsPageShell } from '@/components/alerts/alerts-page-shell';
|
||||
|
||||
export default function AlertsPage() {
|
||||
return <AlertsPageShell />;
|
||||
}
|
||||
@@ -0,0 +1,5 @@
|
||||
import { BerthReservationsList } from '@/components/reservations/berth-reservations-list';
|
||||
|
||||
export default function BerthReservationsPage() {
|
||||
return <BerthReservationsList />;
|
||||
}
|
||||
@@ -0,0 +1,41 @@
|
||||
import { Skeleton } from '@/components/ui/skeleton';
|
||||
import { CardSkeleton } from '@/components/shared/loading-skeleton';
|
||||
|
||||
/**
|
||||
* Route-level loading UI for the client detail page. Renders while the
|
||||
* server component resolves the session and the client component bootstraps
|
||||
* its initial query — replaces the previous empty-header flash on direct
|
||||
* URL visits.
|
||||
*/
|
||||
export default function Loading() {
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
{/* Header strip — title, badges, action buttons */}
|
||||
<div className="rounded-xl border border-border bg-card px-5 py-4 shadow-sm space-y-3">
|
||||
<div className="flex items-center gap-3">
|
||||
<Skeleton className="h-7 w-56" />
|
||||
<Skeleton className="h-5 w-16 rounded-full" />
|
||||
</div>
|
||||
<div className="flex flex-wrap gap-2">
|
||||
<Skeleton className="h-9 w-20 rounded-md" />
|
||||
<Skeleton className="h-9 w-20 rounded-md" />
|
||||
<Skeleton className="h-9 w-24 rounded-md" />
|
||||
<Skeleton className="h-9 w-32 rounded-md" />
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Tab strip */}
|
||||
<div className="flex gap-2 border-b border-border pb-1">
|
||||
{Array.from({ length: 8 }).map((_, i) => (
|
||||
<Skeleton key={i} className="h-8 w-20 rounded-md" />
|
||||
))}
|
||||
</div>
|
||||
|
||||
{/* Two-column overview */}
|
||||
<div className="grid grid-cols-1 gap-6 md:grid-cols-2">
|
||||
<CardSkeleton />
|
||||
<CardSkeleton />
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -20,6 +20,7 @@ import { TableSkeleton } from '@/components/shared/loading-skeleton';
|
||||
import { ArchiveConfirmDialog } from '@/components/shared/archive-confirm-dialog';
|
||||
import { PermissionGate } from '@/components/shared/permission-gate';
|
||||
import { ExpenseFormDialog } from '@/components/expenses/expense-form-dialog';
|
||||
import { ExpenseCard } from '@/components/expenses/expense-card';
|
||||
import { expenseFilterDefinitions } from '@/components/expenses/expense-filters';
|
||||
import { getExpenseColumns, type ExpenseRow } from '@/components/expenses/expense-columns';
|
||||
import { usePaginatedQuery } from '@/hooks/use-paginated-query';
|
||||
@@ -60,8 +61,7 @@ export default function ExpensesPage() {
|
||||
});
|
||||
|
||||
const archiveMutation = useMutation({
|
||||
mutationFn: (id: string) =>
|
||||
apiFetch(`/api/v1/expenses/${id}`, { method: 'DELETE' }),
|
||||
mutationFn: (id: string) => apiFetch(`/api/v1/expenses/${id}`, { method: 'DELETE' }),
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ['expenses'] });
|
||||
setArchiveExpense(null);
|
||||
@@ -151,6 +151,14 @@ export default function ExpensesPage() {
|
||||
onSortChange={setSort}
|
||||
isLoading={isFetching && !isLoading}
|
||||
getRowId={(row) => row.id}
|
||||
cardRender={(row) => (
|
||||
<ExpenseCard
|
||||
expense={row.original}
|
||||
portSlug={portSlug}
|
||||
onEdit={setEditExpense}
|
||||
onArchive={setArchiveExpense}
|
||||
/>
|
||||
)}
|
||||
emptyState={
|
||||
<EmptyState
|
||||
title="No expenses found"
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useRef } from 'react';
|
||||
import { useEffect, useRef, useState } from 'react';
|
||||
import { useParams, useRouter } from 'next/navigation';
|
||||
import { useMutation } from '@tanstack/react-query';
|
||||
import { Upload, Loader2, ScanLine } from 'lucide-react';
|
||||
import { Camera, Loader2, ScanLine, Upload } from 'lucide-react';
|
||||
|
||||
import { useMobileChrome } from '@/components/layout/mobile/mobile-layout-provider';
|
||||
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { Input } from '@/components/ui/input';
|
||||
@@ -33,9 +35,16 @@ export default function ScanReceiptPage() {
|
||||
const router = useRouter();
|
||||
const fileInputRef = useRef<HTMLInputElement>(null);
|
||||
|
||||
const cameraInputRef = useRef<HTMLInputElement>(null);
|
||||
const [scanResult, setScanResult] = useState<ScanResult | null>(null);
|
||||
const [previewUrl, setPreviewUrl] = useState<string | null>(null);
|
||||
|
||||
const { setChrome } = useMobileChrome();
|
||||
useEffect(() => {
|
||||
setChrome({ title: 'Scan Receipt', showBackButton: true });
|
||||
return () => setChrome({ title: null, showBackButton: false });
|
||||
}, [setChrome]);
|
||||
|
||||
// Editable fields from scan
|
||||
const [establishment, setEstablishment] = useState('');
|
||||
const [amount, setAmount] = useState('');
|
||||
@@ -94,7 +103,7 @@ export default function ScanReceiptPage() {
|
||||
|
||||
return (
|
||||
<div className="max-w-2xl mx-auto space-y-6">
|
||||
<div>
|
||||
<div className="hidden sm:block">
|
||||
<h1 className="text-2xl font-bold">Scan Receipt</h1>
|
||||
<p className="text-muted-foreground mt-1">
|
||||
Upload a receipt image and we will extract the expense details automatically.
|
||||
@@ -109,28 +118,44 @@ export default function ScanReceiptPage() {
|
||||
</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div
|
||||
className="border-2 border-dashed rounded-lg p-8 text-center cursor-pointer hover:bg-muted/50 transition-colors"
|
||||
onClick={() => fileInputRef.current?.click()}
|
||||
>
|
||||
{previewUrl ? (
|
||||
{previewUrl ? (
|
||||
<div
|
||||
className="border-2 border-dashed rounded-lg p-4 text-center cursor-pointer hover:bg-muted/50 transition-colors"
|
||||
onClick={() => fileInputRef.current?.click()}
|
||||
>
|
||||
<img
|
||||
src={previewUrl}
|
||||
alt="Receipt preview"
|
||||
className="max-h-64 mx-auto rounded object-contain"
|
||||
/>
|
||||
) : (
|
||||
<div className="space-y-2">
|
||||
<Upload className="h-8 w-8 mx-auto text-muted-foreground" />
|
||||
<p className="text-sm text-muted-foreground">
|
||||
Click to upload or drag and drop
|
||||
</p>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
JPEG, PNG, WebP up to 10MB
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
) : (
|
||||
<div className="grid gap-2 sm:grid-cols-2">
|
||||
<Button
|
||||
type="button"
|
||||
size="lg"
|
||||
className="w-full h-14 sm:hidden"
|
||||
onClick={() => cameraInputRef.current?.click()}
|
||||
>
|
||||
<Camera className="mr-2 h-5 w-5" />
|
||||
Take photo
|
||||
</Button>
|
||||
<Button
|
||||
type="button"
|
||||
variant="outline"
|
||||
size="lg"
|
||||
className="w-full h-14"
|
||||
onClick={() => fileInputRef.current?.click()}
|
||||
>
|
||||
<Upload className="mr-2 h-5 w-5" />
|
||||
<span className="sm:hidden">Choose from library</span>
|
||||
<span className="hidden sm:inline">Click to upload or drag and drop</span>
|
||||
</Button>
|
||||
<p className="text-xs text-muted-foreground sm:col-span-2 text-center">
|
||||
JPEG, PNG, WebP up to 10MB
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
<input
|
||||
ref={fileInputRef}
|
||||
type="file"
|
||||
@@ -138,6 +163,14 @@ export default function ScanReceiptPage() {
|
||||
className="hidden"
|
||||
onChange={handleFileChange}
|
||||
/>
|
||||
<input
|
||||
ref={cameraInputRef}
|
||||
type="file"
|
||||
accept="image/*"
|
||||
capture="environment"
|
||||
className="hidden"
|
||||
onChange={handleFileChange}
|
||||
/>
|
||||
|
||||
{scanMutation.isPending && (
|
||||
<div className="flex items-center justify-center gap-2 mt-4 text-muted-foreground">
|
||||
@@ -222,25 +255,18 @@ export default function ScanReceiptPage() {
|
||||
</div>
|
||||
|
||||
{saveMutation.isError && (
|
||||
<p className="text-sm text-destructive">
|
||||
{(saveMutation.error as Error).message}
|
||||
</p>
|
||||
<p className="text-sm text-destructive">{(saveMutation.error as Error).message}</p>
|
||||
)}
|
||||
|
||||
<div className="flex gap-2 pt-2">
|
||||
<Button
|
||||
variant="outline"
|
||||
onClick={() => router.push(`/${params.portSlug}/expenses`)}
|
||||
>
|
||||
<Button variant="outline" onClick={() => router.push(`/${params.portSlug}/expenses`)}>
|
||||
Cancel
|
||||
</Button>
|
||||
<Button
|
||||
onClick={() => saveMutation.mutate()}
|
||||
disabled={saveMutation.isPending || !amount}
|
||||
>
|
||||
{saveMutation.isPending && (
|
||||
<Loader2 className="mr-2 h-4 w-4 animate-spin" />
|
||||
)}
|
||||
{saveMutation.isPending && <Loader2 className="mr-2 h-4 w-4 animate-spin" />}
|
||||
Save as Expense
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
@@ -1,11 +1,13 @@
|
||||
'use client';
|
||||
|
||||
import { useState } from 'react';
|
||||
import { useParams, useRouter } from 'next/navigation';
|
||||
import { useEffect, useState } from 'react';
|
||||
import { useParams, useRouter, useSearchParams } from 'next/navigation';
|
||||
import { useForm, FormProvider } from 'react-hook-form';
|
||||
import { zodResolver } from '@hookform/resolvers/zod';
|
||||
import { useMutation } from '@tanstack/react-query';
|
||||
import { ChevronLeft, ChevronRight, Check, Loader2 } from 'lucide-react';
|
||||
import { useMutation, useQuery } from '@tanstack/react-query';
|
||||
import { ChevronLeft, ChevronRight, Check, Loader2, Wallet } from 'lucide-react';
|
||||
|
||||
import { useMobileChrome } from '@/components/layout/mobile/mobile-layout-provider';
|
||||
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { Input } from '@/components/ui/input';
|
||||
@@ -43,9 +45,35 @@ export default function NewInvoicePage() {
|
||||
const params = useParams<{ portSlug: string }>();
|
||||
const portSlug = params?.portSlug ?? '';
|
||||
const router = useRouter();
|
||||
const searchParams = useSearchParams();
|
||||
const prefilledInterestId = searchParams.get('interestId') ?? undefined;
|
||||
const prefilledKind =
|
||||
searchParams.get('kind') === 'deposit' ? ('deposit' as const) : ('general' as const);
|
||||
|
||||
const [step, setStep] = useState(1);
|
||||
|
||||
const { setChrome } = useMobileChrome();
|
||||
useEffect(() => {
|
||||
setChrome({ title: 'New Invoice', showBackButton: true });
|
||||
return () => setChrome({ title: null, showBackButton: false });
|
||||
}, [setChrome]);
|
||||
|
||||
// When the form is launched from an interest detail with `?interestId=…&kind=deposit`,
|
||||
// fetch enough of the interest to display "Deposit for {client} — Berth {n}" in
|
||||
// the review step. Doubles as the source of truth for the billing entity prefill.
|
||||
const { data: prefilledInterest } = useQuery<{
|
||||
data: {
|
||||
id: string;
|
||||
clientId: string;
|
||||
clientName: string | null;
|
||||
berthMooringNumber: string | null;
|
||||
};
|
||||
}>({
|
||||
queryKey: ['interest-prefill', prefilledInterestId],
|
||||
queryFn: () => apiFetch(`/api/v1/interests/${prefilledInterestId}`),
|
||||
enabled: !!prefilledInterestId,
|
||||
});
|
||||
|
||||
const methods = useForm<CreateInvoiceInput>({
|
||||
resolver: zodResolver(createInvoiceSchema),
|
||||
defaultValues: {
|
||||
@@ -53,6 +81,8 @@ export default function NewInvoicePage() {
|
||||
currency: 'USD',
|
||||
lineItems: [],
|
||||
expenseIds: [],
|
||||
interestId: prefilledInterestId,
|
||||
kind: prefilledKind,
|
||||
},
|
||||
});
|
||||
|
||||
@@ -65,6 +95,43 @@ export default function NewInvoicePage() {
|
||||
} = methods;
|
||||
|
||||
const watchedValues = watch();
|
||||
const isDepositInvoice = watchedValues.kind === 'deposit';
|
||||
|
||||
// Resolve the selected billing entity to a human name so the review step
|
||||
// shows "Acme Yacht Charters" instead of "company 4f2a1b…".
|
||||
const billingEntityRef = watchedValues.billingEntity ?? null;
|
||||
const { data: billingEntityName } = useQuery<{ name: string }>({
|
||||
queryKey: ['billing-entity-name', billingEntityRef?.type, billingEntityRef?.id],
|
||||
queryFn: async () => {
|
||||
if (!billingEntityRef) return { name: '' };
|
||||
const path =
|
||||
billingEntityRef.type === 'company'
|
||||
? `/api/v1/companies/${billingEntityRef.id}`
|
||||
: `/api/v1/clients/${billingEntityRef.id}`;
|
||||
const res = await apiFetch<{
|
||||
data: { fullName?: string; name?: string };
|
||||
}>(path);
|
||||
return {
|
||||
name: res?.data?.fullName ?? res?.data?.name ?? '',
|
||||
};
|
||||
},
|
||||
enabled: !!billingEntityRef?.id,
|
||||
staleTime: 60_000,
|
||||
});
|
||||
|
||||
// Pre-fill the billing entity from the linked interest's client on launch.
|
||||
useEffect(() => {
|
||||
if (prefilledInterest?.data && !watchedValues.billingEntity) {
|
||||
setValue(
|
||||
'billingEntity',
|
||||
{ type: 'client', id: prefilledInterest.data.clientId },
|
||||
{ shouldValidate: true },
|
||||
);
|
||||
}
|
||||
// We only want this to run when the interest data first arrives.
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [prefilledInterest?.data?.clientId]);
|
||||
|
||||
const lineItems = watchedValues.lineItems ?? [];
|
||||
const subtotal = lineItems.reduce(
|
||||
(sum, li) => sum + (Number(li.quantity) || 0) * (Number(li.unitPrice) || 0),
|
||||
@@ -117,8 +184,8 @@ export default function NewInvoicePage() {
|
||||
|
||||
return (
|
||||
<div className="max-w-2xl mx-auto space-y-6">
|
||||
{/* Header */}
|
||||
<div className="flex items-center gap-3">
|
||||
{/* Header — desktop only; mobile gets the title from the topbar */}
|
||||
<div className="hidden sm:flex items-center gap-3">
|
||||
<Button variant="ghost" size="sm" onClick={() => router.push(`/${portSlug}/invoices`)}>
|
||||
<ChevronLeft className="h-4 w-4" />
|
||||
</Button>
|
||||
@@ -157,6 +224,23 @@ export default function NewInvoicePage() {
|
||||
<CardTitle className="text-base">Client Information</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent className="space-y-4">
|
||||
{isDepositInvoice ? (
|
||||
<div className="flex items-start gap-3 rounded-md border border-amber-200 bg-amber-50 px-3 py-2 text-sm text-amber-900">
|
||||
<Wallet className="mt-0.5 h-4 w-4 shrink-0" />
|
||||
<div className="min-w-0">
|
||||
<p className="font-medium">Deposit invoice</p>
|
||||
<p className="text-xs text-amber-800">
|
||||
{prefilledInterest?.data
|
||||
? `Linked to ${prefilledInterest.data.clientName ?? 'interest'}${
|
||||
prefilledInterest.data.berthMooringNumber
|
||||
? ` — Berth ${prefilledInterest.data.berthMooringNumber}`
|
||||
: ''
|
||||
}. Marking this invoice as paid will advance the interest to "Deposit 10%".`
|
||||
: 'Marking this invoice as paid will advance the linked interest to "Deposit 10%".'}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
) : null}
|
||||
<div className="space-y-2">
|
||||
<Label>
|
||||
Billing entity <span className="text-destructive">*</span>
|
||||
@@ -294,9 +378,13 @@ export default function NewInvoicePage() {
|
||||
<p className="font-medium mt-0.5">
|
||||
{watchedValues.billingEntity ? (
|
||||
<>
|
||||
<span className="capitalize">{watchedValues.billingEntity.type}</span>{' '}
|
||||
<span className="text-xs opacity-60">
|
||||
{watchedValues.billingEntity.id.slice(0, 12)}
|
||||
{billingEntityName?.name ? (
|
||||
<span>{billingEntityName.name}</span>
|
||||
) : (
|
||||
<span className="text-muted-foreground">Loading…</span>
|
||||
)}{' '}
|
||||
<span className="text-xs text-muted-foreground capitalize">
|
||||
({watchedValues.billingEntity.type})
|
||||
</span>
|
||||
</>
|
||||
) : (
|
||||
|
||||
@@ -12,6 +12,7 @@ import { PageHeader } from '@/components/shared/page-header';
|
||||
import { EmptyState } from '@/components/shared/empty-state';
|
||||
import { TableSkeleton } from '@/components/shared/loading-skeleton';
|
||||
import { PermissionGate } from '@/components/shared/permission-gate';
|
||||
import { InvoiceCard } from '@/components/invoices/invoice-card';
|
||||
import { invoiceFilterDefinitions } from '@/components/invoices/invoice-filters';
|
||||
import { getInvoiceColumns, type InvoiceRow } from '@/components/invoices/invoice-columns';
|
||||
import { usePaginatedQuery } from '@/hooks/use-paginated-query';
|
||||
@@ -63,8 +64,7 @@ export default function InvoicesPage() {
|
||||
});
|
||||
|
||||
const deleteMutation = useMutation({
|
||||
mutationFn: (id: string) =>
|
||||
apiFetch(`/api/v1/invoices/${id}`, { method: 'DELETE' }),
|
||||
mutationFn: (id: string) => apiFetch(`/api/v1/invoices/${id}`, { method: 'DELETE' }),
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ['invoices'] });
|
||||
setDeleteTarget(null);
|
||||
@@ -72,8 +72,7 @@ export default function InvoicesPage() {
|
||||
});
|
||||
|
||||
const sendMutation = useMutation({
|
||||
mutationFn: (id: string) =>
|
||||
apiFetch(`/api/v1/invoices/${id}/send`, { method: 'POST' }),
|
||||
mutationFn: (id: string) => apiFetch(`/api/v1/invoices/${id}/send`, { method: 'POST' }),
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ['invoices'] });
|
||||
},
|
||||
@@ -82,8 +81,7 @@ export default function InvoicesPage() {
|
||||
const columns = getInvoiceColumns({
|
||||
portSlug,
|
||||
onSend: (invoice) => sendMutation.mutate(invoice.id),
|
||||
onRecordPayment: (invoice) =>
|
||||
router.push(`/${portSlug}/invoices/${invoice.id}?tab=payment`),
|
||||
onRecordPayment: (invoice) => router.push(`/${portSlug}/invoices/${invoice.id}?tab=payment`),
|
||||
onDelete: (invoice) => setDeleteTarget(invoice),
|
||||
});
|
||||
|
||||
@@ -141,6 +139,17 @@ export default function InvoicesPage() {
|
||||
onSortChange={setSort}
|
||||
isLoading={isFetching && !isLoading}
|
||||
getRowId={(row) => row.id}
|
||||
cardRender={(row) => (
|
||||
<InvoiceCard
|
||||
invoice={row.original}
|
||||
portSlug={portSlug}
|
||||
onSend={(invoice) => sendMutation.mutate(invoice.id)}
|
||||
onRecordPayment={(invoice) =>
|
||||
router.push(`/${portSlug}/invoices/${invoice.id}?tab=payment`)
|
||||
}
|
||||
onDelete={setDeleteTarget}
|
||||
/>
|
||||
)}
|
||||
emptyState={
|
||||
<EmptyState
|
||||
title="No invoices found"
|
||||
@@ -161,15 +170,11 @@ export default function InvoicesPage() {
|
||||
<h3 className="font-semibold">Delete Invoice?</h3>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
This will permanently delete invoice{' '}
|
||||
<span className="font-mono font-medium">{deleteTarget.invoiceNumber}</span>.
|
||||
This action cannot be undone.
|
||||
<span className="font-mono font-medium">{deleteTarget.invoiceNumber}</span>. This
|
||||
action cannot be undone.
|
||||
</p>
|
||||
<div className="flex items-center gap-2 justify-end">
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={() => setDeleteTarget(null)}
|
||||
>
|
||||
<Button variant="outline" size="sm" onClick={() => setDeleteTarget(null)}>
|
||||
Cancel
|
||||
</Button>
|
||||
<Button
|
||||
|
||||
@@ -12,6 +12,8 @@ import { PortProvider } from '@/providers/port-provider';
|
||||
import { PermissionsProvider } from '@/providers/permissions-provider';
|
||||
import { Sidebar } from '@/components/layout/sidebar';
|
||||
import { Topbar } from '@/components/layout/topbar';
|
||||
import { MobileLayout } from '@/components/layout/mobile/mobile-layout';
|
||||
import { RealtimeToasts } from '@/components/shared/realtime-toasts';
|
||||
|
||||
export default async function DashboardLayout({ children }: { children: React.ReactNode }) {
|
||||
const session = await auth.api.getSession({ headers: await headers() });
|
||||
@@ -37,7 +39,9 @@ export default async function DashboardLayout({ children }: { children: React.Re
|
||||
<PortProvider ports={ports} defaultPortId={ports[0]?.id ?? null}>
|
||||
<PermissionsProvider>
|
||||
<SocketProvider>
|
||||
<div className="flex h-screen overflow-hidden bg-background">
|
||||
<RealtimeToasts />
|
||||
{/* Desktop shell — hidden by CSS on mobile */}
|
||||
<div data-shell="desktop" className="flex h-screen overflow-hidden bg-background">
|
||||
<Sidebar
|
||||
portRoles={portRoles}
|
||||
isSuperAdmin={profile?.isSuperAdmin ?? false}
|
||||
@@ -57,6 +61,9 @@ export default async function DashboardLayout({ children }: { children: React.Re
|
||||
<main className="flex-1 overflow-y-auto bg-background p-6">{children}</main>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Mobile shell — hidden by CSS on desktop */}
|
||||
<MobileLayout>{children}</MobileLayout>
|
||||
</SocketProvider>
|
||||
</PermissionsProvider>
|
||||
</PortProvider>
|
||||
|
||||
@@ -5,28 +5,19 @@ import type { Metadata } from 'next';
|
||||
import { getPortalSession } from '@/lib/portal/auth';
|
||||
import { getClientInterests } from '@/lib/services/portal.service';
|
||||
import { Badge } from '@/components/ui/badge';
|
||||
import { stageLabel, safeStage, type PipelineStage } from '@/lib/constants';
|
||||
|
||||
export const metadata: Metadata = { title: 'Interests' };
|
||||
|
||||
const STAGE_LABELS: Record<string, string> = {
|
||||
open: 'Open',
|
||||
details_sent: 'Details Sent',
|
||||
in_communication: 'In Communication',
|
||||
visited: 'Visited',
|
||||
signed_eoi_nda: 'EOI / NDA Signed',
|
||||
deposit_10pct: 'Deposit Received',
|
||||
contract: 'Contract Stage',
|
||||
completed: 'Completed',
|
||||
};
|
||||
|
||||
const STAGE_COLORS: Record<string, 'default' | 'secondary' | 'destructive' | 'outline'> = {
|
||||
const STAGE_VARIANT: Record<PipelineStage, 'default' | 'secondary' | 'destructive' | 'outline'> = {
|
||||
open: 'secondary',
|
||||
details_sent: 'secondary',
|
||||
in_communication: 'default',
|
||||
visited: 'default',
|
||||
signed_eoi_nda: 'default',
|
||||
eoi_sent: 'default',
|
||||
eoi_signed: 'default',
|
||||
deposit_10pct: 'default',
|
||||
contract: 'default',
|
||||
contract_sent: 'default',
|
||||
contract_signed: 'default',
|
||||
completed: 'outline',
|
||||
};
|
||||
|
||||
@@ -40,9 +31,7 @@ export default async function PortalInterestsPage() {
|
||||
<div className="space-y-6">
|
||||
<div>
|
||||
<h1 className="text-2xl font-semibold text-gray-900">Berth Interests</h1>
|
||||
<p className="text-sm text-gray-500 mt-1">
|
||||
Your berth enquiries and applications
|
||||
</p>
|
||||
<p className="text-sm text-gray-500 mt-1">Your berth enquiries and applications</p>
|
||||
</div>
|
||||
|
||||
{interests.length === 0 ? (
|
||||
@@ -56,10 +45,7 @@ export default async function PortalInterestsPage() {
|
||||
) : (
|
||||
<div className="space-y-3">
|
||||
{interests.map((interest) => (
|
||||
<div
|
||||
key={interest.id}
|
||||
className="bg-white rounded-lg border p-5"
|
||||
>
|
||||
<div key={interest.id} className="bg-white rounded-lg border p-5">
|
||||
<div className="flex items-start justify-between gap-4">
|
||||
<div className="flex-1 min-w-0">
|
||||
<div className="flex items-center gap-2 mb-1">
|
||||
@@ -98,8 +84,8 @@ export default async function PortalInterestsPage() {
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
<Badge variant={STAGE_COLORS[interest.pipelineStage] ?? 'default'}>
|
||||
{STAGE_LABELS[interest.pipelineStage] ?? interest.pipelineStage}
|
||||
<Badge variant={STAGE_VARIANT[safeStage(interest.pipelineStage)]}>
|
||||
{stageLabel(interest.pipelineStage)}
|
||||
</Badge>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
50
src/app/(scanner)/[portSlug]/scan/layout.tsx
Normal file
50
src/app/(scanner)/[portSlug]/scan/layout.tsx
Normal file
@@ -0,0 +1,50 @@
|
||||
import { redirect } from 'next/navigation';
|
||||
import { headers } from 'next/headers';
|
||||
|
||||
import { auth } from '@/lib/auth';
|
||||
import { db } from '@/lib/db';
|
||||
import { ports as portsTable } from '@/lib/db/schema/ports';
|
||||
import { QueryProvider } from '@/providers/query-provider';
|
||||
import { PortProvider } from '@/providers/port-provider';
|
||||
import { eq } from 'drizzle-orm';
|
||||
|
||||
/**
|
||||
* Minimal layout for the mobile receipt-scanner PWA. No sidebar, no
|
||||
* topbar — the scanner is its own contained surface. Adds the PWA
|
||||
* manifest link + theme color so iOS/Android pick up "Add to Home
|
||||
* Screen". Auth check matches the dashboard layout so unauthorized
|
||||
* users still bounce to /login.
|
||||
*/
|
||||
export default async function ScannerLayout({
|
||||
children,
|
||||
params,
|
||||
}: {
|
||||
children: React.ReactNode;
|
||||
params: Promise<{ portSlug: string }>;
|
||||
}) {
|
||||
const session = await auth.api.getSession({ headers: await headers() });
|
||||
if (!session?.user) redirect('/login');
|
||||
|
||||
const { portSlug } = await params;
|
||||
const port = await db.query.ports.findFirst({
|
||||
where: eq(portsTable.slug, portSlug),
|
||||
});
|
||||
if (!port) redirect('/login');
|
||||
|
||||
return (
|
||||
<QueryProvider>
|
||||
<PortProvider ports={port ? [port] : []} defaultPortId={port?.id ?? null}>
|
||||
<head>
|
||||
<link rel="manifest" href={`/${portSlug}/scan/manifest.webmanifest`} />
|
||||
<meta name="theme-color" content="#3a7bc8" />
|
||||
<meta name="mobile-web-app-capable" content="yes" />
|
||||
<meta name="apple-mobile-web-app-capable" content="yes" />
|
||||
<meta name="apple-mobile-web-app-status-bar-style" content="default" />
|
||||
<meta name="apple-mobile-web-app-title" content="PN Scanner" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, viewport-fit=cover" />
|
||||
</head>
|
||||
<div className="min-h-[100dvh] bg-background">{children}</div>
|
||||
</PortProvider>
|
||||
</QueryProvider>
|
||||
);
|
||||
}
|
||||
@@ -0,0 +1,45 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { eq } from 'drizzle-orm';
|
||||
|
||||
import { db } from '@/lib/db';
|
||||
import { ports } from '@/lib/db/schema/ports';
|
||||
|
||||
/**
|
||||
* Per-port PWA manifest. Scoped to `/<portSlug>/scan` so the install
|
||||
* only covers the scanner page, not the rest of the CRM. Each port
|
||||
* gets its own homescreen icon labeled with its name.
|
||||
*/
|
||||
export async function GET(_req: Request, { params }: { params: Promise<{ portSlug: string }> }) {
|
||||
const { portSlug } = await params;
|
||||
const port = await db.query.ports.findFirst({ where: eq(ports.slug, portSlug) });
|
||||
const portName = port?.name ?? 'Port Nimara';
|
||||
|
||||
const manifest = {
|
||||
name: `${portName} — Scanner`,
|
||||
short_name: 'Scanner',
|
||||
description: `Capture and submit expense receipts for ${portName}.`,
|
||||
start_url: `/${portSlug}/scan`,
|
||||
scope: `/${portSlug}/scan`,
|
||||
display: 'standalone',
|
||||
orientation: 'portrait',
|
||||
background_color: '#ffffff',
|
||||
theme_color: '#3a7bc8',
|
||||
icons: [
|
||||
{ src: '/icon-192.png', sizes: '192x192', type: 'image/png' },
|
||||
{ src: '/icon-512.png', sizes: '512x512', type: 'image/png' },
|
||||
{
|
||||
src: '/icon-512-maskable.png',
|
||||
sizes: '512x512',
|
||||
type: 'image/png',
|
||||
purpose: 'maskable',
|
||||
},
|
||||
],
|
||||
};
|
||||
|
||||
return NextResponse.json(manifest, {
|
||||
headers: {
|
||||
'Content-Type': 'application/manifest+json',
|
||||
'Cache-Control': 'public, max-age=300, must-revalidate',
|
||||
},
|
||||
});
|
||||
}
|
||||
11
src/app/(scanner)/[portSlug]/scan/page.tsx
Normal file
11
src/app/(scanner)/[portSlug]/scan/page.tsx
Normal file
@@ -0,0 +1,11 @@
|
||||
import type { Metadata } from 'next';
|
||||
|
||||
import { ScanShell } from '@/components/scan/scan-shell';
|
||||
|
||||
export const metadata: Metadata = {
|
||||
title: 'Scan receipt — Port Nimara',
|
||||
};
|
||||
|
||||
export default function ScanPage() {
|
||||
return <ScanShell />;
|
||||
}
|
||||
@@ -1,68 +1,15 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { db } from '@/lib/db';
|
||||
import { redis } from '@/lib/redis';
|
||||
import { minioClient } from '@/lib/minio';
|
||||
import { env } from '@/lib/env';
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
type CheckStatus = 'ok' | 'error';
|
||||
|
||||
interface HealthChecks {
|
||||
postgres: CheckStatus;
|
||||
redis: CheckStatus;
|
||||
minio: CheckStatus;
|
||||
}
|
||||
|
||||
interface HealthResponse {
|
||||
status: 'healthy' | 'degraded';
|
||||
checks: HealthChecks;
|
||||
timestamp: string;
|
||||
}
|
||||
|
||||
export async function GET(): Promise<NextResponse<HealthResponse>> {
|
||||
const checks: HealthChecks = {
|
||||
postgres: 'error',
|
||||
redis: 'error',
|
||||
minio: 'error',
|
||||
};
|
||||
|
||||
await Promise.allSettled([
|
||||
db
|
||||
.execute(sql`SELECT 1`)
|
||||
.then(() => {
|
||||
checks.postgres = 'ok';
|
||||
})
|
||||
.catch(() => {
|
||||
checks.postgres = 'error';
|
||||
}),
|
||||
|
||||
redis
|
||||
.ping()
|
||||
.then(() => {
|
||||
checks.redis = 'ok';
|
||||
})
|
||||
.catch(() => {
|
||||
checks.redis = 'error';
|
||||
}),
|
||||
|
||||
minioClient
|
||||
.bucketExists(env.MINIO_BUCKET)
|
||||
.then(() => {
|
||||
checks.minio = 'ok';
|
||||
})
|
||||
.catch(() => {
|
||||
checks.minio = 'error';
|
||||
}),
|
||||
]);
|
||||
|
||||
const allHealthy = Object.values(checks).every((s) => s === 'ok');
|
||||
const status: HealthResponse['status'] = allHealthy ? 'healthy' : 'degraded';
|
||||
|
||||
const body: HealthResponse = {
|
||||
status,
|
||||
checks,
|
||||
timestamp: new Date().toISOString(),
|
||||
};
|
||||
|
||||
return NextResponse.json(body, { status: allHealthy ? 200 : 503 });
|
||||
/**
|
||||
* Liveness probe — confirms the Next.js process is responding.
|
||||
*
|
||||
* Returns 200 unconditionally; if the process is wedged or has crashed
|
||||
* the request never lands here at all. Do NOT include database/Redis/MinIO
|
||||
* checks in this endpoint — a transient downstream blip should drop the
|
||||
* pod from the load balancer (readiness), not restart the pod (liveness).
|
||||
*
|
||||
* For deep dependency checks, hit `/api/ready` instead.
|
||||
*/
|
||||
export async function GET() {
|
||||
return NextResponse.json({ status: 'ok', timestamp: new Date().toISOString() });
|
||||
}
|
||||
|
||||
@@ -12,31 +12,23 @@ import { yachts, yachtOwnershipHistory } from '@/lib/db/schema/yachts';
|
||||
import { companies, companyMemberships } from '@/lib/db/schema/companies';
|
||||
import { createAuditLog } from '@/lib/audit';
|
||||
import { errorResponse, RateLimitError } from '@/lib/errors';
|
||||
import { checkRateLimit, rateLimiters } from '@/lib/rate-limit';
|
||||
import { publicInterestSchema } from '@/lib/validators/interests';
|
||||
import { sendInquiryNotifications } from '@/lib/services/inquiry-notifications.service';
|
||||
import { parsePhone } from '@/lib/i18n/phone';
|
||||
import type { CountryCode } from '@/lib/i18n/countries';
|
||||
|
||||
// ─── Simple in-memory rate limiter ───────────────────────────────────────────
|
||||
// Max 5 requests per hour per IP
|
||||
|
||||
const ipHits = new Map<string, { count: number; resetAt: number }>();
|
||||
const WINDOW_MS = 60 * 60 * 1000; // 1 hour
|
||||
const MAX_HITS = 5;
|
||||
|
||||
function checkRateLimit(ip: string): void {
|
||||
const now = Date.now();
|
||||
const entry = ipHits.get(ip);
|
||||
|
||||
if (!entry || now > entry.resetAt) {
|
||||
ipHits.set(ip, { count: 1, resetAt: now + WINDOW_MS });
|
||||
return;
|
||||
}
|
||||
|
||||
if (entry.count >= MAX_HITS) {
|
||||
const retryAfter = Math.ceil((entry.resetAt - now) / 1000);
|
||||
/**
|
||||
* Throws RateLimitError if the IP has exceeded the public-form quota.
|
||||
* Backed by the Redis sliding-window limiter so the cap survives restarts
|
||||
* and is shared across worker processes.
|
||||
*/
|
||||
async function gateRateLimit(ip: string): Promise<void> {
|
||||
const result = await checkRateLimit(ip, rateLimiters.publicForm);
|
||||
if (!result.allowed) {
|
||||
const retryAfter = Math.max(1, Math.ceil((result.resetAt - Date.now()) / 1000));
|
||||
throw new RateLimitError(retryAfter);
|
||||
}
|
||||
|
||||
entry.count += 1;
|
||||
}
|
||||
|
||||
type PublicInterestData = z.infer<typeof publicInterestSchema>;
|
||||
@@ -50,7 +42,7 @@ type Tx = typeof db;
|
||||
export async function POST(req: NextRequest) {
|
||||
try {
|
||||
const ip = req.headers.get('x-forwarded-for')?.split(',')[0]?.trim() ?? 'unknown';
|
||||
checkRateLimit(ip);
|
||||
await gateRateLimit(ip);
|
||||
|
||||
const body = await req.json();
|
||||
const data = publicInterestSchema.parse(body);
|
||||
@@ -61,6 +53,16 @@ export async function POST(req: NextRequest) {
|
||||
return NextResponse.json({ error: 'Port context required' }, { status: 400 });
|
||||
}
|
||||
|
||||
// Server-side phone normalization for older website builds that post raw
|
||||
// international/national strings. Newer builds may pre-fill phoneE164/Country.
|
||||
let phoneE164 = data.phoneE164 ?? null;
|
||||
let phoneCountry: CountryCode | null = (data.phoneCountry as CountryCode | null) ?? null;
|
||||
if (!phoneE164) {
|
||||
const parsed = parsePhone(data.phone, phoneCountry ?? undefined);
|
||||
phoneE164 = parsed.e164;
|
||||
phoneCountry = parsed.country ?? phoneCountry;
|
||||
}
|
||||
|
||||
const fullName =
|
||||
data.firstName && data.lastName
|
||||
? `${data.firstName} ${data.lastName}`
|
||||
@@ -96,17 +98,21 @@ export async function POST(req: NextRequest) {
|
||||
});
|
||||
if (existingClient && existingClient.portId === portId) {
|
||||
clientId = existingClient.id;
|
||||
const updates: Partial<typeof clients.$inferInsert> = {};
|
||||
if (data.preferredContactMethod) {
|
||||
await tx
|
||||
.update(clients)
|
||||
.set({ preferredContactMethod: data.preferredContactMethod })
|
||||
.where(eq(clients.id, clientId));
|
||||
updates.preferredContactMethod = data.preferredContactMethod;
|
||||
}
|
||||
if (data.nationalityIso && !existingClient.nationalityIso) {
|
||||
updates.nationalityIso = data.nationalityIso;
|
||||
}
|
||||
if (Object.keys(updates).length > 0) {
|
||||
await tx.update(clients).set(updates).where(eq(clients.id, clientId));
|
||||
}
|
||||
} else {
|
||||
clientId = await createClientInTx(tx, portId, fullName, data);
|
||||
clientId = await createClientInTx(tx, portId, fullName, data, phoneE164, phoneCountry);
|
||||
}
|
||||
} else {
|
||||
clientId = await createClientInTx(tx, portId, fullName, data);
|
||||
clientId = await createClientInTx(tx, portId, fullName, data, phoneE164, phoneCountry);
|
||||
}
|
||||
|
||||
// 2. Optional: upsert company + add membership
|
||||
@@ -128,7 +134,8 @@ export async function POST(req: NextRequest) {
|
||||
name: data.company.name,
|
||||
legalName: data.company.legalName ?? null,
|
||||
taxId: data.company.taxId ?? null,
|
||||
incorporationCountry: data.company.incorporationCountry ?? null,
|
||||
incorporationCountryIso: data.company.incorporationCountryIso ?? null,
|
||||
incorporationSubdivisionIso: data.company.incorporationSubdivisionIso ?? null,
|
||||
status: 'active',
|
||||
})
|
||||
.returning();
|
||||
@@ -198,9 +205,9 @@ export async function POST(req: NextRequest) {
|
||||
label: 'Primary',
|
||||
streetAddress: data.address.street ?? null,
|
||||
city: data.address.city ?? null,
|
||||
stateProvince: data.address.stateProvince ?? null,
|
||||
subdivisionIso: data.address.subdivisionIso ?? null,
|
||||
postalCode: data.address.postalCode ?? null,
|
||||
country: data.address.country ?? null,
|
||||
countryIso: data.address.countryIso ?? null,
|
||||
isPrimary: true,
|
||||
});
|
||||
}
|
||||
@@ -279,7 +286,9 @@ async function createClientInTx(
|
||||
tx: Tx,
|
||||
portId: string,
|
||||
fullName: string,
|
||||
data: Pick<PublicInterestData, 'email' | 'phone' | 'preferredContactMethod'>,
|
||||
data: Pick<PublicInterestData, 'email' | 'phone' | 'preferredContactMethod' | 'nationalityIso'>,
|
||||
phoneE164: string | null,
|
||||
phoneCountry: CountryCode | null,
|
||||
): Promise<string> {
|
||||
const [newClient] = await tx
|
||||
.insert(clients)
|
||||
@@ -287,6 +296,7 @@ async function createClientInTx(
|
||||
portId,
|
||||
fullName,
|
||||
preferredContactMethod: data.preferredContactMethod,
|
||||
nationalityIso: data.nationalityIso ?? null,
|
||||
source: 'website',
|
||||
})
|
||||
.returning();
|
||||
@@ -303,6 +313,8 @@ async function createClientInTx(
|
||||
clientId,
|
||||
channel: 'phone',
|
||||
value: data.phone,
|
||||
valueE164: phoneE164,
|
||||
valueCountry: phoneCountry,
|
||||
isPrimary: false,
|
||||
});
|
||||
|
||||
|
||||
@@ -14,26 +14,23 @@ import {
|
||||
import { env } from '@/lib/env';
|
||||
import { errorResponse, RateLimitError, ValidationError } from '@/lib/errors';
|
||||
import { logger } from '@/lib/logger';
|
||||
import { checkRateLimit, rateLimiters } from '@/lib/rate-limit';
|
||||
import { publicResidentialInquirySchema } from '@/lib/validators/residential';
|
||||
import { emitToRoom } from '@/lib/socket/server';
|
||||
import { parsePhone } from '@/lib/i18n/phone';
|
||||
import type { CountryCode } from '@/lib/i18n/countries';
|
||||
|
||||
// ─── Rate limiter (5 per hour per IP) ────────────────────────────────────────
|
||||
|
||||
const ipHits = new Map<string, { count: number; resetAt: number }>();
|
||||
const WINDOW_MS = 60 * 60 * 1000;
|
||||
const MAX_HITS = 5;
|
||||
|
||||
function checkRateLimit(ip: string): void {
|
||||
const now = Date.now();
|
||||
const entry = ipHits.get(ip);
|
||||
if (!entry || now > entry.resetAt) {
|
||||
ipHits.set(ip, { count: 1, resetAt: now + WINDOW_MS });
|
||||
return;
|
||||
/**
|
||||
* Throws RateLimitError if the IP has exceeded the public-form quota.
|
||||
* Backed by the Redis sliding-window limiter so the cap survives restarts
|
||||
* and is shared across worker processes.
|
||||
*/
|
||||
async function gateRateLimit(ip: string): Promise<void> {
|
||||
const result = await checkRateLimit(ip, rateLimiters.publicForm);
|
||||
if (!result.allowed) {
|
||||
const retryAfter = Math.max(1, Math.ceil((result.resetAt - Date.now()) / 1000));
|
||||
throw new RateLimitError(retryAfter);
|
||||
}
|
||||
if (entry.count >= MAX_HITS) {
|
||||
throw new RateLimitError(Math.ceil((entry.resetAt - now) / 1000));
|
||||
}
|
||||
entry.count += 1;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -47,7 +44,7 @@ function checkRateLimit(ip: string): void {
|
||||
export async function POST(req: NextRequest) {
|
||||
try {
|
||||
const ip = req.headers.get('x-forwarded-for')?.split(',')[0]?.trim() ?? 'unknown';
|
||||
checkRateLimit(ip);
|
||||
await gateRateLimit(ip);
|
||||
|
||||
const body = await req.json();
|
||||
const data = publicResidentialInquirySchema.parse(body);
|
||||
@@ -61,6 +58,16 @@ export async function POST(req: NextRequest) {
|
||||
throw new ValidationError('Unknown port');
|
||||
}
|
||||
|
||||
// If the website didn't pre-normalize, parse server-side. International
|
||||
// strings parse without a hint; national-format submissions need a country.
|
||||
let phoneE164 = data.phoneE164 ?? null;
|
||||
let phoneCountry: CountryCode | null = (data.phoneCountry as CountryCode | null) ?? null;
|
||||
if (!phoneE164) {
|
||||
const parsed = parsePhone(data.phone, phoneCountry ?? undefined);
|
||||
phoneE164 = parsed.e164;
|
||||
phoneCountry = parsed.country ?? phoneCountry;
|
||||
}
|
||||
|
||||
const result = await withTransaction(async (tx) => {
|
||||
const [client] = await tx
|
||||
.insert(residentialClients)
|
||||
@@ -69,7 +76,13 @@ export async function POST(req: NextRequest) {
|
||||
fullName: `${data.firstName.trim()} ${data.lastName.trim()}`.trim(),
|
||||
email: data.email,
|
||||
phone: data.phone,
|
||||
phoneE164,
|
||||
phoneCountry,
|
||||
nationalityIso: data.nationalityIso ?? null,
|
||||
timezone: data.timezone ?? null,
|
||||
placeOfResidence: data.placeOfResidence,
|
||||
placeOfResidenceCountryIso: data.placeOfResidenceCountryIso ?? null,
|
||||
subdivisionIso: data.subdivisionIso ?? null,
|
||||
preferredContactMethod: data.preferredContactMethod,
|
||||
source: 'website',
|
||||
status: 'prospect',
|
||||
|
||||
82
src/app/api/ready/route.ts
Normal file
82
src/app/api/ready/route.ts
Normal file
@@ -0,0 +1,82 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
import { db } from '@/lib/db';
|
||||
import { redis } from '@/lib/redis';
|
||||
import { minioClient } from '@/lib/minio';
|
||||
import { env } from '@/lib/env';
|
||||
|
||||
type CheckStatus = 'ok' | 'error';
|
||||
|
||||
interface ReadyChecks {
|
||||
postgres: CheckStatus;
|
||||
redis: CheckStatus;
|
||||
minio: CheckStatus;
|
||||
}
|
||||
|
||||
interface ReadyResponse {
|
||||
status: 'ready' | 'degraded';
|
||||
checks: ReadyChecks;
|
||||
timestamp: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Readiness probe — verifies that every backing service this process
|
||||
* needs to serve traffic is reachable. A 503 should drop the pod from the
|
||||
* load balancer until the next probe succeeds; it should not trigger a
|
||||
* pod restart (that's what `/api/health` is for).
|
||||
*
|
||||
* Checks:
|
||||
* - postgres: `SELECT 1` against the primary
|
||||
* - redis: `PING`
|
||||
* - minio: `bucketExists(<configured-bucket>)`
|
||||
*
|
||||
* Documenso + SMTP are intentionally not probed here: they're optional
|
||||
* integrations, and each tenant configures its own credentials. A
|
||||
* tenant-misconfigured Documenso instance shouldn't deadline the entire
|
||||
* shared CRM.
|
||||
*/
|
||||
export async function GET(): Promise<NextResponse<ReadyResponse>> {
|
||||
const checks: ReadyChecks = {
|
||||
postgres: 'error',
|
||||
redis: 'error',
|
||||
minio: 'error',
|
||||
};
|
||||
|
||||
await Promise.allSettled([
|
||||
db
|
||||
.execute(sql`SELECT 1`)
|
||||
.then(() => {
|
||||
checks.postgres = 'ok';
|
||||
})
|
||||
.catch(() => {
|
||||
checks.postgres = 'error';
|
||||
}),
|
||||
|
||||
redis
|
||||
.ping()
|
||||
.then(() => {
|
||||
checks.redis = 'ok';
|
||||
})
|
||||
.catch(() => {
|
||||
checks.redis = 'error';
|
||||
}),
|
||||
|
||||
minioClient
|
||||
.bucketExists(env.MINIO_BUCKET)
|
||||
.then(() => {
|
||||
checks.minio = 'ok';
|
||||
})
|
||||
.catch(() => {
|
||||
checks.minio = 'error';
|
||||
}),
|
||||
]);
|
||||
|
||||
const allReady = Object.values(checks).every((s) => s === 'ok');
|
||||
const status: ReadyResponse['status'] = allReady ? 'ready' : 'degraded';
|
||||
|
||||
return NextResponse.json(
|
||||
{ status, checks, timestamp: new Date().toISOString() },
|
||||
{ status: allReady ? 200 : 503 },
|
||||
);
|
||||
}
|
||||
46
src/app/api/v1/admin/ai-budget/route.ts
Normal file
46
src/app/api/v1/admin/ai-budget/route.ts
Normal file
@@ -0,0 +1,46 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { z } from 'zod';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import {
|
||||
getAiBudget,
|
||||
setAiBudget,
|
||||
currentPeriodTokens,
|
||||
periodBreakdown,
|
||||
} from '@/lib/services/ai-budget.service';
|
||||
|
||||
const saveSchema = z.object({
|
||||
enabled: z.boolean().optional(),
|
||||
softCapTokens: z.number().int().nonnegative().max(100_000_000).optional(),
|
||||
hardCapTokens: z.number().int().nonnegative().max(100_000_000).optional(),
|
||||
period: z.enum(['day', 'week', 'month']).optional(),
|
||||
});
|
||||
|
||||
export const GET = withAuth(
|
||||
withPermission('admin', 'manage_settings', async (req, ctx) => {
|
||||
try {
|
||||
const [budget, used, breakdown] = await Promise.all([
|
||||
getAiBudget(ctx.portId),
|
||||
currentPeriodTokens(ctx.portId),
|
||||
periodBreakdown(ctx.portId),
|
||||
]);
|
||||
return NextResponse.json({ data: { budget, used, breakdown } });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
export const PUT = withAuth(
|
||||
withPermission('admin', 'manage_settings', async (req, ctx) => {
|
||||
try {
|
||||
const body = await parseBody(req, saveSchema);
|
||||
const next = await setAiBudget(ctx.portId, body, ctx.userId);
|
||||
return NextResponse.json({ data: next });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
25
src/app/api/v1/admin/alerts/run-engine/route.ts
Normal file
25
src/app/api/v1/admin/alerts/run-engine/route.ts
Normal file
@@ -0,0 +1,25 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth } from '@/lib/api/helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { runAlertEngineForPorts } from '@/lib/services/alert-engine';
|
||||
|
||||
/**
|
||||
* Admin trigger for an immediate alert engine sweep over the caller's port.
|
||||
* Useful for manual ops ("re-evaluate now after I fixed a rule") and
|
||||
* exercised by the realapi socket fanout test.
|
||||
*
|
||||
* Requires super_admin or per-port admin permissions; the engine itself
|
||||
* is idempotent — duplicate runs only re-evaluate, never duplicate rows.
|
||||
*/
|
||||
export const POST = withAuth(async (_req, ctx) => {
|
||||
try {
|
||||
if (!ctx.isSuperAdmin) {
|
||||
return NextResponse.json({ error: 'Super admin only' }, { status: 403 });
|
||||
}
|
||||
const summary = await runAlertEngineForPorts([ctx.portId]);
|
||||
return NextResponse.json({ data: summary });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
});
|
||||
@@ -1,29 +1,76 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { z } from 'zod';
|
||||
import { inArray } from 'drizzle-orm';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { parseQuery } from '@/lib/api/route-helpers';
|
||||
import { listAuditLogs } from '@/lib/services/audit.service';
|
||||
import { searchAuditLogs } from '@/lib/services/audit-search.service';
|
||||
import { db } from '@/lib/db';
|
||||
import { user } from '@/lib/db/schema/users';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
|
||||
const auditQuerySchema = z.object({
|
||||
page: z.coerce.number().int().min(1).default(1),
|
||||
limit: z.coerce.number().int().min(1).max(100).default(50),
|
||||
limit: z.coerce.number().int().min(1).max(200).default(50),
|
||||
entityType: z.string().optional(),
|
||||
action: z.string().optional(),
|
||||
userId: z.string().optional(),
|
||||
entityId: z.string().optional(),
|
||||
dateFrom: z.string().optional(),
|
||||
dateTo: z.string().optional(),
|
||||
/** Free-text query against the tsvector `search_text` column. */
|
||||
search: z.string().optional(),
|
||||
/** Cursor pair from the previous page's response. */
|
||||
cursorAt: z.string().optional(),
|
||||
cursorId: z.string().optional(),
|
||||
});
|
||||
|
||||
export const GET = withAuth(
|
||||
withPermission('admin', 'view_audit_log', async (req, ctx) => {
|
||||
try {
|
||||
const query = parseQuery(req, auditQuerySchema);
|
||||
const result = await listAuditLogs(ctx.portId, query);
|
||||
return NextResponse.json(result);
|
||||
const cursor =
|
||||
query.cursorAt && query.cursorId
|
||||
? { createdAt: new Date(query.cursorAt), id: query.cursorId }
|
||||
: undefined;
|
||||
const { rows, nextCursor } = await searchAuditLogs({
|
||||
portId: ctx.portId,
|
||||
q: query.search,
|
||||
userId: query.userId,
|
||||
action: query.action,
|
||||
entityType: query.entityType,
|
||||
entityId: query.entityId,
|
||||
from: query.dateFrom ? new Date(query.dateFrom) : undefined,
|
||||
to: query.dateTo ? new Date(query.dateTo) : undefined,
|
||||
cursor,
|
||||
limit: query.limit,
|
||||
});
|
||||
|
||||
// Resolve actor emails in one batched query so the table can show
|
||||
// who did what without N+1 round trips.
|
||||
const userIds = Array.from(
|
||||
new Set(rows.map((r) => r.userId).filter((id): id is string => Boolean(id))),
|
||||
);
|
||||
const userRows = userIds.length
|
||||
? await db
|
||||
.select({ id: user.id, email: user.email, name: user.name })
|
||||
.from(user)
|
||||
.where(inArray(user.id, userIds))
|
||||
: [];
|
||||
const userMap = new Map(userRows.map((u) => [u.id, u]));
|
||||
|
||||
const data = rows.map((r) => ({
|
||||
...r,
|
||||
actor: r.userId ? (userMap.get(r.userId) ?? null) : null,
|
||||
}));
|
||||
|
||||
return NextResponse.json({
|
||||
data,
|
||||
pagination: {
|
||||
nextCursor: nextCursor
|
||||
? { createdAt: nextCursor.createdAt.toISOString(), id: nextCursor.id }
|
||||
: null,
|
||||
},
|
||||
});
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
|
||||
@@ -1,12 +1,18 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { errorResponse, ForbiddenError } from '@/lib/errors';
|
||||
import { resendCrmInvite } from '@/lib/services/crm-invite.service';
|
||||
|
||||
// Resend mints a fresh token + new email on a global invite row;
|
||||
// restrict to super-admins to match revoke/list and avoid cross-tenant
|
||||
// re-issuance of foreign-port invitations.
|
||||
export const POST = withAuth(
|
||||
withPermission('admin', 'manage_users', async (_req, ctx, params) => {
|
||||
try {
|
||||
if (!ctx.isSuperAdmin) {
|
||||
throw new ForbiddenError('Resending CRM invites requires super-admin');
|
||||
}
|
||||
const id = params.id ?? '';
|
||||
const result = await resendCrmInvite(id, {
|
||||
userId: ctx.userId,
|
||||
|
||||
@@ -1,12 +1,18 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { errorResponse, ForbiddenError } from '@/lib/errors';
|
||||
import { revokeCrmInvite } from '@/lib/services/crm-invite.service';
|
||||
|
||||
// Invites are a global resource (no portId column). Revoking a foreign
|
||||
// tenant's pending invite by id would be cross-tenant tampering;
|
||||
// restrict to super-admins to match the listing endpoint.
|
||||
export const DELETE = withAuth(
|
||||
withPermission('admin', 'manage_users', async (_req, ctx, params) => {
|
||||
try {
|
||||
if (!ctx.isSuperAdmin) {
|
||||
throw new ForbiddenError('Revoking CRM invites requires super-admin');
|
||||
}
|
||||
const id = params.id ?? '';
|
||||
await revokeCrmInvite(id, {
|
||||
userId: ctx.userId,
|
||||
|
||||
@@ -3,12 +3,20 @@ import { z } from 'zod';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { errorResponse, ForbiddenError } from '@/lib/errors';
|
||||
import { createCrmInvite, listCrmInvites } from '@/lib/services/crm-invite.service';
|
||||
|
||||
export const GET = withAuth(
|
||||
withPermission('admin', 'manage_users', async (_req, _ctx) => {
|
||||
withPermission('admin', 'manage_users', async (_req, ctx) => {
|
||||
try {
|
||||
// crm_user_invites is a global table (no per-port column) — invites
|
||||
// mint better-auth users that may later be assigned roles in any
|
||||
// port. Listing it cross-tenant would let a port-A director
|
||||
// enumerate pending invitee emails, names, and isSuperAdmin flags
|
||||
// for every other tenant. Restrict the listing to super-admins.
|
||||
if (!ctx.isSuperAdmin) {
|
||||
throw new ForbiddenError('Listing CRM invites requires super-admin');
|
||||
}
|
||||
const data = await listCrmInvites();
|
||||
return NextResponse.json({ data });
|
||||
} catch (error) {
|
||||
@@ -24,10 +32,17 @@ const createInviteSchema = z.object({
|
||||
});
|
||||
|
||||
export const POST = withAuth(
|
||||
withPermission('admin', 'manage_users', async (req, _ctx) => {
|
||||
withPermission('admin', 'manage_users', async (req, ctx) => {
|
||||
try {
|
||||
const body = await parseBody(req, createInviteSchema);
|
||||
const result = await createCrmInvite(body);
|
||||
// Only existing super-admins can mint super-admin invitations. The
|
||||
// manage_users permission is granted to port-scoped director roles,
|
||||
// which must not be able to elevate themselves cross-tenant by
|
||||
// inviting a fresh super_admin.
|
||||
if (body.isSuperAdmin && !ctx.isSuperAdmin) {
|
||||
throw new ForbiddenError('Only super admins can mint super-admin invitations');
|
||||
}
|
||||
const result = await createCrmInvite({ ...body, invitedBy: ctx });
|
||||
return NextResponse.json({ data: result }, { status: 201 });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
|
||||
72
src/app/api/v1/admin/ocr-settings/route.ts
Normal file
72
src/app/api/v1/admin/ocr-settings/route.ts
Normal file
@@ -0,0 +1,72 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { z } from 'zod';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { getPublicOcrConfig, saveOcrConfig, OCR_MODELS } from '@/lib/services/ocr-config.service';
|
||||
|
||||
const saveSchema = z.object({
|
||||
/** When 'global', requires super_admin and stores at port_id=null. */
|
||||
scope: z.enum(['port', 'global']),
|
||||
provider: z.enum(['openai', 'claude']),
|
||||
model: z.string().min(1),
|
||||
apiKey: z.string().optional(),
|
||||
clearApiKey: z.boolean().optional(),
|
||||
useGlobal: z.boolean().optional(),
|
||||
aiEnabled: z.boolean().optional(),
|
||||
});
|
||||
|
||||
// Only role tiers that hold `admin.manage_settings` (director / super_admin)
|
||||
// may read or write the OCR config: the apiKey is stored encrypted but is
|
||||
// passed straight into the receipt-scan handler, so a swapped key would
|
||||
// exfiltrate every subsequent receipt image to whatever endpoint that key
|
||||
// authenticates with.
|
||||
export const GET = withAuth(
|
||||
withPermission('admin', 'manage_settings', async (req, ctx) => {
|
||||
try {
|
||||
const url = new URL(req.url);
|
||||
const scope = url.searchParams.get('scope') ?? 'port';
|
||||
if (scope === 'global' && !ctx.isSuperAdmin) {
|
||||
return NextResponse.json({ error: 'Super admin only' }, { status: 403 });
|
||||
}
|
||||
const config = await getPublicOcrConfig(scope === 'global' ? null : ctx.portId);
|
||||
return NextResponse.json({ data: config, models: OCR_MODELS });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
export const PUT = withAuth(
|
||||
withPermission('admin', 'manage_settings', async (req, ctx) => {
|
||||
try {
|
||||
const body = await parseBody(req, saveSchema);
|
||||
if (body.scope === 'global' && !ctx.isSuperAdmin) {
|
||||
return NextResponse.json({ error: 'Super admin only' }, { status: 403 });
|
||||
}
|
||||
const validModels = OCR_MODELS[body.provider];
|
||||
if (!validModels.includes(body.model)) {
|
||||
return NextResponse.json(
|
||||
{ error: `Invalid model for provider ${body.provider}` },
|
||||
{ status: 400 },
|
||||
);
|
||||
}
|
||||
await saveOcrConfig(
|
||||
body.scope === 'global' ? null : ctx.portId,
|
||||
{
|
||||
provider: body.provider,
|
||||
model: body.model,
|
||||
apiKey: body.apiKey,
|
||||
clearApiKey: body.clearApiKey,
|
||||
useGlobal: body.useGlobal,
|
||||
aiEnabled: body.aiEnabled,
|
||||
},
|
||||
ctx.userId,
|
||||
);
|
||||
return NextResponse.json({ ok: true });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
31
src/app/api/v1/admin/ocr-settings/test/route.ts
Normal file
31
src/app/api/v1/admin/ocr-settings/test/route.ts
Normal file
@@ -0,0 +1,31 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { z } from 'zod';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { OCR_MODELS } from '@/lib/services/ocr-config.service';
|
||||
import { testProvider } from '@/lib/services/ocr-providers';
|
||||
|
||||
const schema = z.object({
|
||||
provider: z.enum(['openai', 'claude']),
|
||||
model: z.string().min(1),
|
||||
apiKey: z.string().min(1),
|
||||
});
|
||||
|
||||
// `manage_settings`-gated for parity with the parent OCR settings route —
|
||||
// triggers outbound AI provider auth requests using a caller-supplied key.
|
||||
export const POST = withAuth(
|
||||
withPermission('admin', 'manage_settings', async (req) => {
|
||||
try {
|
||||
const body = await parseBody(req, schema);
|
||||
if (!OCR_MODELS[body.provider].includes(body.model)) {
|
||||
return NextResponse.json({ error: 'Invalid model' }, { status: 400 });
|
||||
}
|
||||
const result = await testProvider(body.provider, body.apiKey, body.model);
|
||||
return NextResponse.json(result);
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
@@ -4,11 +4,25 @@ import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { getPort, updatePort } from '@/lib/services/ports.service';
|
||||
import { updatePortSchema } from '@/lib/validators/ports';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { errorResponse, ForbiddenError } from '@/lib/errors';
|
||||
|
||||
/**
|
||||
* Non-super-admin callers (e.g. port directors holding admin.manage_settings)
|
||||
* may only read/mutate THEIR OWN port row. The path id is therefore
|
||||
* compared against ctx.portId and a foreign target is rejected before the
|
||||
* service is touched. Super-admins retain unrestricted access.
|
||||
*/
|
||||
function assertPortInScope(targetPortId: string, ctx: { portId: string; isSuperAdmin: boolean }) {
|
||||
if (ctx.isSuperAdmin) return;
|
||||
if (targetPortId !== ctx.portId) {
|
||||
throw new ForbiddenError('Cross-tenant port access denied');
|
||||
}
|
||||
}
|
||||
|
||||
export const GET = withAuth(
|
||||
withPermission('admin', 'manage_settings', async (_req, _ctx, params) => {
|
||||
withPermission('admin', 'manage_settings', async (_req, ctx, params) => {
|
||||
try {
|
||||
assertPortInScope(params.id!, ctx);
|
||||
const data = await getPort(params.id!);
|
||||
return NextResponse.json({ data });
|
||||
} catch (error) {
|
||||
@@ -20,6 +34,7 @@ export const GET = withAuth(
|
||||
export const PATCH = withAuth(
|
||||
withPermission('admin', 'manage_settings', async (req, ctx, params) => {
|
||||
try {
|
||||
assertPortInScope(params.id!, ctx);
|
||||
const body = await parseBody(req, updatePortSchema);
|
||||
const data = await updatePort(params.id!, body, {
|
||||
userId: ctx.userId,
|
||||
|
||||
@@ -4,11 +4,18 @@ import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { listPorts, createPort } from '@/lib/services/ports.service';
|
||||
import { createPortSchema } from '@/lib/validators/ports';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { errorResponse, ForbiddenError } from '@/lib/errors';
|
||||
|
||||
// Listing every tenant and creating new tenants are super-admin operations:
|
||||
// a port director must not be able to enumerate other ports (target
|
||||
// discovery for cross-tenant attacks) or spin up new tenants whose admin
|
||||
// they implicitly become.
|
||||
export const GET = withAuth(
|
||||
withPermission('admin', 'manage_settings', async () => {
|
||||
withPermission('admin', 'manage_settings', async (_req, ctx) => {
|
||||
try {
|
||||
if (!ctx.isSuperAdmin) {
|
||||
throw new ForbiddenError('Listing all ports requires super-admin');
|
||||
}
|
||||
const data = await listPorts();
|
||||
return NextResponse.json({ data });
|
||||
} catch (error) {
|
||||
@@ -20,6 +27,9 @@ export const GET = withAuth(
|
||||
export const POST = withAuth(
|
||||
withPermission('admin', 'manage_settings', async (req, ctx) => {
|
||||
try {
|
||||
if (!ctx.isSuperAdmin) {
|
||||
throw new ForbiddenError('Creating ports requires super-admin');
|
||||
}
|
||||
const body = await parseBody(req, createPortSchema);
|
||||
const data = await createPort(body, {
|
||||
userId: ctx.userId,
|
||||
|
||||
@@ -4,14 +4,17 @@ import { withAuth } from '@/lib/api/helpers';
|
||||
import { getEmailDraftResult } from '@/lib/services/email-draft.service';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
|
||||
export const GET = withAuth(async (_req, _ctx, params) => {
|
||||
export const GET = withAuth(async (_req, ctx, params) => {
|
||||
try {
|
||||
const { jobId } = params;
|
||||
if (!jobId) {
|
||||
return NextResponse.json({ error: 'jobId is required' }, { status: 400 });
|
||||
}
|
||||
|
||||
const result = await getEmailDraftResult(jobId);
|
||||
const result = await getEmailDraftResult(jobId, {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
});
|
||||
|
||||
if (result === null) {
|
||||
return NextResponse.json({ status: 'processing' });
|
||||
|
||||
11
src/app/api/v1/alerts/[id]/acknowledge/route.ts
Normal file
11
src/app/api/v1/alerts/[id]/acknowledge/route.ts
Normal file
@@ -0,0 +1,11 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth } from '@/lib/api/helpers';
|
||||
import { acknowledgeAlert } from '@/lib/services/alerts.service';
|
||||
|
||||
export const POST = withAuth(async (_req, ctx, params) => {
|
||||
const id = params.id;
|
||||
if (!id) return NextResponse.json({ error: 'Missing id' }, { status: 400 });
|
||||
await acknowledgeAlert(id, ctx.portId, ctx.userId);
|
||||
return NextResponse.json({ ok: true });
|
||||
});
|
||||
11
src/app/api/v1/alerts/[id]/dismiss/route.ts
Normal file
11
src/app/api/v1/alerts/[id]/dismiss/route.ts
Normal file
@@ -0,0 +1,11 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth } from '@/lib/api/helpers';
|
||||
import { dismissAlert } from '@/lib/services/alerts.service';
|
||||
|
||||
export const POST = withAuth(async (_req, ctx, params) => {
|
||||
const id = params.id;
|
||||
if (!id) return NextResponse.json({ error: 'Missing id' }, { status: 400 });
|
||||
await dismissAlert(id, ctx.portId, ctx.userId);
|
||||
return NextResponse.json({ ok: true });
|
||||
});
|
||||
24
src/app/api/v1/alerts/count/route.ts
Normal file
24
src/app/api/v1/alerts/count/route.ts
Normal file
@@ -0,0 +1,24 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { and, eq, isNull, sql } from 'drizzle-orm';
|
||||
|
||||
import { withAuth } from '@/lib/api/helpers';
|
||||
import { db } from '@/lib/db';
|
||||
import { alerts } from '@/lib/db/schema/insights';
|
||||
|
||||
export const GET = withAuth(async (_req, ctx) => {
|
||||
const rows = await db
|
||||
.select({ severity: alerts.severity, count: sql<number>`count(*)::int` })
|
||||
.from(alerts)
|
||||
.where(
|
||||
and(eq(alerts.portId, ctx.portId), isNull(alerts.resolvedAt), isNull(alerts.dismissedAt)),
|
||||
)
|
||||
.groupBy(alerts.severity);
|
||||
|
||||
const bySeverity = { info: 0, warning: 0, critical: 0 } as Record<string, number>;
|
||||
let total = 0;
|
||||
for (const r of rows) {
|
||||
bySeverity[r.severity] = r.count;
|
||||
total += r.count;
|
||||
}
|
||||
return NextResponse.json({ total, bySeverity });
|
||||
});
|
||||
26
src/app/api/v1/alerts/route.ts
Normal file
26
src/app/api/v1/alerts/route.ts
Normal file
@@ -0,0 +1,26 @@
|
||||
import { NextRequest, NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth } from '@/lib/api/helpers';
|
||||
import { listAlertsForPort } from '@/lib/services/alerts.service';
|
||||
|
||||
type AlertStatus = 'open' | 'dismissed' | 'resolved';
|
||||
|
||||
export const GET = withAuth(async (req: NextRequest, ctx) => {
|
||||
const url = new URL(req.url);
|
||||
const status = (url.searchParams.get('status') ?? 'open') as AlertStatus;
|
||||
|
||||
const rows = await listAlertsForPort(ctx.portId, {
|
||||
includeDismissed: status !== 'open',
|
||||
includeResolved: status !== 'open',
|
||||
});
|
||||
|
||||
// Filter to the requested status bucket so callers don't see overlap.
|
||||
const filtered = rows.filter((a) => {
|
||||
if (status === 'open') return !a.dismissedAt && !a.resolvedAt;
|
||||
if (status === 'dismissed') return Boolean(a.dismissedAt) && !a.resolvedAt;
|
||||
if (status === 'resolved') return Boolean(a.resolvedAt);
|
||||
return true;
|
||||
});
|
||||
|
||||
return NextResponse.json({ data: filtered });
|
||||
});
|
||||
37
src/app/api/v1/analytics/route.ts
Normal file
37
src/app/api/v1/analytics/route.ts
Normal file
@@ -0,0 +1,37 @@
|
||||
import { NextRequest, NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import {
|
||||
ALL_RANGES,
|
||||
getLeadSourceAttribution,
|
||||
getOccupancyTimeline,
|
||||
getPipelineFunnel,
|
||||
getRevenueBreakdown,
|
||||
type DateRange,
|
||||
type MetricBase,
|
||||
} from '@/lib/services/analytics.service';
|
||||
|
||||
const METRICS: Record<MetricBase, (portId: string, range: DateRange) => Promise<unknown>> = {
|
||||
pipeline_funnel: getPipelineFunnel,
|
||||
occupancy_timeline: getOccupancyTimeline,
|
||||
revenue_breakdown: getRevenueBreakdown,
|
||||
lead_source_attribution: getLeadSourceAttribution,
|
||||
};
|
||||
|
||||
export const GET = withAuth(
|
||||
withPermission('reports', 'view_analytics', async (req: NextRequest, ctx) => {
|
||||
const url = new URL(req.url);
|
||||
const metric = url.searchParams.get('metric') as MetricBase | null;
|
||||
const range = (url.searchParams.get('range') ?? '30d') as DateRange;
|
||||
|
||||
if (!metric || !(metric in METRICS)) {
|
||||
return NextResponse.json({ error: 'Invalid or missing metric' }, { status: 400 });
|
||||
}
|
||||
if (!ALL_RANGES.includes(range)) {
|
||||
return NextResponse.json({ error: 'Invalid range' }, { status: 400 });
|
||||
}
|
||||
|
||||
const data = await METRICS[metric](ctx.portId, range);
|
||||
return NextResponse.json({ metric, range, data });
|
||||
}),
|
||||
);
|
||||
107
src/app/api/v1/berth-reservations/[id]/handlers.ts
Normal file
107
src/app/api/v1/berth-reservations/[id]/handlers.ts
Normal file
@@ -0,0 +1,107 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { z } from 'zod';
|
||||
|
||||
import { type RouteHandler } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { requirePermission } from '@/lib/auth/permissions';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import {
|
||||
activate,
|
||||
cancel,
|
||||
endReservation,
|
||||
getById,
|
||||
} from '@/lib/services/berth-reservations.service';
|
||||
|
||||
// ─── PATCH body schema (action-based discriminated union) ────────────────────
|
||||
|
||||
const patchBodySchema = z.discriminatedUnion('action', [
|
||||
z.object({
|
||||
action: z.literal('activate'),
|
||||
contractFileId: z.string().optional(),
|
||||
effectiveDate: z.coerce.date().optional(),
|
||||
}),
|
||||
z.object({
|
||||
action: z.literal('end'),
|
||||
endDate: z.coerce.date(),
|
||||
notes: z.string().optional(),
|
||||
}),
|
||||
z.object({
|
||||
action: z.literal('cancel'),
|
||||
reason: z.string().optional(),
|
||||
}),
|
||||
]);
|
||||
|
||||
// ─── Handlers ────────────────────────────────────────────────────────────────
|
||||
|
||||
export const getHandler: RouteHandler = async (_req, ctx, params) => {
|
||||
try {
|
||||
const reservation = await getById(params.id!, ctx.portId);
|
||||
return NextResponse.json({ data: reservation });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
|
||||
export const patchHandler: RouteHandler = async (req, ctx, params) => {
|
||||
try {
|
||||
const body = await parseBody(req, patchBodySchema);
|
||||
const meta = {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
};
|
||||
|
||||
if (body.action === 'activate') {
|
||||
requirePermission(ctx, 'reservations', 'activate');
|
||||
const result = await activate(
|
||||
params.id!,
|
||||
ctx.portId,
|
||||
{
|
||||
contractFileId: body.contractFileId,
|
||||
effectiveDate: body.effectiveDate,
|
||||
},
|
||||
meta,
|
||||
);
|
||||
return NextResponse.json({ data: result });
|
||||
}
|
||||
|
||||
if (body.action === 'end') {
|
||||
// `end` is lifecycle progression; same privilege as activate.
|
||||
requirePermission(ctx, 'reservations', 'activate');
|
||||
const result = await endReservation(
|
||||
params.id!,
|
||||
ctx.portId,
|
||||
{ endDate: body.endDate, notes: body.notes },
|
||||
meta,
|
||||
);
|
||||
return NextResponse.json({ data: result });
|
||||
}
|
||||
|
||||
// action === 'cancel'
|
||||
requirePermission(ctx, 'reservations', 'cancel');
|
||||
const result = await cancel(params.id!, ctx.portId, { reason: body.reason }, meta);
|
||||
return NextResponse.json({ data: result });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteHandler: RouteHandler = async (_req, ctx, params) => {
|
||||
try {
|
||||
await cancel(
|
||||
params.id!,
|
||||
ctx.portId,
|
||||
{},
|
||||
{
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
},
|
||||
);
|
||||
return new NextResponse(null, { status: 204 });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
@@ -1,110 +1,6 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { z } from 'zod';
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
|
||||
import { withAuth, withPermission, type RouteHandler } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { requirePermission } from '@/lib/auth/permissions';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import {
|
||||
activate,
|
||||
cancel,
|
||||
endReservation,
|
||||
getById,
|
||||
} from '@/lib/services/berth-reservations.service';
|
||||
|
||||
// ─── PATCH body schema (action-based discriminated union) ────────────────────
|
||||
|
||||
const patchBodySchema = z.discriminatedUnion('action', [
|
||||
z.object({
|
||||
action: z.literal('activate'),
|
||||
contractFileId: z.string().optional(),
|
||||
effectiveDate: z.coerce.date().optional(),
|
||||
}),
|
||||
z.object({
|
||||
action: z.literal('end'),
|
||||
endDate: z.coerce.date(),
|
||||
notes: z.string().optional(),
|
||||
}),
|
||||
z.object({
|
||||
action: z.literal('cancel'),
|
||||
reason: z.string().optional(),
|
||||
}),
|
||||
]);
|
||||
|
||||
// ─── Handlers ────────────────────────────────────────────────────────────────
|
||||
|
||||
export const getHandler: RouteHandler = async (_req, ctx, params) => {
|
||||
try {
|
||||
const reservation = await getById(params.id!, ctx.portId);
|
||||
return NextResponse.json({ data: reservation });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
|
||||
export const patchHandler: RouteHandler = async (req, ctx, params) => {
|
||||
try {
|
||||
const body = await parseBody(req, patchBodySchema);
|
||||
const meta = {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
};
|
||||
|
||||
if (body.action === 'activate') {
|
||||
requirePermission(ctx, 'reservations', 'activate');
|
||||
const result = await activate(
|
||||
params.id!,
|
||||
ctx.portId,
|
||||
{
|
||||
contractFileId: body.contractFileId,
|
||||
effectiveDate: body.effectiveDate,
|
||||
},
|
||||
meta,
|
||||
);
|
||||
return NextResponse.json({ data: result });
|
||||
}
|
||||
|
||||
if (body.action === 'end') {
|
||||
// `end` is lifecycle progression; same privilege as activate.
|
||||
requirePermission(ctx, 'reservations', 'activate');
|
||||
const result = await endReservation(
|
||||
params.id!,
|
||||
ctx.portId,
|
||||
{ endDate: body.endDate, notes: body.notes },
|
||||
meta,
|
||||
);
|
||||
return NextResponse.json({ data: result });
|
||||
}
|
||||
|
||||
// action === 'cancel'
|
||||
requirePermission(ctx, 'reservations', 'cancel');
|
||||
const result = await cancel(params.id!, ctx.portId, { reason: body.reason }, meta);
|
||||
return NextResponse.json({ data: result });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteHandler: RouteHandler = async (_req, ctx, params) => {
|
||||
try {
|
||||
await cancel(
|
||||
params.id!,
|
||||
ctx.portId,
|
||||
{},
|
||||
{
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
},
|
||||
);
|
||||
return new NextResponse(null, { status: 204 });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
import { getHandler, patchHandler, deleteHandler } from './handlers';
|
||||
|
||||
export const GET = withAuth(withPermission('reservations', 'view', getHandler));
|
||||
// PATCH cannot use `withPermission` wrapper — the required permission depends
|
||||
|
||||
35
src/app/api/v1/berth-reservations/handlers.ts
Normal file
35
src/app/api/v1/berth-reservations/handlers.ts
Normal file
@@ -0,0 +1,35 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
|
||||
import type { AuthContext } from '@/lib/api/helpers';
|
||||
import { parseQuery } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { listReservations } from '@/lib/services/berth-reservations.service';
|
||||
import { listReservationsSchema } from '@/lib/validators/reservations';
|
||||
|
||||
/**
|
||||
* Port-scoped global list of reservations across all berths. Inner handler
|
||||
* lives here so it can be invoked directly from integration tests without
|
||||
* the `withAuth(withPermission(...))` wrappers (matches the convention
|
||||
* used throughout `src/app/api/v1/*`).
|
||||
*/
|
||||
export async function listHandler(req: Request, ctx: AuthContext): Promise<NextResponse> {
|
||||
try {
|
||||
const query = parseQuery(req as never, listReservationsSchema);
|
||||
const result = await listReservations(ctx.portId, query);
|
||||
const { page, limit } = query;
|
||||
const totalPages = Math.ceil(result.total / limit);
|
||||
return NextResponse.json({
|
||||
data: result.data,
|
||||
pagination: {
|
||||
page,
|
||||
pageSize: limit,
|
||||
total: result.total,
|
||||
totalPages,
|
||||
hasNextPage: page < totalPages,
|
||||
hasPreviousPage: page > 1,
|
||||
},
|
||||
});
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}
|
||||
4
src/app/api/v1/berth-reservations/route.ts
Normal file
4
src/app/api/v1/berth-reservations/route.ts
Normal file
@@ -0,0 +1,4 @@
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { listHandler } from './handlers';
|
||||
|
||||
export const GET = withAuth(withPermission('reservations', 'view', listHandler));
|
||||
@@ -8,7 +8,7 @@ import { reorderWaitingListSchema } from '@/lib/validators/interests';
|
||||
import { getWaitingList, updateWaitingList } from '@/lib/services/berths.service';
|
||||
import { errorResponse, NotFoundError } from '@/lib/errors';
|
||||
import { db } from '@/lib/db';
|
||||
import { berthWaitingList } from '@/lib/db/schema/berths';
|
||||
import { berths, berthWaitingList } from '@/lib/db/schema/berths';
|
||||
|
||||
// GET /api/v1/berths/[id]/waiting-list
|
||||
export const GET = withAuth(
|
||||
@@ -47,11 +47,17 @@ export const PATCH = withAuth(
|
||||
const body = await parseBody(req, reorderWaitingListSchema);
|
||||
const berthId = params.id!;
|
||||
|
||||
// Tenant scope: refuse to reorder a foreign-port berth's waiting
|
||||
// list. The route's URL id and the entry id are otherwise enough
|
||||
// for any user with manage_waiting_list to mutate any tenant's
|
||||
// queue ordering.
|
||||
const berthRow = await db.query.berths.findFirst({
|
||||
where: and(eq(berths.id, berthId), eq(berths.portId, ctx.portId)),
|
||||
});
|
||||
if (!berthRow) throw new NotFoundError('Berth');
|
||||
|
||||
const entry = await db.query.berthWaitingList.findFirst({
|
||||
where: and(
|
||||
eq(berthWaitingList.id, body.entryId),
|
||||
eq(berthWaitingList.berthId, berthId),
|
||||
),
|
||||
where: and(eq(berthWaitingList.id, body.entryId), eq(berthWaitingList.berthId, berthId)),
|
||||
});
|
||||
if (!entry) throw new NotFoundError('Waiting list entry');
|
||||
|
||||
|
||||
@@ -1,15 +1,17 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth } from '@/lib/api/helpers';
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { getBerthOptions } from '@/lib/services/berths.service';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
|
||||
// GET /api/v1/berths/options — lightweight list for selects/comboboxes
|
||||
export const GET = withAuth(async (req, ctx) => {
|
||||
try {
|
||||
const options = await getBerthOptions(ctx.portId);
|
||||
return NextResponse.json({ data: options });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
});
|
||||
export const GET = withAuth(
|
||||
withPermission('berths', 'view', async (req, ctx) => {
|
||||
try {
|
||||
const options = await getBerthOptions(ctx.portId);
|
||||
return NextResponse.json({ data: options });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
51
src/app/api/v1/clients/[id]/addresses/[addressId]/route.ts
Normal file
51
src/app/api/v1/clients/[id]/addresses/[addressId]/route.ts
Normal file
@@ -0,0 +1,51 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { z } from 'zod';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { updateClientAddress, removeClientAddress } from '@/lib/services/clients.service';
|
||||
import { optionalCountryIsoSchema, optionalSubdivisionIsoSchema } from '@/lib/validators/i18n';
|
||||
|
||||
const updateAddressSchema = z.object({
|
||||
label: z.string().min(1).max(80).optional(),
|
||||
streetAddress: z.string().max(500).optional().nullable(),
|
||||
city: z.string().max(120).optional().nullable(),
|
||||
subdivisionIso: optionalSubdivisionIsoSchema.optional(),
|
||||
postalCode: z.string().max(40).optional().nullable(),
|
||||
countryIso: optionalCountryIsoSchema.optional(),
|
||||
isPrimary: z.boolean().optional(),
|
||||
});
|
||||
|
||||
export const PATCH = withAuth(
|
||||
withPermission('clients', 'edit', async (req, ctx, params) => {
|
||||
try {
|
||||
const body = await parseBody(req, updateAddressSchema);
|
||||
const row = await updateClientAddress(params.addressId!, params.id!, ctx.portId, body, {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
});
|
||||
return NextResponse.json({ data: row });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
export const DELETE = withAuth(
|
||||
withPermission('clients', 'edit', async (req, ctx, params) => {
|
||||
try {
|
||||
await removeClientAddress(params.addressId!, params.id!, ctx.portId, {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
});
|
||||
return new NextResponse(null, { status: 204 });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
46
src/app/api/v1/clients/[id]/addresses/route.ts
Normal file
46
src/app/api/v1/clients/[id]/addresses/route.ts
Normal file
@@ -0,0 +1,46 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { z } from 'zod';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { listClientAddresses, addClientAddress } from '@/lib/services/clients.service';
|
||||
import { optionalCountryIsoSchema, optionalSubdivisionIsoSchema } from '@/lib/validators/i18n';
|
||||
|
||||
const addAddressSchema = z.object({
|
||||
label: z.string().min(1).max(80).optional(),
|
||||
streetAddress: z.string().max(500).optional().nullable(),
|
||||
city: z.string().max(120).optional().nullable(),
|
||||
subdivisionIso: optionalSubdivisionIsoSchema.optional(),
|
||||
postalCode: z.string().max(40).optional().nullable(),
|
||||
countryIso: optionalCountryIsoSchema.optional(),
|
||||
isPrimary: z.boolean().optional(),
|
||||
});
|
||||
|
||||
export const GET = withAuth(
|
||||
withPermission('clients', 'view', async (req, ctx, params) => {
|
||||
try {
|
||||
const rows = await listClientAddresses(params.id!, ctx.portId);
|
||||
return NextResponse.json({ data: rows });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
export const POST = withAuth(
|
||||
withPermission('clients', 'edit', async (req, ctx, params) => {
|
||||
try {
|
||||
const body = await parseBody(req, addAddressSchema);
|
||||
const row = await addClientAddress(params.id!, ctx.portId, body, {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
});
|
||||
return NextResponse.json({ data: row }, { status: 201 });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
@@ -5,10 +5,13 @@ import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { updateContact, removeContact } from '@/lib/services/clients.service';
|
||||
import { optionalCountryIsoSchema, optionalPhoneE164Schema } from '@/lib/validators/i18n';
|
||||
|
||||
const updateContactSchema = z.object({
|
||||
channel: z.enum(['email', 'phone', 'whatsapp', 'other']).optional(),
|
||||
value: z.string().min(1).optional(),
|
||||
valueE164: optionalPhoneE164Schema.optional(),
|
||||
valueCountry: optionalCountryIsoSchema.optional(),
|
||||
label: z.string().optional(),
|
||||
isPrimary: z.boolean().optional(),
|
||||
notes: z.string().optional(),
|
||||
@@ -18,18 +21,12 @@ export const PATCH = withAuth(
|
||||
withPermission('clients', 'edit', async (req, ctx, params) => {
|
||||
try {
|
||||
const body = await parseBody(req, updateContactSchema);
|
||||
const contact = await updateContact(
|
||||
params.contactId!,
|
||||
params.id!,
|
||||
ctx.portId,
|
||||
body,
|
||||
{
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
},
|
||||
);
|
||||
const contact = await updateContact(params.contactId!, params.id!, ctx.portId, body, {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
});
|
||||
return NextResponse.json({ data: contact });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
|
||||
@@ -5,10 +5,13 @@ import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { listContacts, addContact } from '@/lib/services/clients.service';
|
||||
import { optionalCountryIsoSchema, optionalPhoneE164Schema } from '@/lib/validators/i18n';
|
||||
|
||||
const addContactSchema = z.object({
|
||||
channel: z.enum(['email', 'phone', 'whatsapp', 'other']),
|
||||
value: z.string().min(1),
|
||||
valueE164: optionalPhoneE164Schema.optional(),
|
||||
valueCountry: optionalCountryIsoSchema.optional(),
|
||||
label: z.string().optional(),
|
||||
isPrimary: z.boolean().optional().default(false),
|
||||
notes: z.string().optional(),
|
||||
|
||||
24
src/app/api/v1/clients/[id]/gdpr-export/[exportId]/route.ts
Normal file
24
src/app/api/v1/clients/[id]/gdpr-export/[exportId]/route.ts
Normal file
@@ -0,0 +1,24 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth, withPermission, withRateLimit } from '@/lib/api/helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { getExportDownloadUrl } from '@/lib/services/gdpr-export.service';
|
||||
|
||||
/**
|
||||
* Returns a fresh signed URL for an existing GDPR export. Staff use this
|
||||
* from the admin UI; the email path embeds its own signed URL.
|
||||
*/
|
||||
export const GET = withAuth(
|
||||
withPermission(
|
||||
'admin',
|
||||
'manage_settings',
|
||||
withRateLimit('exports', async (req, ctx, params) => {
|
||||
try {
|
||||
const url = await getExportDownloadUrl(params.exportId!, ctx.portId);
|
||||
return NextResponse.json({ data: { url } });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
),
|
||||
);
|
||||
49
src/app/api/v1/clients/[id]/gdpr-export/route.ts
Normal file
49
src/app/api/v1/clients/[id]/gdpr-export/route.ts
Normal file
@@ -0,0 +1,49 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { z } from 'zod';
|
||||
|
||||
import { withAuth, withPermission, withRateLimit } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { requestGdprExport, listClientExports } from '@/lib/services/gdpr-export.service';
|
||||
|
||||
const requestSchema = z.object({
|
||||
/** When true, the bundle is emailed to the client once it finishes building. */
|
||||
emailToClient: z.boolean().optional().default(false),
|
||||
/** Optional override recipient (e.g. legal counsel). Skips the primary-email lookup. */
|
||||
emailOverride: z.string().email().optional().nullable(),
|
||||
});
|
||||
|
||||
export const GET = withAuth(
|
||||
withPermission('clients', 'view', async (req, ctx, params) => {
|
||||
try {
|
||||
const rows = await listClientExports(params.id!, ctx.portId);
|
||||
return NextResponse.json({ data: rows });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
export const POST = withAuth(
|
||||
withPermission(
|
||||
'admin',
|
||||
'manage_settings',
|
||||
withRateLimit('exports', async (req, ctx, params) => {
|
||||
try {
|
||||
const body = await parseBody(req, requestSchema);
|
||||
const result = await requestGdprExport({
|
||||
clientId: params.id!,
|
||||
portId: ctx.portId,
|
||||
requestedBy: ctx.userId,
|
||||
emailToClient: body.emailToClient,
|
||||
emailOverride: body.emailOverride ?? null,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
});
|
||||
return NextResponse.json({ data: result.export }, { status: 202 });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
),
|
||||
);
|
||||
@@ -1,15 +1,17 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth } from '@/lib/api/helpers';
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { listClientOptions } from '@/lib/services/clients.service';
|
||||
|
||||
export const GET = withAuth(async (req, ctx) => {
|
||||
try {
|
||||
const search = req.nextUrl.searchParams.get('search') ?? undefined;
|
||||
const data = await listClientOptions(ctx.portId, search);
|
||||
return NextResponse.json({ data });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
});
|
||||
export const GET = withAuth(
|
||||
withPermission('clients', 'view', async (req, ctx) => {
|
||||
try {
|
||||
const search = req.nextUrl.searchParams.get('search') ?? undefined;
|
||||
const data = await listClientOptions(ctx.portId, search);
|
||||
return NextResponse.json({ data });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
51
src/app/api/v1/companies/[id]/addresses/[addressId]/route.ts
Normal file
51
src/app/api/v1/companies/[id]/addresses/[addressId]/route.ts
Normal file
@@ -0,0 +1,51 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { z } from 'zod';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { updateCompanyAddress, removeCompanyAddress } from '@/lib/services/companies.service';
|
||||
import { optionalCountryIsoSchema, optionalSubdivisionIsoSchema } from '@/lib/validators/i18n';
|
||||
|
||||
const updateAddressSchema = z.object({
|
||||
label: z.string().min(1).max(80).optional(),
|
||||
streetAddress: z.string().max(500).optional().nullable(),
|
||||
city: z.string().max(120).optional().nullable(),
|
||||
subdivisionIso: optionalSubdivisionIsoSchema.optional(),
|
||||
postalCode: z.string().max(40).optional().nullable(),
|
||||
countryIso: optionalCountryIsoSchema.optional(),
|
||||
isPrimary: z.boolean().optional(),
|
||||
});
|
||||
|
||||
export const PATCH = withAuth(
|
||||
withPermission('companies', 'edit', async (req, ctx, params) => {
|
||||
try {
|
||||
const body = await parseBody(req, updateAddressSchema);
|
||||
const row = await updateCompanyAddress(params.addressId!, params.id!, ctx.portId, body, {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
});
|
||||
return NextResponse.json({ data: row });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
export const DELETE = withAuth(
|
||||
withPermission('companies', 'edit', async (req, ctx, params) => {
|
||||
try {
|
||||
await removeCompanyAddress(params.addressId!, params.id!, ctx.portId, {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
});
|
||||
return new NextResponse(null, { status: 204 });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
46
src/app/api/v1/companies/[id]/addresses/route.ts
Normal file
46
src/app/api/v1/companies/[id]/addresses/route.ts
Normal file
@@ -0,0 +1,46 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { z } from 'zod';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { listCompanyAddresses, addCompanyAddress } from '@/lib/services/companies.service';
|
||||
import { optionalCountryIsoSchema, optionalSubdivisionIsoSchema } from '@/lib/validators/i18n';
|
||||
|
||||
const addAddressSchema = z.object({
|
||||
label: z.string().min(1).max(80).optional(),
|
||||
streetAddress: z.string().max(500).optional().nullable(),
|
||||
city: z.string().max(120).optional().nullable(),
|
||||
subdivisionIso: optionalSubdivisionIsoSchema.optional(),
|
||||
postalCode: z.string().max(40).optional().nullable(),
|
||||
countryIso: optionalCountryIsoSchema.optional(),
|
||||
isPrimary: z.boolean().optional(),
|
||||
});
|
||||
|
||||
export const GET = withAuth(
|
||||
withPermission('companies', 'view', async (req, ctx, params) => {
|
||||
try {
|
||||
const rows = await listCompanyAddresses(params.id!, ctx.portId);
|
||||
return NextResponse.json({ data: rows });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
export const POST = withAuth(
|
||||
withPermission('companies', 'edit', async (req, ctx, params) => {
|
||||
try {
|
||||
const body = await parseBody(req, addAddressSchema);
|
||||
const row = await addCompanyAddress(params.id!, ctx.portId, body, {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
});
|
||||
return NextResponse.json({ data: row }, { status: 201 });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
47
src/app/api/v1/companies/[id]/members/[mid]/handlers.ts
Normal file
47
src/app/api/v1/companies/[id]/members/[mid]/handlers.ts
Normal file
@@ -0,0 +1,47 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
|
||||
import { type RouteHandler } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { endMembership, updateMembership } from '@/lib/services/company-memberships.service';
|
||||
import { endMembershipSchema, updateMembershipSchema } from '@/lib/validators/company-memberships';
|
||||
|
||||
export const patchHandler: RouteHandler = async (req, ctx, params) => {
|
||||
try {
|
||||
const body = await parseBody(req, updateMembershipSchema);
|
||||
const updated = await updateMembership(params.mid!, ctx.portId, body, {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
});
|
||||
return NextResponse.json({ data: updated });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteHandler: RouteHandler = async (req, ctx, params) => {
|
||||
try {
|
||||
let endDate = new Date();
|
||||
const text = await req.text();
|
||||
if (text.length > 0) {
|
||||
const parsed = endMembershipSchema.parse(JSON.parse(text));
|
||||
endDate = parsed.endDate;
|
||||
}
|
||||
await endMembership(
|
||||
params.mid!,
|
||||
ctx.portId,
|
||||
{ endDate },
|
||||
{
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
},
|
||||
);
|
||||
return new NextResponse(null, { status: 204 });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
@@ -1,50 +1,6 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
|
||||
import { withAuth, withPermission, type RouteHandler } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { endMembership, updateMembership } from '@/lib/services/company-memberships.service';
|
||||
import { endMembershipSchema, updateMembershipSchema } from '@/lib/validators/company-memberships';
|
||||
|
||||
export const patchHandler: RouteHandler = async (req, ctx, params) => {
|
||||
try {
|
||||
const body = await parseBody(req, updateMembershipSchema);
|
||||
const updated = await updateMembership(params.mid!, ctx.portId, body, {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
});
|
||||
return NextResponse.json({ data: updated });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteHandler: RouteHandler = async (req, ctx, params) => {
|
||||
try {
|
||||
let endDate = new Date();
|
||||
const text = await req.text();
|
||||
if (text.length > 0) {
|
||||
const parsed = endMembershipSchema.parse(JSON.parse(text));
|
||||
endDate = parsed.endDate;
|
||||
}
|
||||
await endMembership(
|
||||
params.mid!,
|
||||
ctx.portId,
|
||||
{ endDate },
|
||||
{
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
},
|
||||
);
|
||||
return new NextResponse(null, { status: 204 });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
import { patchHandler, deleteHandler } from './handlers';
|
||||
|
||||
export const PATCH = withAuth(withPermission('memberships', 'manage', patchHandler));
|
||||
export const DELETE = withAuth(withPermission('memberships', 'manage', deleteHandler));
|
||||
|
||||
@@ -0,0 +1,19 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
|
||||
import { type RouteHandler } from '@/lib/api/helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { setPrimary } from '@/lib/services/company-memberships.service';
|
||||
|
||||
export const setPrimaryHandler: RouteHandler = async (_req, ctx, params) => {
|
||||
try {
|
||||
const membership = await setPrimary(params.mid!, ctx.portId, {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
});
|
||||
return NextResponse.json({ data: membership });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
@@ -1,21 +1,5 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
|
||||
import { withAuth, withPermission, type RouteHandler } from '@/lib/api/helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { setPrimary } from '@/lib/services/company-memberships.service';
|
||||
|
||||
export const setPrimaryHandler: RouteHandler = async (_req, ctx, params) => {
|
||||
try {
|
||||
const membership = await setPrimary(params.mid!, ctx.portId, {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
});
|
||||
return NextResponse.json({ data: membership });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
import { setPrimaryHandler } from './handlers';
|
||||
|
||||
export const POST = withAuth(withPermission('memberships', 'manage', setPrimaryHandler));
|
||||
|
||||
40
src/app/api/v1/companies/[id]/members/handlers.ts
Normal file
40
src/app/api/v1/companies/[id]/members/handlers.ts
Normal file
@@ -0,0 +1,40 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { z } from 'zod';
|
||||
|
||||
import { type RouteHandler } from '@/lib/api/helpers';
|
||||
import { parseBody, parseQuery } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { addMembership, listByCompany } from '@/lib/services/company-memberships.service';
|
||||
import { addMembershipSchema } from '@/lib/validators/company-memberships';
|
||||
|
||||
const listQuerySchema = z.object({
|
||||
activeOnly: z
|
||||
.enum(['true', 'false'])
|
||||
.transform((v) => v === 'true')
|
||||
.default('true'),
|
||||
});
|
||||
|
||||
export const listHandler: RouteHandler = async (req, ctx, params) => {
|
||||
try {
|
||||
const { activeOnly } = parseQuery(req, listQuerySchema);
|
||||
const memberships = await listByCompany(params.id!, ctx.portId, { activeOnly });
|
||||
return NextResponse.json({ data: memberships });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
|
||||
export const createHandler: RouteHandler = async (req, ctx, params) => {
|
||||
try {
|
||||
const body = await parseBody(req, addMembershipSchema);
|
||||
const membership = await addMembership(params.id!, ctx.portId, body, {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
});
|
||||
return NextResponse.json({ data: membership }, { status: 201 });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
@@ -1,43 +1,6 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { z } from 'zod';
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
|
||||
import { withAuth, withPermission, type RouteHandler } from '@/lib/api/helpers';
|
||||
import { parseBody, parseQuery } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { addMembership, listByCompany } from '@/lib/services/company-memberships.service';
|
||||
import { addMembershipSchema } from '@/lib/validators/company-memberships';
|
||||
|
||||
const listQuerySchema = z.object({
|
||||
activeOnly: z
|
||||
.enum(['true', 'false'])
|
||||
.transform((v) => v === 'true')
|
||||
.default('true'),
|
||||
});
|
||||
|
||||
export const listHandler: RouteHandler = async (req, ctx, params) => {
|
||||
try {
|
||||
const { activeOnly } = parseQuery(req, listQuerySchema);
|
||||
const memberships = await listByCompany(params.id!, ctx.portId, { activeOnly });
|
||||
return NextResponse.json({ data: memberships });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
|
||||
export const createHandler: RouteHandler = async (req, ctx, params) => {
|
||||
try {
|
||||
const body = await parseBody(req, addMembershipSchema);
|
||||
const membership = await addMembership(params.id!, ctx.portId, body, {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
});
|
||||
return NextResponse.json({ data: membership }, { status: 201 });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
import { listHandler, createHandler } from './handlers';
|
||||
|
||||
export const GET = withAuth(withPermission('memberships', 'view', listHandler));
|
||||
export const POST = withAuth(withPermission('memberships', 'manage', createHandler));
|
||||
|
||||
18
src/app/api/v1/companies/autocomplete/handlers.ts
Normal file
18
src/app/api/v1/companies/autocomplete/handlers.ts
Normal file
@@ -0,0 +1,18 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
|
||||
import { type RouteHandler } from '@/lib/api/helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { autocomplete } from '@/lib/services/companies.service';
|
||||
|
||||
export const autocompleteHandler: RouteHandler = async (req, ctx) => {
|
||||
try {
|
||||
const q = req.nextUrl.searchParams.get('q');
|
||||
if (!q) {
|
||||
return NextResponse.json({ data: [] });
|
||||
}
|
||||
const companies = await autocomplete(ctx.portId, q);
|
||||
return NextResponse.json({ data: companies });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
@@ -1,20 +1,5 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
|
||||
import { withAuth, withPermission, type RouteHandler } from '@/lib/api/helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { autocomplete } from '@/lib/services/companies.service';
|
||||
|
||||
export const autocompleteHandler: RouteHandler = async (req, ctx) => {
|
||||
try {
|
||||
const q = req.nextUrl.searchParams.get('q');
|
||||
if (!q) {
|
||||
return NextResponse.json({ data: [] });
|
||||
}
|
||||
const companies = await autocomplete(ctx.portId, q);
|
||||
return NextResponse.json({ data: companies });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
};
|
||||
import { autocompleteHandler } from './handlers';
|
||||
|
||||
export const GET = withAuth(withPermission('companies', 'view', autocompleteHandler));
|
||||
|
||||
@@ -1,45 +1,55 @@
|
||||
import { NextRequest, NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth } from '@/lib/api/helpers';
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { errorResponse, NotFoundError } from '@/lib/errors';
|
||||
import { setValuesSchema } from '@/lib/validators/custom-fields';
|
||||
import { getValues, setValues } from '@/lib/services/custom-fields.service';
|
||||
|
||||
export const GET = withAuth(async (_req: NextRequest, ctx, params) => {
|
||||
try {
|
||||
const { entityId } = params;
|
||||
if (!entityId) throw new NotFoundError('Entity');
|
||||
// Custom-field values live on top of a port-scoped entity (client, yacht,
|
||||
// interest, berth, company). Reading the values is in scope for any role
|
||||
// that can view clients (the most common surface); writing requires the
|
||||
// equivalent edit permission. The service-layer also re-validates the
|
||||
// entityId against the field definition's entityType + portId so a
|
||||
// caller cannot poke values onto an arbitrary or foreign-port entity.
|
||||
export const GET = withAuth(
|
||||
withPermission('clients', 'view', async (_req: NextRequest, ctx, params) => {
|
||||
try {
|
||||
const { entityId } = params;
|
||||
if (!entityId) throw new NotFoundError('Entity');
|
||||
|
||||
const data = await getValues(entityId, ctx.portId);
|
||||
return NextResponse.json({ data });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
});
|
||||
const data = await getValues(entityId, ctx.portId);
|
||||
return NextResponse.json({ data });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
export const PUT = withAuth(async (req: NextRequest, ctx, params) => {
|
||||
try {
|
||||
const { entityId } = params;
|
||||
if (!entityId) throw new NotFoundError('Entity');
|
||||
export const PUT = withAuth(
|
||||
withPermission('clients', 'edit', async (req: NextRequest, ctx, params) => {
|
||||
try {
|
||||
const { entityId } = params;
|
||||
if (!entityId) throw new NotFoundError('Entity');
|
||||
|
||||
const body = await req.json();
|
||||
const { values } = setValuesSchema.parse(body);
|
||||
const body = await req.json();
|
||||
const { values } = setValuesSchema.parse(body);
|
||||
|
||||
const result = await setValues(
|
||||
entityId,
|
||||
ctx.portId,
|
||||
ctx.userId,
|
||||
values as Array<{ fieldId: string; value: unknown }>,
|
||||
{
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
},
|
||||
);
|
||||
const result = await setValues(
|
||||
entityId,
|
||||
ctx.portId,
|
||||
ctx.userId,
|
||||
values as Array<{ fieldId: string; value: unknown }>,
|
||||
{
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
},
|
||||
);
|
||||
|
||||
return NextResponse.json({ data: result });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
});
|
||||
return NextResponse.json({ data: result });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
import { NextRequest, NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth } from '@/lib/api/helpers';
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { getRecentActivity } from '@/lib/services/dashboard.service';
|
||||
|
||||
export const GET = withAuth(async (req: NextRequest, ctx) => {
|
||||
const result = await getRecentActivity(ctx.portId);
|
||||
return NextResponse.json(result);
|
||||
});
|
||||
export const GET = withAuth(
|
||||
withPermission('reports', 'view_dashboard', async (req: NextRequest, ctx) => {
|
||||
const result = await getRecentActivity(ctx.portId);
|
||||
return NextResponse.json(result);
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
import { NextRequest, NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth } from '@/lib/api/helpers';
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { getRevenueForecast } from '@/lib/services/dashboard.service';
|
||||
|
||||
export const GET = withAuth(async (req: NextRequest, ctx) => {
|
||||
const result = await getRevenueForecast(ctx.portId);
|
||||
return NextResponse.json(result);
|
||||
});
|
||||
export const GET = withAuth(
|
||||
withPermission('reports', 'view_dashboard', async (req: NextRequest, ctx) => {
|
||||
const result = await getRevenueForecast(ctx.portId);
|
||||
return NextResponse.json(result);
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
import { NextRequest, NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth } from '@/lib/api/helpers';
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { getKpis } from '@/lib/services/dashboard.service';
|
||||
|
||||
export const GET = withAuth(async (req: NextRequest, ctx) => {
|
||||
const result = await getKpis(ctx.portId);
|
||||
return NextResponse.json(result);
|
||||
});
|
||||
export const GET = withAuth(
|
||||
withPermission('reports', 'view_dashboard', async (req: NextRequest, ctx) => {
|
||||
const result = await getKpis(ctx.portId);
|
||||
return NextResponse.json(result);
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
import { NextRequest, NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth } from '@/lib/api/helpers';
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { getPipelineCounts } from '@/lib/services/dashboard.service';
|
||||
|
||||
export const GET = withAuth(async (req: NextRequest, ctx) => {
|
||||
const result = await getPipelineCounts(ctx.portId);
|
||||
return NextResponse.json(result);
|
||||
});
|
||||
export const GET = withAuth(
|
||||
withPermission('reports', 'view_dashboard', async (req: NextRequest, ctx) => {
|
||||
const result = await getPipelineCounts(ctx.portId);
|
||||
return NextResponse.json(result);
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -1,14 +1,32 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { eq } from 'drizzle-orm';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { db } from '@/lib/db';
|
||||
import { emailAccounts } from '@/lib/db/schema/email';
|
||||
import { errorResponse, ForbiddenError, NotFoundError } from '@/lib/errors';
|
||||
import { getQueue } from '@/lib/queue';
|
||||
|
||||
export const POST = withAuth(
|
||||
withPermission('email', 'view', async (_req, _ctx, params) => {
|
||||
withPermission('email', 'view', async (_req, ctx, params) => {
|
||||
try {
|
||||
const accountId = params.accountId!;
|
||||
// Owner check: the sibling toggle/disconnect endpoints already enforce
|
||||
// account.userId === ctx.userId. Without the same check here, any
|
||||
// user with `email:view` could force IMAP sync against a foreign
|
||||
// account, advancing lastSyncAt (data-loss risk on the legitimate
|
||||
// owner's next sync) and triggering work using the foreign user's
|
||||
// decrypted credentials.
|
||||
const account = await db.query.emailAccounts.findFirst({
|
||||
where: eq(emailAccounts.id, accountId),
|
||||
});
|
||||
if (!account) throw new NotFoundError('Email account');
|
||||
if (account.userId !== ctx.userId) {
|
||||
throw new ForbiddenError('You do not own this email account');
|
||||
}
|
||||
|
||||
const queue = getQueue('email');
|
||||
const job = await queue.add('inbox-sync', { accountId: params.accountId! });
|
||||
const job = await queue.add('inbox-sync', { accountId });
|
||||
return NextResponse.json({ data: { jobId: job.id } }, { status: 202 });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
|
||||
18
src/app/api/v1/expenses/[id]/clear-duplicate/route.ts
Normal file
18
src/app/api/v1/expenses/[id]/clear-duplicate/route.ts
Normal file
@@ -0,0 +1,18 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { clearDuplicate } from '@/lib/services/expense-dedup.service';
|
||||
|
||||
export const POST = withAuth(
|
||||
withPermission('expenses', 'edit', async (_req, ctx, params) => {
|
||||
try {
|
||||
const id = params.id;
|
||||
if (!id) return NextResponse.json({ error: 'Missing id' }, { status: 400 });
|
||||
await clearDuplicate(id, ctx.portId);
|
||||
return NextResponse.json({ ok: true });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
28
src/app/api/v1/expenses/[id]/merge/route.ts
Normal file
28
src/app/api/v1/expenses/[id]/merge/route.ts
Normal file
@@ -0,0 +1,28 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { z } from 'zod';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { mergeDuplicate } from '@/lib/services/expense-dedup.service';
|
||||
|
||||
const mergeSchema = z.object({
|
||||
/** Surviving expense id — typically the row's existing `duplicateOf` pointer. */
|
||||
targetId: z.string().min(1),
|
||||
});
|
||||
|
||||
export const POST = withAuth(
|
||||
withPermission('expenses', 'edit', async (req, ctx, params) => {
|
||||
try {
|
||||
const sourceId = params.id;
|
||||
if (!sourceId) {
|
||||
return NextResponse.json({ error: 'Missing id' }, { status: 400 });
|
||||
}
|
||||
const body = await parseBody(req, mergeSchema);
|
||||
await mergeDuplicate(sourceId, body.targetId, ctx.portId);
|
||||
return NextResponse.json({ ok: true });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
@@ -1,27 +1,117 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { withAuth, withPermission, withRateLimit } from '@/lib/api/helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { scanReceipt } from '@/lib/services/receipt-scanner';
|
||||
import { logger } from '@/lib/logger';
|
||||
import { getResolvedOcrConfig } from '@/lib/services/ocr-config.service';
|
||||
import {
|
||||
runOcr,
|
||||
type ParsedReceipt,
|
||||
OCR_FEATURE,
|
||||
OCR_ESTIMATED_TOKENS,
|
||||
} from '@/lib/services/ocr-providers';
|
||||
import { checkBudget, recordAiUsage } from '@/lib/services/ai-budget.service';
|
||||
|
||||
const EMPTY: ParsedReceipt = {
|
||||
establishment: null,
|
||||
date: null,
|
||||
amount: null,
|
||||
currency: null,
|
||||
lineItems: [],
|
||||
confidence: 0,
|
||||
};
|
||||
|
||||
export const POST = withAuth(
|
||||
withPermission('expenses', 'create', async (req, _ctx) => {
|
||||
try {
|
||||
const formData = await req.formData();
|
||||
const file = formData.get('file') as File | null;
|
||||
withPermission(
|
||||
'expenses',
|
||||
'create',
|
||||
withRateLimit('ocr', async (req, ctx) => {
|
||||
try {
|
||||
const formData = await req.formData();
|
||||
const file = formData.get('file') as File | null;
|
||||
if (!file) {
|
||||
return NextResponse.json({ error: 'No file provided' }, { status: 400 });
|
||||
}
|
||||
const buffer = Buffer.from(await file.arrayBuffer());
|
||||
const mimeType = file.type || 'image/jpeg';
|
||||
|
||||
if (!file) {
|
||||
return NextResponse.json({ error: 'No file provided' }, { status: 400 });
|
||||
const config = await getResolvedOcrConfig(ctx.portId);
|
||||
// Tesseract.js (in-browser) is the default. The server only invokes
|
||||
// an AI provider when (a) the port admin has flipped `aiEnabled` on
|
||||
// and (b) a key resolves. Otherwise the client falls back to its
|
||||
// local Tesseract result.
|
||||
if (!config.aiEnabled) {
|
||||
return NextResponse.json({
|
||||
data: { parsed: EMPTY, source: 'manual', reason: 'ai-disabled' },
|
||||
});
|
||||
}
|
||||
if (!config.apiKey) {
|
||||
return NextResponse.json({
|
||||
data: { parsed: EMPTY, source: 'manual', reason: 'no-ocr-configured' },
|
||||
});
|
||||
}
|
||||
|
||||
// Per-port budget gate — refuse the call before we spend tokens
|
||||
// when the port has already hit its hard cap, or when the request
|
||||
// would push it past the cap. Soft-cap warnings ride along on the
|
||||
// success response so the UI can show a banner without blocking.
|
||||
const budget = await checkBudget({
|
||||
portId: ctx.portId,
|
||||
estimatedTokens: OCR_ESTIMATED_TOKENS,
|
||||
});
|
||||
if (!budget.ok) {
|
||||
return NextResponse.json({
|
||||
data: {
|
||||
parsed: EMPTY,
|
||||
source: 'manual',
|
||||
reason: 'budget-exceeded',
|
||||
providerError: `AI budget reached (${budget.usedTokens}/${budget.capTokens} tokens this period).`,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
const result = await runOcr({
|
||||
provider: config.provider,
|
||||
model: config.model,
|
||||
apiKey: config.apiKey,
|
||||
imageBuffer: buffer,
|
||||
mimeType,
|
||||
});
|
||||
await recordAiUsage({
|
||||
portId: ctx.portId,
|
||||
userId: ctx.userId,
|
||||
feature: OCR_FEATURE,
|
||||
provider: config.provider,
|
||||
model: config.model,
|
||||
inputTokens: result.usage.inputTokens,
|
||||
outputTokens: result.usage.outputTokens,
|
||||
requestId: result.usage.requestId,
|
||||
});
|
||||
return NextResponse.json({
|
||||
data: {
|
||||
parsed: result.parsed,
|
||||
source: 'ai',
|
||||
provider: config.provider,
|
||||
model: config.model,
|
||||
softCapWarning: budget.softCap,
|
||||
},
|
||||
});
|
||||
} catch (err) {
|
||||
logger.error({ err, provider: config.provider }, 'OCR provider call failed');
|
||||
// Provider hiccup — degrade to manual entry rather than 500-ing.
|
||||
return NextResponse.json({
|
||||
data: {
|
||||
parsed: EMPTY,
|
||||
source: 'manual',
|
||||
reason: 'provider-error',
|
||||
providerError: err instanceof Error ? err.message.slice(0, 200) : 'Unknown error',
|
||||
},
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
|
||||
const buffer = Buffer.from(await file.arrayBuffer());
|
||||
const mimeType = file.type || 'image/jpeg';
|
||||
|
||||
const result = await scanReceipt(buffer, mimeType);
|
||||
|
||||
return NextResponse.json({ data: result });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
}),
|
||||
),
|
||||
);
|
||||
|
||||
41
src/app/api/v1/interests/[id]/outcome/route.ts
Normal file
41
src/app/api/v1/interests/[id]/outcome/route.ts
Normal file
@@ -0,0 +1,41 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
|
||||
import { withAuth, withPermission } from '@/lib/api/helpers';
|
||||
import { parseBody } from '@/lib/api/route-helpers';
|
||||
import { errorResponse } from '@/lib/errors';
|
||||
import { clearInterestOutcome, setInterestOutcome } from '@/lib/services/interests.service';
|
||||
import { clearOutcomeSchema, setOutcomeSchema } from '@/lib/validators/interests';
|
||||
|
||||
export const POST = withAuth(
|
||||
withPermission('interests', 'change_stage', async (req, ctx, params) => {
|
||||
try {
|
||||
const body = await parseBody(req, setOutcomeSchema);
|
||||
const result = await setInterestOutcome(params.id!, ctx.portId, body, {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
});
|
||||
return NextResponse.json({ data: result });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
export const DELETE = withAuth(
|
||||
withPermission('interests', 'change_stage', async (req, ctx, params) => {
|
||||
try {
|
||||
const body = await parseBody(req, clearOutcomeSchema);
|
||||
const result = await clearInterestOutcome(params.id!, ctx.portId, body, {
|
||||
userId: ctx.userId,
|
||||
portId: ctx.portId,
|
||||
ipAddress: ctx.ipAddress,
|
||||
userAgent: ctx.userAgent,
|
||||
});
|
||||
return NextResponse.json({ data: result });
|
||||
} catch (error) {
|
||||
return errorResponse(error);
|
||||
}
|
||||
}),
|
||||
);
|
||||
@@ -7,6 +7,26 @@ import { db } from '@/lib/db';
|
||||
import { interests } from '@/lib/db/schema/interests';
|
||||
import { auditLogs } from '@/lib/db/schema/system';
|
||||
import { documents, documentEvents } from '@/lib/db/schema/documents';
|
||||
import { user } from '@/lib/db/schema/users';
|
||||
import { stageLabel } from '@/lib/constants';
|
||||
|
||||
const OUTCOME_LABELS: Record<string, string> = {
|
||||
won: 'Won',
|
||||
lost_other_marina: 'Lost — went to another marina',
|
||||
lost_unqualified: 'Lost — unqualified',
|
||||
lost_no_response: 'Lost — no response',
|
||||
cancelled: 'Cancelled',
|
||||
};
|
||||
|
||||
const DOC_EVENT_LABELS: Record<string, string> = {
|
||||
sent: 'sent for signing',
|
||||
completed: 'fully signed',
|
||||
signed: 'signed by recipient',
|
||||
rejected: 'rejected',
|
||||
expired: 'expired',
|
||||
cancelled: 'cancelled',
|
||||
reminder_sent: 'reminder sent',
|
||||
};
|
||||
|
||||
interface TimelineEvent {
|
||||
id: string;
|
||||
@@ -14,6 +34,10 @@ interface TimelineEvent {
|
||||
action: string;
|
||||
description: string;
|
||||
userId: string | null;
|
||||
/** Resolved display name for `userId`. `'system'` for auto-events; null when
|
||||
* the user has been deleted or the event has no actor. Falls back to
|
||||
* email-localpart if the user has no display name. */
|
||||
userName: string | null;
|
||||
createdAt: Date;
|
||||
metadata: Record<string, unknown>;
|
||||
}
|
||||
@@ -33,12 +57,7 @@ export const GET = withAuth(
|
||||
const auditRows = await db
|
||||
.select()
|
||||
.from(auditLogs)
|
||||
.where(
|
||||
and(
|
||||
eq(auditLogs.entityType, 'interest'),
|
||||
eq(auditLogs.entityId, interestId),
|
||||
),
|
||||
)
|
||||
.where(and(eq(auditLogs.entityType, 'interest'), eq(auditLogs.entityId, interestId)))
|
||||
.orderBy(desc(auditLogs.createdAt))
|
||||
.limit(50);
|
||||
|
||||
@@ -67,28 +86,82 @@ export const GET = withAuth(
|
||||
|
||||
const docTitles = Object.fromEntries(interestDocs.map((d) => [d.id, d.title]));
|
||||
|
||||
// Resolve display names for any `userId` that is a real user row (the
|
||||
// sentinel value 'system' is used for auto-events and isn't joined).
|
||||
const realUserIds = Array.from(
|
||||
new Set(auditRows.map((r) => r.userId).filter((u): u is string => !!u && u !== 'system')),
|
||||
);
|
||||
const userRows =
|
||||
realUserIds.length > 0
|
||||
? await db
|
||||
.select({ id: user.id, name: user.name, email: user.email })
|
||||
.from(user)
|
||||
.where(inArray(user.id, realUserIds))
|
||||
: [];
|
||||
const userNameById = new Map<string, string>(
|
||||
userRows.map((u) => [u.id, u.name?.trim() || u.email.split('@')[0] || 'User']),
|
||||
);
|
||||
const resolveUserName = (userId: string | null): string | null => {
|
||||
if (!userId) return null;
|
||||
if (userId === 'system') return 'system';
|
||||
return userNameById.get(userId) ?? null;
|
||||
};
|
||||
|
||||
// Union and sort
|
||||
const auditEvents: TimelineEvent[] = auditRows.map((row) => ({
|
||||
id: row.id,
|
||||
type: 'audit',
|
||||
action: row.action,
|
||||
description: buildAuditDescription(row.action, row.newValue as Record<string, unknown> | null),
|
||||
description: buildAuditDescription(
|
||||
row.action,
|
||||
row.newValue as Record<string, unknown> | null,
|
||||
(row.metadata as Record<string, unknown>) ?? {},
|
||||
row.userId,
|
||||
),
|
||||
userId: row.userId,
|
||||
userName: resolveUserName(row.userId),
|
||||
createdAt: row.createdAt,
|
||||
metadata: (row.metadata as Record<string, unknown>) ?? {},
|
||||
}));
|
||||
|
||||
const docEvents: TimelineEvent[] = docEventRows.map((row) => ({
|
||||
id: row.id,
|
||||
type: 'document_event',
|
||||
action: row.eventType,
|
||||
description: `Document "${docTitles[row.documentId] ?? row.documentId}": ${row.eventType}`,
|
||||
userId: null,
|
||||
createdAt: row.createdAt,
|
||||
metadata: (row.eventData as Record<string, unknown>) ?? {},
|
||||
}));
|
||||
const docEvents: TimelineEvent[] = docEventRows.map((row) => {
|
||||
const title = docTitles[row.documentId] ?? row.documentId;
|
||||
const action = DOC_EVENT_LABELS[row.eventType] ?? row.eventType;
|
||||
return {
|
||||
id: row.id,
|
||||
type: 'document_event',
|
||||
action: row.eventType,
|
||||
description: `Document "${title}" ${action}`,
|
||||
userId: null,
|
||||
userName: null,
|
||||
createdAt: row.createdAt,
|
||||
metadata: (row.eventData as Record<string, unknown>) ?? {},
|
||||
};
|
||||
});
|
||||
|
||||
const allEvents = [...auditEvents, ...docEvents];
|
||||
|
||||
// Fallback: when no audit-log entries exist for this interest (typical
|
||||
// for seed/imported data inserted directly into the table without going
|
||||
// through the service), synthesize a "Created at <stage>" event so the
|
||||
// tab isn't empty when the interest is clearly past `open`.
|
||||
const hasCreateAudit = allEvents.some((e) => e.action === 'create');
|
||||
if (!hasCreateAudit) {
|
||||
const stage = stageLabel(interest.pipelineStage);
|
||||
const created = interest.createdAt ?? new Date();
|
||||
allEvents.push({
|
||||
id: `synth-${interest.id}-create`,
|
||||
type: 'audit',
|
||||
action: 'create',
|
||||
description:
|
||||
interest.pipelineStage === 'open' ? 'Interest created' : `Interest created at ${stage}`,
|
||||
userId: null,
|
||||
userName: null,
|
||||
createdAt: created,
|
||||
metadata: { synthetic: true },
|
||||
});
|
||||
}
|
||||
|
||||
allEvents.sort((a, b) => new Date(b.createdAt).getTime() - new Date(a.createdAt).getTime());
|
||||
|
||||
return NextResponse.json({ data: allEvents.slice(0, 50) });
|
||||
@@ -101,12 +174,39 @@ export const GET = withAuth(
|
||||
function buildAuditDescription(
|
||||
action: string,
|
||||
newValue: Record<string, unknown> | null,
|
||||
metadata: Record<string, unknown>,
|
||||
userId: string | null,
|
||||
): string {
|
||||
if (action === 'create') return 'Interest created';
|
||||
if (action === 'archive') return 'Interest archived';
|
||||
if (action === 'restore') return 'Interest restored';
|
||||
|
||||
const type = metadata.type;
|
||||
|
||||
if (type === 'outcome_set') {
|
||||
const outcomeKey = (newValue?.outcome as string | undefined) ?? '';
|
||||
const label = OUTCOME_LABELS[outcomeKey] ?? outcomeKey ?? 'Closed';
|
||||
const reason = (newValue?.reason as string | undefined) ?? '';
|
||||
return reason ? `Marked as ${label} — ${reason}` : `Marked as ${label}`;
|
||||
}
|
||||
|
||||
if (type === 'outcome_cleared') {
|
||||
const stage = (newValue?.pipelineStage as string | undefined) ?? '';
|
||||
return stage ? `Reopened to ${stageLabel(stage)}` : 'Reopened';
|
||||
}
|
||||
|
||||
if (type === 'stage_change' && newValue?.pipelineStage) {
|
||||
const stage = stageLabel(newValue.pipelineStage as string);
|
||||
const reason = (newValue.reason as string | undefined) ?? '';
|
||||
const auto = userId === 'system';
|
||||
if (auto) {
|
||||
return reason ? `${stage} (auto-advanced — ${reason})` : `Stage advanced to ${stage}`;
|
||||
}
|
||||
return reason ? `Stage changed to ${stage} — ${reason}` : `Stage changed to ${stage}`;
|
||||
}
|
||||
|
||||
if (action === 'update' && newValue?.pipelineStage) {
|
||||
return `Stage changed to "${newValue.pipelineStage}"`;
|
||||
return `Stage changed to ${stageLabel(newValue.pipelineStage as string)}`;
|
||||
}
|
||||
if (action === 'update') return 'Interest updated';
|
||||
return action;
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user