feat(emails): sales send-out flows + brochures + email-from settings
Phase 7 of the berth-recommender refactor (plan §3.3, §4.8, §4.9, §5.7,
§5.8, §5.9, §11.1, §14.7, §14.9). Adds the rep-driven send-out path for
per-berth PDFs and port-wide brochures, the per-port sales SMTP/IMAP
config + body templates, and the supporting admin UI.
Migration: 0031_brochures_and_document_sends.sql
Schema additions:
- brochures (port-wide, with isDefault marker + archive)
- brochure_versions (versioned uploads, storageKey per §4.7a)
- document_sends (audit log of every rep-initiated send; failures
captured with failedAt + errorReason). berthPdfVersionId is a plain
text column (no FK) — loose-coupled to Phase 6b's berth_pdf_versions
so the two phases stay independent.
§14.7 critical mitigations:
- Body XSS: rep-authored markdown goes through renderEmailBody()
(HTML-escape first, then a tight allowlist of bold/italic/code/link
rules). https:// + mailto: only — javascript:/data: URLs stripped.
Tested against script/img/iframe/svg/onerror polyglots.
- Recipient typo: strict email regex + two-step confirm modal that
shows the exact recipient before send.
- Unresolved merge fields: pre-send dry-run /preview endpoint blocks
submission until findUnresolvedTokens() returns empty.
- SMTP failure: every transport rejection writes a document_sends row
with failedAt + errorReason; UI surfaces the message.
- Hourly per-user rate limit: 50 sends/user/hour via existing
checkRateLimit().
- Size threshold fallback (§11.1): files above
email_attach_threshold_mb (default 15) ship as a 24h signed-URL
download link in the body instead of an attachment. Storage stream
flows directly to nodemailer to avoid buffering 20MB+.
§14.10 critical mitigation:
- SMTP/IMAP passwords encrypted at rest via the existing
EMAIL_CREDENTIAL_KEY (AES-256-GCM). The /api/v1/admin/email/
sales-config GET endpoint never returns the decrypted value — only
a *PassIsSet boolean. PATCH treats empty string as "leave unchanged"
and explicit null as "clear", so the masked-placeholder UI round-
trips without forcing re-entry on every save.
system_settings keys (per-port unless noted):
- sales_from_address, sales_smtp_{host,port,secure,user,pass_encrypted}
- sales_imap_{host,port,user,pass_encrypted}
- sales_auth_method (default app_password)
- noreply_from_address
- email_template_send_berth_pdf_body, email_template_send_brochure_body
- brochure_max_upload_mb (default 50)
- email_attach_threshold_mb (default 15)
UI surfaces (per §5.7, §5.8, §5.9):
- <SendDocumentDialog> shared 2-step compose+confirm flow.
- <SendBerthPdfDialog>, <SendDocumentsDialog>, <SendFromInterestButton>
wrappers per detail page.
- /[portSlug]/admin/brochures: list, upload (direct-to-storage
presigned PUT for the 20MB+ files per §11.1), default toggle,
archive.
- /[portSlug]/admin/email extended with <SalesEmailConfigCard>:
SMTP + IMAP creds, body templates, threshold/max settings.
Storage: every upload + download goes through getStorageBackend() —
no direct minio imports, per Phase 6a contract.
Tests: 1145 vitest passing (+ 50 new in
markdown-email-sanitization.test.ts, document-sends-validators.test.ts,
sales-email-config-validators.test.ts).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-05 03:38:47 +02:00
|
|
|
/**
|
|
|
|
|
* Brochures + brochure-versions service (Phase 7 — see plan §3.3 / §4.7).
|
|
|
|
|
*
|
|
|
|
|
* Brochures are port-wide marketing PDFs (the sample `Port-Nimara-Brochure-March-2025`
|
|
|
|
|
* is 10.26 MB). Each `brochures` row groups a logical brochure (e.g.
|
|
|
|
|
* "Investor Pack"); each `brochure_versions` row is an immutable upload tied
|
|
|
|
|
* to that brochure. The default brochure is the one the send-out flow picks
|
|
|
|
|
* when the rep doesn't pick explicitly (§14.7).
|
|
|
|
|
*
|
|
|
|
|
* Storage goes through `getStorageBackend()` (Phase 6a) — never minio
|
|
|
|
|
* directly. The version row's `storageKey` follows the §4.7a convention.
|
|
|
|
|
*/
|
|
|
|
|
import { and, asc, desc, eq, isNull } from 'drizzle-orm';
|
|
|
|
|
|
|
|
|
|
import { db } from '@/lib/db';
|
|
|
|
|
import { brochures, brochureVersions, ports } from '@/lib/db/schema';
|
|
|
|
|
import type { Brochure, BrochureVersion } from '@/lib/db/schema';
|
|
|
|
|
import { ForbiddenError, NotFoundError, ValidationError } from '@/lib/errors';
|
|
|
|
|
import { getStorageBackend } from '@/lib/storage';
|
|
|
|
|
import { buildStoragePath } from '@/lib/minio';
|
|
|
|
|
import { logger } from '@/lib/logger';
|
|
|
|
|
|
|
|
|
|
// ─── Types ───────────────────────────────────────────────────────────────────
|
|
|
|
|
|
|
|
|
|
export interface BrochureWithCurrentVersion extends Brochure {
|
|
|
|
|
currentVersion: BrochureVersion | null;
|
|
|
|
|
versionCount: number;
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
// ─── Internal helpers ────────────────────────────────────────────────────────
|
|
|
|
|
|
|
|
|
|
async function loadPortSlug(portId: string): Promise<string> {
|
|
|
|
|
const port = await db.query.ports.findFirst({ where: eq(ports.id, portId) });
|
|
|
|
|
if (!port) throw new NotFoundError('Port');
|
|
|
|
|
return port.slug;
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
// ─── List ────────────────────────────────────────────────────────────────────
|
|
|
|
|
|
|
|
|
|
/**
|
|
|
|
|
* List all brochures for a port. By default returns only non-archived rows;
|
|
|
|
|
* pass `{ includeArchived: true }` for the admin manage page.
|
|
|
|
|
*/
|
|
|
|
|
export async function listBrochures(
|
|
|
|
|
portId: string,
|
|
|
|
|
opts: { includeArchived?: boolean } = {},
|
|
|
|
|
): Promise<BrochureWithCurrentVersion[]> {
|
|
|
|
|
const baseRows = await db.query.brochures.findMany({
|
|
|
|
|
where: opts.includeArchived
|
|
|
|
|
? eq(brochures.portId, portId)
|
|
|
|
|
: and(eq(brochures.portId, portId), isNull(brochures.archivedAt)),
|
|
|
|
|
orderBy: [desc(brochures.isDefault), asc(brochures.label)],
|
|
|
|
|
});
|
|
|
|
|
|
|
|
|
|
if (baseRows.length === 0) return [];
|
|
|
|
|
|
|
|
|
|
const ids = baseRows.map((r) => r.id);
|
|
|
|
|
const versions = await db
|
|
|
|
|
.select()
|
|
|
|
|
.from(brochureVersions)
|
|
|
|
|
.where(eq(brochureVersions.brochureId, ids[0]!));
|
|
|
|
|
// Pull all versions for these brochures in one round trip.
|
|
|
|
|
const allVersions =
|
|
|
|
|
ids.length === 1
|
|
|
|
|
? versions
|
|
|
|
|
: await db.query.brochureVersions.findMany({
|
|
|
|
|
where: (bv, { inArray }) => inArray(bv.brochureId, ids),
|
|
|
|
|
orderBy: [desc(brochureVersions.uploadedAt)],
|
|
|
|
|
});
|
|
|
|
|
|
|
|
|
|
return baseRows.map((row) => {
|
|
|
|
|
const versionsForRow = allVersions.filter((v) => v.brochureId === row.id);
|
|
|
|
|
versionsForRow.sort(
|
|
|
|
|
(a, b) => new Date(b.uploadedAt).getTime() - new Date(a.uploadedAt).getTime(),
|
|
|
|
|
);
|
|
|
|
|
return {
|
|
|
|
|
...row,
|
|
|
|
|
currentVersion: versionsForRow[0] ?? null,
|
|
|
|
|
versionCount: versionsForRow.length,
|
|
|
|
|
};
|
|
|
|
|
});
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
export async function getBrochure(
|
|
|
|
|
portId: string,
|
|
|
|
|
brochureId: string,
|
|
|
|
|
): Promise<BrochureWithCurrentVersion> {
|
|
|
|
|
const row = await db.query.brochures.findFirst({
|
|
|
|
|
where: and(eq(brochures.id, brochureId), eq(brochures.portId, portId)),
|
|
|
|
|
});
|
|
|
|
|
if (!row) throw new NotFoundError('Brochure');
|
|
|
|
|
|
|
|
|
|
const versions = await db.query.brochureVersions.findMany({
|
|
|
|
|
where: eq(brochureVersions.brochureId, brochureId),
|
|
|
|
|
orderBy: [desc(brochureVersions.uploadedAt)],
|
|
|
|
|
});
|
|
|
|
|
return { ...row, currentVersion: versions[0] ?? null, versionCount: versions.length };
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/**
|
|
|
|
|
* Resolve the brochure that the send-out flow should default to. Returns the
|
|
|
|
|
* default brochure when one exists and is non-archived; falls back to the
|
|
|
|
|
* most recently created non-archived brochure with a version; null when
|
|
|
|
|
* the port has no usable brochures (the send UI hides the button — §14.7).
|
|
|
|
|
*/
|
|
|
|
|
export async function getDefaultBrochure(
|
|
|
|
|
portId: string,
|
|
|
|
|
): Promise<BrochureWithCurrentVersion | null> {
|
|
|
|
|
const all = await listBrochures(portId, { includeArchived: false });
|
|
|
|
|
const usable = all.filter((b) => b.currentVersion !== null);
|
|
|
|
|
if (usable.length === 0) return null;
|
|
|
|
|
const flaggedDefault = usable.find((b) => b.isDefault);
|
|
|
|
|
if (flaggedDefault) return flaggedDefault;
|
|
|
|
|
return usable[0]!;
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
// ─── Mutations ───────────────────────────────────────────────────────────────
|
|
|
|
|
|
|
|
|
|
export interface CreateBrochureInput {
|
|
|
|
|
portId: string;
|
|
|
|
|
label: string;
|
|
|
|
|
description?: string | null;
|
|
|
|
|
isDefault?: boolean;
|
|
|
|
|
createdBy: string;
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
export async function createBrochure(input: CreateBrochureInput): Promise<Brochure> {
|
|
|
|
|
if (!input.label.trim()) throw new ValidationError('Brochure label is required');
|
|
|
|
|
|
|
|
|
|
// If this is being created as default, clear any existing default first
|
|
|
|
|
// so we maintain the invariant: at most one default per port.
|
|
|
|
|
return db.transaction(async (tx) => {
|
|
|
|
|
if (input.isDefault) {
|
|
|
|
|
await tx
|
|
|
|
|
.update(brochures)
|
|
|
|
|
.set({ isDefault: false })
|
|
|
|
|
.where(and(eq(brochures.portId, input.portId), eq(brochures.isDefault, true)));
|
|
|
|
|
}
|
|
|
|
|
const [row] = await tx
|
|
|
|
|
.insert(brochures)
|
|
|
|
|
.values({
|
|
|
|
|
portId: input.portId,
|
|
|
|
|
label: input.label.trim(),
|
|
|
|
|
description: input.description ?? null,
|
|
|
|
|
isDefault: input.isDefault ?? false,
|
|
|
|
|
createdBy: input.createdBy,
|
|
|
|
|
})
|
|
|
|
|
.returning();
|
|
|
|
|
if (!row) throw new Error('Failed to create brochure');
|
|
|
|
|
return row;
|
|
|
|
|
});
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
export interface UpdateBrochureInput {
|
|
|
|
|
label?: string;
|
|
|
|
|
description?: string | null;
|
|
|
|
|
isDefault?: boolean;
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
export async function updateBrochure(
|
|
|
|
|
portId: string,
|
|
|
|
|
brochureId: string,
|
|
|
|
|
patch: UpdateBrochureInput,
|
|
|
|
|
): Promise<Brochure> {
|
|
|
|
|
const existing = await db.query.brochures.findFirst({
|
|
|
|
|
where: and(eq(brochures.id, brochureId), eq(brochures.portId, portId)),
|
|
|
|
|
});
|
|
|
|
|
if (!existing) throw new NotFoundError('Brochure');
|
|
|
|
|
|
|
|
|
|
return db.transaction(async (tx) => {
|
|
|
|
|
if (patch.isDefault === true) {
|
|
|
|
|
await tx
|
|
|
|
|
.update(brochures)
|
|
|
|
|
.set({ isDefault: false })
|
|
|
|
|
.where(and(eq(brochures.portId, portId), eq(brochures.isDefault, true)));
|
|
|
|
|
}
|
|
|
|
|
const updates: Partial<Brochure> = {};
|
|
|
|
|
if (patch.label !== undefined) updates.label = patch.label.trim();
|
|
|
|
|
if (patch.description !== undefined) updates.description = patch.description;
|
|
|
|
|
if (patch.isDefault !== undefined) updates.isDefault = patch.isDefault;
|
|
|
|
|
const [row] = await tx
|
|
|
|
|
.update(brochures)
|
|
|
|
|
.set(updates)
|
|
|
|
|
.where(eq(brochures.id, brochureId))
|
|
|
|
|
.returning();
|
|
|
|
|
if (!row) throw new Error('Failed to update brochure');
|
|
|
|
|
return row;
|
|
|
|
|
});
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
export async function archiveBrochure(portId: string, brochureId: string): Promise<void> {
|
|
|
|
|
const existing = await db.query.brochures.findFirst({
|
|
|
|
|
where: and(eq(brochures.id, brochureId), eq(brochures.portId, portId)),
|
|
|
|
|
});
|
|
|
|
|
if (!existing) throw new NotFoundError('Brochure');
|
|
|
|
|
await db
|
|
|
|
|
.update(brochures)
|
|
|
|
|
.set({ archivedAt: new Date(), isDefault: false })
|
|
|
|
|
.where(eq(brochures.id, brochureId));
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
// ─── Versions ────────────────────────────────────────────────────────────────
|
|
|
|
|
|
|
|
|
|
export interface RegisterBrochureVersionInput {
|
|
|
|
|
portId: string;
|
|
|
|
|
brochureId: string;
|
|
|
|
|
storageKey: string;
|
|
|
|
|
fileName: string;
|
|
|
|
|
fileSizeBytes: number;
|
|
|
|
|
contentSha256: string;
|
|
|
|
|
uploadedBy: string;
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/**
|
|
|
|
|
* After a presigned upload completes, the browser POSTs the metadata back
|
|
|
|
|
* here. We HEAD the storage key to verify the file exists at the claimed
|
|
|
|
|
* size + content-type (per §11.1 "Server-side validation"), then write the
|
|
|
|
|
* version row + bump version number.
|
|
|
|
|
*/
|
|
|
|
|
export async function registerBrochureVersion(
|
|
|
|
|
input: RegisterBrochureVersionInput,
|
|
|
|
|
): Promise<BrochureVersion> {
|
|
|
|
|
const brochure = await db.query.brochures.findFirst({
|
|
|
|
|
where: and(eq(brochures.id, input.brochureId), eq(brochures.portId, input.portId)),
|
|
|
|
|
});
|
|
|
|
|
if (!brochure) throw new NotFoundError('Brochure');
|
|
|
|
|
if (brochure.archivedAt) {
|
|
|
|
|
throw new ForbiddenError('Cannot upload a version to an archived brochure');
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
const storage = await getStorageBackend();
|
|
|
|
|
const head = await storage.head(input.storageKey);
|
|
|
|
|
if (!head) throw new ValidationError('Uploaded object not found in storage');
|
|
|
|
|
if (head.sizeBytes !== input.fileSizeBytes) {
|
|
|
|
|
logger.warn(
|
|
|
|
|
{ expected: input.fileSizeBytes, actual: head.sizeBytes, key: input.storageKey },
|
|
|
|
|
'Brochure upload size mismatch',
|
|
|
|
|
);
|
|
|
|
|
throw new ValidationError('Uploaded object size does not match metadata');
|
|
|
|
|
}
|
2026-05-05 04:07:03 +02:00
|
|
|
// Magic-byte check (§14.6 critical) - the presign path doesn't see the
|
|
|
|
|
// bytes until upload completes. Read the first 5 bytes; abort + delete
|
|
|
|
|
// on mismatch so a malicious uploader can't smuggle a non-PDF that the
|
|
|
|
|
// CRM would later email as `application/pdf`.
|
|
|
|
|
const stream = await storage.get(input.storageKey);
|
|
|
|
|
const chunks: Buffer[] = [];
|
|
|
|
|
let total = 0;
|
|
|
|
|
for await (const chunk of stream as AsyncIterable<Buffer | string>) {
|
|
|
|
|
const buf = typeof chunk === 'string' ? Buffer.from(chunk) : chunk;
|
|
|
|
|
chunks.push(buf);
|
|
|
|
|
total += buf.length;
|
|
|
|
|
if (total >= 5) break;
|
|
|
|
|
}
|
|
|
|
|
if (typeof (stream as { destroy?: () => void }).destroy === 'function') {
|
|
|
|
|
(stream as unknown as { destroy: () => void }).destroy();
|
|
|
|
|
}
|
|
|
|
|
const probe = Buffer.concat(chunks).subarray(0, 5);
|
|
|
|
|
if (probe.length < 5 || probe.toString('utf8', 0, 5) !== '%PDF-') {
|
|
|
|
|
await storage.delete(input.storageKey).catch(() => undefined);
|
|
|
|
|
throw new ValidationError(
|
|
|
|
|
'Uploaded file failed PDF magic-byte check (does not start with %PDF-).',
|
|
|
|
|
);
|
|
|
|
|
}
|
feat(emails): sales send-out flows + brochures + email-from settings
Phase 7 of the berth-recommender refactor (plan §3.3, §4.8, §4.9, §5.7,
§5.8, §5.9, §11.1, §14.7, §14.9). Adds the rep-driven send-out path for
per-berth PDFs and port-wide brochures, the per-port sales SMTP/IMAP
config + body templates, and the supporting admin UI.
Migration: 0031_brochures_and_document_sends.sql
Schema additions:
- brochures (port-wide, with isDefault marker + archive)
- brochure_versions (versioned uploads, storageKey per §4.7a)
- document_sends (audit log of every rep-initiated send; failures
captured with failedAt + errorReason). berthPdfVersionId is a plain
text column (no FK) — loose-coupled to Phase 6b's berth_pdf_versions
so the two phases stay independent.
§14.7 critical mitigations:
- Body XSS: rep-authored markdown goes through renderEmailBody()
(HTML-escape first, then a tight allowlist of bold/italic/code/link
rules). https:// + mailto: only — javascript:/data: URLs stripped.
Tested against script/img/iframe/svg/onerror polyglots.
- Recipient typo: strict email regex + two-step confirm modal that
shows the exact recipient before send.
- Unresolved merge fields: pre-send dry-run /preview endpoint blocks
submission until findUnresolvedTokens() returns empty.
- SMTP failure: every transport rejection writes a document_sends row
with failedAt + errorReason; UI surfaces the message.
- Hourly per-user rate limit: 50 sends/user/hour via existing
checkRateLimit().
- Size threshold fallback (§11.1): files above
email_attach_threshold_mb (default 15) ship as a 24h signed-URL
download link in the body instead of an attachment. Storage stream
flows directly to nodemailer to avoid buffering 20MB+.
§14.10 critical mitigation:
- SMTP/IMAP passwords encrypted at rest via the existing
EMAIL_CREDENTIAL_KEY (AES-256-GCM). The /api/v1/admin/email/
sales-config GET endpoint never returns the decrypted value — only
a *PassIsSet boolean. PATCH treats empty string as "leave unchanged"
and explicit null as "clear", so the masked-placeholder UI round-
trips without forcing re-entry on every save.
system_settings keys (per-port unless noted):
- sales_from_address, sales_smtp_{host,port,secure,user,pass_encrypted}
- sales_imap_{host,port,user,pass_encrypted}
- sales_auth_method (default app_password)
- noreply_from_address
- email_template_send_berth_pdf_body, email_template_send_brochure_body
- brochure_max_upload_mb (default 50)
- email_attach_threshold_mb (default 15)
UI surfaces (per §5.7, §5.8, §5.9):
- <SendDocumentDialog> shared 2-step compose+confirm flow.
- <SendBerthPdfDialog>, <SendDocumentsDialog>, <SendFromInterestButton>
wrappers per detail page.
- /[portSlug]/admin/brochures: list, upload (direct-to-storage
presigned PUT for the 20MB+ files per §11.1), default toggle,
archive.
- /[portSlug]/admin/email extended with <SalesEmailConfigCard>:
SMTP + IMAP creds, body templates, threshold/max settings.
Storage: every upload + download goes through getStorageBackend() —
no direct minio imports, per Phase 6a contract.
Tests: 1145 vitest passing (+ 50 new in
markdown-email-sanitization.test.ts, document-sends-validators.test.ts,
sales-email-config-validators.test.ts).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-05 03:38:47 +02:00
|
|
|
|
|
|
|
|
// Determine the next version number for this brochure.
|
|
|
|
|
const existing = await db.query.brochureVersions.findMany({
|
|
|
|
|
where: eq(brochureVersions.brochureId, input.brochureId),
|
|
|
|
|
orderBy: [desc(brochureVersions.versionNumber)],
|
|
|
|
|
limit: 1,
|
|
|
|
|
});
|
|
|
|
|
const nextVersion = (existing[0]?.versionNumber ?? 0) + 1;
|
|
|
|
|
|
|
|
|
|
const [row] = await db
|
|
|
|
|
.insert(brochureVersions)
|
|
|
|
|
.values({
|
|
|
|
|
brochureId: input.brochureId,
|
|
|
|
|
versionNumber: nextVersion,
|
|
|
|
|
storageKey: input.storageKey,
|
|
|
|
|
fileName: input.fileName,
|
|
|
|
|
fileSizeBytes: input.fileSizeBytes,
|
|
|
|
|
contentSha256: input.contentSha256,
|
|
|
|
|
uploadedBy: input.uploadedBy,
|
|
|
|
|
})
|
|
|
|
|
.returning();
|
|
|
|
|
if (!row) throw new Error('Failed to record brochure version');
|
|
|
|
|
return row;
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/**
|
|
|
|
|
* Generate a storage key the client should PUT to. Caller hands the returned
|
|
|
|
|
* key + URL to the browser; after upload the browser calls
|
|
|
|
|
* `registerBrochureVersion` with the same key.
|
|
|
|
|
*/
|
|
|
|
|
export async function generateBrochureStorageKey(
|
|
|
|
|
portId: string,
|
|
|
|
|
brochureId: string,
|
|
|
|
|
): Promise<string> {
|
|
|
|
|
const portSlug = await loadPortSlug(portId);
|
|
|
|
|
const fileId = crypto.randomUUID();
|
|
|
|
|
return buildStoragePath(portSlug, 'brochures', brochureId, fileId, 'pdf');
|
|
|
|
|
}
|