Files
pn-new-crm/src/lib/services/brochures.service.ts

296 lines
11 KiB
TypeScript
Raw Normal View History

feat(emails): sales send-out flows + brochures + email-from settings Phase 7 of the berth-recommender refactor (plan §3.3, §4.8, §4.9, §5.7, §5.8, §5.9, §11.1, §14.7, §14.9). Adds the rep-driven send-out path for per-berth PDFs and port-wide brochures, the per-port sales SMTP/IMAP config + body templates, and the supporting admin UI. Migration: 0031_brochures_and_document_sends.sql Schema additions: - brochures (port-wide, with isDefault marker + archive) - brochure_versions (versioned uploads, storageKey per §4.7a) - document_sends (audit log of every rep-initiated send; failures captured with failedAt + errorReason). berthPdfVersionId is a plain text column (no FK) — loose-coupled to Phase 6b's berth_pdf_versions so the two phases stay independent. §14.7 critical mitigations: - Body XSS: rep-authored markdown goes through renderEmailBody() (HTML-escape first, then a tight allowlist of bold/italic/code/link rules). https:// + mailto: only — javascript:/data: URLs stripped. Tested against script/img/iframe/svg/onerror polyglots. - Recipient typo: strict email regex + two-step confirm modal that shows the exact recipient before send. - Unresolved merge fields: pre-send dry-run /preview endpoint blocks submission until findUnresolvedTokens() returns empty. - SMTP failure: every transport rejection writes a document_sends row with failedAt + errorReason; UI surfaces the message. - Hourly per-user rate limit: 50 sends/user/hour via existing checkRateLimit(). - Size threshold fallback (§11.1): files above email_attach_threshold_mb (default 15) ship as a 24h signed-URL download link in the body instead of an attachment. Storage stream flows directly to nodemailer to avoid buffering 20MB+. §14.10 critical mitigation: - SMTP/IMAP passwords encrypted at rest via the existing EMAIL_CREDENTIAL_KEY (AES-256-GCM). The /api/v1/admin/email/ sales-config GET endpoint never returns the decrypted value — only a *PassIsSet boolean. PATCH treats empty string as "leave unchanged" and explicit null as "clear", so the masked-placeholder UI round- trips without forcing re-entry on every save. system_settings keys (per-port unless noted): - sales_from_address, sales_smtp_{host,port,secure,user,pass_encrypted} - sales_imap_{host,port,user,pass_encrypted} - sales_auth_method (default app_password) - noreply_from_address - email_template_send_berth_pdf_body, email_template_send_brochure_body - brochure_max_upload_mb (default 50) - email_attach_threshold_mb (default 15) UI surfaces (per §5.7, §5.8, §5.9): - <SendDocumentDialog> shared 2-step compose+confirm flow. - <SendBerthPdfDialog>, <SendDocumentsDialog>, <SendFromInterestButton> wrappers per detail page. - /[portSlug]/admin/brochures: list, upload (direct-to-storage presigned PUT for the 20MB+ files per §11.1), default toggle, archive. - /[portSlug]/admin/email extended with <SalesEmailConfigCard>: SMTP + IMAP creds, body templates, threshold/max settings. Storage: every upload + download goes through getStorageBackend() — no direct minio imports, per Phase 6a contract. Tests: 1145 vitest passing (+ 50 new in markdown-email-sanitization.test.ts, document-sends-validators.test.ts, sales-email-config-validators.test.ts). Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-05 03:38:47 +02:00
/**
* Brochures + brochure-versions service (Phase 7 see plan §3.3 / §4.7).
*
* Brochures are port-wide marketing PDFs (the sample `Port-Nimara-Brochure-March-2025`
* is 10.26 MB). Each `brochures` row groups a logical brochure (e.g.
* "Investor Pack"); each `brochure_versions` row is an immutable upload tied
* to that brochure. The default brochure is the one the send-out flow picks
* when the rep doesn't pick explicitly (§14.7).
*
* Storage goes through `getStorageBackend()` (Phase 6a) never minio
* directly. The version row's `storageKey` follows the §4.7a convention.
*/
import { and, asc, desc, eq, isNull } from 'drizzle-orm';
import { db } from '@/lib/db';
import { brochures, brochureVersions, ports } from '@/lib/db/schema';
import type { Brochure, BrochureVersion } from '@/lib/db/schema';
import { ForbiddenError, NotFoundError, ValidationError } from '@/lib/errors';
import { getStorageBackend } from '@/lib/storage';
import { buildStoragePath } from '@/lib/minio';
import { logger } from '@/lib/logger';
// ─── Types ───────────────────────────────────────────────────────────────────
export interface BrochureWithCurrentVersion extends Brochure {
currentVersion: BrochureVersion | null;
versionCount: number;
}
// ─── Internal helpers ────────────────────────────────────────────────────────
async function loadPortSlug(portId: string): Promise<string> {
const port = await db.query.ports.findFirst({ where: eq(ports.id, portId) });
if (!port) throw new NotFoundError('Port');
return port.slug;
}
// ─── List ────────────────────────────────────────────────────────────────────
/**
* List all brochures for a port. By default returns only non-archived rows;
* pass `{ includeArchived: true }` for the admin manage page.
*/
export async function listBrochures(
portId: string,
opts: { includeArchived?: boolean } = {},
): Promise<BrochureWithCurrentVersion[]> {
const baseRows = await db.query.brochures.findMany({
where: opts.includeArchived
? eq(brochures.portId, portId)
: and(eq(brochures.portId, portId), isNull(brochures.archivedAt)),
orderBy: [desc(brochures.isDefault), asc(brochures.label)],
});
if (baseRows.length === 0) return [];
const ids = baseRows.map((r) => r.id);
feat(expenses): streaming expense-PDF export + receipt-less expense flag + audit-3 fixes Replaces the legacy text-only expense PDF (was just dumping rows into a single pdfme text field — no images, no pagination) with a proper streaming export modelled on the legacy Nuxt client-portal but re-architected for memory safety. The legacy implementation OOM'd on hundreds of receipts because it: - buffered every receipt image into memory simultaneously - accumulated PDF chunks into an array, concat'd at end - base64-encoded the whole PDF into a JSON response (3x peak memory) - had no image downscaling The new design: - `streamExpensePdf()` (src/lib/services/expense-pdf.service.ts): pdfkit pipes bytes directly to the HTTP response (no Buffer accumulation). Receipts are processed serially so peak heap is one image at a time. Sharp downscales any receipt > 500 KB or > 1500 px to JPEG q80 — typical 8 MB phone photo collapses to ~250 KB. For a 500-receipt export, peak RSS stays under ~100 MB; legacy needed >2 GB for the same input. - Pages: cover summary box (count, totals, currency equiv, optional processing fee), grouped expense table (groupBy=none|payer|category| date), one-page-per-receipt with header (establishment, amount, date, payer, category, file name) and full-bleed image. - Storage backend abstraction — receipts stream from `getStorageBackend().get(storageKey)`, works on MinIO/S3/filesystem. - Route: POST /api/v1/expenses/export/pdf streams binary application/pdf with cache-control:no-store. Validator caps expenseIds at 1000 to prevent runaway loops. Receipt-less expense flow (per user request): - Schema: 0033 migration adds `expenses.no_receipt_acknowledged` boolean (default false). - Validator: createExpenseSchema requires either receiptFileIds OR noReceiptAcknowledged=true; the .refine() error message tells the rep exactly what to do. updateExpenseSchema is partial and skips the rule (existing rows can be edited without re-acknowledging). - PDF: receiptless expenses get an inline red "(no receipt)" tag in the establishment cell + a red footer warning in the summary box showing the count and at-risk amount. - The legacy parent-company reimbursement queue may refuse to pay receiptless expenses, so the warning is load-bearing for ops. Audit-3 fixes piggy-backed: - 🔴 Tesseract OCR runtime now races a 30s timeout (CPU-bomb DoS protection — a crafted PDF rasterizing to high-res noise could pin the worker indefinitely). - 🟠 brochures.service.ts:listBrochures dropped a wasted query (the legacy single-brochure fast-path was discarding its result on the multi-brochure branch). - 🟠 berth-pdf.service.ts:listBerthPdfVersions now Promise.all's the presignDownload calls instead of awaiting each in a for-loop — 20-version berths went from 20× round-trip to 1×. - 🟡 public berths route no longer logs the full `row` object on enum drift (was dumping price + amenity columns into ops logs). - 🟡 dropped the dead `void sql` import from public berths route. Tests still 1163/1163. tsc clean. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-05 04:38:32 +02:00
// One round-trip fetches every version for the page, ordered newest-first
// so the per-row `currentVersion` lookup below is just `[0]`.
const allVersions = await db.query.brochureVersions.findMany({
where: (bv, { inArray }) => inArray(bv.brochureId, ids),
orderBy: [desc(brochureVersions.uploadedAt)],
});
feat(emails): sales send-out flows + brochures + email-from settings Phase 7 of the berth-recommender refactor (plan §3.3, §4.8, §4.9, §5.7, §5.8, §5.9, §11.1, §14.7, §14.9). Adds the rep-driven send-out path for per-berth PDFs and port-wide brochures, the per-port sales SMTP/IMAP config + body templates, and the supporting admin UI. Migration: 0031_brochures_and_document_sends.sql Schema additions: - brochures (port-wide, with isDefault marker + archive) - brochure_versions (versioned uploads, storageKey per §4.7a) - document_sends (audit log of every rep-initiated send; failures captured with failedAt + errorReason). berthPdfVersionId is a plain text column (no FK) — loose-coupled to Phase 6b's berth_pdf_versions so the two phases stay independent. §14.7 critical mitigations: - Body XSS: rep-authored markdown goes through renderEmailBody() (HTML-escape first, then a tight allowlist of bold/italic/code/link rules). https:// + mailto: only — javascript:/data: URLs stripped. Tested against script/img/iframe/svg/onerror polyglots. - Recipient typo: strict email regex + two-step confirm modal that shows the exact recipient before send. - Unresolved merge fields: pre-send dry-run /preview endpoint blocks submission until findUnresolvedTokens() returns empty. - SMTP failure: every transport rejection writes a document_sends row with failedAt + errorReason; UI surfaces the message. - Hourly per-user rate limit: 50 sends/user/hour via existing checkRateLimit(). - Size threshold fallback (§11.1): files above email_attach_threshold_mb (default 15) ship as a 24h signed-URL download link in the body instead of an attachment. Storage stream flows directly to nodemailer to avoid buffering 20MB+. §14.10 critical mitigation: - SMTP/IMAP passwords encrypted at rest via the existing EMAIL_CREDENTIAL_KEY (AES-256-GCM). The /api/v1/admin/email/ sales-config GET endpoint never returns the decrypted value — only a *PassIsSet boolean. PATCH treats empty string as "leave unchanged" and explicit null as "clear", so the masked-placeholder UI round- trips without forcing re-entry on every save. system_settings keys (per-port unless noted): - sales_from_address, sales_smtp_{host,port,secure,user,pass_encrypted} - sales_imap_{host,port,user,pass_encrypted} - sales_auth_method (default app_password) - noreply_from_address - email_template_send_berth_pdf_body, email_template_send_brochure_body - brochure_max_upload_mb (default 50) - email_attach_threshold_mb (default 15) UI surfaces (per §5.7, §5.8, §5.9): - <SendDocumentDialog> shared 2-step compose+confirm flow. - <SendBerthPdfDialog>, <SendDocumentsDialog>, <SendFromInterestButton> wrappers per detail page. - /[portSlug]/admin/brochures: list, upload (direct-to-storage presigned PUT for the 20MB+ files per §11.1), default toggle, archive. - /[portSlug]/admin/email extended with <SalesEmailConfigCard>: SMTP + IMAP creds, body templates, threshold/max settings. Storage: every upload + download goes through getStorageBackend() — no direct minio imports, per Phase 6a contract. Tests: 1145 vitest passing (+ 50 new in markdown-email-sanitization.test.ts, document-sends-validators.test.ts, sales-email-config-validators.test.ts). Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-05 03:38:47 +02:00
return baseRows.map((row) => {
const versionsForRow = allVersions.filter((v) => v.brochureId === row.id);
versionsForRow.sort(
(a, b) => new Date(b.uploadedAt).getTime() - new Date(a.uploadedAt).getTime(),
);
return {
...row,
currentVersion: versionsForRow[0] ?? null,
versionCount: versionsForRow.length,
};
});
}
export async function getBrochure(
portId: string,
brochureId: string,
): Promise<BrochureWithCurrentVersion> {
const row = await db.query.brochures.findFirst({
where: and(eq(brochures.id, brochureId), eq(brochures.portId, portId)),
});
if (!row) throw new NotFoundError('Brochure');
const versions = await db.query.brochureVersions.findMany({
where: eq(brochureVersions.brochureId, brochureId),
orderBy: [desc(brochureVersions.uploadedAt)],
});
return { ...row, currentVersion: versions[0] ?? null, versionCount: versions.length };
}
/**
* Resolve the brochure that the send-out flow should default to. Returns the
* default brochure when one exists and is non-archived; falls back to the
* most recently created non-archived brochure with a version; null when
* the port has no usable brochures (the send UI hides the button §14.7).
*/
export async function getDefaultBrochure(
portId: string,
): Promise<BrochureWithCurrentVersion | null> {
const all = await listBrochures(portId, { includeArchived: false });
const usable = all.filter((b) => b.currentVersion !== null);
if (usable.length === 0) return null;
const flaggedDefault = usable.find((b) => b.isDefault);
if (flaggedDefault) return flaggedDefault;
return usable[0]!;
}
// ─── Mutations ───────────────────────────────────────────────────────────────
export interface CreateBrochureInput {
portId: string;
label: string;
description?: string | null;
isDefault?: boolean;
createdBy: string;
}
export async function createBrochure(input: CreateBrochureInput): Promise<Brochure> {
if (!input.label.trim()) throw new ValidationError('Brochure label is required');
// If this is being created as default, clear any existing default first
// so we maintain the invariant: at most one default per port.
return db.transaction(async (tx) => {
if (input.isDefault) {
await tx
.update(brochures)
.set({ isDefault: false })
.where(and(eq(brochures.portId, input.portId), eq(brochures.isDefault, true)));
}
const [row] = await tx
.insert(brochures)
.values({
portId: input.portId,
label: input.label.trim(),
description: input.description ?? null,
isDefault: input.isDefault ?? false,
createdBy: input.createdBy,
})
.returning();
if (!row) throw new Error('Failed to create brochure');
return row;
});
}
export interface UpdateBrochureInput {
label?: string;
description?: string | null;
isDefault?: boolean;
}
export async function updateBrochure(
portId: string,
brochureId: string,
patch: UpdateBrochureInput,
): Promise<Brochure> {
const existing = await db.query.brochures.findFirst({
where: and(eq(brochures.id, brochureId), eq(brochures.portId, portId)),
});
if (!existing) throw new NotFoundError('Brochure');
return db.transaction(async (tx) => {
if (patch.isDefault === true) {
await tx
.update(brochures)
.set({ isDefault: false })
.where(and(eq(brochures.portId, portId), eq(brochures.isDefault, true)));
}
const updates: Partial<Brochure> = {};
if (patch.label !== undefined) updates.label = patch.label.trim();
if (patch.description !== undefined) updates.description = patch.description;
if (patch.isDefault !== undefined) updates.isDefault = patch.isDefault;
const [row] = await tx
.update(brochures)
.set(updates)
.where(eq(brochures.id, brochureId))
.returning();
if (!row) throw new Error('Failed to update brochure');
return row;
});
}
export async function archiveBrochure(portId: string, brochureId: string): Promise<void> {
const existing = await db.query.brochures.findFirst({
where: and(eq(brochures.id, brochureId), eq(brochures.portId, portId)),
});
if (!existing) throw new NotFoundError('Brochure');
await db
.update(brochures)
.set({ archivedAt: new Date(), isDefault: false })
.where(eq(brochures.id, brochureId));
}
// ─── Versions ────────────────────────────────────────────────────────────────
export interface RegisterBrochureVersionInput {
portId: string;
brochureId: string;
storageKey: string;
fileName: string;
fileSizeBytes: number;
contentSha256: string;
uploadedBy: string;
}
/**
* After a presigned upload completes, the browser POSTs the metadata back
* here. We HEAD the storage key to verify the file exists at the claimed
* size + content-type (per §11.1 "Server-side validation"), then write the
* version row + bump version number.
*/
export async function registerBrochureVersion(
input: RegisterBrochureVersionInput,
): Promise<BrochureVersion> {
const brochure = await db.query.brochures.findFirst({
where: and(eq(brochures.id, input.brochureId), eq(brochures.portId, input.portId)),
});
if (!brochure) throw new NotFoundError('Brochure');
if (brochure.archivedAt) {
throw new ForbiddenError('Cannot upload a version to an archived brochure');
}
const storage = await getStorageBackend();
const head = await storage.head(input.storageKey);
if (!head) throw new ValidationError('Uploaded object not found in storage');
if (head.sizeBytes !== input.fileSizeBytes) {
logger.warn(
{ expected: input.fileSizeBytes, actual: head.sizeBytes, key: input.storageKey },
'Brochure upload size mismatch',
);
throw new ValidationError('Uploaded object size does not match metadata');
}
fix(audit): post-review hardening across phases 0-7 15 of 17 findings from the consolidated audit (3 reviewer agents on the previously-shipped phase commits). Remaining two are nice-to-have follow-ups deferred. Critical (data integrity / security): - Public berths API: closed-deal junction rows no longer flip a berth to "Under Offer" - filter on `interests.outcome IS NULL` so won/ lost/cancelled don't pollute public-map status. Both list + single-mooring routes. - Recommender heat: cancelled outcomes now count as fall-throughs (SQL was `LIKE 'lost%'` which silently dropped them, leaving cancelled-only berths stuck in tier A). - Filesystem presignDownload returns an absolute URL (origin from APP_URL) so emailed download links resolve from external mail clients. - Magic-byte verification on the presigned-PUT path: both per-berth PDFs and brochures stream the first 5 bytes via the storage backend and reject + delete on `%PDF-` mismatch (was only enforced when the server saw the buffer; presign-PUT was wide open). - Replay-protection TTL aligned to the token's own expiry (was a fixed 30 min, but send-out tokens live 24 h). Floor 60 s, ceiling 25 days. - Brochures unique partial index on (port_id) WHERE is_default=true + 0032 migration. Closes the read-then-write race in the create/ update transactions. Important: - Recommender SQL: defense-in-depth `i.port_id = $portId` filter on the aggregates CTE. - berth-pdf service: per-berth pg_advisory_xact_lock around the version-number SELECT + insert. Storage key is now UUID-based so concurrent uploads can't collide on blob paths. Replaces `nextVersionNumber` with the tx-bound variant. - berth-pdf apply: rejects with ConflictError when parse_results contain a mooring-mismatch warning unless the caller passes `confirmMooringMismatch: true` (force-reconfirm gate was UI-only). - Send-out body: HTML-escape brochure filename in the download-link fallback (XSS guard). - parseDecimalWithUnit rejects negative numbers. - listClients DISTINCT ON for primary contact resolution: bounds contact-row count to ~2 per client. Defensive: - verifyProxyToken rejects NaN/Infinity expiries via Number.isFinite. - Replaced sql ANY() with inArray() in interest-berths. Tests: 1145 -> 1163 passing. Deferred: bulk-send rate limit (no bulk endpoint today), markdown italic regex breaking links with asterisks (cosmetic). Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-05 04:07:03 +02:00
// Magic-byte check (§14.6 critical) - the presign path doesn't see the
// bytes until upload completes. Read the first 5 bytes; abort + delete
// on mismatch so a malicious uploader can't smuggle a non-PDF that the
// CRM would later email as `application/pdf`.
const stream = await storage.get(input.storageKey);
const chunks: Buffer[] = [];
let total = 0;
for await (const chunk of stream as AsyncIterable<Buffer | string>) {
const buf = typeof chunk === 'string' ? Buffer.from(chunk) : chunk;
chunks.push(buf);
total += buf.length;
if (total >= 5) break;
}
if (typeof (stream as { destroy?: () => void }).destroy === 'function') {
(stream as unknown as { destroy: () => void }).destroy();
}
const probe = Buffer.concat(chunks).subarray(0, 5);
if (probe.length < 5 || probe.toString('utf8', 0, 5) !== '%PDF-') {
await storage.delete(input.storageKey).catch(() => undefined);
throw new ValidationError(
'Uploaded file failed PDF magic-byte check (does not start with %PDF-).',
);
}
feat(emails): sales send-out flows + brochures + email-from settings Phase 7 of the berth-recommender refactor (plan §3.3, §4.8, §4.9, §5.7, §5.8, §5.9, §11.1, §14.7, §14.9). Adds the rep-driven send-out path for per-berth PDFs and port-wide brochures, the per-port sales SMTP/IMAP config + body templates, and the supporting admin UI. Migration: 0031_brochures_and_document_sends.sql Schema additions: - brochures (port-wide, with isDefault marker + archive) - brochure_versions (versioned uploads, storageKey per §4.7a) - document_sends (audit log of every rep-initiated send; failures captured with failedAt + errorReason). berthPdfVersionId is a plain text column (no FK) — loose-coupled to Phase 6b's berth_pdf_versions so the two phases stay independent. §14.7 critical mitigations: - Body XSS: rep-authored markdown goes through renderEmailBody() (HTML-escape first, then a tight allowlist of bold/italic/code/link rules). https:// + mailto: only — javascript:/data: URLs stripped. Tested against script/img/iframe/svg/onerror polyglots. - Recipient typo: strict email regex + two-step confirm modal that shows the exact recipient before send. - Unresolved merge fields: pre-send dry-run /preview endpoint blocks submission until findUnresolvedTokens() returns empty. - SMTP failure: every transport rejection writes a document_sends row with failedAt + errorReason; UI surfaces the message. - Hourly per-user rate limit: 50 sends/user/hour via existing checkRateLimit(). - Size threshold fallback (§11.1): files above email_attach_threshold_mb (default 15) ship as a 24h signed-URL download link in the body instead of an attachment. Storage stream flows directly to nodemailer to avoid buffering 20MB+. §14.10 critical mitigation: - SMTP/IMAP passwords encrypted at rest via the existing EMAIL_CREDENTIAL_KEY (AES-256-GCM). The /api/v1/admin/email/ sales-config GET endpoint never returns the decrypted value — only a *PassIsSet boolean. PATCH treats empty string as "leave unchanged" and explicit null as "clear", so the masked-placeholder UI round- trips without forcing re-entry on every save. system_settings keys (per-port unless noted): - sales_from_address, sales_smtp_{host,port,secure,user,pass_encrypted} - sales_imap_{host,port,user,pass_encrypted} - sales_auth_method (default app_password) - noreply_from_address - email_template_send_berth_pdf_body, email_template_send_brochure_body - brochure_max_upload_mb (default 50) - email_attach_threshold_mb (default 15) UI surfaces (per §5.7, §5.8, §5.9): - <SendDocumentDialog> shared 2-step compose+confirm flow. - <SendBerthPdfDialog>, <SendDocumentsDialog>, <SendFromInterestButton> wrappers per detail page. - /[portSlug]/admin/brochures: list, upload (direct-to-storage presigned PUT for the 20MB+ files per §11.1), default toggle, archive. - /[portSlug]/admin/email extended with <SalesEmailConfigCard>: SMTP + IMAP creds, body templates, threshold/max settings. Storage: every upload + download goes through getStorageBackend() — no direct minio imports, per Phase 6a contract. Tests: 1145 vitest passing (+ 50 new in markdown-email-sanitization.test.ts, document-sends-validators.test.ts, sales-email-config-validators.test.ts). Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-05 03:38:47 +02:00
// Determine the next version number for this brochure.
const existing = await db.query.brochureVersions.findMany({
where: eq(brochureVersions.brochureId, input.brochureId),
orderBy: [desc(brochureVersions.versionNumber)],
limit: 1,
});
const nextVersion = (existing[0]?.versionNumber ?? 0) + 1;
const [row] = await db
.insert(brochureVersions)
.values({
brochureId: input.brochureId,
versionNumber: nextVersion,
storageKey: input.storageKey,
fileName: input.fileName,
fileSizeBytes: input.fileSizeBytes,
contentSha256: input.contentSha256,
uploadedBy: input.uploadedBy,
})
.returning();
if (!row) throw new Error('Failed to record brochure version');
return row;
}
/**
* Generate a storage key the client should PUT to. Caller hands the returned
* key + URL to the browser; after upload the browser calls
* `registerBrochureVersion` with the same key.
*/
export async function generateBrochureStorageKey(
portId: string,
brochureId: string,
): Promise<string> {
const portSlug = await loadPortSlug(portId);
const fileId = crypto.randomUUID();
return buildStoragePath(portSlug, 'brochures', brochureId, fileId, 'pdf');
}