fix(audit-v3): platform-wide deferred-list cleanup (rounds 1-4)
Working through the audit-v2 deferred backlog. Each round was tested
(typecheck + 1168/1168 vitest) before moving on.
Round 1 — DB performance + AI cost visibility:
- Add missing FK indexes Postgres doesn't auto-create on
berth_reservations.{interest_id, contract_file_id},
documents.{file_id, signed_file_id}, document_events.signer_id,
document_templates.source_file_id, form_submissions.{form_template_id,
client_id}, document_sends.{brochure_id, brochure_version_id,
sent_by_user_id}. Without these, RESTRICT-checks on parent delete +
reverse-lookups walk the child tables fully. Migration 0037.
- AI worker now writes one ai_usage_ledger row per OpenAI call so admins
can audit spend per port/user/feature and future per-port budgets have
history to read from. Failure to write is logged-not-thrown so the
user-facing email draft is unaffected.
Round 2 — Boot-time + transport hardening:
- S3 backend verifies the bucket exists at startup (or auto-creates
when MINIO_AUTO_CREATE_BUCKET=true). A typo'd bucket name now
surfaces with a clear boot error instead of a vague Minio error
inside the first user-facing request.
- Documenso v1 placeFields: 3-attempt exponential-backoff retry on 5xx
+ network errors, fail-fast on 4xx. Stops one transient flake from
leaving a document with a partial field set.
- FilesystemBackend logs a structured warn-once at boot when the dev
HMAC fallback is in effect, so two processes started with different
BETTER_AUTH_SECRET values are observable (random 401s on file
downloads otherwise).
- Logger redact paths extended to cover *.headers.{authorization,
cookie}, *.config.headers.authorization, encrypted-credential blobs
(secretKeyEncrypted, smtpPassEncrypted, etc.), the Documenso
X-Documenso-Secret header, and 2-level nested forms.
Round 3 — UI feedback + permission gates:
- Storage admin migrate dialog: success toast with row count + error
toast on both dryRun and migrate mutations.
- Invoice detail Send + Record-payment buttons wrapped in
PermissionGate (invoices.send / invoices.record_payment); both
mutations now toast on success/error.
- Admin user list Edit button wrapped in PermissionGate(admin.manage_users).
- Scan-receipt page surfaces an amber warning when OCR fails so reps
know they can fill the form manually instead of staring at a stalled
spinner; the editable form now also opens on scanMutation.isError
/ uploadedFile, not only on success.
- Email threads list now renders skeleton rows during load + shared
EmptyState for the empty case (was a single "Loading…" line).
Round 4 — Service / route correctness:
- documentSends.sent_by_user_id was a free-text NOT NULL column with no
FK. Now nullable + FK to user(id) ON DELETE SET NULL so the audit row
survives a user being hard-deleted. Migration 0038 with a defensive
null-out for any orphan ids before attaching the constraint.
- Saved-views route: documented why withAuth alone is correct (the
service strictly filters by (portId, userId) — owner-only by design).
- Public-interests audit log: replaced "userId: null as unknown as
string" cast with userId: null; AuditLogParams already accepts null
for system-generated events.
- EOI in-app PDF fill: extracted setBerthRange() that, when the
AcroForm field is missing AND the context has a non-empty range
string, logs a structured warn so the deployment gap (live Documenso
template needs the field) is observable instead of silently dropping
the multi-berth range.
Test status: 1168/1168 vitest. tsc clean. Two new migrations
(0037/0038) need pnpm db:push (or migration apply) on the dev DB.
Deferred-doc updated with the remaining open items (bigger refactors).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
@@ -9,6 +9,40 @@ import { QUEUE_CONFIGS } from '@/lib/queue';
|
||||
const MAX_OUTPUT_BYTES = 10 * 1024; // 10 KB
|
||||
const OPENAI_TIMEOUT_MS = 30_000; // 30 s
|
||||
|
||||
interface RecordAiUsageArgs {
|
||||
portId: string;
|
||||
userId: string;
|
||||
feature: string;
|
||||
provider: 'openai' | 'claude' | 'tesseract';
|
||||
model: string;
|
||||
inputTokens: number;
|
||||
outputTokens: number;
|
||||
totalTokens: number;
|
||||
requestId: string | null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Insert one ai_usage_ledger row per provider call. Best-effort — the
|
||||
* draft generation is the user-facing artefact, the ledger is
|
||||
* observability. Imports are lazy so this module loads cleanly inside
|
||||
* the worker bundle without dragging the DB layer in at import time.
|
||||
*/
|
||||
async function recordAiUsage(args: RecordAiUsageArgs): Promise<void> {
|
||||
const { db } = await import('@/lib/db');
|
||||
const { aiUsageLedger } = await import('@/lib/db/schema/ai-usage');
|
||||
await db.insert(aiUsageLedger).values({
|
||||
portId: args.portId,
|
||||
userId: args.userId,
|
||||
feature: args.feature,
|
||||
provider: args.provider,
|
||||
model: args.model,
|
||||
inputTokens: args.inputTokens,
|
||||
outputTokens: args.outputTokens,
|
||||
totalTokens: args.totalTokens,
|
||||
requestId: args.requestId,
|
||||
});
|
||||
}
|
||||
|
||||
interface GenerateEmailDraftPayload {
|
||||
interestId: string;
|
||||
clientId: string;
|
||||
@@ -150,7 +184,13 @@ async function generateEmailDraft(payload: GenerateEmailDraftPayload): Promise<D
|
||||
}
|
||||
|
||||
const data = (await response.json()) as {
|
||||
id?: string;
|
||||
choices: Array<{ message: { content: string } }>;
|
||||
usage?: {
|
||||
prompt_tokens?: number;
|
||||
completion_tokens?: number;
|
||||
total_tokens?: number;
|
||||
};
|
||||
};
|
||||
|
||||
const content = data.choices[0]?.message?.content ?? '{}';
|
||||
@@ -160,6 +200,24 @@ async function generateEmailDraft(payload: GenerateEmailDraftPayload): Promise<D
|
||||
throw new Error('AI output exceeded 10 KB cap');
|
||||
}
|
||||
|
||||
// Record token usage so admins can audit spend + future per-port
|
||||
// budget caps have a history to read from. Failure here must not
|
||||
// bubble up — the email draft is the user-facing artefact, the
|
||||
// ledger is observability.
|
||||
void recordAiUsage({
|
||||
portId,
|
||||
userId: payload.requestedBy,
|
||||
feature: 'reply_draft',
|
||||
provider: 'openai',
|
||||
model: 'gpt-4o-mini',
|
||||
inputTokens: data.usage?.prompt_tokens ?? 0,
|
||||
outputTokens: data.usage?.completion_tokens ?? 0,
|
||||
totalTokens: data.usage?.total_tokens ?? 0,
|
||||
requestId: data.id ?? null,
|
||||
}).catch((err) => {
|
||||
logger.warn({ err, interestId }, 'Failed to record AI usage ledger row');
|
||||
});
|
||||
|
||||
const parsed = JSON.parse(content) as { subject?: string; body?: string };
|
||||
subject = parsed.subject ?? `Follow-up: ${client.fullName}`;
|
||||
body = parsed.body ?? '';
|
||||
|
||||
Reference in New Issue
Block a user