fix(audit-v3): platform-wide deferred-list cleanup (rounds 1-4)
Working through the audit-v2 deferred backlog. Each round was tested
(typecheck + 1168/1168 vitest) before moving on.
Round 1 — DB performance + AI cost visibility:
- Add missing FK indexes Postgres doesn't auto-create on
berth_reservations.{interest_id, contract_file_id},
documents.{file_id, signed_file_id}, document_events.signer_id,
document_templates.source_file_id, form_submissions.{form_template_id,
client_id}, document_sends.{brochure_id, brochure_version_id,
sent_by_user_id}. Without these, RESTRICT-checks on parent delete +
reverse-lookups walk the child tables fully. Migration 0037.
- AI worker now writes one ai_usage_ledger row per OpenAI call so admins
can audit spend per port/user/feature and future per-port budgets have
history to read from. Failure to write is logged-not-thrown so the
user-facing email draft is unaffected.
Round 2 — Boot-time + transport hardening:
- S3 backend verifies the bucket exists at startup (or auto-creates
when MINIO_AUTO_CREATE_BUCKET=true). A typo'd bucket name now
surfaces with a clear boot error instead of a vague Minio error
inside the first user-facing request.
- Documenso v1 placeFields: 3-attempt exponential-backoff retry on 5xx
+ network errors, fail-fast on 4xx. Stops one transient flake from
leaving a document with a partial field set.
- FilesystemBackend logs a structured warn-once at boot when the dev
HMAC fallback is in effect, so two processes started with different
BETTER_AUTH_SECRET values are observable (random 401s on file
downloads otherwise).
- Logger redact paths extended to cover *.headers.{authorization,
cookie}, *.config.headers.authorization, encrypted-credential blobs
(secretKeyEncrypted, smtpPassEncrypted, etc.), the Documenso
X-Documenso-Secret header, and 2-level nested forms.
Round 3 — UI feedback + permission gates:
- Storage admin migrate dialog: success toast with row count + error
toast on both dryRun and migrate mutations.
- Invoice detail Send + Record-payment buttons wrapped in
PermissionGate (invoices.send / invoices.record_payment); both
mutations now toast on success/error.
- Admin user list Edit button wrapped in PermissionGate(admin.manage_users).
- Scan-receipt page surfaces an amber warning when OCR fails so reps
know they can fill the form manually instead of staring at a stalled
spinner; the editable form now also opens on scanMutation.isError
/ uploadedFile, not only on success.
- Email threads list now renders skeleton rows during load + shared
EmptyState for the empty case (was a single "Loading…" line).
Round 4 — Service / route correctness:
- documentSends.sent_by_user_id was a free-text NOT NULL column with no
FK. Now nullable + FK to user(id) ON DELETE SET NULL so the audit row
survives a user being hard-deleted. Migration 0038 with a defensive
null-out for any orphan ids before attaching the constraint.
- Saved-views route: documented why withAuth alone is correct (the
service strictly filters by (portId, userId) — owner-only by design).
- Public-interests audit log: replaced "userId: null as unknown as
string" cast with userId: null; AuditLogParams already accepts null
for system-generated events.
- EOI in-app PDF fill: extracted setBerthRange() that, when the
AcroForm field is missing AND the context has a non-empty range
string, logs a structured warn so the deployment gap (live Documenso
template needs the field) is observable instead of silently dropping
the multi-berth range.
Test status: 1168/1168 vitest. tsc clean. Two new migrations
(0037/0038) need pnpm db:push (or migration apply) on the dev DB.
Deferred-doc updated with the remaining open items (bigger refactors).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
@@ -1,5 +1,13 @@
|
|||||||
# Final audit deferred findings
|
# Final audit deferred findings
|
||||||
|
|
||||||
|
> **Status update (audit-v3 round)**: most of the v2 deferred items have
|
||||||
|
> now landed. Items struck through below are completed. The remaining
|
||||||
|
> open items are bigger refactors (custom-fields per-entity routes,
|
||||||
|
> systemSettings PK reconciliation, Documenso v2 voidDocument verification,
|
||||||
|
> partial-vs-composite archived index conversion, storage-proxy port_id
|
||||||
|
> claim, Documenso webhook port_id enforcement, response-shape
|
||||||
|
> standardization, berths.current_pdf_version_id Drizzle FK).
|
||||||
|
|
||||||
The pre-merge audit on `feat/berth-recommender` produced ~30 findings. The
|
The pre-merge audit on `feat/berth-recommender` produced ~30 findings. The
|
||||||
critical + high-severity items were fixed in-branch. The items below are
|
critical + high-severity items were fixed in-branch. The items below are
|
||||||
medium / low severity and deferred to follow-up issues so the merge isn't
|
medium / low severity and deferred to follow-up issues so the merge isn't
|
||||||
|
|||||||
@@ -268,10 +268,18 @@ export default function ScanReceiptPage() {
|
|||||||
<span className="text-sm">Scanning receipt...</span>
|
<span className="text-sm">Scanning receipt...</span>
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
|
|
||||||
|
{scanMutation.isError && (
|
||||||
|
<div className="mt-4 rounded-md border border-amber-300 bg-amber-50 p-3 text-xs text-amber-900 dark:border-amber-900 dark:bg-amber-950/40 dark:text-amber-200">
|
||||||
|
<span className="font-medium">Couldn't read this receipt automatically.</span>{' '}
|
||||||
|
You can still fill in the details manually below — the receipt image will save with
|
||||||
|
the expense.
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
</CardContent>
|
</CardContent>
|
||||||
</Card>
|
</Card>
|
||||||
|
|
||||||
{(scanResult || scanMutation.isSuccess) && (
|
{(scanResult || scanMutation.isSuccess || scanMutation.isError || uploadedFile) && (
|
||||||
<Card>
|
<Card>
|
||||||
<CardHeader>
|
<CardHeader>
|
||||||
<CardTitle className="text-base">
|
<CardTitle className="text-base">
|
||||||
|
|||||||
@@ -250,8 +250,12 @@ export async function POST(req: NextRequest) {
|
|||||||
});
|
});
|
||||||
|
|
||||||
// ─── Post-commit side-effects (fire-and-forget) ─────────────────────────
|
// ─── Post-commit side-effects (fire-and-forget) ─────────────────────────
|
||||||
|
// `AuditLogParams.userId` is `string | null`; null is the documented
|
||||||
|
// "system-generated" sentinel and matches `audit_logs.user_id` being
|
||||||
|
// nullable in the schema. The earlier `null as unknown as string`
|
||||||
|
// cast was a relic from before the type was widened.
|
||||||
void createAuditLog({
|
void createAuditLog({
|
||||||
userId: null as unknown as string,
|
userId: null,
|
||||||
portId,
|
portId,
|
||||||
action: 'create',
|
action: 'create',
|
||||||
entityType: 'interest',
|
entityType: 'interest',
|
||||||
|
|||||||
@@ -11,6 +11,11 @@ const listQuerySchema = z.object({
|
|||||||
entityType: z.string().optional(),
|
entityType: z.string().optional(),
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// Saved views are owner-only by design: every service call filters by
|
||||||
|
// (portId, userId), so any authenticated user can manage exactly their
|
||||||
|
// own views. We deliberately skip `withPermission(...)` here — there is
|
||||||
|
// no resource-level permission to add. See `savedViewsService` for the
|
||||||
|
// ownership filter that backs this route.
|
||||||
export const GET = withAuth(async (req, ctx) => {
|
export const GET = withAuth(async (req, ctx) => {
|
||||||
try {
|
try {
|
||||||
const { entityType } = parseQuery(req, listQuerySchema);
|
const { entityType } = parseQuery(req, listQuerySchema);
|
||||||
|
|||||||
@@ -3,6 +3,7 @@
|
|||||||
import { useState } from 'react';
|
import { useState } from 'react';
|
||||||
import { useMutation, useQuery, useQueryClient } from '@tanstack/react-query';
|
import { useMutation, useQuery, useQueryClient } from '@tanstack/react-query';
|
||||||
import { CheckCircle2, HardDrive, Loader2, RefreshCw, ServerCog, XCircle } from 'lucide-react';
|
import { CheckCircle2, HardDrive, Loader2, RefreshCw, ServerCog, XCircle } from 'lucide-react';
|
||||||
|
import { toast } from 'sonner';
|
||||||
|
|
||||||
import { PageHeader } from '@/components/shared/page-header';
|
import { PageHeader } from '@/components/shared/page-header';
|
||||||
import { Button } from '@/components/ui/button';
|
import { Button } from '@/components/ui/button';
|
||||||
@@ -56,6 +57,8 @@ export function StorageAdminPanel() {
|
|||||||
setDryRun(result.data);
|
setDryRun(result.data);
|
||||||
setConfirmOpen(true);
|
setConfirmOpen(true);
|
||||||
},
|
},
|
||||||
|
onError: (e) =>
|
||||||
|
toast.error(e instanceof Error ? e.message : 'Storage migration dry-run failed'),
|
||||||
});
|
});
|
||||||
|
|
||||||
const migrateMutation = useMutation({
|
const migrateMutation = useMutation({
|
||||||
@@ -64,11 +67,14 @@ export function StorageAdminPanel() {
|
|||||||
method: 'POST',
|
method: 'POST',
|
||||||
body: JSON.stringify({ ...opts, dryRun: false }),
|
body: JSON.stringify({ ...opts, dryRun: false }),
|
||||||
}),
|
}),
|
||||||
onSuccess: () => {
|
onSuccess: (result) => {
|
||||||
setConfirmOpen(false);
|
setConfirmOpen(false);
|
||||||
setDryRun(null);
|
setDryRun(null);
|
||||||
|
const copied = result.data.rowsMigrated ?? 0;
|
||||||
|
toast.success(`Storage migration complete (${copied} file${copied === 1 ? '' : 's'} copied)`);
|
||||||
queryClient.invalidateQueries({ queryKey: ['admin', 'storage', 'status'] });
|
queryClient.invalidateQueries({ queryKey: ['admin', 'storage', 'status'] });
|
||||||
},
|
},
|
||||||
|
onError: (e) => toast.error(e instanceof Error ? e.message : 'Storage migration failed'),
|
||||||
});
|
});
|
||||||
|
|
||||||
const testMutation = useMutation({
|
const testMutation = useMutation({
|
||||||
|
|||||||
@@ -7,6 +7,7 @@ import { Pencil, Trash2, Plus, ShieldCheck, ShieldOff } from 'lucide-react';
|
|||||||
import { DataTable } from '@/components/shared/data-table';
|
import { DataTable } from '@/components/shared/data-table';
|
||||||
import { PageHeader } from '@/components/shared/page-header';
|
import { PageHeader } from '@/components/shared/page-header';
|
||||||
import { ConfirmationDialog } from '@/components/shared/confirmation-dialog';
|
import { ConfirmationDialog } from '@/components/shared/confirmation-dialog';
|
||||||
|
import { PermissionGate } from '@/components/shared/permission-gate';
|
||||||
import { Button } from '@/components/ui/button';
|
import { Button } from '@/components/ui/button';
|
||||||
import { Badge } from '@/components/ui/badge';
|
import { Badge } from '@/components/ui/badge';
|
||||||
import { apiFetch } from '@/lib/api/client';
|
import { apiFetch } from '@/lib/api/client';
|
||||||
@@ -111,10 +112,12 @@ export function UserList() {
|
|||||||
header: '',
|
header: '',
|
||||||
cell: ({ row }) => (
|
cell: ({ row }) => (
|
||||||
<div className="flex items-center justify-end gap-1">
|
<div className="flex items-center justify-end gap-1">
|
||||||
|
<PermissionGate resource="admin" action="manage_users">
|
||||||
<Button variant="ghost" size="sm" onClick={() => handleEditUser(row.original)}>
|
<Button variant="ghost" size="sm" onClick={() => handleEditUser(row.original)}>
|
||||||
<Pencil className="h-4 w-4" />
|
<Pencil className="h-4 w-4" />
|
||||||
<span className="sr-only">Edit</span>
|
<span className="sr-only">Edit</span>
|
||||||
</Button>
|
</Button>
|
||||||
|
</PermissionGate>
|
||||||
<ConfirmationDialog
|
<ConfirmationDialog
|
||||||
trigger={
|
trigger={
|
||||||
<Button variant="ghost" size="sm" className="text-destructive hover:text-destructive">
|
<Button variant="ghost" size="sm" className="text-destructive hover:text-destructive">
|
||||||
|
|||||||
@@ -5,6 +5,8 @@ import { formatDistanceToNow } from 'date-fns';
|
|||||||
import { Mail } from 'lucide-react';
|
import { Mail } from 'lucide-react';
|
||||||
|
|
||||||
import { apiFetch } from '@/lib/api/client';
|
import { apiFetch } from '@/lib/api/client';
|
||||||
|
import { EmptyState } from '@/components/shared/empty-state';
|
||||||
|
import { Skeleton } from '@/components/ui/skeleton';
|
||||||
|
|
||||||
interface Thread {
|
interface Thread {
|
||||||
id: string;
|
id: string;
|
||||||
@@ -27,20 +29,32 @@ export function EmailThreadsList() {
|
|||||||
});
|
});
|
||||||
|
|
||||||
if (isLoading) {
|
if (isLoading) {
|
||||||
return <p className="text-sm text-muted-foreground">Loading threads…</p>;
|
// Skeleton rows shaped like the real list so the layout doesn't pop.
|
||||||
|
return (
|
||||||
|
<div className="rounded-lg border divide-y">
|
||||||
|
{Array.from({ length: 4 }).map((_, i) => (
|
||||||
|
<div key={i} className="p-3 space-y-2">
|
||||||
|
<div className="flex items-center justify-between gap-2">
|
||||||
|
<Skeleton className="h-4 w-1/3" />
|
||||||
|
<Skeleton className="h-3 w-16" />
|
||||||
|
</div>
|
||||||
|
<Skeleton className="h-3 w-1/2" />
|
||||||
|
<Skeleton className="h-3 w-2/3" />
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
const threads = data?.data ?? [];
|
const threads = data?.data ?? [];
|
||||||
|
|
||||||
if (threads.length === 0) {
|
if (threads.length === 0) {
|
||||||
return (
|
return (
|
||||||
<div className="rounded-lg border border-dashed p-8 text-center text-muted-foreground">
|
<EmptyState
|
||||||
<Mail className="mx-auto h-6 w-6 mb-2" />
|
icon={Mail}
|
||||||
<p className="text-sm">No email threads yet.</p>
|
title="No email threads yet"
|
||||||
<p className="text-xs">
|
description="Connect an account and trigger a sync to see incoming threads here."
|
||||||
Connect an account and trigger a sync to see incoming threads here.
|
/>
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -11,6 +11,8 @@ import { format } from 'date-fns';
|
|||||||
import { Button } from '@/components/ui/button';
|
import { Button } from '@/components/ui/button';
|
||||||
import { Badge } from '@/components/ui/badge';
|
import { Badge } from '@/components/ui/badge';
|
||||||
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
|
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
|
||||||
|
import { PermissionGate } from '@/components/shared/permission-gate';
|
||||||
|
import { toast } from 'sonner';
|
||||||
import { Label } from '@/components/ui/label';
|
import { Label } from '@/components/ui/label';
|
||||||
import { Input } from '@/components/ui/input';
|
import { Input } from '@/components/ui/input';
|
||||||
import {
|
import {
|
||||||
@@ -93,9 +95,11 @@ export function InvoiceDetail({ invoiceId }: InvoiceDetailProps) {
|
|||||||
const sendMutation = useMutation({
|
const sendMutation = useMutation({
|
||||||
mutationFn: () => apiFetch(`/api/v1/invoices/${invoiceId}/send`, { method: 'POST' }),
|
mutationFn: () => apiFetch(`/api/v1/invoices/${invoiceId}/send`, { method: 'POST' }),
|
||||||
onSuccess: () => {
|
onSuccess: () => {
|
||||||
|
toast.success('Invoice sent');
|
||||||
queryClient.invalidateQueries({ queryKey: ['invoices', invoiceId] });
|
queryClient.invalidateQueries({ queryKey: ['invoices', invoiceId] });
|
||||||
queryClient.invalidateQueries({ queryKey: ['invoices'] });
|
queryClient.invalidateQueries({ queryKey: ['invoices'] });
|
||||||
},
|
},
|
||||||
|
onError: (e) => toast.error(e instanceof Error ? e.message : 'Could not send invoice'),
|
||||||
});
|
});
|
||||||
|
|
||||||
const paymentForm = useForm<RecordPaymentInput>({
|
const paymentForm = useForm<RecordPaymentInput>({
|
||||||
@@ -110,9 +114,11 @@ export function InvoiceDetail({ invoiceId }: InvoiceDetailProps) {
|
|||||||
body: values,
|
body: values,
|
||||||
}),
|
}),
|
||||||
onSuccess: () => {
|
onSuccess: () => {
|
||||||
|
toast.success('Payment recorded');
|
||||||
queryClient.invalidateQueries({ queryKey: ['invoices', invoiceId] });
|
queryClient.invalidateQueries({ queryKey: ['invoices', invoiceId] });
|
||||||
queryClient.invalidateQueries({ queryKey: ['invoices'] });
|
queryClient.invalidateQueries({ queryKey: ['invoices'] });
|
||||||
},
|
},
|
||||||
|
onError: (e) => toast.error(e instanceof Error ? e.message : 'Could not record payment'),
|
||||||
});
|
});
|
||||||
|
|
||||||
if (isLoading) {
|
if (isLoading) {
|
||||||
@@ -150,6 +156,7 @@ export function InvoiceDetail({ invoiceId }: InvoiceDetailProps) {
|
|||||||
</div>
|
</div>
|
||||||
<div className="flex items-center gap-2">
|
<div className="flex items-center gap-2">
|
||||||
{invoice.status === 'draft' && (
|
{invoice.status === 'draft' && (
|
||||||
|
<PermissionGate resource="invoices" action="send">
|
||||||
<Button
|
<Button
|
||||||
variant="outline"
|
variant="outline"
|
||||||
size="sm"
|
size="sm"
|
||||||
@@ -163,6 +170,7 @@ export function InvoiceDetail({ invoiceId }: InvoiceDetailProps) {
|
|||||||
)}
|
)}
|
||||||
Send Invoice
|
Send Invoice
|
||||||
</Button>
|
</Button>
|
||||||
|
</PermissionGate>
|
||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -347,6 +355,7 @@ export function InvoiceDetail({ invoiceId }: InvoiceDetailProps) {
|
|||||||
</CardContent>
|
</CardContent>
|
||||||
</Card>
|
</Card>
|
||||||
) : (
|
) : (
|
||||||
|
<PermissionGate resource="invoices" action="record_payment">
|
||||||
<Card>
|
<Card>
|
||||||
<CardHeader>
|
<CardHeader>
|
||||||
<CardTitle className="text-sm font-medium">Record Payment</CardTitle>
|
<CardTitle className="text-sm font-medium">Record Payment</CardTitle>
|
||||||
@@ -358,7 +367,11 @@ export function InvoiceDetail({ invoiceId }: InvoiceDetailProps) {
|
|||||||
>
|
>
|
||||||
<div className="space-y-1">
|
<div className="space-y-1">
|
||||||
<Label htmlFor="paymentDate">Payment Date</Label>
|
<Label htmlFor="paymentDate">Payment Date</Label>
|
||||||
<Input id="paymentDate" type="date" {...paymentForm.register('paymentDate')} />
|
<Input
|
||||||
|
id="paymentDate"
|
||||||
|
type="date"
|
||||||
|
{...paymentForm.register('paymentDate')}
|
||||||
|
/>
|
||||||
{paymentForm.formState.errors.paymentDate && (
|
{paymentForm.formState.errors.paymentDate && (
|
||||||
<p className="text-xs text-destructive">
|
<p className="text-xs text-destructive">
|
||||||
{paymentForm.formState.errors.paymentDate.message}
|
{paymentForm.formState.errors.paymentDate.message}
|
||||||
@@ -404,6 +417,7 @@ export function InvoiceDetail({ invoiceId }: InvoiceDetailProps) {
|
|||||||
</form>
|
</form>
|
||||||
</CardContent>
|
</CardContent>
|
||||||
</Card>
|
</Card>
|
||||||
|
</PermissionGate>
|
||||||
)}
|
)}
|
||||||
</TabsContent>
|
</TabsContent>
|
||||||
</Tabs>
|
</Tabs>
|
||||||
|
|||||||
38
src/lib/db/migrations/0037_missing_fk_indexes.sql
Normal file
38
src/lib/db/migrations/0037_missing_fk_indexes.sql
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
-- Audit-final v2 follow-up: cover the FK columns Postgres doesn't auto-index.
|
||||||
|
-- Without these, deleting a parent row (or RESTRICT-checking on update) walks
|
||||||
|
-- the child table fully. CREATE INDEX IF NOT EXISTS keeps the migration safe
|
||||||
|
-- to re-run.
|
||||||
|
|
||||||
|
-- berth_reservations
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_br_interest
|
||||||
|
ON berth_reservations(interest_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_br_contract_file
|
||||||
|
ON berth_reservations(contract_file_id);
|
||||||
|
|
||||||
|
-- documents (file FKs)
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_docs_file_id
|
||||||
|
ON documents(file_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_docs_signed_file_id
|
||||||
|
ON documents(signed_file_id);
|
||||||
|
|
||||||
|
-- document_events
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_de_signer
|
||||||
|
ON document_events(signer_id);
|
||||||
|
|
||||||
|
-- document_templates
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_dt_source_file
|
||||||
|
ON document_templates(source_file_id);
|
||||||
|
|
||||||
|
-- form_submissions
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_fs_template
|
||||||
|
ON form_submissions(form_template_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_fs_client
|
||||||
|
ON form_submissions(client_id);
|
||||||
|
|
||||||
|
-- document_sends
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_ds_brochure
|
||||||
|
ON document_sends(brochure_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_ds_brochure_version
|
||||||
|
ON document_sends(brochure_version_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_ds_sent_by
|
||||||
|
ON document_sends(sent_by_user_id);
|
||||||
@@ -0,0 +1,23 @@
|
|||||||
|
-- Audit-final v2 follow-up: document_sends.sent_by_user_id was a free-text
|
||||||
|
-- column with no FK. If a user is hard-deleted (rare; we soft-delete), an
|
||||||
|
-- orphan id remained without any ON DELETE handling. Add the FK with
|
||||||
|
-- SET NULL semantics so the audit row keeps recipient + timestamp + body
|
||||||
|
-- even when the originating user is removed.
|
||||||
|
|
||||||
|
-- Drop the NOT NULL so SET NULL is legal.
|
||||||
|
ALTER TABLE document_sends
|
||||||
|
ALTER COLUMN sent_by_user_id DROP NOT NULL;
|
||||||
|
|
||||||
|
-- Defensive: if any historical rows have sent_by_user_id values that don't
|
||||||
|
-- match an existing user (dev-only), null them out so the FK can attach.
|
||||||
|
UPDATE document_sends
|
||||||
|
SET sent_by_user_id = NULL
|
||||||
|
WHERE sent_by_user_id IS NOT NULL
|
||||||
|
AND sent_by_user_id NOT IN (SELECT id FROM "user");
|
||||||
|
|
||||||
|
ALTER TABLE document_sends
|
||||||
|
ADD CONSTRAINT document_sends_sent_by_user_id_user_id_fk
|
||||||
|
FOREIGN KEY (sent_by_user_id) REFERENCES "user"(id) ON DELETE SET NULL;
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_ds_sent_by
|
||||||
|
ON document_sends(sent_by_user_id);
|
||||||
@@ -260,6 +260,20 @@
|
|||||||
"when": 1778100000000,
|
"when": 1778100000000,
|
||||||
"tag": "0036_polymorphic_check_constraints",
|
"tag": "0036_polymorphic_check_constraints",
|
||||||
"breakpoints": true
|
"breakpoints": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"idx": 37,
|
||||||
|
"version": "7",
|
||||||
|
"when": 1778150000000,
|
||||||
|
"tag": "0037_missing_fk_indexes",
|
||||||
|
"breakpoints": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"idx": 38,
|
||||||
|
"version": "7",
|
||||||
|
"when": 1778200000000,
|
||||||
|
"tag": "0038_document_sends_sent_by_user_fk",
|
||||||
|
"breakpoints": true
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -13,6 +13,7 @@ import { ports } from './ports';
|
|||||||
import { clients } from './clients';
|
import { clients } from './clients';
|
||||||
import { interests } from './interests';
|
import { interests } from './interests';
|
||||||
import { berths } from './berths';
|
import { berths } from './berths';
|
||||||
|
import { user } from './users';
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Port-wide brochures (Phase 7 — see plan §3.3 / §4.8).
|
* Port-wide brochures (Phase 7 — see plan §3.3 / §4.8).
|
||||||
@@ -123,7 +124,12 @@ export const documentSends = pgTable(
|
|||||||
}),
|
}),
|
||||||
/** Exact body used (after merge-field expansion + sanitization). */
|
/** Exact body used (after merge-field expansion + sanitization). */
|
||||||
bodyMarkdown: text('body_markdown'),
|
bodyMarkdown: text('body_markdown'),
|
||||||
sentByUserId: text('sent_by_user_id').notNull(),
|
/**
|
||||||
|
* better-auth user id of the sender. SET NULL on user delete so the
|
||||||
|
* audit row keeps `recipientEmail` + timestamp + body for compliance
|
||||||
|
* even when the originating user is removed from the system.
|
||||||
|
*/
|
||||||
|
sentByUserId: text('sent_by_user_id').references(() => user.id, { onDelete: 'set null' }),
|
||||||
fromAddress: text('from_address').notNull(),
|
fromAddress: text('from_address').notNull(),
|
||||||
sentAt: timestamp('sent_at', { withTimezone: true }).notNull().defaultNow(),
|
sentAt: timestamp('sent_at', { withTimezone: true }).notNull().defaultNow(),
|
||||||
/** SMTP provider message-id for deliverability tracking. */
|
/** SMTP provider message-id for deliverability tracking. */
|
||||||
@@ -143,6 +149,11 @@ export const documentSends = pgTable(
|
|||||||
index('idx_ds_interest').on(t.interestId, t.sentAt),
|
index('idx_ds_interest').on(t.interestId, t.sentAt),
|
||||||
index('idx_ds_berth').on(t.berthId, t.sentAt),
|
index('idx_ds_berth').on(t.berthId, t.sentAt),
|
||||||
index('idx_ds_port').on(t.portId, t.sentAt),
|
index('idx_ds_port').on(t.portId, t.sentAt),
|
||||||
|
// Reverse-lookups: "what sends used this brochure / version" and
|
||||||
|
// FK-RESTRICT scans on brochure delete.
|
||||||
|
index('idx_ds_brochure').on(t.brochureId),
|
||||||
|
index('idx_ds_brochure_version').on(t.brochureVersionId),
|
||||||
|
index('idx_ds_sent_by').on(t.sentByUserId),
|
||||||
],
|
],
|
||||||
);
|
);
|
||||||
|
|
||||||
|
|||||||
@@ -80,6 +80,11 @@ export const documents = pgTable(
|
|||||||
index('idx_docs_reservation').on(table.reservationId),
|
index('idx_docs_reservation').on(table.reservationId),
|
||||||
index('idx_docs_type').on(table.portId, table.documentType),
|
index('idx_docs_type').on(table.portId, table.documentType),
|
||||||
index('idx_docs_status_port').on(table.portId, table.status),
|
index('idx_docs_status_port').on(table.portId, table.status),
|
||||||
|
// Cover the file FKs Postgres doesn't auto-index. Without these,
|
||||||
|
// deleting (or RESTRICT-checking) a referenced files row scans
|
||||||
|
// the documents table fully.
|
||||||
|
index('idx_docs_file_id').on(table.fileId),
|
||||||
|
index('idx_docs_signed_file_id').on(table.signedFileId),
|
||||||
],
|
],
|
||||||
);
|
);
|
||||||
|
|
||||||
@@ -122,6 +127,8 @@ export const documentEvents = pgTable(
|
|||||||
},
|
},
|
||||||
(table) => [
|
(table) => [
|
||||||
index('idx_de_doc').on(table.documentId),
|
index('idx_de_doc').on(table.documentId),
|
||||||
|
// Reverse-lookup signer→events without scanning the events table.
|
||||||
|
index('idx_de_signer').on(table.signerId),
|
||||||
uniqueIndex('idx_de_dedup')
|
uniqueIndex('idx_de_dedup')
|
||||||
.on(table.documentId, table.signatureHash)
|
.on(table.documentId, table.signatureHash)
|
||||||
.where(sql`${table.signatureHash} IS NOT NULL`),
|
.where(sql`${table.signatureHash} IS NOT NULL`),
|
||||||
@@ -161,6 +168,7 @@ export const documentTemplates = pgTable(
|
|||||||
(table) => [
|
(table) => [
|
||||||
index('idx_dt_port').on(table.portId),
|
index('idx_dt_port').on(table.portId),
|
||||||
index('idx_dt_type').on(table.portId, table.templateType),
|
index('idx_dt_type').on(table.portId, table.templateType),
|
||||||
|
index('idx_dt_source_file').on(table.sourceFileId),
|
||||||
],
|
],
|
||||||
);
|
);
|
||||||
|
|
||||||
@@ -221,7 +229,11 @@ export const formSubmissions = pgTable(
|
|||||||
submittedAt: timestamp('submitted_at', { withTimezone: true }),
|
submittedAt: timestamp('submitted_at', { withTimezone: true }),
|
||||||
createdAt: timestamp('created_at', { withTimezone: true }).notNull().defaultNow(),
|
createdAt: timestamp('created_at', { withTimezone: true }).notNull().defaultNow(),
|
||||||
},
|
},
|
||||||
(table) => [uniqueIndex('idx_fs_token').on(table.token)],
|
(table) => [
|
||||||
|
uniqueIndex('idx_fs_token').on(table.token),
|
||||||
|
index('idx_fs_template').on(table.formTemplateId),
|
||||||
|
index('idx_fs_client').on(table.clientId),
|
||||||
|
],
|
||||||
);
|
);
|
||||||
|
|
||||||
export type File = typeof files.$inferSelect;
|
export type File = typeof files.$inferSelect;
|
||||||
|
|||||||
@@ -41,6 +41,11 @@ export const berthReservations = pgTable(
|
|||||||
index('idx_br_client').on(table.clientId),
|
index('idx_br_client').on(table.clientId),
|
||||||
index('idx_br_yacht').on(table.yachtId),
|
index('idx_br_yacht').on(table.yachtId),
|
||||||
index('idx_br_port').on(table.portId),
|
index('idx_br_port').on(table.portId),
|
||||||
|
// Cover the FKs Postgres doesn't auto-index. Without these, deleting
|
||||||
|
// (or restrict-checking) the parent interest / contract file row
|
||||||
|
// requires a full scan of berth_reservations.
|
||||||
|
index('idx_br_interest').on(table.interestId),
|
||||||
|
index('idx_br_contract_file').on(table.contractFileId),
|
||||||
uniqueIndex('idx_br_active')
|
uniqueIndex('idx_br_active')
|
||||||
.on(table.berthId)
|
.on(table.berthId)
|
||||||
.where(sql`${table.status} = 'active'`),
|
.where(sql`${table.status} = 'active'`),
|
||||||
|
|||||||
@@ -15,6 +15,31 @@ export const logger = pino({
|
|||||||
'*.secret',
|
'*.secret',
|
||||||
'*.accessKey',
|
'*.accessKey',
|
||||||
'*.secretKey',
|
'*.secretKey',
|
||||||
|
// Encrypted credential blobs surface in storage / smtp config logs
|
||||||
|
// unintentionally; redact them defensively even though they're
|
||||||
|
// already AES-encrypted at rest.
|
||||||
|
'*.secretKeyEncrypted',
|
||||||
|
'*.smtpPassEncrypted',
|
||||||
|
'*.imapPassEncrypted',
|
||||||
|
'*.proxyHmacSecretEncrypted',
|
||||||
|
// HTTP authorization headers (Bearer tokens, Basic creds) leak via
|
||||||
|
// err.config.headers on http-client error logs.
|
||||||
|
'*.headers.authorization',
|
||||||
|
'*.headers.Authorization',
|
||||||
|
'*.headers["x-documenso-secret"]',
|
||||||
|
'*.config.headers.Authorization',
|
||||||
|
'*.config.headers.authorization',
|
||||||
|
// Cookie headers can carry session tokens.
|
||||||
|
'*.headers.cookie',
|
||||||
|
'*.headers.Cookie',
|
||||||
|
// Two-level nesting for things like `req.headers.authorization` or
|
||||||
|
// `cfg.s3.secretKeyEncrypted`.
|
||||||
|
'*.*.password',
|
||||||
|
'*.*.token',
|
||||||
|
'*.*.secret',
|
||||||
|
'*.*.secretKeyEncrypted',
|
||||||
|
'*.*.headers.authorization',
|
||||||
|
'*.*.headers.Authorization',
|
||||||
],
|
],
|
||||||
censor: '[REDACTED]',
|
censor: '[REDACTED]',
|
||||||
},
|
},
|
||||||
|
|||||||
@@ -4,6 +4,7 @@ import path from 'node:path';
|
|||||||
import { PDFDocument } from 'pdf-lib';
|
import { PDFDocument } from 'pdf-lib';
|
||||||
|
|
||||||
import type { EoiContext } from '@/lib/services/eoi-context';
|
import type { EoiContext } from '@/lib/services/eoi-context';
|
||||||
|
import { logger } from '@/lib/logger';
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Source PDF for the in-app EOI pathway. Must contain AcroForm fields whose
|
* Source PDF for the in-app EOI pathway. Must contain AcroForm fields whose
|
||||||
@@ -48,6 +49,28 @@ function setText(form: ReturnType<PDFDocument['getForm']>, name: string, value:
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Special-cased setter for the multi-berth `Berth Range` field. When the
|
||||||
|
* caller has a non-empty range and the AcroForm field is missing, we log
|
||||||
|
* a warning so the deployment gap is observable (the in-app pathway is
|
||||||
|
* intentionally tolerant of older PDF templates, but ops needs to know
|
||||||
|
* when ranges are silently dropped — otherwise a customer's multi-berth
|
||||||
|
* EOI ships with only the primary mooring visible).
|
||||||
|
*/
|
||||||
|
function setBerthRange(form: ReturnType<PDFDocument['getForm']>, value: string): void {
|
||||||
|
try {
|
||||||
|
form.getTextField('Berth Range').setText(value);
|
||||||
|
} catch {
|
||||||
|
if (value && value.trim().length > 0) {
|
||||||
|
logger.warn(
|
||||||
|
{ berthRange: value },
|
||||||
|
'EOI in-app PDF template is missing the "Berth Range" AcroForm field — ' +
|
||||||
|
'multi-berth bundle range string was dropped. Update the source template.',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
function setCheckbox(
|
function setCheckbox(
|
||||||
form: ReturnType<PDFDocument['getForm']>,
|
form: ReturnType<PDFDocument['getForm']>,
|
||||||
name: string,
|
name: string,
|
||||||
@@ -88,9 +111,12 @@ export async function fillEoiFormFields(
|
|||||||
setText(form, 'Draft', context.yacht?.draftFt ?? '');
|
setText(form, 'Draft', context.yacht?.draftFt ?? '');
|
||||||
setText(form, 'Berth Number', context.berth?.mooringNumber ?? '');
|
setText(form, 'Berth Number', context.berth?.mooringNumber ?? '');
|
||||||
// Multi-berth EOI: compact range string from the interest's EOI bundle.
|
// Multi-berth EOI: compact range string from the interest's EOI bundle.
|
||||||
// Falls back silently when the AcroForm field doesn't exist (older
|
// The AcroForm field may be absent on an older template revision —
|
||||||
// template revisions without the field still fill cleanly).
|
// when the context HAS a non-empty range string but the field is
|
||||||
setText(form, 'Berth Range', context.eoiBerthRange);
|
// missing we surface a structured warning so the deployment gap is
|
||||||
|
// observable (the CRM dataset has multi-berth bundles but the live
|
||||||
|
// PDF template needs the field added before they render correctly).
|
||||||
|
setBerthRange(form, context.eoiBerthRange);
|
||||||
|
|
||||||
setCheckbox(form, 'Purchase', true);
|
setCheckbox(form, 'Purchase', true);
|
||||||
setCheckbox(form, 'Lease_10', false);
|
setCheckbox(form, 'Lease_10', false);
|
||||||
|
|||||||
@@ -9,6 +9,40 @@ import { QUEUE_CONFIGS } from '@/lib/queue';
|
|||||||
const MAX_OUTPUT_BYTES = 10 * 1024; // 10 KB
|
const MAX_OUTPUT_BYTES = 10 * 1024; // 10 KB
|
||||||
const OPENAI_TIMEOUT_MS = 30_000; // 30 s
|
const OPENAI_TIMEOUT_MS = 30_000; // 30 s
|
||||||
|
|
||||||
|
interface RecordAiUsageArgs {
|
||||||
|
portId: string;
|
||||||
|
userId: string;
|
||||||
|
feature: string;
|
||||||
|
provider: 'openai' | 'claude' | 'tesseract';
|
||||||
|
model: string;
|
||||||
|
inputTokens: number;
|
||||||
|
outputTokens: number;
|
||||||
|
totalTokens: number;
|
||||||
|
requestId: string | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Insert one ai_usage_ledger row per provider call. Best-effort — the
|
||||||
|
* draft generation is the user-facing artefact, the ledger is
|
||||||
|
* observability. Imports are lazy so this module loads cleanly inside
|
||||||
|
* the worker bundle without dragging the DB layer in at import time.
|
||||||
|
*/
|
||||||
|
async function recordAiUsage(args: RecordAiUsageArgs): Promise<void> {
|
||||||
|
const { db } = await import('@/lib/db');
|
||||||
|
const { aiUsageLedger } = await import('@/lib/db/schema/ai-usage');
|
||||||
|
await db.insert(aiUsageLedger).values({
|
||||||
|
portId: args.portId,
|
||||||
|
userId: args.userId,
|
||||||
|
feature: args.feature,
|
||||||
|
provider: args.provider,
|
||||||
|
model: args.model,
|
||||||
|
inputTokens: args.inputTokens,
|
||||||
|
outputTokens: args.outputTokens,
|
||||||
|
totalTokens: args.totalTokens,
|
||||||
|
requestId: args.requestId,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
interface GenerateEmailDraftPayload {
|
interface GenerateEmailDraftPayload {
|
||||||
interestId: string;
|
interestId: string;
|
||||||
clientId: string;
|
clientId: string;
|
||||||
@@ -150,7 +184,13 @@ async function generateEmailDraft(payload: GenerateEmailDraftPayload): Promise<D
|
|||||||
}
|
}
|
||||||
|
|
||||||
const data = (await response.json()) as {
|
const data = (await response.json()) as {
|
||||||
|
id?: string;
|
||||||
choices: Array<{ message: { content: string } }>;
|
choices: Array<{ message: { content: string } }>;
|
||||||
|
usage?: {
|
||||||
|
prompt_tokens?: number;
|
||||||
|
completion_tokens?: number;
|
||||||
|
total_tokens?: number;
|
||||||
|
};
|
||||||
};
|
};
|
||||||
|
|
||||||
const content = data.choices[0]?.message?.content ?? '{}';
|
const content = data.choices[0]?.message?.content ?? '{}';
|
||||||
@@ -160,6 +200,24 @@ async function generateEmailDraft(payload: GenerateEmailDraftPayload): Promise<D
|
|||||||
throw new Error('AI output exceeded 10 KB cap');
|
throw new Error('AI output exceeded 10 KB cap');
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Record token usage so admins can audit spend + future per-port
|
||||||
|
// budget caps have a history to read from. Failure here must not
|
||||||
|
// bubble up — the email draft is the user-facing artefact, the
|
||||||
|
// ledger is observability.
|
||||||
|
void recordAiUsage({
|
||||||
|
portId,
|
||||||
|
userId: payload.requestedBy,
|
||||||
|
feature: 'reply_draft',
|
||||||
|
provider: 'openai',
|
||||||
|
model: 'gpt-4o-mini',
|
||||||
|
inputTokens: data.usage?.prompt_tokens ?? 0,
|
||||||
|
outputTokens: data.usage?.completion_tokens ?? 0,
|
||||||
|
totalTokens: data.usage?.total_tokens ?? 0,
|
||||||
|
requestId: data.id ?? null,
|
||||||
|
}).catch((err) => {
|
||||||
|
logger.warn({ err, interestId }, 'Failed to record AI usage ledger row');
|
||||||
|
});
|
||||||
|
|
||||||
const parsed = JSON.parse(content) as { subject?: string; body?: string };
|
const parsed = JSON.parse(content) as { subject?: string; body?: string };
|
||||||
subject = parsed.subject ?? `Follow-up: ${client.fullName}`;
|
subject = parsed.subject ?? `Follow-up: ${client.fullName}`;
|
||||||
body = parsed.body ?? '';
|
body = parsed.body ?? '';
|
||||||
|
|||||||
@@ -382,6 +382,11 @@ export async function placeFields(
|
|||||||
pageWidth: Math.round((f.pageWidth / 100) * dims.width),
|
pageWidth: Math.round((f.pageWidth / 100) * dims.width),
|
||||||
pageHeight: Math.round((f.pageHeight / 100) * dims.height),
|
pageHeight: Math.round((f.pageHeight / 100) * dims.height),
|
||||||
};
|
};
|
||||||
|
// Retry transient failures so one flaky 5xx mid-loop doesn't leave
|
||||||
|
// the document with a partial field set. 3 attempts at 250 / 500 /
|
||||||
|
// 1000 ms; 4xx responses (validation errors) fail-fast.
|
||||||
|
let lastError: { status: number; body: string } | null = null;
|
||||||
|
for (let attempt = 0; attempt < 3; attempt += 1) {
|
||||||
const res = await fetch(`${baseUrl}/api/v1/documents/${docId}/fields`, {
|
const res = await fetch(`${baseUrl}/api/v1/documents/${docId}/fields`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: {
|
headers: {
|
||||||
@@ -390,10 +395,25 @@ export async function placeFields(
|
|||||||
},
|
},
|
||||||
body: JSON.stringify(body),
|
body: JSON.stringify(body),
|
||||||
});
|
});
|
||||||
if (!res.ok) {
|
if (res.ok) {
|
||||||
const err = await res.text();
|
lastError = null;
|
||||||
logger.error({ docId, status: res.status, err, portId }, 'Documenso v1 placeField error');
|
break;
|
||||||
throw new Error(`Documenso v1 placeField error: ${res.status}`);
|
}
|
||||||
|
const errBody = await res.text().catch(() => '');
|
||||||
|
lastError = { status: res.status, body: errBody };
|
||||||
|
// Don't retry on 4xx — that's a validation error, won't change.
|
||||||
|
if (res.status >= 400 && res.status < 500) break;
|
||||||
|
// Backoff: 250ms, 500ms (skipped on the 3rd iteration because we exit).
|
||||||
|
if (attempt < 2) {
|
||||||
|
await new Promise((r) => setTimeout(r, 250 * Math.pow(2, attempt)));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (lastError) {
|
||||||
|
logger.error(
|
||||||
|
{ docId, status: lastError.status, err: lastError.body, portId },
|
||||||
|
'Documenso v1 placeField error',
|
||||||
|
);
|
||||||
|
throw new Error(`Documenso v1 placeField error: ${lastError.status}`);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -362,7 +362,21 @@ function resolveHmacSecret(encryptedSecret: string | null): string {
|
|||||||
// Dev fallback: derive a stable per-process secret so the filesystem
|
// Dev fallback: derive a stable per-process secret so the filesystem
|
||||||
// backend works without explicit configuration during local development.
|
// backend works without explicit configuration during local development.
|
||||||
const seed = process.env.BETTER_AUTH_SECRET ?? env.BETTER_AUTH_SECRET ?? 'storage-default';
|
const seed = process.env.BETTER_AUTH_SECRET ?? env.BETTER_AUTH_SECRET ?? 'storage-default';
|
||||||
return createHash('sha256').update(`storage-proxy:${seed}`).digest('hex');
|
const derived = createHash('sha256').update(`storage-proxy:${seed}`).digest('hex');
|
||||||
|
// Warn once at boot so two processes started with different
|
||||||
|
// `BETTER_AUTH_SECRET` values are observable: tokens minted by one
|
||||||
|
// wouldn't validate on the other otherwise — which surfaces as random
|
||||||
|
// 401s on file downloads in dev.
|
||||||
|
logger.warn(
|
||||||
|
{
|
||||||
|
hint:
|
||||||
|
'Storage proxy HMAC derived from BETTER_AUTH_SECRET. ' +
|
||||||
|
'Multi-process dev setups must share the same secret value.',
|
||||||
|
secretFingerprint: derived.slice(0, 8),
|
||||||
|
},
|
||||||
|
'FilesystemBackend: using DEV HMAC fallback (no storage_proxy_hmac_secret_encrypted set)',
|
||||||
|
);
|
||||||
|
return derived;
|
||||||
}
|
}
|
||||||
|
|
||||||
async function streamToBuffer(stream: NodeJS.ReadableStream): Promise<Buffer> {
|
async function streamToBuffer(stream: NodeJS.ReadableStream): Promise<Buffer> {
|
||||||
|
|||||||
@@ -107,6 +107,37 @@ export class S3Backend implements StorageBackend {
|
|||||||
secretKey: resolved.secretKey,
|
secretKey: resolved.secretKey,
|
||||||
region: resolved.region,
|
region: resolved.region,
|
||||||
});
|
});
|
||||||
|
// Verify the bucket exists at boot so a typo / missing-bucket admin
|
||||||
|
// error surfaces with a clear message instead of as a vague Minio
|
||||||
|
// error inside the first user-facing request that touches storage.
|
||||||
|
// Logged-not-thrown when MINIO_AUTO_CREATE_BUCKET=true and the bucket
|
||||||
|
// is missing — we'll create it. Otherwise we throw so the boot fails
|
||||||
|
// fast and the deployment-time misconfig is loud.
|
||||||
|
try {
|
||||||
|
const exists = await client.bucketExists(resolved.bucket);
|
||||||
|
if (!exists) {
|
||||||
|
if (process.env.MINIO_AUTO_CREATE_BUCKET === 'true') {
|
||||||
|
await client.makeBucket(resolved.bucket, resolved.region);
|
||||||
|
logger.info(
|
||||||
|
{ bucket: resolved.bucket, endpoint: resolved.endpoint },
|
||||||
|
'S3 bucket auto-created (MINIO_AUTO_CREATE_BUCKET=true)',
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
throw new Error(
|
||||||
|
`S3 bucket "${resolved.bucket}" does not exist on ${resolved.endpoint}. ` +
|
||||||
|
`Create it manually or set MINIO_AUTO_CREATE_BUCKET=true.`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
if (err instanceof Error && err.message.includes('does not exist')) throw err;
|
||||||
|
// Connection / auth errors get re-thrown with extra context.
|
||||||
|
logger.error(
|
||||||
|
{ err, bucket: resolved.bucket, endpoint: resolved.endpoint },
|
||||||
|
'S3 bucket existence check failed at backend boot',
|
||||||
|
);
|
||||||
|
throw err;
|
||||||
|
}
|
||||||
return new S3Backend(client, resolved.bucket);
|
return new S3Backend(client, resolved.bucket);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user