fix(audit-2): integration regressions + data-integrity from second-pass review

Two reviewer agents did a second-pass deep audit of the 21-commit
refactor. Eight findings; four fixed here (one was deferred with a
schema comment, three were 🟡 nice-to-haves left for follow-up).

Integration regressions (🟠 high):
- Outbound webhook `interest.berth_linked` now fires from the new
  junction-add handler. Was emitting a socket-only event, leaving
  external integrations silent post-refactor.
- Two new webhook events `interest.berth_unlinked` and
  `interest.berth_link_updated` added to WEBHOOK_EVENTS +
  INTERNAL_TO_WEBHOOK_MAP. PATCH and DELETE handlers now dispatch them
  alongside the existing socket emits — lifecycle parity restored.
- BerthInterestPulse adds useRealtimeInvalidation for berth-link
  events. The query key was berth-scoped while the linked-berths
  dialog invalidates interest-scoped keys (no prefix match), so the
  pulse went stale. Bridges via the realtime hook now.

Recommender semantic fix (🟠 medium-high):
- aggregates CTE: active_interest_count now filters on
  `ib.is_specific_interest = true`, matching the public-map "Under
  Offer" derivation. EOI-bundle-only links no longer demote a berth
  to Tier C for other reps. Smoke test confirms previously-all-Tier-C
  results now correctly classify as Tier A.
- Same CTE: `total_interest_count` uses COUNT(ib.berth_id) instead of
  COUNT(*) so a berth with no junction rows reports 0 (not 1 from
  the LEFT JOIN's NULL-right-side row). Prevents heat over-counting.

Data integrity (🟠):
- AcroForm tier rejects negative numerics in coerceFieldValue (was
  letting through `length_ft="-50"` which would poison the
  recommender feasibility filter on apply).
- FilesystemBackend.resolveHmacSecret throws in production when
  storage_proxy_hmac_secret_encrypted is null. Dev still derives from
  BETTER_AUTH_SECRET for ergonomics; prod must explicitly configure.
- Documented the circular FK between berths.current_pdf_version_id
  and berth_pdf_versions.id. Drizzle's `.references()` can't express
  the cycle so the schema column is plain text + a comment; the FK
  is authoritatively maintained by migration 0030.

Tests still 1163/1163. tsc clean.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
Matt Ciaccio
2026-05-05 04:20:38 +02:00
parent 312ebf1a88
commit a3e002852b
8 changed files with 85 additions and 8 deletions

View File

@@ -92,6 +92,17 @@ export const berths = pgTable(
],
);
// Note: `berths.current_pdf_version_id` has an `ON DELETE SET NULL` FK to
// `berth_pdf_versions.id` installed by migration 0030. The column is left
// without a `.references()` / `foreignKey()` declaration in the Drizzle
// schema because the two tables form a circular FK (berth_pdf_versions →
// berths), and Drizzle's relation inference doesn't tolerate the cycle
// when both sides are declared via column-level `.references()`. The
// migration chain authoritatively maintains the constraint; a fresh
// `db:push` against an empty DB would skip the FK and require a follow-up
// generated migration to add it back. This is acceptable because we
// always apply migrations in order in dev/CI/prod.
export const berthMapData = pgTable(
'berth_map_data',
{

View File

@@ -470,9 +470,13 @@ function coerceFieldValue(key: keyof ExtractedBerthFields, raw: string): string
}
return raw;
}
// Numeric columns: strip currency / unit suffixes and commas.
// Numeric columns: strip currency / unit suffixes and commas. Berth
// dimensions / capacities / prices are all non-negative — reject
// negatives outright so an AcroForm with `length_ft="-50"` doesn't
// poison the recommender feasibility filter when applied.
const numeric = Number(raw.replace(/[^0-9.\-]/g, ''));
return Number.isFinite(numeric) ? numeric : null;
if (!Number.isFinite(numeric)) return null;
return numeric < 0 ? null : numeric;
}
/** Parse a human date like "September 15 2025" → "2025-09-15". */

View File

@@ -448,7 +448,15 @@ export async function recommendBerths(args: RecommendBerthsArgs): Promise<Recomm
aggregates AS (
SELECT
f.id AS berth_id,
COUNT(*) FILTER (WHERE i.archived_at IS NULL AND i.outcome IS NULL) AS active_interest_count,
-- Active = is_specific_interest=true junction rows only (matches
-- the public-map "Under Offer" filter). An EOI-bundle-only link
-- (is_specific_interest=false, is_in_eoi_bundle=true) is legal
-- coverage, not a pitch, and shouldn't demote the berth.
COUNT(*) FILTER (
WHERE i.archived_at IS NULL
AND i.outcome IS NULL
AND ib.is_specific_interest = true
) AS active_interest_count,
COUNT(*) FILTER (
WHERE i.outcome IS NOT NULL AND (i.outcome::text LIKE 'lost%' OR i.outcome = 'cancelled')
) AS lost_count,
@@ -483,7 +491,11 @@ export async function recommendBerths(args: RecommendBerthsArgs): Promise<Recomm
) FILTER (WHERE i.outcome IS NOT NULL AND (i.outcome::text LIKE 'lost%' OR i.outcome = 'cancelled')),
0
) AS fallthrough_max_stage,
COUNT(*) AS total_interest_count,
-- COUNT(ib.berth_id) (not COUNT(*)) so a berth with no junction
-- rows reports 0 — the LEFT JOIN otherwise produces a single
-- NULL-right-side row that COUNT(*) would tally as 1 and inflate
-- the heat interest-count component for berths with no history.
COUNT(ib.berth_id) AS total_interest_count,
COUNT(*) FILTER (WHERE i.eoi_status = 'signed') AS eoi_signed_count
FROM feasible f
LEFT JOIN interest_berths ib ON ib.berth_id = f.id

View File

@@ -11,6 +11,8 @@ export const WEBHOOK_EVENTS = [
'interest.created',
'interest.stage_changed',
'interest.berth_linked',
'interest.berth_unlinked',
'interest.berth_link_updated',
'berth.status_changed',
'berth.updated',
'document.sent',
@@ -51,6 +53,8 @@ export const INTERNAL_TO_WEBHOOK_MAP: Record<string, WebhookEvent> = {
'interest:created': 'interest.created',
'interest:stageChanged': 'interest.stage_changed',
'interest:berthLinked': 'interest.berth_linked',
'interest:berthUnlinked': 'interest.berth_unlinked',
'interest:berthLinkUpdated': 'interest.berth_link_updated',
'berth:statusChanged': 'berth.status_changed',
'berth:updated': 'berth.updated',
'document:sent': 'document.sent',

View File

@@ -314,9 +314,19 @@ function resolveHmacSecret(encryptedSecret: string | null): string {
logger.error({ err }, 'Failed to decrypt storage_proxy_hmac_secret_encrypted');
}
}
// Derive a stable per-process secret from BETTER_AUTH_SECRET so dev mode
// works without explicit configuration. In production the admin UI writes
// an encrypted random secret.
// Production refuses to derive: an admin must have explicitly configured
// `storage_proxy_hmac_secret_encrypted` before flipping the storage
// backend to filesystem. Conflating this trust domain with the auth
// cookie HMAC (BETTER_AUTH_SECRET) is acceptable in dev for ergonomics
// but a deployment-time misconfig in prod.
if (process.env.NODE_ENV === 'production') {
throw new Error(
'FilesystemBackend: storage_proxy_hmac_secret_encrypted must be set in production. ' +
'Generate a random secret in admin > storage and persist it before flipping the backend.',
);
}
// Dev fallback: derive a stable per-process secret so the filesystem
// backend works without explicit configuration during local development.
const seed = process.env.BETTER_AUTH_SECRET ?? env.BETTER_AUTH_SECRET ?? 'storage-default';
return createHash('sha256').update(`storage-proxy:${seed}`).digest('hex');
}