Compare commits

...

20 Commits

Author SHA1 Message Date
Matt
e5b7cdf670 Add document analysis: page count, text extraction & language detection
Some checks failed
Build and Push Docker Image / build (push) Failing after 11s
Introduces a document analyzer service that extracts page count (via pdf-parse),
text preview, and detected language (via franc) from uploaded files. Analysis runs
automatically on upload (configurable via SystemSettings) and can be triggered
retroactively for existing files. Results are displayed as badges in the FileViewer
and fed to AI screening for language-based filtering criteria.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 10:08:20 +01:00
Matt
90f36ac9b2 Retroactive auto-PASS for projects with complete documents
Wire batchCheckRequirementsAndTransition into round activation and reopen
so pre-existing projects that already have all required docs get auto-
passed. Also adds checkDocumentCompletion endpoint for manual sweeps on
already-active rounds.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 10:08:20 +01:00
Matt
a921731c52 Pass tag confidence scores to AI assignment for weighted matching
The AI assignment path was receiving project tags as flat strings, losing
the confidence scores from AI tagging. Now both the GPT path and the
fallback algorithm weight tag matches by confidence — a 0.9 tag matters
more than a 0.5 one.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 10:08:20 +01:00
fc8e58f985 Auto-transition projects to PASSED when all required documents uploaded
Add checkRequirementsAndTransition() to round-engine that checks if all
required FileRequirements for a round are satisfied by uploaded files.
When all are met and the project is PENDING/IN_PROGRESS, it auto-
transitions to PASSED. Also adds batchCheckRequirementsAndTransition()
for bulk operations.

Wired into:
- file.adminUploadForRoundRequirement (admin bulk upload)
- applicant.saveFileMetadata (applicant self-upload)

Non-fatal: failures in the check never break the upload itself.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 10:08:20 +01:00
Matt
e547d2bd03 Add auto-pass & advance for intake rounds (no manual marking needed)
Some checks failed
Build and Push Docker Image / build (push) Failing after 7s
For INTAKE, SUBMISSION, and MENTORING rounds, the Advance Projects dialog
now shows a simplified "Advance All" flow that auto-passes all pending
projects and advances them in one click. Backend accepts autoPassPending
flag to bulk-set PENDING→PASSED before advancing. Jury/evaluation rounds
keep the existing per-project selection workflow.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 19:09:23 +01:00
Matt
f731f96a0a Hide jury stat card in header for non-jury rounds (INTAKE, FILTERING, etc.)
Some checks failed
Build and Push Docker Image / build (push) Failing after 8s
The jury selector card in the stats bar was still visible on round types
where juries don't apply. Now conditionally rendered based on hasJury,
with the grid adjusting from 4 to 3 columns accordingly.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 18:33:08 +01:00
Matt
09049d2911 Round management: tab cleanup, date pickers, advancement workflow
- Remove Document Windows tab (round dates + file requirements in
  Config are sufficient, separate SubmissionWindow was redundant)
- Restrict Jury and Awards tabs to round types that use them
  (EVALUATION, LIVE_FINAL, DELIBERATION only)
- Add Round Dates card in Config tab with DateTimePicker for
  start/end dates (supports past and future dates)
- Make Advance Projects button always visible when projects exist
  (dimmed with guidance when no projects are PASSED yet)
- Add Close & Advance combined quick action to streamline round
  progression workflow
- Add target round selector to Advance Projects dialog so admin
  can pick which round to advance projects into

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 16:43:23 +01:00
Matt
3fb0d128a1 Fix missing query invalidations across member management
Add utils.user.list.invalidate() after mutations that change user
status to ensure member lists refresh without manual page reload:
- Member detail page: after update and send invitation
- User mobile actions: after send invitation
- Add member dialog: after send invitation in jury group flow

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 16:16:23 +01:00
Matt
5965f7889d Platform-wide UX fixes: assignment dialog, invalidation, settings, dashboard
1. Assignment dialog overhaul: replace raw UUID inputs with searchable
   juror Combobox (shows name, email, capacity) and multi-select project
   checklist with bulk assignment support

2. Query invalidation sweep: fix missing invalidations in
   assignment-preview-sheet (roundAssignment.execute) and
   filtering-dashboard (filtering.finalizeResults) so data refreshes
   without page reload

3. Rename Submissions tab to Document Windows with descriptive
   header explaining upload window configuration

4. Connect 6 disconnected settings: storage_provider, local_storage_path,
   avatar_max_size_mb, allowed_image_types, whatsapp_enabled,
   whatsapp_provider - all now accessible in Settings UI

5. Admin dashboard redesign: branded Editorial Command Center with
   Dark Blue gradient header, colored border-l-4 stat cards, staggered
   animations, 2-column layout, action-required panel, activity timeline

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 16:05:25 +01:00
Matt
b2279067e2 Add LiteLLM proxy support for ChatGPT subscription AI access
- Add ai_provider setting: 'openai' (API key) or 'litellm' (ChatGPT subscription proxy)
- Auto-strip max_tokens/max_completion_tokens for chatgpt/ prefix models
  (ChatGPT subscription backend rejects token limit fields)
- LiteLLM mode: dummy API key when none configured, base URL required
- isOpenAIConfigured() checks base URL instead of API key for LiteLLM
- listAvailableModels() returns manualEntry flag for LiteLLM (no models.list)
- Settings UI: conditional fields, info banner, manual model input with
  chatgpt/ prefix examples when LiteLLM selected
- All 7 AI services work transparently via buildCompletionParams()

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 15:48:34 +01:00
Matt
014bb15890 Reduce AI costs: switch tagging to gpt-4o-mini, add custom base URL support
Some checks failed
Build and Push Docker Image / build (push) Failing after 7s
- Change AI tagging to use AI_MODELS.QUICK (gpt-4o-mini) instead of gpt-4o for
  10-15x cost reduction on classification tasks
- Add openai_base_url system setting for OpenAI-compatible providers
  (OpenRouter, Groq, Together AI, local models)
- Reset OpenAI client singleton when API key, base URL, or model changes
- Add base URL field to AI settings form with provider examples

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 15:34:59 +01:00
Matt
f12c29103c Fix project detail crash: replace dynamic hooks with single query
Some checks failed
Build and Push Docker Image / build (push) Failing after 8s
The project detail page called useQuery inside .map() to fetch file
requirements per round, violating React's rules of hooks. When
competitionRounds changed from [] to [round1, round2], the hook count
changed, causing React to crash with "Cannot read properties of
undefined (reading 'length')".

Fix: Add listRequirementsByRounds endpoint that accepts multiple
roundIds in one query, replacing the dynamic hook pattern with a
single stable useQuery call.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 15:30:44 +01:00
Matt
65a22e6f19 Optimize all AI functions for efficiency and speed
Some checks failed
Build and Push Docker Image / build (push) Failing after 7s
- AI Tagging: batch 10 projects per API call with 3 concurrent batches (~10x faster)
  - New `tagProjectsBatch()` with `getAISuggestionsBatch()` for multi-project prompts
  - Single DB query for all projects, single anonymization pass
  - Compact JSON in prompts (no pretty-print) saves tokens
- AI Shortlist: run STARTUP and BUSINESS_CONCEPT categories in parallel (2x faster)
- AI Filtering: increase default parallel batches from 1 to 3 (3x faster)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 14:02:38 +01:00
Matt
989db4dc14 Allow AI tagging dialog to close during processing, show background progress
- Remove blocking guard on dialog close when tagging is in progress
- Change Cancel button to "Run in Background" during processing
- Add amber border + spinner + progress % on AI Tags button when job runs in background
- Job already runs server-side and sends in-app notification on completion

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 13:58:03 +01:00
Matt
5e0c8b2dfe Add schema reconciliation migration and file removal in bulk upload
Migration:
- Add standalone hasConflict index on ConflictOfInterest
- Ensure roundId is nullable on ConflictOfInterest
- Drop stale composite roundId_hasConflict index

Bulk upload:
- Add trash icon button to remove uploaded files
- Uses existing file.delete endpoint with audit logging

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 13:46:12 +01:00
Matt
85a0fa5016 Make bulk upload documents clickable with storage verification
- Add bucket/objectKey to file select in listProjectsByRoundRequirements
- Add verifyFilesExist endpoint to bulk-check file existence in MinIO
- Make uploaded filenames clickable links that open presigned download URLs
- Verify files exist in storage on page load, show re-upload button if missing

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 13:32:23 +01:00
Matt
c707899179 Add missing migration for ProjectFile.pageCount column
Column was in Prisma schema but had no migration file, causing
'column does not exist' errors on file uploads.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 13:23:18 +01:00
Matt
4d40afec6e Improve Project Pool button contrast in dark header
Give button a subtle bg-white/15 default background so it's visible
without hovering, and stronger hover state (bg-white/30).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 13:21:35 +01:00
Matt
effc078918 Make all migration SQL files idempotent for clean prod deploys
Added IF NOT EXISTS, IF EXISTS, and DO $$ EXCEPTION guards to all
migration files from 20260205 onwards so they survive partial application
and work correctly on both fresh databases and existing deployments.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 13:09:41 +01:00
Matt
763b2ef0f5 Jury management: create, delete, add/remove members from round detail page
- Added Jury tab to round detail page with full jury management inline
- Create new jury groups (auto-assigns to current round)
- Delete jury groups with confirmation dialog
- Add/remove members with inline member list
- Assign/switch jury groups via dropdown selector
- Added delete endpoint to juryGroup router (unlinks rounds, deletes members)
- Removed CHAIR/OBSERVER role selectors from add-member dialog (all members)
- Removed role column from jury-members-table

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 12:46:01 +01:00
48 changed files with 4067 additions and 1301 deletions

48
package-lock.json generated
View File

@@ -49,6 +49,7 @@
"cmdk": "^1.0.4", "cmdk": "^1.0.4",
"csv-parse": "^6.1.0", "csv-parse": "^6.1.0",
"date-fns": "^4.1.0", "date-fns": "^4.1.0",
"franc": "^6.2.0",
"html2canvas": "^1.4.1", "html2canvas": "^1.4.1",
"jspdf": "^4.1.0", "jspdf": "^4.1.0",
"jspdf-autotable": "^5.0.7", "jspdf-autotable": "^5.0.7",
@@ -6147,6 +6148,16 @@
"react-dom": "^18 || ^19 || ^19.0.0-rc" "react-dom": "^18 || ^19 || ^19.0.0-rc"
} }
}, },
"node_modules/collapse-white-space": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/collapse-white-space/-/collapse-white-space-2.1.0.tgz",
"integrity": "sha512-loKTxY1zCOuG4j9f6EPnuyyYkf58RnhhWTvRoZEokgB+WbdXehfjFviyOVYkqzEWz1Q5kRiZdBYS5SwxbQYwzw==",
"license": "MIT",
"funding": {
"type": "github",
"url": "https://github.com/sponsors/wooorm"
}
},
"node_modules/color-convert": { "node_modules/color-convert": {
"version": "2.0.1", "version": "2.0.1",
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz", "resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
@@ -7736,6 +7747,19 @@
} }
} }
}, },
"node_modules/franc": {
"version": "6.2.0",
"resolved": "https://registry.npmjs.org/franc/-/franc-6.2.0.tgz",
"integrity": "sha512-rcAewP7PSHvjq7Kgd7dhj82zE071kX5B4W1M4ewYMf/P+i6YsDQmj62Xz3VQm9zyUzUXwhIde/wHLGCMrM+yGg==",
"license": "MIT",
"dependencies": {
"trigram-utils": "^2.0.0"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/wooorm"
}
},
"node_modules/fsevents": { "node_modules/fsevents": {
"version": "2.3.2", "version": "2.3.2",
"resolved": "https://registry.npmjs.org/fsevents/-/fsevents-2.3.2.tgz", "resolved": "https://registry.npmjs.org/fsevents/-/fsevents-2.3.2.tgz",
@@ -10441,6 +10465,16 @@
"integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==", "integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==",
"license": "MIT" "license": "MIT"
}, },
"node_modules/n-gram": {
"version": "2.0.2",
"resolved": "https://registry.npmjs.org/n-gram/-/n-gram-2.0.2.tgz",
"integrity": "sha512-S24aGsn+HLBxUGVAUFOwGpKs7LBcG4RudKU//eWzt/mQ97/NMKQxDWHyHx63UNWk/OOdihgmzoETn1tf5nQDzQ==",
"license": "MIT",
"funding": {
"type": "github",
"url": "https://github.com/sponsors/wooorm"
}
},
"node_modules/nanoid": { "node_modules/nanoid": {
"version": "3.3.11", "version": "3.3.11",
"resolved": "https://registry.npmjs.org/nanoid/-/nanoid-3.3.11.tgz", "resolved": "https://registry.npmjs.org/nanoid/-/nanoid-3.3.11.tgz",
@@ -13110,6 +13144,20 @@
"integrity": "sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw==", "integrity": "sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw==",
"license": "MIT" "license": "MIT"
}, },
"node_modules/trigram-utils": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/trigram-utils/-/trigram-utils-2.0.1.tgz",
"integrity": "sha512-nfWIXHEaB+HdyslAfMxSqWKDdmqY9I32jS7GnqpdWQnLH89r6A5sdk3fDVYqGAZ0CrT8ovAFSAo6HRiWcWNIGQ==",
"license": "MIT",
"dependencies": {
"collapse-white-space": "^2.0.0",
"n-gram": "^2.0.0"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/wooorm"
}
},
"node_modules/trim-lines": { "node_modules/trim-lines": {
"version": "3.0.1", "version": "3.0.1",
"resolved": "https://registry.npmjs.org/trim-lines/-/trim-lines-3.0.1.tgz", "resolved": "https://registry.npmjs.org/trim-lines/-/trim-lines-3.0.1.tgz",

View File

@@ -62,6 +62,7 @@
"cmdk": "^1.0.4", "cmdk": "^1.0.4",
"csv-parse": "^6.1.0", "csv-parse": "^6.1.0",
"date-fns": "^4.1.0", "date-fns": "^4.1.0",
"franc": "^6.2.0",
"html2canvas": "^1.4.1", "html2canvas": "^1.4.1",
"jspdf": "^4.1.0", "jspdf": "^4.1.0",
"jspdf-autotable": "^5.0.7", "jspdf-autotable": "^5.0.7",

View File

@@ -16,105 +16,143 @@
-- the enum. -- the enum.
ALTER TYPE "SettingCategory" ADD VALUE 'DIGEST'; DO $$ BEGIN ALTER TYPE "SettingCategory" ADD VALUE 'DIGEST'; EXCEPTION WHEN duplicate_object THEN NULL; END $$;
ALTER TYPE "SettingCategory" ADD VALUE 'ANALYTICS'; DO $$ BEGIN ALTER TYPE "SettingCategory" ADD VALUE 'ANALYTICS'; EXCEPTION WHEN duplicate_object THEN NULL; END $$;
ALTER TYPE "SettingCategory" ADD VALUE 'AUDIT_CONFIG'; DO $$ BEGIN ALTER TYPE "SettingCategory" ADD VALUE 'AUDIT_CONFIG'; EXCEPTION WHEN duplicate_object THEN NULL; END $$;
ALTER TYPE "SettingCategory" ADD VALUE 'INTEGRATIONS'; DO $$ BEGIN ALTER TYPE "SettingCategory" ADD VALUE 'INTEGRATIONS'; EXCEPTION WHEN duplicate_object THEN NULL; END $$;
ALTER TYPE "SettingCategory" ADD VALUE 'LOCALIZATION'; DO $$ BEGIN ALTER TYPE "SettingCategory" ADD VALUE 'LOCALIZATION'; EXCEPTION WHEN duplicate_object THEN NULL; END $$;
ALTER TYPE "SettingCategory" ADD VALUE 'COMMUNICATION'; DO $$ BEGIN ALTER TYPE "SettingCategory" ADD VALUE 'COMMUNICATION'; EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- DropForeignKey -- DropForeignKey
ALTER TABLE "ApplicationForm" DROP CONSTRAINT "ApplicationForm_programId_fkey"; ALTER TABLE "ApplicationForm" DROP CONSTRAINT IF EXISTS "ApplicationForm_programId_fkey";
-- DropForeignKey -- DropForeignKey
ALTER TABLE "ApplicationForm" DROP CONSTRAINT "ApplicationForm_roundId_fkey"; ALTER TABLE "ApplicationForm" DROP CONSTRAINT IF EXISTS "ApplicationForm_roundId_fkey";
-- DropForeignKey -- DropForeignKey
ALTER TABLE "ApplicationFormField" DROP CONSTRAINT "ApplicationFormField_formId_fkey"; ALTER TABLE "ApplicationFormField" DROP CONSTRAINT IF EXISTS "ApplicationFormField_formId_fkey";
-- DropForeignKey -- DropForeignKey
ALTER TABLE "ApplicationFormField" DROP CONSTRAINT "ApplicationFormField_stepId_fkey"; ALTER TABLE "ApplicationFormField" DROP CONSTRAINT IF EXISTS "ApplicationFormField_stepId_fkey";
-- DropForeignKey -- DropForeignKey
ALTER TABLE "ApplicationFormSubmission" DROP CONSTRAINT "ApplicationFormSubmission_formId_fkey"; ALTER TABLE "ApplicationFormSubmission" DROP CONSTRAINT IF EXISTS "ApplicationFormSubmission_formId_fkey";
-- DropForeignKey -- DropForeignKey
ALTER TABLE "OnboardingStep" DROP CONSTRAINT "OnboardingStep_formId_fkey"; ALTER TABLE "OnboardingStep" DROP CONSTRAINT IF EXISTS "OnboardingStep_formId_fkey";
-- DropForeignKey -- DropForeignKey
ALTER TABLE "SubmissionFile" DROP CONSTRAINT "SubmissionFile_submissionId_fkey"; ALTER TABLE "SubmissionFile" DROP CONSTRAINT IF EXISTS "SubmissionFile_submissionId_fkey";
-- DropIndex -- DropIndex
DROP INDEX "User_email_idx"; DROP INDEX IF EXISTS "User_email_idx";
-- AlterTable -- AlterTable
ALTER TABLE "AssignmentJob" ALTER COLUMN "updatedAt" DROP DEFAULT; DO $$ BEGIN ALTER TABLE "AssignmentJob" ALTER COLUMN "updatedAt" DROP DEFAULT; EXCEPTION WHEN others THEN NULL; END $$;
-- AlterTable -- AlterTable
ALTER TABLE "AuditLog" ADD COLUMN "previousDataJson" JSONB, DO $$ BEGIN
ADD COLUMN "sessionId" TEXT; ALTER TABLE "AuditLog" ADD COLUMN "previousDataJson" JSONB;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "AuditLog" ADD COLUMN "sessionId" TEXT;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
-- AlterTable -- AlterTable
ALTER TABLE "FilteringJob" ALTER COLUMN "updatedAt" DROP DEFAULT; DO $$ BEGIN ALTER TABLE "FilteringJob" ALTER COLUMN "updatedAt" DROP DEFAULT; EXCEPTION WHEN others THEN NULL; END $$;
-- AlterTable -- AlterTable
DO $$ BEGIN
ALTER TABLE "LiveVote" ADD COLUMN "isAudienceVote" BOOLEAN NOT NULL DEFAULT false; ALTER TABLE "LiveVote" ADD COLUMN "isAudienceVote" BOOLEAN NOT NULL DEFAULT false;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
-- AlterTable -- AlterTable
ALTER TABLE "LiveVotingSession" ADD COLUMN "allowAudienceVotes" BOOLEAN NOT NULL DEFAULT false, DO $$ BEGIN
ADD COLUMN "audienceVoteWeight" DOUBLE PRECISION NOT NULL DEFAULT 0, ALTER TABLE "LiveVotingSession" ADD COLUMN "allowAudienceVotes" BOOLEAN NOT NULL DEFAULT false;
ADD COLUMN "presentationSettingsJson" JSONB, EXCEPTION WHEN duplicate_column THEN NULL; END $$;
ADD COLUMN "tieBreakerMethod" TEXT NOT NULL DEFAULT 'admin_decides'; DO $$ BEGIN
ALTER TABLE "LiveVotingSession" ADD COLUMN "audienceVoteWeight" DOUBLE PRECISION NOT NULL DEFAULT 0;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "LiveVotingSession" ADD COLUMN "presentationSettingsJson" JSONB;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "LiveVotingSession" ADD COLUMN "tieBreakerMethod" TEXT NOT NULL DEFAULT 'admin_decides';
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
-- AlterTable -- AlterTable
ALTER TABLE "MentorAssignment" ADD COLUMN "completionStatus" TEXT NOT NULL DEFAULT 'in_progress', DO $$ BEGIN
ADD COLUMN "lastViewedAt" TIMESTAMP(3); ALTER TABLE "MentorAssignment" ADD COLUMN "completionStatus" TEXT NOT NULL DEFAULT 'in_progress';
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "MentorAssignment" ADD COLUMN "lastViewedAt" TIMESTAMP(3);
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
-- AlterTable -- AlterTable
ALTER TABLE "NotificationEmailSetting" ALTER COLUMN "updatedAt" DROP DEFAULT; DO $$ BEGIN ALTER TABLE "NotificationEmailSetting" ALTER COLUMN "updatedAt" DROP DEFAULT; EXCEPTION WHEN others THEN NULL; END $$;
-- AlterTable -- AlterTable
ALTER TABLE "Project" ADD COLUMN "draftDataJson" JSONB, DO $$ BEGIN
ADD COLUMN "draftExpiresAt" TIMESTAMP(3), ALTER TABLE "Project" ADD COLUMN "draftDataJson" JSONB;
ADD COLUMN "isDraft" BOOLEAN NOT NULL DEFAULT false; EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "Project" ADD COLUMN "draftExpiresAt" TIMESTAMP(3);
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "Project" ADD COLUMN "isDraft" BOOLEAN NOT NULL DEFAULT false;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
-- AlterTable -- AlterTable
ALTER TABLE "ProjectFile" ADD COLUMN "isLate" BOOLEAN NOT NULL DEFAULT false, DO $$ BEGIN
ADD COLUMN "replacedById" TEXT, ALTER TABLE "ProjectFile" ADD COLUMN "isLate" BOOLEAN NOT NULL DEFAULT false;
ADD COLUMN "roundId" TEXT, EXCEPTION WHEN duplicate_column THEN NULL; END $$;
ADD COLUMN "version" INTEGER NOT NULL DEFAULT 1; DO $$ BEGIN
ALTER TABLE "ProjectFile" ADD COLUMN "replacedById" TEXT;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "ProjectFile" ADD COLUMN "roundId" TEXT;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "ProjectFile" ADD COLUMN "version" INTEGER NOT NULL DEFAULT 1;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
-- AlterTable -- AlterTable
ALTER TABLE "TaggingJob" ALTER COLUMN "updatedAt" DROP DEFAULT; DO $$ BEGIN ALTER TABLE "TaggingJob" ALTER COLUMN "updatedAt" DROP DEFAULT; EXCEPTION WHEN others THEN NULL; END $$;
-- AlterTable -- AlterTable
ALTER TABLE "User" ADD COLUMN "availabilityJson" JSONB, DO $$ BEGIN
ADD COLUMN "digestFrequency" TEXT NOT NULL DEFAULT 'none', ALTER TABLE "User" ADD COLUMN "availabilityJson" JSONB;
ADD COLUMN "preferredWorkload" INTEGER; EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "User" ADD COLUMN "digestFrequency" TEXT NOT NULL DEFAULT 'none';
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "User" ADD COLUMN "preferredWorkload" INTEGER;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
-- DropTable -- DropTable
DROP TABLE "ApplicationForm"; DROP TABLE IF EXISTS "ApplicationForm";
-- DropTable -- DropTable
DROP TABLE "ApplicationFormField"; DROP TABLE IF EXISTS "ApplicationFormField";
-- DropTable -- DropTable
DROP TABLE "ApplicationFormSubmission"; DROP TABLE IF EXISTS "ApplicationFormSubmission";
-- DropTable -- DropTable
DROP TABLE "OnboardingStep"; DROP TABLE IF EXISTS "OnboardingStep";
-- DropTable -- DropTable
DROP TABLE "SubmissionFile"; DROP TABLE IF EXISTS "SubmissionFile";
-- DropEnum -- DropEnum
DROP TYPE "FormFieldType"; DROP TYPE IF EXISTS "FormFieldType";
-- DropEnum -- DropEnum
DROP TYPE "SpecialFieldType"; DROP TYPE IF EXISTS "SpecialFieldType";
-- CreateTable -- CreateTable
CREATE TABLE "ReminderLog" ( CREATE TABLE IF NOT EXISTS "ReminderLog" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"roundId" TEXT NOT NULL, "roundId" TEXT NOT NULL,
"userId" TEXT NOT NULL, "userId" TEXT NOT NULL,
@@ -125,7 +163,7 @@ CREATE TABLE "ReminderLog" (
); );
-- CreateTable -- CreateTable
CREATE TABLE "ConflictOfInterest" ( CREATE TABLE IF NOT EXISTS "ConflictOfInterest" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"assignmentId" TEXT NOT NULL, "assignmentId" TEXT NOT NULL,
"userId" TEXT NOT NULL, "userId" TEXT NOT NULL,
@@ -143,7 +181,7 @@ CREATE TABLE "ConflictOfInterest" (
); );
-- CreateTable -- CreateTable
CREATE TABLE "EvaluationSummary" ( CREATE TABLE IF NOT EXISTS "EvaluationSummary" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"projectId" TEXT NOT NULL, "projectId" TEXT NOT NULL,
"roundId" TEXT NOT NULL, "roundId" TEXT NOT NULL,
@@ -157,7 +195,7 @@ CREATE TABLE "EvaluationSummary" (
); );
-- CreateTable -- CreateTable
CREATE TABLE "ProjectStatusHistory" ( CREATE TABLE IF NOT EXISTS "ProjectStatusHistory" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"projectId" TEXT NOT NULL, "projectId" TEXT NOT NULL,
"status" "ProjectStatus" NOT NULL, "status" "ProjectStatus" NOT NULL,
@@ -168,7 +206,7 @@ CREATE TABLE "ProjectStatusHistory" (
); );
-- CreateTable -- CreateTable
CREATE TABLE "MentorMessage" ( CREATE TABLE IF NOT EXISTS "MentorMessage" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"projectId" TEXT NOT NULL, "projectId" TEXT NOT NULL,
"senderId" TEXT NOT NULL, "senderId" TEXT NOT NULL,
@@ -180,7 +218,7 @@ CREATE TABLE "MentorMessage" (
); );
-- CreateTable -- CreateTable
CREATE TABLE "DigestLog" ( CREATE TABLE IF NOT EXISTS "DigestLog" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"userId" TEXT NOT NULL, "userId" TEXT NOT NULL,
"digestType" TEXT NOT NULL, "digestType" TEXT NOT NULL,
@@ -191,7 +229,7 @@ CREATE TABLE "DigestLog" (
); );
-- CreateTable -- CreateTable
CREATE TABLE "RoundTemplate" ( CREATE TABLE IF NOT EXISTS "RoundTemplate" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"name" TEXT NOT NULL, "name" TEXT NOT NULL,
"description" TEXT, "description" TEXT,
@@ -208,7 +246,7 @@ CREATE TABLE "RoundTemplate" (
); );
-- CreateTable -- CreateTable
CREATE TABLE "MentorNote" ( CREATE TABLE IF NOT EXISTS "MentorNote" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"mentorAssignmentId" TEXT NOT NULL, "mentorAssignmentId" TEXT NOT NULL,
"authorId" TEXT NOT NULL, "authorId" TEXT NOT NULL,
@@ -221,7 +259,7 @@ CREATE TABLE "MentorNote" (
); );
-- CreateTable -- CreateTable
CREATE TABLE "MentorMilestone" ( CREATE TABLE IF NOT EXISTS "MentorMilestone" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"programId" TEXT NOT NULL, "programId" TEXT NOT NULL,
"name" TEXT NOT NULL, "name" TEXT NOT NULL,
@@ -236,7 +274,7 @@ CREATE TABLE "MentorMilestone" (
); );
-- CreateTable -- CreateTable
CREATE TABLE "MentorMilestoneCompletion" ( CREATE TABLE IF NOT EXISTS "MentorMilestoneCompletion" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"milestoneId" TEXT NOT NULL, "milestoneId" TEXT NOT NULL,
"mentorAssignmentId" TEXT NOT NULL, "mentorAssignmentId" TEXT NOT NULL,
@@ -247,7 +285,7 @@ CREATE TABLE "MentorMilestoneCompletion" (
); );
-- CreateTable -- CreateTable
CREATE TABLE "Message" ( CREATE TABLE IF NOT EXISTS "Message" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"senderId" TEXT NOT NULL, "senderId" TEXT NOT NULL,
"recipientType" TEXT NOT NULL, "recipientType" TEXT NOT NULL,
@@ -266,7 +304,7 @@ CREATE TABLE "Message" (
); );
-- CreateTable -- CreateTable
CREATE TABLE "MessageTemplate" ( CREATE TABLE IF NOT EXISTS "MessageTemplate" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"name" TEXT NOT NULL, "name" TEXT NOT NULL,
"category" TEXT NOT NULL, "category" TEXT NOT NULL,
@@ -282,7 +320,7 @@ CREATE TABLE "MessageTemplate" (
); );
-- CreateTable -- CreateTable
CREATE TABLE "MessageRecipient" ( CREATE TABLE IF NOT EXISTS "MessageRecipient" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"messageId" TEXT NOT NULL, "messageId" TEXT NOT NULL,
"userId" TEXT NOT NULL, "userId" TEXT NOT NULL,
@@ -295,7 +333,7 @@ CREATE TABLE "MessageRecipient" (
); );
-- CreateTable -- CreateTable
CREATE TABLE "Webhook" ( CREATE TABLE IF NOT EXISTS "Webhook" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"name" TEXT NOT NULL, "name" TEXT NOT NULL,
"url" TEXT NOT NULL, "url" TEXT NOT NULL,
@@ -312,7 +350,7 @@ CREATE TABLE "Webhook" (
); );
-- CreateTable -- CreateTable
CREATE TABLE "WebhookDelivery" ( CREATE TABLE IF NOT EXISTS "WebhookDelivery" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"webhookId" TEXT NOT NULL, "webhookId" TEXT NOT NULL,
"event" TEXT NOT NULL, "event" TEXT NOT NULL,
@@ -328,7 +366,7 @@ CREATE TABLE "WebhookDelivery" (
); );
-- CreateTable -- CreateTable
CREATE TABLE "EvaluationDiscussion" ( CREATE TABLE IF NOT EXISTS "EvaluationDiscussion" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"projectId" TEXT NOT NULL, "projectId" TEXT NOT NULL,
"roundId" TEXT NOT NULL, "roundId" TEXT NOT NULL,
@@ -341,7 +379,7 @@ CREATE TABLE "EvaluationDiscussion" (
); );
-- CreateTable -- CreateTable
CREATE TABLE "DiscussionComment" ( CREATE TABLE IF NOT EXISTS "DiscussionComment" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"discussionId" TEXT NOT NULL, "discussionId" TEXT NOT NULL,
"userId" TEXT NOT NULL, "userId" TEXT NOT NULL,
@@ -352,199 +390,257 @@ CREATE TABLE "DiscussionComment" (
); );
-- CreateIndex -- CreateIndex
CREATE INDEX "ReminderLog_roundId_idx" ON "ReminderLog"("roundId"); CREATE INDEX IF NOT EXISTS "ReminderLog_roundId_idx" ON "ReminderLog"("roundId");
-- CreateIndex -- CreateIndex
CREATE UNIQUE INDEX "ReminderLog_roundId_userId_type_key" ON "ReminderLog"("roundId", "userId", "type"); CREATE UNIQUE INDEX IF NOT EXISTS "ReminderLog_roundId_userId_type_key" ON "ReminderLog"("roundId", "userId", "type");
-- CreateIndex -- CreateIndex
CREATE UNIQUE INDEX "ConflictOfInterest_assignmentId_key" ON "ConflictOfInterest"("assignmentId"); CREATE UNIQUE INDEX IF NOT EXISTS "ConflictOfInterest_assignmentId_key" ON "ConflictOfInterest"("assignmentId");
-- CreateIndex -- CreateIndex
CREATE INDEX "ConflictOfInterest_userId_idx" ON "ConflictOfInterest"("userId"); CREATE INDEX IF NOT EXISTS "ConflictOfInterest_userId_idx" ON "ConflictOfInterest"("userId");
-- CreateIndex -- CreateIndex
CREATE INDEX "ConflictOfInterest_roundId_hasConflict_idx" ON "ConflictOfInterest"("roundId", "hasConflict"); CREATE INDEX IF NOT EXISTS "ConflictOfInterest_roundId_hasConflict_idx" ON "ConflictOfInterest"("roundId", "hasConflict");
-- CreateIndex -- CreateIndex
CREATE INDEX "EvaluationSummary_roundId_idx" ON "EvaluationSummary"("roundId"); CREATE INDEX IF NOT EXISTS "EvaluationSummary_roundId_idx" ON "EvaluationSummary"("roundId");
-- CreateIndex -- CreateIndex
CREATE UNIQUE INDEX "EvaluationSummary_projectId_roundId_key" ON "EvaluationSummary"("projectId", "roundId"); CREATE UNIQUE INDEX IF NOT EXISTS "EvaluationSummary_projectId_roundId_key" ON "EvaluationSummary"("projectId", "roundId");
-- CreateIndex -- CreateIndex
CREATE INDEX "ProjectStatusHistory_projectId_changedAt_idx" ON "ProjectStatusHistory"("projectId", "changedAt"); CREATE INDEX IF NOT EXISTS "ProjectStatusHistory_projectId_changedAt_idx" ON "ProjectStatusHistory"("projectId", "changedAt");
-- CreateIndex -- CreateIndex
CREATE INDEX "MentorMessage_projectId_createdAt_idx" ON "MentorMessage"("projectId", "createdAt"); CREATE INDEX IF NOT EXISTS "MentorMessage_projectId_createdAt_idx" ON "MentorMessage"("projectId", "createdAt");
-- CreateIndex -- CreateIndex
CREATE INDEX "DigestLog_userId_idx" ON "DigestLog"("userId"); CREATE INDEX IF NOT EXISTS "DigestLog_userId_idx" ON "DigestLog"("userId");
-- CreateIndex -- CreateIndex
CREATE INDEX "DigestLog_sentAt_idx" ON "DigestLog"("sentAt"); CREATE INDEX IF NOT EXISTS "DigestLog_sentAt_idx" ON "DigestLog"("sentAt");
-- CreateIndex -- CreateIndex
CREATE INDEX "RoundTemplate_programId_idx" ON "RoundTemplate"("programId"); CREATE INDEX IF NOT EXISTS "RoundTemplate_programId_idx" ON "RoundTemplate"("programId");
-- CreateIndex -- CreateIndex
CREATE INDEX "MentorNote_mentorAssignmentId_idx" ON "MentorNote"("mentorAssignmentId"); CREATE INDEX IF NOT EXISTS "MentorNote_mentorAssignmentId_idx" ON "MentorNote"("mentorAssignmentId");
-- CreateIndex -- CreateIndex
CREATE INDEX "MentorMilestone_programId_idx" ON "MentorMilestone"("programId"); CREATE INDEX IF NOT EXISTS "MentorMilestone_programId_idx" ON "MentorMilestone"("programId");
-- CreateIndex -- CreateIndex
CREATE INDEX "MentorMilestone_sortOrder_idx" ON "MentorMilestone"("sortOrder"); CREATE INDEX IF NOT EXISTS "MentorMilestone_sortOrder_idx" ON "MentorMilestone"("sortOrder");
-- CreateIndex -- CreateIndex
CREATE INDEX "MentorMilestoneCompletion_mentorAssignmentId_idx" ON "MentorMilestoneCompletion"("mentorAssignmentId"); CREATE INDEX IF NOT EXISTS "MentorMilestoneCompletion_mentorAssignmentId_idx" ON "MentorMilestoneCompletion"("mentorAssignmentId");
-- CreateIndex -- CreateIndex
CREATE UNIQUE INDEX "MentorMilestoneCompletion_milestoneId_mentorAssignmentId_key" ON "MentorMilestoneCompletion"("milestoneId", "mentorAssignmentId"); CREATE UNIQUE INDEX IF NOT EXISTS "MentorMilestoneCompletion_milestoneId_mentorAssignmentId_key" ON "MentorMilestoneCompletion"("milestoneId", "mentorAssignmentId");
-- CreateIndex -- CreateIndex
CREATE INDEX "Message_senderId_idx" ON "Message"("senderId"); CREATE INDEX IF NOT EXISTS "Message_senderId_idx" ON "Message"("senderId");
-- CreateIndex -- CreateIndex
CREATE INDEX "Message_sentAt_idx" ON "Message"("sentAt"); CREATE INDEX IF NOT EXISTS "Message_sentAt_idx" ON "Message"("sentAt");
-- CreateIndex -- CreateIndex
CREATE INDEX "Message_scheduledAt_idx" ON "Message"("scheduledAt"); CREATE INDEX IF NOT EXISTS "Message_scheduledAt_idx" ON "Message"("scheduledAt");
-- CreateIndex -- CreateIndex
CREATE INDEX "MessageTemplate_category_idx" ON "MessageTemplate"("category"); CREATE INDEX IF NOT EXISTS "MessageTemplate_category_idx" ON "MessageTemplate"("category");
-- CreateIndex -- CreateIndex
CREATE INDEX "MessageTemplate_isActive_idx" ON "MessageTemplate"("isActive"); CREATE INDEX IF NOT EXISTS "MessageTemplate_isActive_idx" ON "MessageTemplate"("isActive");
-- CreateIndex -- CreateIndex
CREATE INDEX "MessageRecipient_messageId_idx" ON "MessageRecipient"("messageId"); CREATE INDEX IF NOT EXISTS "MessageRecipient_messageId_idx" ON "MessageRecipient"("messageId");
-- CreateIndex -- CreateIndex
CREATE INDEX "MessageRecipient_userId_isRead_idx" ON "MessageRecipient"("userId", "isRead"); CREATE INDEX IF NOT EXISTS "MessageRecipient_userId_isRead_idx" ON "MessageRecipient"("userId", "isRead");
-- CreateIndex -- CreateIndex
CREATE INDEX "Webhook_isActive_idx" ON "Webhook"("isActive"); CREATE INDEX IF NOT EXISTS "Webhook_isActive_idx" ON "Webhook"("isActive");
-- CreateIndex -- CreateIndex
CREATE INDEX "WebhookDelivery_webhookId_idx" ON "WebhookDelivery"("webhookId"); CREATE INDEX IF NOT EXISTS "WebhookDelivery_webhookId_idx" ON "WebhookDelivery"("webhookId");
-- CreateIndex -- CreateIndex
CREATE INDEX "WebhookDelivery_status_idx" ON "WebhookDelivery"("status"); CREATE INDEX IF NOT EXISTS "WebhookDelivery_status_idx" ON "WebhookDelivery"("status");
-- CreateIndex -- CreateIndex
CREATE INDEX "WebhookDelivery_createdAt_idx" ON "WebhookDelivery"("createdAt"); CREATE INDEX IF NOT EXISTS "WebhookDelivery_createdAt_idx" ON "WebhookDelivery"("createdAt");
-- CreateIndex -- CreateIndex
CREATE INDEX "EvaluationDiscussion_roundId_idx" ON "EvaluationDiscussion"("roundId"); CREATE INDEX IF NOT EXISTS "EvaluationDiscussion_roundId_idx" ON "EvaluationDiscussion"("roundId");
-- CreateIndex -- CreateIndex
CREATE INDEX "EvaluationDiscussion_status_idx" ON "EvaluationDiscussion"("status"); CREATE INDEX IF NOT EXISTS "EvaluationDiscussion_status_idx" ON "EvaluationDiscussion"("status");
-- CreateIndex -- CreateIndex
CREATE UNIQUE INDEX "EvaluationDiscussion_projectId_roundId_key" ON "EvaluationDiscussion"("projectId", "roundId"); CREATE UNIQUE INDEX IF NOT EXISTS "EvaluationDiscussion_projectId_roundId_key" ON "EvaluationDiscussion"("projectId", "roundId");
-- CreateIndex -- CreateIndex
CREATE INDEX "DiscussionComment_discussionId_createdAt_idx" ON "DiscussionComment"("discussionId", "createdAt"); CREATE INDEX IF NOT EXISTS "DiscussionComment_discussionId_createdAt_idx" ON "DiscussionComment"("discussionId", "createdAt");
-- CreateIndex -- CreateIndex
CREATE INDEX "AuditLog_entityType_entityId_timestamp_idx" ON "AuditLog"("entityType", "entityId", "timestamp"); CREATE INDEX IF NOT EXISTS "AuditLog_entityType_entityId_timestamp_idx" ON "AuditLog"("entityType", "entityId", "timestamp");
-- CreateIndex -- CreateIndex
CREATE INDEX "Evaluation_status_formId_idx" ON "Evaluation"("status", "formId"); CREATE INDEX IF NOT EXISTS "Evaluation_status_formId_idx" ON "Evaluation"("status", "formId");
-- CreateIndex -- CreateIndex
CREATE INDEX "GracePeriod_roundId_userId_extendedUntil_idx" ON "GracePeriod"("roundId", "userId", "extendedUntil"); CREATE INDEX IF NOT EXISTS "GracePeriod_roundId_userId_extendedUntil_idx" ON "GracePeriod"("roundId", "userId", "extendedUntil");
-- CreateIndex -- CreateIndex
CREATE INDEX "LiveVote_isAudienceVote_idx" ON "LiveVote"("isAudienceVote"); CREATE INDEX IF NOT EXISTS "LiveVote_isAudienceVote_idx" ON "LiveVote"("isAudienceVote");
-- CreateIndex -- CreateIndex
CREATE INDEX "ProjectFile_roundId_idx" ON "ProjectFile"("roundId"); CREATE INDEX IF NOT EXISTS "ProjectFile_roundId_idx" ON "ProjectFile"("roundId");
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "ProjectFile" ADD CONSTRAINT "ProjectFile_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "ProjectFile" ADD CONSTRAINT "ProjectFile_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "SpecialAward" ADD CONSTRAINT "SpecialAward_winnerOverriddenBy_fkey" FOREIGN KEY ("winnerOverriddenBy") REFERENCES "User"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "SpecialAward" ADD CONSTRAINT "SpecialAward_winnerOverriddenBy_fkey" FOREIGN KEY ("winnerOverriddenBy") REFERENCES "User"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "ReminderLog" ADD CONSTRAINT "ReminderLog_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "ReminderLog" ADD CONSTRAINT "ReminderLog_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "ReminderLog" ADD CONSTRAINT "ReminderLog_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "ReminderLog" ADD CONSTRAINT "ReminderLog_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "ConflictOfInterest" ADD CONSTRAINT "ConflictOfInterest_assignmentId_fkey" FOREIGN KEY ("assignmentId") REFERENCES "Assignment"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "ConflictOfInterest" ADD CONSTRAINT "ConflictOfInterest_assignmentId_fkey" FOREIGN KEY ("assignmentId") REFERENCES "Assignment"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "ConflictOfInterest" ADD CONSTRAINT "ConflictOfInterest_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "ConflictOfInterest" ADD CONSTRAINT "ConflictOfInterest_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "ConflictOfInterest" ADD CONSTRAINT "ConflictOfInterest_reviewedById_fkey" FOREIGN KEY ("reviewedById") REFERENCES "User"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "ConflictOfInterest" ADD CONSTRAINT "ConflictOfInterest_reviewedById_fkey" FOREIGN KEY ("reviewedById") REFERENCES "User"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "EvaluationSummary" ADD CONSTRAINT "EvaluationSummary_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "EvaluationSummary" ADD CONSTRAINT "EvaluationSummary_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "EvaluationSummary" ADD CONSTRAINT "EvaluationSummary_generatedById_fkey" FOREIGN KEY ("generatedById") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "EvaluationSummary" ADD CONSTRAINT "EvaluationSummary_generatedById_fkey" FOREIGN KEY ("generatedById") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "ProjectStatusHistory" ADD CONSTRAINT "ProjectStatusHistory_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "ProjectStatusHistory" ADD CONSTRAINT "ProjectStatusHistory_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "MentorMessage" ADD CONSTRAINT "MentorMessage_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "MentorMessage" ADD CONSTRAINT "MentorMessage_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "MentorMessage" ADD CONSTRAINT "MentorMessage_senderId_fkey" FOREIGN KEY ("senderId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "MentorMessage" ADD CONSTRAINT "MentorMessage_senderId_fkey" FOREIGN KEY ("senderId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "DigestLog" ADD CONSTRAINT "DigestLog_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "DigestLog" ADD CONSTRAINT "DigestLog_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "MentorNote" ADD CONSTRAINT "MentorNote_mentorAssignmentId_fkey" FOREIGN KEY ("mentorAssignmentId") REFERENCES "MentorAssignment"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "MentorNote" ADD CONSTRAINT "MentorNote_mentorAssignmentId_fkey" FOREIGN KEY ("mentorAssignmentId") REFERENCES "MentorAssignment"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "MentorNote" ADD CONSTRAINT "MentorNote_authorId_fkey" FOREIGN KEY ("authorId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "MentorNote" ADD CONSTRAINT "MentorNote_authorId_fkey" FOREIGN KEY ("authorId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "MentorMilestone" ADD CONSTRAINT "MentorMilestone_programId_fkey" FOREIGN KEY ("programId") REFERENCES "Program"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "MentorMilestone" ADD CONSTRAINT "MentorMilestone_programId_fkey" FOREIGN KEY ("programId") REFERENCES "Program"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "MentorMilestoneCompletion" ADD CONSTRAINT "MentorMilestoneCompletion_milestoneId_fkey" FOREIGN KEY ("milestoneId") REFERENCES "MentorMilestone"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "MentorMilestoneCompletion" ADD CONSTRAINT "MentorMilestoneCompletion_milestoneId_fkey" FOREIGN KEY ("milestoneId") REFERENCES "MentorMilestone"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "MentorMilestoneCompletion" ADD CONSTRAINT "MentorMilestoneCompletion_mentorAssignmentId_fkey" FOREIGN KEY ("mentorAssignmentId") REFERENCES "MentorAssignment"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "MentorMilestoneCompletion" ADD CONSTRAINT "MentorMilestoneCompletion_mentorAssignmentId_fkey" FOREIGN KEY ("mentorAssignmentId") REFERENCES "MentorAssignment"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "MentorMilestoneCompletion" ADD CONSTRAINT "MentorMilestoneCompletion_completedById_fkey" FOREIGN KEY ("completedById") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "MentorMilestoneCompletion" ADD CONSTRAINT "MentorMilestoneCompletion_completedById_fkey" FOREIGN KEY ("completedById") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "Message" ADD CONSTRAINT "Message_senderId_fkey" FOREIGN KEY ("senderId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "Message" ADD CONSTRAINT "Message_senderId_fkey" FOREIGN KEY ("senderId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "Message" ADD CONSTRAINT "Message_templateId_fkey" FOREIGN KEY ("templateId") REFERENCES "MessageTemplate"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "Message" ADD CONSTRAINT "Message_templateId_fkey" FOREIGN KEY ("templateId") REFERENCES "MessageTemplate"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "MessageRecipient" ADD CONSTRAINT "MessageRecipient_messageId_fkey" FOREIGN KEY ("messageId") REFERENCES "Message"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "MessageRecipient" ADD CONSTRAINT "MessageRecipient_messageId_fkey" FOREIGN KEY ("messageId") REFERENCES "Message"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "MessageRecipient" ADD CONSTRAINT "MessageRecipient_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "MessageRecipient" ADD CONSTRAINT "MessageRecipient_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "Webhook" ADD CONSTRAINT "Webhook_createdById_fkey" FOREIGN KEY ("createdById") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "Webhook" ADD CONSTRAINT "Webhook_createdById_fkey" FOREIGN KEY ("createdById") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "WebhookDelivery" ADD CONSTRAINT "WebhookDelivery_webhookId_fkey" FOREIGN KEY ("webhookId") REFERENCES "Webhook"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "WebhookDelivery" ADD CONSTRAINT "WebhookDelivery_webhookId_fkey" FOREIGN KEY ("webhookId") REFERENCES "Webhook"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "EvaluationDiscussion" ADD CONSTRAINT "EvaluationDiscussion_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "EvaluationDiscussion" ADD CONSTRAINT "EvaluationDiscussion_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "EvaluationDiscussion" ADD CONSTRAINT "EvaluationDiscussion_closedById_fkey" FOREIGN KEY ("closedById") REFERENCES "User"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "EvaluationDiscussion" ADD CONSTRAINT "EvaluationDiscussion_closedById_fkey" FOREIGN KEY ("closedById") REFERENCES "User"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "DiscussionComment" ADD CONSTRAINT "DiscussionComment_discussionId_fkey" FOREIGN KEY ("discussionId") REFERENCES "EvaluationDiscussion"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "DiscussionComment" ADD CONSTRAINT "DiscussionComment_discussionId_fkey" FOREIGN KEY ("discussionId") REFERENCES "EvaluationDiscussion"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "DiscussionComment" ADD CONSTRAINT "DiscussionComment_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "DiscussionComment" ADD CONSTRAINT "DiscussionComment_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;

View File

@@ -6,36 +6,46 @@
-- Missing Foreign Keys -- Missing Foreign Keys
-- ===================================================== -- =====================================================
-- RoundTemplate Program -- RoundTemplate -> Program
DO $$ BEGIN
ALTER TABLE "RoundTemplate" ADD CONSTRAINT "RoundTemplate_programId_fkey" ALTER TABLE "RoundTemplate" ADD CONSTRAINT "RoundTemplate_programId_fkey"
FOREIGN KEY ("programId") REFERENCES "Program"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("programId") REFERENCES "Program"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- RoundTemplate User (creator) -- RoundTemplate -> User (creator)
DO $$ BEGIN
ALTER TABLE "RoundTemplate" ADD CONSTRAINT "RoundTemplate_createdBy_fkey" ALTER TABLE "RoundTemplate" ADD CONSTRAINT "RoundTemplate_createdBy_fkey"
FOREIGN KEY ("createdBy") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE; FOREIGN KEY ("createdBy") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- Message Round -- Message -> Round
DO $$ BEGIN
ALTER TABLE "Message" ADD CONSTRAINT "Message_roundId_fkey" ALTER TABLE "Message" ADD CONSTRAINT "Message_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE SET NULL ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- EvaluationDiscussion Round -- EvaluationDiscussion -> Round
DO $$ BEGIN
ALTER TABLE "EvaluationDiscussion" ADD CONSTRAINT "EvaluationDiscussion_roundId_fkey" ALTER TABLE "EvaluationDiscussion" ADD CONSTRAINT "EvaluationDiscussion_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ProjectFile ProjectFile (self-relation for file versioning) -- ProjectFile -> ProjectFile (self-relation for file versioning)
DO $$ BEGIN
ALTER TABLE "ProjectFile" ADD CONSTRAINT "ProjectFile_replacedById_fkey" ALTER TABLE "ProjectFile" ADD CONSTRAINT "ProjectFile_replacedById_fkey"
FOREIGN KEY ("replacedById") REFERENCES "ProjectFile"("id") ON DELETE SET NULL ON UPDATE CASCADE; FOREIGN KEY ("replacedById") REFERENCES "ProjectFile"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ===================================================== -- =====================================================
-- Missing Indexes -- Missing Indexes
-- ===================================================== -- =====================================================
CREATE INDEX "RoundTemplate_roundType_idx" ON "RoundTemplate"("roundType"); CREATE INDEX IF NOT EXISTS "RoundTemplate_roundType_idx" ON "RoundTemplate"("roundType");
CREATE INDEX "MentorNote_authorId_idx" ON "MentorNote"("authorId"); CREATE INDEX IF NOT EXISTS "MentorNote_authorId_idx" ON "MentorNote"("authorId");
CREATE INDEX "MentorMilestoneCompletion_completedById_idx" ON "MentorMilestoneCompletion"("completedById"); CREATE INDEX IF NOT EXISTS "MentorMilestoneCompletion_completedById_idx" ON "MentorMilestoneCompletion"("completedById");
CREATE INDEX "Webhook_createdById_idx" ON "Webhook"("createdById"); CREATE INDEX IF NOT EXISTS "Webhook_createdById_idx" ON "Webhook"("createdById");
CREATE INDEX "WebhookDelivery_event_idx" ON "WebhookDelivery"("event"); CREATE INDEX IF NOT EXISTS "WebhookDelivery_event_idx" ON "WebhookDelivery"("event");
CREATE INDEX "Message_roundId_idx" ON "Message"("roundId"); CREATE INDEX IF NOT EXISTS "Message_roundId_idx" ON "Message"("roundId");
CREATE INDEX "EvaluationDiscussion_closedById_idx" ON "EvaluationDiscussion"("closedById"); CREATE INDEX IF NOT EXISTS "EvaluationDiscussion_closedById_idx" ON "EvaluationDiscussion"("closedById");
CREATE INDEX "DiscussionComment_discussionId_idx" ON "DiscussionComment"("discussionId"); CREATE INDEX IF NOT EXISTS "DiscussionComment_discussionId_idx" ON "DiscussionComment"("discussionId");
CREATE INDEX "DiscussionComment_userId_idx" ON "DiscussionComment"("userId"); CREATE INDEX IF NOT EXISTS "DiscussionComment_userId_idx" ON "DiscussionComment"("userId");

View File

@@ -3,11 +3,15 @@
-- Add SET NULL on ProjectFile.roundId so deleting Round nullifies the reference -- Add SET NULL on ProjectFile.roundId so deleting Round nullifies the reference
-- AlterTable: Evaluation.formId -> onDelete CASCADE -- AlterTable: Evaluation.formId -> onDelete CASCADE
ALTER TABLE "Evaluation" DROP CONSTRAINT "Evaluation_formId_fkey"; ALTER TABLE "Evaluation" DROP CONSTRAINT IF EXISTS "Evaluation_formId_fkey";
DO $$ BEGIN
ALTER TABLE "Evaluation" ADD CONSTRAINT "Evaluation_formId_fkey" ALTER TABLE "Evaluation" ADD CONSTRAINT "Evaluation_formId_fkey"
FOREIGN KEY ("formId") REFERENCES "EvaluationForm"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("formId") REFERENCES "EvaluationForm"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AlterTable: ProjectFile.roundId -> onDelete SET NULL -- AlterTable: ProjectFile.roundId -> onDelete SET NULL
ALTER TABLE "ProjectFile" DROP CONSTRAINT "ProjectFile_roundId_fkey"; ALTER TABLE "ProjectFile" DROP CONSTRAINT IF EXISTS "ProjectFile_roundId_fkey";
DO $$ BEGIN
ALTER TABLE "ProjectFile" ADD CONSTRAINT "ProjectFile_roundId_fkey" ALTER TABLE "ProjectFile" ADD CONSTRAINT "ProjectFile_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE SET NULL ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;

View File

@@ -1,5 +1,5 @@
-- CreateTable -- CreateTable
CREATE TABLE "FileRequirement" ( CREATE TABLE IF NOT EXISTS "FileRequirement" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"roundId" TEXT NOT NULL, "roundId" TEXT NOT NULL,
"name" TEXT NOT NULL, "name" TEXT NOT NULL,
@@ -15,16 +15,22 @@ CREATE TABLE "FileRequirement" (
); );
-- CreateIndex -- CreateIndex
CREATE INDEX "FileRequirement_roundId_idx" ON "FileRequirement"("roundId"); CREATE INDEX IF NOT EXISTS "FileRequirement_roundId_idx" ON "FileRequirement"("roundId");
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "FileRequirement" ADD CONSTRAINT "FileRequirement_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "FileRequirement" ADD CONSTRAINT "FileRequirement_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AlterTable: add requirementId to ProjectFile -- AlterTable: add requirementId to ProjectFile
DO $$ BEGIN
ALTER TABLE "ProjectFile" ADD COLUMN "requirementId" TEXT; ALTER TABLE "ProjectFile" ADD COLUMN "requirementId" TEXT;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
-- CreateIndex -- CreateIndex
CREATE INDEX "ProjectFile_requirementId_idx" ON "ProjectFile"("requirementId"); CREATE INDEX IF NOT EXISTS "ProjectFile_requirementId_idx" ON "ProjectFile"("requirementId");
-- AddForeignKey -- AddForeignKey
DO $$ BEGIN
ALTER TABLE "ProjectFile" ADD CONSTRAINT "ProjectFile_requirementId_fkey" FOREIGN KEY ("requirementId") REFERENCES "FileRequirement"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "ProjectFile" ADD CONSTRAINT "ProjectFile_requirementId_fkey" FOREIGN KEY ("requirementId") REFERENCES "FileRequirement"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;

View File

@@ -1,2 +1,2 @@
-- CreateIndex -- CreateIndex
CREATE INDEX "AwardVote_awardId_userId_idx" ON "AwardVote"("awardId", "userId"); CREATE INDEX IF NOT EXISTS "AwardVote_awardId_userId_idx" ON "AwardVote"("awardId", "userId");

View File

@@ -1,20 +1,26 @@
-- Simplify RoutingMode enum: remove POST_MAIN, rename PARALLEL SHARED -- Simplify RoutingMode enum: remove POST_MAIN, rename PARALLEL -> SHARED
-- Drop RoutingRule table (routing is now handled via award assignment) -- Drop RoutingRule table (routing is now handled via award assignment)
-- 1. Update existing PARALLEL values to SHARED, POST_MAIN to SHARED -- 1. Update existing PARALLEL values to SHARED, POST_MAIN to SHARED
-- (safe to run even if no rows match)
UPDATE "Track" SET "routingMode" = 'PARALLEL' WHERE "routingMode" = 'POST_MAIN'; UPDATE "Track" SET "routingMode" = 'PARALLEL' WHERE "routingMode" = 'POST_MAIN';
-- 2. Rename PARALLEL SHARED in the enum -- 2. Rename PARALLEL -> SHARED in the enum (only if PARALLEL still exists)
DO $$ BEGIN
ALTER TYPE "RoutingMode" RENAME VALUE 'PARALLEL' TO 'SHARED'; ALTER TYPE "RoutingMode" RENAME VALUE 'PARALLEL' TO 'SHARED';
EXCEPTION WHEN invalid_parameter_value THEN NULL; WHEN others THEN NULL; END $$;
-- 3. Remove POST_MAIN from the enum -- 3. Remove POST_MAIN from the enum
-- PostgreSQL doesn't support DROP VALUE directly, so we recreate the enum -- PostgreSQL doesn't support DROP VALUE directly, so we recreate the enum
-- Since we already converted POST_MAIN values to PARALLEL (now SHARED), this is safe -- Since we already converted POST_MAIN values to PARALLEL (now SHARED), this is safe
-- Create new enum without POST_MAIN -- Only recreate if the old enum still has POST_MAIN (i.e., hasn't been done yet)
-- Actually, since we already renamed PARALLEL to SHARED and converted POST_MAIN rows, DO $$ BEGIN
-- we just need to remove the POST_MAIN value. PostgreSQL 13+ doesn't support dropping IF EXISTS (
-- enum values natively, but since all rows are already migrated, we can: SELECT 1 FROM pg_enum
WHERE enumlabel = 'POST_MAIN'
AND enumtypid = (SELECT oid FROM pg_type WHERE typname = 'RoutingMode')
) THEN
CREATE TYPE "RoutingMode_new" AS ENUM ('SHARED', 'EXCLUSIVE'); CREATE TYPE "RoutingMode_new" AS ENUM ('SHARED', 'EXCLUSIVE');
ALTER TABLE "Track" ALTER TABLE "Track"
@@ -23,6 +29,8 @@ ALTER TABLE "Track"
DROP TYPE "RoutingMode"; DROP TYPE "RoutingMode";
ALTER TYPE "RoutingMode_new" RENAME TO "RoutingMode"; ALTER TYPE "RoutingMode_new" RENAME TO "RoutingMode";
END IF;
END $$;
-- 4. Drop the RoutingRule table (no longer needed) -- 4. Drop the RoutingRule table (no longer needed)
DROP TABLE IF EXISTS "RoutingRule"; DROP TABLE IF EXISTS "RoutingRule";

View File

@@ -1,36 +1,36 @@
-- ============================================================================= -- =============================================================================
-- Phase 0+1: Add Competition/Round Architecture (additive no breaking changes) -- Phase 0+1: Add Competition/Round Architecture (additive -- no breaking changes)
-- ============================================================================= -- =============================================================================
-- New enums, new tables, new optional columns on existing tables. -- New enums, new tables, new optional columns on existing tables.
-- Old Pipeline/Track/Stage tables are untouched. -- Old Pipeline/Track/Stage tables are untouched.
-- ─── New Enum Types ────────────────────────────────────────────────────────── -- --- New Enum Types ---
CREATE TYPE "CompetitionStatus" AS ENUM ('DRAFT', 'ACTIVE', 'CLOSED', 'ARCHIVED'); DO $$ BEGIN CREATE TYPE "CompetitionStatus" AS ENUM ('DRAFT', 'ACTIVE', 'CLOSED', 'ARCHIVED'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
CREATE TYPE "RoundType" AS ENUM ('INTAKE', 'FILTERING', 'EVALUATION', 'SUBMISSION', 'MENTORING', 'LIVE_FINAL', 'DELIBERATION'); DO $$ BEGIN CREATE TYPE "RoundType" AS ENUM ('INTAKE', 'FILTERING', 'EVALUATION', 'SUBMISSION', 'MENTORING', 'LIVE_FINAL', 'DELIBERATION'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
CREATE TYPE "RoundStatus" AS ENUM ('ROUND_DRAFT', 'ROUND_ACTIVE', 'ROUND_CLOSED', 'ROUND_ARCHIVED'); DO $$ BEGIN CREATE TYPE "RoundStatus" AS ENUM ('ROUND_DRAFT', 'ROUND_ACTIVE', 'ROUND_CLOSED', 'ROUND_ARCHIVED'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
CREATE TYPE "ProjectRoundStateValue" AS ENUM ('PENDING', 'IN_PROGRESS', 'PASSED', 'REJECTED', 'COMPLETED', 'WITHDRAWN'); DO $$ BEGIN CREATE TYPE "ProjectRoundStateValue" AS ENUM ('PENDING', 'IN_PROGRESS', 'PASSED', 'REJECTED', 'COMPLETED', 'WITHDRAWN'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
CREATE TYPE "AdvancementRuleType" AS ENUM ('AUTO_ADVANCE', 'SCORE_THRESHOLD', 'TOP_N', 'ADMIN_SELECTION', 'AI_RECOMMENDED'); DO $$ BEGIN CREATE TYPE "AdvancementRuleType" AS ENUM ('AUTO_ADVANCE', 'SCORE_THRESHOLD', 'TOP_N', 'ADMIN_SELECTION', 'AI_RECOMMENDED'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
CREATE TYPE "CapMode" AS ENUM ('HARD', 'SOFT', 'NONE'); DO $$ BEGIN CREATE TYPE "CapMode" AS ENUM ('HARD', 'SOFT', 'NONE'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
CREATE TYPE "DeadlinePolicy" AS ENUM ('HARD_DEADLINE', 'FLAG', 'GRACE'); DO $$ BEGIN CREATE TYPE "DeadlinePolicy" AS ENUM ('HARD_DEADLINE', 'FLAG', 'GRACE'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
CREATE TYPE "JuryGroupMemberRole" AS ENUM ('CHAIR', 'MEMBER', 'OBSERVER'); DO $$ BEGIN CREATE TYPE "JuryGroupMemberRole" AS ENUM ('CHAIR', 'MEMBER', 'OBSERVER'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
CREATE TYPE "AssignmentIntentSource" AS ENUM ('INVITE', 'ADMIN', 'SYSTEM'); DO $$ BEGIN CREATE TYPE "AssignmentIntentSource" AS ENUM ('INVITE', 'ADMIN', 'SYSTEM'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
CREATE TYPE "AssignmentIntentStatus" AS ENUM ('INTENT_PENDING', 'HONORED', 'OVERRIDDEN', 'EXPIRED', 'CANCELLED'); DO $$ BEGIN CREATE TYPE "AssignmentIntentStatus" AS ENUM ('INTENT_PENDING', 'HONORED', 'OVERRIDDEN', 'EXPIRED', 'CANCELLED'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
CREATE TYPE "MentorMessageRole" AS ENUM ('MENTOR_ROLE', 'APPLICANT_ROLE', 'ADMIN_ROLE'); DO $$ BEGIN CREATE TYPE "MentorMessageRole" AS ENUM ('MENTOR_ROLE', 'APPLICANT_ROLE', 'ADMIN_ROLE'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
CREATE TYPE "SubmissionPromotionSource" AS ENUM ('MENTOR_FILE', 'ADMIN_REPLACEMENT'); DO $$ BEGIN CREATE TYPE "SubmissionPromotionSource" AS ENUM ('MENTOR_FILE', 'ADMIN_REPLACEMENT'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
CREATE TYPE "DeliberationMode" AS ENUM ('SINGLE_WINNER_VOTE', 'FULL_RANKING'); DO $$ BEGIN CREATE TYPE "DeliberationMode" AS ENUM ('SINGLE_WINNER_VOTE', 'FULL_RANKING'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
CREATE TYPE "DeliberationStatus" AS ENUM ('DELIB_OPEN', 'VOTING', 'TALLYING', 'RUNOFF', 'DELIB_LOCKED'); DO $$ BEGIN CREATE TYPE "DeliberationStatus" AS ENUM ('DELIB_OPEN', 'VOTING', 'TALLYING', 'RUNOFF', 'DELIB_LOCKED'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
CREATE TYPE "TieBreakMethod" AS ENUM ('TIE_RUNOFF', 'TIE_ADMIN_DECIDES', 'SCORE_FALLBACK'); DO $$ BEGIN CREATE TYPE "TieBreakMethod" AS ENUM ('TIE_RUNOFF', 'TIE_ADMIN_DECIDES', 'SCORE_FALLBACK'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
CREATE TYPE "DeliberationParticipantStatus" AS ENUM ('REQUIRED', 'ABSENT_EXCUSED', 'REPLACED', 'REPLACEMENT_ACTIVE'); DO $$ BEGIN CREATE TYPE "DeliberationParticipantStatus" AS ENUM ('REQUIRED', 'ABSENT_EXCUSED', 'REPLACED', 'REPLACEMENT_ACTIVE'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
CREATE TYPE "AwardEligibilityMode" AS ENUM ('SEPARATE_POOL', 'STAY_IN_MAIN'); DO $$ BEGIN CREATE TYPE "AwardEligibilityMode" AS ENUM ('SEPARATE_POOL', 'STAY_IN_MAIN'); EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- Add FEATURE_FLAGS to SettingCategory enum -- Add FEATURE_FLAGS to SettingCategory enum
ALTER TYPE "SettingCategory" ADD VALUE 'FEATURE_FLAGS'; DO $$ BEGIN ALTER TYPE "SettingCategory" ADD VALUE 'FEATURE_FLAGS'; EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── New Tables ────────────────────────────────────────────────────────────── -- --- New Tables ---
-- Competition (replaces Pipeline) -- Competition (replaces Pipeline)
CREATE TABLE "Competition" ( CREATE TABLE IF NOT EXISTS "Competition" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"programId" TEXT NOT NULL, "programId" TEXT NOT NULL,
"name" TEXT NOT NULL, "name" TEXT NOT NULL,
@@ -49,7 +49,7 @@ CREATE TABLE "Competition" (
); );
-- Round (replaces Stage) -- Round (replaces Stage)
CREATE TABLE "Round" ( CREATE TABLE IF NOT EXISTS "Round" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"competitionId" TEXT NOT NULL, "competitionId" TEXT NOT NULL,
"name" TEXT NOT NULL, "name" TEXT NOT NULL,
@@ -70,7 +70,7 @@ CREATE TABLE "Round" (
); );
-- ProjectRoundState -- ProjectRoundState
CREATE TABLE "ProjectRoundState" ( CREATE TABLE IF NOT EXISTS "ProjectRoundState" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"projectId" TEXT NOT NULL, "projectId" TEXT NOT NULL,
"roundId" TEXT NOT NULL, "roundId" TEXT NOT NULL,
@@ -85,7 +85,7 @@ CREATE TABLE "ProjectRoundState" (
); );
-- AdvancementRule -- AdvancementRule
CREATE TABLE "AdvancementRule" ( CREATE TABLE IF NOT EXISTS "AdvancementRule" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"roundId" TEXT NOT NULL, "roundId" TEXT NOT NULL,
"targetRoundId" TEXT, "targetRoundId" TEXT,
@@ -99,7 +99,7 @@ CREATE TABLE "AdvancementRule" (
); );
-- JuryGroup -- JuryGroup
CREATE TABLE "JuryGroup" ( CREATE TABLE IF NOT EXISTS "JuryGroup" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"competitionId" TEXT NOT NULL, "competitionId" TEXT NOT NULL,
"name" TEXT NOT NULL, "name" TEXT NOT NULL,
@@ -120,7 +120,7 @@ CREATE TABLE "JuryGroup" (
); );
-- JuryGroupMember -- JuryGroupMember
CREATE TABLE "JuryGroupMember" ( CREATE TABLE IF NOT EXISTS "JuryGroupMember" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"juryGroupId" TEXT NOT NULL, "juryGroupId" TEXT NOT NULL,
"userId" TEXT NOT NULL, "userId" TEXT NOT NULL,
@@ -138,7 +138,7 @@ CREATE TABLE "JuryGroupMember" (
); );
-- SubmissionWindow -- SubmissionWindow
CREATE TABLE "SubmissionWindow" ( CREATE TABLE IF NOT EXISTS "SubmissionWindow" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"competitionId" TEXT NOT NULL, "competitionId" TEXT NOT NULL,
"name" TEXT NOT NULL, "name" TEXT NOT NULL,
@@ -158,7 +158,7 @@ CREATE TABLE "SubmissionWindow" (
); );
-- SubmissionFileRequirement -- SubmissionFileRequirement
CREATE TABLE "SubmissionFileRequirement" ( CREATE TABLE IF NOT EXISTS "SubmissionFileRequirement" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"submissionWindowId" TEXT NOT NULL, "submissionWindowId" TEXT NOT NULL,
"label" TEXT NOT NULL, "label" TEXT NOT NULL,
@@ -175,7 +175,7 @@ CREATE TABLE "SubmissionFileRequirement" (
); );
-- RoundSubmissionVisibility -- RoundSubmissionVisibility
CREATE TABLE "RoundSubmissionVisibility" ( CREATE TABLE IF NOT EXISTS "RoundSubmissionVisibility" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"roundId" TEXT NOT NULL, "roundId" TEXT NOT NULL,
"submissionWindowId" TEXT NOT NULL, "submissionWindowId" TEXT NOT NULL,
@@ -186,7 +186,7 @@ CREATE TABLE "RoundSubmissionVisibility" (
); );
-- AssignmentIntent -- AssignmentIntent
CREATE TABLE "AssignmentIntent" ( CREATE TABLE IF NOT EXISTS "AssignmentIntent" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"juryGroupMemberId" TEXT NOT NULL, "juryGroupMemberId" TEXT NOT NULL,
"roundId" TEXT NOT NULL, "roundId" TEXT NOT NULL,
@@ -200,7 +200,7 @@ CREATE TABLE "AssignmentIntent" (
); );
-- AssignmentException -- AssignmentException
CREATE TABLE "AssignmentException" ( CREATE TABLE IF NOT EXISTS "AssignmentException" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"assignmentId" TEXT NOT NULL, "assignmentId" TEXT NOT NULL,
"reason" TEXT NOT NULL, "reason" TEXT NOT NULL,
@@ -212,7 +212,7 @@ CREATE TABLE "AssignmentException" (
); );
-- MentorFile -- MentorFile
CREATE TABLE "MentorFile" ( CREATE TABLE IF NOT EXISTS "MentorFile" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"mentorAssignmentId" TEXT NOT NULL, "mentorAssignmentId" TEXT NOT NULL,
"uploadedByUserId" TEXT NOT NULL, "uploadedByUserId" TEXT NOT NULL,
@@ -232,7 +232,7 @@ CREATE TABLE "MentorFile" (
); );
-- MentorFileComment -- MentorFileComment
CREATE TABLE "MentorFileComment" ( CREATE TABLE IF NOT EXISTS "MentorFileComment" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"mentorFileId" TEXT NOT NULL, "mentorFileId" TEXT NOT NULL,
"authorId" TEXT NOT NULL, "authorId" TEXT NOT NULL,
@@ -245,7 +245,7 @@ CREATE TABLE "MentorFileComment" (
); );
-- SubmissionPromotionEvent -- SubmissionPromotionEvent
CREATE TABLE "SubmissionPromotionEvent" ( CREATE TABLE IF NOT EXISTS "SubmissionPromotionEvent" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"projectId" TEXT NOT NULL, "projectId" TEXT NOT NULL,
"roundId" TEXT NOT NULL, "roundId" TEXT NOT NULL,
@@ -259,7 +259,7 @@ CREATE TABLE "SubmissionPromotionEvent" (
); );
-- DeliberationSession -- DeliberationSession
CREATE TABLE "DeliberationSession" ( CREATE TABLE IF NOT EXISTS "DeliberationSession" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"competitionId" TEXT NOT NULL, "competitionId" TEXT NOT NULL,
"roundId" TEXT NOT NULL, "roundId" TEXT NOT NULL,
@@ -277,7 +277,7 @@ CREATE TABLE "DeliberationSession" (
); );
-- DeliberationVote -- DeliberationVote
CREATE TABLE "DeliberationVote" ( CREATE TABLE IF NOT EXISTS "DeliberationVote" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"sessionId" TEXT NOT NULL, "sessionId" TEXT NOT NULL,
"juryMemberId" TEXT NOT NULL, "juryMemberId" TEXT NOT NULL,
@@ -291,7 +291,7 @@ CREATE TABLE "DeliberationVote" (
); );
-- DeliberationResult -- DeliberationResult
CREATE TABLE "DeliberationResult" ( CREATE TABLE IF NOT EXISTS "DeliberationResult" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"sessionId" TEXT NOT NULL, "sessionId" TEXT NOT NULL,
"projectId" TEXT NOT NULL, "projectId" TEXT NOT NULL,
@@ -304,7 +304,7 @@ CREATE TABLE "DeliberationResult" (
); );
-- DeliberationParticipant -- DeliberationParticipant
CREATE TABLE "DeliberationParticipant" ( CREATE TABLE IF NOT EXISTS "DeliberationParticipant" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"sessionId" TEXT NOT NULL, "sessionId" TEXT NOT NULL,
"userId" TEXT NOT NULL, "userId" TEXT NOT NULL,
@@ -315,7 +315,7 @@ CREATE TABLE "DeliberationParticipant" (
); );
-- ResultLock -- ResultLock
CREATE TABLE "ResultLock" ( CREATE TABLE IF NOT EXISTS "ResultLock" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"competitionId" TEXT NOT NULL, "competitionId" TEXT NOT NULL,
"roundId" TEXT NOT NULL, "roundId" TEXT NOT NULL,
@@ -328,7 +328,7 @@ CREATE TABLE "ResultLock" (
); );
-- ResultUnlockEvent -- ResultUnlockEvent
CREATE TABLE "ResultUnlockEvent" ( CREATE TABLE IF NOT EXISTS "ResultUnlockEvent" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"resultLockId" TEXT NOT NULL, "resultLockId" TEXT NOT NULL,
"unlockedById" TEXT NOT NULL, "unlockedById" TEXT NOT NULL,
@@ -338,235 +338,365 @@ CREATE TABLE "ResultUnlockEvent" (
CONSTRAINT "ResultUnlockEvent_pkey" PRIMARY KEY ("id") CONSTRAINT "ResultUnlockEvent_pkey" PRIMARY KEY ("id")
); );
-- ─── Add Columns to Existing Tables ────────────────────────────────────────── -- --- Add Columns to Existing Tables ---
-- Assignment: add juryGroupId -- Assignment: add juryGroupId
DO $$ BEGIN
ALTER TABLE "Assignment" ADD COLUMN "juryGroupId" TEXT; ALTER TABLE "Assignment" ADD COLUMN "juryGroupId" TEXT;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
-- SpecialAward: add competition/round architecture fields -- SpecialAward: add competition/round architecture fields
DO $$ BEGIN
ALTER TABLE "SpecialAward" ADD COLUMN "competitionId" TEXT; ALTER TABLE "SpecialAward" ADD COLUMN "competitionId" TEXT;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "SpecialAward" ADD COLUMN "evaluationRoundId" TEXT; ALTER TABLE "SpecialAward" ADD COLUMN "evaluationRoundId" TEXT;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "SpecialAward" ADD COLUMN "juryGroupId" TEXT; ALTER TABLE "SpecialAward" ADD COLUMN "juryGroupId" TEXT;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "SpecialAward" ADD COLUMN "eligibilityMode" "AwardEligibilityMode" NOT NULL DEFAULT 'STAY_IN_MAIN'; ALTER TABLE "SpecialAward" ADD COLUMN "eligibilityMode" "AwardEligibilityMode" NOT NULL DEFAULT 'STAY_IN_MAIN';
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "SpecialAward" ADD COLUMN "decisionMode" TEXT; ALTER TABLE "SpecialAward" ADD COLUMN "decisionMode" TEXT;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
-- MentorAssignment: add workspace fields -- MentorAssignment: add workspace fields
DO $$ BEGIN
ALTER TABLE "MentorAssignment" ADD COLUMN "workspaceEnabled" BOOLEAN NOT NULL DEFAULT false; ALTER TABLE "MentorAssignment" ADD COLUMN "workspaceEnabled" BOOLEAN NOT NULL DEFAULT false;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "MentorAssignment" ADD COLUMN "workspaceOpenAt" TIMESTAMP(3); ALTER TABLE "MentorAssignment" ADD COLUMN "workspaceOpenAt" TIMESTAMP(3);
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "MentorAssignment" ADD COLUMN "workspaceCloseAt" TIMESTAMP(3); ALTER TABLE "MentorAssignment" ADD COLUMN "workspaceCloseAt" TIMESTAMP(3);
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
-- MentorMessage: add workspace fields -- MentorMessage: add workspace fields
DO $$ BEGIN
ALTER TABLE "MentorMessage" ADD COLUMN "workspaceId" TEXT; ALTER TABLE "MentorMessage" ADD COLUMN "workspaceId" TEXT;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "MentorMessage" ADD COLUMN "senderRole" "MentorMessageRole"; ALTER TABLE "MentorMessage" ADD COLUMN "senderRole" "MentorMessageRole";
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
-- ProjectFile: add submission window link -- ProjectFile: add submission window link
DO $$ BEGIN
ALTER TABLE "ProjectFile" ADD COLUMN "submissionWindowId" TEXT; ALTER TABLE "ProjectFile" ADD COLUMN "submissionWindowId" TEXT;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "ProjectFile" ADD COLUMN "submissionFileRequirementId" TEXT; ALTER TABLE "ProjectFile" ADD COLUMN "submissionFileRequirementId" TEXT;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
-- ─── Unique Constraints ────────────────────────────────────────────────────── -- --- Unique Constraints ---
CREATE UNIQUE INDEX "Competition_slug_key" ON "Competition"("slug"); CREATE UNIQUE INDEX IF NOT EXISTS "Competition_slug_key" ON "Competition"("slug");
CREATE UNIQUE INDEX "Round_competitionId_slug_key" ON "Round"("competitionId", "slug"); CREATE UNIQUE INDEX IF NOT EXISTS "Round_competitionId_slug_key" ON "Round"("competitionId", "slug");
CREATE UNIQUE INDEX "Round_competitionId_sortOrder_key" ON "Round"("competitionId", "sortOrder"); CREATE UNIQUE INDEX IF NOT EXISTS "Round_competitionId_sortOrder_key" ON "Round"("competitionId", "sortOrder");
CREATE UNIQUE INDEX "ProjectRoundState_projectId_roundId_key" ON "ProjectRoundState"("projectId", "roundId"); CREATE UNIQUE INDEX IF NOT EXISTS "ProjectRoundState_projectId_roundId_key" ON "ProjectRoundState"("projectId", "roundId");
CREATE UNIQUE INDEX "JuryGroup_competitionId_slug_key" ON "JuryGroup"("competitionId", "slug"); CREATE UNIQUE INDEX IF NOT EXISTS "JuryGroup_competitionId_slug_key" ON "JuryGroup"("competitionId", "slug");
CREATE UNIQUE INDEX "JuryGroupMember_juryGroupId_userId_key" ON "JuryGroupMember"("juryGroupId", "userId"); CREATE UNIQUE INDEX IF NOT EXISTS "JuryGroupMember_juryGroupId_userId_key" ON "JuryGroupMember"("juryGroupId", "userId");
CREATE UNIQUE INDEX "SubmissionWindow_competitionId_slug_key" ON "SubmissionWindow"("competitionId", "slug"); CREATE UNIQUE INDEX IF NOT EXISTS "SubmissionWindow_competitionId_slug_key" ON "SubmissionWindow"("competitionId", "slug");
CREATE UNIQUE INDEX "SubmissionWindow_competitionId_roundNumber_key" ON "SubmissionWindow"("competitionId", "roundNumber"); CREATE UNIQUE INDEX IF NOT EXISTS "SubmissionWindow_competitionId_roundNumber_key" ON "SubmissionWindow"("competitionId", "roundNumber");
CREATE UNIQUE INDEX "RoundSubmissionVisibility_roundId_submissionWindowId_key" ON "RoundSubmissionVisibility"("roundId", "submissionWindowId"); CREATE UNIQUE INDEX IF NOT EXISTS "RoundSubmissionVisibility_roundId_submissionWindowId_key" ON "RoundSubmissionVisibility"("roundId", "submissionWindowId");
CREATE UNIQUE INDEX "AssignmentIntent_juryGroupMemberId_roundId_projectId_key" ON "AssignmentIntent"("juryGroupMemberId", "roundId", "projectId"); CREATE UNIQUE INDEX IF NOT EXISTS "AssignmentIntent_juryGroupMemberId_roundId_projectId_key" ON "AssignmentIntent"("juryGroupMemberId", "roundId", "projectId");
CREATE UNIQUE INDEX "MentorFile_promotedToFileId_key" ON "MentorFile"("promotedToFileId"); CREATE UNIQUE INDEX IF NOT EXISTS "MentorFile_promotedToFileId_key" ON "MentorFile"("promotedToFileId");
CREATE UNIQUE INDEX "DeliberationVote_sessionId_juryMemberId_projectId_runoffRo_key" ON "DeliberationVote"("sessionId", "juryMemberId", "projectId", "runoffRound"); CREATE UNIQUE INDEX IF NOT EXISTS "DeliberationVote_sessionId_juryMemberId_projectId_runoffRo_key" ON "DeliberationVote"("sessionId", "juryMemberId", "projectId", "runoffRound");
CREATE UNIQUE INDEX "DeliberationResult_sessionId_projectId_key" ON "DeliberationResult"("sessionId", "projectId"); CREATE UNIQUE INDEX IF NOT EXISTS "DeliberationResult_sessionId_projectId_key" ON "DeliberationResult"("sessionId", "projectId");
CREATE UNIQUE INDEX "DeliberationParticipant_sessionId_userId_key" ON "DeliberationParticipant"("sessionId", "userId"); CREATE UNIQUE INDEX IF NOT EXISTS "DeliberationParticipant_sessionId_userId_key" ON "DeliberationParticipant"("sessionId", "userId");
CREATE UNIQUE INDEX "SubmissionFileRequirement_submissionWindowId_slug_key" ON "SubmissionFileRequirement"("submissionWindowId", "slug"); CREATE UNIQUE INDEX IF NOT EXISTS "SubmissionFileRequirement_submissionWindowId_slug_key" ON "SubmissionFileRequirement"("submissionWindowId", "slug");
CREATE UNIQUE INDEX "AdvancementRule_roundId_sortOrder_key" ON "AdvancementRule"("roundId", "sortOrder"); CREATE UNIQUE INDEX IF NOT EXISTS "AdvancementRule_roundId_sortOrder_key" ON "AdvancementRule"("roundId", "sortOrder");
-- ─── Indexes ───────────────────────────────────────────────────────────────── -- --- Indexes ---
-- Competition -- Competition
CREATE INDEX "Competition_programId_idx" ON "Competition"("programId"); CREATE INDEX IF NOT EXISTS "Competition_programId_idx" ON "Competition"("programId");
CREATE INDEX "Competition_status_idx" ON "Competition"("status"); CREATE INDEX IF NOT EXISTS "Competition_status_idx" ON "Competition"("status");
-- Round -- Round
CREATE INDEX "Round_competitionId_idx" ON "Round"("competitionId"); CREATE INDEX IF NOT EXISTS "Round_competitionId_idx" ON "Round"("competitionId");
CREATE INDEX "Round_roundType_idx" ON "Round"("roundType"); CREATE INDEX IF NOT EXISTS "Round_roundType_idx" ON "Round"("roundType");
CREATE INDEX "Round_status_idx" ON "Round"("status"); CREATE INDEX IF NOT EXISTS "Round_status_idx" ON "Round"("status");
-- ProjectRoundState -- ProjectRoundState
CREATE INDEX "ProjectRoundState_projectId_idx" ON "ProjectRoundState"("projectId"); CREATE INDEX IF NOT EXISTS "ProjectRoundState_projectId_idx" ON "ProjectRoundState"("projectId");
CREATE INDEX "ProjectRoundState_roundId_idx" ON "ProjectRoundState"("roundId"); CREATE INDEX IF NOT EXISTS "ProjectRoundState_roundId_idx" ON "ProjectRoundState"("roundId");
CREATE INDEX "ProjectRoundState_state_idx" ON "ProjectRoundState"("state"); CREATE INDEX IF NOT EXISTS "ProjectRoundState_state_idx" ON "ProjectRoundState"("state");
-- AdvancementRule -- AdvancementRule
CREATE INDEX "AdvancementRule_roundId_idx" ON "AdvancementRule"("roundId"); CREATE INDEX IF NOT EXISTS "AdvancementRule_roundId_idx" ON "AdvancementRule"("roundId");
-- JuryGroup -- JuryGroup
CREATE INDEX "JuryGroup_competitionId_idx" ON "JuryGroup"("competitionId"); CREATE INDEX IF NOT EXISTS "JuryGroup_competitionId_idx" ON "JuryGroup"("competitionId");
-- JuryGroupMember -- JuryGroupMember
CREATE INDEX "JuryGroupMember_juryGroupId_idx" ON "JuryGroupMember"("juryGroupId"); CREATE INDEX IF NOT EXISTS "JuryGroupMember_juryGroupId_idx" ON "JuryGroupMember"("juryGroupId");
CREATE INDEX "JuryGroupMember_userId_idx" ON "JuryGroupMember"("userId"); CREATE INDEX IF NOT EXISTS "JuryGroupMember_userId_idx" ON "JuryGroupMember"("userId");
-- SubmissionWindow -- SubmissionWindow
CREATE INDEX "SubmissionWindow_competitionId_idx" ON "SubmissionWindow"("competitionId"); CREATE INDEX IF NOT EXISTS "SubmissionWindow_competitionId_idx" ON "SubmissionWindow"("competitionId");
-- SubmissionFileRequirement -- SubmissionFileRequirement
CREATE INDEX "SubmissionFileRequirement_submissionWindowId_idx" ON "SubmissionFileRequirement"("submissionWindowId"); CREATE INDEX IF NOT EXISTS "SubmissionFileRequirement_submissionWindowId_idx" ON "SubmissionFileRequirement"("submissionWindowId");
-- RoundSubmissionVisibility -- RoundSubmissionVisibility
CREATE INDEX "RoundSubmissionVisibility_roundId_idx" ON "RoundSubmissionVisibility"("roundId"); CREATE INDEX IF NOT EXISTS "RoundSubmissionVisibility_roundId_idx" ON "RoundSubmissionVisibility"("roundId");
-- AssignmentIntent -- AssignmentIntent
CREATE INDEX "AssignmentIntent_roundId_idx" ON "AssignmentIntent"("roundId"); CREATE INDEX IF NOT EXISTS "AssignmentIntent_roundId_idx" ON "AssignmentIntent"("roundId");
CREATE INDEX "AssignmentIntent_projectId_idx" ON "AssignmentIntent"("projectId"); CREATE INDEX IF NOT EXISTS "AssignmentIntent_projectId_idx" ON "AssignmentIntent"("projectId");
CREATE INDEX "AssignmentIntent_status_idx" ON "AssignmentIntent"("status"); CREATE INDEX IF NOT EXISTS "AssignmentIntent_status_idx" ON "AssignmentIntent"("status");
-- AssignmentException -- AssignmentException
CREATE INDEX "AssignmentException_assignmentId_idx" ON "AssignmentException"("assignmentId"); CREATE INDEX IF NOT EXISTS "AssignmentException_assignmentId_idx" ON "AssignmentException"("assignmentId");
CREATE INDEX "AssignmentException_approvedById_idx" ON "AssignmentException"("approvedById"); CREATE INDEX IF NOT EXISTS "AssignmentException_approvedById_idx" ON "AssignmentException"("approvedById");
-- MentorFile -- MentorFile
CREATE INDEX "MentorFile_mentorAssignmentId_idx" ON "MentorFile"("mentorAssignmentId"); CREATE INDEX IF NOT EXISTS "MentorFile_mentorAssignmentId_idx" ON "MentorFile"("mentorAssignmentId");
CREATE INDEX "MentorFile_uploadedByUserId_idx" ON "MentorFile"("uploadedByUserId"); CREATE INDEX IF NOT EXISTS "MentorFile_uploadedByUserId_idx" ON "MentorFile"("uploadedByUserId");
-- MentorFileComment -- MentorFileComment
CREATE INDEX "MentorFileComment_mentorFileId_idx" ON "MentorFileComment"("mentorFileId"); CREATE INDEX IF NOT EXISTS "MentorFileComment_mentorFileId_idx" ON "MentorFileComment"("mentorFileId");
CREATE INDEX "MentorFileComment_authorId_idx" ON "MentorFileComment"("authorId"); CREATE INDEX IF NOT EXISTS "MentorFileComment_authorId_idx" ON "MentorFileComment"("authorId");
CREATE INDEX "MentorFileComment_parentCommentId_idx" ON "MentorFileComment"("parentCommentId"); CREATE INDEX IF NOT EXISTS "MentorFileComment_parentCommentId_idx" ON "MentorFileComment"("parentCommentId");
-- SubmissionPromotionEvent -- SubmissionPromotionEvent
CREATE INDEX "SubmissionPromotionEvent_projectId_idx" ON "SubmissionPromotionEvent"("projectId"); CREATE INDEX IF NOT EXISTS "SubmissionPromotionEvent_projectId_idx" ON "SubmissionPromotionEvent"("projectId");
CREATE INDEX "SubmissionPromotionEvent_roundId_idx" ON "SubmissionPromotionEvent"("roundId"); CREATE INDEX IF NOT EXISTS "SubmissionPromotionEvent_roundId_idx" ON "SubmissionPromotionEvent"("roundId");
CREATE INDEX "SubmissionPromotionEvent_sourceFileId_idx" ON "SubmissionPromotionEvent"("sourceFileId"); CREATE INDEX IF NOT EXISTS "SubmissionPromotionEvent_sourceFileId_idx" ON "SubmissionPromotionEvent"("sourceFileId");
-- DeliberationSession -- DeliberationSession
CREATE INDEX "DeliberationSession_competitionId_idx" ON "DeliberationSession"("competitionId"); CREATE INDEX IF NOT EXISTS "DeliberationSession_competitionId_idx" ON "DeliberationSession"("competitionId");
CREATE INDEX "DeliberationSession_roundId_idx" ON "DeliberationSession"("roundId"); CREATE INDEX IF NOT EXISTS "DeliberationSession_roundId_idx" ON "DeliberationSession"("roundId");
CREATE INDEX "DeliberationSession_status_idx" ON "DeliberationSession"("status"); CREATE INDEX IF NOT EXISTS "DeliberationSession_status_idx" ON "DeliberationSession"("status");
-- DeliberationVote -- DeliberationVote
CREATE INDEX "DeliberationVote_sessionId_idx" ON "DeliberationVote"("sessionId"); CREATE INDEX IF NOT EXISTS "DeliberationVote_sessionId_idx" ON "DeliberationVote"("sessionId");
CREATE INDEX "DeliberationVote_juryMemberId_idx" ON "DeliberationVote"("juryMemberId"); CREATE INDEX IF NOT EXISTS "DeliberationVote_juryMemberId_idx" ON "DeliberationVote"("juryMemberId");
CREATE INDEX "DeliberationVote_projectId_idx" ON "DeliberationVote"("projectId"); CREATE INDEX IF NOT EXISTS "DeliberationVote_projectId_idx" ON "DeliberationVote"("projectId");
-- DeliberationResult -- DeliberationResult
CREATE INDEX "DeliberationResult_sessionId_idx" ON "DeliberationResult"("sessionId"); CREATE INDEX IF NOT EXISTS "DeliberationResult_sessionId_idx" ON "DeliberationResult"("sessionId");
CREATE INDEX "DeliberationResult_projectId_idx" ON "DeliberationResult"("projectId"); CREATE INDEX IF NOT EXISTS "DeliberationResult_projectId_idx" ON "DeliberationResult"("projectId");
-- DeliberationParticipant -- DeliberationParticipant
CREATE INDEX "DeliberationParticipant_sessionId_idx" ON "DeliberationParticipant"("sessionId"); CREATE INDEX IF NOT EXISTS "DeliberationParticipant_sessionId_idx" ON "DeliberationParticipant"("sessionId");
CREATE INDEX "DeliberationParticipant_userId_idx" ON "DeliberationParticipant"("userId"); CREATE INDEX IF NOT EXISTS "DeliberationParticipant_userId_idx" ON "DeliberationParticipant"("userId");
-- ResultLock -- ResultLock
CREATE INDEX "ResultLock_competitionId_idx" ON "ResultLock"("competitionId"); CREATE INDEX IF NOT EXISTS "ResultLock_competitionId_idx" ON "ResultLock"("competitionId");
CREATE INDEX "ResultLock_roundId_idx" ON "ResultLock"("roundId"); CREATE INDEX IF NOT EXISTS "ResultLock_roundId_idx" ON "ResultLock"("roundId");
CREATE INDEX "ResultLock_category_idx" ON "ResultLock"("category"); CREATE INDEX IF NOT EXISTS "ResultLock_category_idx" ON "ResultLock"("category");
-- ResultUnlockEvent -- ResultUnlockEvent
CREATE INDEX "ResultUnlockEvent_resultLockId_idx" ON "ResultUnlockEvent"("resultLockId"); CREATE INDEX IF NOT EXISTS "ResultUnlockEvent_resultLockId_idx" ON "ResultUnlockEvent"("resultLockId");
CREATE INDEX "ResultUnlockEvent_unlockedById_idx" ON "ResultUnlockEvent"("unlockedById"); CREATE INDEX IF NOT EXISTS "ResultUnlockEvent_unlockedById_idx" ON "ResultUnlockEvent"("unlockedById");
-- Indexes on modified existing tables -- Indexes on modified existing tables
CREATE INDEX "Assignment_juryGroupId_idx" ON "Assignment"("juryGroupId"); CREATE INDEX IF NOT EXISTS "Assignment_juryGroupId_idx" ON "Assignment"("juryGroupId");
CREATE INDEX "SpecialAward_competitionId_idx" ON "SpecialAward"("competitionId"); CREATE INDEX IF NOT EXISTS "SpecialAward_competitionId_idx" ON "SpecialAward"("competitionId");
CREATE INDEX "SpecialAward_evaluationRoundId_idx" ON "SpecialAward"("evaluationRoundId"); CREATE INDEX IF NOT EXISTS "SpecialAward_evaluationRoundId_idx" ON "SpecialAward"("evaluationRoundId");
CREATE INDEX "MentorMessage_workspaceId_idx" ON "MentorMessage"("workspaceId"); CREATE INDEX IF NOT EXISTS "MentorMessage_workspaceId_idx" ON "MentorMessage"("workspaceId");
CREATE INDEX "ProjectFile_submissionWindowId_idx" ON "ProjectFile"("submissionWindowId"); CREATE INDEX IF NOT EXISTS "ProjectFile_submissionWindowId_idx" ON "ProjectFile"("submissionWindowId");
CREATE INDEX "ProjectFile_submissionFileRequirementId_idx" ON "ProjectFile"("submissionFileRequirementId"); CREATE INDEX IF NOT EXISTS "ProjectFile_submissionFileRequirementId_idx" ON "ProjectFile"("submissionFileRequirementId");
-- ─── Foreign Keys ──────────────────────────────────────────────────────────── -- --- Foreign Keys ---
-- Competition -- Competition
DO $$ BEGIN
ALTER TABLE "Competition" ADD CONSTRAINT "Competition_programId_fkey" FOREIGN KEY ("programId") REFERENCES "Program"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "Competition" ADD CONSTRAINT "Competition_programId_fkey" FOREIGN KEY ("programId") REFERENCES "Program"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- Round -- Round
DO $$ BEGIN
ALTER TABLE "Round" ADD CONSTRAINT "Round_competitionId_fkey" FOREIGN KEY ("competitionId") REFERENCES "Competition"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "Round" ADD CONSTRAINT "Round_competitionId_fkey" FOREIGN KEY ("competitionId") REFERENCES "Competition"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "Round" ADD CONSTRAINT "Round_juryGroupId_fkey" FOREIGN KEY ("juryGroupId") REFERENCES "JuryGroup"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "Round" ADD CONSTRAINT "Round_juryGroupId_fkey" FOREIGN KEY ("juryGroupId") REFERENCES "JuryGroup"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "Round" ADD CONSTRAINT "Round_submissionWindowId_fkey" FOREIGN KEY ("submissionWindowId") REFERENCES "SubmissionWindow"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "Round" ADD CONSTRAINT "Round_submissionWindowId_fkey" FOREIGN KEY ("submissionWindowId") REFERENCES "SubmissionWindow"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ProjectRoundState -- ProjectRoundState
DO $$ BEGIN
ALTER TABLE "ProjectRoundState" ADD CONSTRAINT "ProjectRoundState_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "ProjectRoundState" ADD CONSTRAINT "ProjectRoundState_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "ProjectRoundState" ADD CONSTRAINT "ProjectRoundState_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "ProjectRoundState" ADD CONSTRAINT "ProjectRoundState_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AdvancementRule -- AdvancementRule
DO $$ BEGIN
ALTER TABLE "AdvancementRule" ADD CONSTRAINT "AdvancementRule_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "AdvancementRule" ADD CONSTRAINT "AdvancementRule_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- JuryGroup -- JuryGroup
DO $$ BEGIN
ALTER TABLE "JuryGroup" ADD CONSTRAINT "JuryGroup_competitionId_fkey" FOREIGN KEY ("competitionId") REFERENCES "Competition"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "JuryGroup" ADD CONSTRAINT "JuryGroup_competitionId_fkey" FOREIGN KEY ("competitionId") REFERENCES "Competition"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- JuryGroupMember -- JuryGroupMember
DO $$ BEGIN
ALTER TABLE "JuryGroupMember" ADD CONSTRAINT "JuryGroupMember_juryGroupId_fkey" FOREIGN KEY ("juryGroupId") REFERENCES "JuryGroup"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "JuryGroupMember" ADD CONSTRAINT "JuryGroupMember_juryGroupId_fkey" FOREIGN KEY ("juryGroupId") REFERENCES "JuryGroup"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "JuryGroupMember" ADD CONSTRAINT "JuryGroupMember_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "JuryGroupMember" ADD CONSTRAINT "JuryGroupMember_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- SubmissionWindow -- SubmissionWindow
DO $$ BEGIN
ALTER TABLE "SubmissionWindow" ADD CONSTRAINT "SubmissionWindow_competitionId_fkey" FOREIGN KEY ("competitionId") REFERENCES "Competition"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "SubmissionWindow" ADD CONSTRAINT "SubmissionWindow_competitionId_fkey" FOREIGN KEY ("competitionId") REFERENCES "Competition"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- SubmissionFileRequirement -- SubmissionFileRequirement
DO $$ BEGIN
ALTER TABLE "SubmissionFileRequirement" ADD CONSTRAINT "SubmissionFileRequirement_submissionWindowId_fkey" FOREIGN KEY ("submissionWindowId") REFERENCES "SubmissionWindow"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "SubmissionFileRequirement" ADD CONSTRAINT "SubmissionFileRequirement_submissionWindowId_fkey" FOREIGN KEY ("submissionWindowId") REFERENCES "SubmissionWindow"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- RoundSubmissionVisibility -- RoundSubmissionVisibility
DO $$ BEGIN
ALTER TABLE "RoundSubmissionVisibility" ADD CONSTRAINT "RoundSubmissionVisibility_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "RoundSubmissionVisibility" ADD CONSTRAINT "RoundSubmissionVisibility_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "RoundSubmissionVisibility" ADD CONSTRAINT "RoundSubmissionVisibility_submissionWindowId_fkey" FOREIGN KEY ("submissionWindowId") REFERENCES "SubmissionWindow"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "RoundSubmissionVisibility" ADD CONSTRAINT "RoundSubmissionVisibility_submissionWindowId_fkey" FOREIGN KEY ("submissionWindowId") REFERENCES "SubmissionWindow"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AssignmentIntent -- AssignmentIntent
DO $$ BEGIN
ALTER TABLE "AssignmentIntent" ADD CONSTRAINT "AssignmentIntent_juryGroupMemberId_fkey" FOREIGN KEY ("juryGroupMemberId") REFERENCES "JuryGroupMember"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "AssignmentIntent" ADD CONSTRAINT "AssignmentIntent_juryGroupMemberId_fkey" FOREIGN KEY ("juryGroupMemberId") REFERENCES "JuryGroupMember"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "AssignmentIntent" ADD CONSTRAINT "AssignmentIntent_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "AssignmentIntent" ADD CONSTRAINT "AssignmentIntent_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "AssignmentIntent" ADD CONSTRAINT "AssignmentIntent_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "AssignmentIntent" ADD CONSTRAINT "AssignmentIntent_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- AssignmentException -- AssignmentException
DO $$ BEGIN
ALTER TABLE "AssignmentException" ADD CONSTRAINT "AssignmentException_assignmentId_fkey" FOREIGN KEY ("assignmentId") REFERENCES "Assignment"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "AssignmentException" ADD CONSTRAINT "AssignmentException_assignmentId_fkey" FOREIGN KEY ("assignmentId") REFERENCES "Assignment"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "AssignmentException" ADD CONSTRAINT "AssignmentException_approvedById_fkey" FOREIGN KEY ("approvedById") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "AssignmentException" ADD CONSTRAINT "AssignmentException_approvedById_fkey" FOREIGN KEY ("approvedById") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- MentorFile -- MentorFile
DO $$ BEGIN
ALTER TABLE "MentorFile" ADD CONSTRAINT "MentorFile_mentorAssignmentId_fkey" FOREIGN KEY ("mentorAssignmentId") REFERENCES "MentorAssignment"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "MentorFile" ADD CONSTRAINT "MentorFile_mentorAssignmentId_fkey" FOREIGN KEY ("mentorAssignmentId") REFERENCES "MentorAssignment"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "MentorFile" ADD CONSTRAINT "MentorFile_uploadedByUserId_fkey" FOREIGN KEY ("uploadedByUserId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "MentorFile" ADD CONSTRAINT "MentorFile_uploadedByUserId_fkey" FOREIGN KEY ("uploadedByUserId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "MentorFile" ADD CONSTRAINT "MentorFile_promotedByUserId_fkey" FOREIGN KEY ("promotedByUserId") REFERENCES "User"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "MentorFile" ADD CONSTRAINT "MentorFile_promotedByUserId_fkey" FOREIGN KEY ("promotedByUserId") REFERENCES "User"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "MentorFile" ADD CONSTRAINT "MentorFile_promotedToFileId_fkey" FOREIGN KEY ("promotedToFileId") REFERENCES "ProjectFile"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "MentorFile" ADD CONSTRAINT "MentorFile_promotedToFileId_fkey" FOREIGN KEY ("promotedToFileId") REFERENCES "ProjectFile"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- MentorFileComment -- MentorFileComment
DO $$ BEGIN
ALTER TABLE "MentorFileComment" ADD CONSTRAINT "MentorFileComment_mentorFileId_fkey" FOREIGN KEY ("mentorFileId") REFERENCES "MentorFile"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "MentorFileComment" ADD CONSTRAINT "MentorFileComment_mentorFileId_fkey" FOREIGN KEY ("mentorFileId") REFERENCES "MentorFile"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "MentorFileComment" ADD CONSTRAINT "MentorFileComment_authorId_fkey" FOREIGN KEY ("authorId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "MentorFileComment" ADD CONSTRAINT "MentorFileComment_authorId_fkey" FOREIGN KEY ("authorId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "MentorFileComment" ADD CONSTRAINT "MentorFileComment_parentCommentId_fkey" FOREIGN KEY ("parentCommentId") REFERENCES "MentorFileComment"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "MentorFileComment" ADD CONSTRAINT "MentorFileComment_parentCommentId_fkey" FOREIGN KEY ("parentCommentId") REFERENCES "MentorFileComment"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- SubmissionPromotionEvent -- SubmissionPromotionEvent
DO $$ BEGIN
ALTER TABLE "SubmissionPromotionEvent" ADD CONSTRAINT "SubmissionPromotionEvent_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "SubmissionPromotionEvent" ADD CONSTRAINT "SubmissionPromotionEvent_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "SubmissionPromotionEvent" ADD CONSTRAINT "SubmissionPromotionEvent_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "SubmissionPromotionEvent" ADD CONSTRAINT "SubmissionPromotionEvent_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "SubmissionPromotionEvent" ADD CONSTRAINT "SubmissionPromotionEvent_sourceFileId_fkey" FOREIGN KEY ("sourceFileId") REFERENCES "MentorFile"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "SubmissionPromotionEvent" ADD CONSTRAINT "SubmissionPromotionEvent_sourceFileId_fkey" FOREIGN KEY ("sourceFileId") REFERENCES "MentorFile"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "SubmissionPromotionEvent" ADD CONSTRAINT "SubmissionPromotionEvent_promotedById_fkey" FOREIGN KEY ("promotedById") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "SubmissionPromotionEvent" ADD CONSTRAINT "SubmissionPromotionEvent_promotedById_fkey" FOREIGN KEY ("promotedById") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- DeliberationSession -- DeliberationSession
DO $$ BEGIN
ALTER TABLE "DeliberationSession" ADD CONSTRAINT "DeliberationSession_competitionId_fkey" FOREIGN KEY ("competitionId") REFERENCES "Competition"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "DeliberationSession" ADD CONSTRAINT "DeliberationSession_competitionId_fkey" FOREIGN KEY ("competitionId") REFERENCES "Competition"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "DeliberationSession" ADD CONSTRAINT "DeliberationSession_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "DeliberationSession" ADD CONSTRAINT "DeliberationSession_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- DeliberationVote -- DeliberationVote
DO $$ BEGIN
ALTER TABLE "DeliberationVote" ADD CONSTRAINT "DeliberationVote_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "DeliberationSession"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "DeliberationVote" ADD CONSTRAINT "DeliberationVote_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "DeliberationSession"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "DeliberationVote" ADD CONSTRAINT "DeliberationVote_juryMemberId_fkey" FOREIGN KEY ("juryMemberId") REFERENCES "JuryGroupMember"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "DeliberationVote" ADD CONSTRAINT "DeliberationVote_juryMemberId_fkey" FOREIGN KEY ("juryMemberId") REFERENCES "JuryGroupMember"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "DeliberationVote" ADD CONSTRAINT "DeliberationVote_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "DeliberationVote" ADD CONSTRAINT "DeliberationVote_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- DeliberationResult -- DeliberationResult
DO $$ BEGIN
ALTER TABLE "DeliberationResult" ADD CONSTRAINT "DeliberationResult_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "DeliberationSession"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "DeliberationResult" ADD CONSTRAINT "DeliberationResult_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "DeliberationSession"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "DeliberationResult" ADD CONSTRAINT "DeliberationResult_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "DeliberationResult" ADD CONSTRAINT "DeliberationResult_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- DeliberationParticipant -- DeliberationParticipant
DO $$ BEGIN
ALTER TABLE "DeliberationParticipant" ADD CONSTRAINT "DeliberationParticipant_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "DeliberationSession"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "DeliberationParticipant" ADD CONSTRAINT "DeliberationParticipant_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "DeliberationSession"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "DeliberationParticipant" ADD CONSTRAINT "DeliberationParticipant_userId_fkey" FOREIGN KEY ("userId") REFERENCES "JuryGroupMember"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "DeliberationParticipant" ADD CONSTRAINT "DeliberationParticipant_userId_fkey" FOREIGN KEY ("userId") REFERENCES "JuryGroupMember"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "DeliberationParticipant" ADD CONSTRAINT "DeliberationParticipant_replacedById_fkey" FOREIGN KEY ("replacedById") REFERENCES "User"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "DeliberationParticipant" ADD CONSTRAINT "DeliberationParticipant_replacedById_fkey" FOREIGN KEY ("replacedById") REFERENCES "User"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ResultLock -- ResultLock
DO $$ BEGIN
ALTER TABLE "ResultLock" ADD CONSTRAINT "ResultLock_competitionId_fkey" FOREIGN KEY ("competitionId") REFERENCES "Competition"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "ResultLock" ADD CONSTRAINT "ResultLock_competitionId_fkey" FOREIGN KEY ("competitionId") REFERENCES "Competition"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "ResultLock" ADD CONSTRAINT "ResultLock_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "ResultLock" ADD CONSTRAINT "ResultLock_roundId_fkey" FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "ResultLock" ADD CONSTRAINT "ResultLock_lockedById_fkey" FOREIGN KEY ("lockedById") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "ResultLock" ADD CONSTRAINT "ResultLock_lockedById_fkey" FOREIGN KEY ("lockedById") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ResultUnlockEvent -- ResultUnlockEvent
DO $$ BEGIN
ALTER TABLE "ResultUnlockEvent" ADD CONSTRAINT "ResultUnlockEvent_resultLockId_fkey" FOREIGN KEY ("resultLockId") REFERENCES "ResultLock"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "ResultUnlockEvent" ADD CONSTRAINT "ResultUnlockEvent_resultLockId_fkey" FOREIGN KEY ("resultLockId") REFERENCES "ResultLock"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "ResultUnlockEvent" ADD CONSTRAINT "ResultUnlockEvent_unlockedById_fkey" FOREIGN KEY ("unlockedById") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "ResultUnlockEvent" ADD CONSTRAINT "ResultUnlockEvent_unlockedById_fkey" FOREIGN KEY ("unlockedById") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- FKs on modified existing tables -- FKs on modified existing tables
DO $$ BEGIN
ALTER TABLE "Assignment" ADD CONSTRAINT "Assignment_juryGroupId_fkey" FOREIGN KEY ("juryGroupId") REFERENCES "JuryGroup"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "Assignment" ADD CONSTRAINT "Assignment_juryGroupId_fkey" FOREIGN KEY ("juryGroupId") REFERENCES "JuryGroup"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "SpecialAward" ADD CONSTRAINT "SpecialAward_competitionId_fkey" FOREIGN KEY ("competitionId") REFERENCES "Competition"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "SpecialAward" ADD CONSTRAINT "SpecialAward_competitionId_fkey" FOREIGN KEY ("competitionId") REFERENCES "Competition"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "SpecialAward" ADD CONSTRAINT "SpecialAward_evaluationRoundId_fkey" FOREIGN KEY ("evaluationRoundId") REFERENCES "Round"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "SpecialAward" ADD CONSTRAINT "SpecialAward_evaluationRoundId_fkey" FOREIGN KEY ("evaluationRoundId") REFERENCES "Round"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "SpecialAward" ADD CONSTRAINT "SpecialAward_juryGroupId_fkey" FOREIGN KEY ("juryGroupId") REFERENCES "JuryGroup"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "SpecialAward" ADD CONSTRAINT "SpecialAward_juryGroupId_fkey" FOREIGN KEY ("juryGroupId") REFERENCES "JuryGroup"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "MentorMessage" ADD CONSTRAINT "MentorMessage_workspaceId_fkey" FOREIGN KEY ("workspaceId") REFERENCES "MentorAssignment"("id") ON DELETE CASCADE ON UPDATE CASCADE; ALTER TABLE "MentorMessage" ADD CONSTRAINT "MentorMessage_workspaceId_fkey" FOREIGN KEY ("workspaceId") REFERENCES "MentorAssignment"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "ProjectFile" ADD CONSTRAINT "ProjectFile_submissionWindowId_fkey" FOREIGN KEY ("submissionWindowId") REFERENCES "SubmissionWindow"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "ProjectFile" ADD CONSTRAINT "ProjectFile_submissionWindowId_fkey" FOREIGN KEY ("submissionWindowId") REFERENCES "SubmissionWindow"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "ProjectFile" ADD CONSTRAINT "ProjectFile_submissionFileRequirementId_fkey" FOREIGN KEY ("submissionFileRequirementId") REFERENCES "SubmissionFileRequirement"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "ProjectFile" ADD CONSTRAINT "ProjectFile_submissionFileRequirementId_fkey" FOREIGN KEY ("submissionFileRequirementId") REFERENCES "SubmissionFileRequirement"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;

View File

@@ -1,3 +1,7 @@
-- AlterTable -- AlterTable
ALTER TABLE "JuryGroupMember" ADD COLUMN "selfServiceCap" INTEGER, DO $$ BEGIN
ADD COLUMN "selfServiceRatio" DOUBLE PRECISION; ALTER TABLE "JuryGroupMember" ADD COLUMN "selfServiceCap" INTEGER;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;
DO $$ BEGIN
ALTER TABLE "JuryGroupMember" ADD COLUMN "selfServiceRatio" DOUBLE PRECISION;
EXCEPTION WHEN duplicate_column THEN NULL; END $$;

View File

@@ -1,14 +1,14 @@
-- ============================================================================= -- =============================================================================
-- Phase 7/8 Migration Part 1: Rename stageId roundId on 15 tables -- Phase 7/8 Migration Part 1: Rename stageId -> roundId on 15 tables
-- ============================================================================= -- =============================================================================
-- This migration renames stageId columns to roundId and updates FK constraints -- This migration renames stageId columns to roundId and updates FK constraints
-- to point to the Round table instead of Stage table. -- to point to the Round table instead of Stage table.
-- --
-- NOTE: After the pipeline migration (20260213), most tables have BOTH a -- NOTE: After the pipeline migration (20260213), most tables have BOTH a
-- nullable roundId column (legacy, no FK) AND a stageId column. We must -- nullable roundId column (legacy, no FK) AND a stageId column. We must
-- drop the old roundId column before renaming stageId roundId. -- drop the old roundId column before renaming stageId -> roundId.
-- ─── 1. EvaluationForm ─────────────────────────────────────────────────────── -- --- 1. EvaluationForm ---
-- Drop old roundId column (nullable, no FK since 20260213 migration) -- Drop old roundId column (nullable, no FK since 20260213 migration)
ALTER TABLE "EvaluationForm" DROP COLUMN IF EXISTS "roundId"; ALTER TABLE "EvaluationForm" DROP COLUMN IF EXISTS "roundId";
@@ -20,18 +20,22 @@ ALTER TABLE "EvaluationForm" DROP CONSTRAINT IF EXISTS "EvaluationForm_stageId_f
DROP INDEX IF EXISTS "EvaluationForm_stageId_version_key"; DROP INDEX IF EXISTS "EvaluationForm_stageId_version_key";
DROP INDEX IF EXISTS "EvaluationForm_stageId_isActive_idx"; DROP INDEX IF EXISTS "EvaluationForm_stageId_isActive_idx";
-- Rename column -- Rename column (only if stageId exists)
DO $$ BEGIN
ALTER TABLE "EvaluationForm" RENAME COLUMN "stageId" TO "roundId"; ALTER TABLE "EvaluationForm" RENAME COLUMN "stageId" TO "roundId";
EXCEPTION WHEN undefined_column THEN NULL; END $$;
-- Recreate indexes with new name -- Recreate indexes with new name
CREATE UNIQUE INDEX "EvaluationForm_roundId_version_key" ON "EvaluationForm"("roundId", "version"); CREATE UNIQUE INDEX IF NOT EXISTS "EvaluationForm_roundId_version_key" ON "EvaluationForm"("roundId", "version");
CREATE INDEX "EvaluationForm_roundId_isActive_idx" ON "EvaluationForm"("roundId", "isActive"); CREATE INDEX IF NOT EXISTS "EvaluationForm_roundId_isActive_idx" ON "EvaluationForm"("roundId", "isActive");
-- Recreate FK pointing to Round -- Recreate FK pointing to Round
DO $$ BEGIN
ALTER TABLE "EvaluationForm" ADD CONSTRAINT "EvaluationForm_roundId_fkey" ALTER TABLE "EvaluationForm" ADD CONSTRAINT "EvaluationForm_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── 2. FileRequirement ────────────────────────────────────────────────────── -- --- 2. FileRequirement ---
ALTER TABLE "FileRequirement" DROP COLUMN IF EXISTS "roundId"; ALTER TABLE "FileRequirement" DROP COLUMN IF EXISTS "roundId";
@@ -39,14 +43,18 @@ ALTER TABLE "FileRequirement" DROP CONSTRAINT IF EXISTS "FileRequirement_stageId
DROP INDEX IF EXISTS "FileRequirement_stageId_idx"; DROP INDEX IF EXISTS "FileRequirement_stageId_idx";
DO $$ BEGIN
ALTER TABLE "FileRequirement" RENAME COLUMN "stageId" TO "roundId"; ALTER TABLE "FileRequirement" RENAME COLUMN "stageId" TO "roundId";
EXCEPTION WHEN undefined_column THEN NULL; END $$;
CREATE INDEX "FileRequirement_roundId_idx" ON "FileRequirement"("roundId"); CREATE INDEX IF NOT EXISTS "FileRequirement_roundId_idx" ON "FileRequirement"("roundId");
DO $$ BEGIN
ALTER TABLE "FileRequirement" ADD CONSTRAINT "FileRequirement_roundId_fkey" ALTER TABLE "FileRequirement" ADD CONSTRAINT "FileRequirement_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── 3. Assignment ─────────────────────────────────────────────────────────── -- --- 3. Assignment ---
ALTER TABLE "Assignment" DROP COLUMN IF EXISTS "roundId"; ALTER TABLE "Assignment" DROP COLUMN IF EXISTS "roundId";
@@ -55,15 +63,19 @@ ALTER TABLE "Assignment" DROP CONSTRAINT IF EXISTS "Assignment_stageId_fkey";
DROP INDEX IF EXISTS "Assignment_userId_projectId_stageId_key"; DROP INDEX IF EXISTS "Assignment_userId_projectId_stageId_key";
DROP INDEX IF EXISTS "Assignment_stageId_idx"; DROP INDEX IF EXISTS "Assignment_stageId_idx";
DO $$ BEGIN
ALTER TABLE "Assignment" RENAME COLUMN "stageId" TO "roundId"; ALTER TABLE "Assignment" RENAME COLUMN "stageId" TO "roundId";
EXCEPTION WHEN undefined_column THEN NULL; END $$;
CREATE UNIQUE INDEX "Assignment_userId_projectId_roundId_key" ON "Assignment"("userId", "projectId", "roundId"); CREATE UNIQUE INDEX IF NOT EXISTS "Assignment_userId_projectId_roundId_key" ON "Assignment"("userId", "projectId", "roundId");
CREATE INDEX "Assignment_roundId_idx" ON "Assignment"("roundId"); CREATE INDEX IF NOT EXISTS "Assignment_roundId_idx" ON "Assignment"("roundId");
DO $$ BEGIN
ALTER TABLE "Assignment" ADD CONSTRAINT "Assignment_roundId_fkey" ALTER TABLE "Assignment" ADD CONSTRAINT "Assignment_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── 4. GracePeriod ────────────────────────────────────────────────────────── -- --- 4. GracePeriod ---
ALTER TABLE "GracePeriod" DROP COLUMN IF EXISTS "roundId"; ALTER TABLE "GracePeriod" DROP COLUMN IF EXISTS "roundId";
@@ -72,15 +84,19 @@ ALTER TABLE "GracePeriod" DROP CONSTRAINT IF EXISTS "GracePeriod_stageId_fkey";
DROP INDEX IF EXISTS "GracePeriod_stageId_idx"; DROP INDEX IF EXISTS "GracePeriod_stageId_idx";
DROP INDEX IF EXISTS "GracePeriod_stageId_userId_extendedUntil_idx"; DROP INDEX IF EXISTS "GracePeriod_stageId_userId_extendedUntil_idx";
DO $$ BEGIN
ALTER TABLE "GracePeriod" RENAME COLUMN "stageId" TO "roundId"; ALTER TABLE "GracePeriod" RENAME COLUMN "stageId" TO "roundId";
EXCEPTION WHEN undefined_column THEN NULL; END $$;
CREATE INDEX "GracePeriod_roundId_idx" ON "GracePeriod"("roundId"); CREATE INDEX IF NOT EXISTS "GracePeriod_roundId_idx" ON "GracePeriod"("roundId");
CREATE INDEX "GracePeriod_roundId_userId_extendedUntil_idx" ON "GracePeriod"("roundId", "userId", "extendedUntil"); CREATE INDEX IF NOT EXISTS "GracePeriod_roundId_userId_extendedUntil_idx" ON "GracePeriod"("roundId", "userId", "extendedUntil");
DO $$ BEGIN
ALTER TABLE "GracePeriod" ADD CONSTRAINT "GracePeriod_roundId_fkey" ALTER TABLE "GracePeriod" ADD CONSTRAINT "GracePeriod_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── 5. LiveVotingSession ──────────────────────────────────────────────────── -- --- 5. LiveVotingSession ---
ALTER TABLE "LiveVotingSession" DROP COLUMN IF EXISTS "roundId"; ALTER TABLE "LiveVotingSession" DROP COLUMN IF EXISTS "roundId";
@@ -88,14 +104,18 @@ ALTER TABLE "LiveVotingSession" DROP CONSTRAINT IF EXISTS "LiveVotingSession_sta
DROP INDEX IF EXISTS "LiveVotingSession_stageId_key"; DROP INDEX IF EXISTS "LiveVotingSession_stageId_key";
DO $$ BEGIN
ALTER TABLE "LiveVotingSession" RENAME COLUMN "stageId" TO "roundId"; ALTER TABLE "LiveVotingSession" RENAME COLUMN "stageId" TO "roundId";
EXCEPTION WHEN undefined_column THEN NULL; END $$;
CREATE UNIQUE INDEX "LiveVotingSession_roundId_key" ON "LiveVotingSession"("roundId"); CREATE UNIQUE INDEX IF NOT EXISTS "LiveVotingSession_roundId_key" ON "LiveVotingSession"("roundId");
DO $$ BEGIN
ALTER TABLE "LiveVotingSession" ADD CONSTRAINT "LiveVotingSession_roundId_fkey" ALTER TABLE "LiveVotingSession" ADD CONSTRAINT "LiveVotingSession_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── 6. FilteringRule ──────────────────────────────────────────────────────── -- --- 6. FilteringRule ---
ALTER TABLE "FilteringRule" DROP COLUMN IF EXISTS "roundId"; ALTER TABLE "FilteringRule" DROP COLUMN IF EXISTS "roundId";
@@ -103,14 +123,18 @@ ALTER TABLE "FilteringRule" DROP CONSTRAINT IF EXISTS "FilteringRule_stageId_fke
DROP INDEX IF EXISTS "FilteringRule_stageId_idx"; DROP INDEX IF EXISTS "FilteringRule_stageId_idx";
DO $$ BEGIN
ALTER TABLE "FilteringRule" RENAME COLUMN "stageId" TO "roundId"; ALTER TABLE "FilteringRule" RENAME COLUMN "stageId" TO "roundId";
EXCEPTION WHEN undefined_column THEN NULL; END $$;
CREATE INDEX "FilteringRule_roundId_idx" ON "FilteringRule"("roundId"); CREATE INDEX IF NOT EXISTS "FilteringRule_roundId_idx" ON "FilteringRule"("roundId");
DO $$ BEGIN
ALTER TABLE "FilteringRule" ADD CONSTRAINT "FilteringRule_roundId_fkey" ALTER TABLE "FilteringRule" ADD CONSTRAINT "FilteringRule_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── 7. FilteringResult ────────────────────────────────────────────────────── -- --- 7. FilteringResult ---
ALTER TABLE "FilteringResult" DROP COLUMN IF EXISTS "roundId"; ALTER TABLE "FilteringResult" DROP COLUMN IF EXISTS "roundId";
@@ -119,15 +143,19 @@ ALTER TABLE "FilteringResult" DROP CONSTRAINT IF EXISTS "FilteringResult_stageId
DROP INDEX IF EXISTS "FilteringResult_stageId_projectId_key"; DROP INDEX IF EXISTS "FilteringResult_stageId_projectId_key";
DROP INDEX IF EXISTS "FilteringResult_stageId_idx"; DROP INDEX IF EXISTS "FilteringResult_stageId_idx";
DO $$ BEGIN
ALTER TABLE "FilteringResult" RENAME COLUMN "stageId" TO "roundId"; ALTER TABLE "FilteringResult" RENAME COLUMN "stageId" TO "roundId";
EXCEPTION WHEN undefined_column THEN NULL; END $$;
CREATE UNIQUE INDEX "FilteringResult_roundId_projectId_key" ON "FilteringResult"("roundId", "projectId"); CREATE UNIQUE INDEX IF NOT EXISTS "FilteringResult_roundId_projectId_key" ON "FilteringResult"("roundId", "projectId");
CREATE INDEX "FilteringResult_roundId_idx" ON "FilteringResult"("roundId"); CREATE INDEX IF NOT EXISTS "FilteringResult_roundId_idx" ON "FilteringResult"("roundId");
DO $$ BEGIN
ALTER TABLE "FilteringResult" ADD CONSTRAINT "FilteringResult_roundId_fkey" ALTER TABLE "FilteringResult" ADD CONSTRAINT "FilteringResult_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── 8. FilteringJob ───────────────────────────────────────────────────────── -- --- 8. FilteringJob ---
ALTER TABLE "FilteringJob" DROP COLUMN IF EXISTS "roundId"; ALTER TABLE "FilteringJob" DROP COLUMN IF EXISTS "roundId";
@@ -135,14 +163,18 @@ ALTER TABLE "FilteringJob" DROP CONSTRAINT IF EXISTS "FilteringJob_stageId_fkey"
DROP INDEX IF EXISTS "FilteringJob_stageId_idx"; DROP INDEX IF EXISTS "FilteringJob_stageId_idx";
DO $$ BEGIN
ALTER TABLE "FilteringJob" RENAME COLUMN "stageId" TO "roundId"; ALTER TABLE "FilteringJob" RENAME COLUMN "stageId" TO "roundId";
EXCEPTION WHEN undefined_column THEN NULL; END $$;
CREATE INDEX "FilteringJob_roundId_idx" ON "FilteringJob"("roundId"); CREATE INDEX IF NOT EXISTS "FilteringJob_roundId_idx" ON "FilteringJob"("roundId");
DO $$ BEGIN
ALTER TABLE "FilteringJob" ADD CONSTRAINT "FilteringJob_roundId_fkey" ALTER TABLE "FilteringJob" ADD CONSTRAINT "FilteringJob_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── 9. AssignmentJob ──────────────────────────────────────────────────────── -- --- 9. AssignmentJob ---
ALTER TABLE "AssignmentJob" DROP COLUMN IF EXISTS "roundId"; ALTER TABLE "AssignmentJob" DROP COLUMN IF EXISTS "roundId";
@@ -150,14 +182,18 @@ ALTER TABLE "AssignmentJob" DROP CONSTRAINT IF EXISTS "AssignmentJob_stageId_fke
DROP INDEX IF EXISTS "AssignmentJob_stageId_idx"; DROP INDEX IF EXISTS "AssignmentJob_stageId_idx";
DO $$ BEGIN
ALTER TABLE "AssignmentJob" RENAME COLUMN "stageId" TO "roundId"; ALTER TABLE "AssignmentJob" RENAME COLUMN "stageId" TO "roundId";
EXCEPTION WHEN undefined_column THEN NULL; END $$;
CREATE INDEX "AssignmentJob_roundId_idx" ON "AssignmentJob"("roundId"); CREATE INDEX IF NOT EXISTS "AssignmentJob_roundId_idx" ON "AssignmentJob"("roundId");
DO $$ BEGIN
ALTER TABLE "AssignmentJob" ADD CONSTRAINT "AssignmentJob_roundId_fkey" ALTER TABLE "AssignmentJob" ADD CONSTRAINT "AssignmentJob_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── 10. ReminderLog ───────────────────────────────────────────────────────── -- --- 10. ReminderLog ---
ALTER TABLE "ReminderLog" DROP COLUMN IF EXISTS "roundId"; ALTER TABLE "ReminderLog" DROP COLUMN IF EXISTS "roundId";
@@ -166,15 +202,19 @@ ALTER TABLE "ReminderLog" DROP CONSTRAINT IF EXISTS "ReminderLog_stageId_fkey";
DROP INDEX IF EXISTS "ReminderLog_stageId_userId_type_key"; DROP INDEX IF EXISTS "ReminderLog_stageId_userId_type_key";
DROP INDEX IF EXISTS "ReminderLog_stageId_idx"; DROP INDEX IF EXISTS "ReminderLog_stageId_idx";
DO $$ BEGIN
ALTER TABLE "ReminderLog" RENAME COLUMN "stageId" TO "roundId"; ALTER TABLE "ReminderLog" RENAME COLUMN "stageId" TO "roundId";
EXCEPTION WHEN undefined_column THEN NULL; END $$;
CREATE UNIQUE INDEX "ReminderLog_roundId_userId_type_key" ON "ReminderLog"("roundId", "userId", "type"); CREATE UNIQUE INDEX IF NOT EXISTS "ReminderLog_roundId_userId_type_key" ON "ReminderLog"("roundId", "userId", "type");
CREATE INDEX "ReminderLog_roundId_idx" ON "ReminderLog"("roundId"); CREATE INDEX IF NOT EXISTS "ReminderLog_roundId_idx" ON "ReminderLog"("roundId");
DO $$ BEGIN
ALTER TABLE "ReminderLog" ADD CONSTRAINT "ReminderLog_roundId_fkey" ALTER TABLE "ReminderLog" ADD CONSTRAINT "ReminderLog_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── 11. EvaluationSummary ─────────────────────────────────────────────────── -- --- 11. EvaluationSummary ---
ALTER TABLE "EvaluationSummary" DROP COLUMN IF EXISTS "roundId"; ALTER TABLE "EvaluationSummary" DROP COLUMN IF EXISTS "roundId";
@@ -183,15 +223,19 @@ ALTER TABLE "EvaluationSummary" DROP CONSTRAINT IF EXISTS "EvaluationSummary_sta
DROP INDEX IF EXISTS "EvaluationSummary_projectId_stageId_key"; DROP INDEX IF EXISTS "EvaluationSummary_projectId_stageId_key";
DROP INDEX IF EXISTS "EvaluationSummary_stageId_idx"; DROP INDEX IF EXISTS "EvaluationSummary_stageId_idx";
DO $$ BEGIN
ALTER TABLE "EvaluationSummary" RENAME COLUMN "stageId" TO "roundId"; ALTER TABLE "EvaluationSummary" RENAME COLUMN "stageId" TO "roundId";
EXCEPTION WHEN undefined_column THEN NULL; END $$;
CREATE UNIQUE INDEX "EvaluationSummary_projectId_roundId_key" ON "EvaluationSummary"("projectId", "roundId"); CREATE UNIQUE INDEX IF NOT EXISTS "EvaluationSummary_projectId_roundId_key" ON "EvaluationSummary"("projectId", "roundId");
CREATE INDEX "EvaluationSummary_roundId_idx" ON "EvaluationSummary"("roundId"); CREATE INDEX IF NOT EXISTS "EvaluationSummary_roundId_idx" ON "EvaluationSummary"("roundId");
DO $$ BEGIN
ALTER TABLE "EvaluationSummary" ADD CONSTRAINT "EvaluationSummary_roundId_fkey" ALTER TABLE "EvaluationSummary" ADD CONSTRAINT "EvaluationSummary_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── 12. EvaluationDiscussion ──────────────────────────────────────────────── -- --- 12. EvaluationDiscussion ---
ALTER TABLE "EvaluationDiscussion" DROP COLUMN IF EXISTS "roundId"; ALTER TABLE "EvaluationDiscussion" DROP COLUMN IF EXISTS "roundId";
@@ -200,15 +244,19 @@ ALTER TABLE "EvaluationDiscussion" DROP CONSTRAINT IF EXISTS "EvaluationDiscussi
DROP INDEX IF EXISTS "EvaluationDiscussion_projectId_stageId_key"; DROP INDEX IF EXISTS "EvaluationDiscussion_projectId_stageId_key";
DROP INDEX IF EXISTS "EvaluationDiscussion_stageId_idx"; DROP INDEX IF EXISTS "EvaluationDiscussion_stageId_idx";
DO $$ BEGIN
ALTER TABLE "EvaluationDiscussion" RENAME COLUMN "stageId" TO "roundId"; ALTER TABLE "EvaluationDiscussion" RENAME COLUMN "stageId" TO "roundId";
EXCEPTION WHEN undefined_column THEN NULL; END $$;
CREATE UNIQUE INDEX "EvaluationDiscussion_projectId_roundId_key" ON "EvaluationDiscussion"("projectId", "roundId"); CREATE UNIQUE INDEX IF NOT EXISTS "EvaluationDiscussion_projectId_roundId_key" ON "EvaluationDiscussion"("projectId", "roundId");
CREATE INDEX "EvaluationDiscussion_roundId_idx" ON "EvaluationDiscussion"("roundId"); CREATE INDEX IF NOT EXISTS "EvaluationDiscussion_roundId_idx" ON "EvaluationDiscussion"("roundId");
DO $$ BEGIN
ALTER TABLE "EvaluationDiscussion" ADD CONSTRAINT "EvaluationDiscussion_roundId_fkey" ALTER TABLE "EvaluationDiscussion" ADD CONSTRAINT "EvaluationDiscussion_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── 13. Message ───────────────────────────────────────────────────────────── -- --- 13. Message ---
-- Message has roundId (from init, nullable) and stageId (from pipeline, nullable) -- Message has roundId (from init, nullable) and stageId (from pipeline, nullable)
ALTER TABLE "Message" DROP COLUMN IF EXISTS "roundId"; ALTER TABLE "Message" DROP COLUMN IF EXISTS "roundId";
@@ -217,42 +265,54 @@ ALTER TABLE "Message" DROP CONSTRAINT IF EXISTS "Message_stageId_fkey";
DROP INDEX IF EXISTS "Message_stageId_idx"; DROP INDEX IF EXISTS "Message_stageId_idx";
DO $$ BEGIN
ALTER TABLE "Message" RENAME COLUMN "stageId" TO "roundId"; ALTER TABLE "Message" RENAME COLUMN "stageId" TO "roundId";
EXCEPTION WHEN undefined_column THEN NULL; END $$;
CREATE INDEX "Message_roundId_idx" ON "Message"("roundId"); CREATE INDEX IF NOT EXISTS "Message_roundId_idx" ON "Message"("roundId");
DO $$ BEGIN
ALTER TABLE "Message" ADD CONSTRAINT "Message_roundId_fkey" ALTER TABLE "Message" ADD CONSTRAINT "Message_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE SET NULL ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── 14. Cohort ────────────────────────────────────────────────────────────── -- --- 14. Cohort ---
-- Cohort was created in pipeline migration with stageId only (no roundId) -- Cohort was created in pipeline migration with stageId only (no roundId)
ALTER TABLE "Cohort" DROP CONSTRAINT IF EXISTS "Cohort_stageId_fkey"; ALTER TABLE "Cohort" DROP CONSTRAINT IF EXISTS "Cohort_stageId_fkey";
DROP INDEX IF EXISTS "Cohort_stageId_idx"; DROP INDEX IF EXISTS "Cohort_stageId_idx";
DO $$ BEGIN
ALTER TABLE "Cohort" RENAME COLUMN "stageId" TO "roundId"; ALTER TABLE "Cohort" RENAME COLUMN "stageId" TO "roundId";
EXCEPTION WHEN undefined_column THEN NULL; END $$;
CREATE INDEX "Cohort_roundId_idx" ON "Cohort"("roundId"); CREATE INDEX IF NOT EXISTS "Cohort_roundId_idx" ON "Cohort"("roundId");
DO $$ BEGIN
ALTER TABLE "Cohort" ADD CONSTRAINT "Cohort_roundId_fkey" ALTER TABLE "Cohort" ADD CONSTRAINT "Cohort_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── 15. LiveProgressCursor ────────────────────────────────────────────────── -- --- 15. LiveProgressCursor ---
-- LiveProgressCursor was created in pipeline migration with stageId only (no roundId) -- LiveProgressCursor was created in pipeline migration with stageId only (no roundId)
ALTER TABLE "LiveProgressCursor" DROP CONSTRAINT IF EXISTS "LiveProgressCursor_stageId_fkey"; ALTER TABLE "LiveProgressCursor" DROP CONSTRAINT IF EXISTS "LiveProgressCursor_stageId_fkey";
DROP INDEX IF EXISTS "LiveProgressCursor_stageId_key"; DROP INDEX IF EXISTS "LiveProgressCursor_stageId_key";
DO $$ BEGIN
ALTER TABLE "LiveProgressCursor" RENAME COLUMN "stageId" TO "roundId"; ALTER TABLE "LiveProgressCursor" RENAME COLUMN "stageId" TO "roundId";
EXCEPTION WHEN undefined_column THEN NULL; END $$;
CREATE UNIQUE INDEX "LiveProgressCursor_roundId_key" ON "LiveProgressCursor"("roundId"); CREATE UNIQUE INDEX IF NOT EXISTS "LiveProgressCursor_roundId_key" ON "LiveProgressCursor"("roundId");
DO $$ BEGIN
ALTER TABLE "LiveProgressCursor" ADD CONSTRAINT "LiveProgressCursor_roundId_fkey" ALTER TABLE "LiveProgressCursor" ADD CONSTRAINT "LiveProgressCursor_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── 16. SpecialAward: Drop trackId column ─────────────────────────────────── -- --- 16. SpecialAward: Drop trackId column ---
ALTER TABLE "SpecialAward" DROP CONSTRAINT IF EXISTS "SpecialAward_trackId_fkey"; ALTER TABLE "SpecialAward" DROP CONSTRAINT IF EXISTS "SpecialAward_trackId_fkey";
@@ -260,12 +320,16 @@ DROP INDEX IF EXISTS "SpecialAward_trackId_key";
ALTER TABLE "SpecialAward" DROP COLUMN IF EXISTS "trackId"; ALTER TABLE "SpecialAward" DROP COLUMN IF EXISTS "trackId";
-- ─── 17. ConflictOfInterest: roundId was made nullable in pipeline migration -- --- 17. ConflictOfInterest: roundId was made nullable in pipeline migration ---
-- It still exists, just restore FK to new Round table -- It still exists, just restore FK to new Round table
DO $$ BEGIN
ALTER TABLE "ConflictOfInterest" ADD CONSTRAINT "ConflictOfInterest_roundId_fkey" ALTER TABLE "ConflictOfInterest" ADD CONSTRAINT "ConflictOfInterest_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE CASCADE ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;
-- ─── 18. TaggingJob: roundId was made nullable in pipeline migration ───────── -- --- 18. TaggingJob: roundId was made nullable in pipeline migration ---
-- Restore FK to new Round table -- Restore FK to new Round table
DO $$ BEGIN
ALTER TABLE "TaggingJob" ADD CONSTRAINT "TaggingJob_roundId_fkey" ALTER TABLE "TaggingJob" ADD CONSTRAINT "TaggingJob_roundId_fkey"
FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE SET NULL ON UPDATE CASCADE; FOREIGN KEY ("roundId") REFERENCES "Round"("id") ON DELETE SET NULL ON UPDATE CASCADE;
EXCEPTION WHEN duplicate_object THEN NULL; END $$;

View File

@@ -0,0 +1,2 @@
-- Add pageCount column to ProjectFile (was in schema but missing migration)
ALTER TABLE "ProjectFile" ADD COLUMN IF NOT EXISTS "pageCount" INTEGER;

View File

@@ -0,0 +1,19 @@
-- =============================================================================
-- Schema Reconciliation: Fill remaining gaps between migrations and schema.prisma
-- =============================================================================
-- All statements are idempotent (safe to re-run on any database state).
-- 1. ConflictOfInterest: add standalone hasConflict index (schema has @@index([hasConflict]))
-- Migration 20260205223133 only created composite (roundId, hasConflict) index.
CREATE INDEX IF NOT EXISTS "ConflictOfInterest_hasConflict_idx" ON "ConflictOfInterest"("hasConflict");
-- 2. Ensure ConflictOfInterest.roundId is nullable (schema says String?)
-- Pipeline migration (20260213) makes it nullable, but guard for safety.
DO $$ BEGIN
ALTER TABLE "ConflictOfInterest" ALTER COLUMN "roundId" DROP NOT NULL;
EXCEPTION WHEN others THEN NULL;
END $$;
-- 3. Drop stale composite index that no longer matches schema
-- Schema only has @@index([hasConflict]) and @@index([userId]), not (roundId, hasConflict).
DROP INDEX IF EXISTS "ConflictOfInterest_roundId_hasConflict_idx";

View File

@@ -0,0 +1,5 @@
-- AlterTable
ALTER TABLE "ProjectFile" ADD COLUMN "textPreview" TEXT;
ALTER TABLE "ProjectFile" ADD COLUMN "detectedLang" TEXT;
ALTER TABLE "ProjectFile" ADD COLUMN "langConfidence" DOUBLE PRECISION;
ALTER TABLE "ProjectFile" ADD COLUMN "analyzedAt" TIMESTAMP(3);

View File

@@ -689,6 +689,12 @@ model ProjectFile {
size Int // bytes size Int // bytes
pageCount Int? // Number of pages (PDFs, presentations, etc.) pageCount Int? // Number of pages (PDFs, presentations, etc.)
// Document analysis (optional, populated by document-analyzer service)
textPreview String? @db.Text // First ~2000 chars of extracted text
detectedLang String? // ISO 639-3 code (e.g. 'eng', 'fra', 'und')
langConfidence Float? // 0.01.0 confidence
analyzedAt DateTime? // When analysis last ran
// MinIO location // MinIO location
bucket String bucket String
objectKey String objectKey String

File diff suppressed because it is too large Load Diff

View File

@@ -58,6 +58,7 @@ import {
export default function MemberDetailPage() { export default function MemberDetailPage() {
const params = useParams() const params = useParams()
const router = useRouter() const router = useRouter()
const utils = trpc.useUtils()
const userId = params.id as string const userId = params.id as string
const { data: user, isLoading, error, refetch } = trpc.user.get.useQuery({ id: userId }) const { data: user, isLoading, error, refetch } = trpc.user.get.useQuery({ id: userId })
@@ -103,6 +104,8 @@ export default function MemberDetailPage() {
expertiseTags, expertiseTags,
maxAssignments: maxAssignments ? parseInt(maxAssignments) : null, maxAssignments: maxAssignments ? parseInt(maxAssignments) : null,
}) })
utils.user.get.invalidate({ id: userId })
utils.user.list.invalidate()
toast.success('Member updated successfully') toast.success('Member updated successfully')
router.push('/admin/members') router.push('/admin/members')
} catch (error) { } catch (error) {
@@ -115,6 +118,7 @@ export default function MemberDetailPage() {
await sendInvitation.mutateAsync({ userId }) await sendInvitation.mutateAsync({ userId })
toast.success('Invitation email sent successfully') toast.success('Invitation email sent successfully')
refetch() refetch()
utils.user.list.invalidate()
} catch (error) { } catch (error) {
toast.error(error instanceof Error ? error.message : 'Failed to send invitation') toast.error(error instanceof Error ? error.message : 'Failed to send invitation')
} }

View File

@@ -49,7 +49,10 @@ import {
Heart, Heart,
Crown, Crown,
UserPlus, UserPlus,
Loader2,
ScanSearch,
} from 'lucide-react' } from 'lucide-react'
import { toast } from 'sonner'
import { formatDate, formatDateOnly } from '@/lib/utils' import { formatDate, formatDateOnly } from '@/lib/utils'
interface PageProps { interface PageProps {
@@ -105,14 +108,13 @@ function ProjectDetailContent({ projectId }: { projectId: string }) {
// Extract all rounds from the competition // Extract all rounds from the competition
const competitionRounds = competition?.rounds || [] const competitionRounds = competition?.rounds || []
// Fetch requirements for each round // Fetch requirements for all rounds in a single query (avoids dynamic hook violation)
const requirementQueries = competitionRounds.map((round: { id: string; name: string }) => const roundIds = competitionRounds.map((r: { id: string }) => r.id)
trpc.file.listRequirements.useQuery({ roundId: round.id }) const { data: allRequirements = [] } = trpc.file.listRequirementsByRounds.useQuery(
{ roundIds },
{ enabled: roundIds.length > 0 }
) )
// Combine requirements from all rounds
const allRequirements = requirementQueries.flatMap((q: { data?: unknown[] }) => q.data || [])
const utils = trpc.useUtils() const utils = trpc.useUtils()
if (isLoading) { if (isLoading) {
@@ -530,6 +532,8 @@ function ProjectDetailContent({ projectId }: { projectId: string }) {
<AnimatedCard index={4}> <AnimatedCard index={4}>
<Card> <Card>
<CardHeader> <CardHeader>
<div className="flex items-center justify-between">
<div>
<CardTitle className="flex items-center gap-2.5 text-lg"> <CardTitle className="flex items-center gap-2.5 text-lg">
<div className="rounded-lg bg-rose-500/10 p-1.5"> <div className="rounded-lg bg-rose-500/10 p-1.5">
<FileText className="h-4 w-4 text-rose-500" /> <FileText className="h-4 w-4 text-rose-500" />
@@ -539,6 +543,9 @@ function ProjectDetailContent({ projectId }: { projectId: string }) {
<CardDescription> <CardDescription>
Project documents and materials organized by competition round Project documents and materials organized by competition round
</CardDescription> </CardDescription>
</div>
<AnalyzeDocumentsButton projectId={projectId} onComplete={() => utils.file.listByProject.invalidate({ projectId })} />
</div>
</CardHeader> </CardHeader>
<CardContent className="space-y-6"> <CardContent className="space-y-6">
{/* Requirements organized by round */} {/* Requirements organized by round */}
@@ -592,7 +599,7 @@ function ProjectDetailContent({ projectId }: { projectId: string }) {
</p> </p>
)} )}
<div className="flex items-center gap-2 text-xs text-muted-foreground mt-0.5"> <div className="flex items-center gap-2 text-xs text-muted-foreground mt-0.5">
{req.acceptedMimeTypes.length > 0 && ( {req.acceptedMimeTypes?.length > 0 && (
<span> <span>
{req.acceptedMimeTypes.map((mime: string) => { {req.acceptedMimeTypes.map((mime: string) => {
if (mime === 'application/pdf') return 'PDF' if (mime === 'application/pdf') return 'PDF'
@@ -665,6 +672,11 @@ function ProjectDetailContent({ projectId }: { projectId: string }) {
size: f.size, size: f.size,
bucket: f.bucket, bucket: f.bucket,
objectKey: f.objectKey, objectKey: f.objectKey,
pageCount: f.pageCount,
textPreview: f.textPreview,
detectedLang: f.detectedLang,
langConfidence: f.langConfidence,
analyzedAt: f.analyzedAt ? String(f.analyzedAt) : null,
}))} }))}
/> />
</div> </div>
@@ -848,6 +860,36 @@ function ProjectDetailSkeleton() {
) )
} }
function AnalyzeDocumentsButton({ projectId, onComplete }: { projectId: string; onComplete: () => void }) {
const analyzeMutation = trpc.file.analyzeProjectFiles.useMutation({
onSuccess: (result) => {
toast.success(
`Analyzed ${result.analyzed} file${result.analyzed !== 1 ? 's' : ''}${result.failed > 0 ? ` (${result.failed} failed)` : ''}`
)
onComplete()
},
onError: (error) => {
toast.error(error.message || 'Analysis failed')
},
})
return (
<Button
variant="outline"
size="sm"
onClick={() => analyzeMutation.mutate({ projectId })}
disabled={analyzeMutation.isPending}
>
{analyzeMutation.isPending ? (
<Loader2 className="mr-2 h-4 w-4 animate-spin" />
) : (
<ScanSearch className="mr-2 h-4 w-4" />
)}
{analyzeMutation.isPending ? 'Analyzing...' : 'Analyze Documents'}
</Button>
)
}
export default function ProjectDetailPage({ params }: PageProps) { export default function ProjectDetailPage({ params }: PageProps) {
const { id } = use(params) const { id } = use(params)

View File

@@ -1,6 +1,6 @@
'use client' 'use client'
import { useState, useCallback, useRef } from 'react' import { useState, useCallback, useRef, useEffect, useMemo } from 'react'
import Link from 'next/link' import Link from 'next/link'
import { trpc } from '@/lib/trpc/client' import { trpc } from '@/lib/trpc/client'
import { toast } from 'sonner' import { toast } from 'sonner'
@@ -46,6 +46,8 @@ import {
Loader2, Loader2,
FileUp, FileUp,
AlertCircle, AlertCircle,
ExternalLink,
Trash2,
} from 'lucide-react' } from 'lucide-react'
import { cn, formatFileSize } from '@/lib/utils' import { cn, formatFileSize } from '@/lib/utils'
import { Pagination } from '@/components/shared/pagination' import { Pagination } from '@/components/shared/pagination'
@@ -77,12 +79,13 @@ export default function BulkUploadPage() {
label: string label: string
mimeTypes: string[] mimeTypes: string[]
required: boolean required: boolean
file: { id: string; fileName: string } | null file: { id: string; fileName: string; bucket: string; objectKey: string } | null
}> }>
} | null>(null) } | null>(null)
const [bulkFiles, setBulkFiles] = useState<Record<string, File | null>>({}) const [bulkFiles, setBulkFiles] = useState<Record<string, File | null>>({})
const fileInputRefs = useRef<Record<string, HTMLInputElement | null>>({}) const fileInputRefs = useRef<Record<string, HTMLInputElement | null>>({})
const utils = trpc.useUtils()
// Debounce search // Debounce search
const searchTimer = useRef<ReturnType<typeof setTimeout>>(undefined) const searchTimer = useRef<ReturnType<typeof setTimeout>>(undefined)
@@ -109,6 +112,70 @@ export default function BulkUploadPage() {
{ enabled: !!roundId } { enabled: !!roundId }
) )
// Collect all files from current data for existence verification
const filesToVerify = useMemo(() => {
if (!data?.projects) return []
const files: { bucket: string; objectKey: string }[] = []
for (const row of data.projects) {
for (const req of row.requirements) {
if (req.file?.bucket && req.file?.objectKey) {
files.push({ bucket: req.file.bucket, objectKey: req.file.objectKey })
}
}
}
return files
}, [data])
// Verify files actually exist in storage
const { data: fileExistence } = trpc.file.verifyFilesExist.useQuery(
{ files: filesToVerify },
{ enabled: filesToVerify.length > 0, staleTime: 30_000 }
)
// Track which files are missing from storage (objectKey → true means missing)
const missingFiles = useMemo(() => {
if (!fileExistence) return new Set<string>()
const missing = new Set<string>()
for (const [key, exists] of Object.entries(fileExistence)) {
if (!exists) missing.add(key)
}
return missing
}, [fileExistence])
// Open file in new tab via presigned URL
const handleViewFile = useCallback(
async (bucket: string, objectKey: string) => {
try {
const { url } = await utils.file.getDownloadUrl.fetch({ bucket, objectKey })
window.open(url, '_blank')
} catch {
toast.error('Failed to open file. It may have been deleted from storage.')
refetch()
}
},
[utils, refetch]
)
// Delete a file
const deleteMutation = trpc.file.delete.useMutation({
onSuccess: () => {
toast.success('File removed')
refetch()
},
onError: (err) => {
toast.error(`Failed to remove file: ${err.message}`)
},
})
const handleDeleteFile = useCallback(
(fileId: string) => {
if (confirm('Remove this uploaded file?')) {
deleteMutation.mutate({ id: fileId })
}
},
[deleteMutation]
)
const uploadMutation = trpc.file.adminUploadForRoundRequirement.useMutation() const uploadMutation = trpc.file.adminUploadForRoundRequirement.useMutation()
// Upload a single file for a project requirement // Upload a single file for a project requirement
@@ -390,7 +457,7 @@ export default function BulkUploadPage() {
<TableBody> <TableBody>
{data.projects.map((row) => { {data.projects.map((row) => {
const missingRequired = row.requirements.filter( const missingRequired = row.requirements.filter(
(r) => r.required && !r.file (r) => r.required && (!r.file || (r.file?.objectKey && missingFiles.has(r.file.objectKey)))
) )
return ( return (
<TableRow <TableRow
@@ -446,12 +513,57 @@ export default function BulkUploadPage() {
Retry Retry
</Button> </Button>
</div> </div>
) : req.file && req.file.objectKey && missingFiles.has(req.file.objectKey) ? (
<div className="flex flex-col items-center gap-1">
<AlertCircle className="h-4 w-4 text-amber-500" />
<span className="text-[10px] text-amber-600 font-medium">Missing</span>
<Button
variant="outline"
size="sm"
className="h-6 px-2 text-[10px]"
onClick={() =>
handleCellUpload(
row.project.id,
req.requirementId,
req.mimeTypes
)
}
>
Re-upload
</Button>
</div>
) : req.file || uploadState?.status === 'complete' ? ( ) : req.file || uploadState?.status === 'complete' ? (
<div className="flex flex-col items-center gap-1"> <div className="flex flex-col items-center gap-1">
<div className="flex items-center gap-1">
<CheckCircle2 className="h-4 w-4 text-green-600" /> <CheckCircle2 className="h-4 w-4 text-green-600" />
{req.file && (
<button
type="button"
className="text-muted-foreground hover:text-destructive transition-colors cursor-pointer"
title="Remove file"
onClick={() => handleDeleteFile(req.file!.id)}
disabled={deleteMutation.isPending}
>
<Trash2 className="h-3 w-3" />
</button>
)}
</div>
{req.file?.bucket && req.file?.objectKey ? (
<button
type="button"
className="text-[10px] text-teal-600 hover:text-teal-800 hover:underline truncate max-w-[120px] flex items-center gap-0.5 cursor-pointer"
onClick={() =>
handleViewFile(req.file!.bucket, req.file!.objectKey)
}
>
{req.file.fileName}
<ExternalLink className="h-2.5 w-2.5 shrink-0" />
</button>
) : (
<span className="text-[10px] text-muted-foreground truncate max-w-[120px]"> <span className="text-[10px] text-muted-foreground truncate max-w-[120px]">
{req.file?.fileName ?? 'Uploaded'} {req.file?.fileName ?? 'Uploaded'}
</span> </span>
)}
</div> </div>
) : ( ) : (
<Button <Button

View File

@@ -366,8 +366,9 @@ export default function ProjectsPage() {
} }
const handleCloseTaggingDialog = () => { const handleCloseTaggingDialog = () => {
if (!taggingInProgress) {
setAiTagDialogOpen(false) setAiTagDialogOpen(false)
// Only reset job state if not in progress (preserve polling for background jobs)
if (!taggingInProgress) {
setActiveTaggingJobId(null) setActiveTaggingJobId(null)
setSelectedRoundForTagging('') setSelectedRoundForTagging('')
setSelectedProgramForTagging('') setSelectedProgramForTagging('')
@@ -618,9 +619,22 @@ export default function ProjectsPage() {
</p> </p>
</div> </div>
<div className="flex flex-wrap gap-2"> <div className="flex flex-wrap gap-2">
<Button variant="outline" onClick={() => setAiTagDialogOpen(true)}> <Button
variant="outline"
onClick={() => setAiTagDialogOpen(true)}
className={taggingInProgress ? 'border-amber-400 bg-amber-50 dark:bg-amber-950/20' : ''}
>
{taggingInProgress ? (
<Loader2 className="mr-2 h-4 w-4 animate-spin text-amber-600" />
) : (
<Bot className="mr-2 h-4 w-4" /> <Bot className="mr-2 h-4 w-4" />
)}
AI Tags AI Tags
{taggingInProgress && (
<span className="ml-1.5 text-[10px] text-amber-600 font-medium">
{taggingProgressPercent}%
</span>
)}
</Button> </Button>
<Button variant="outline" asChild> <Button variant="outline" asChild>
<Link href="/admin/projects/pool"> <Link href="/admin/projects/pool">
@@ -1833,9 +1847,8 @@ export default function ProjectsPage() {
<Button <Button
variant="outline" variant="outline"
onClick={handleCloseTaggingDialog} onClick={handleCloseTaggingDialog}
disabled={taggingInProgress}
> >
Cancel {taggingInProgress ? 'Run in Background' : 'Cancel'}
</Button> </Button>
<Button <Button
onClick={handleStartTagging} onClick={handleStartTagging}

File diff suppressed because it is too large Load Diff

View File

@@ -9,7 +9,7 @@ import { Skeleton } from '@/components/ui/skeleton'
import { SettingsContent } from '@/components/settings/settings-content' import { SettingsContent } from '@/components/settings/settings-content'
// Categories that only super admins can access // Categories that only super admins can access
const SUPER_ADMIN_CATEGORIES = new Set(['AI', 'EMAIL', 'STORAGE', 'SECURITY']) const SUPER_ADMIN_CATEGORIES = new Set(['AI', 'EMAIL', 'STORAGE', 'SECURITY', 'WHATSAPP'])
async function SettingsLoader({ isSuperAdmin }: { isSuperAdmin: boolean }) { async function SettingsLoader({ isSuperAdmin }: { isSuperAdmin: boolean }) {
const settings = await prisma.systemSettings.findMany({ const settings = await prisma.systemSettings.findMany({

View File

@@ -45,6 +45,8 @@ export function AssignmentPreviewSheet({
toast.success(`Created ${result.created} assignments`) toast.success(`Created ${result.created} assignments`)
utils.roundAssignment.coverageReport.invalidate({ roundId }) utils.roundAssignment.coverageReport.invalidate({ roundId })
utils.roundAssignment.unassignedQueue.invalidate({ roundId }) utils.roundAssignment.unassignedQueue.invalidate({ roundId })
utils.assignment.listByStage.invalidate({ roundId })
utils.roundEngine.getProjectStates.invalidate({ roundId })
onOpenChange(false) onOpenChange(false)
}, },
onError: (err) => { onError: (err) => {

View File

@@ -36,14 +36,12 @@ export function AddMemberDialog({ juryGroupId, open, onOpenChange }: AddMemberDi
// Search existing user state // Search existing user state
const [searchQuery, setSearchQuery] = useState('') const [searchQuery, setSearchQuery] = useState('')
const [selectedUserId, setSelectedUserId] = useState<string>('') const [selectedUserId, setSelectedUserId] = useState<string>('')
const [role, setRole] = useState<'CHAIR' | 'MEMBER' | 'OBSERVER'>('MEMBER')
const [maxAssignments, setMaxAssignments] = useState<string>('') const [maxAssignments, setMaxAssignments] = useState<string>('')
const [capMode, setCapMode] = useState<string>('') const [capMode, setCapMode] = useState<string>('')
// Invite new user state // Invite new user state
const [inviteName, setInviteName] = useState('') const [inviteName, setInviteName] = useState('')
const [inviteEmail, setInviteEmail] = useState('') const [inviteEmail, setInviteEmail] = useState('')
const [inviteRole, setInviteRole] = useState<'CHAIR' | 'MEMBER' | 'OBSERVER'>('MEMBER')
const [inviteMaxAssignments, setInviteMaxAssignments] = useState<string>('') const [inviteMaxAssignments, setInviteMaxAssignments] = useState<string>('')
const [inviteCapMode, setInviteCapMode] = useState<string>('') const [inviteCapMode, setInviteCapMode] = useState<string>('')
const [inviteExpertise, setInviteExpertise] = useState('') const [inviteExpertise, setInviteExpertise] = useState('')
@@ -75,7 +73,7 @@ export function AddMemberDialog({ juryGroupId, open, onOpenChange }: AddMemberDi
addMember({ addMember({
juryGroupId, juryGroupId,
userId: newUser.id, userId: newUser.id,
role: inviteRole, role: 'MEMBER',
maxAssignmentsOverride: inviteMaxAssignments ? parseInt(inviteMaxAssignments, 10) : null, maxAssignmentsOverride: inviteMaxAssignments ? parseInt(inviteMaxAssignments, 10) : null,
capModeOverride: inviteCapMode && inviteCapMode !== 'DEFAULT' ? (inviteCapMode as 'HARD' | 'SOFT' | 'NONE') : null, capModeOverride: inviteCapMode && inviteCapMode !== 'DEFAULT' ? (inviteCapMode as 'HARD' | 'SOFT' | 'NONE') : null,
}) })
@@ -90,6 +88,7 @@ export function AddMemberDialog({ juryGroupId, open, onOpenChange }: AddMemberDi
const { mutate: sendInvitation } = trpc.user.sendInvitation.useMutation({ const { mutate: sendInvitation } = trpc.user.sendInvitation.useMutation({
onSuccess: (result) => { onSuccess: (result) => {
toast.success(`Invitation sent to ${result.email}`) toast.success(`Invitation sent to ${result.email}`)
utils.user.list.invalidate()
}, },
onError: (err) => { onError: (err) => {
// Don't block — user was created and added, just invitation failed // Don't block — user was created and added, just invitation failed
@@ -100,12 +99,10 @@ export function AddMemberDialog({ juryGroupId, open, onOpenChange }: AddMemberDi
const resetForm = () => { const resetForm = () => {
setSearchQuery('') setSearchQuery('')
setSelectedUserId('') setSelectedUserId('')
setRole('MEMBER')
setMaxAssignments('') setMaxAssignments('')
setCapMode('') setCapMode('')
setInviteName('') setInviteName('')
setInviteEmail('') setInviteEmail('')
setInviteRole('MEMBER')
setInviteMaxAssignments('') setInviteMaxAssignments('')
setInviteCapMode('') setInviteCapMode('')
setInviteExpertise('') setInviteExpertise('')
@@ -122,7 +119,7 @@ export function AddMemberDialog({ juryGroupId, open, onOpenChange }: AddMemberDi
addMember({ addMember({
juryGroupId, juryGroupId,
userId: selectedUserId, userId: selectedUserId,
role, role: 'MEMBER',
maxAssignmentsOverride: maxAssignments ? parseInt(maxAssignments, 10) : null, maxAssignmentsOverride: maxAssignments ? parseInt(maxAssignments, 10) : null,
capModeOverride: capMode && capMode !== 'DEFAULT' ? (capMode as 'HARD' | 'SOFT' | 'NONE') : null, capModeOverride: capMode && capMode !== 'DEFAULT' ? (capMode as 'HARD' | 'SOFT' | 'NONE') : null,
}) })
@@ -215,20 +212,6 @@ export function AddMemberDialog({ juryGroupId, open, onOpenChange }: AddMemberDi
</div> </div>
<div className="grid grid-cols-2 gap-3"> <div className="grid grid-cols-2 gap-3">
<div className="space-y-2">
<Label htmlFor="role">Group Role</Label>
<Select value={role} onValueChange={(val) => setRole(val as typeof role)}>
<SelectTrigger id="role">
<SelectValue />
</SelectTrigger>
<SelectContent>
<SelectItem value="MEMBER">Member</SelectItem>
<SelectItem value="CHAIR">Chair</SelectItem>
<SelectItem value="OBSERVER">Observer</SelectItem>
</SelectContent>
</Select>
</div>
<div className="space-y-2"> <div className="space-y-2">
<Label htmlFor="capMode">Cap Mode</Label> <Label htmlFor="capMode">Cap Mode</Label>
<Select value={capMode || 'DEFAULT'} onValueChange={setCapMode}> <Select value={capMode || 'DEFAULT'} onValueChange={setCapMode}>
@@ -298,20 +281,6 @@ export function AddMemberDialog({ juryGroupId, open, onOpenChange }: AddMemberDi
</div> </div>
<div className="grid grid-cols-2 gap-3"> <div className="grid grid-cols-2 gap-3">
<div className="space-y-2">
<Label htmlFor="inviteGroupRole">Group Role</Label>
<Select value={inviteRole} onValueChange={(val) => setInviteRole(val as typeof inviteRole)}>
<SelectTrigger id="inviteGroupRole">
<SelectValue />
</SelectTrigger>
<SelectContent>
<SelectItem value="MEMBER">Member</SelectItem>
<SelectItem value="CHAIR">Chair</SelectItem>
<SelectItem value="OBSERVER">Observer</SelectItem>
</SelectContent>
</Select>
</div>
<div className="space-y-2"> <div className="space-y-2">
<Label htmlFor="inviteCapMode">Cap Mode</Label> <Label htmlFor="inviteCapMode">Cap Mode</Label>
<Select value={inviteCapMode || 'DEFAULT'} onValueChange={setInviteCapMode}> <Select value={inviteCapMode || 'DEFAULT'} onValueChange={setInviteCapMode}>

View File

@@ -29,15 +29,15 @@ import { AddMemberDialog } from './add-member-dialog'
interface JuryMember { interface JuryMember {
id: string id: string
userId: string userId: string
role: string role?: string
user: { user: {
id: string id: string
name: string | null name: string | null
email: string email: string
} }
maxAssignmentsOverride: number | null maxAssignmentsOverride?: number | null
capModeOverride: string | null capModeOverride?: string | null
preferredStartupRatio: number | null preferredStartupRatio?: number | null
} }
interface JuryMembersTableProps { interface JuryMembersTableProps {
@@ -81,7 +81,6 @@ export function JuryMembersTable({ juryGroupId, members }: JuryMembersTableProps
<TableRow> <TableRow>
<TableHead>Name</TableHead> <TableHead>Name</TableHead>
<TableHead>Email</TableHead> <TableHead>Email</TableHead>
<TableHead className="hidden md:table-cell">Role</TableHead>
<TableHead className="hidden sm:table-cell">Max Assignments</TableHead> <TableHead className="hidden sm:table-cell">Max Assignments</TableHead>
<TableHead className="hidden lg:table-cell">Cap Mode</TableHead> <TableHead className="hidden lg:table-cell">Cap Mode</TableHead>
<TableHead>Actions</TableHead> <TableHead>Actions</TableHead>
@@ -90,7 +89,7 @@ export function JuryMembersTable({ juryGroupId, members }: JuryMembersTableProps
<TableBody> <TableBody>
{members.length === 0 ? ( {members.length === 0 ? (
<TableRow> <TableRow>
<TableCell colSpan={6} className="text-center text-muted-foreground"> <TableCell colSpan={5} className="text-center text-muted-foreground">
No members yet. Add members to get started. No members yet. Add members to get started.
</TableCell> </TableCell>
</TableRow> </TableRow>
@@ -103,11 +102,6 @@ export function JuryMembersTable({ juryGroupId, members }: JuryMembersTableProps
<TableCell className="text-sm text-muted-foreground"> <TableCell className="text-sm text-muted-foreground">
{member.user.email} {member.user.email}
</TableCell> </TableCell>
<TableCell className="hidden md:table-cell">
<Badge variant={member.role === 'CHAIR' ? 'default' : 'secondary'}>
{member.role}
</Badge>
</TableCell>
<TableCell className="hidden sm:table-cell"> <TableCell className="hidden sm:table-cell">
{member.maxAssignmentsOverride ?? '—'} {member.maxAssignmentsOverride ?? '—'}
</TableCell> </TableCell>

View File

@@ -198,6 +198,8 @@ export function FilteringDashboard({ competitionId, roundId }: FilteringDashboar
onSuccess: (data) => { onSuccess: (data) => {
utils.filtering.getResults.invalidate() utils.filtering.getResults.invalidate()
utils.filtering.getResultStats.invalidate({ roundId }) utils.filtering.getResultStats.invalidate({ roundId })
utils.roundEngine.getProjectStates.invalidate({ roundId })
utils.project.list.invalidate()
toast.success( toast.success(
`Finalized: ${data.passed} passed, ${data.filteredOut} filtered out` + `Finalized: ${data.passed} passed, ${data.filteredOut} filtered out` +
(data.advancedToStageName ? `. Next round: ${data.advancedToStageName}` : '') (data.advancedToStageName ? `. Next round: ${data.advancedToStageName}` : '')
@@ -1597,7 +1599,7 @@ function FilteringRulesSection({ roundId }: { roundId: string }) {
className="text-sm" className="text-sm"
/> />
<p className="text-xs text-muted-foreground mt-1"> <p className="text-xs text-muted-foreground mt-1">
The AI has access to: category, country, region, founded year, ocean issue, tags, description, file details (type, page count, size), and team size. The AI has access to: category, country, region, founded year, ocean issue, tags, description, file details (type, page count, size, detected language), and team size.
</p> </p>
</div> </div>

View File

@@ -250,6 +250,7 @@ export function UserMobileActions({
try { try {
await sendInvitation.mutateAsync({ userId }) await sendInvitation.mutateAsync({ userId })
toast.success(`Invitation sent to ${userEmail}`) toast.success(`Invitation sent to ${userEmail}`)
utils.user.list.invalidate()
} catch (error) { } catch (error) {
toast.error(error instanceof Error ? error.message : 'Failed to send invitation') toast.error(error instanceof Error ? error.message : 'Failed to send invitation')
} finally { } finally {

View File

@@ -4,7 +4,7 @@ import { useForm } from 'react-hook-form'
import { zodResolver } from '@hookform/resolvers/zod' import { zodResolver } from '@hookform/resolvers/zod'
import { z } from 'zod' import { z } from 'zod'
import { toast } from 'sonner' import { toast } from 'sonner'
import { Cog, Loader2, Zap, AlertCircle, RefreshCw, SlidersHorizontal } from 'lucide-react' import { Cog, Loader2, Zap, AlertCircle, RefreshCw, SlidersHorizontal, Info } from 'lucide-react'
import { trpc } from '@/lib/trpc/client' import { trpc } from '@/lib/trpc/client'
import { Button } from '@/components/ui/button' import { Button } from '@/components/ui/button'
import { Input } from '@/components/ui/input' import { Input } from '@/components/ui/input'
@@ -36,6 +36,7 @@ const formSchema = z.object({
ai_model: z.string(), ai_model: z.string(),
ai_send_descriptions: z.boolean(), ai_send_descriptions: z.boolean(),
openai_api_key: z.string().optional(), openai_api_key: z.string().optional(),
openai_base_url: z.string().optional(),
}) })
type FormValues = z.infer<typeof formSchema> type FormValues = z.infer<typeof formSchema>
@@ -47,6 +48,7 @@ interface AISettingsFormProps {
ai_model?: string ai_model?: string
ai_send_descriptions?: string ai_send_descriptions?: string
openai_api_key?: string openai_api_key?: string
openai_base_url?: string
} }
} }
@@ -61,10 +63,14 @@ export function AISettingsForm({ settings }: AISettingsFormProps) {
ai_model: settings.ai_model || 'gpt-4o', ai_model: settings.ai_model || 'gpt-4o',
ai_send_descriptions: settings.ai_send_descriptions === 'true', ai_send_descriptions: settings.ai_send_descriptions === 'true',
openai_api_key: '', openai_api_key: '',
openai_base_url: settings.openai_base_url || '',
}, },
}) })
// Fetch available models from OpenAI API const watchProvider = form.watch('ai_provider')
const isLiteLLM = watchProvider === 'litellm'
// Fetch available models from OpenAI API (skip for LiteLLM — no models.list support)
const { const {
data: modelsData, data: modelsData,
isLoading: modelsLoading, isLoading: modelsLoading,
@@ -73,6 +79,7 @@ export function AISettingsForm({ settings }: AISettingsFormProps) {
} = trpc.settings.listAIModels.useQuery(undefined, { } = trpc.settings.listAIModels.useQuery(undefined, {
staleTime: 5 * 60 * 1000, // Cache for 5 minutes staleTime: 5 * 60 * 1000, // Cache for 5 minutes
retry: false, retry: false,
enabled: !isLiteLLM,
}) })
const updateSettings = trpc.settings.updateMultiple.useMutation({ const updateSettings = trpc.settings.updateMultiple.useMutation({
@@ -113,6 +120,9 @@ export function AISettingsForm({ settings }: AISettingsFormProps) {
settingsToUpdate.push({ key: 'openai_api_key', value: data.openai_api_key }) settingsToUpdate.push({ key: 'openai_api_key', value: data.openai_api_key })
} }
// Save base URL (empty string clears it)
settingsToUpdate.push({ key: 'openai_base_url', value: data.openai_base_url?.trim() || '' })
updateSettings.mutate({ settings: settingsToUpdate }) updateSettings.mutate({ settings: settingsToUpdate })
} }
@@ -176,11 +186,50 @@ export function AISettingsForm({ settings }: AISettingsFormProps) {
</SelectTrigger> </SelectTrigger>
</FormControl> </FormControl>
<SelectContent> <SelectContent>
<SelectItem value="openai">OpenAI</SelectItem> <SelectItem value="openai">OpenAI (API Key)</SelectItem>
<SelectItem value="litellm">LiteLLM Proxy (ChatGPT Subscription)</SelectItem>
</SelectContent> </SelectContent>
</Select> </Select>
<FormDescription> <FormDescription>
AI provider for smart assignment suggestions {field.value === 'litellm'
? 'Route AI calls through a LiteLLM proxy connected to your ChatGPT Plus/Pro subscription'
: 'Direct OpenAI API access using your API key'}
</FormDescription>
<FormMessage />
</FormItem>
)}
/>
{isLiteLLM && (
<Alert>
<Info className="h-4 w-4" />
<AlertDescription>
<strong>LiteLLM Proxy Mode</strong> AI calls will be routed through your LiteLLM proxy
using your ChatGPT subscription. Token limits are automatically stripped (not supported by ChatGPT backend).
Make sure your LiteLLM proxy is running and accessible.
</AlertDescription>
</Alert>
)}
<FormField
control={form.control}
name="openai_api_key"
render={({ field }) => (
<FormItem>
<FormLabel>{isLiteLLM ? 'API Key (Optional)' : 'API Key'}</FormLabel>
<FormControl>
<Input
type="password"
placeholder={isLiteLLM
? 'Optional — leave blank for default'
: (settings.openai_api_key ? '••••••••' : 'Enter API key')}
{...field}
/>
</FormControl>
<FormDescription>
{isLiteLLM
? 'LiteLLM proxy usually does not require an API key. Leave blank to use default.'
: 'Your OpenAI API key. Leave blank to keep the existing key.'}
</FormDescription> </FormDescription>
<FormMessage /> <FormMessage />
</FormItem> </FormItem>
@@ -189,19 +238,29 @@ export function AISettingsForm({ settings }: AISettingsFormProps) {
<FormField <FormField
control={form.control} control={form.control}
name="openai_api_key" name="openai_base_url"
render={({ field }) => ( render={({ field }) => (
<FormItem> <FormItem>
<FormLabel>API Key</FormLabel> <FormLabel>{isLiteLLM ? 'LiteLLM Proxy URL' : 'API Base URL (Optional)'}</FormLabel>
<FormControl> <FormControl>
<Input <Input
type="password" placeholder={isLiteLLM ? 'http://localhost:4000' : 'https://api.openai.com/v1'}
placeholder={settings.openai_api_key ? '••••••••' : 'Enter API key'}
{...field} {...field}
/> />
</FormControl> </FormControl>
<FormDescription> <FormDescription>
Your OpenAI API key. Leave blank to keep the existing key. {isLiteLLM ? (
<>
URL of your LiteLLM proxy. Typically{' '}
<code className="text-xs bg-muted px-1 rounded">http://localhost:4000</code>{' '}
or your server address.
</>
) : (
<>
Custom base URL for OpenAI-compatible providers. Leave blank for OpenAI.
Use <code className="text-xs bg-muted px-1 rounded">https://openrouter.ai/api/v1</code> for OpenRouter.
</>
)}
</FormDescription> </FormDescription>
<FormMessage /> <FormMessage />
</FormItem> </FormItem>
@@ -215,7 +274,7 @@ export function AISettingsForm({ settings }: AISettingsFormProps) {
<FormItem> <FormItem>
<div className="flex items-center justify-between"> <div className="flex items-center justify-between">
<FormLabel>Model</FormLabel> <FormLabel>Model</FormLabel>
{modelsData?.success && ( {!isLiteLLM && modelsData?.success && !modelsData?.manualEntry && (
<Button <Button
type="button" type="button"
variant="ghost" variant="ghost"
@@ -229,7 +288,13 @@ export function AISettingsForm({ settings }: AISettingsFormProps) {
)} )}
</div> </div>
{modelsLoading ? ( {isLiteLLM || modelsData?.manualEntry ? (
<Input
value={field.value}
onChange={(e) => field.onChange(e.target.value)}
placeholder="chatgpt/gpt-5.2"
/>
) : modelsLoading ? (
<Skeleton className="h-10 w-full" /> <Skeleton className="h-10 w-full" />
) : modelsError || !modelsData?.success ? ( ) : modelsError || !modelsData?.success ? (
<div className="space-y-2"> <div className="space-y-2">
@@ -276,7 +341,15 @@ export function AISettingsForm({ settings }: AISettingsFormProps) {
</Select> </Select>
)} )}
<FormDescription> <FormDescription>
{form.watch('ai_model')?.startsWith('o') ? ( {isLiteLLM ? (
<>
Enter the model ID with the{' '}
<code className="text-xs bg-muted px-1 rounded">chatgpt/</code> prefix.
Examples:{' '}
<code className="text-xs bg-muted px-1 rounded">chatgpt/gpt-5.2</code>,{' '}
<code className="text-xs bg-muted px-1 rounded">chatgpt/gpt-5.2-codex</code>
</>
) : form.watch('ai_model')?.startsWith('o') ? (
<span className="flex items-center gap-1 text-purple-600"> <span className="flex items-center gap-1 text-purple-600">
<SlidersHorizontal className="h-3 w-3" /> <SlidersHorizontal className="h-3 w-3" />
Reasoning model - optimized for complex analysis tasks Reasoning model - optimized for complex analysis tasks

View File

@@ -25,6 +25,7 @@ import {
ShieldAlert, ShieldAlert,
Globe, Globe,
Webhook, Webhook,
MessageCircle,
} from 'lucide-react' } from 'lucide-react'
import Link from 'next/link' import Link from 'next/link'
import { AnimatedCard } from '@/components/shared/animated-container' import { AnimatedCard } from '@/components/shared/animated-container'
@@ -84,6 +85,7 @@ export function SettingsContent({ initialSettings, isSuperAdmin = true }: Settin
'ai_model', 'ai_model',
'ai_send_descriptions', 'ai_send_descriptions',
'openai_api_key', 'openai_api_key',
'openai_base_url',
]) ])
const brandingSettings = getSettingsByKeys([ const brandingSettings = getSettingsByKeys([
@@ -102,8 +104,12 @@ export function SettingsContent({ initialSettings, isSuperAdmin = true }: Settin
]) ])
const storageSettings = getSettingsByKeys([ const storageSettings = getSettingsByKeys([
'storage_provider',
'local_storage_path',
'max_file_size_mb', 'max_file_size_mb',
'avatar_max_size_mb',
'allowed_file_types', 'allowed_file_types',
'allowed_image_types',
]) ])
const securitySettings = getSettingsByKeys([ const securitySettings = getSettingsByKeys([
@@ -146,6 +152,11 @@ export function SettingsContent({ initialSettings, isSuperAdmin = true }: Settin
'anomaly_off_hours_end', 'anomaly_off_hours_end',
]) ])
const whatsappSettings = getSettingsByKeys([
'whatsapp_enabled',
'whatsapp_provider',
])
const localizationSettings = getSettingsByKeys([ const localizationSettings = getSettingsByKeys([
'localization_enabled_locales', 'localization_enabled_locales',
'localization_default_locale', 'localization_default_locale',
@@ -182,6 +193,12 @@ export function SettingsContent({ initialSettings, isSuperAdmin = true }: Settin
<Newspaper className="h-4 w-4" /> <Newspaper className="h-4 w-4" />
Digest Digest
</TabsTrigger> </TabsTrigger>
{isSuperAdmin && (
<TabsTrigger value="whatsapp" className="gap-2 shrink-0">
<MessageCircle className="h-4 w-4" />
WhatsApp
</TabsTrigger>
)}
{isSuperAdmin && ( {isSuperAdmin && (
<TabsTrigger value="security" className="gap-2 shrink-0"> <TabsTrigger value="security" className="gap-2 shrink-0">
<Shield className="h-4 w-4" /> <Shield className="h-4 w-4" />
@@ -258,6 +275,12 @@ export function SettingsContent({ initialSettings, isSuperAdmin = true }: Settin
<Newspaper className="h-4 w-4" /> <Newspaper className="h-4 w-4" />
Digest Digest
</TabsTrigger> </TabsTrigger>
{isSuperAdmin && (
<TabsTrigger value="whatsapp" className="justify-start gap-2 w-full px-3 py-2 h-auto data-[state=active]:bg-muted">
<MessageCircle className="h-4 w-4" />
WhatsApp
</TabsTrigger>
)}
</TabsList> </TabsList>
</div> </div>
<div> <div>
@@ -501,6 +524,24 @@ export function SettingsContent({ initialSettings, isSuperAdmin = true }: Settin
</Card> </Card>
</AnimatedCard> </AnimatedCard>
</TabsContent> </TabsContent>
{isSuperAdmin && (
<TabsContent value="whatsapp" className="space-y-6">
<AnimatedCard>
<Card>
<CardHeader>
<CardTitle>WhatsApp Notifications</CardTitle>
<CardDescription>
Configure WhatsApp messaging for notifications
</CardDescription>
</CardHeader>
<CardContent>
<WhatsAppSettingsSection settings={whatsappSettings} />
</CardContent>
</Card>
</AnimatedCard>
</TabsContent>
)}
</div>{/* end content area */} </div>{/* end content area */}
</div>{/* end lg:flex */} </div>{/* end lg:flex */}
</Tabs> </Tabs>
@@ -793,6 +834,29 @@ function AuditSettingsSection({ settings }: { settings: Record<string, string> }
) )
} }
function WhatsAppSettingsSection({ settings }: { settings: Record<string, string> }) {
return (
<div className="space-y-4">
<SettingToggle
label="Enable WhatsApp Notifications"
description="Send notifications via WhatsApp in addition to email"
settingKey="whatsapp_enabled"
value={settings.whatsapp_enabled || 'false'}
/>
<SettingSelect
label="WhatsApp Provider"
description="Select the API provider for sending WhatsApp messages"
settingKey="whatsapp_provider"
value={settings.whatsapp_provider || 'META'}
options={[
{ value: 'META', label: 'Meta (WhatsApp Business API)' },
{ value: 'TWILIO', label: 'Twilio' },
]}
/>
</div>
)
}
function LocalizationSettingsSection({ settings }: { settings: Record<string, string> }) { function LocalizationSettingsSection({ settings }: { settings: Record<string, string> }) {
const mutation = useSettingsMutation() const mutation = useSettingsMutation()
const enabledLocales = (settings.localization_enabled_locales || 'en').split(',') const enabledLocales = (settings.localization_enabled_locales || 'en').split(',')

View File

@@ -22,6 +22,14 @@ import {
} from '@/components/ui/form' } from '@/components/ui/form'
// Note: Storage provider cache is cleared server-side when settings are updated // Note: Storage provider cache is cleared server-side when settings are updated
const COMMON_IMAGE_TYPES = [
{ value: 'image/png', label: 'PNG (.png)' },
{ value: 'image/jpeg', label: 'JPEG (.jpg, .jpeg)' },
{ value: 'image/webp', label: 'WebP (.webp)' },
{ value: 'image/gif', label: 'GIF (.gif)' },
{ value: 'image/svg+xml', label: 'SVG (.svg)' },
]
const COMMON_FILE_TYPES = [ const COMMON_FILE_TYPES = [
{ value: 'application/pdf', label: 'PDF Documents (.pdf)' }, { value: 'application/pdf', label: 'PDF Documents (.pdf)' },
{ value: 'video/mp4', label: 'MP4 Video (.mp4)' }, { value: 'video/mp4', label: 'MP4 Video (.mp4)' },
@@ -41,6 +49,7 @@ const formSchema = z.object({
max_file_size_mb: z.string().regex(/^\d+$/, 'Must be a number'), max_file_size_mb: z.string().regex(/^\d+$/, 'Must be a number'),
avatar_max_size_mb: z.string().regex(/^\d+$/, 'Must be a number'), avatar_max_size_mb: z.string().regex(/^\d+$/, 'Must be a number'),
allowed_file_types: z.array(z.string()).min(1, 'Select at least one file type'), allowed_file_types: z.array(z.string()).min(1, 'Select at least one file type'),
allowed_image_types: z.array(z.string()).min(1, 'Select at least one image type'),
}) })
type FormValues = z.infer<typeof formSchema> type FormValues = z.infer<typeof formSchema>
@@ -52,6 +61,7 @@ interface StorageSettingsFormProps {
max_file_size_mb?: string max_file_size_mb?: string
avatar_max_size_mb?: string avatar_max_size_mb?: string
allowed_file_types?: string allowed_file_types?: string
allowed_image_types?: string
} }
} }
@@ -68,6 +78,16 @@ export function StorageSettingsForm({ settings }: StorageSettingsFormProps) {
allowedTypes = ['application/pdf', 'video/mp4', 'video/quicktime', 'image/png', 'image/jpeg'] allowedTypes = ['application/pdf', 'video/mp4', 'video/quicktime', 'image/png', 'image/jpeg']
} }
// Parse allowed image types from JSON string
let allowedImageTypes: string[] = []
try {
allowedImageTypes = settings.allowed_image_types
? JSON.parse(settings.allowed_image_types)
: ['image/png', 'image/jpeg', 'image/webp']
} catch {
allowedImageTypes = ['image/png', 'image/jpeg', 'image/webp']
}
const form = useForm<FormValues>({ const form = useForm<FormValues>({
resolver: zodResolver(formSchema), resolver: zodResolver(formSchema),
defaultValues: { defaultValues: {
@@ -76,6 +96,7 @@ export function StorageSettingsForm({ settings }: StorageSettingsFormProps) {
max_file_size_mb: settings.max_file_size_mb || '500', max_file_size_mb: settings.max_file_size_mb || '500',
avatar_max_size_mb: settings.avatar_max_size_mb || '5', avatar_max_size_mb: settings.avatar_max_size_mb || '5',
allowed_file_types: allowedTypes, allowed_file_types: allowedTypes,
allowed_image_types: allowedImageTypes,
}, },
}) })
@@ -99,6 +120,7 @@ export function StorageSettingsForm({ settings }: StorageSettingsFormProps) {
{ key: 'max_file_size_mb', value: data.max_file_size_mb }, { key: 'max_file_size_mb', value: data.max_file_size_mb },
{ key: 'avatar_max_size_mb', value: data.avatar_max_size_mb }, { key: 'avatar_max_size_mb', value: data.avatar_max_size_mb },
{ key: 'allowed_file_types', value: JSON.stringify(data.allowed_file_types) }, { key: 'allowed_file_types', value: JSON.stringify(data.allowed_file_types) },
{ key: 'allowed_image_types', value: JSON.stringify(data.allowed_image_types) },
], ],
}) })
} }
@@ -255,6 +277,57 @@ export function StorageSettingsForm({ settings }: StorageSettingsFormProps) {
)} )}
/> />
<FormField
control={form.control}
name="allowed_image_types"
render={() => (
<FormItem>
<div className="mb-4">
<FormLabel>Allowed Image Types (Avatars/Logos)</FormLabel>
<FormDescription>
Select which image formats can be used for profile pictures and project logos
</FormDescription>
</div>
<div className="grid gap-3 md:grid-cols-2">
{COMMON_IMAGE_TYPES.map((type) => (
<FormField
key={type.value}
control={form.control}
name="allowed_image_types"
render={({ field }) => {
return (
<FormItem
key={type.value}
className="flex items-start space-x-3 space-y-0"
>
<FormControl>
<Checkbox
checked={field.value?.includes(type.value)}
onCheckedChange={(checked) => {
return checked
? field.onChange([...field.value, type.value])
: field.onChange(
field.value?.filter(
(value) => value !== type.value
)
)
}}
/>
</FormControl>
<FormLabel className="cursor-pointer text-sm font-normal">
{type.label}
</FormLabel>
</FormItem>
)
}}
/>
))}
</div>
<FormMessage />
</FormItem>
)}
/>
{storageProvider === 's3' && ( {storageProvider === 's3' && (
<div className="rounded-lg border border-muted bg-muted/50 p-4"> <div className="rounded-lg border border-muted bg-muted/50 p-4">
<p className="text-sm text-muted-foreground"> <p className="text-sm text-muted-foreground">

View File

@@ -65,6 +65,12 @@ interface ProjectFile {
isLate?: boolean isLate?: boolean
requirementId?: string | null requirementId?: string | null
requirement?: FileRequirementInfo | null requirement?: FileRequirementInfo | null
// Document analysis fields
pageCount?: number | null
textPreview?: string | null
detectedLang?: string | null
langConfidence?: number | null
analyzedAt?: Date | string | null
} }
interface RoundGroup { interface RoundGroup {
@@ -270,6 +276,25 @@ function FileItem({ file }: { file: ProjectFile }) {
</Badge> </Badge>
)} )}
<span>{formatFileSize(file.size)}</span> <span>{formatFileSize(file.size)}</span>
{file.pageCount != null && (
<Badge variant="outline" className="text-xs gap-1">
<FileText className="h-3 w-3" />
{file.pageCount} {file.pageCount === 1 ? 'page' : 'pages'}
</Badge>
)}
{file.detectedLang && file.detectedLang !== 'und' && (
<Badge
variant="outline"
className={cn('text-xs font-mono uppercase', {
'border-green-300 text-green-700 bg-green-50': file.langConfidence != null && file.langConfidence >= 0.8,
'border-amber-300 text-amber-700 bg-amber-50': file.langConfidence != null && file.langConfidence >= 0.4 && file.langConfidence < 0.8,
'border-red-300 text-red-700 bg-red-50': file.langConfidence != null && file.langConfidence < 0.4,
})}
title={`Language: ${file.detectedLang} (${Math.round((file.langConfidence ?? 0) * 100)}% confidence)`}
>
{file.detectedLang.toUpperCase()}
</Badge>
)}
</div> </div>
</div> </div>

View File

@@ -8,6 +8,33 @@ const globalForOpenAI = globalThis as unknown as {
openaiInitialized: boolean openaiInitialized: boolean
} }
// ─── Provider Detection ─────────────────────────────────────────────────────
/**
* Get the configured AI provider from SystemSettings.
* Returns 'openai' (default) or 'litellm' (ChatGPT subscription proxy).
*/
export async function getConfiguredProvider(): Promise<'openai' | 'litellm'> {
try {
const setting = await prisma.systemSettings.findUnique({
where: { key: 'ai_provider' },
})
const value = setting?.value || 'openai'
return value === 'litellm' ? 'litellm' : 'openai'
} catch {
return 'openai'
}
}
/**
* Check if a model ID indicates LiteLLM ChatGPT subscription routing.
* Models like 'chatgpt/gpt-5.2' use the chatgpt/ prefix.
* Used by buildCompletionParams (sync) to strip unsupported token limit fields.
*/
export function isLiteLLMChatGPTModel(model: string): boolean {
return model.toLowerCase().startsWith('chatgpt/')
}
// ─── Model Type Detection ──────────────────────────────────────────────────── // ─── Model Type Detection ────────────────────────────────────────────────────
/** /**
@@ -168,6 +195,12 @@ export function buildCompletionParams(
params.response_format = { type: 'json_object' } params.response_format = { type: 'json_object' }
} }
// LiteLLM ChatGPT subscription models reject token limit fields
if (isLiteLLMChatGPTModel(model)) {
delete params.max_tokens
delete params.max_completion_tokens
}
return params return params
} }
@@ -187,18 +220,47 @@ async function getOpenAIApiKey(): Promise<string | null> {
} }
/** /**
* Create OpenAI client instance * Get custom base URL for OpenAI-compatible providers.
* Supports OpenRouter, Together AI, Groq, local models, etc.
* Set via Settings → AI or OPENAI_BASE_URL env var.
*/
async function getBaseURL(): Promise<string | undefined> {
try {
const setting = await prisma.systemSettings.findUnique({
where: { key: 'openai_base_url' },
})
return setting?.value || process.env.OPENAI_BASE_URL || undefined
} catch {
return process.env.OPENAI_BASE_URL || undefined
}
}
/**
* Create OpenAI client instance.
* Supports custom baseURL for OpenAI-compatible providers
* (OpenRouter, Groq, Together AI, local models, etc.)
*/ */
async function createOpenAIClient(): Promise<OpenAI | null> { async function createOpenAIClient(): Promise<OpenAI | null> {
const apiKey = await getOpenAIApiKey() const apiKey = await getOpenAIApiKey()
const provider = await getConfiguredProvider()
if (!apiKey) { // LiteLLM proxy may not require a real API key
const effectiveApiKey = apiKey || (provider === 'litellm' ? 'sk-litellm' : null)
if (!effectiveApiKey) {
console.warn('OpenAI API key not configured') console.warn('OpenAI API key not configured')
return null return null
} }
const baseURL = await getBaseURL()
if (baseURL) {
console.log(`[OpenAI] Using custom base URL: ${baseURL} (provider: ${provider})`)
}
return new OpenAI({ return new OpenAI({
apiKey, apiKey: effectiveApiKey,
...(baseURL ? { baseURL } : {}),
}) })
} }
@@ -221,10 +283,25 @@ export async function getOpenAI(): Promise<OpenAI | null> {
return client return client
} }
/**
* Reset the OpenAI client singleton (e.g., after settings change).
* Next call to getOpenAI() will create a fresh client.
*/
export function resetOpenAIClient(): void {
globalForOpenAI.openai = undefined
globalForOpenAI.openaiInitialized = false
}
/** /**
* Check if OpenAI is configured and available * Check if OpenAI is configured and available
*/ */
export async function isOpenAIConfigured(): Promise<boolean> { export async function isOpenAIConfigured(): Promise<boolean> {
const provider = await getConfiguredProvider()
if (provider === 'litellm') {
// LiteLLM just needs a base URL configured
const baseURL = await getBaseURL()
return !!baseURL
}
const apiKey = await getOpenAIApiKey() const apiKey = await getOpenAIApiKey()
return !!apiKey return !!apiKey
} }
@@ -236,8 +313,20 @@ export async function listAvailableModels(): Promise<{
success: boolean success: boolean
models?: string[] models?: string[]
error?: string error?: string
manualEntry?: boolean
}> { }> {
try { try {
const provider = await getConfiguredProvider()
// LiteLLM proxy for ChatGPT subscription doesn't support models.list()
if (provider === 'litellm') {
return {
success: true,
models: [],
manualEntry: true,
}
}
const client = await getOpenAI() const client = await getOpenAI()
if (!client) { if (!client) {

View File

@@ -6,6 +6,7 @@ import { getPresignedUrl, generateObjectKey } from '@/lib/minio'
import { sendStyledNotificationEmail, sendTeamMemberInviteEmail } from '@/lib/email' import { sendStyledNotificationEmail, sendTeamMemberInviteEmail } from '@/lib/email'
import { logAudit } from '@/server/utils/audit' import { logAudit } from '@/server/utils/audit'
import { createNotification } from '../services/in-app-notification' import { createNotification } from '../services/in-app-notification'
import { checkRequirementsAndTransition } from '../services/round-engine'
// Bucket for applicant submissions // Bucket for applicant submissions
export const SUBMISSIONS_BUCKET = 'mopc-submissions' export const SUBMISSIONS_BUCKET = 'mopc-submissions'
@@ -410,6 +411,24 @@ export const applicantRouter = router({
}, },
}) })
// Auto-transition: if uploading against a round requirement, check completion
if (roundId && requirementId) {
await checkRequirementsAndTransition(
projectId,
roundId,
ctx.user.id,
ctx.prisma,
)
}
// Auto-analyze document (fire-and-forget, delayed for presigned upload)
import('../services/document-analyzer').then(({ analyzeFileDelayed, isAutoAnalysisEnabled }) =>
isAutoAnalysisEnabled().then((enabled) => {
if (enabled) analyzeFileDelayed(file.id).catch((err) =>
console.warn('[DocAnalyzer] Post-upload analysis failed:', err))
})
).catch(() => {})
return file return file
}), }),

View File

@@ -74,10 +74,22 @@ async function runAIAssignmentJob(jobId: string, roundId: string, userId: string
description: true, description: true,
tags: true, tags: true,
teamName: true, teamName: true,
projectTags: {
select: { tag: { select: { name: true } }, confidence: true },
},
_count: { select: { assignments: { where: { roundId } } } }, _count: { select: { assignments: { where: { roundId } } } },
}, },
}) })
// Enrich projects with tag confidence data for AI matching
const projectsWithConfidence = projects.map((p) => ({
...p,
tagConfidences: p.projectTags.map((pt) => ({
name: pt.tag.name,
confidence: pt.confidence,
})),
}))
const existingAssignments = await prisma.assignment.findMany({ const existingAssignments = await prisma.assignment.findMany({
where: { roundId }, where: { roundId },
select: { userId: true, projectId: true }, select: { userId: true, projectId: true },
@@ -124,7 +136,7 @@ async function runAIAssignmentJob(jobId: string, roundId: string, userId: string
const result = await generateAIAssignments( const result = await generateAIAssignments(
jurors, jurors,
projects, projectsWithConfidence,
constraints, constraints,
userId, userId,
roundId, roundId,

View File

@@ -3,6 +3,7 @@ import { TRPCError } from '@trpc/server'
import { router, protectedProcedure, adminProcedure } from '../trpc' import { router, protectedProcedure, adminProcedure } from '../trpc'
import { getPresignedUrl, generateObjectKey, deleteObject, BUCKET_NAME } from '@/lib/minio' import { getPresignedUrl, generateObjectKey, deleteObject, BUCKET_NAME } from '@/lib/minio'
import { logAudit } from '../utils/audit' import { logAudit } from '../utils/audit'
import { checkRequirementsAndTransition } from '../services/round-engine'
export const fileRouter = router({ export const fileRouter = router({
/** /**
@@ -205,6 +206,14 @@ export const fileRouter = router({
userAgent: ctx.userAgent, userAgent: ctx.userAgent,
}) })
// Auto-analyze document (fire-and-forget, delayed for presigned upload)
import('../services/document-analyzer').then(({ analyzeFileDelayed, isAutoAnalysisEnabled }) =>
isAutoAnalysisEnabled().then((enabled) => {
if (enabled) analyzeFileDelayed(file.id).catch((err) =>
console.warn('[DocAnalyzer] Post-upload analysis failed:', err))
})
).catch(() => {})
return { return {
uploadUrl, uploadUrl,
file, file,
@@ -818,6 +827,20 @@ export const fileRouter = router({
}) })
}), }),
/**
* List file requirements for multiple rounds in a single query.
* Avoids dynamic hook violations when fetching requirements per-round.
*/
listRequirementsByRounds: protectedProcedure
.input(z.object({ roundIds: z.array(z.string()).max(50) }))
.query(async ({ ctx, input }) => {
if (input.roundIds.length === 0) return []
return ctx.prisma.fileRequirement.findMany({
where: { roundId: { in: input.roundIds } },
orderBy: { sortOrder: 'asc' },
})
}),
/** /**
* Create a file requirement for a stage (admin only) * Create a file requirement for a stage (admin only)
*/ */
@@ -1186,6 +1209,14 @@ export const fileRouter = router({
userAgent: ctx.userAgent, userAgent: ctx.userAgent,
}) })
// Auto-analyze document (fire-and-forget, delayed for presigned upload)
import('../services/document-analyzer').then(({ analyzeFileDelayed, isAutoAnalysisEnabled }) =>
isAutoAnalysisEnabled().then((enabled) => {
if (enabled) analyzeFileDelayed(file.id).catch((err) =>
console.warn('[DocAnalyzer] Post-upload analysis failed:', err))
})
).catch(() => {})
return { uploadUrl, file } return { uploadUrl, file }
}), }),
@@ -1295,6 +1326,8 @@ export const fileRouter = router({
size: true, size: true,
createdAt: true, createdAt: true,
requirementId: true, requirementId: true,
bucket: true,
objectKey: true,
}, },
}, },
}, },
@@ -1485,6 +1518,76 @@ export const fileRouter = router({
userAgent: ctx.userAgent, userAgent: ctx.userAgent,
}) })
// Auto-transition: check if all required documents are now uploaded
await checkRequirementsAndTransition(
input.projectId,
input.roundId,
ctx.user.id,
ctx.prisma,
)
// Auto-analyze document (fire-and-forget, delayed for presigned upload)
import('../services/document-analyzer').then(({ analyzeFileDelayed, isAutoAnalysisEnabled }) =>
isAutoAnalysisEnabled().then((enabled) => {
if (enabled) analyzeFileDelayed(file.id).catch((err) =>
console.warn('[DocAnalyzer] Post-upload analysis failed:', err))
})
).catch(() => {})
return { uploadUrl, file } return { uploadUrl, file }
}), }),
/**
* Verify that files actually exist in storage (MinIO/S3).
* Returns a map of objectKey → exists boolean.
*/
verifyFilesExist: adminProcedure
.input(
z.object({
files: z.array(
z.object({
bucket: z.string(),
objectKey: z.string(),
})
).max(200),
})
)
.query(async ({ input }) => {
const { getMinioClient } = await import('@/lib/minio')
const client = getMinioClient()
const results: Record<string, boolean> = {}
await Promise.all(
input.files.map(async ({ bucket, objectKey }) => {
try {
await client.statObject(bucket, objectKey)
results[objectKey] = true
} catch {
results[objectKey] = false
}
})
)
return results
}),
/**
* Analyze all files for a specific project (page count, language, text preview).
* Retroactive: re-analyzes even previously analyzed files.
*/
analyzeProjectFiles: adminProcedure
.input(z.object({ projectId: z.string() }))
.mutation(async ({ input }) => {
const { analyzeProjectFiles } = await import('../services/document-analyzer')
return analyzeProjectFiles(input.projectId)
}),
/**
* Batch analyze all unanalyzed files across the platform.
* For retroactive analysis of files uploaded before this feature.
*/
analyzeAllFiles: adminProcedure
.mutation(async () => {
const { analyzeAllUnanalyzed } = await import('../services/document-analyzer')
return analyzeAllUnanalyzed()
}),
}) })

View File

@@ -69,6 +69,8 @@ export async function runFilteringJob(jobId: string, roundId: string, userId: st
mimeType: true, mimeType: true,
size: true, size: true,
pageCount: true, pageCount: true,
detectedLang: true,
langConfidence: true,
objectKey: true, objectKey: true,
roundId: true, roundId: true,
createdAt: true, createdAt: true,

View File

@@ -249,6 +249,50 @@ export const juryGroupRouter = router({
return existing return existing
}), }),
/**
* Delete a jury group entirely
*/
delete: adminProcedure
.input(z.object({ id: z.string() }))
.mutation(async ({ ctx, input }) => {
const group = await ctx.prisma.juryGroup.findUniqueOrThrow({
where: { id: input.id },
include: {
_count: { select: { assignments: true, rounds: true } },
},
})
// Unlink any rounds that reference this jury group
await ctx.prisma.round.updateMany({
where: { juryGroupId: input.id },
data: { juryGroupId: null },
})
// Delete all members first (cascade should handle this, but be explicit)
await ctx.prisma.juryGroupMember.deleteMany({
where: { juryGroupId: input.id },
})
await ctx.prisma.juryGroup.delete({ where: { id: input.id } })
await logAudit({
prisma: ctx.prisma,
userId: ctx.user.id,
action: 'DELETE',
entityType: 'JuryGroup',
entityId: input.id,
detailsJson: {
name: group.name,
competitionId: group.competitionId,
memberCount: group._count.assignments,
},
ipAddress: ctx.ip,
userAgent: ctx.userAgent,
})
return { success: true, name: group.name }
}),
/** /**
* Update a jury group member's role/overrides * Update a jury group member's role/overrides
*/ */

View File

@@ -243,10 +243,11 @@ export const roundRouter = router({
roundId: z.string(), roundId: z.string(),
targetRoundId: z.string().optional(), targetRoundId: z.string().optional(),
projectIds: z.array(z.string()).optional(), projectIds: z.array(z.string()).optional(),
autoPassPending: z.boolean().optional(),
}) })
) )
.mutation(async ({ ctx, input }) => { .mutation(async ({ ctx, input }) => {
const { roundId, targetRoundId, projectIds } = input const { roundId, targetRoundId, projectIds, autoPassPending } = input
// Get current round with competition context // Get current round with competition context
const currentRound = await ctx.prisma.round.findUniqueOrThrow({ const currentRound = await ctx.prisma.round.findUniqueOrThrow({
@@ -280,6 +281,16 @@ export const roundRouter = router({
targetRound = nextRound targetRound = nextRound
} }
// Auto-pass all PENDING projects first (for intake/bulk workflows)
let autoPassedCount = 0
if (autoPassPending) {
const result = await ctx.prisma.projectRoundState.updateMany({
where: { roundId, state: 'PENDING' },
data: { state: 'PASSED' },
})
autoPassedCount = result.count
}
// Determine which projects to advance // Determine which projects to advance
let idsToAdvance: string[] let idsToAdvance: string[]
if (projectIds && projectIds.length > 0) { if (projectIds && projectIds.length > 0) {
@@ -346,6 +357,7 @@ export const roundRouter = router({
toRound: targetRound.name, toRound: targetRound.name,
targetRoundId: targetRound.id, targetRoundId: targetRound.id,
projectCount: idsToAdvance.length, projectCount: idsToAdvance.length,
autoPassedCount,
projectIds: idsToAdvance, projectIds: idsToAdvance,
}, },
ipAddress: ctx.ip, ipAddress: ctx.ip,
@@ -354,6 +366,7 @@ export const roundRouter = router({
return { return {
advancedCount: idsToAdvance.length, advancedCount: idsToAdvance.length,
autoPassedCount,
targetRoundId: targetRound.id, targetRoundId: targetRound.id,
targetRoundName: targetRound.name, targetRoundName: targetRound.name,
} }

View File

@@ -263,4 +263,41 @@ export const roundEngineRouter = router({
return { success: true, removedCount: deleted.count } return { success: true, removedCount: deleted.count }
}), }),
/**
* Retroactive document check: auto-PASS any PENDING/IN_PROGRESS projects
* that already have all required documents uploaded for this round.
* Useful for rounds activated before the auto-transition feature was deployed.
*/
checkDocumentCompletion: adminProcedure
.input(z.object({ roundId: z.string() }))
.mutation(async ({ ctx, input }) => {
const { batchCheckRequirementsAndTransition } = await import('../services/round-engine')
const projectStates = await ctx.prisma.projectRoundState.findMany({
where: {
roundId: input.roundId,
state: { in: ['PENDING', 'IN_PROGRESS'] },
},
select: { projectId: true },
})
if (projectStates.length === 0) {
return { transitionedCount: 0, checkedCount: 0, projectIds: [] }
}
const projectIds = projectStates.map((ps: { projectId: string }) => ps.projectId)
const result = await batchCheckRequirementsAndTransition(
input.roundId,
projectIds,
ctx.user.id,
ctx.prisma,
)
return {
transitionedCount: result.transitionedCount,
checkedCount: projectIds.length,
projectIds: result.projectIds,
}
}),
}) })

View File

@@ -201,6 +201,12 @@ export const settingsRouter = router({
clearStorageProviderCache() clearStorageProviderCache()
} }
// Reset OpenAI client if API key, base URL, model, or provider changed
if (input.settings.some((s) => s.key === 'openai_api_key' || s.key === 'openai_base_url' || s.key === 'ai_model' || s.key === 'ai_provider')) {
const { resetOpenAIClient } = await import('@/lib/openai')
resetOpenAIClient()
}
// Audit log // Audit log
await logAudit({ await logAudit({
prisma: ctx.prisma, prisma: ctx.prisma,
@@ -241,6 +247,15 @@ export const settingsRouter = router({
listAIModels: superAdminProcedure.query(async () => { listAIModels: superAdminProcedure.query(async () => {
const result = await listAvailableModels() const result = await listAvailableModels()
// LiteLLM mode: manual model entry, no listing available
if (result.manualEntry) {
return {
success: true,
models: [],
manualEntry: true,
}
}
if (!result.success || !result.models) { if (!result.success || !result.models) {
return { return {
success: false, success: false,

View File

@@ -5,6 +5,7 @@ import { prisma } from '@/lib/prisma'
import { logAudit } from '../utils/audit' import { logAudit } from '../utils/audit'
import { import {
tagProject, tagProject,
tagProjectsBatch,
getTagSuggestions, getTagSuggestions,
addProjectTag, addProjectTag,
removeProjectTag, removeProjectTag,
@@ -17,7 +18,7 @@ import {
NotificationTypes, NotificationTypes,
} from '../services/in-app-notification' } from '../services/in-app-notification'
// Background job runner for tagging // Background job runner for tagging — uses batched API calls for efficiency
async function runTaggingJob(jobId: string, userId: string) { async function runTaggingJob(jobId: string, userId: string) {
const job = await prisma.taggingJob.findUnique({ const job = await prisma.taggingJob.findUnique({
where: { id: jobId }, where: { id: jobId },
@@ -28,7 +29,7 @@ async function runTaggingJob(jobId: string, userId: string) {
return return
} }
console.log(`[AI Tagging Job] Starting job ${jobId}...`) console.log(`[AI Tagging Job] Starting job ${jobId} (batched mode)...`)
// Mark as running // Mark as running
await prisma.taggingJob.update({ await prisma.taggingJob.update({
@@ -56,7 +57,7 @@ async function runTaggingJob(jobId: string, userId: string) {
const allProjects = await prisma.project.findMany({ const allProjects = await prisma.project.findMany({
where: whereClause, where: whereClause,
select: { id: true, title: true, tags: true }, select: { id: true, title: true, tags: true, projectTags: { select: { tagId: true } } },
}) })
const untaggedProjects = allProjects.filter(p => p.tags.length === 0) const untaggedProjects = allProjects.filter(p => p.tags.length === 0)
@@ -83,48 +84,33 @@ async function runTaggingJob(jobId: string, userId: string) {
return return
} }
let taggedCount = 0
let failedCount = 0
const errors: string[] = []
const startTime = Date.now() const startTime = Date.now()
for (let i = 0; i < untaggedProjects.length; i++) { // Use batched tagging — processes 10 projects per API call, 3 concurrent calls
const project = untaggedProjects[i] const { results, totalTokens } = await tagProjectsBatch(
console.log(`[AI Tagging Job] Processing ${i + 1}/${untaggedProjects.length}: "${project.title.substring(0, 40)}..."`) untaggedProjects,
userId,
try { async (processed, total) => {
const result = await tagProject(project.id, userId) // Update job progress on each batch completion
taggedCount++ const taggedSoFar = results?.length ?? processed
console.log(`[AI Tagging Job] ✓ Tagged with ${result.applied.length} tags`)
} catch (error) {
failedCount++
const errorMsg = error instanceof Error ? error.message : 'Unknown error'
errors.push(`${project.title}: ${errorMsg}`)
console.error(`[AI Tagging Job] ✗ Failed: ${errorMsg}`)
}
// Update progress
await prisma.taggingJob.update({ await prisma.taggingJob.update({
where: { id: jobId }, where: { id: jobId },
data: { data: {
processedCount: i + 1, processedCount: processed,
taggedCount, taggedCount: taggedSoFar,
failedCount,
errorsJson: errors.length > 0 ? errors.slice(0, 20) : undefined, // Keep last 20 errors
}, },
}) })
// Log progress every 10 projects
if ((i + 1) % 10 === 0) {
const elapsed = ((Date.now() - startTime) / 1000).toFixed(0) const elapsed = ((Date.now() - startTime) / 1000).toFixed(0)
const avgTime = (Date.now() - startTime) / (i + 1) / 1000 console.log(`[AI Tagging Job] Progress: ${processed}/${total} (${elapsed}s elapsed)`)
const remaining = avgTime * (untaggedProjects.length - i - 1)
console.log(`[AI Tagging Job] Progress: ${i + 1}/${untaggedProjects.length} (${elapsed}s elapsed, ~${remaining.toFixed(0)}s remaining)`)
}
} }
)
const taggedCount = results.filter(r => r.applied.length > 0).length
const failedCount = untaggedProjects.length - results.length
const totalTime = ((Date.now() - startTime) / 1000).toFixed(1) const totalTime = ((Date.now() - startTime) / 1000).toFixed(1)
console.log(`[AI Tagging Job] Complete: ${taggedCount} tagged, ${failedCount} failed in ${totalTime}s`) console.log(`[AI Tagging Job] Complete: ${taggedCount} tagged, ${failedCount} failed in ${totalTime}s (${totalTokens} tokens)`)
// Mark as completed // Mark as completed
await prisma.taggingJob.update({ await prisma.taggingJob.update({
@@ -132,7 +118,9 @@ async function runTaggingJob(jobId: string, userId: string) {
data: { data: {
status: 'COMPLETED', status: 'COMPLETED',
completedAt: new Date(), completedAt: new Date(),
errorsJson: errors.length > 0 ? errors : undefined, processedCount: results.length,
taggedCount,
failedCount,
}, },
}) })
@@ -144,7 +132,7 @@ async function runTaggingJob(jobId: string, userId: string) {
linkUrl: '/admin/projects', linkUrl: '/admin/projects',
linkLabel: 'View Projects', linkLabel: 'View Projects',
priority: 'normal', priority: 'normal',
metadata: { jobId, taggedCount, failedCount, skippedCount }, metadata: { jobId, taggedCount, failedCount, skippedCount, totalTokens },
}) })
} catch (error) { } catch (error) {

View File

@@ -38,7 +38,7 @@ const ASSIGNMENT_SYSTEM_PROMPT = `You are an expert jury assignment optimizer fo
Match jurors to projects based on expertise alignment, workload balance, and coverage requirements. Match jurors to projects based on expertise alignment, workload balance, and coverage requirements.
## Matching Criteria (Weighted) ## Matching Criteria (Weighted)
- Expertise Match (50%): How well juror tags/expertise align with project topics - Expertise Match (50%): How well juror tags/expertise align with project topics. Project tags include a confidence score (0-1) — weight higher-confidence tags more heavily as they are more reliably assigned. A tag with confidence 0.9 is a strong signal; one with 0.5 is uncertain.
- Workload Balance (30%): Distribute assignments evenly; prefer jurors below capacity - Workload Balance (30%): Distribute assignments evenly; prefer jurors below capacity
- Minimum Target (20%): Prioritize jurors who haven't reached their minimum assignment count - Minimum Target (20%): Prioritize jurors who haven't reached their minimum assignment count
@@ -99,6 +99,7 @@ interface ProjectForAssignment {
title: string title: string
description?: string | null description?: string | null
tags: string[] tags: string[]
tagConfidences?: Array<{ name: string; confidence: number }>
teamName?: string | null teamName?: string | null
_count?: { _count?: {
assignments: number assignments: number
@@ -539,7 +540,7 @@ export function generateFallbackAssignments(
return { return {
juror, juror,
score: calculateExpertiseScore(juror.expertiseTags, project.tags), score: calculateExpertiseScore(juror.expertiseTags, project.tags, project.tagConfidences),
loadScore: calculateLoadScore(currentLoad, maxLoad), loadScore: calculateLoadScore(currentLoad, maxLoad),
underMinBonus: calculateUnderMinBonus(currentLoad, minTarget), underMinBonus: calculateUnderMinBonus(currentLoad, minTarget),
} }
@@ -586,24 +587,44 @@ export function generateFallbackAssignments(
/** /**
* Calculate expertise match score based on tag overlap * Calculate expertise match score based on tag overlap
* When tagConfidences are available, weights matches by confidence
*/ */
function calculateExpertiseScore( function calculateExpertiseScore(
jurorTags: string[], jurorTags: string[],
projectTags: string[] projectTags: string[],
tagConfidences?: Array<{ name: string; confidence: number }>
): number { ): number {
if (jurorTags.length === 0 || projectTags.length === 0) { if (jurorTags.length === 0 || projectTags.length === 0) {
return 0.5 // Neutral score if no tags return 0.5 // Neutral score if no tags
} }
const jurorTagsLower = new Set(jurorTags.map((t) => t.toLowerCase())) const jurorTagsLower = new Set(jurorTags.map((t) => t.toLowerCase()))
// If we have confidence data, use weighted scoring
if (tagConfidences && tagConfidences.length > 0) {
let weightedMatches = 0
let totalWeight = 0
for (const tc of tagConfidences) {
totalWeight += tc.confidence
if (jurorTagsLower.has(tc.name.toLowerCase())) {
weightedMatches += tc.confidence
}
}
if (totalWeight === 0) return 0.5
const weightedRatio = weightedMatches / totalWeight
const hasExpertise = weightedMatches > 0 ? 0.2 : 0
return Math.min(1, weightedRatio * 0.8 + hasExpertise)
}
// Fallback: unweighted matching using flat tags
const matchingTags = projectTags.filter((t) => const matchingTags = projectTags.filter((t) =>
jurorTagsLower.has(t.toLowerCase()) jurorTagsLower.has(t.toLowerCase())
) )
// Score based on percentage of project tags matched
const matchRatio = matchingTags.length / projectTags.length const matchRatio = matchingTags.length / projectTags.length
// Boost for having expertise, even if not all match
const hasExpertise = matchingTags.length > 0 ? 0.2 : 0 const hasExpertise = matchingTags.length > 0 ? 0.2 : 0
return Math.min(1, matchRatio * 0.8 + hasExpertise) return Math.min(1, matchRatio * 0.8 + hasExpertise)

View File

@@ -142,7 +142,7 @@ interface FilteringRuleInput {
const DEFAULT_BATCH_SIZE = 20 const DEFAULT_BATCH_SIZE = 20
const MAX_BATCH_SIZE = 50 const MAX_BATCH_SIZE = 50
const MIN_BATCH_SIZE = 1 const MIN_BATCH_SIZE = 1
const DEFAULT_PARALLEL_BATCHES = 1 const DEFAULT_PARALLEL_BATCHES = 3
const MAX_PARALLEL_BATCHES = 10 const MAX_PARALLEL_BATCHES = 10
// Structured system prompt for AI screening // Structured system prompt for AI screening
@@ -179,10 +179,11 @@ Return a JSON object with this exact structure:
- founded_year: when the company/initiative was founded (use for age checks) - founded_year: when the company/initiative was founded (use for age checks)
- ocean_issue: the ocean conservation area - ocean_issue: the ocean conservation area
- file_count, file_types: uploaded documents summary - file_count, file_types: uploaded documents summary
- files[]: per-file details with file_type, page_count (if known), size_kb, round_name (which round the file was submitted for), and is_current_round flag - files[]: per-file details with file_type, page_count (if known), size_kb, detected_lang (ISO 639-3 language code like 'eng', 'fra'), lang_confidence (0-1), round_name (which round the file was submitted for), and is_current_round flag
- description: project summary text - description: project summary text
- tags: topic tags - tags: topic tags
- If document content is provided (text_content field in files), use it for deeper analysis. Pay SPECIAL ATTENTION to files from the current round (is_current_round=true) as they are the most recent and relevant submissions. - If document content is provided (text_content field in files), use it for deeper analysis. Pay SPECIAL ATTENTION to files from the current round (is_current_round=true) as they are the most recent and relevant submissions.
- If detected_lang is provided, use it to evaluate language requirements (e.g. 'eng' = English, 'fra' = French). lang_confidence indicates detection reliability.
## Guidelines ## Guidelines
- Evaluate ONLY against the provided criteria, not your own standards - Evaluate ONLY against the provided criteria, not your own standards

View File

@@ -344,8 +344,8 @@ export async function generateShortlist(
let totalTokens = 0 let totalTokens = 0
const allErrors: string[] = [] const allErrors: string[] = []
// Run each category independently // Run categories in parallel for efficiency
for (const cat of categories) { const categoryPromises = categories.map(async (cat) => {
const catTopN = cat === 'STARTUP' const catTopN = cat === 'STARTUP'
? (startupTopN ?? topN) ? (startupTopN ?? topN)
: (conceptTopN ?? topN) : (conceptTopN ?? topN)
@@ -357,6 +357,12 @@ export async function generateShortlist(
prisma, prisma,
) )
return { cat, result }
})
const categoryResults = await Promise.all(categoryPromises)
for (const { cat, result } of categoryResults) {
if (cat === 'STARTUP') { if (cat === 'STARTUP') {
allRecommendations.STARTUP = result.recommendations allRecommendations.STARTUP = result.recommendations
} else { } else {

View File

@@ -5,7 +5,7 @@
* *
* Features: * Features:
* - Single project tagging (on-submit or manual) * - Single project tagging (on-submit or manual)
* - Batch tagging for rounds * - Batch tagging with concurrent processing (10 projects/batch, 3 concurrent)
* - Confidence scores for each tag * - Confidence scores for each tag
* - Additive only - never removes existing tags * - Additive only - never removes existing tags
* *
@@ -16,7 +16,7 @@
*/ */
import { prisma } from '@/lib/prisma' import { prisma } from '@/lib/prisma'
import { getOpenAI, getConfiguredModel, buildCompletionParams } from '@/lib/openai' import { getOpenAI, getConfiguredModel, buildCompletionParams, AI_MODELS } from '@/lib/openai'
import { logAIUsage, extractTokenUsage } from '@/server/utils/ai-usage' import { logAIUsage, extractTokenUsage } from '@/server/utils/ai-usage'
import { classifyAIError, createParseError, logAIError } from './ai-errors' import { classifyAIError, createParseError, logAIError } from './ai-errors'
import { import {
@@ -53,8 +53,10 @@ interface AvailableTag {
const CONFIDENCE_THRESHOLD = 0.5 const CONFIDENCE_THRESHOLD = 0.5
const DEFAULT_MAX_TAGS = 5 const DEFAULT_MAX_TAGS = 5
const BATCH_SIZE = 10 // Projects per API call
const BATCH_CONCURRENCY = 3 // Concurrent API calls
// System prompt optimized for tag suggestion // System prompt optimized for single-project tag suggestion
const TAG_SUGGESTION_SYSTEM_PROMPT = `You are an expert at categorizing ocean conservation and sustainability projects. const TAG_SUGGESTION_SYSTEM_PROMPT = `You are an expert at categorizing ocean conservation and sustainability projects.
Analyze the project and suggest the most relevant expertise tags from the provided list. Analyze the project and suggest the most relevant expertise tags from the provided list.
@@ -78,6 +80,36 @@ Rules:
- Maximum 7 suggestions per project - Maximum 7 suggestions per project
- Be conservative - only suggest tags that truly apply` - Be conservative - only suggest tags that truly apply`
// System prompt optimized for batch tagging (multiple projects in one call)
const BATCH_TAG_SYSTEM_PROMPT = `You are an expert at categorizing ocean conservation and sustainability projects.
Analyze EACH project and suggest the most relevant expertise tags from the provided list.
Consider each project's focus areas, technology, methodology, and domain.
Return JSON with this format:
{
"projects": [
{
"project_id": "PROJECT_001",
"suggestions": [
{
"tag_name": "exact tag name from list",
"confidence": 0.0-1.0,
"reasoning": "brief explanation"
}
]
}
]
}
Rules:
- Only suggest tags from the provided list (exact names)
- Order by relevance (most relevant first)
- Confidence should reflect how well the tag matches
- Maximum 7 suggestions per project
- Be conservative - only suggest tags that truly apply
- Return results for ALL projects provided`
// ─── Helper Functions ──────────────────────────────────────────────────────── // ─── Helper Functions ────────────────────────────────────────────────────────
/** /**
@@ -132,7 +164,8 @@ export async function getAvailableTags(): Promise<AvailableTag[]> {
// ─── AI Tagging Core ───────────────────────────────────────────────────────── // ─── AI Tagging Core ─────────────────────────────────────────────────────────
/** /**
* Call OpenAI to get tag suggestions for a project * Call OpenAI to get tag suggestions for a single project
* Used for on-demand single-project tagging
*/ */
async function getAISuggestions( async function getAISuggestions(
anonymizedProject: AnonymizedProjectForAI, anonymizedProject: AnonymizedProjectForAI,
@@ -145,9 +178,10 @@ async function getAISuggestions(
return { suggestions: [], tokensUsed: 0 } return { suggestions: [], tokensUsed: 0 }
} }
const model = await getConfiguredModel() // Use QUICK model — tag classification is simple, doesn't need expensive reasoning
const model = await getConfiguredModel(AI_MODELS.QUICK)
// Build tag list for prompt // Build compact tag list for prompt
const tagList = availableTags.map((t) => ({ const tagList = availableTags.map((t) => ({
name: t.name, name: t.name,
category: t.category, category: t.category,
@@ -155,10 +189,10 @@ async function getAISuggestions(
})) }))
const userPrompt = `PROJECT: const userPrompt = `PROJECT:
${JSON.stringify(anonymizedProject, null, 2)} ${JSON.stringify(anonymizedProject)}
AVAILABLE TAGS: AVAILABLE TAGS:
${JSON.stringify(tagList, null, 2)} ${JSON.stringify(tagList)}
Suggest relevant tags for this project.` Suggest relevant tags for this project.`
@@ -246,6 +280,161 @@ Suggest relevant tags for this project.`
} }
} }
/**
* Call OpenAI to get tag suggestions for a batch of projects in one API call.
* Returns a map of project_id -> TagSuggestion[].
*/
async function getAISuggestionsBatch(
anonymizedProjects: AnonymizedProjectForAI[],
availableTags: AvailableTag[],
userId?: string
): Promise<{ suggestionsMap: Map<string, TagSuggestion[]>; tokensUsed: number }> {
const openai = await getOpenAI()
if (!openai) {
console.warn('[AI Tagging] OpenAI not configured')
return { suggestionsMap: new Map(), tokensUsed: 0 }
}
// Use QUICK model — tag classification is simple, doesn't need expensive reasoning
const model = await getConfiguredModel(AI_MODELS.QUICK)
const suggestionsMap = new Map<string, TagSuggestion[]>()
// Build compact tag list (sent once for entire batch)
const tagList = availableTags.map((t) => ({
name: t.name,
category: t.category,
description: t.description,
}))
const userPrompt = `PROJECTS (${anonymizedProjects.length}):
${JSON.stringify(anonymizedProjects)}
AVAILABLE TAGS:
${JSON.stringify(tagList)}
Suggest relevant tags for each project.`
const MAX_PARSE_RETRIES = 2
let parseAttempts = 0
try {
const params = buildCompletionParams(model, {
messages: [
{ role: 'system', content: BATCH_TAG_SYSTEM_PROMPT },
{ role: 'user', content: userPrompt },
],
jsonMode: true,
temperature: 0.1,
maxTokens: Math.min(4000, anonymizedProjects.length * 500),
})
let response = await openai.chat.completions.create(params)
let usage = extractTokenUsage(response)
let totalTokens = usage.totalTokens
// Parse with retry logic
let parsed: {
projects: Array<{
project_id: string
suggestions: Array<{
tag_name: string
confidence: number
reasoning: string
}>
}>
}
while (true) {
try {
const content = response.choices[0]?.message?.content
if (!content) throw new Error('Empty response from AI')
const raw = JSON.parse(content)
parsed = raw.projects ? raw : { projects: Array.isArray(raw) ? raw : [] }
break
} catch (parseError) {
if (parseError instanceof SyntaxError && parseAttempts < MAX_PARSE_RETRIES) {
parseAttempts++
console.warn(`[AI Tagging Batch] JSON parse failed, retrying (${parseAttempts}/${MAX_PARSE_RETRIES})`)
const retryParams = buildCompletionParams(model, {
messages: [
{ role: 'system', content: BATCH_TAG_SYSTEM_PROMPT },
{ role: 'user', content: userPrompt + '\n\nIMPORTANT: Please ensure valid JSON output.' },
],
jsonMode: true,
temperature: 0.1,
maxTokens: Math.min(4000, anonymizedProjects.length * 500),
})
response = await openai.chat.completions.create(retryParams)
const retryUsage = extractTokenUsage(response)
totalTokens += retryUsage.totalTokens
continue
}
throw parseError
}
}
// Log usage for the entire batch
await logAIUsage({
userId,
action: 'PROJECT_TAGGING',
entityType: 'Project',
model,
promptTokens: usage.promptTokens,
completionTokens: usage.completionTokens,
totalTokens,
batchSize: anonymizedProjects.length,
itemsProcessed: parsed.projects?.length || 0,
status: 'SUCCESS',
})
// Map results back to TagSuggestion format
for (const projectResult of parsed.projects || []) {
const suggestions: TagSuggestion[] = []
for (const s of projectResult.suggestions || []) {
const tag = availableTags.find(
(t) => t.name.toLowerCase() === s.tag_name.toLowerCase()
)
if (tag) {
suggestions.push({
tagId: tag.id,
tagName: tag.name,
confidence: Math.max(0, Math.min(1, s.confidence)),
reasoning: s.reasoning || '',
})
}
}
suggestionsMap.set(projectResult.project_id, suggestions)
}
return { suggestionsMap, tokensUsed: totalTokens }
} catch (error) {
if (error instanceof SyntaxError) {
const parseError = createParseError(error.message)
logAIError('Tagging', 'getAISuggestionsBatch', parseError)
}
const classified = classifyAIError(error)
logAIError('Tagging', 'getAISuggestionsBatch', classified)
await logAIUsage({
userId,
action: 'PROJECT_TAGGING',
entityType: 'Project',
model: 'unknown',
promptTokens: 0,
completionTokens: 0,
totalTokens: 0,
batchSize: anonymizedProjects.length,
itemsProcessed: 0,
status: 'ERROR',
errorMessage: error instanceof Error ? error.message : 'Unknown error',
})
throw error
}
}
// ─── Public API ────────────────────────────────────────────────────────────── // ─── Public API ──────────────────────────────────────────────────────────────
/** /**
@@ -355,6 +544,153 @@ export async function tagProject(
} }
} }
/**
* Tag a batch of projects using batched API calls with concurrency.
* Much more efficient than tagging one-by-one for bulk operations.
*
* @param projects Array of { id, projectTags } to tag
* @param userId The user initiating the tagging
* @param onProgress Callback for progress updates
* @returns Array of TaggingResult
*/
export async function tagProjectsBatch(
projects: Array<{
id: string
title: string
projectTags: Array<{ tagId: string }>
}>,
userId: string,
onProgress?: (processed: number, total: number) => Promise<void>
): Promise<{ results: TaggingResult[]; totalTokens: number }> {
const settings = await getTaggingSettings()
if (!settings.enabled) {
return { results: [], totalTokens: 0 }
}
const availableTags = await getAvailableTags()
if (availableTags.length === 0) {
return { results: [], totalTokens: 0 }
}
// Fetch full project data for all projects at once (single DB query)
const fullProjects = await prisma.project.findMany({
where: { id: { in: projects.map((p) => p.id) } },
include: {
projectTags: true,
files: { select: { fileType: true } },
_count: { select: { teamMembers: true, files: true } },
},
})
const projectMap = new Map(fullProjects.map((p) => [p.id, p]))
// Anonymize all projects at once
const projectsWithRelations = fullProjects.map(toProjectWithRelations)
const { anonymized, mappings } = anonymizeProjectsForAI(projectsWithRelations, 'FILTERING')
if (!validateAnonymizedProjects(anonymized)) {
throw new Error('GDPR compliance check failed: PII detected in anonymized data')
}
// Build mapping from anonymous ID to real project
const anonToRealMap = new Map<string, string>()
for (const mapping of mappings) {
anonToRealMap.set(mapping.anonymousId, mapping.realId)
}
// Split into batches
const batches: AnonymizedProjectForAI[][] = []
for (let i = 0; i < anonymized.length; i += BATCH_SIZE) {
batches.push(anonymized.slice(i, i + BATCH_SIZE))
}
const allResults: TaggingResult[] = []
let totalTokens = 0
let processedCount = 0
// Process batches with concurrency
for (let i = 0; i < batches.length; i += BATCH_CONCURRENCY) {
const concurrentBatches = batches.slice(i, i + BATCH_CONCURRENCY)
const batchPromises = concurrentBatches.map(async (batch) => {
try {
const { suggestionsMap, tokensUsed } = await getAISuggestionsBatch(
batch,
availableTags,
userId
)
return { suggestionsMap, tokensUsed, error: null }
} catch (error) {
console.error('[AI Tagging Batch] Batch failed:', error)
return { suggestionsMap: new Map<string, TagSuggestion[]>(), tokensUsed: 0, error }
}
})
const batchResults = await Promise.all(batchPromises)
// Process results from all concurrent batches
for (const { suggestionsMap, tokensUsed } of batchResults) {
totalTokens += tokensUsed
for (const [anonId, suggestions] of suggestionsMap) {
const realId = anonToRealMap.get(anonId)
if (!realId) continue
const project = projectMap.get(realId)
if (!project) continue
// Filter by confidence
const validSuggestions = suggestions.filter(
(s) => s.confidence >= settings.confidenceThreshold
)
// Get existing tags
const existingTagIds = new Set(project.projectTags.map((pt) => pt.tagId))
const currentTagCount = project.projectTags.length
const remainingSlots = Math.max(0, settings.maxTags - currentTagCount)
const newSuggestions = validSuggestions
.filter((s) => !existingTagIds.has(s.tagId))
.slice(0, remainingSlots)
// Apply tags
const applied: TagSuggestion[] = []
for (const suggestion of newSuggestions) {
try {
await prisma.projectTag.create({
data: {
projectId: realId,
tagId: suggestion.tagId,
confidence: suggestion.confidence,
source: 'AI',
},
})
applied.push(suggestion)
} catch {
// Skip duplicates
}
}
allResults.push({
projectId: realId,
suggestions,
applied,
tokensUsed: 0, // Token tracking is per-batch, not per-project
})
processedCount++
}
}
// Report progress after each concurrent chunk
if (onProgress) {
await onProgress(processedCount, projects.length)
}
}
return { results: allResults, totalTokens }
}
/** /**
* Get tag suggestions for a project without applying them * Get tag suggestions for a project without applying them
* Useful for preview/review before applying * Useful for preview/review before applying

View File

@@ -52,7 +52,7 @@ export interface AnonymizedProject {
anonymousId: string anonymousId: string
title: string title: string
description: string | null description: string | null
tags: string[] tags: Array<{ name: string; confidence: number }>
teamName: string | null teamName: string | null
} }
@@ -83,6 +83,8 @@ export interface AnonymizedFileInfo {
file_type: string // FileType enum value file_type: string // FileType enum value
page_count: number | null // Number of pages if known page_count: number | null // Number of pages if known
size_kb: number // File size in KB size_kb: number // File size in KB
detected_lang?: string | null // ISO 639-3 language code (e.g. 'eng', 'fra')
lang_confidence?: number | null // 0.01.0 confidence score
round_name?: string | null // Which round the file was submitted for round_name?: string | null // Which round the file was submitted for
is_current_round?: boolean // Whether this file belongs to the current filtering/evaluation round is_current_round?: boolean // Whether this file belongs to the current filtering/evaluation round
text_content?: string // Extracted text content (when aiParseFiles is enabled) text_content?: string // Extracted text content (when aiParseFiles is enabled)
@@ -209,6 +211,7 @@ interface ProjectInput {
title: string title: string
description?: string | null description?: string | null
tags: string[] tags: string[]
tagConfidences?: Array<{ name: string; confidence: number }>
teamName?: string | null teamName?: string | null
} }
@@ -253,7 +256,9 @@ export function anonymizeForAI(
description: project.description description: project.description
? truncateAndSanitize(project.description, DESCRIPTION_LIMITS.ASSIGNMENT) ? truncateAndSanitize(project.description, DESCRIPTION_LIMITS.ASSIGNMENT)
: null, : null,
tags: project.tags, tags: project.tagConfidences && project.tagConfidences.length > 0
? project.tagConfidences
: project.tags.map((t) => ({ name: t, confidence: 1.0 })),
teamName: project.teamName ? `Team ${index + 1}` : null, teamName: project.teamName ? `Team ${index + 1}` : null,
} }
} }
@@ -306,6 +311,8 @@ export function anonymizeProjectForAI(
file_type: f.fileType ?? 'OTHER', file_type: f.fileType ?? 'OTHER',
page_count: f.pageCount ?? null, page_count: f.pageCount ?? null,
size_kb: Math.round((f.size ?? 0) / 1024), size_kb: Math.round((f.size ?? 0) / 1024),
...(f.detectedLang ? { detected_lang: f.detectedLang } : {}),
...(f.langConfidence != null ? { lang_confidence: f.langConfidence } : {}),
...(f.roundName ? { round_name: f.roundName } : {}), ...(f.roundName ? { round_name: f.roundName } : {}),
...(f.isCurrentRound !== undefined ? { is_current_round: f.isCurrentRound } : {}), ...(f.isCurrentRound !== undefined ? { is_current_round: f.isCurrentRound } : {}),
...(f.textContent ? { text_content: f.textContent } : {}), ...(f.textContent ? { text_content: f.textContent } : {}),
@@ -524,7 +531,7 @@ export function validateAnonymization(data: AnonymizationResult): boolean {
if (!checkText(project.title)) return false if (!checkText(project.title)) return false
if (!checkText(project.description)) return false if (!checkText(project.description)) return false
for (const tag of project.tags) { for (const tag of project.tags) {
if (!checkText(tag)) return false if (!checkText(typeof tag === 'string' ? tag : tag.name)) return false
} }
} }

View File

@@ -0,0 +1,367 @@
/**
* Document Analyzer Service
*
* Extracts metadata from uploaded files:
* - Page count (PDFs)
* - Text preview (first ~2000 chars)
* - Language detection via franc
*
* Runs optionally on upload (controlled by SystemSettings) and
* retroactively via admin endpoint.
*/
import { getStorageProvider } from '@/lib/storage'
import { isParseableMimeType } from './file-content-extractor'
import { prisma } from '@/lib/prisma'
const TEXT_PREVIEW_LIMIT = 2000
const BATCH_SIZE = 10
// ─── Types ──────────────────────────────────────────────────────────────────
export type AnalysisResult = {
fileId: string
pageCount: number | null
textPreview: string | null
detectedLang: string | null
langConfidence: number | null
error?: string
}
// ─── Language Detection ──────────────────────────────────────────────────────
/**
* Detect language using franc. Returns ISO 639-3 code and confidence.
* franc returns a distance-based score where lower = better match.
* We convert to 0-1 confidence where 1 = perfect match.
*/
async function detectLanguage(
text: string
): Promise<{ lang: string; confidence: number }> {
if (!text || text.trim().length < 20) {
return { lang: 'und', confidence: 0 }
}
// Use a reasonable sample for detection (first 5000 chars)
const sample = text.slice(0, 5000)
const { francAll } = await import('franc')
const results = francAll(sample, { minLength: 10 })
if (!results || results.length === 0 || results[0][0] === 'und') {
return { lang: 'und', confidence: 0 }
}
const topLang = results[0][0]
const topScore = results[0][1] // 1.0 = best match, 0.0 = worst
// franc scores: 1.0 is best match, scale drops from there
// Convert to a 0-1 confidence
const confidence = Math.max(0, Math.min(1, topScore))
return { lang: topLang, confidence: Math.round(confidence * 100) / 100 }
}
// ─── Core Analysis ──────────────────────────────────────────────────────────
/**
* Analyze a single file: extract page count, text preview, and detect language.
* Downloads the file from storage, parses it, and returns results.
*/
export async function analyzeFileContent(
objectKey: string,
bucket: string,
mimeType: string,
fileName: string,
fileId: string
): Promise<AnalysisResult> {
const result: AnalysisResult = {
fileId,
pageCount: null,
textPreview: null,
detectedLang: null,
langConfidence: null,
}
if (!isParseableMimeType(mimeType)) {
return { ...result, error: 'Unsupported mime type for analysis' }
}
try {
const storage = await getStorageProvider()
const buffer = await storage.getObject(objectKey)
let text = ''
let pageCount: number | null = null
if (mimeType === 'application/pdf') {
const pdfParseModule = await import('pdf-parse')
const pdfParse =
typeof pdfParseModule === 'function'
? pdfParseModule
: (pdfParseModule as any).default ?? pdfParseModule
const pdf = await pdfParse(buffer)
text = pdf.text || ''
pageCount = pdf.numpages ?? null
} else {
// Text-based files (plain text, CSV, markdown, HTML, RTF)
text = buffer.toString('utf-8')
}
result.pageCount = pageCount
// Text preview
if (text.trim()) {
result.textPreview =
text.length > TEXT_PREVIEW_LIMIT
? text.slice(0, TEXT_PREVIEW_LIMIT)
: text
}
// Language detection
if (text.trim().length >= 20) {
const langResult = await detectLanguage(text)
result.detectedLang = langResult.lang
result.langConfidence = langResult.confidence
}
return result
} catch (error) {
console.warn(
`[DocAnalyzer] Failed to analyze ${fileName}:`,
error instanceof Error ? error.message : error
)
return {
...result,
error: error instanceof Error ? error.message : 'Analysis failed',
}
}
}
// ─── DB-Integrated Operations ───────────────────────────────────────────────
/**
* Analyze a single file by ID and persist results to DB.
*/
export async function analyzeFile(fileId: string): Promise<AnalysisResult> {
const file = await prisma.projectFile.findUnique({
where: { id: fileId },
select: {
id: true,
objectKey: true,
bucket: true,
mimeType: true,
fileName: true,
},
})
if (!file) {
return {
fileId,
pageCount: null,
textPreview: null,
detectedLang: null,
langConfidence: null,
error: 'File not found',
}
}
const result = await analyzeFileContent(
file.objectKey,
file.bucket,
file.mimeType,
file.fileName,
file.id
)
// Persist results
await prisma.projectFile.update({
where: { id: fileId },
data: {
pageCount: result.pageCount,
textPreview: result.textPreview,
detectedLang: result.detectedLang,
langConfidence: result.langConfidence,
analyzedAt: new Date(),
},
})
return result
}
/**
* Analyze a single file by ID with a delay (for post-upload use).
* The delay accounts for presigned URL uploads where the file
* may not be in storage yet when the DB record is created.
*/
export async function analyzeFileDelayed(
fileId: string,
delayMs = 3000
): Promise<AnalysisResult> {
await new Promise((resolve) => setTimeout(resolve, delayMs))
return analyzeFile(fileId)
}
/**
* Analyze all files for a specific project.
*/
export async function analyzeProjectFiles(
projectId: string
): Promise<{ analyzed: number; failed: number; total: number }> {
const files = await prisma.projectFile.findMany({
where: { projectId },
select: {
id: true,
objectKey: true,
bucket: true,
mimeType: true,
fileName: true,
},
})
let analyzed = 0
let failed = 0
// Process in batches
for (let i = 0; i < files.length; i += BATCH_SIZE) {
const batch = files.slice(i, i + BATCH_SIZE)
const results = await Promise.allSettled(
batch.map(async (file) => {
if (!isParseableMimeType(file.mimeType)) {
// Mark non-parseable files as analyzed with no data
await prisma.projectFile.update({
where: { id: file.id },
data: { analyzedAt: new Date() },
})
return 'skipped'
}
const result = await analyzeFileContent(
file.objectKey,
file.bucket,
file.mimeType,
file.fileName,
file.id
)
await prisma.projectFile.update({
where: { id: file.id },
data: {
pageCount: result.pageCount,
textPreview: result.textPreview,
detectedLang: result.detectedLang,
langConfidence: result.langConfidence,
analyzedAt: new Date(),
},
})
return result.error ? 'failed' : 'analyzed'
})
)
for (const r of results) {
if (r.status === 'fulfilled') {
if (r.value === 'analyzed') analyzed++
else if (r.value === 'failed') failed++
} else {
failed++
}
}
}
return { analyzed, failed, total: files.length }
}
/**
* Retroactive batch analysis: analyze all files that haven't been analyzed yet.
* Returns counts. Processes in batches to avoid memory issues.
*/
export async function analyzeAllUnanalyzed(): Promise<{
analyzed: number
failed: number
skipped: number
total: number
}> {
const files = await prisma.projectFile.findMany({
where: { analyzedAt: null },
select: {
id: true,
objectKey: true,
bucket: true,
mimeType: true,
fileName: true,
},
orderBy: { createdAt: 'desc' },
})
let analyzed = 0
let failed = 0
let skipped = 0
for (let i = 0; i < files.length; i += BATCH_SIZE) {
const batch = files.slice(i, i + BATCH_SIZE)
const results = await Promise.allSettled(
batch.map(async (file) => {
if (!isParseableMimeType(file.mimeType)) {
await prisma.projectFile.update({
where: { id: file.id },
data: { analyzedAt: new Date() },
})
return 'skipped'
}
const result = await analyzeFileContent(
file.objectKey,
file.bucket,
file.mimeType,
file.fileName,
file.id
)
await prisma.projectFile.update({
where: { id: file.id },
data: {
pageCount: result.pageCount,
textPreview: result.textPreview,
detectedLang: result.detectedLang,
langConfidence: result.langConfidence,
analyzedAt: new Date(),
},
})
return result.error ? 'failed' : 'analyzed'
})
)
for (const r of results) {
if (r.status === 'fulfilled') {
if (r.value === 'analyzed') analyzed++
else if (r.value === 'failed') failed++
else if (r.value === 'skipped') skipped++
} else {
failed++
}
}
console.log(
`[DocAnalyzer] Batch progress: ${i + batch.length}/${files.length} (${analyzed} analyzed, ${skipped} skipped, ${failed} failed)`
)
}
return { analyzed, failed, skipped, total: files.length }
}
/**
* Check if auto-analysis is enabled via SystemSettings.
*/
export async function isAutoAnalysisEnabled(): Promise<boolean> {
try {
const setting = await prisma.systemSettings.findUnique({
where: { key: 'file_analysis_auto_enabled' },
})
// Default to true if setting doesn't exist
return setting?.value !== 'false'
} catch {
return true
}
}

View File

@@ -143,6 +143,24 @@ export async function activateRound(
detailsJson: { name: round.name, roundType: round.roundType }, detailsJson: { name: round.name, roundType: round.roundType },
}) })
// Retroactive check: auto-PASS any projects that already have all required docs uploaded
// Non-fatal — runs after activation so it never blocks the transition
try {
const projectStates = await prisma.projectRoundState.findMany({
where: { roundId, state: { in: ['PENDING', 'IN_PROGRESS'] } },
select: { projectId: true },
})
if (projectStates.length > 0) {
const projectIds = projectStates.map((ps: { projectId: string }) => ps.projectId)
const result = await batchCheckRequirementsAndTransition(roundId, projectIds, actorId, prisma)
if (result.transitionedCount > 0) {
console.log(`[RoundEngine] On activation: auto-passed ${result.transitionedCount} projects with complete documents`)
}
}
} catch (retroError) {
console.error('[RoundEngine] Retroactive document check failed (non-fatal):', retroError)
}
return { return {
success: true, success: true,
round: { id: updated.id, status: updated.status }, round: { id: updated.id, status: updated.status },
@@ -429,6 +447,23 @@ export async function reopenRound(
}, },
}) })
// Retroactive check: auto-PASS any projects that already have all required docs
try {
const projectStates = await prisma.projectRoundState.findMany({
where: { roundId, state: { in: ['PENDING', 'IN_PROGRESS'] } },
select: { projectId: true },
})
if (projectStates.length > 0) {
const projectIds = projectStates.map((ps: { projectId: string }) => ps.projectId)
const batchResult = await batchCheckRequirementsAndTransition(roundId, projectIds, actorId, prisma)
if (batchResult.transitionedCount > 0) {
console.log(`[RoundEngine] On reopen: auto-passed ${batchResult.transitionedCount} projects with complete documents`)
}
}
} catch (retroError) {
console.error('[RoundEngine] Retroactive document check on reopen failed (non-fatal):', retroError)
}
return { return {
success: true, success: true,
round: { id: result.updated.id, status: result.updated.status }, round: { id: result.updated.id, status: result.updated.status },
@@ -625,6 +660,109 @@ export async function getProjectRoundState(
}) })
} }
// ─── Auto-Transition on Document Completion ─────────────────────────────────
/**
* Check if a project has fulfilled all required FileRequirements for a round.
* If yes, and the project is currently PENDING, transition it to PASSED.
*
* Called after file uploads (admin bulk upload or applicant upload).
* Non-fatal: errors are logged but never propagated to callers.
*/
export async function checkRequirementsAndTransition(
projectId: string,
roundId: string,
actorId: string,
prisma: PrismaClient | any,
): Promise<{ transitioned: boolean; newState?: string }> {
try {
// Get all required FileRequirements for this round
const requirements = await prisma.fileRequirement.findMany({
where: { roundId, isRequired: true },
select: { id: true },
})
// If the round has no file requirements, nothing to check
if (requirements.length === 0) {
return { transitioned: false }
}
// Check which requirements this project has satisfied (has a file uploaded)
const fulfilledFiles = await prisma.projectFile.findMany({
where: {
projectId,
roundId,
requirementId: { in: requirements.map((r: { id: string }) => r.id) },
},
select: { requirementId: true },
})
const fulfilledIds = new Set(
fulfilledFiles
.map((f: { requirementId: string | null }) => f.requirementId)
.filter(Boolean)
)
// Check if all required requirements are met
const allMet = requirements.every((r: { id: string }) => fulfilledIds.has(r.id))
if (!allMet) {
return { transitioned: false }
}
// Check current state — only transition if PENDING or IN_PROGRESS
const currentState = await prisma.projectRoundState.findUnique({
where: { projectId_roundId: { projectId, roundId } },
select: { state: true },
})
const eligibleStates = ['PENDING', 'IN_PROGRESS']
if (!currentState || !eligibleStates.includes(currentState.state)) {
return { transitioned: false }
}
// All requirements met — transition to PASSED
const result = await transitionProject(projectId, roundId, 'PASSED' as ProjectRoundStateValue, actorId, prisma)
if (result.success) {
console.log(`[RoundEngine] Auto-transitioned project ${projectId} to PASSED in round ${roundId} (all ${requirements.length} requirements met)`)
return { transitioned: true, newState: 'PASSED' }
}
return { transitioned: false }
} catch (error) {
// Non-fatal — log and continue
console.error('[RoundEngine] checkRequirementsAndTransition failed:', error)
return { transitioned: false }
}
}
/**
* Batch version: check all projects in a round and transition any that
* have all required documents uploaded. Useful after bulk upload.
*/
export async function batchCheckRequirementsAndTransition(
roundId: string,
projectIds: string[],
actorId: string,
prisma: PrismaClient | any,
): Promise<{ transitionedCount: number; projectIds: string[] }> {
const transitioned: string[] = []
for (const projectId of projectIds) {
const result = await checkRequirementsAndTransition(projectId, roundId, actorId, prisma)
if (result.transitioned) {
transitioned.push(projectId)
}
}
if (transitioned.length > 0) {
console.log(`[RoundEngine] Batch auto-transition: ${transitioned.length}/${projectIds.length} projects moved to PASSED in round ${roundId}`)
}
return { transitionedCount: transitioned.length, projectIds: transitioned }
}
// ─── Internals ────────────────────────────────────────────────────────────── // ─── Internals ──────────────────────────────────────────────────────────────
function isTerminalState(state: ProjectRoundStateValue): boolean { function isTerminalState(state: ProjectRoundStateValue): boolean {