MOPC-App/Notes.md

13 KiB
Raw Blame History

Below is a “technical requirements” rendering (not an architecture diagram), structured so you can hand it to a dev team and derive system architecture + backlog from it. Phase 1 is the only critical deliverable; Phase 2+ are explicitly extendable.


0) Product scope and phasing

Phase 1 (critical, delivery in ~2 weeks)

Secure Jury Online Voting Module to run two selection rounds:

  • Round 1: ~130 projects → ~60 semi-finalists (Feb 1823 voting window)
  • Round 2: ~60 projects → 6 finalists (~April 13 week voting window)
  • Voting is asynchronous, online, with assigned project access, scoring + feedback capture, and reporting dashboards.

Phase 2+ (mid-term)

Centralized MOPC platform:

  • Applications/projects database
  • Document management (MinIO S3)
  • Jury spaces (history, comments, scoring)
  • Learning hub / resources
  • Communication workflows (email + possibly WhatsApp)
  • Partner/sponsor visibility modules
  • Potential website integration / shared back office

1) Users, roles, permissions (RBAC)

Core roles

  1. Platform Super Admin

    • Full system configuration, security policies, integrations, user/role management, data export, audit access.
  2. Program Admin (MOPC Admin)

    • Manages cycles/rounds, projects, jury members, assignments, voting windows, criteria forms, dashboards, exports.
  3. Jury Member

    • Can access only assigned projects for active rounds; submit evaluations; view own submitted evaluations; optionally view aggregated results only if permitted.
  4. Read-only Observer (optional)

    • Internal meeting viewer: can see dashboards/aggregates but cannot edit votes.

Permission model requirements

  • Least privilege by default
  • Round-scoped permissions: access can be constrained per selection round/cycle.
  • Project-scoped access control: jury sees only assigned projects (unless admin toggles “all projects visible”).
  • Admin override controls: reassign projects, revoke access, reopen/lock evaluations, extend voting windows, invalidate votes with reason logging.

2) Core domain objects (data model concepts)

Entities

  • Program (e.g., “MOPC 2026”)

  • Selection Cycle / Round

    • Attributes: name, start/end of voting window, status (draft/active/closed/archived), required reviews per project (default ≥3), scoring form version, jury cohort.
  • Project

    • Attributes: title, team name, description, tags, status (submitted/eligible/assigned/semi-finalist/finalist/etc.), submission metadata, external IDs (Typeform/Notion), files (exec summary, PDF deck, intro video).
  • File Asset

    • Stored in MinIO (S3-compatible): object key, bucket, version/etag, mime type, size, upload timestamp, retention policy, access policy.
  • Jury Member

    • Profile: name, email, organization (optional), role, expertise tags, status (invited/active/suspended).
  • Expertise Tag

    • Managed vocabulary or free-form with admin approval.
  • Assignment

    • Connects Jury Member ↔ Project ↔ Round
    • Attributes: assignment method (manual/auto), created by, created timestamp, required review flag, completion status.
  • Evaluation (Vote)

    • Per assignment: criterion scores + global score + binary decision + qualitative feedback
    • Metadata: submitted_at, last_edited_at, finalization flag, versioning, IP/user-agent logging (optional), conflict handling.
  • Audit Log

    • Immutable events: login, permission changes, voting window changes, assignments, overrides, exports, vote invalidations.

3) Phase 1 functional requirements

3.1 Jury authentication & access

  • Invite flow:

    • Admin imports jury list (CSV) or adds manually.
    • System sends invitation email with secure link + account activation.
  • Authentication options (choose one for Phase 1, keep others pluggable):

    • Email magic link (recommended for speed)
    • Password + MFA optional
  • Session requirements:

    • Configurable session duration
    • Forced logout on role revocation
  • Access gating:

    • Jury can only view projects for active rounds and only those assigned.

3.2 Project ingestion & management

Phase 1 can support either:

  • Option A (fastest): Manual import

    • Admin uploads CSV with project metadata + file links or uploads.
  • Option B (semi-integrated): Sync from Notion/Typeform

    • Read projects from existing Notion DB and/or Typeform export.

Minimum capabilities:

  • Admin CRUD on projects (create/update/archive)

  • Project tagging (from “Which issue does your project address?” + additional admin tags)

  • Attach required assets:

    • Executive summary (PDF/doc)
    • PDF presentation
    • 30s intro video (mp4)
  • File storage via MinIO (see Section 6)

3.3 Assignment system (≥3 reviews/project)

Admin can:

  • Manually assign projects to jury members (bulk assign supported)

  • Auto-assign (optional but strongly recommended):

    • Input: jury expertise tags + project tags + constraints

    • Constraints:

      • Each project assigned to at least N jurors (N configurable; default 3)
      • Load balancing across jurors (minimize variance)
      • Avoid conflicts (optional): disallow assignment if juror marked conflict with project
    • Output: assignment set + summary metrics (coverage, per-juror load, unmatched tags)

  • Reassignment rules:

    • Admin can reassign at any time

    • If an evaluation exists, admin can:

      • keep existing evaluation tied to original juror
      • or invalidate/lock it (requires reason + audit event)

3.4 Evaluation form & scoring logic

Per project evaluation must capture:

  • Criterion scores (scale-based, define exact scale as configurable; e.g., 15 or 110)

    1. Need clarity
    2. Solution relevance
    3. Gap analysis (market/competitors)
    4. Target customers clarity
    5. Ocean impact
  • Global score: 110

  • Binary decision: “Select as semi-finalist?” (Yes/No)

  • Qualitative feedback: long text

Form requirements:

  • Admin-configurable criteria text, ordering, scales, and whether fields are mandatory
  • Autosave drafts
  • Final submit locks evaluation by default (admin can allow edits until window closes)
  • Support multiple rounds with potentially different forms (versioned forms per round)

3.5 Voting windows and enforcement (must-have)

Admins must be able to configure and enforce:

  • Voting window start/end per round (date-time, timezone-aware)

  • States:

    • Draft (admins only)
    • Active (jury can submit)
    • Closed (jury read-only)
    • Archived (admin/export only)
  • Enforcement rules:

    • Jury cannot submit outside the active window
    • Admin “grace period” toggle to accept late submissions for specific jurors/projects
    • Admin can extend the window (global or subset) with audit logging
  • Dashboard countdown + clear messaging for jurors

3.6 Dashboards & outputs

Must produce:

  • Jury member view

    • Assigned projects list, completion status, quick access to files, evaluation status (not started/draft/submitted)
  • Admin dashboards

    • Coverage: projects with <N evaluations

    • Progress: submission rates by juror

    • Aggregates per project:

      • Average per criterion
      • Average global score
      • Distribution (min/max, std dev optional)
      • Count of “Yes” votes
      • Qualitative comments list (with juror identity visible only to admins, configurable)
    • Shortlisting tools:

      • Filter/sort by aggregate score, yes-vote ratio, tag, missing reviews
      • Export shortlist (e.g., top 60 / top 6) with manual override controls
  • Exports (Phase 1):

    • CSV/Excel export for:

      • Evaluations (row per evaluation)
      • Aggregates (row per project)
      • Assignment matrix
    • PDF export (optional) for meeting packs


4) Admin console requirements (robust)

4.1 Governance & configuration

  • Create/manage Programs and Rounds

  • Set:

    • Required reviews per project (N)
    • Voting windows (start/end) + grace rules
    • Evaluation form version
    • Visibility rules (whether jurors can see aggregates, whether jurors can see their past submissions after close)
  • Manage tags:

    • Tag taxonomy, synonyms/merging, locked tags

4.2 User management & security controls

  • Bulk invite/import

  • Role assignment & revocation

  • Force password reset / disable account

  • View user activity logs

  • Configure:

    • Allowed email domains (optional)
    • MFA requirement (optional)
    • Session lifetime (optional)

4.3 Assignment controls

  • Manual assignment UI (single + bulk)

  • Auto-assignment wizard:

    • select round
    • choose balancing strategy (e.g., “maximize tag match”, “balance load first”)
    • preview results
    • apply
  • Conflict of interest handling:

    • Admin can mark conflicts (juror ↔ project)
    • Auto-assign must respect conflicts

4.4 Data integrity controls

  • Vote invalidation (requires reason)
  • Reopen evaluation (admin-only, logged)
  • Freeze round (hard lock)
  • Immutable audit log export

4.5 Integrations management

  • Connectors toggles (Typeform/Notion/email provider/WhatsApp) with credentials stored securely
  • MinIO bucket configuration + retention policies
  • Webhook management (optional)

5) Non-functional requirements (Phase 1)

Security

  • TLS everywhere

  • RBAC + project-level access control

  • Secure file access (pre-signed URLs with short TTL; no public buckets)

  • Audit logging for admin actions + exports

  • Basic anti-abuse:

    • rate limiting login endpoints
    • brute-force protection if password auth used

Reliability & performance

  • Support:

    • Round 1: 15 jurors, 130 projects, min 390 evaluations
    • Round 2: ~30 jurors, 60 projects
  • Fast page load for dashboards and project pages

  • File streaming for PDFs/videos (avoid timeouts)

Compliance & privacy (baseline)

  • Store only necessary personal data for jurors/candidates
  • Retention policies configurable (especially for candidate files)
  • Access logs available for security review

6) File storage requirements (MinIO S3)

Storage design (requirements-level)

  • Use MinIO as S3-compatible object store for:

    • project documents (exec summary, deck)
    • video files
    • optional assets (logos, exports packs)
  • Buckets:

    • Separate buckets or prefixes by Program/Round to simplify retention + permissions
  • Access pattern:

    • Upload: direct-to-S3 (preferred) or via backend proxy
    • Download/view: pre-signed URLs generated by backend per authorized user
  • Optional features:

    • Object versioning enabled
    • Antivirus scanning hook (Phase 2)
    • Lifecycle rules (auto-expire after X months)

7) “Current process” integration mapping (future-proof)

Existing flow

  • Typeform application → confirmation email → Tally upload → Notion tracking → Google Drive manual upload

Platform integration targets

Phase 1 (minimal):

  • Allow admin to ingest projects and upload assets (replace Drive for jury-facing access)

Phase 2 options:

  • Typeform: pull submissions via API/webhooks
  • Tally: capture uploads directly to MinIO (or via platform upload portal)
  • Notion: sync project status + metadata (one-way or two-way)
  • Email automation: reminder workflows for incomplete applications

8) Additional ideas as “technical backlog candidates”

Automated follow-ups for incomplete applications (Phase 2)

  • State machine for applications: registered → awaiting docs → complete → expired

  • Scheduler:

    • send reminders at configurable intervals (e.g., +2d, +5d, +7d)
    • stop on completion
  • Channels:

    • Email must-have
    • WhatsApp optional (requires compliance + provider; store consent + opt-out)

Learning hub access (semi-finalists only)

  • Resource library stored in MinIO + metadata in DB
  • Access controlled by cohort + passwordless login or access tokens
  • Expiring invite links

Website integration

  • Shared identity/back office (SSO-ready) OR separate admin domains
  • Public-facing site remains content-only; platform is operational hub
  • Requirement: clear separation between “public content” and “private jury/applicant data”

9) Acceptance criteria checklist (Phase 1)

  1. Admin can create a round, set voting window (start/end), and activate it.
  2. Admin can import projects + upload/attach required files to MinIO.
  3. Admin can import jurors, invite them, and jurors can log in securely.
  4. Admin can assign projects (manual + bulk). Auto-assign is optional but if included must guarantee ≥3 reviews/project.
  5. Juror sees only assigned projects, can view files, and submit evaluation form.
  6. System blocks submissions outside the voting window (unless admin-granted exception).
  7. Admin dashboard shows progress + aggregates per project; admin can export results.
  8. All critical admin actions are audit-logged.
  9. File access is protected (no public links; pre-signed URLs with TTL).

If you want, I can turn this into:

  • a clean PRD-style document (Dev-ready) plus
  • a ticket breakdown (Epics → user stories → acceptance tests) for Phase 1 delivery.