From 7ee7a1441b9989a12a645319762da435af6004f6 Mon Sep 17 00:00:00 2001 From: William Hill Date: Tue, 31 Mar 2026 18:48:35 -0400 Subject: [PATCH 01/16] docs: add implementation plan for self-service data upload (#86) 12 tasks covering schema registry, file parsing, API routes, UI wizard, upload history, nav update, and test data generation. TDD approach with Vitest for pure-logic modules. --- .../2026-03-31-self-service-data-upload.md | 2786 +++++++++++++++++ 1 file changed, 2786 insertions(+) create mode 100644 docs/superpowers/plans/2026-03-31-self-service-data-upload.md diff --git a/docs/superpowers/plans/2026-03-31-self-service-data-upload.md b/docs/superpowers/plans/2026-03-31-self-service-data-upload.md new file mode 100644 index 0000000..bc51a49 --- /dev/null +++ b/docs/superpowers/plans/2026-03-31-self-service-data-upload.md @@ -0,0 +1,2786 @@ +# Self-Service Data Upload Implementation Plan + +> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking. + +**Goal:** Allow admin/IR users to upload PDP cohort, course enrollment, and ML prediction files via a 3-step wizard that auto-detects file type, previews data with column mapping, and batch-upserts into Postgres. + +**Architecture:** 3-step client-side wizard (`Upload → Preview & Map → Confirm`) backed by three API routes (preview, commit, history). Schema detection in `lib/upload-schemas.ts` scores file headers against known PDP/AR column signatures. Pure-logic modules are unit-tested with Vitest. + +**Tech Stack:** Next.js 16 App Router, React 19, TypeScript, Tailwind CSS, shadcn/ui, pg, papaparse, xlsx, Vitest + +**Spec:** `docs/superpowers/specs/2026-03-31-self-service-data-upload-design.md` + +**Worktree:** `.worktrees/feature-86-data-upload` (branch: `feature/86-self-service-data-upload`, based on `origin/main`) + +**Working directory:** All `codebenders-dashboard/` paths are relative to the worktree root. + +--- + +## File Structure + +### New Files + +| File | Responsibility | +|------|---------------| +| `codebenders-dashboard/lib/upload-schemas.ts` | Schema registry, header normalization, auto-detection scoring | +| `codebenders-dashboard/lib/upload-parser.ts` | CSV/XLSX parsing to uniform row arrays | +| `codebenders-dashboard/lib/__tests__/upload-schemas.test.ts` | Unit tests for schema detection | +| `codebenders-dashboard/lib/__tests__/upload-parser.test.ts` | Unit tests for file parsing | +| `codebenders-dashboard/vitest.config.ts` | Vitest configuration | +| `codebenders-dashboard/app/admin/layout.tsx` | Admin sub-layout (shared heading) | +| `codebenders-dashboard/app/admin/upload/page.tsx` | Upload wizard (3-step, client component) | +| `codebenders-dashboard/app/admin/upload/history/page.tsx` | Upload history table | +| `codebenders-dashboard/app/api/admin/upload/preview/route.ts` | POST: parse file, detect schema, return preview | +| `codebenders-dashboard/app/api/admin/upload/commit/route.ts` | POST: validate rows, batch upsert, log to history | +| `codebenders-dashboard/app/api/admin/upload/history/route.ts` | GET: paginated upload history | +| `codebenders-dashboard/components/upload/drop-zone.tsx` | Drag-and-drop file input | +| `codebenders-dashboard/components/upload/column-mapper.tsx` | Column mapping table with dropdowns | +| `codebenders-dashboard/components/upload/data-preview.tsx` | Sample rows table | +| `codebenders-dashboard/components/upload/upload-summary.tsx` | Completion card with row counts | +| `operations/generate_test_data.py` | Python script generating synthetic PDP/AR/course test files | + +### Modified Files + +| File | Change | +|------|--------| +| `codebenders-dashboard/lib/roles.ts:6-14` | Add `/admin` and `/api/admin` to `ROUTE_PERMISSIONS` | +| `codebenders-dashboard/components/nav-header.tsx:15-19` | Add "Admin" link to `NAV_LINKS` (conditionally shown for admin/ir) | +| `codebenders-dashboard/package.json` | Add `papaparse`, `@types/papaparse`, `xlsx`, `vitest` | + +--- + +## Task 1: Install Dependencies & Configure Vitest + +**Files:** +- Modify: `codebenders-dashboard/package.json` +- Create: `codebenders-dashboard/vitest.config.ts` + +- [ ] **Step 1: Install npm packages** + +```bash +cd codebenders-dashboard +npm install papaparse xlsx +npm install -D @types/papaparse vitest +``` + +- [ ] **Step 2: Create Vitest config** + +Create `codebenders-dashboard/vitest.config.ts`: + +```typescript +import { defineConfig } from "vitest/config" +import path from "path" + +export default defineConfig({ + test: { + environment: "node", + include: ["lib/__tests__/**/*.test.ts"], + }, + resolve: { + alias: { + "@": path.resolve(__dirname), + }, + }, +}) +``` + +- [ ] **Step 3: Add test script to package.json** + +Add to the `"scripts"` section in `package.json`: + +```json +"test": "vitest run", +"test:watch": "vitest" +``` + +- [ ] **Step 4: Verify vitest runs (no tests yet)** + +```bash +cd codebenders-dashboard +npx vitest run +``` + +Expected: "No test files found" or similar clean exit. + +- [ ] **Step 5: Commit** + +```bash +git add codebenders-dashboard/package.json codebenders-dashboard/package-lock.json codebenders-dashboard/vitest.config.ts +git commit -m "chore: add papaparse, xlsx, vitest dependencies for upload feature" +``` + +--- + +## Task 2: Schema Registry & Detection Logic + +**Files:** +- Create: `codebenders-dashboard/lib/upload-schemas.ts` +- Create: `codebenders-dashboard/lib/__tests__/upload-schemas.test.ts` + +- [ ] **Step 1: Write failing tests for header normalization and schema detection** + +Create `codebenders-dashboard/lib/__tests__/upload-schemas.test.ts`: + +```typescript +import { describe, it, expect } from "vitest" +import { + normalizeHeader, + detectSchema, + mapColumns, + SCHEMAS, +} from "../upload-schemas" + +describe("normalizeHeader", () => { + it("lowercases and replaces spaces with underscores", () => { + expect(normalizeHeader("Cohort Term")).toBe("cohort_term") + }) + + it("trims whitespace", () => { + expect(normalizeHeader(" Student_GUID ")).toBe("student_guid") + }) + + it("replaces hyphens with underscores", () => { + expect(normalizeHeader("Co-requisite Course")).toBe("co_requisite_course") + }) + + it("collapses multiple separators", () => { + expect(normalizeHeader("Some Weird--Header")).toBe("some_weird_header") + }) +}) + +describe("detectSchema", () => { + it("detects PDP cohort AR file from its headers", () => { + const headers = [ + "Student_GUID", "Cohort", "Cohort_Term", "Enrollment_Type", + "Retention", "Persistence", "GPA_Group_Year_1", + "Gateway_Math_Status", "Gateway_English_Status", + "Number_of_Credits_Attempted_Year_1", + ] + const result = detectSchema(headers) + expect(result.schema?.id).toBe("pdp_cohort_ar") + expect(result.confidence).toBeGreaterThan(0.6) + }) + + it("detects PDP cohort submission file from spaced headers", () => { + const headers = [ + "Student ID", "Cohort", "Cohort Term", "First Name", "Last Name", + "Date of Birth", "Enrollment Type", "Math Placement", + "English Placement", "Gateway Math Status", + ] + const result = detectSchema(headers) + expect(result.schema?.id).toBe("pdp_cohort_submission") + expect(result.confidence).toBeGreaterThan(0.6) + }) + + it("detects course AR file", () => { + const headers = [ + "Student_GUID", "Course_Prefix", "Course_Number", "Grade", + "Academic_Year", "Academic_Term", "Course_Name", + "Number_of_Credits_Attempted", "Number_of_Credits_Earned", + ] + const result = detectSchema(headers) + expect(result.schema?.id).toBe("course_ar") + expect(result.confidence).toBeGreaterThan(0.6) + }) + + it("detects course submission file", () => { + const headers = [ + "Student ID", "Course Prefix", "Course Number", "Grade", + "Course Name", "Course CIP", "Section ID", + "Semester/Session GPA", "Overall GPA", + ] + const result = detectSchema(headers) + expect(result.schema?.id).toBe("course_submission") + expect(result.confidence).toBeGreaterThan(0.6) + }) + + it("detects ML predictions file", () => { + const headers = [ + "student_guid", "prediction_type", "prediction_value", + "model_version", "confidence_score", + ] + const result = detectSchema(headers) + expect(result.schema?.id).toBe("ml_predictions") + expect(result.confidence).toBeGreaterThan(0.6) + }) + + it("returns null schema with low confidence for unknown headers", () => { + const headers = ["foo", "bar", "baz", "qux"] + const result = detectSchema(headers) + expect(result.schema).toBeNull() + expect(result.confidence).toBeLessThan(0.3) + }) + + it("handles mixed casing headers", () => { + const headers = [ + "STUDENT_GUID", "cohort", "COHORT_TERM", "enrollment_type", + "retention", "PERSISTENCE", "gpa_group_year_1", + ] + const result = detectSchema(headers) + expect(result.schema?.id).toBe("pdp_cohort_ar") + }) +}) + +describe("mapColumns", () => { + it("maps matched headers to canonical column names", () => { + const headers = ["Student_GUID", "Cohort_Term", "Unknown_Col"] + const schema = SCHEMAS.find((s) => s.id === "pdp_cohort_ar")! + const result = mapColumns(headers, schema) + + const matched = result.filter((c) => c.status === "matched") + const unmapped = result.filter((c) => c.status === "unmapped") + + expect(matched.length).toBe(2) + expect(matched[0].mappedTo).toBe("student_guid") + expect(unmapped.length).toBe(1) + expect(unmapped[0].header).toBe("Unknown_Col") + }) +}) +``` + +- [ ] **Step 2: Run tests to verify they fail** + +```bash +cd codebenders-dashboard +npx vitest run +``` + +Expected: FAIL — module `../upload-schemas` not found. + +- [ ] **Step 3: Implement the schema registry** + +Create `codebenders-dashboard/lib/upload-schemas.ts`: + +```typescript +// ── Types ──────────────────────────────────────────────────────────────────── + +export interface SchemaColumn { + name: string + aliases: string[] + type: "text" | "numeric" | "date" | "enum" + required: boolean + validValues?: string[] + transform?: (value: string) => string +} + +export interface UploadSchema { + id: string + label: string + targetTable: string + upsertKey: string[] + columns: SchemaColumn[] +} + +export interface DetectionResult { + schema: UploadSchema | null + confidence: number + scores: Array<{ schemaId: string; label: string; score: number }> +} + +export interface ColumnMapping { + header: string + mappedTo: string | null + status: "matched" | "unmapped" +} + +// ── Value Transforms ───────────────────────────────────────────────────────── + +const ENROLLMENT_TYPE_MAP: Record = { + F: "First-Time", + R: "Re-admit", + T: "Transfer-In", +} + +const RACE_MAP: Record = { + W: "White", + B: "Black or African American", + A: "Asian", + AN: "American Indian or Alaska Native", + IA: "American Indian or Alaska Native", + HP: "Native Hawaiian or Other Pacific Islander", + TM: "Two or More Races", + UK: "Unknown", +} + +const ETHNICITY_MAP: Record = { + H: "Hispanic", + N: "Not Hispanic", + UK: "Unknown", +} + +const GENDER_MAP: Record = { + M: "Male", + F: "Female", + P: "Non-binary", + X: "Other", + UK: "Unknown", +} + +function mapLookup(map: Record) { + return (value: string) => map[value.trim()] ?? value +} + +// ── Schema Definitions ─────────────────────────────────────────────────────── + +export const SCHEMAS: UploadSchema[] = [ + { + id: "pdp_cohort_ar", + label: "PDP Cohort AR File", + targetTable: "student_level_with_predictions", + upsertKey: ["student_guid"], + columns: [ + // Identity + { name: "student_guid", aliases: ["student_guid", "student_id"], type: "text", required: true }, + { name: "institution_id", aliases: ["institution_id"], type: "text", required: false }, + { name: "cohort", aliases: ["cohort"], type: "text", required: true }, + { name: "cohort_term", aliases: ["cohort_term"], type: "enum", required: true, validValues: ["Fall", "Winter", "Spring", "Summer"] }, + // Demographics + { name: "student_age", aliases: ["student_age"], type: "text", required: false }, + { name: "enrollment_type", aliases: ["enrollment_type"], type: "text", required: true }, + { name: "enrollment_intensity_first_term", aliases: ["enrollment_intensity_first_term"], type: "text", required: false }, + { name: "race", aliases: ["race"], type: "text", required: false }, + { name: "ethnicity", aliases: ["ethnicity"], type: "text", required: false }, + { name: "gender", aliases: ["gender"], type: "text", required: false }, + { name: "first_gen", aliases: ["first_gen"], type: "text", required: false }, + { name: "pell_status_first_year", aliases: ["pell_status_first_year"], type: "text", required: false }, + // Academic + { name: "math_placement", aliases: ["math_placement"], type: "text", required: false }, + { name: "english_placement", aliases: ["english_placement"], type: "text", required: false }, + { name: "reading_placement", aliases: ["reading_placement"], type: "text", required: false }, + { name: "gateway_math_status", aliases: ["gateway_math_status"], type: "text", required: false }, + { name: "gateway_english_status", aliases: ["gateway_english_status"], type: "text", required: false }, + { name: "credential_type_sought_year_1", aliases: ["credential_type_sought_year_1"], type: "text", required: false }, + { name: "gpa_group_term_1", aliases: ["gpa_group_term_1"], type: "text", required: false }, + { name: "gpa_group_year_1", aliases: ["gpa_group_year_1"], type: "text", required: false }, + // Credits + { name: "number_of_credits_attempted_year_1", aliases: ["number_of_credits_attempted_year_1"], type: "numeric", required: false }, + { name: "number_of_credits_earned_year_1", aliases: ["number_of_credits_earned_year_1"], type: "numeric", required: false }, + { name: "number_of_credits_attempted_year_2", aliases: ["number_of_credits_attempted_year_2"], type: "numeric", required: false }, + { name: "number_of_credits_earned_year_2", aliases: ["number_of_credits_earned_year_2"], type: "numeric", required: false }, + // Outcomes (AR-specific — these discriminate AR from submission) + { name: "retention", aliases: ["retention"], type: "numeric", required: false }, + { name: "persistence", aliases: ["persistence"], type: "numeric", required: false }, + { name: "time_to_credential", aliases: ["time_to_credential"], type: "text", required: false }, + { name: "years_to_bachelors_at_cohort_inst_", aliases: ["years_to_bachelors_at_cohort_inst_"], type: "text", required: false }, + { name: "years_to_associates_or_certificate_at_cohort_inst_", aliases: ["years_to_associates_or_certificate_at_cohort_inst_"], type: "text", required: false }, + // Metadata + { name: "school", aliases: ["school"], type: "text", required: false }, + { name: "dataset_type", aliases: ["dataset_type"], type: "text", required: false }, + ], + }, + { + id: "pdp_cohort_submission", + label: "PDP Cohort Submission File", + targetTable: "student_level_with_predictions", + upsertKey: ["student_guid"], + columns: [ + // PII fields — discriminate submission from AR + { name: "student_guid", aliases: ["student_id", "student id"], type: "text", required: true }, + { name: "cohort", aliases: ["cohort"], type: "text", required: true }, + { name: "cohort_term", aliases: ["cohort_term", "cohort term"], type: "text", required: true }, + { name: "first_name", aliases: ["first_name", "first name"], type: "text", required: false }, + { name: "last_name", aliases: ["last_name", "last name"], type: "text", required: false }, + { name: "date_of_birth", aliases: ["date_of_birth", "date of birth"], type: "date", required: false }, + { name: "enrollment_type", aliases: ["enrollment_type", "enrollment type"], type: "text", required: true, transform: mapLookup(ENROLLMENT_TYPE_MAP) }, + { name: "race", aliases: ["race"], type: "text", required: false, transform: mapLookup(RACE_MAP) }, + { name: "ethnicity", aliases: ["ethnicity"], type: "text", required: false, transform: mapLookup(ETHNICITY_MAP) }, + { name: "gender", aliases: ["gender"], type: "text", required: false, transform: mapLookup(GENDER_MAP) }, + { name: "math_placement", aliases: ["math_placement", "math placement"], type: "text", required: false }, + { name: "english_placement", aliases: ["english_placement", "english placement"], type: "text", required: false }, + { name: "gateway_math_status", aliases: ["gateway_math_status", "gateway math status"], type: "text", required: false }, + { name: "gateway_english_status", aliases: ["gateway_english_status", "gateway english status"], type: "text", required: false }, + { name: "first_gen", aliases: ["first_gen", "first gen"], type: "text", required: false }, + { name: "dual_and_summer_enrollment", aliases: ["dual_and_summer_enrollment", "dual and summer enrollment"], type: "text", required: false }, + ], + }, + { + id: "course_ar", + label: "Course Enrollment AR File", + targetTable: "course_enrollments", + upsertKey: ["student_guid", "course_prefix", "course_number", "academic_term", "academic_year"], + columns: [ + { name: "student_guid", aliases: ["student_guid", "student_id"], type: "text", required: true }, + { name: "cohort", aliases: ["cohort"], type: "text", required: false }, + { name: "cohort_term", aliases: ["cohort_term"], type: "text", required: false }, + { name: "academic_year", aliases: ["academic_year"], type: "text", required: true }, + { name: "academic_term", aliases: ["academic_term"], type: "text", required: true }, + { name: "course_prefix", aliases: ["course_prefix"], type: "text", required: true }, + { name: "course_number", aliases: ["course_number"], type: "text", required: true }, + { name: "section_id", aliases: ["section_id"], type: "text", required: false }, + { name: "course_name", aliases: ["course_name"], type: "text", required: false }, + { name: "course_cip", aliases: ["course_cip"], type: "text", required: false }, + { name: "course_type", aliases: ["course_type"], type: "text", required: false }, + { name: "math_or_english_gateway", aliases: ["math_or_english_gateway"], type: "text", required: false }, + { name: "grade", aliases: ["grade"], type: "text", required: true }, + { name: "number_of_credits_attempted", aliases: ["number_of_credits_attempted"], type: "numeric", required: false }, + { name: "number_of_credits_earned", aliases: ["number_of_credits_earned"], type: "numeric", required: false }, + { name: "delivery_method", aliases: ["delivery_method"], type: "text", required: false }, + { name: "course_begin_date", aliases: ["course_begin_date"], type: "date", required: false }, + { name: "course_end_date", aliases: ["course_end_date"], type: "date", required: false }, + // Demographics (joined from cohort in AR files) + { name: "student_age", aliases: ["student_age"], type: "text", required: false }, + { name: "race", aliases: ["race"], type: "text", required: false }, + { name: "ethnicity", aliases: ["ethnicity"], type: "text", required: false }, + { name: "gender", aliases: ["gender"], type: "text", required: false }, + { name: "institution_id", aliases: ["institution_id"], type: "text", required: false }, + { name: "school", aliases: ["school"], type: "text", required: false }, + ], + }, + { + id: "course_submission", + label: "Course Enrollment Submission File", + targetTable: "course_enrollments", + upsertKey: ["student_guid", "course_prefix", "course_number", "academic_term", "academic_year"], + columns: [ + { name: "student_guid", aliases: ["student_id", "student id"], type: "text", required: true }, + { name: "cohort", aliases: ["cohort"], type: "text", required: false }, + { name: "cohort_term", aliases: ["cohort_term", "cohort term"], type: "text", required: false }, + { name: "academic_year", aliases: ["academic_year", "academic year"], type: "text", required: true }, + { name: "academic_term", aliases: ["term", "academic_term", "academic term"], type: "text", required: true }, + { name: "course_prefix", aliases: ["course_prefix", "course prefix"], type: "text", required: true }, + { name: "course_number", aliases: ["course_number", "course number"], type: "text", required: true }, + { name: "section_id", aliases: ["section_id", "section id"], type: "text", required: false }, + { name: "course_name", aliases: ["course_name", "course name"], type: "text", required: false }, + { name: "course_cip", aliases: ["course_cip", "course cip"], type: "text", required: false }, + { name: "course_type", aliases: ["course_type", "course type"], type: "text", required: false }, + { name: "grade", aliases: ["grade"], type: "text", required: true }, + { name: "number_of_credits_attempted", aliases: ["number_of_credits_attempted", "number of credits attempted"], type: "numeric", required: false }, + { name: "number_of_credits_earned", aliases: ["number_of_credits_earned", "number of credits earned"], type: "numeric", required: false }, + // PII fields — discriminate submission from AR + { name: "first_name", aliases: ["first_name", "first name"], type: "text", required: false }, + { name: "last_name", aliases: ["last_name", "last name"], type: "text", required: false }, + { name: "date_of_birth", aliases: ["date_of_birth", "date of birth"], type: "date", required: false }, + { name: "semester_session_gpa", aliases: ["semester_session_gpa", "semester/session gpa"], type: "numeric", required: false }, + { name: "overall_gpa", aliases: ["overall_gpa", "overall gpa"], type: "numeric", required: false }, + ], + }, + { + id: "ml_predictions", + label: "ML Predictions", + targetTable: "student_level_with_predictions", + upsertKey: ["student_guid"], + columns: [ + { name: "student_guid", aliases: ["student_guid", "student_id"], type: "text", required: true }, + { name: "prediction_type", aliases: ["prediction_type"], type: "text", required: true }, + { name: "prediction_value", aliases: ["prediction_value"], type: "text", required: true }, + { name: "model_version", aliases: ["model_version"], type: "text", required: false }, + { name: "confidence_score", aliases: ["confidence_score", "confidence"], type: "numeric", required: false }, + ], + }, +] + +// ── Header Normalization ───────────────────────────────────────────────────── + +export function normalizeHeader(header: string): string { + return header + .trim() + .toLowerCase() + .replace(/[/]/g, "_") + .replace(/[\s\-]+/g, "_") +} + +// ── Schema Detection ───────────────────────────────────────────────────────── + +export function detectSchema(headers: string[]): DetectionResult { + const normalized = headers.map(normalizeHeader) + + const scores = SCHEMAS.map((schema) => { + const schemaNames = new Set( + schema.columns.flatMap((col) => [col.name, ...col.aliases.map(normalizeHeader)]) + ) + const matched = normalized.filter((h) => schemaNames.has(h)).length + const score = schema.columns.length > 0 ? matched / schema.columns.length : 0 + return { schemaId: schema.id, label: schema.label, score } + }) + + scores.sort((a, b) => b.score - a.score) + const best = scores[0] + + if (best.score >= 0.6) { + return { + schema: SCHEMAS.find((s) => s.id === best.schemaId)!, + confidence: best.score, + scores, + } + } + + if (best.score >= 0.3) { + return { schema: SCHEMAS.find((s) => s.id === best.schemaId)!, confidence: best.score, scores } + } + + return { schema: null, confidence: best.score, scores } +} + +// ── Column Mapping ─────────────────────────────────────────────────────────── + +export function mapColumns(headers: string[], schema: UploadSchema): ColumnMapping[] { + return headers.map((header) => { + const norm = normalizeHeader(header) + const col = schema.columns.find( + (c) => c.name === norm || c.aliases.some((a) => normalizeHeader(a) === norm) + ) + return col + ? { header, mappedTo: col.name, status: "matched" as const } + : { header, mappedTo: null, status: "unmapped" as const } + }) +} +``` + +- [ ] **Step 4: Run tests to verify they pass** + +```bash +cd codebenders-dashboard +npx vitest run +``` + +Expected: All tests PASS. + +- [ ] **Step 5: Commit** + +```bash +git add codebenders-dashboard/lib/upload-schemas.ts codebenders-dashboard/lib/__tests__/upload-schemas.test.ts +git commit -m "feat(upload): schema registry with auto-detection and column mapping" +``` + +--- + +## Task 3: File Parser Utility + +**Files:** +- Create: `codebenders-dashboard/lib/upload-parser.ts` +- Create: `codebenders-dashboard/lib/__tests__/upload-parser.test.ts` + +- [ ] **Step 1: Write failing tests for CSV parsing** + +Create `codebenders-dashboard/lib/__tests__/upload-parser.test.ts`: + +```typescript +import { describe, it, expect } from "vitest" +import { parseFileBuffer, validateFileSize, getFileType } from "../upload-parser" + +describe("getFileType", () => { + it("returns csv for .csv files", () => { + expect(getFileType("data.csv")).toBe("csv") + }) + + it("returns xlsx for .xlsx files", () => { + expect(getFileType("data.xlsx")).toBe("xlsx") + }) + + it("returns null for unsupported extensions", () => { + expect(getFileType("data.pdf")).toBeNull() + expect(getFileType("data.json")).toBeNull() + }) + + it("handles uppercase extensions", () => { + expect(getFileType("DATA.CSV")).toBe("csv") + expect(getFileType("DATA.XLSX")).toBe("xlsx") + }) +}) + +describe("validateFileSize", () => { + it("returns true for files under 50MB", () => { + expect(validateFileSize(1024 * 1024)).toBe(true) // 1MB + expect(validateFileSize(50 * 1024 * 1024 - 1)).toBe(true) // just under 50MB + }) + + it("returns false for files at or over 50MB", () => { + expect(validateFileSize(50 * 1024 * 1024)).toBe(false) + expect(validateFileSize(100 * 1024 * 1024)).toBe(false) + }) +}) + +describe("parseFileBuffer", () => { + it("parses CSV content into headers and rows", async () => { + const csv = "Name,Age,City\nAlice,30,Mobile\nBob,25,Birmingham\n" + const buffer = Buffer.from(csv) + const result = await parseFileBuffer(buffer, "csv") + + expect(result.headers).toEqual(["Name", "Age", "City"]) + expect(result.rows).toHaveLength(2) + expect(result.rows[0]).toEqual({ Name: "Alice", Age: "30", City: "Mobile" }) + expect(result.rows[1]).toEqual({ Name: "Bob", Age: "25", City: "Birmingham" }) + }) + + it("handles CSV with quoted fields containing commas", async () => { + const csv = 'Name,Description\nAlice,"Has, commas"\n' + const buffer = Buffer.from(csv) + const result = await parseFileBuffer(buffer, "csv") + + expect(result.rows[0].Description).toBe("Has, commas") + }) + + it("respects maxRows parameter", async () => { + const csv = "X\na\nb\nc\nd\ne\n" + const buffer = Buffer.from(csv) + const result = await parseFileBuffer(buffer, "csv", 3) + + expect(result.rows).toHaveLength(3) + expect(result.totalRows).toBe(5) + }) + + it("trims whitespace from headers", async () => { + const csv = " Name , Age \nAlice,30\n" + const buffer = Buffer.from(csv) + const result = await parseFileBuffer(buffer, "csv") + + expect(result.headers).toEqual(["Name", "Age"]) + }) +}) +``` + +- [ ] **Step 2: Run tests to verify they fail** + +```bash +cd codebenders-dashboard +npx vitest run +``` + +Expected: FAIL — module `../upload-parser` not found. + +- [ ] **Step 3: Implement the file parser** + +Create `codebenders-dashboard/lib/upload-parser.ts`: + +```typescript +import Papa from "papaparse" +import * as XLSX from "xlsx" + +const MAX_FILE_SIZE = 50 * 1024 * 1024 // 50 MB + +export interface ParseResult { + headers: string[] + rows: Record[] + totalRows: number +} + +export function getFileType(filename: string): "csv" | "xlsx" | null { + const ext = filename.toLowerCase().split(".").pop() + if (ext === "csv") return "csv" + if (ext === "xlsx" || ext === "xls") return "xlsx" + return null +} + +export function validateFileSize(bytes: number): boolean { + return bytes < MAX_FILE_SIZE +} + +export async function parseFileBuffer( + buffer: Buffer, + fileType: "csv" | "xlsx", + maxRows?: number +): Promise { + if (fileType === "xlsx") { + return parseXlsx(buffer, maxRows) + } + return parseCsv(buffer, maxRows) +} + +function parseCsv(buffer: Buffer, maxRows?: number): Promise { + return new Promise((resolve, reject) => { + const text = buffer.toString("utf-8") + + const result = Papa.parse>(text, { + header: true, + skipEmptyLines: true, + transformHeader: (h: string) => h.trim(), + }) + + if (result.errors.length > 0 && result.data.length === 0) { + reject(new Error(`CSV parse error: ${result.errors[0].message}`)) + return + } + + const headers = result.meta.fields ?? [] + const allRows = result.data + const rows = maxRows ? allRows.slice(0, maxRows) : allRows + + resolve({ headers, rows, totalRows: allRows.length }) + }) +} + +function parseXlsx(buffer: Buffer, maxRows?: number): Promise { + return new Promise((resolve, reject) => { + try { + const workbook = XLSX.read(buffer, { type: "buffer" }) + const sheetName = workbook.SheetNames[0] + if (!sheetName) { + reject(new Error("Excel file has no sheets")) + return + } + + const sheet = workbook.Sheets[sheetName] + const jsonData = XLSX.utils.sheet_to_json>(sheet, { + defval: "", + raw: false, + }) + + if (jsonData.length === 0) { + resolve({ headers: [], rows: [], totalRows: 0 }) + return + } + + const headers = Object.keys(jsonData[0]).map((h) => h.trim()) + const rows = maxRows ? jsonData.slice(0, maxRows) : jsonData + + resolve({ headers, rows, totalRows: jsonData.length }) + } catch (err) { + reject(new Error(`Excel parse error: ${(err as Error).message}`)) + } + }) +} +``` + +- [ ] **Step 4: Run tests to verify they pass** + +```bash +cd codebenders-dashboard +npx vitest run +``` + +Expected: All tests PASS. + +- [ ] **Step 5: Commit** + +```bash +git add codebenders-dashboard/lib/upload-parser.ts codebenders-dashboard/lib/__tests__/upload-parser.test.ts +git commit -m "feat(upload): CSV and XLSX file parser with size validation" +``` + +--- + +## Task 4: Auth, Routing & Database Migration + +**Files:** +- Modify: `codebenders-dashboard/lib/roles.ts:6-14` +- Create: `codebenders-dashboard/app/admin/layout.tsx` + +- [ ] **Step 1: Add admin routes to ROUTE_PERMISSIONS** + +In `codebenders-dashboard/lib/roles.ts`, add two entries to the `ROUTE_PERMISSIONS` array, after the existing entries (before the closing `]`): + +```typescript + { prefix: "/admin", roles: ["admin", "ir"] }, + { prefix: "/api/admin", roles: ["admin", "ir"] }, +``` + +- [ ] **Step 2: Create the upload_history table in Supabase** + +Run this SQL against the Supabase database (via the Supabase dashboard SQL editor or `psql`): + +```sql +CREATE TABLE IF NOT EXISTS public.upload_history ( + id BIGSERIAL PRIMARY KEY, + user_id UUID NOT NULL, + user_email TEXT NOT NULL, + filename TEXT NOT NULL, + file_type TEXT NOT NULL, + rows_inserted INT DEFAULT 0, + rows_skipped INT DEFAULT 0, + error_count INT DEFAULT 0, + status TEXT CHECK (status IN ('success', 'partial', 'failed')) NOT NULL, + uploaded_at TIMESTAMPTZ NOT NULL DEFAULT now() +); + +CREATE INDEX IF NOT EXISTS idx_upload_history_uploaded_at + ON public.upload_history (uploaded_at DESC); +``` + +Verify: `SELECT * FROM upload_history LIMIT 1;` should return 0 rows, no error. + +- [ ] **Step 3: Create admin layout** + +Create `codebenders-dashboard/app/admin/layout.tsx`: + +```tsx +export default function AdminLayout({ children }: { children: React.ReactNode }) { + return <>{children} +} +``` + +This is a passthrough layout. The admin pages inherit the root layout's NavHeader. We keep this file so the `/admin` route segment exists for future admin pages. + +- [ ] **Step 4: Verify build passes** + +```bash +cd codebenders-dashboard +npm run build +``` + +Expected: Build succeeds. The middleware now blocks non-admin/ir users from `/admin` paths. + +- [ ] **Step 5: Commit** + +```bash +git add codebenders-dashboard/lib/roles.ts codebenders-dashboard/app/admin/layout.tsx +git commit -m "feat(upload): add admin/ir role gate and upload_history migration" +``` + +--- + +## Task 5: API Route — Preview + +**Files:** +- Create: `codebenders-dashboard/app/api/admin/upload/preview/route.ts` + +- [ ] **Step 1: Create the preview API route** + +Create `codebenders-dashboard/app/api/admin/upload/preview/route.ts`: + +```typescript +import { NextRequest, NextResponse } from "next/server" +import { parseFileBuffer, getFileType, validateFileSize } from "@/lib/upload-parser" +import { detectSchema, mapColumns } from "@/lib/upload-schemas" + +export async function POST(request: NextRequest) { + try { + const formData = await request.formData() + const file = formData.get("file") as File | null + + if (!file) { + return NextResponse.json({ error: "No file provided" }, { status: 400 }) + } + + if (!validateFileSize(file.size)) { + return NextResponse.json( + { error: "File exceeds 50 MB limit" }, + { status: 413 } + ) + } + + const fileType = getFileType(file.name) + if (!fileType) { + return NextResponse.json( + { error: "Unsupported file type. Please upload a .csv or .xlsx file." }, + { status: 400 } + ) + } + + const buffer = Buffer.from(await file.arrayBuffer()) + const { headers, rows, totalRows } = await parseFileBuffer(buffer, fileType, 50) + + const detection = detectSchema(headers) + + const columns = detection.schema + ? mapColumns(headers, detection.schema) + : headers.map((h) => ({ header: h, mappedTo: null, status: "unmapped" as const })) + + const missingRequired = detection.schema + ? detection.schema.columns + .filter((c) => c.required) + .filter((c) => !columns.some((col) => col.mappedTo === c.name)) + .map((c) => `Missing required column: ${c.name}`) + : [] + + const warnings = detection.schema + ? detection.schema.columns + .filter((c) => !c.required) + .filter((c) => !columns.some((col) => col.mappedTo === c.name)) + .slice(0, 5) + .map((c) => `Missing optional column: ${c.name}`) + : [] + + return NextResponse.json({ + detectedSchema: detection.schema?.id ?? null, + detectedSchemaLabel: detection.schema?.label ?? null, + confidence: Math.round(detection.confidence * 100) / 100, + scores: detection.scores, + columns, + sampleRows: rows.slice(0, 10), + totalRows, + warnings, + errors: missingRequired, + }) + } catch (err) { + console.error("Upload preview error:", err) + return NextResponse.json( + { error: `Failed to parse file: ${(err as Error).message}` }, + { status: 500 } + ) + } +} +``` + +- [ ] **Step 2: Verify build passes** + +```bash +cd codebenders-dashboard +npm run build +``` + +Expected: Build succeeds with new route listed. + +- [ ] **Step 3: Commit** + +```bash +git add codebenders-dashboard/app/api/admin/upload/preview/route.ts +git commit -m "feat(upload): preview API route with schema detection" +``` + +--- + +## Task 6: API Routes — Commit & History + +**Files:** +- Create: `codebenders-dashboard/app/api/admin/upload/commit/route.ts` +- Create: `codebenders-dashboard/app/api/admin/upload/history/route.ts` + +- [ ] **Step 1: Create the commit API route** + +Create `codebenders-dashboard/app/api/admin/upload/commit/route.ts`: + +```typescript +import { NextRequest, NextResponse } from "next/server" +import { parseFileBuffer, getFileType, validateFileSize } from "@/lib/upload-parser" +import { SCHEMAS, normalizeHeader, type UploadSchema, type ColumnMapping } from "@/lib/upload-schemas" +import { getPool } from "@/lib/db" + +const BATCH_SIZE = 500 + +export async function POST(request: NextRequest) { + const userId = request.headers.get("x-user-id") ?? "" + const userEmail = request.headers.get("x-user-email") ?? "" + + try { + const formData = await request.formData() + const file = formData.get("file") as File | null + const schemaId = formData.get("schemaId") as string | null + const mappingJson = formData.get("columnMapping") as string | null + + if (!file || !schemaId || !mappingJson) { + return NextResponse.json( + { error: "Missing file, schemaId, or columnMapping" }, + { status: 400 } + ) + } + + if (!validateFileSize(file.size)) { + return NextResponse.json({ error: "File exceeds 50 MB limit" }, { status: 413 }) + } + + const schema = SCHEMAS.find((s) => s.id === schemaId) + if (!schema) { + return NextResponse.json({ error: `Unknown schema: ${schemaId}` }, { status: 400 }) + } + + const columnMapping: ColumnMapping[] = JSON.parse(mappingJson) + + const fileType = getFileType(file.name) + if (!fileType) { + return NextResponse.json({ error: "Unsupported file type" }, { status: 400 }) + } + + const buffer = Buffer.from(await file.arrayBuffer()) + const { rows } = await parseFileBuffer(buffer, fileType) + + const result = await upsertRows(rows, columnMapping, schema) + + // Log to upload_history + const pool = getPool() + const status = + result.errors.length > 0 && result.inserted === 0 + ? "failed" + : result.errors.length > 0 + ? "partial" + : "success" + + const { rows: historyRows } = await pool.query( + `INSERT INTO upload_history (user_id, user_email, filename, file_type, rows_inserted, rows_skipped, error_count, status) + VALUES ($1, $2, $3, $4, $5, $6, $7, $8) RETURNING id`, + [userId, userEmail, file.name, schemaId, result.inserted, result.skipped, result.errors.length, status] + ) + + return NextResponse.json({ + inserted: result.inserted, + skipped: result.skipped, + errors: result.errors.slice(0, 50), + uploadId: historyRows[0].id, + }) + } catch (err) { + console.error("Upload commit error:", err) + return NextResponse.json( + { error: `Upload failed: ${(err as Error).message}` }, + { status: 500 } + ) + } +} + +interface UpsertResult { + inserted: number + skipped: number + errors: Array<{ row: number; column?: string; message: string }> +} + +async function upsertRows( + rows: Record[], + columnMapping: ColumnMapping[], + schema: UploadSchema +): Promise { + const pool = getPool() + let inserted = 0 + let skipped = 0 + const errors: UpsertResult["errors"] = [] + + // Build the header→dbColumn map from the user-confirmed mapping + const headerToDb = new Map() + for (const col of columnMapping) { + if (col.mappedTo) { + headerToDb.set(col.header, col.mappedTo) + } + } + + // Build transform lookup from schema + const transforms = new Map string>() + for (const col of schema.columns) { + if (col.transform) { + transforms.set(col.name, col.transform) + } + } + + // Check required columns are mapped + const mappedDbCols = new Set(headerToDb.values()) + for (const col of schema.columns) { + if (col.required && !mappedDbCols.has(col.name)) { + errors.push({ row: 0, column: col.name, message: `Required column not mapped: ${col.name}` }) + } + } + if (errors.length > 0) return { inserted: 0, skipped: 0, errors } + + // Get the actual DB columns for the target table to filter to valid columns only + const dbColNames = Array.from(mappedDbCols) + + // Process in batches + for (let i = 0; i < rows.length; i += BATCH_SIZE) { + const batch = rows.slice(i, i + BATCH_SIZE) + + for (let j = 0; j < batch.length; j++) { + const row = batch[j] + const rowIndex = i + j + 1 // 1-based for user display + + try { + const dbRow: Record = {} + for (const [header, dbCol] of headerToDb) { + let value = row[header] ?? "" + const transform = transforms.get(dbCol) + if (transform) value = transform(value) + dbRow[dbCol] = value + } + + // Check required fields have values + const missingRequired = schema.columns + .filter((c) => c.required && (!dbRow[c.name] || dbRow[c.name].trim() === "")) + if (missingRequired.length > 0) { + skipped++ + errors.push({ + row: rowIndex, + column: missingRequired[0].name, + message: `Empty required field: ${missingRequired[0].name}`, + }) + continue + } + + const cols = Object.keys(dbRow) + const vals = Object.values(dbRow) + const placeholders = cols.map((_, idx) => `$${idx + 1}`) + const updateSet = cols + .filter((c) => !schema.upsertKey.includes(c)) + .map((c) => `${c} = EXCLUDED.${c}`) + .join(", ") + + const conflictClause = schema.upsertKey.join(", ") + const sql = updateSet + ? `INSERT INTO ${schema.targetTable} (${cols.join(", ")}) + VALUES (${placeholders.join(", ")}) + ON CONFLICT (${conflictClause}) DO UPDATE SET ${updateSet}` + : `INSERT INTO ${schema.targetTable} (${cols.join(", ")}) + VALUES (${placeholders.join(", ")}) + ON CONFLICT (${conflictClause}) DO NOTHING` + + await pool.query(sql, vals) + inserted++ + } catch (err) { + skipped++ + errors.push({ + row: rowIndex, + message: (err as Error).message, + }) + } + } + } + + return { inserted, skipped, errors } +} +``` + +- [ ] **Step 2: Create the history API route** + +Create `codebenders-dashboard/app/api/admin/upload/history/route.ts`: + +```typescript +import { NextRequest, NextResponse } from "next/server" +import { getPool } from "@/lib/db" + +export async function GET(request: NextRequest) { + try { + const { searchParams } = new URL(request.url) + const page = Math.max(1, parseInt(searchParams.get("page") ?? "1")) + const pageSize = Math.min(50, Math.max(1, parseInt(searchParams.get("pageSize") ?? "20"))) + const offset = (page - 1) * pageSize + + const pool = getPool() + + const [dataResult, countResult] = await Promise.all([ + pool.query( + `SELECT id, user_email, filename, file_type, rows_inserted, rows_skipped, + error_count, status, uploaded_at + FROM upload_history + ORDER BY uploaded_at DESC + LIMIT $1 OFFSET $2`, + [pageSize, offset] + ), + pool.query(`SELECT COUNT(*)::int AS total FROM upload_history`), + ]) + + const total = countResult.rows[0].total + + return NextResponse.json({ + data: dataResult.rows.map((row) => ({ + id: row.id, + userEmail: row.user_email, + filename: row.filename, + fileType: row.file_type, + rowsInserted: row.rows_inserted, + rowsSkipped: row.rows_skipped, + errorCount: row.error_count, + status: row.status, + uploadedAt: row.uploaded_at, + })), + total, + page, + pageSize, + }) + } catch (err) { + console.error("Upload history error:", err) + return NextResponse.json( + { error: `Failed to fetch upload history: ${(err as Error).message}` }, + { status: 500 } + ) + } +} +``` + +- [ ] **Step 3: Verify build passes** + +```bash +cd codebenders-dashboard +npm run build +``` + +Expected: Build succeeds with all three API routes listed. + +- [ ] **Step 4: Commit** + +```bash +git add codebenders-dashboard/app/api/admin/upload/commit/route.ts codebenders-dashboard/app/api/admin/upload/history/route.ts +git commit -m "feat(upload): commit and history API routes with batch upsert" +``` + +--- + +## Task 7: UI Components — Drop Zone, Column Mapper, Data Preview, Upload Summary + +**Files:** +- Create: `codebenders-dashboard/components/upload/drop-zone.tsx` +- Create: `codebenders-dashboard/components/upload/column-mapper.tsx` +- Create: `codebenders-dashboard/components/upload/data-preview.tsx` +- Create: `codebenders-dashboard/components/upload/upload-summary.tsx` + +- [ ] **Step 1: Create the DropZone component** + +Create `codebenders-dashboard/components/upload/drop-zone.tsx`: + +```tsx +"use client" + +import { useCallback, useState, useRef } from "react" +import { Upload } from "lucide-react" +import { Button } from "@/components/ui/button" + +interface DropZoneProps { + onFile: (file: File) => void + disabled?: boolean +} + +export function DropZone({ onFile, disabled }: DropZoneProps) { + const [dragging, setDragging] = useState(false) + const inputRef = useRef(null) + + const handleDrop = useCallback( + (e: React.DragEvent) => { + e.preventDefault() + setDragging(false) + if (disabled) return + const file = e.dataTransfer.files[0] + if (file) onFile(file) + }, + [onFile, disabled] + ) + + const handleDragOver = useCallback((e: React.DragEvent) => { + e.preventDefault() + setDragging(true) + }, []) + + const handleDragLeave = useCallback(() => setDragging(false), []) + + const handleInputChange = useCallback( + (e: React.ChangeEvent) => { + const file = e.target.files?.[0] + if (file) onFile(file) + e.target.value = "" + }, + [onFile] + ) + + return ( +
inputRef.current?.click()} + > + +

Drag & drop your file here

+

+ .csv or .xlsx up to 50 MB +

+ + +
+ ) +} +``` + +- [ ] **Step 2: Create the ColumnMapper component** + +Create `codebenders-dashboard/components/upload/column-mapper.tsx`: + +```tsx +"use client" + +import { useState } from "react" +import type { ColumnMapping, UploadSchema } from "@/lib/upload-schemas" + +interface ColumnMapperProps { + columns: ColumnMapping[] + schema: UploadSchema | null + onMappingChange: (columns: ColumnMapping[]) => void +} + +export function ColumnMapper({ columns, schema, onMappingChange }: ColumnMapperProps) { + const [showAll, setShowAll] = useState(false) + + const matched = columns.filter((c) => c.status === "matched") + const unmapped = columns.filter((c) => c.status === "unmapped") + + const availableTargets = schema + ? schema.columns + .map((c) => c.name) + .filter((name) => !columns.some((col) => col.mappedTo === name)) + : [] + + function handleRemap(header: string, newTarget: string | null) { + const updated = columns.map((col) => + col.header === header + ? { ...col, mappedTo: newTarget, status: (newTarget ? "matched" : "unmapped") as "matched" | "unmapped" } + : col + ) + onMappingChange(updated) + } + + return ( +
+ {/* Header */} +
+ File Column + + Maps To + Status +
+ + {/* Unmapped columns (always shown) */} + {unmapped.map((col) => ( +
+ + {col.header} + + + + + unmapped + +
+ ))} + + {/* Matched columns (collapsed by default) */} + {matched.length > 0 && !showAll && ( + + )} + + {showAll && + matched.map((col) => ( +
+ + {col.header} + + + {col.mappedTo} + + matched + +
+ ))} + + {showAll && matched.length > 0 && ( + + )} +
+ ) +} +``` + +- [ ] **Step 3: Create the DataPreview component** + +Create `codebenders-dashboard/components/upload/data-preview.tsx`: + +```tsx +"use client" + +interface DataPreviewProps { + headers: string[] + rows: Record[] +} + +export function DataPreview({ headers, rows }: DataPreviewProps) { + if (rows.length === 0) return null + + // Show at most 8 columns to prevent horizontal overflow; user can scroll + const displayHeaders = headers.slice(0, 8) + const hasMore = headers.length > 8 + + return ( +
+

Data Preview

+
+ + + + {displayHeaders.map((h) => ( + + ))} + {hasMore && ( + + )} + + + + {rows.map((row, i) => ( + + {displayHeaders.map((h) => ( + + ))} + {hasMore && ( + + )} + + ))} + +
+ {h} + + +{headers.length - 8} more +
+ {row[h] ?? ""} + + … +
+
+
+ ) +} +``` + +- [ ] **Step 4: Create the UploadSummary component** + +Create `codebenders-dashboard/components/upload/upload-summary.tsx`: + +```tsx +"use client" + +import { Button } from "@/components/ui/button" +import { CheckCircle } from "lucide-react" + +interface UploadSummaryProps { + filename: string + schemaLabel: string + inserted: number + skipped: number + errorCount: number + onUploadAnother: () => void + onViewHistory: () => void +} + +export function UploadSummary({ + filename, + schemaLabel, + inserted, + skipped, + errorCount, + onUploadAnother, + onViewHistory, +}: UploadSummaryProps) { + return ( +
+
+ +

Upload Complete

+

+ {filename} — {schemaLabel} +

+
+
+ {inserted} +
+ inserted +
+
+ {skipped} +
+ skipped +
+
+ {errorCount} +
+ errors +
+
+
+ +
+ + +
+
+ ) +} +``` + +- [ ] **Step 5: Verify build passes** + +```bash +cd codebenders-dashboard +npm run build +``` + +Expected: Build succeeds (components are not yet imported by pages, but should compile). + +- [ ] **Step 6: Commit** + +```bash +git add codebenders-dashboard/components/upload/ +git commit -m "feat(upload): drop-zone, column-mapper, data-preview, upload-summary components" +``` + +--- + +## Task 8: Upload Wizard Page + +**Files:** +- Create: `codebenders-dashboard/app/admin/upload/page.tsx` + +- [ ] **Step 1: Create the upload wizard page** + +Create `codebenders-dashboard/app/admin/upload/page.tsx`: + +```tsx +"use client" + +import { useState, useCallback, useEffect } from "react" +import { useRouter } from "next/navigation" +import { DropZone } from "@/components/upload/drop-zone" +import { ColumnMapper } from "@/components/upload/column-mapper" +import { DataPreview } from "@/components/upload/data-preview" +import { UploadSummary } from "@/components/upload/upload-summary" +import { Button } from "@/components/ui/button" +import { AlertCircle, CheckCircle, Loader2 } from "lucide-react" +import type { ColumnMapping } from "@/lib/upload-schemas" + +type Step = "upload" | "preview" | "complete" + +interface PreviewData { + detectedSchema: string | null + detectedSchemaLabel: string | null + confidence: number + scores: Array<{ schemaId: string; label: string; score: number }> + columns: ColumnMapping[] + sampleRows: Record[] + totalRows: number + warnings: string[] + errors: string[] +} + +interface CommitResult { + inserted: number + skipped: number + errors: Array<{ row: number; message: string }> + uploadId: number +} + +interface HistoryEntry { + id: number + filename: string + fileType: string + rowsInserted: number + status: string + uploadedAt: string +} + +export default function UploadPage() { + const router = useRouter() + const [step, setStep] = useState("upload") + const [file, setFile] = useState(null) + const [preview, setPreview] = useState(null) + const [columns, setColumns] = useState([]) + const [selectedSchema, setSelectedSchema] = useState(null) + const [selectedSchemaLabel, setSelectedSchemaLabel] = useState(null) + const [commitResult, setCommitResult] = useState(null) + const [loading, setLoading] = useState(false) + const [error, setError] = useState(null) + const [recentUploads, setRecentUploads] = useState([]) + + // Fetch recent uploads on mount + useEffect(() => { + fetch("/api/admin/upload/history?pageSize=5") + .then((r) => r.json()) + .then((d) => setRecentUploads(d.data ?? [])) + .catch(() => {}) + }, []) + + const handleFile = useCallback(async (f: File) => { + setFile(f) + setError(null) + setLoading(true) + + try { + const formData = new FormData() + formData.append("file", f) + + const res = await fetch("/api/admin/upload/preview", { + method: "POST", + body: formData, + }) + + if (!res.ok) { + const data = await res.json() + setError(data.error ?? "Preview failed") + setLoading(false) + return + } + + const data: PreviewData = await res.json() + setPreview(data) + setColumns(data.columns) + setSelectedSchema(data.detectedSchema) + setSelectedSchemaLabel(data.detectedSchemaLabel) + setStep("preview") + } catch (err) { + setError((err as Error).message) + } finally { + setLoading(false) + } + }, []) + + const handleSchemaOverride = useCallback( + async (schemaId: string, label: string) => { + if (!file) return + setSelectedSchema(schemaId) + setSelectedSchemaLabel(label) + // Re-run preview with overridden schema detection isn't needed — + // the column mapping will be recalculated on the server during commit. + // For the preview, we just re-map columns client-side. + // This is a simplification — the preview route already returns all columns. + }, + [file] + ) + + const handleCommit = useCallback(async () => { + if (!file || !selectedSchema) return + setLoading(true) + setError(null) + + try { + const formData = new FormData() + formData.append("file", file) + formData.append("schemaId", selectedSchema) + formData.append("columnMapping", JSON.stringify(columns)) + + const res = await fetch("/api/admin/upload/commit", { + method: "POST", + body: formData, + }) + + if (!res.ok) { + const data = await res.json() + setError(data.error ?? "Upload failed") + setLoading(false) + return + } + + const data: CommitResult = await res.json() + setCommitResult(data) + setStep("complete") + } catch (err) { + setError((err as Error).message) + } finally { + setLoading(false) + } + }, [file, selectedSchema, columns]) + + const resetWizard = useCallback(() => { + setStep("upload") + setFile(null) + setPreview(null) + setColumns([]) + setSelectedSchema(null) + setSelectedSchemaLabel(null) + setCommitResult(null) + setError(null) + }, []) + + const headers = preview?.sampleRows?.[0] ? Object.keys(preview.sampleRows[0]) : [] + const hasRequiredErrors = (preview?.errors?.length ?? 0) > 0 + const stepLabels = ["Upload", "Preview & Map", "Complete"] + const stepIndex = step === "upload" ? 0 : step === "preview" ? 1 : 2 + + return ( +
+
+

Upload Data

+

+ Drop a PDP, course, or prediction file — we'll detect the format + automatically +

+
+ + {/* Step indicator */} +
+ {stepLabels.map((label, i) => ( + + {i > 0 && } + + {i < stepIndex ? `${label} ✓` : `${i + 1}. ${label}`} + + + ))} +
+ + {/* Error banner */} + {error && ( +
+ + {error} +
+ )} + + {/* Step 1: Upload */} + {step === "upload" && ( +
+ + {loading && ( +
+ Parsing file… +
+ )} + + {/* Recent uploads */} + {recentUploads.length > 0 && ( +
+

Recent Uploads

+
+ {recentUploads.map((u) => ( +
+
+ {u.filename} + + {u.fileType} + +
+
+ {u.rowsInserted} rows + + {new Date(u.uploadedAt).toLocaleDateString()} + + +
+
+ ))} +
+
+ )} +
+ )} + + {/* Step 2: Preview & Map */} + {step === "preview" && preview && ( +
+ {/* Detection banner */} + {preview.confidence >= 0.6 ? ( +
+
+ + {selectedSchemaLabel} + + — {file?.name} — {preview.totalRows} rows,{" "} + {columns.filter((c) => c.status === "matched").length}/ + {columns.length} columns matched + +
+ +
+ ) : ( +
+
+ + + Couldn't confidently detect the file type + +
+

+ {file?.name} has {columns.length} columns — it partially matches + multiple schemas. Please select the correct type: +

+
+ {preview.scores + .filter((s) => s.score > 0.1) + .map((s) => ( + + ))} +
+
+ )} + + {/* Column mapping */} +
+

Column Mapping

+ +
+ + {/* Data preview */} + + + {/* Validation summary + actions */} +
+
+ + 📊 {preview.totalRows} rows + + + ✓ {columns.filter((c) => c.status === "matched").length} matched + + {columns.filter((c) => c.status === "unmapped").length > 0 && ( + + ⚠{" "} + {columns.filter((c) => c.status === "unmapped").length}{" "} + unmapped + + )} + {hasRequiredErrors && ( + + ✗ {preview.errors.length} errors + + )} +
+
+ + +
+
+
+ )} + + {/* Step 3: Complete */} + {step === "complete" && commitResult && ( + router.push("/admin/upload/history")} + /> + )} +
+ ) +} +``` + +- [ ] **Step 2: Verify build passes** + +```bash +cd codebenders-dashboard +npm run build +``` + +Expected: Build succeeds. `/admin/upload` should appear in the route list. + +- [ ] **Step 3: Commit** + +```bash +git add codebenders-dashboard/app/admin/upload/page.tsx +git commit -m "feat(upload): 3-step upload wizard page with auto-detection" +``` + +--- + +## Task 9: Upload History Page + +**Files:** +- Create: `codebenders-dashboard/app/admin/upload/history/page.tsx` + +- [ ] **Step 1: Create the upload history page** + +Create `codebenders-dashboard/app/admin/upload/history/page.tsx`: + +```tsx +"use client" + +import { useState, useEffect, useCallback } from "react" +import { useRouter } from "next/navigation" +import { Button } from "@/components/ui/button" +import { Loader2 } from "lucide-react" + +interface UploadEntry { + id: number + userEmail: string + filename: string + fileType: string + rowsInserted: number + rowsSkipped: number + errorCount: number + status: "success" | "partial" | "failed" + uploadedAt: string +} + +const FILE_TYPE_COLORS: Record = { + pdp_cohort_ar: "bg-green-50 text-green-700", + pdp_cohort_submission: "bg-green-50 text-green-700", + course_ar: "bg-blue-50 text-blue-700", + course_submission: "bg-blue-50 text-blue-700", + ml_predictions: "bg-purple-50 text-purple-700", +} + +const FILE_TYPE_LABELS: Record = { + pdp_cohort_ar: "PDP Cohort AR", + pdp_cohort_submission: "PDP Cohort Submission", + course_ar: "Course AR", + course_submission: "Course Submission", + ml_predictions: "ML Predictions", +} + +const STATUS_STYLES: Record = { + success: "bg-green-100 text-green-700", + partial: "bg-amber-100 text-amber-700", + failed: "bg-red-100 text-red-700", +} + +export default function UploadHistoryPage() { + const router = useRouter() + const [entries, setEntries] = useState([]) + const [total, setTotal] = useState(0) + const [page, setPage] = useState(1) + const [loading, setLoading] = useState(true) + const pageSize = 20 + + const fetchHistory = useCallback(async (p: number) => { + setLoading(true) + try { + const res = await fetch( + `/api/admin/upload/history?page=${p}&pageSize=${pageSize}` + ) + const data = await res.json() + setEntries(data.data ?? []) + setTotal(data.total ?? 0) + } catch { + setEntries([]) + } finally { + setLoading(false) + } + }, []) + + useEffect(() => { + fetchHistory(page) + }, [page, fetchHistory]) + + const pageCount = Math.ceil(total / pageSize) + + const statusCounts = entries.reduce( + (acc, e) => { + acc[e.status] = (acc[e.status] ?? 0) + 1 + return acc + }, + {} as Record + ) + + return ( +
+
+
+

Upload History

+

+ All data file uploads by admin and IR users +

+
+ +
+ + {/* Stat cards */} +
+
+
+ Total Uploads +
+
{total}
+
+
+
+ Successful +
+
+ {statusCounts.success ?? 0} +
+
+
+
+ Partial +
+
+ {statusCounts.partial ?? 0} +
+
+
+
+ Failed +
+
+ {statusCounts.failed ?? 0} +
+
+
+ + {/* Table */} + {loading ? ( +
+ Loading… +
+ ) : entries.length === 0 ? ( +
+ No uploads yet. Click "+ New Upload" to get started. +
+ ) : ( +
+ + + + + + + + + + + + + + + {entries.map((e, i) => ( + + + + + + + + + + + ))} + +
FileType + Inserted + + Skipped + + Errors + Status + Uploaded By + Date
+ {e.filename} + + + {FILE_TYPE_LABELS[e.fileType] ?? e.fileType} + + + {e.rowsInserted.toLocaleString()} + + {e.rowsSkipped} + + {e.errorCount} + + + {e.status} + + + {e.userEmail} + + {new Date(e.uploadedAt).toLocaleDateString()} +
+
+ )} + + {/* Pagination */} + {pageCount > 1 && ( +
+ + Showing {(page - 1) * pageSize + 1}– + {Math.min(page * pageSize, total)} of {total} uploads + +
+ + {Array.from({ length: pageCount }, (_, i) => i + 1) + .slice(0, 5) + .map((p) => ( + + ))} + +
+
+ )} +
+ ) +} +``` + +- [ ] **Step 2: Verify build passes** + +```bash +cd codebenders-dashboard +npm run build +``` + +Expected: Build succeeds with `/admin/upload/history` in the route list. + +- [ ] **Step 3: Commit** + +```bash +git add codebenders-dashboard/app/admin/upload/history/page.tsx +git commit -m "feat(upload): upload history page with stats and pagination" +``` + +--- + +## Task 10: Nav Header Update + +**Files:** +- Modify: `codebenders-dashboard/components/nav-header.tsx:1-77` + +- [ ] **Step 1: Add Admin link to NavHeader** + +In `codebenders-dashboard/components/nav-header.tsx`, replace the `NAV_LINKS` constant and update the component to conditionally show the Admin link: + +Replace: + +```typescript +const NAV_LINKS = [ + { href: "/", label: "Dashboard" }, + { href: "/courses", label: "Courses" }, + { href: "/students", label: "Students" }, + { href: "/query", label: "Query" }, +] +``` + +With: + +```typescript +const NAV_LINKS = [ + { href: "/", label: "Dashboard" }, + { href: "/courses", label: "Courses" }, + { href: "/students", label: "Students" }, + { href: "/query", label: "Query" }, + { href: "/admin/upload", label: "Admin", roles: ["admin", "ir"] as const }, +] +``` + +Then update the nav rendering to filter by role. Replace the `