Overview
Allow authorized users (admin, ir) to upload institutional data files directly from the dashboard — PDP cohort files, AR files, student-level CSVs, and course enrollment CSVs — without needing direct database or server access.
Supported File Types
| File Type |
Format |
Target Table |
| PDP cohort / student data |
CSV |
student_level_with_predictions |
| PDP AR files |
CSV or Excel (.xlsx) |
student_level_with_predictions |
| Course enrollment data |
CSV |
course_enrollments |
| ML predictions (pre-computed) |
CSV |
student_level_with_predictions, course_predictions |
UI: /admin/upload page
Access: admin and ir roles only
Flow
- User selects file type from a dropdown (PDP Cohort, AR File, Course Data, Custom CSV)
- Drag-and-drop or file picker (accepts
.csv, .xlsx)
- Preview step — shows first 10 rows in a table with column mapping confirmation
- Column mapping UI — auto-detect known columns, allow user to remap unknowns
- Validation summary — row count, detected schema, any warnings (missing required columns, unexpected values)
- Confirm & Upload — streams file to server, inserts in batches
- Progress bar + completion summary (rows inserted, rows skipped, errors)
API Routes
POST /api/admin/upload/preview
- Accepts multipart form upload
- Parses first 50 rows
- Returns: detected columns, sample rows, schema match confidence, warnings
POST /api/admin/upload/commit
- Accepts multipart form upload + column mapping JSON
- Streams CSV/Excel parsing server-side (avoid loading entire file in memory)
- Batch-inserts in chunks of 500 rows
- Returns:
{ inserted, skipped, errors[] }
- Uses upsert to be idempotent on re-upload
GET /api/admin/upload/history
- Returns log of past uploads: filename, type, rows inserted, timestamp, uploader user_id
Backend Considerations
- Excel parsing: use
xlsx or exceljs npm package for .xlsx support
- Large file handling: stream parse with
csv-parse in async iterator mode; never buffer full file
- Column normalization: trim whitespace, normalize header casing before mapping
- Schema validation: check required columns present for each file type; surface clear errors (not stack traces) to the UI
- Upload size limit: configure Next.js
bodyParser limit (suggest 50 MB)
- Role guard: enforce admin/ir via
x-user-role header on all upload routes
Upload History Table (optional migration)
CREATE TABLE public.upload_history (
id BIGSERIAL PRIMARY KEY,
user_id UUID REFERENCES auth.users(id),
filename TEXT NOT NULL,
file_type TEXT NOT NULL,
rows_inserted INT,
rows_skipped INT,
error_count INT,
status TEXT CHECK (status IN ('success', 'partial', 'failed')),
uploaded_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
Acceptance Criteria
Overview
Allow authorized users (admin, ir) to upload institutional data files directly from the dashboard — PDP cohort files, AR files, student-level CSVs, and course enrollment CSVs — without needing direct database or server access.
Supported File Types
student_level_with_predictionsstudent_level_with_predictionscourse_enrollmentsstudent_level_with_predictions,course_predictionsUI:
/admin/uploadpageAccess: admin and ir roles only
Flow
.csv,.xlsx)API Routes
POST /api/admin/upload/previewPOST /api/admin/upload/commit{ inserted, skipped, errors[] }GET /api/admin/upload/historyBackend Considerations
xlsxorexceljsnpm package for.xlsxsupportcsv-parsein async iterator mode; never buffer full filebodyParserlimit (suggest 50 MB)x-user-roleheader on all upload routesUpload History Table (optional migration)
Acceptance Criteria
/admin/upload(leadership/advisor/faculty get 403)